How big are the biggest neural networks?
The biggest neural network ever built so far is GPT-3. GPT-3 was bigger than its brothers (100x bigger than GPT-2). Moreover, it has the record of being the largest neural network ever built with 175 billion parameters.
Jack Clark from Import AI reports that well-resourced government-linked researchers in China apparently want to build some absolutely huge, big neural network models. In addition, a bunch of those researchers (all of them seemingly ethnic Chinese) are affiliated with Western institutions and use the knowledge gained there to possibly win the AI arms race against the West.
Again: We are likely already in an AI arms race with China and they are planning a Manhattan-style project to potentially solve AGI! All of this plays out in the open to appear innocuous.
Can a neural network be too large?
Theoretical analysis of artificial neural networks sometimes will consider the limiting case. Furthermore, that layer width becomes large or infinite. In conclusion, this limit enables simple analytic statements, moreover, to be made about neural network predictions. Lastly, training dynamics, generalization, and loss surfaces.