Very large neural network models such as GPT-3, which have many billions of parameters, are on the rise, but so far only big tech has the resources to train, deploy and study such models. This needs to change, say Stanford AI researchers, who call for an investment in academic collaborations to build and study large neural networks.