This week - small Chinchilla AI outperforms GPT-3 and other giant models; what’s next for protein-folding AIs; automating Spot; earbuds that watch your brain; and more!
More Than A Human
Born from Alphabet's “moonshot” division, NextSense aims to sell earbuds that can collect heaps of neural data—and uncover the mysteries of gray matter.
There is a trend amongst AI companies to build bigger and bigger AI systems that have tens or even hundreds of billions of parameters. Researchers at DeepMind took a closer look at this trend and concluded that it is possible to substitute AI size with a good quality training data. To test this idea, they have created Chinchilla - an AI model much smaller than current state of the art systems but similar in performance. “The conclusion is clear: Current large language models are “significantly undertrained,” which is a consequence of blindly following the scaling hypothesis — making models larger isn’t the only way toward improved performance.”, writes Alberto Romero in this article.
AlphaFold and RoseTTAFold, two AI systems that solved the protein-folding problem, made a huge impact in the scientific community. This article explains the impact of AI revolution in biology and how scientists use these tools to advance our knowledge about proteins.
If you have ever wondered where to start your journey in brain-based AI, then check this post out. It gives the basic information where to start and what skills to acquire to start making artificial brains.
Guys from Oxford Robotics Institute show, on high level, what tools and techniques they have used to allow Spot move autonomously from point A to B while mapping and learning about its environment.
In this post, researchers from Amazon Robotics describe how they use virtual environments to train their robots and how they incorporate real-life tests because even the best virtual test cannot fully recreated the messiness of real-world physics.