The Data Exchange Podcast: Neil Thompson on the computational demands, economic costs, and environmental impact of AI.
Subscribe: Apple • Android • Spotify • Stitcher • Google • RSS.In this episode of the Data Exchange I speak with Neil Thompson, Research Scientist at Computer Science and Artificial Intelligence Lab (CSAIL) and the Initiative on the Digital Economy, both at MIT. I wanted Neil on the podcast to discuss a recent paper he co-wrote entitled “The Computational Limits of Deep Learning” (summary version here). This paper provides estimates of the amount of computation, economic costs, and environmental impact that come with increasingly large and more accurate deep learning models.
Download the 2020 NLP Survey Report and learn how companies are using and implementing natural language technologies.
The authors present estimates for computational, economic, energy metrics needed to achieve three set of error rates across some standard benchmarks used by deep learning researchers. The results suggest that the path we are on is currently unsustainable. As an example, let’s look at some cost estimates (I previously created charts based on tables from this paper):
I asked Neil about the many hardware startups that offer accelerators for deep learning. While he acknowledges that specialized hardware can lead to amazing speedups, he believes these new hardware initiatives alone won’t suffice:
- The problem with hardware specialization is that it’s an example of decreasing marginal returns. … When you first specialize you take the thing that gets you the most advantage, then you take the next one, then the next one. Incrementally they become less and less impactful on your overall outcome. So I am very skeptical that these improvements will lead us to the kind of massive increases that we will need.
Subscribe to our Newsletter:
We also publish a popular newsletter where we share highlights from recent episodes, trends in AI / machine learning / data, and a collection of recommendations.
Related content:
- A video version of this conversation is available on our YouTube channel.
- Responsible AI in Practice: A virtual event
- One Simple Chart: Computational Limits of Deep Learning
- Nir Shavit: “The combination of the right software and commodity hardware will prove capable of handling most machine learning tasks”
- Towards an infinite laptop
- 2020 NLP Industry Survey Report
- One Simple Graphic: companies that offer deep neural network accelerators
- Catching a ball in a cup: The two presentations I mentioned in the episode (RL vs. model predictive control) are embedded in this post by Ben Recht (go to the section: Why are we all sleeping on model predictive control?)
[Image: MareNostrum 4 supercomputer at Barcelona Supercomputing Center from Wikimedia]