“Training a Human Takes 20 Years of Food”: Sam Altman on How Much Power AI Consumes
As artificial intelligence continues to expand across industries, concerns about its environmental impact are growing. The energy required to train and run large AI models has become a major topic of debate. Recently, OpenAI CEO Sam Altman added a new dimension to this discussion with a provocative comparison: “Training a human takes 20 years of food.”
His comment has sparked widespread conversation about how we measure the energy cost of intelligence — both human and artificial.
The Rising Concern Over AI Energy Consumption

Modern AI systems require vast computational resources. Training large language models involves processing enormous datasets through powerful data centers that consume significant electricity. Critics argue that as AI adoption grows, so will:
- Carbon emissions linked to energy production
- Water usage for cooling data centers
- Pressure on global power grids
- The environmental footprint of large-scale computing infrastructure
These concerns are especially relevant as AI becomes integrated into daily life, from search engines and chatbots to automation and predictive analytics.
Altman’s Perspective: Reframing the Comparison
During a public discussion, Sam Altman responded to questions about AI’s power demands by offering a broader comparison. He suggested that raising and educating a human being also requires substantial energy. Over roughly 20 years, a person consumes food daily — food that itself requires energy to grow, transport, and prepare.
His argument was not that AI consumes no energy, but that comparisons should consider the full lifecycle cost of intelligence. If society measures the energy required to “train” a human, then AI’s training costs may not appear as extreme in relative terms.
Altman also emphasized the importance of measuring efficiency per task. Once an AI model is trained, the energy required to answer individual questions can be relatively small. In some cases, he implied, AI may perform certain tasks more efficiently than humans when measured by output per unit of energy.
Criticism and Debate
Altman’s analogy has drawn mixed reactions. Critics argue that comparing human biological energy consumption with electrical energy used in computing oversimplifies a complex issue. Humans contribute to society in ways that go beyond task completion — socially, emotionally, and creatively — making direct comparisons challenging.
Others believe the analogy distracts from the need to reduce AI’s carbon footprint. Instead of debating whether humans also consume energy, they argue that technology companies should focus on minimizing environmental impact.
The debate highlights a broader question: how should society measure the true cost of intelligence?
The Push for Sustainable AI
Regardless of the analogy, there is broad agreement that AI development must become more sustainable. Efforts currently underway include:
- Transitioning data centers to renewable energy sources
- Designing more energy-efficient hardware
- Optimizing algorithms to reduce computational demand
- Increasing transparency around AI’s energy usage
As AI continues to scale, the industry will face growing pressure to balance innovation with environmental responsibility.
Conclusion
Sam Altman’s statement that “training a human takes 20 years of food” was meant to challenge the way people think about AI’s energy consumption. Whether one agrees with the comparison or not, it has intensified an important discussion about sustainability, efficiency, and the future of intelligent systems.
The real issue is not whether AI uses energy — it clearly does — but how that energy is sourced, managed, and optimized. As artificial intelligence becomes more deeply embedded in society, responsible development will determine whether it becomes a burden on global resources or a driver of smarter, more sustainable progress.
