A groundbreaking study by MIT Technology Review has shed light on the immense energy demands of modern artificial intelligence tools, exposing a sharp contrast in energy use across various types of AI-generated content from chatbots to AI video generators.
💬 ChatGPT: From Light Queries to Heavy Loads
- Energy per response: 114 to 6,706 joules
- 🔌 Equivalent to running a microwave for 0.1 to 8 seconds
- Why the gap? Smaller models use fewer parameters, saving energy but reducing answer quality.
🎬 AI Video Generation: A Massive Power Guzzler
- Five-second AI video: ~3.4 million joules
- That’s over 700x the energy used for a single AI-generated image
- Equivalent to running a microwave oven for 1+ hour
🔍 Example Scenario: What Your Daily AI Use Costs
Using AI for:
- ✅ 15 chatbot interactions
- ✅ 10 image generations
- ✅ 3 short (5-second) video clips
➡️ Consumes ~2.9 kilowatt-hours
That’s about 3.5 hours of microwave oven use – in energy.
🇺🇸 The Bigger Picture: Data Center Impact
- U.S. data center electricity use has doubled since 2017
- In 2024, U.S. data centers used as much power as all of Thailand
- 📈 By 2028, nearly 50% of that power is projected to go entirely to AI workloads
🔄 What Changed?
Previously, improvements in server efficiency kept energy use steady. But AI’s rise—especially models requiring intense GPU computation—has triggered a power consumption explosion.
🧠 Key Takeaways
- LLMs like ChatGPT are relatively efficient, but still energy-intensive at scale.
- AI video generation is one of the most power-hungry content types.
- The global energy footprint of AI is no longer theoretical—it’s measurable, growing fast, and posing real infrastructure challenges.
🔋 As AI becomes embedded in daily life, efficiency innovations will be key—not just for cost savings, but to prevent an environmental bottleneck.