Joanna Stern from the Wall Street Journal explores: How Much Energy Does Your AI Prompt Use? I Went to a Data Center to Find Out. Her findings are helpful but not surprising.
Estimated Energy Usage by AI Task:
- Text generation: 0.17-1.7 watt-hours (depending on model size)
- Image generation: About 1.7 watt-hours for a 1024×1024 image
- Video generation: 20-110 watt-hours for just 6 seconds of video
For context: I can turn off an 8w LED lamp (60 watt equivalent) for an hour and save roughly enough energy to create 5 images. Or, if you have a 4-ton AC, you could turn it off for one hour and generate 40 videos.
In terms of consumption, a gallon of gas contains 33.7 kilowatt-hours, meaning I could ask ChatGPT nearly 100,000 questions for the same energy cost as driving 26 miles (for the average 2022 model-year vehicle).
I think we ought to be mindful of the environment and be good stewards of our planet, but I think it’s also important to have context behind these numbers. The potential scope of use is huge (7+ billion people), but relative energy consumption per request remains low and declining with silicon improvements.
Nvidia has seen a jump in energy efficiency with its latest Blackwell Ultra chips, according to Josh Parker, the company’s head of sustainability. “We’re using 1/30th of the energy for the same inference workloads that we were just a year ago,” Parker said.
We saw this with the shift from incandescent to LED light bulbs. The cost of lighting building dropped in terms of energy use and dollars spent is much less today than 20 years ago. I have every reason to expect the same to happen in computing, particularly related to AI technology.