Altman takes a philosophical if not mystically reverent tone as he considers the future of AI. Starting with, “We are past the event horizon; the takeoff has started.” has a certain rhetorical flair to it, although it feels too exhuberant.
Quibbles aside, there are some really interesting nuggets in the post:
- “we have recently built systems that are smarter than people in many ways, and are able to significantly amplify the output of people using them”
- “2025 has seen the arrival of agents that can do real cognitive work; writing computer code will never be the same. 2026 will likely see the arrival of systems that can figure out novel insights. 2027 may see the arrival of robots that can do tasks in the real world.”
- “We already hear from scientists that they are two or three times more productive than they were before AI.”
- “The rate of new wonders being achieved will be immense. It’s hard to even imagine today what we will have discovered by 2035;”
- “OpenAI is a lot of things now, but before anything else, we are a superintelligence research company”
And perhaps the piece that many of us were wondering about: electricity consumption:
People are often curious about how much energy a ChatGPT query uses; the average query uses about 0.34 watt-hours, about what an oven would use in a little over one second, or a high-efficiency lightbulb would use in a couple of minutes. It also uses about 0.000085 gallons of water; roughly one fifteenth of a teaspoon.