Picture New York City on a sweltering summer night: every air conditioner straining, subway cars humming underground, towers blazing with light. Now add San Diego at the peak of a record-breaking heat wave, when demand shot past 5,000 megawatts and the grid nearly buckled.
That’s almost the scale of electricity that Sam Altman and his partners say will be devoured by their next wave of AI data centers—a single corporate project consuming more power, every single day, than two American cities pushed to their breaking point.
The announcement is a “seminal moment” that Andrew Chien, a professor of computer science at the University of Chicago, says he has been waiting a long time to see coming to fruition.
“I’ve been a computer scientist for 40 years, and for most of that time computing was the tiniest piece of our economy’s power use,” Chien told Fortune. “Now, it’s becoming a large share of what the whole economy consumes.”
He called the shift both exciting and alarming.






