Microsoft CEO Satya Nadella stated that the artificial intelligence (AI) sector has now overcome the chip production bottleneck but faces a new and critical obstacle.
Satya Nadella lamented the lack of electricity
Nadella spoke with OpenAI CEO Sam Altman at a program in the US. He stated that the company has thousands of AI GPUs in its inventory, but they don’t have enough electrical power to power all of this hardware.

Nadella clarified the situation in a statement on the Bg2 Pod program broadcast on YouTube. “Right now, I have chips that I can’t find an outlet to plug in,” the CEO said, emphasizing that the problem isn’t chip supply, but the energy capacity to power them.
This statement demonstrates that AI infrastructure has reached a new limit in terms of energy consumption. While the chip production bottleneck has largely been resolved in the last two years, the power needs of data centers have now become the industry’s primary concern.
Large tech companies, especially Microsoft, Google, and Amazon, are struggling to run the massive server farms they’ve built for AI models.
Nadella’s remarks echo NVIDIA CEO Jensen Huang’s previous statement that “there won’t be a computing surplus in the next two or three years.” Nadella emphatically stated that energy access, not hardware production, is the primary limiting factor.
On the program, Sam Altman also raised the idea that GPT-5 or GPT-6-level models could run natively on low-power devices in the future. “One day,” Altman said, “we’ll develop an incredible consumer device that can run these models entirely natively.”
However, analysts warn that this could create a short-term risk of billions of dollars in AI data center investments. If AI models begin running efficiently on local hardware, the value of the massive centers built today could plummet. In such a scenario, it’s being discussed that $20 trillion in market capitalization for tech stocks could be at risk.

