Nvidia CEO Jensen Huang recently said that AI has “advanced a millionfold” in the last 10 years. Given that Nvidia is a hardware-focused company, it can be said that Huang’s statement mainly refers to the progress in hardware components that run AI systems.
AI has improved a millionfold in 10 years
A comparison between the company’s Blackwell B200 processor, which it introduced in 2024, and the Pascal P100 from 2016 reveals the scale of this development. According to Nvidia, the B200 is approximately 20,000 times faster than the P100 in inference performance.

It is also stated that it provides 42,500 times better results per token in terms of energy efficiency. This data shows that even a difference of just one hardware generation creates serious gaps in AI processing power.
The increase in AI performance is not limited to hardware. The development of large language models, combined with software and algorithm optimizations, is increasing overall system performance exponentially. Companies like Nvidia are not just manufacturing processors; they are also building AI supercomputers that scale these processors and can run thousands of units simultaneously.
For example, Elon Musk’s xAI company is currently actively using a supercomputer consisting of 200,000 Hopper GPUs. Musk has announced plans to expand this system to one million Blackwell GPUs in the future.
This growth led by Nvidia is not limited to internal projects. The company has also started making significant investments in AI in the UK. The new research center will focus on areas such as robotics, environmental modeling, and materials science.
While Huang’s “million-fold improvement” statement may seem exaggerated, the point that AI has reached in a decade has surprised many. What do you think about this? Share your thoughts with us in the comments section below.