The AI revolution happened in mid-2023. The suddenness of this revolution led to the popularization of AI accelerator GPUs like the Nvidia H100. However, according to the data released by Google today, things have started to turn around. Especially in ai outputs, CPU are preferred over GPU.
According to Google, CPU usage in AI has overtaken GPU usage
Artificial intelligence technology took a big leap forward with ChatGPT in mid-2023. The succession of artificial intelligence tools and the allocation of billion-dollar budgets by big companies turned all balances upside down. In this first era of AI, Nvidia and HX100 GPU units came to the fore. Especially OpenAI and Microsoft used Nvidia H100 GPU units in their ai servers.
GPU units are still much more effective in the learning process of AI today. However, the main burden in the AI industry is now on AI outputs. In order for these outputs to be healthier and faster, the CPU has come to the fore. Google Cloud product manager Brandon Roya explained the role of CPU in ai technology at the TechFieldDay event.
GPUs, or graphics processors, generally have a much higher core count than processors. This allows graphics cards to do much more work. Brandon Roya emphasized the importance of this ability to perform multiple tasks simultaneously in the AI training CPU.
However, when it comes to AI output, graphics cards are weaker than processors in terms of focus. Although processors are not as successful as graphics cards in multi-processing, they have a much higher processing power when it comes to focusing on a single process.
Brandon Ray explained that processing speed and throughput are much more important, especially for AI outputs. Graphics cards enable more optimized processing, but lag behind in processing speed. In this sense, the industry has turned to AI-powered processors for AI outputs.
Another reason for the rise of processors for AI outputs is cost. Server processors are cheaper and more accessible than AI-powered server GPU units such as the H100. Especially in the wake of the artificial intelligence craze, Nvidia H100 and higher-end GPU units have become more difficult to access.
In this sense, AI servers started to prefer AI processors to get faster results. It is also a matter of curiosity whether this preference will unseat Nvidia. Currently, Intel AVX-512 processors are used in AI servers. In addition, Meta has already rolled up its sleeves for the AI processor, and AMD and Microsoft are also preparing to invest heavily in this issue. In addition, OpenAI CEO Sam Altman is trying to raise a huge investment of 5 to 7 billion dollars to develop an AI processor.