As the race in artificial intelligence continues at full speed, both software and hardware companies are collaborating on joint projects in this field. OpenAI, which has already proven itself with ChatGPT, aims to increase its control in the field of AI through a partnership with NVIDIA.
OpenAI and NVIDIA are joining forces for a massive artificial intelligence system
Previously, NVIDIA provided support to OpenAI with 20,000 graphics cards for the GPT-4 model. However, after expanding their partnership in the upcoming period, we can say that this number will seem quite modest.
According to Wang Xiaochuan, the founder and businessman of the Chinese search engine Sogou, ChatGPT is already working on a more advanced AI computing model that uses highly sophisticated training methods. Information suggests that this model requires the power of ten million artificial intelligence GPUs to operate at full capacity.
Although the use of ten million AI GPUs may seem astronomical, it could be considered a necessity for OpenAI’s future language models. As AI systems continue to evolve and expand, they require much more hardware.
Considering NVIDIA’s current production capacity, reaching this number in the near future seems unlikely. With its current capacity, the company can produce one million GPUs, which means it would need ten years just to provide the system for OpenAI.
However, NVIDIA is working to accelerate its efforts with TSMC to increase supply. Besides production capacity and cost, connecting such a vast number of GPUs presents a significant challenge for the company.
The realization of such a system is beyond our imagination. What are your thoughts on the collaborative AI efforts of NVIDIA and OpenAI? You can share your opinions in the comments section.
{{user}} {{datetime}}
{{text}}