According to a report by Trendforce, NVIDIA, one of the key players in the AI market, is expected to benefit directly from ChatGPT’s increasing demand for GPUs. While previous estimates had suggested that the GPT model was powered by around 10-20K GPUs, the latest model is expected to use significantly more. Trendforce’s report estimates that ChatGPT’s newest model will require 30,000 GPUs, although it’s possible that the demand may even exceed that number.
According to the research firm, the demand for AI GPUs is projected to exceed 30,000, with the estimation based on the A100 GPU, which is one of the fastest AI chips available, capable of delivering up to 5 Petaflops of AI performance. However, the actual number of required GPUs may vary, depending on the type of chips being used. This puts NVIDIA in a tough position as they need to decide whether they want to focus their supply on AI GPUs or gaming GPUs, given the substantial pressure they will face due to this demand.
NVIDIA’s CEO, Jensen Huang, has repeatedly expressed admiration for OpenAI’s ChatGPT tool in various interviews and financial earnings calls. The company has made significant advancements in the AI segment, with the latest RTX Video Super Resolution leveraging the AI capabilities of its consumer GPUs to enhance video and streaming quality across various applications. Game developers and production houses are utilizing the company’s AI tensor cores in the form of DLSS 3 to achieve better performance throughput while maintaining visual fidelity.
The rise in popularity of ChatGPT is expected to be a fortunate development for NVIDIA as the demand for PC hardware slows down due to the increasing inflation worldwide. However, the demand for greater AI capabilities, such as in the case of ChatGPT, could make up for the revenue discrepancies. Industry experts anticipate that the demand for NVIDIA’s products could surpass the overall supply in the upcoming quarters.