OpenAI turns to Google’s AI chips to power its products, The Information reports

(Reuters) -OpenAI has recently begun renting Google’s artificial intelligence chips to power ChatGPT and other products, The Information reported on Friday, citing a person involved in the arrangement.

The move, which marks the first time OpenAI has used non-Nvidia chips in a meaningful way, shows the Sam Altman-led company’s shift away from relying on backer Microsoft’s data centers, potentially boosting Google’s tensor processing units (TPUs) as a cheaper alternative to Nvidia’s graphics processing units (GPUs), the report said.

As one of the largest purchasers of Nvidia’s GPUs, OpenAI uses AI chips to train models and also for inference computing, a process in which an AI model uses its trained knowledge to make predictions or decisions based on new information.

OpenAI hopes the TPUs, which it rents through Google Cloud, will help lower the cost of inference, according to the report.

However, Google, an OpenAI competitor in the AI race, is not renting its most powerful TPUs to its rival, The Information said, citing a Google Cloud employee.

Both OpenAI and Google did not immediately respond to Reuters requests for comment.

OpenAI planned to add Google Cloud service to meet its growing needs for computing capacity, Reuters had exclusively reported earlier this month, marking a surprising collaboration between two prominent competitors in the AI sector.

For Google, the deal comes as it is expanding external availability of its in-house TPUs, which were historically reserved for internal use. That helped Google win customers including Big Tech player Apple as well as startups like Anthropic and Safe Superintelligence, two OpenAI competitors launched by former OpenAI leaders.

(Reporting by Juby Babu in Mexico City; Editing by Alan Barona)