A recent research paper published by Apple has unveiled that for its upcoming Apple Intelligence service, the company utilized Google’s Tensor Processing Units (TPUs) instead of Nvidia’s widely used GPUs. The paper discloses that Apple employed 2,048 Google TPUv5p chips for training its AI models and 8,192 TPUv4 processors for server AI models.
Nvidia’s GPUs are renowned for their performance and computational efficiency, making them a highly sought-after choice. These GPU solutions are typically sold as standalone products, allowing customers to utilize them as they deem fit.
In contrast, Google’s TPUs are only accessible through the company’s cloud services package. Customers do not own Google TPUs outright; instead, they lease access to them within the Google Cloud environment where AI models are developed.
Receive a weekly overview of technology in PC gaming
Apple’s research team found that the cloud requirement of Google’s TPUs actually benefitted their AI models’ training process, allowing them to harness the necessary processing power more efficiently compared to a standalone system.
The decision by Apple to utilize Google’s products was unexpected given the historical rivalry between the two companies. Nvidia currently holds a dominant market share in AI chips, with their accelerators comprising between 70% and 95% of sales.
However, Apple’s choice could indicate a shift away from Nvidia’s high-priced high-end chips by tech companies. Amazon, for instance, recently announced the development of a new line of AI chips claimed to be 50% more powerful and consume 50% less power compared to Nvidia offerings.
Microsoft revealed in May its plan to provide cloud services built on AMD’s AI chips instead of Nvidia’s, while Google made similar strides in April.