Apple is making headlines in the technology world with its recently launched AI-based features, though there’s one player Apple would seem to prefer to keep out of the limelight: Nvidia. As the reigning king of GPU technology, Nvidia is vital to the AI revolution. Still, Apple appears to be quietly charting a different course. Rather than follow the industry trend of heavily relying upon Nvidia, Apple is in the market or developing new alternatives to Nvidia, including its own AI server chips. Such a step actually reflects a mix of strategic ambition, financial restraint, and a long-standing rivalry ever since the times of Steve Jobs.
A Calculated Move Toward Independence
Apple’s shift away from Nvidia is deliberate. While many tech companies are snapping up Nvidia’s GPUs, Apple primarily rents access to them via cloud providers like Amazon and Microsoft. According to reports, Apple has even used in-house chips designed by Google to train its largest AI models. This approach underscores Apple’s desire for independence, avoiding overreliance on a single supplier.
Experts say that Apple’s decision has been influenced by two factors: cost-efficiency and control. Building in-house hardware allows Apple to integrate AI solutions more smoothly into its ecosystem while retaining greater control over its technology stack.
Experts say that Apple’s decision has been influenced by two factors: cost-efficiency and control. Building in-house hardware allows Apple to integrate AI solutions more smoothly into its ecosystem while retaining greater control over its technology stack.
A Two-Decade-Old Rift
The tensions between Apple and Nvidia are not new. Some sources from within the industry claim that it goes back at least 20 years to disputes during the time of Steve Jobs. The business fallout from those conflicts has lingered, with Apple seeming reluctant to deepen its ties with Nvidia, even as Nvidia has climbed to dominance in AI hardware.