Apple is leveling up its chip ambitions. The Silicon Valley technology giant has spent years designing chips for its own hardware — for Macs, iPhones, iPads, and more. But, running AI models requires higher-grade chips like NVIDIA's graphics processors, which have become industry standard.
To keep up, and to fuel its own AI ambitions, Apple is working on its own chips, according to a report in the Wall Street Journal, designed to support AI applications from servers in large data centers. Internally, the project is code-named ACDC, short for Apple Chips in Data Center, though it has no set timeline for completion.
Apple's chips are reportedly meant for running AI applications, rather than training them, which makes sense given Apple's consumer focus. Apple has yielded the first leg of the AI race to upstarts like OpenAI and Anthropic, as well as to incumbents Microsoft and Meta, but the view from Cupertino is clearly better-late-than-never.