Apple is reportedly throwing millions of dollars a day into AI development

The tech giant appears to have a multi-pronged approach to catch up with the generative AI boom.
Written by Maria Diaz, Staff Writer
iPhone with Siri

Siri is Apple's most public AI tool for now.

Maria Diaz/ZDNET

Apple is notorious for playing the long game by releasing features and capabilities that other companies already offer but marketing them with an innovative twist to make them appear unique. Artificial intelligence is one of the ways the company continues playing the long game. Almost a year after other major players released their AI tools, Apple is now reportedly spending millions of dollars a day on AI development. 

Google and Microsoft jumped onto the generative AI bandwagon just weeks after OpenAI released ChatGPT, each one releasing its own AI chatbot to rival the latter, Bard and Bing

Also: Everything we're expecting at Apple's September event (and what won't be unveiled)

Months after these AI chatbots have grown in popularity and controversy, The Information reports that Apple is working on several artificial intelligence models. The tech giant sees generative AI as an area of growth and an opportunity to compete with major companies like Amazon, Google, and Meta, which have foundational models.

Apple has several teams working on different AI models, including an image generation model, a multimodal model to handle both text and visual data, and a conversational AI unit manned by John Giannandrea, the senior vice president of Machine Learning and AI Strategy, who was originally hired to improve Siri. 

Also: Anthropic unveils Claude Pro, a paid subscription plan for its ChatGPT rival

The Information also reports that these models could automate tasks through Siri and serve as AI customer service chatbots for AppleCare. 

Apple's largest foundational model, 'Ajax GPT', is reportedly more powerful than OpenAI's GPT 3.5, which powers the free version of ChatGPT. Ajax GPT is said to have over 200 billion parameters, while GPT 3.5 has about 175 billion parameters. 

Also: Google just gave Android's most frustrating widget an AI facelift, and it's a relief

Large language models are expensive to train, resulting in steep costs for hardware and data that goes into GPUs and TPUs and massive dataset storage. This also incurs high energy bills as these systems consume a lot of power. All this, added to the time and expertise needed to fine-tune these machine learning models, results in an expensive endeavor, and Apple is paying the price -- literally.

Apple's hefty investment into generative AI has been kept under wraps, as the company hasn't officially made any announcements or confirmations on its progress.

Editorial standards