Click to Skip Ad
Closing in...

If you buy through a BGR link, we may earn an affiliate commission, helping support our expert product labs.

Apple is quietly spending millions on AI to make Siri as smart as ChatGPT

Published Sep 6th, 2023 3:07PM EDT
Siri on the Vision Pro headset. Apple GPT
Image: Apple Inc.

After reports about Apple GPT and all the Siri drama due to the complexities of the personal assistant, The Information shared that Cupertino is spending millions of dollars a day to train its large-language models – LLM for short.

With a small team of 16 people overseen by the head of AI, John Giannandrea (former Google), Apple has been training LLM for a few years. Since this technology blew with OpenAI’s ChatGPT, Cupertino could consider adding some forms of chatbots internally and as a final product.

This conversational AI team is named Foundational Model, and Apple has reportedly created two new teams to develop language or image models. One of the teams might be working on software that generates “images, video, or 3D scenes,” while the other is working on “long-term research involving multimodal AI which can recognize and produce images or videos.”

The Information says that one of these LLMs could “eventually interact with customers who use AppleCare.” At the same time, the Siri team plans to incorporate these language models to make complex shortcut integrations a lot easier.

What’s interesting about this story is that people on the Apple team believe its most advanced language model, Ajax GPT, which Bloomberg already reported, might be better than OpenAI’s GPT. 3.5. Even though the other company is already working on better solutions, it’s good to know that Apple has dramatically improved in the conversational AI field.

The roadblock for Cupertino is that the company prefers to run software on devices, which improves privacy and performance, while it’s impossible to deliver the best language-learning model without being cloud-based.

The Information reports: “Ajax GPT, for example, has been trained on more than 200 billion parameters, as they’re known in AI parlance. (…) Parameters reflect the size and complexity of a machine-learning model; a higher number of parameters indicates greater complexity and requires more storage space and computing power. An LLM with more than 200 billion parameters couldn’t reasonably fit on an iPhone.”

While Apple could be trying to shrink these large models to fit on an iPhone, we might have to wait a little longer to see some of these initiatives coming to the final user.

BGR will keep reporting on Apple’s AI efforts and how it will tackle OpenAI’s ChatGPT, Microsoft Bing, and Google Bard.

José Adorno Tech News Reporter

José is a Tech News Reporter at BGR. He has previously covered Apple and iPhone news for 9to5Mac, and was a producer and web editor for Latin America broadcaster TV Globo. He is based out of Brazil.