Forbes India 15th Anniversary Special

Privacy-first AI: Can Apple show the art of possible today?

Anticipation is at fever pitch with respect the iPhone maker's AI roadmap, as WWDC, its annual developer jamboree, gets underway

Harichandan Arakali
Published: Jun 10, 2024 03:51:51 PM IST
Updated: Jun 10, 2024 05:48:22 PM IST

Privacy-first AI: Can Apple show the art of possible today?In the short term, starting with features that might be unveiled at the WWDC today, it’s likely that Apple will match features that Google and Samsung offer on the Korean tech giant’s latest flagship, the Galaxy S24 Image: Shutterstock

Apple iOS developers who aren’t physically present at the in-person event in the US, will likely be glued to their computers today, as the company’s annual Worldwide Developer Conference, WWDC, gets underway at 10:30 p.m. India time.

This time, everyone is waiting for one thing: What will Apple unveil about its upcoming artificial intelligence (AI) features and longer-term roadmap. The general perception so far has been that Apple lags Google, Microsoft (read Microsoft + OpenAI) and Samsung in its AI features.

With Apple set to unveil updates to its operating system for the iPhone, iOS18, and all its other devices and products, anything short of a full set of new AI-based features will likely disappoint developers as well as industry analysts, not to mention investors.

“Frankly, Siri lags Google Assistant,” Navkendar Singh, associate vice president at IDC India, says, referring to Apple’s Siri voice assistant, adding that an important reason for this is “Apple’s intense privacy focus.”

For example, a few years ago, users saw they could bar apps from tracking their activity via the App Tracking Transparency (ATT) feature Apple introduced, and one that Meta Platforms, for example, stridently objected to – as it made targeted ads more difficult.

The feature benefited Apple too. Since the launch of that feature, at the 2020 WWDC, Apple watchers have pointed out its own ad revenue went up. The feature can also be seen to be in line with the company’s walled-garden ecosystem (that the EU is now forcing Apple to open up).

In the context of AI, Singh says that “no tracking means no data,” which in turn means offering AI features that depend on data and machine learning will need innovation, as Apple looks to maintain its approach and brand image as a privacy-first tech company.

Apple is often not the first to offer a feature but when it does, the experience tends to be the best, Singh points out. The iPhone itself, for example, wasn’t the first smartphone, but it was the one that changed everything.

“Over the last five years, Apple has added a lot of native AI features which run in the background, they don't advertise it as such,” Neil Shah, a vice president of research at Counterpoint Technology Market Research, said in an interview with Forbes India recently.

These reduced the number of clicks or taps, for instance, and raised the user experience. A simple example is, on taking a photo of a say a restaurant bill, one can easily copy the text.

“That is AI, basically, running an OCR and then running an algorithm to select exactly what was on the receipt,” Shah says. In such ways, there are many AI features baked in already within the Apple device, he says.

What’s happened today, however, is that “the conversation has changed, with the industry moving towards the next level of AI, which is generative AI,” he says.

This entails splitting the processing needed between what the on-device processor and storage can support and what needs to go on the cloud. Here, Google and Microsoft, for instance, already have the large cloud-based compute infrastructure, with powerful GPUs and so on, and they’ve been working on the large language models that underpin generative AI for several years, Shah says.

And they have data from internet search engines, office productivity software, like the Microsoft Office 365 suite and so on. What Apple has, on the other hand, is data on more than a billion iPhones. “But what Apple has been doing with that is not clear, whether they have been building their own AI model,” he says.

Shah thinks Apple is yet to make big strides in large language models (LLM), and the broader class of AI models called foundation models, on the cloud. Meantime, Google and Microsoft are now looking to push their AI features on to the edge (meaning the end devices like phones and laptops).

For example, Microsoft recently unveiled a laptop that brings its Copilot AI assistant to the Windows PC that taps OpenAI’s GPT large language models. Similarly, Google is bringing its Gemini model as Gemini Nano into smartphones that use its Android software.

So far, with respect to LLM-based AI features, “Apple has not shown anything in cloud. What they have is basic AI, not generative AI. That is where Apple has to increase their R&D efforts and innovations,” Shah says.

It is widely expected that Apple will likely strike a deal with Open AI, the maker of Chat GPT to bring those features on to iOS. Apple has an existing non-AI partnership with Google as well. Singh, at IDC, points out that Apple received some $20 billion in 2022 from Google, as annual payment for making Google the default search engine on Apple devices.

Also read: From enterprise to aam aadmi, Satya Nadella's plan for Microsoft AI

Here too, Apple is making changes with respect to privacy. In September 2023, we learnt that Apple was allowing users to select different search engines in the Safari browser in private mode.

In the short term, Apple could tap Open AI to “power Siri to the next level from the cloud perspective,” Shah says. The new Apple Silicon chipsets the company has in its devices are powerful enough to run generative AI tasks on the device natively as well as in the cloud. Therefore, Shah sees a lot of scope for enhancing Siri.

In the short term, starting with features that might be unveiled at the WWDC today, it’s likely that Apple will match features that Google and Samsung offer on the Korean tech giant’s latest flagship, the Galaxy S24, Singh says.

Such features could include AI-based editing of images, or productivity tools like predictive text for faster completion of emails, and even screenshots – like in Microsoft’s Copilot PC – that can help Siri tell the user what she did two weeks ago and so on.

“You’ll see more AI integration in the core apps,” such as messages, emails, notes and so on, Singh says. “iOS 18 is going to be big.” With the widely expected partnership with Open AI, one might see a ChatGPT made-for-iPhone version, he says.

With the privacy-first approach, Apple might incorporate a feature where the user decides if some data can be sent to the cloud for an enhanced experience, he says. “They may even provide a toggle.” That’s what Apple did with the app transparency feature, for example.
This year’s WWDC will be “Apple AI’s coming-out party to the entire Apple ecosystem of devices. And the time couldn’t be more ripe,” Dipanjan Chatterjee, vice president and principal analyst at Forrester Research, wrote in a recent blog post.

The analyst makes three points in his post: First, “you’ll see AI everywhere. You may not even know it’s there,” Chatterjee writes. And he echoes Singh’s point that we’ll see Apple AI in Photos, Music, and its productivity applications.

Second, the combination of Siri and generative AI will offer helpful features. “We’ve all seen how ChatGPT has breathed new life into the miserable chatbot experience, and Apple’s talks with OpenAI (and also Google) may work the same wonders for the languishing Siri,” he says.

Third, “Apple has always been an outlier in the tech titan territory for its customer-first privacy commitment.” Apple has the golden opportunity to stand out further, as a bastion of consumer privacy at a time when ethical standards for AI have become an afterthought, he says.

In the long run, because Apple is such a vertically integrated player – it designs its own devices and makes its own software – the next stage and the biggest opportunity for Apple is to have its own foundation models, Shah at Counterpoint says.

Apple’s app ecosystem involves millions of developers, and with their help it can have a foundation model and an instance of that model in every application within its main operating system, iOS for iPhones, he says.

A simple example of how this might work for an end user of the iPhone is, she could tell Siri to create an itinerary for her to travel from Delhi to Mumbai and Bengaluru, and the AI assistant can pull data from all the relevant apps, for flight tickets, hotel rooms and so on, and generate the plan.

In this scenario, Siri would be directly connected via the AI model to each and every app and Apple can do this because is fully vertically integrated. And it can have developers integrate that level of AI prowess within their apps. Apple is privacy-led and it will get access through Siri and the large foundation model to all the apps, he says.

“That is the biggest AI play for Apple in the next few years and we’ll see the foundations this WWDC.”