Forbes India 15th Anniversary Special

From enterprise to aam aadmi, Satya Nadella's plan for Microsoft AI

AI is moving the world from humans needing to understanding computers to computers understanding us

Harichandan Arakali
Published: May 22, 2024 05:28:45 PM IST
Updated: May 22, 2024 02:20:46 PM IST

From enterprise to aam aadmi, Satya Nadella's plan for Microsoft AIMicrosoft CEO Satya Nadella speaks during the Microsoft Build conference at Seattle Convention Center Summit Building in Redmond, Washington, on May 21, 2024. Image: Jason Redmond / AFP
Microsoft CEO Satya Nadella on Tuesday offered a glimpse into his plan for how the Windows software maker will bring artificial intelligence (AI) to everyone, via its Copilot stack on its Azure cloud platform. This had industry analysts describing the company’s AI advances as “meaningful” and “democratising”.

At the 14th edition of the company’s Build conference, Nadella announced updates that were all linked to the underlying theme of bringing more access and capabilities on Microsoft’s AI platforms and products to customers, ranging from the largest businesses to individual developers to the aam aadmi.

From native availability of important software programs and tools to expanded hardware partnerships, Nadella painted a broad canvas in his keynote on what the maker of Windows software has been up to and where it’s headed. Reporters such as this writer should probably upgrade the descriptor of the company to “maker of Copilot”.

“How do you democratise AI in a big way? They've addressed the whole capability right through the stack, right from the software, the developer toolkits, the models, the orchestration capability, the runtime capabilities, as well as the hardware,” says Deepika Giri, associate vice president, head of research, Big Data & AI, at IDC Asia Pacific.

A day ahead of Build, on May 20, Microsoft launched a new Copilot+ PC. The laptops—both Microsoft’s own Surface models and various OEM ones—will set one back by about $1000. Microsoft says Copilot+ PCs hook us up to OpenAI’s GPT 4 large language models (LLMs), various small language models (SLMs) as well, and use an onboard ‘neural processing unit’ (NPU) to bring powerful AI assistants to individual users in whatever they’re trying to do, research, gaming or shopping.

“I really believe they have achieved something meaningful in terms of a step function on making AI a lot more accessible, usable and customizable,” says Sidhanth Rastogi, president, technology services and platforms, at Zinnov, a management consultancy.

Microsoft has access to a massive base of PC customers, with most people using Windows machines at work and in their homes. Then it has a very large cloud business, trusted by large businesses and governments. Tapping into this lead, “they’ve made it possible to access AI straight from the PC, and second, they have natively embedded it into Azure,” Microsoft’s cloud platform, Rastogi points out. “So, an enterprise who is looking at accessing AI functions can directly do it from Azure.”
“Just as AI is democratising expertise, this is actually democratising AI for common folks like you and me to larger enterprises and even smaller enterprises who can just access it by using Azure,” he adds.

The message was subtle, in the keynote, Rastogi says, “but to me it was one of the most impactful things and we will see a lot of adoption increase because of this.”

Also read: Why 1,000 top executives voted for global AI standards

Assistant to agent

So far, over the last two years, generative AI has made ‘AI’ a household name, and most folks with access to the internet know about ChatGPT, and many might even have experimented with generating images and videos. What Microsoft announced at Build is that they’ve embedded Copilot, which uses GPT, into not just an application, but a platform, Rastogi says.

Therefore, Copilot can now be embedded into almost any application. And Microsoft demonstrated this via a video in which a Minecraft enthusiast is helped by an AI assistant that shared his screen and walked him through some steps needed. In another example, a Microsoft executive demonstrated very quickly locating and purchasing the right shoes for a trek based on the conditions he described and the type of shoe he might need; in fact, the AI assistant even offered insights on what kind of shoe he’ll need. The executive spoke in Spanish, interrupted the AI, just like humans might do in a natural conversation. It still worked.

Now consider Copilot in the office environment, embedded into everything that people use. “You don't need anything else as not only to assist you, but also almost as an agent it can configure things completely,” Rastogi says. For example, companies can embed Copilot into their HR onboarding process, or sales tracking or what have you. The AI agent will provide updates on who or how many were hired, who’s completed an initial training programme and what has been rescheduled, and so on. “So, it’s not just a collaborator, but almost an agent,” he says.

Also read: How generative AI is changing Infosys from within

Small language models

So far, the focus has been on the capabilities that LLMs have made possible via the cloud. But Nadella’s keynote already outlined Microsoft’s plan to expand its SLMs—starting with its own Phi-3 family of AI models—as there are any number of possibilities when one starts looking at specific applications in specific industries.

“We are seeing the evolution of models from a size perspective from a fit-for-purpose perspective,” Giri at IDC points out. “We call this an enterprise intelligence framework, the ability to not just synthesise information, but also the ability to deliver that information as insights to the right stakeholder at the right time.”

Again, while Microsoft is definitely not the only one developing SLMs, it is making it easy to access and use through Azure and Windows. More importantly, and harking back to the Copilot+ PC, such models can sit on the device itself—even mobile devices—and hence they will be very fast, intuitive and real time.

“The whole edge AI capability is something that has gained a lot of prominence, and will gain a lot of prominence,” Giri says. And bringing together speech and video analytics and making such capabilities easy to tap via Copilot “right from Azure is very powerful,” she says.

And the next step is here, already. One can tap all these AI models “as a service” depending on the specific need. And “we don't necessarily need to use Copilot or any of this functionality the way Microsoft has designed it,” Rastogi says. “We can customise it.”

From enterprise to aam aadmi, Satya Nadella's plan for Microsoft AI

With a platform called Copilot Studio, Microsoft is offering a growing set of customisation features. This too will contribute to adoption of AI across the spectrum of users, from the biggest corporations in the world to the aam aadmi on smartphones.

Let’s leave you with some of Nadella’s opening comments on how he sees the broader changes that are unfolding in the world AI, and Microsoft’s role in it.

There are “fundamental changes that you can sense in the air,” he said at the outset. “I still remember distinctly the first time Win32 was discussed, .NET, Azure … these are moments that I've marked my life with.”

“And it just feels like we're yet again at a moment like that. It's just that the scale, the scope is so much deeper, so much broader this time around. Every layer of this tech stack is changing.”

If one went back to the beginning of modern computing, say 70 years ago, there have been two aspirations at Microsoft (or perhaps he meant more broadly, as all computer scientist and engineers), he says: “First, can computers understand us instead of us having to understand computers.”

Second, in a world with a surfeit of information, “can computers help us reason, plan and act more effectively on all that information… I think that we have real breakthroughs. This is like maybe the golden age of systems.”