AI is not a single-size-fits-all approach: Intel India MD
Santhosh Viswanathan, vice president & managing director, Intel India, on running data centres more sustainably, focusing on affordability and building in a scalable manner


Intel India used the recently concluded India AI Impact Summit in New Delhi to unveil its ‘frugal AI’ strategy and next generation AI PCs, highlighting a shift toward cost efficient, distributed AI across PCs, edge devices and local systems. The company showcased on device AI capabilities with improved performance and lower power use, while emphasising affordable PC access and vernacular, AI enabled learning to bridge India’s digital fluency gap.
In a conversation with Forbes India during the AI Impact Summit in Delhi, Santhosh Viswanathan, vice president & managing director, Intel India, spoke about why India is different from the West, its key revenue areas and data centres and sustainability. Edited excerpts:
Q. What are Intel India’s key focus areas and where do you expect growth to come from over the next year?
Everyone’s on the AI wave, but the way we see AI in India is going to be different from what the West or Europe is following because we think that India is a market with its own unique needs. So, our model of AI adoption needs to be a little different. We call this the “frugal AI” approach, where we not only think about the large, big infrastructure that powers these big models, but also a scalable infrastructure all around—something akin to the UPI revolution India had. How do we create the universal AI revolution that India needs? So, the scalable, accessible model across all is something we’re working on. That starts with your computer being an AI device, your mobile phone being an AI device, your EDGE devices being AI devices, and then, in your data centre, your servers being AI devices. AI needs to be everywhere to make it meaningful for a market like India. The second aspect is: Which sectors will adopt AI at population scale, and truly change the way India uses it.
Q. Could you give some examples?
The best example I can give you is education. It’s an area where the system has stayed the same. The model of the education system has been the same over the years—one way download: Textbooks, notebooks, read, write, memorise.
But we’re now in a position where AI can change the whole aspect of education. Personalised learning, augmenting teachers, taking quality learning into rural areas, providing best in class education with local language support—it completely changes the way we teach our kids.
Q. What are the key revenue driving focus areas for Intel in the India market?
If you look at PC penetration in India, it’s less than 10 percent. In the US, it’s 95 percent; in China, it’s 60 percent. Clearly, there is a massive scope for growth. The second area is data centres. I’m so thrilled with the new tax holiday and policy. If you're generating 40 percent of the world’s data and only 3 percent of the world’s servers, it’s still a small market for the amount of data we're generating. The new data centre tax support helps close that gap. But there’s much more to do—we have to build infrastructure and build data centres at scale. Because if you're generating data—it’s like having a factory of data generation and processing—you need the refining to happen somewhere.
Q. On the chip side of the business, especially with Nvidia’s rapid AI led rise, is Intel playing catch up or pursuing another differentiator?
The key for us, from an India market perspective, is that with every key technology change, Intel has been a key participant. I go back all the way: If you look at the 90s and the way we created the first channel IT ecosystem. If you go to Richie Street, to Nehru Place, to SP Road in Bengaluru—the first set of partners who assembled computers—Intel had the GID programme, the Genuine Intel Dealer programme, that enabled that whole ecosystem.
Same with Wi Fi and USBs. Even with UPI and Aadhaar. These scaled because you had server infrastructure that was scalable, affordable and built for population scale. So, in every key technology front, you’ve seen Intel play the inside role well. We’ve always been that ‘Intel Inside’ that makes the outside amazing. And I think that’s the role we are passionate about continuing, even in this AI journey.
So, building AI solutions that can scale—the AI PC is a big transition when it comes to laptops and computers. But it’s not just about laptops and computers; it also transforms the edge. So, I think we’re just starting that journey, and we will play a key role in many more transitions still to come.
Q. Data centre compute is expensive and a major concern for large organisations. What innovations is Intel working on to reduce that cost?
The beauty of Moore’s Law is that it’s not just a performance or power law; it’s also an economic one. When you double the transistors in the same size chip, you get more performance at the same cost, or even at a lower cost over time. That’s why what was a supercomputer 10 years ago is now in your lap, running at that size and efficiency. So, we are in that pursuit.
And that’s why I say AI is not a one size fits all approach. It’s not that you just need large GPUs and large data centres. It requires a heterogeneous architecture. You need large data centres, you need CPU led servers in many cases, you need edge devices in many cases. When we look at AI, we look at it across the board—and then you build the right platform for AI to thrive in a country like ours.
That’s what Intel focuses on: Building an architecture that spans all of this—not one versus the other—and ensuring it is frugal, affordable and impactful for society.
Q. Data centres consume huge amounts of water and energy. How can they scale while becoming more sustainable?
I think there are many angles to sustainability. One is power and India’s investments in sustainable power, solar power and other sources provide a strong foundation. The second aspect is what I mentioned earlier: If you have AI applications and models that can run on your existing infrastructure, and not require a huge GPU or accelerator led compute infrastructure to be built, then you start there. Run your models there and see whether it’s relevant for your organisation before investing heavily. Earlier, the mindset was: If I don’t have a big GPU infrastructure, I can’t participate in the AI journey. That has changed completely.
Today, with an AI PC, you can participate in the AI journey by running a small model locally. For example, this conversation—why send 35 minutes of audio to the cloud, process it there, bring it back here and then write from it? You’re wasting resources. Instead, a small local model can summarise it, extract key points, generate two pages of notes, and even produce headline options—all locally, without using cloud compute or water cooled data centre infrastructure. It’s efficient and frugal.
This approach saves a huge amount of compute and energy because you’re not sending unnecessary data to large data centres. Those centres can then reserve their GPU power for when it’s truly needed. So, in the long run, this architecture will help build the most sustainable ecosystem. And that’s why we need to rethink infrastructure slightly differently from everyone else.
First Published: Feb 25, 2026, 11:22
Subscribe Now