All sessions

Fireside Chat: Intel, Tata Electronics, CDAC & Asia Group | India AI Impact Summit

Contents

Executive Summary

This fireside chat addresses the critical gap between India's ambitious AI policy announcements (PAX Silica, $50B+ investments from Microsoft, Google, and others) and the practical challenges of enterprise AI deployment at scale. The discussion reveals that while government infrastructure and enterprise capability are advancing, significant barriers—including ROI uncertainty, talent gaps, MLOps maturity, and deployment complexity—prevent most Indian enterprises from moving AI projects beyond pilot phase into production.

Key Takeaways

  1. Infrastructure ≠ Deployment: India has built significant government compute capacity (CDAC), and private investment is flowing in, but the missing link is production-scale enterprise deployment. Policy announcements alone do not translate to ROI or adoption.

  2. The "Real Life" Gap is the Real Problem: Educational and enterprise AI practice focuses on clean, curated data and theoretical models. Real-world production faces data quality, MLOps, security, and constraint challenges that require practical, hands-on training—not just theoretical knowledge.

  3. Sovereignty is About Control Points, Not Independence: India should focus on controlling the AI stack above semiconductor design (models, orchestration, applications) rather than pursuing unrealistic vertical integration. Pragmatism beats aspiration.

  4. Cost Determines Scale: Frugal AI (using CPUs, edge deployment, and right-sized models) will likely drive broader adoption than high-end GPU clusters. Enterprises need permission to ask, "Do I actually need a GPU for this?"

  5. India's Demographic Advantage is Real but Time-Limited: The young, tech-exposed population is a genuine competitive advantage, but only if education systems adapt curricula to emphasize practical MLOps, deployment, and real-world data challenges alongside theory.

Key Topics Covered

  • India's AI Infrastructure & Sovereignty: Government R&D efforts through CDAC, the PARAM supercomputing series, and the strategy for building a "pragmatic" AI stack under sovereign control
  • Enterprise AI Deployment Challenges: Barriers preventing Fortune 500 Indian companies from scaling AI from pilot to production
  • Frugal AI & Cost Optimization: Intel's approach to efficient, resource-conscious AI deployment across edge, on-premises, and cloud environments
  • Data Sovereignty vs. Performance Trade-offs: How Indian enterprises navigate regulatory requirements alongside cost and performance considerations
  • Talent & Curriculum Gaps: Mismatch between theoretical AI education and practical MLOps deployment skills
  • Energy Efficiency & Sustainability: Power consumption implications of AI infrastructure, from chip design to data center operations
  • Deployment Architecture Decisions: When to use edge computing, on-premises, cloud, or hybrid approaches

Key Points & Insights

  1. Government Infrastructure Maturity: CDAC operates ~48 exaflops of compute capacity (scaling to 100 petaflops by end of 2024) across 60+ installations, serving 15,000+ researchers. Applications span drug discovery, weather prediction, oil exploration, and computational modeling—primarily research-focused rather than enterprise production.

  2. The "Pilot Trap": Indian enterprises have purchased powerful GPU systems but remain stuck in pilot phase due to ROI uncertainty, poor data quality, and inability to transition from curated datasets to real-world messy data. This is a critical bottleneck shared across industries globally, but particularly acute in India.

  3. Pragmatic Sovereignty Model: Rather than pursuing complete vertical integration (silicon through applications), India should focus on controlling "critical choke points"—models, orchestration, and applications—while sourcing advanced semiconductors from global suppliers (Nvidia, Intel, AMD) in the near term. CDAC's RISC-V-based GPGPU design targeted for 2029-2030 represents long-term aspirational independence.

  4. Frugal AI Framework: Intel advocates that enterprises ask whether they need GPU acceleration for every workload. Modern CPUs (Core Ultra, Xeon 6) can run 7-8 billion parameter models at edge and 20-80 billion parameters in data centers, potentially reducing infrastructure costs significantly without sacrificing ROI.

  5. Data Sovereignty ≠ Top Priority for Most: Contrary to policy emphasis, enterprise data localization requirements vary sharply by industry. Banking and healthcare prioritize data sovereignty; manufacturing and retail prioritize cloud accessibility for rapid development and API access. Cost and performance remain primary drivers for most sectors.

  6. MLOps & Deployment Maturity Deficit: Enterprises lack practical experience with real-world MLOps challenges—data cleaning, handling missing/skewed data, real-time constraints, security considerations. Most curriculum focuses on theory and curated datasets; production-scale deployment remains a major capability gap.

  7. Talent Demographics as an Asset: India's young population (average age 13-25, highly exposed to AI) represents a learning advantage compared to aging populations in developed markets. While skill gaps exist today, they are expected to close rapidly within 2-4 years.

  8. Power Efficiency Progress: Industry techniques (liquid cooling, power-aware chip design, ribbon interconnects) have improved data center efficiency. Intel's facilities achieve PUE (Power Usage Effectiveness) of 1.06; water-cooled systems achieve ~1.2-1.5. Energy per token (training/inference) should become a critical benchmark metric.

  9. Architecture Agnosticism: Neither enterprise nor government infrastructure should be dogmatically "on-prem vs. cloud." The right choice depends on specific use case, data sensitivity, scale requirements, and total cost of ownership (TCO).

  10. Success Metrics for India's AI Future: Real impact will be measured not by infrastructure capacity or policy announcements, but by widespread deployment across workflows—with tangible benefits for everyday users, small business owners ("subzi wallas"), and public services through Indic language models.


Notable Quotes or Statements

  • Vive Kanea (CDAC): "A more pragmatic approach is to have maybe the silicon coming from outside and everything above that should be under my control. So you should be able to control all your critical choke points... the sovereignity where you are getting the maximum ROI."

  • Vive Kanea (CDAC): "Pilots is fine but the real revenue will come only once you actually deploy it at scale."

  • Nathan Baj (Intel): "The biggest gap today is what to use... whether to use it on prem or whether to go on cloud... Then once those use cases are ready... what is the final cost of that deployment... the entire AI journey is changing so rapidly... the whole ecosystem is changing so fast."

  • Nathan Baj (Intel): "Do I need a GPU in every instance is the first question that we are trying to ask... they can reutilize [existing CPUs]... that's where the cost versus performance comes in."

  • Vive Kanea (CDAC) (on talent): "When it comes to actual deployments on field I think that's where we are lacking... we need to have a serious look at our curriculums... how do I train large models how do I deploy large models using MLOps."

  • Nathan Baj (Intel) (on success vision): "When we were like about 150 in terms of data usage... today we are number one... if we can consume the data for improving the general intelligence of public that will make a large scale impact... if a subzi walla can figure out how they can uplevel their state... that would be the best way."


Speakers & Organizations Mentioned

EntityRoleAffiliation
Aman Raj KhanaModeratorAsia Group (Partner & MD for India)
Vive KaneaPanelistCDAC (Executive Director) – Center for Development of Advanced Computing, Ministry of Electronics & IT
Nathan BajPanelistIntel India (Director of Sales for Conglomerate Accounts)
Brad SmithReferencedMicrosoft (announced $50B global investment in Global South, $20B for India)
Google ($15B+ investment announced)
Anthropic (partnership with Infosys, Tata, OpenAI announced)
Sangeeta ReddyClosing speaker (unfinished)Apollo Hospitals (Joint Managing Director)

Government Initiatives Referenced:

  • PAX Silica (announced morning of summit)
  • National Supercomputing Mission
  • National Knowledge Network (NKN)

Technical Concepts & Resources

Compute Infrastructure

  • PARAM Supercomputing Series: CDAC's branded supercomputers (8000 through current generation)
  • Current Capacity: 48 exaflops across distributed installations; scaling to 100 petaflops by end of 2024
  • Installations: 60+ connected via National Knowledge Network (NKN)
  • PARAM Utsah Kersh: CDAC's open access facility for startups and MSMEs

Processors & Silicon

  • Intel Core Ultra & Core CPUs: Integrated GPU + NPU + CPU; capable of running 7-8B parameter models at edge
  • Intel Xeon 6 Processors: Data center grade; can run 20B parameter models, scaling to 80B parameters
  • AMD Grenite Rapids & Sapphire Rapids: Global competitors referenced
  • Nvidia H100, B200: High-end alternatives not competitive for India's near-term domestic strategy
  • CDAC RISC-V GPGPU: Custom GPGPU design in development, target launch 2029-2030

AI Models & Deployment

  • SLMs (Small Language Models): Emerging focus as enterprises move beyond LLMs
  • Indic Language Models: Localized AI mentioned as high-impact use case for India
  • Parameter Ranges Discussed:
    • 7-8B parameters (edge devices)
    • 20-80B parameters (data center CPUs)

Application Domains

  • Drug discovery & bioinformatics
  • Protein folding & molecular modeling
  • Weather prediction
  • Oil exploration
  • Finite element modeling & computational fluid dynamics
  • Smart manufacturing (surveillance, digital twins, "dark factories")
  • Smart retail (theft prevention, customer analytics)
  • Document search & policy analysis
  • Multimodal use cases

Key Metrics & Concepts

  • PUE (Power Usage Effectiveness): Intel data centers achieve 1.06; industry standard water-cooled ~1.2-1.5
  • Frugal AI: Intel's cost-conscious deployment framework
  • MLOps: Machine Learning Operations maturity—identified as critical gap
  • Throughput Examples: Human reading ~10 prompts/second; sufficient AI inference 15-20 prompts/second in some use cases; high-performance 200+ prompts/second
  • Energy Benchmarking: Energy per token (training/inference) proposed as critical metric

Cooling Technologies

  • Liquid Cooling: 70:30 liquid-to-water ratio currently; moving toward pure liquid cooling
  • Air Cooling: Advanced variants exist
  • Ribbon Fed Interconnects: New manufacturing technique improving power efficiency by 15%

Context & Significance

This conversation captures a critical inflection point in India's AI maturity. While policy infrastructure, government compute capacity, and private investment are accelerating rapidly, the panelists highlight a persistent gap between capability announcements and production deployment. The discussion is notably candid about constraints:

  • The "pilot trap" reflects a real phenomenon in Indian enterprises, not just a perception
  • Talent and curriculum gaps are acknowledged as structural challenges, not easily fixed by policy
  • Data quality and real-world MLOps remain underestimated as barriers
  • Cost and ROI dominate enterprise decision-making more than data sovereignty rhetoric suggests

The panelists' emphasis on pragmatic sovereignty (control above the silicon layer) and frugal AI (right-sizing models and infrastructure) offers a more realistic roadmap than aspirational vertical integration. The demographic argument—that India's young, tech-exposed population will rapidly close skill gaps—is presented with cautious optimism, not certainty.

Success, according to both speakers, will not be measured in summit announcements or exaflop counts, but in widespread, tangible deployment across diverse sectors and populations.