Explainers
Tap-to-learn companions for people who land here wondering what any of this is. Not essays. Things you poke at.
AI compute
The full stack: chips, serving, post-training, model weights, and the terms.
Topic hubIndian AI
Policy, compute, data, institutions, talent, applications, and gaps.
ReferenceAI glossary
Search the vocabulary around AI chips and infrastructure.
India's AI build
Who does what across policy, compute, energy, models, applications, talent, and data.
Open map → Decision treeAI chip buyer's guide
Pick the right chip for training, inference, batch, latency, or budget constraints.
Open guide → Training pipelineThe RL post-training map
See what happens after pre-training and where the 2026 compute growth sits.
Open pipeline → Searchable termsAI, term by term
Search the words people throw around when they talk about AI chips.
Search glossary →- 22 Apr 2026 The AI stack From electrons to tokens, one layer at a time. Tap any of the nine layers to unfold what it is, who operates there, and what breaks.
- 22 Apr 2026 India's AI build The functional map of who does what in Indian AI. Policy, compute, energy, models, applications, talent, data – area by area, with the gaps drawn in.
- 22 Apr 2026 AI chip buyer's guide Which chip for which workload. Eight chips (H100 / H200 / B200 / GB300 / TPU / Trainium / MI300X / Groq) compared on specs, ecosystem, availability, and fit – plus a three-question decision tree.
- 22 Apr 2026 The RL post-training map What happens to an LLM after pre-training. SFT, RLHF, DPO, PPO, GRPO, RLVR explained – with the pipeline, the terms, and why post-training is the growth slice of 2026 compute.
- 22 Apr 2026 AI, term by term What the words mean when you read about AI chips. Search across 100-plus terms, filter by category, click through to definitions and context.