The framing is worth understanding because it clarifies where value is created and where the real competition is.

The Five Layers

Layer 1 — Chips: The physical substrate. GPUs, memory, networking. NVIDIA's core business. What's possible at every layer above depends on what happens here.

Layer 2 — Systems: How hardware is assembled and operated at scale. The hyperscalers (AWS, Azure, Google Cloud) dominate here. Energy and cooling economics are primary constraints.

Layer 3 — Frameworks: Software that abstracts hardware for AI developers — CUDA, PyTorch, inference servers. NVIDIA competes here too with NeMo and Triton.

Layer 4 — Models: The trained systems doing AI work — GPT-5, Claude, Gemini, Llama. Where OpenAI, Anthropic, and Google fight their most visible battles.

Layer 5 — Applications: Products built on models. Claude Code, ChatGPT, OpenClaw, every AI product. Where most startups and solo founders compete.

What Changes Everything

The key insight NVIDIA emphasizes: intelligence is now produced in real time. Software is no longer pre-recorded algorithms retrieving stored answers. Every response is newly generated from context.

This changes the economics of every layer. Chips matter more. Inference efficiency matters more. The companies that control Layers 1-3 have leverage over everything built above them.

For Builders

You're in Layer 5. Your success depends on what happens in Layers 1-4 — model quality, inference cost, hardware availability. Understanding what's changing in those layers gives you better intuition about where your category is going.

Model cost curves are falling. Inference efficiency is improving. The ceiling for what's economically viable to build at Layer 5 rises with every improvement below.