This is Part 4 of the NVIDIA Series. Parts 1–3 are available here.


75% of Google's Revenue Depends on One Habit

Start with the facts.

Alphabet's 2025 revenue was approximately $350 billion. Roughly 75% came from advertising — predominantly search advertising, the business of showing relevant ads to people who search for things on Google.

The entire model rests on one behavioral assumption: people will keep using Google to search.

That assumption is under pressure. ChatGPT, Claude, Gemini, and dozens of AI assistants have begun substituting for search in exactly the scenarios where Google's advertising is most valuable — researching travel, comparing products, getting answers to specific questions. The habit of "asking AI" rather than "searching Google" is spreading, particularly among younger and more tech-literate users.

For Google's core business, this is an existential question in slow motion.

The Arsonist Selling Fire Hoses

Here is where the story becomes genuinely interesting.

Google has been developing its own AI chips — Tensor Processing Units, or TPUs — since 2016. The latest announced generation, TPU v7 (codenamed Ironwood), delivers claimed 4x inference performance improvement over the previous generation.

Unlike NVIDIA, Google does not sell TPUs as hardware. Instead, they are offered exclusively through Google Cloud as a computing service. If you want access to TPU compute, you pay Google to run workloads in their data centers.

The deals being signed are substantial. In 2025, Anthropic committed to a multi-billion-dollar TPU arrangement with Google, targeting access to one million TPU v7 chips by 2026. Meta has been reported to be in discussions about deploying TPUs in its own infrastructure. Apple disclosed in a technical paper that it trained Apple Intelligence models on TPU v5p hardware.

The company whose search advertising is being cannibalized by AI is simultaneously selling the infrastructure those AI models run on. Google set the fire. Google is also selling the hoses.

Can Advertising Servers Become AI Servers?

This is where we move from established fact to informed speculation — and the distinction matters.

What can be stated with confidence: Google operates one of the world's largest data center networks. The physical infrastructure — land, power grids, cooling systems, fiber networks — required to run advertising workloads is substantially the same infrastructure required to run AI workloads. The foundation transfers. Data centers built for AdWords can house TPUs.

What cannot: The chips themselves do not transfer. Advertising servers run on conventional CPUs. AI inference and training require TPUs or GPUs. Replacing the hardware inside existing data centers requires capital expenditure at significant scale.

The reasonable inference: If AI compute demand grows faster than advertising demand declines — which appears to be the current trajectory — Google has a plausible path to offsetting revenue erosion through Cloud AI services. And because Google designs its own TPUs rather than purchasing from NVIDIA, its cost structure for this transition is likely more favorable than Amazon's or Microsoft's.

Whether this actually plays out depends on the relative pace of advertising decline and AI cloud growth — two variables that remain genuinely uncertain.

Google vs. Meta: A Useful Contrast

Both companies derive the vast majority of revenue from digital advertising. Both face the same structural threat from AI displacing ad-supported search and social browsing. But their positions are meaningfully different.

Meta's advertising dependency is actually higher than Google's — roughly 97% of revenue. And Meta has no TPU, no cloud computing business, no infrastructure play. Its AI investment is concentrated in Llama (an open-source model it releases freely) and augmented reality hardware — both of which are difficult to connect to near-term revenue.

Meta's AI strategy appears to be a bet that better AI makes its advertising products more effective and that future AR devices create a new platform. Google's strategy includes that bet but also includes a direct revenue stream from selling AI infrastructure to the same companies competing with Google's own AI products.

From an investor's perspective, Google's AI exposure is more structurally coherent — though neither company has fully answered how AI ultimately affects their core advertising economics.

NVIDIA's Moat Intact — But Encircled

The core NVIDIA thesis from Parts 1 and 2 remains valid. CUDA's twenty-year ecosystem cannot be dismantled quickly. NVIDIA holds 80–90% market share in AI compute.

But the strategic landscape around NVIDIA has changed:

Company Custom Silicon Purpose
Google TPU v7 Cloud AI infrastructure
Amazon Trainium3 AWS AI training and inference
Meta MTIA Internal social AI workloads
Apple Neural Engine On-device AI inference
Microsoft Maia Azure AI services

Every major technology company is now developing chips specifically to reduce NVIDIA dependence. The motivation is consistent across all of them: supply security, cost control, and workload-specific optimization.

This does not threaten NVIDIA immediately. None of these chips has a general-purpose software ecosystem comparable to CUDA. The displacement, if it comes, is a decade-scale process. But the trajectory is established.

The Structural Picture

The most useful framework for investors may be to view this not as a zero-sum competition but as a layered ecosystem:

  • NVIDIA dominates general-purpose AI compute and holds the CUDA moat
  • Google/Amazon/Microsoft are building specialized infrastructure for their own cloud customers
  • Japanese supply chain companies benefit from growing AI infrastructure regardless of which compute platform wins

Google's potential transition from advertising company to AI infrastructure company — if it occurs — would represent one of the most significant business model pivots in corporate history. It is not certain. The advertising business may prove more resilient than feared, or AI cloud revenue may disappoint. But the structural pieces for such a transition exist in a way they simply do not for most other companies facing AI disruption.

The arsonist has unusually good fire-fighting equipment.


This concludes the NVIDIA Series. Parts 1–3 are available here.


Source: Alphabet, Meta public filings and press reports | 日本語版

Disclaimer | This article is for informational purposes only and does not constitute investment advice. URL: analysis/2026/03/nvidia-ai-series-04/Save_As: analysis/2026/03/nvidia-ai-series-04/index.html