
Google will invest up to $40 billion in AI company Anthropic to deepen cooperation.
The relationship between Google and Anthropic is special – both as a partner and as a competitor. Anthropic is a significant customer of Google, making extensive use of Google's proprietary AI chips (TPUs) and cloud services. And Google is trying to grow these new businesses because its old business, search advertising, has slowed and needs new revenue streams.
According to the latest agreement, Google Cloud will provide Anthropic with 5 gigawatts of computing power (equivalent to the power of several large data centers) over the next five years, gradually coming online starting in 2027, with the possibility of adding a few more gigawatts after that.
The situation of "three hegemonies" is over
This large-scale investment by Google marks a fundamental change in the AI industry landscape.
In the past two years, it has often been said that the "first echelon" of AI is the triad of OpenAI, Google and Anthropic. But this statement is now outdated.
If you look at Anthropic's financing in the last six months, you will find a surprising fact: all four of Silicon Valley's tech giants have become its sponsors:
Amazon: invested $5 billion (up to $25 billion), also promised to provide 5 GW of Trainium chip computing power, and signed a $100 billion AWS cloud service order; Google: Invest $10 billion (up to 40 billion) first, then add 5 GW of TPU computing power; Nvidia: Invest up to $10 billion to provide 1 GW of GPU computing power; Microsoft: Invest up to $5 billion, while Anthropic promises to spend $30 billion on Azure to buy computing power.
The four giants appeared on Anthropic's shareholder list, with a cumulative commitment of more than 11 GW of computing power - equivalent to the power generation of 10 nuclear power plants!
The current situation is no longer a "three-of-a-kind battle", but has evolved into a two-way confrontation between the "Anthropic+ Big Four" and OpenAI.
A clear technical route dispute
Anthropic is taking the path of "large model + specialized chip (ASIC)".
It primarily relies on Google's TPU and Amazon's Trainium – both of which are chips specifically designed for AI tasks.
For Google, investing in Anthropic is not just a bet on a company, but also finding a "big customer" for its TPU. You know, Google plans to spend $185 billion this year to build data centers and expand TPU production. If no one uses them, these chips become sky-high inventory. Anthropic can not only help Google digest production capacity, but also drive more enterprises to use TPU, which can be called the "ballast stone".
OpenAI is taking the route of "large model + GPU".
Its computing power core comes from Nvidia, relying on a super infrastructure project called Stargate - with a total investment target of up to $500 billion, all around NVIDIA GPUs. The advantages of this road are ecological maturity and easy development; But the disadvantages are also obvious: construction is too slow. The entire Stargate will not be fully operational until around 2029, and even the first data center in Texas has been slow to progress.
Two routes, each with its own advantages and disadvantages:
ASICs (such as TPU, Trainium): high energy efficiency, cheaper per unit computing power, but highly customized and less flexible; GPU (such as NVIDIA): Highly versatile, many developers, and complete tools, but high power consumption and high cost.
In the final analysis, the current AI competition is no longer just a competition between models, but more like a proxy war between the "ASIC camp vs. the GPU camp": Google and Amazon promote their own chips by supporting Anthropic; Nvidia, on the other hand, relies on OpenAI and Stargate to firmly maintain the dominance of GPUs in AI infrastructure.
Google's abacus: Instead of being hard, it is better to borrow strength
Google is willing to spend $40 billion to invest in a "nominal rival" Anthropic, in fact, three numbers can be understood:
First, Anthropic is growing too much.
At the beginning of 2025, it will only have about $1 billion in annual revenue; By March 2026, annualized revenue has exceeded $30 billion - up 30 times a year! Its AI product, Claude Code, has gone from programmers to Fortune 500 customers.
Second, the market valued Anthropic at almost $1 trillion.
In the enterprise AI developer market, its share has steadily surpassed Google's own Gemini.
Third, Google has actually been in a hurry for a long time.
It holds DeepMind, Gemini and the world's largest TPU computing power cluster, but Gemini has been released for more than two years and has never been able to beat Claude in the enterprise market. Instead of going head-to-head, it is better to "join if you can't beat it".
So, this investment is essentially a shrewd "hedge": if Anthropic wins the enterprise AI market, Google makes money on equity; If Gemini comes from behind, Google wins itself; Even if Gemini fails, Anthropic will purchase a large number of Google's TPUs to ensure that there is no backlog of chips, and it can also use Anthropic to gain a foothold in the AI enterprise market.
Google's "Yang Mou": If you can't afford to lose, buy it
Main line 1: The "arms race" of AI infrastructure
Cloud computing giants (Google Cloud, AWS): computing power is the moat; Domestic AI chips: ASIC route verification has been successful, and Huawei Ascend and Cambrian have ushered in a strategic window period.
Main line 2: Pay attention to the synergy effect of "model + chip"
Enterprises with self-developed chips + large model closed loops (such as Google and Amazon) will gain long-term cost advantages; If pure model companies do not have strong computing power endorsements, their living space will be squeezed.
Main line 3: Be wary of the uncertainty of the OpenAI ecosystem
Although the GPT series is strong, if the computing power bottleneck cannot be solved, the commercialization may be hindered; Developers need to evaluate multi-model deployment strategies to avoid "putting eggs in one basket".





