Technology

A Thorough Review of AI Chips from Nvidia, AMD, Google, and Tesla

Published March 26, 2025

This article provides a detailed look at the AI chips developed by AMD, Google, and Tesla. Notably, Tesla's Dojo 3 chip has the potential to outperform AMD in terms of AI chip performance and the number of chips produced.

The significance of AI computing is highlighted on Tesla's Abundance Slide, which mentions the Dojo 2 and Dojo 3 training chips. These chips are designed to improve Full Self-Driving (FSD) capabilities and provide support for the Optimus robot. Additionally, Tesla's upcoming AI5 and AI6 inference chips will be integrated into the Optimus and Robotaxi platforms.

In terms of production, AMD is estimated to have shipped between 300,000 and 400,000 of its Instinct MI300 AI chips in 2024, generating approximately $5 billion in revenue.

The average selling price (ASP) of the chips can be calculated as follows:
$5 billion ÷ 300,000 units = ~$16,667 per chip
$5 billion ÷ 400,000 units = ~$12,500 per chip

Looking ahead, AMD anticipates selling around 500,000 AI chips in 2025, which could bring in about $7.5 billion in revenue.

Nvidia AI Chips in 2025

Nvidia's data center revenue is a useful indicator of their AI chip sales. In 2024, analysts projected that Nvidia would achieve approximately $110.36 billion in data center revenue. Given the company’s leadership in the AI sector and expected growth, estimates for revenue in 2025 reach around $120 billion.

The quantity of chips sold will depend on the ASP. Nvidia's H100 GPUs are reported to sell for prices ranging from $20,000 to $40,000 each. If we assume an average ASP of $30,000 per chip, the calculation would be:
$120 billion ÷ $30,000 = 4,000,000 chips (or 4 million chips).

Google TPUs in 2025

Google develops Tensor Processing Units (TPUs) primarily for its own use, which means that 'made and installed' refers to production within its data centers rather than sales. In 2024, global shipments of self-developed cloud-based AI application-specific integrated circuits (ASICs), including TPUs, are expected to total around 3.45 million units, with Google capturing a 74% market share, equating to approximately 2.55 million TPUs. Assuming a market growth rate of 20% in 2025, total shipments may reach 4.14 million units, with Google maintaining its share:
4.14 million × 0.74 = 3.06 million TPUs.

In terms of performance, the current TPU v4 offers 275 teraFLOPs (bfloat16), while the TPU v5e provides 197 teraFLOPs (bfloat16). Looking ahead, the sixth-generation Trillium TPU, planned for a 2025 release, is expected to reach around 400 teraFLOPs per chip (bfloat16).

Tesla Dojo 2 in 2025

Tesla's Dojo 2 AI training chip is anticipated to be in high-volume production by late 2025. Previously, Tesla indicated that its Dojo 1 chips, which deliver 367 Tflops, correspond to 5% of the performance of 50,000 to 100,000 Nvidia H100 chips. This roughly translates to between 15,000 and 30,000 Dojo 1 chips.

Tesla's investments in Dojo supercomputers are believed to be around $500 million annually. Assuming an ASP of $10,000 per chip (similar to high-end AI chips):
$500 million ÷ $10,000 = 50,000 chips.

Dojo 2 is expected to perform at 10 times the capacity of Dojo 1, suggesting it could achieve somewhere between 3-4 petaflops, likely surpassing Nvidia H100's performance.

The future Dojo 3 chip, planned for production in 2026, could potentially have a performance increase of 10 times, suggesting it could reach up to 40 petaflops, which would be competitive with Nvidia's B300s. After adjusting for pricing, it would also be favorably positioned against Nvidia's Rubin chips.

Tesla and XAI (eXternal AI) are set to create large-scale AI data centers featuring one million chips by 2026. If the Dojo 3 chip proves successful and XAI along with Tesla become its major customers, it would position Tesla as the second-leading provider of AI chips, surpassing AMD.

AI, chips, technology