Google has detailed its next-generation AI accelerator, the Ironwood TPU Superpod, which solidifies its leadership in AI. According to information shared by the company at its Hot Chips 2025 event, this system, built on the 7th-generation TPU architecture, delivers a massive performance increase of over 16x compared to the previous generation.
Google Ironwood Coming Soon
Ironwood TPU comes with 9216 chips in each pod and 192GB of HBM memory per chip. This architecture delivers 4614 TFLOPs of processing power, more than 16 times the performance of TPU v4, which was introduced in 2022.

This total computing power, reaching 42.5 exaflops, makes Ironwood 24 times faster than similar system segments of El Capitan, known as the world’s most powerful supercomputer. However, Google uses FP8 for this comparison and notes that El Capitan does not offer this support.
At the heart of Ironwood is the next-generation Ironwood SoC, which houses four chips on each motherboard. A rack is created by combining 16 motherboards. This modular structure is designed in blocks of 64 chips.
Google connects these blocks using a network architecture called a 3D Torus. This architecture has a 4x4x4 structure, providing low latency and high flexibility in data communication. Furthermore, the system can be scalable up to 43 blocks using Google’s InterChip Interconnect (ICI) technology, enabling a network capacity of 1.8 petabytes.
An Ironwood Superpod contains 144 racks. The system also includes optical switch enclosures and liquid cooling units. It utilizes hybrid connection technologies, meaning copper connections on the motherboard are combined with fiber optic lines to provide high bandwidth and flexibility.
On the power side, each rack operates with a dedicated power distribution system that converts 416-volt AC input to DC output. The entire system is liquid-cooled and can consume over 100 kW of power in a single rack. These details highlight Google’s ambitions in AI and the extensive infrastructure it has built to support them.