You can now rent Google’s most powerful AI chip: Trillium TPU underpins Gemini 2.0 and will put AMD and Nvidia on high alert

- Trillium has hit general availability just months after preview release
- Powerful AI chip offers more than four times the training performance
- Google uses it to train Gemini 2.0, the company’s advanced AI model
Google has been developing Tensor Processing Units (TPUs), its custom AI accelerators, for over a decade, and a few months after being made available in preview, has announced that its sixth-generation TPU has reached general availability and is now available for rent.
Trillium doubles both the HBM capacity and the Interchip Interconnect bandwidth, and was was used to train Gemini 2.0, the tech giant’s flagship AI model.
Google reports it offers up to a 2.5x improvement in training performance per dollar compared to prior TPU generations, making it an appealing option for enterprises seeking efficient AI infrastructure.
Google Cloud’s AI Hypercomputer
Trillium delivers a range of other improvements over its predecessor, including more than four times the training performance. Energy efficiency has been increased by 67%, while peak compute performance per chip has risen by a factor of 4.7.
Trillium naturally improves inference performance as well. Google’s tests indicate over three times higher throughput for image generation models such as Stable Diffusion XL and nearly twice the throughput for large language models compared to earlier TPU generations.
The chip is also optimized for embedding-intensive models, with its third-generation SparseCore providing better performance for dynamic and data-dependent operations.
Trillium TPU also forms the foundation of Google Cloud’s AI Hypercomputer. This system features over 100,000 Trillium chips connected via a Jupiter network fabric delivering 13 Petabits/sec of bandwidth. It integrates optimized hardware, open software, and popular machine learning frameworks, including JAX, PyTorch, and TensorFlow.
Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!
With Trillium now generally available, Google Cloud customers have the opportunity to access the same hardware used to train Gemini 2.0, making high-performance AI infrastructure more accessible for a wide range of applications.

You might also like
Trillium has hit general availability just months after preview release Powerful AI chip offers more than four times the training performance Google uses it to train Gemini 2.0, the company’s advanced AI model Google has been developing Tensor Processing Units (TPUs), its custom AI accelerators, for over a decade, and…
Recent Posts
- DOGE can keep accessing government data for now, judge rules
- In a test, 2000 people were shown deepfake content, and only two of them managed to get a perfect score
- Quordle hints and answers for Wednesday, February 19 (game #1122)
- Facebook is about to mass delete a lot of old live streams
- An obscure French startup just launched the cheapest true 5K monitor in the world right now and I can’t wait to test it
Archives
- February 2025
- January 2025
- December 2024
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- September 2018
- October 2017
- December 2011
- August 2010