AI GPU accelerators with 6TB HBM memory could appear by 2035 as AI GPU die sizes set to shrink – but there’s far worse coming up


- Future AI memory chips could demand more power than entire industrial zones combined
- 6TB of memory in one GPU sounds amazing until you see the power draw
- HBM8 stacks are impressive in theory, but terrifying in practice for any energy-conscious enterprise
The relentless drive to expand AI processing power is ushering in a new era for memory technology, but it comes at a cost that raises practical and environmental concerns, experts have warned.
Research by Korea Advanced Institute of Science & Technology (KAIST) and Terabyte Interconnection and Package Laboratory (TERA) suggests by 2035, AI GPU accelerators equipped with 6TB of HBM could become a reality.
These developments, while technically impressive, also highlight the steep power demands and increasing complexity involved in pushing the boundaries of AI infrastructure.
Rise in AI GPU memory capacity brings huge power consumption
The roadmap reveals the evolution from HBM4 to HBM8 will deliver major gains in bandwidth, memory stacking, and cooling techniques.
Starting in 2026 with HBM4, Nvidia‘s Rubin and AMD’s Instinct MI400 platforms will incorporate up to 432GB of memory, with bandwidths reaching nearly 20TB/s.
This memory type employs direct-to-chip liquid cooling and custom packaging methods to handle power densities around 75 to 80W per stack.
HBM5, projected for 2029, doubles the input/output lanes and moves toward immersion cooling, with up to 80GB per stack consuming 100W.
Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!
However, the power requirements will continue to climb with HBM6, anticipated by 2032, which pushes bandwidth to 8TB/s and stack capacity to 120GB, each drawing up to 120W.
These figures quickly add up when considering full GPU packages expected to consume up to 5,920W per chip, assuming 16 HBM6 stacks in a system.
By the time HBM7 and HBM8 arrive, the numbers stretch into previously unimaginable territory.
HBM7, expected around 2035, triples bandwidth to 24TB/s and enables up to 192GB per stack. The architecture supports 32 memory stacks, pushing total memory capacity beyond 6TB, but the power demand reaches 15,360W per package.
The estimated 15,360W power consumption marks a dramatic increase, representing a sevenfold rise in just nine years.
This means that a million of these in a data center would consume 15.36GW, a figure that roughly equals the UK’s entire onshore wind generation capacity in 2024.
HBM8, projected for 2038, further expands capacity and bandwidth with 64TB/s per stack and up to 240GB capacity, using 16,384 I/O and 32Gbps speeds.
It also features coaxial TSV, embedded cooling, and double-sided interposers.
The growing demands of AI and large language model (LLM) inference have driven researchers to introduce concepts like HBF (High-Bandwidth Flash) and HBM-centric computing.
These designs propose integrating NAND flash and LPDDR memory into the HBM stack, relying on new cooling methods and interconnects, but their feasibility and real-world efficiency remain to be proven.
You might also like
Future AI memory chips could demand more power than entire industrial zones combined 6TB of memory in one GPU sounds amazing until you see the power draw HBM8 stacks are impressive in theory, but terrifying in practice for any energy-conscious enterprise The relentless drive to expand AI processing power is…
Recent Posts
- AI GPU accelerators with 6TB HBM memory could appear by 2035 as AI GPU die sizes set to shrink – but there’s far worse coming up
- Adobe launches a new ‘computational photography’ camera app for iPhones
- 3 tear-jerking TV shows on Netflix, Disney+ and Max that have made me cry so far in 2025
- Spotify’s HiFi lossless streaming might really, finally, actually be coming soon
- Eufy’s X10 Pro Omni mopping robovac has returned to its best price to date
Archives
- June 2025
- May 2025
- April 2025
- March 2025
- February 2025
- January 2025
- December 2024
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- September 2018
- October 2017
- December 2011
- August 2010