Samsung develops AI processor-embedded memory chip HBM-PIM chip


Samsung Electronics today said that it has developed an artificial intelligence (AI) processor-embedded high bandwidth memory (HBM) chip that boasts low energy consumption and enhanced performance.
The new processing-in-memory (PIM) technology will help bring powerful AI computing capabilities inside high-performance memory.
The chip, christened HBM-PIM, doubles the performance of AI systems while reducing power consumption by over 70% compared to conventional HBM2, Samsung said in a statement.
The whole thing will accelerate large-scale processing in data centers, high performance computing (HPC) systems and AI-enabled mobile applications, Samsung added.
HBM-PIM is said to use the same HBM interface as older iterations. In the event, customers will not have to change any hardware and software to apply the chip into their existing systems.
New chip maximizes parallel processing
Giving a background on standard computer architecture, Samsung statement said, the processor and memory are separate and data are exchanged between the two. In such a configuration, latency occurs especially when lots of data are moved.
Sidestepping these issues, Samsung is now installing AI engines into each memory bank, maximizing parallel processing to boost performance.
“The HBM-PIM brings processing power directly to where the data is stored by placing a DRAM-optimized AI engine inside each memory bank — a storage sub-unit — enabling parallel processing and minimizing data movement.”
The chip is currently being tested inside AI accelerators of customers that are expected to be completed within the first half of the year.
An AI accelerator is computer hardware that is designed specifically to handle AI requirements.
Samsung’s paper on the chip will be presented at the virtual International Solid-State Circuits Conference to be held on February 22.
Via: Samsung
Samsung Electronics today said that it has developed an artificial intelligence (AI) processor-embedded high bandwidth memory (HBM) chip that boasts low energy consumption and enhanced performance. The new processing-in-memory (PIM) technology will help bring powerful AI computing capabilities inside high-performance memory. The chip, christened HBM-PIM, doubles the performance of AI…
Recent Posts
- Nvidia confirms ‘rare’ RTX 5090 and 5070 Ti manufacturing issue
- I used NoteBookLM to help with productivity – here’s 5 top tips to get the most from Google’s AI audio tool
- Reddit is experiencing outages again
- OpenAI confirms 400 million weekly ChatGPT users – here’s 5 great ways to use the world’s most popular AI chatbot
- Elon Musk’s AI said he and Trump deserve the death penalty
Archives
- February 2025
- January 2025
- December 2024
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- September 2018
- October 2017
- December 2011
- August 2010