‘Feels like magic!’: Groq’s ultrafast LPU could well be the first LLM-native processor — and its latest demo may well convince Nvidia and AMD to get out their checkbooks


Groq, led by ex-Google engineer and CEO Jonathan Ross, claims to have created the first ever Language Processing Unit (LPU) which it says can deliver the fastest speeds for AI applications.
It’s a bold claim, but one that the latest demos more than back up, suggesting it could well become an absolute game-changer for AI.
Ross, who previously designed Google’s tensor processing unit (TPU), launched Groq in 2016 to create a chip capable of executing deep learning inference tasks more efficiently than existing CPUs and GPUs.
Lightning fast
The company’s Tensor Stream Processor (TSP) is likened to an assembly line, processing data tasks in a sequential, organized manner. In contrast, a GPU is akin to a static workstation, where workers come and go to apply processing steps. The TSP’s efficiency became evident with the rise of Generative AI, leading Ross to rebrand the TSP as the Language Processing Unit (LPU) to increase its recognizability.
Unlike GPUs, LPUs utilize a streamlined approach, eliminating the need for complex scheduling hardware, ensuring consistent latency and throughput. LPUs are also energy efficient, reducing the overhead of managing multiple threads and avoiding underutilization of cores. Groq’s scalable chip design allows multiple TSPs to be linked without traditional bottlenecks, simplifying hardware requirements for large-scale AI models.
The first public demo of Groq was a lightning-fast AI answers engine that generated answers with hundreds of words in less that a second. Matt Shumer posted the test on X and says more than 3/4 of the time was spent searching not generating.
The first public demo using Groq: a lightning-fast AI Answers Engine.It writes factual, cited answers with hundreds of words in less than a second.More than 3/4 of the time is spent searching, not generating!The LLM runs in a fraction of a second.https://t.co/dVUPyh3XGV https://t.co/mNV78XkoVB pic.twitter.com/QaDXixgSzpFebruary 19, 2024
See more
While that’s impressive, watching Groq go head to head with Chat-GPT is something else.
If you want to try Groq for yourself, to get an idea of just how fast it can be for AI, go to this chat page. Use the drop down on the left to switch between the different available models.
More from TechRadar Pro
Groq, led by ex-Google engineer and CEO Jonathan Ross, claims to have created the first ever Language Processing Unit (LPU) which it says can deliver the fastest speeds for AI applications. It’s a bold claim, but one that the latest demos more than back up, suggesting it could well become…
Recent Posts
- XO, Kitty season 3: everything we know so far about the hit show’s return to Netflix
- This surprisingly simple way to hide hardware security keys in mainstream flash memory could pave the way for ultra-secure storage very soon
- Quordle hints and answers for Sunday, July 6 (game #1259)
- NYT Connections hints and answers for Sunday, July 6 (game #756)
- NYT Strands hints and answers for Sunday, July 6 (game #490)
Archives
- July 2025
- June 2025
- May 2025
- April 2025
- March 2025
- February 2025
- January 2025
- December 2024
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022