AI marketing is a con – especially when it comes to CPUs

Artificial intelligence is increasingly making its presence felt in more areas of our lives, certainly since the launch of ChatGPT. Depending on your view, it’s that big bad bogeyman that’s taking jobs and causing widespread copyright infringement, or a gift with the potential to catapult humanity into a new age of enlightenment.
What many have achieved with the new tech, from Midjourney and LLMs to smart algorithms and data analysis, is beyond radical. It’s a technology that, like most of the silicon-based breakthroughs that came before it, has a lot of potency behind it. It can do a lot of good, but also, many fear, a lot of bad. And those outcomes are entirely dependent on how it’s manipulated, managed, and regulated.
It’s not surprising then, given how rapidly AI has forced its way into the zeitgeist, that tech companies and their sales teams are equally leaning into the technology, stuffing its various iterations into their latest products, all in the aim of encouraging us to buy their hardware.
Check out this new AI powered laptop, that motherboard that utilizes AI to overclock your CPU to the limit, those new webcams featuring AI deep-learning tech. You get the point. You just know that from Silicon Valley to Shanghai, share-holders and company execs are asking their marketing teams “How can we get AI into our products?” in time for the next CES or the next Computex, no matter how modest the value will actually be for us consumers.
My biggest bugbear comes in the form of the latest generation of CPUs being launched by the likes of AMD, Intel, and Qualcomm. Now, these aren’t bad products, not by a long shot. Qualcomm is making huge leaps and bounds in the desktop and laptop chip markets, and the performance of both Intel and AMD’s latest chips is nothing if not impressive. Generation on generation, we’re seeing higher performance scores, better efficiency, broader connectivity, lower latencies, and ridiculous power savings (here’s looking at you, Snapdragon), among a whole slew of innovative design changes and choices. To most of us mere mortals, it’s magic way beyond the basic 0s and 1s.
Despite that, we still get AI slapped onto everything regardless of whether or not it’s actually adding anything useful to a product. We have new neural processing units (NPUs) added to chips, which are co-processors that are designed to accelerate low-level operations that can take advantage of AI. These are then put into low-powered laptops, allowing them to use advanced AI features such as Microsoft’s Copilot assistant to tick that AI checkbox, as if it makes a difference to a predominantly cloud-based solution.
The thing is though, CPU performance, when it comes to AI, is insignificant. Like seriously insignificant, to the point it’s not even mildly relevant. It’s like trying to launch NASA’s JWST space telescope with a bottle of Coke and some Mentos.
Emperor’s new clothes?
I’ve spent the last month testing a raft of laptops and processors, specifically in regard to how they handle artificial intelligence tasks and apps. Using UL’s Procyon benchmark suite (makers of the 3D Mark series), you can run its Computer Vision inference test, and that can spit out a nice number for you, giving you a score for each component. Intel Core i9-14900K? 50. AMD Ryzen 9 7900X? 56. 9900X? 79 (that’s a 41% performance increase, by the way, gen-on-gen, seriously huge).
Here’s the thing though: chuck a GPU through that same test, such as Nvidia’s RTX 4080 Super, and it scores 2,123. That’s a 2,587% performance increase compared to that Ryzen 9 9900X, and that’s not even using Nvidia’s own TensorRT SDK, which scores even higher than that.
The simple fact of the matter is that AI demands parallel processing performance like nothing else, and nothing does that better than a graphics card right now. Elon Musk knows this – he’s just installed 100,000 Nvidia H100 GPUs in xAI’s latest AI training system. That’s more than $1 billion worth of graphics cards in a single supercomputer.
Obscured by clouds
To add insult to injury, the vast majority of popular AI tools today require cloud computing to fully function anyway.
LLMs (large language models) like ChatGPT and Google Gemini require so much processing power and storage space that it’s impossible to run them on a local machine. Even Adobe’s Generative Fill and AI smart filter tech in the latest versions of Photoshop require cloud computing to process images.
It’s just not feasible or possible to really run the vast majority of these AI programs that are so popular today on your own home machine. There are exceptions, of course; certain AI image-generation tools are far easier to operate on a solo machine, but still, you’re far better off using cloud computing to process it in 99% of use cases.
The one big exception to this rule is localized upscaling and super-sampling. Things like Nvidia’s DLSS and Intel’s XeSS, and even to a lesser extent AMD’s own FSR (although this is predominantly based on deep-learning models, applied via rasterization hardware, meaning you don’t need AI componentry) are fantastic examples of a good use of localized AI. Otherwise though, you’re basically out of luck.
Yet still, here we are. Another week, another AI-powered laptop, another AI chip, much of which, in my opinion, amounts to a lot of fuss about nothing.
You might also like…
Artificial intelligence is increasingly making its presence felt in more areas of our lives, certainly since the launch of ChatGPT. Depending on your view, it’s that big bad bogeyman that’s taking jobs and causing widespread copyright infringement, or a gift with the potential to catapult humanity into a new age…
Recent Posts
- The GSA is shutting down its EV chargers, calling them ‘not mission critical’
- Lenovo is going all out with yet another funky laptop design: this time, it’s a business notebook with a foldable OLED screen
- Elon Musk’s first month of destroying America will cost us decades
- The first iOS 18.4 developer beta is here, with support for Priority Notifications
- Fortnite’s new season leans heavily on heist mechanics
Archives
- February 2025
- January 2025
- December 2024
- November 2024
- October 2024
- September 2024
- August 2024
- July 2024
- June 2024
- May 2024
- April 2024
- March 2024
- February 2024
- January 2024
- December 2023
- November 2023
- October 2023
- September 2023
- August 2023
- July 2023
- June 2023
- May 2023
- April 2023
- March 2023
- February 2023
- January 2023
- December 2022
- November 2022
- October 2022
- September 2022
- August 2022
- July 2022
- June 2022
- May 2022
- April 2022
- March 2022
- February 2022
- January 2022
- December 2021
- November 2021
- October 2021
- September 2021
- August 2021
- July 2021
- June 2021
- May 2021
- April 2021
- March 2021
- February 2021
- January 2021
- December 2020
- November 2020
- October 2020
- September 2020
- August 2020
- July 2020
- June 2020
- May 2020
- April 2020
- March 2020
- February 2020
- January 2020
- December 2019
- November 2019
- September 2018
- October 2017
- December 2011
- August 2010