A new AI battlefield is opening up, not among all the GPU ‘farms’ in data centres around the world, but out at the network edge and in particular on the laptop.
This explains why Intel has invested so much in its AI-PC programme – I’d call it a reference design, but it’s both broader and more conceptual than that. Its glitzy “AI Everywhere” announcement of AI-PC and its AI-enabled Core Ultra processors last week clearly showed the chip giant’s desire, and need, to catch up in an area where it has been outstripped by rivals.
AI-enabled processors aren’t new, but they’re not mainstream either
The fact is that Core Ultra may look impressive, with its AI-dedicated NPU (neural processing unit) built in, alongside an Intel Arc GPU and multiple primary CPU cores, but the idea is not at all new.
For example, Apple’s processors gained support for real-time processing of machine learning (ML) algorithms as far back as 2017, and have been steadily getting more AI-capable since then. Qualcomm announced a version of its ARM-based (and Windows-capable) Snapdragon with an integrated NPU a few months ago, and AMD is already onto its second generation of ‘Ryzen AI’ powered chips for ultraportable laptops.
But Intel has been talking up AI-PC and the fact that it’s building AI-dedicated cores into its newest processors for quite some time. Its hardware partners have also dropped AI-PC mentions, and all this suggests that Intel thinks it can still take control of the narrative, or at least some of it.
To be realistic, it is in a decent position here. That’s partly because most laptops are Intel-powered, despite the Mac making big inroads, so we have a long way to go before AI-enabled processors become mainstream. It’s also because of Intel’s ecosystem approach and the number of developers it’s able to sign up to build AI technologies into their applications – plus of course it says its AI hardware uses industry-standard APIs anyway.
AI is more than just chatbots
Whether you should be worried about any of this depends on the application. If these chips are used to run Private AI – and in the future it’s quite likely that they will be – then there will be the usual potential risks associated with chatbots and other generative AI technologies. But those will exist anyway, wherever you run those workloads.
So in the main, I see this as being similar to the way that the shift from text-based screens to graphical user interfaces and applications drove vendors to integrate a graphics capability as standard. More and more software developers are taking advantage of AI technologies in their applications, and PC suppliers want and need to respond to that.
In my next blog, I’ll take a look at what applications are being AI-enabled and why, and how AI-capable laptops can help.
Bryan Betts is sadly no longer with us. He worked as an analyst at Freeform Dynamics between July 2016 and February 2024, when he tragically passed away following an unexpected illness. We are proud to continue to host Bryan’s work as a tribute to his great contribution to the IT industry.
Have You Read This?
Generative AI Checkpoint
From Barcode Scanning to Smart Data Capture
Beyond the Barcode: Smart Data Capture
The Evolving Role of Converged Infrastructure in Modern IT
Evaluating the Potential of Hyper-Converged Storage
Kubernetes as an enterprise multi-cloud enabler
A CX perspective on the Contact Centre
Automation of SAP Master Data Management