AI PCs to gain speed, cut cloud costs — and help workers upskill
Windows PCs have faced death threats for decades from a variety of rival devices, including tablets, Macs, Linux computers, and other hardware. But the rise of Ai in recent years could be helping to revive interests in PCs as company’s contemplate upgrades in the near future,
The first “AI PCs” were introduced amid much fanfare in 2024, and shipments are growing. But enterprises that picked up early AI PCs have been stymied in their embrace of the technology as meaningful offline applications haven’t yet materialized.
While AI browsers from Perplexity and potentially from OpenAI have given AI PCs a lift, AI chips aren’t fast — and PC users still mostly go to the cloud for AI services.
“We’re aware that AI workloads, while they’re moving to the edge, [that’s] not going to happen overnight…, but that’s changing,” Jim Johnson, senior vice president, general manager, Client Computing Group at Intel, said during a keynote address at the last week’s CES trade show.
By late this year and early 2027, enterprises will have more applications to work with on AI PCs; that could reduce cloud costs and provide a path for employees to upskill in AI, said Zach Noskey, director of portfolio strategy and product management at Dell.
“The initial investment in AI PCs can be offset by long-term savings in cloud service fees, improved productivity, and enhanced security,” he said.
New hardware to make offline AI faster
A new Intel AI PC chip — Panther Lake — generated positive buzz at CES after Intel boasted it has enough speed to run large language models (LLMs) and AI applications on-device.
A faster chip translates to bringing more intelligence to applications and bolstering worker productivity, Jim Johnson, senior vice president and general manager at Intel’s Client Computing Group, said during a keynote speech at CES
The Core Ultra Series 3 chips will support generative AI (genAI) models, including Alibaba’s Qwen 3 model, from day one, with Johnson boasting of 500+ AI features. The chip also supports AI tools in applications such as Zoom and Adobe, which uses the AI features on the chip to perform image search in the Adobe Premiere Pro suite.
Specifically, Panther Lake has 12 GPU tiles, which will bring faster AI processing capability to PCs. The older Intel PC chip, Lunar Lake, had only 4 GPU tiles. (GPUs are also used in the cloud to run OpenAI’s ChatGPT, Microsoft’s Azure AI and other critical cloud services.)
Intel has redesigned its chip to run AI models out of the box by ignoring the neural processing unit (NPU), the other AI chip in Panther Lake. NPUs can only run targeted AI models, while GPUs don’t require specially programmed AI models. The newest NPU and deliver 50 trillion operations per second (TOPS); the NPU in the previous-generation Lunar Lake chips could handle only 40 TOPS.
AI on PCs to cut cloud costs
AI models are getting increasingly sophisticated, but they are so large that there’s no chance of fitting them on devices, Dell’s Noskey said. “What you’re probably seeing is you have to work smarter, not harder,” he said.
But users should soon start getting better fine-tuned models for specific tasks. “It’s really exciting because the silicon is starting to come along to enable a lot of those use cases,” Noskey said.
Dell expects a steady climb in the advance of AI on PCs, Noskey said. “It’s an exciting time and we’re really excited about 2026 and beyond there.”
Running AI on PCs has tangible benefits because it can localize services and cut the volume of queries to AI cloud services. That could help companies save money, Noskey said. “By processing AI tasks locally, AI PCs reduce reliance on cloud-based services, cutting down on data transfer and cloud computing costs,” he said.
Data security is critical for companies, which are investigating on-site or edge AI processing to mitigate risks and other concerns related to sending data to the cloud, Noskey said.
AI on PCs will also give employees a chance to use the technology and learn new skills, Noskey said. AI knowledge has become increasingly important to find or retain jobs.
AI will allow users to create their own workflows
For its part, Microsoft is turning Windows into an agentic OS where users can work with data sets that reside on device. And users can create their own offline agentic workflows by combining multiple AI tools and models, said Namee Oberst, co-founder of AI software company LLMware.
“What if you want to start building agentic workflows that work for you? You can stack small language models together to create workflows for enterprise automation,” said Darren Oberst, another LLMware co-founder. The company is one of the few working on offline AI on PCs.
Their product, ModelHQ, includes more than 200 models that can be chained together to create workflows with no knowledge of coding. The AI tools don’t even need a Wi-Fi connection.
“Two or three years from now, there’s no way a company like us will be dealing with [low-level chip optimization], because a lot of those layers will just start to get smoothed out,” Darren Oberst said.
In addition to Intel, LLMWare has been testing Qualcomm’s next-generation PC chip. ”I think in 2026 you’re gonna see a lot more fast stuff running on Qualcomm,” he said.
While it’s taken some time for AI chips to reach maturity, the enabling software and models are coming together at the right time, Darren Oberst said. “…We’re pretty optimistic heading into 2026 that those pieces are starting to come together,” he said.After AI review: Google stops dangerous health advice – ComputerworldRead More