[ad_1]
Why it matters: As chipmakers embark on a widespread transition to locally processed generative AI, certain users are still questioning the need of this technology. NPUs have emerged as a new buzzword as hardware vendors aim to introduce the concept of the “AI PC,” yet their arrival prompts speculation about whether the valuable die space they occupy could have been allocated to more beneficial purposes.
According to members of the Anandtech forums, AMD has significantly reduced cache size to accommodate large AI chips on upcoming Strix Point hardware. If the reports prove accurate, it would suggest that AMD and other processor vendors are placing considerable bets on a trend that is still unproven.
User “uzzi38” claimed that AMD initially intended to equip Strix Point APUs with system-level cache, which would have substantially improved CPU and integrated graphics performance. However, this plan was replaced with an emphasis on enhanced Neural Processing Units (NPUs), which are positioned as the central feature driving the new wave of AI-enhanced PCs.
Another forum member, “adroc_thurston,” added that Strix Point was originally intended to have 16MB of MALL cache.
Intel, AMD, and Qualcomm are heavily promoting AI as an integral feature of their upcoming generations of CPUs. They plan to leverage Neural Processing Units (NPUs) to locally process generative AI workloads, tasks typically handled by cloud services like ChatGPT.
Intel led the charge toward this trend with the launch of Meteor Lake late last year. It aims to boost NPU performance with subsequent releases such as Arrow Lake, Lunar Lake, and Panther Lake. AMD’s Strix Point is also set to enhance its Zen 5 CPUs and RDNA 3.5 graphics chips with increased AI capabilities upon its launch later this year. These chipmakers are aligning with Microsoft’s initiative for AI-powered PCs, which includes requirements for a dedicated AI key and NPUs capable of achieving at least 40 TOPs.
However, hardware makers and software developers have yet to fully explore how generative AI can benefit end users. While text and image generation are currently the primary applications, they face controversy over copyright and reliability concerns. Microsoft envisions generative AI revolutionizing user interactions with Windows by automating tasks such as file retrieval or settings adjustments, but these concepts remain untested.
Many participants in the Anandtech thread view generative AI as a potential bubble that could negatively impact AI PCs and multiple generations of SoCs if it bursts. If the technology fails to attract mainstream users, it may leave numerous products equipped with NPUs of limited utility.
[ad_2]