<![CDATA[ Latest from PC Gamer UK in Processors ]]> https://www.pcgamer.com 2025-02-14T16:20:30Z en <![CDATA[ Arm reportedly plans to make its own CPUs from this summer with future chips said to be powering a revolutionary Jony Ive-designed AI device ]]> Is Arm planning to make its own chips and not just sell rights to its IP and CPU designs to other companies? So says the FT in what would be a hugely disruptive development, if true.

In fact, the FT claims Arm could unveil its first in-house processor as early as this summer with future chips powering a revolutionary AI strategy—including a new Jony Ive-design personal device.

The first chip is said to be a server CPU, so it won't be going into your next gaming PC. But it's still very big news with all kinds of implications.

The FT says the move is part of a broader plan by Arm's Japanese owner Softbank to move heavily into, yup you guessed it, AI, with a planned $500 billion to be spent on infrastructure in partnership with OpenAI.

That initial in-house Arm chip is actually said to be a server CPU that can be customized for clients, the most notable of which is claimed to be Meta. It's not clear how that chip, which does not appear to be overtly AI-aligned, fits in with Softbank's broader strategy for Arm.

That said, the FT mentions how Arm's move could be part of plans by former Apple designer Jony Ive in partnership with OpenAI and Softbank to create a new AI-powered personal device with a revolutionary, highly intuitive interface.

Back on the humble old dumb PC, Arm moving into making its own chips will surely only serve to accelerate the long-mooted annexation of the PC, with Intel and AMD's x86 processors eventually usurped by Arm chips.

That's been predicted for decades and yet never actually materialised. However, Qualcomm's Snapdragon X chips have been the most plausible possible usurpers, yet. Meanwhile, Nvidia, which tried and failed to buy Arm recently, is also said to be planning a new PC chip of its own based on Arm IP.

It's worth noting that Arm currently licenses both its instruction set and actual CPU designs. But it doesn't actually commission the production of any chips itself.

What's more, it's not clear what the implications might be with its ongoing fight with Qualcomm. During court fisticuffs with Qualcomm late last year, Arm pointed out that it has never built chips itself. However, it also said it is also always considering new strategies for the future.

Anyway, this is ultimately a case of wait and see. Will Arm do its own chips? Will they eventually go into PCs? Could Arm become a major player in AI? Could your smartphone be replaced by a whole new device paradigm powered by an Arm-made AI chip? Honestly, who knows!


Best CPU for gaming: Top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game first.

]]>
https://www.pcgamer.com/hardware/processors/arm-reportedly-plans-to-make-its-own-cpus-from-this-summer-with-future-chips-said-to-be-powering-a-revolutionary-jony-ive-designed-ai-device/ 5bxuB3Rd7f7bqB6HfFVYCN Fri, 14 Feb 2025 16:20:30 +0000
<![CDATA[ Intel is reportedly in talks to spin off its chip factories into a partnership with arch rival TSMC and now I think I've seen everything ]]> I've seen pretty much everything in this industry over the years. And, yes, that does include a man eating his own head. But Intel and TSMC going into the chip foundry business together? Sorry, what?

This rumour, and it very much is a rumour, comes from an equities analyst at investment bank Baird who cited discussions between TSMC and Intel. According to the Wall Street Journal and to broadly précis this particular yarn, the rough idea is for TSMC to send some of its best engineers to Intel's fabs to sort them out. They can then be spun off into a separate entity managed by TSMC but co-owned by both companies.

If you're thinking this sounds like wild speculation, it does. But the markets are taking it seriously, with Intel's share price spiking by fully 6% when the rumour broke. So the question is, does it make sense?

For starters, if there's any truth to this story then it speaks volumes about the health of Intel's all-important upcoming 18A node. And not in a good way. If 18A is all Intel is cracking it up to be, then there would be no need to parachute in TSMC engineers and Intel wouldn't be looking to spin off its fabs.

Personally, I doubt 18A is as healthy as Intel claims, otherwise CEO Pat Gelsinger would not have been ejected, sorry retired, from his role. So, let's take this rumour seriously for a moment and consider the implications.

On the one hand, the idea that TSMC in part or whole takes control of Intel's fabs could be great for the US chip industry and the broader chip supply chain. The world's chip supply would no longer be so dependent on production in a location constantly teetering on edge of a geopolitical crisis thanks to tensions between Taiwan and China.

More chips produced in the US also sidesteps any concerns over increasing electronics prices thanks to the possible imposition of hefty tariffs on Taiwanese chip of up to 100% by the Trump administration.

On the other, handing over control of even more of the world's cutting-edge chip production to TSMC looks like a megamonopoly in the making. And it's hard to find any examples of long term and overwhelming monopolies in important industries working out well for the average consumer.

You could take the view that if Intel's fabs are in more trouble than the company is letting on. It might be a choice between partnering with TSMC or watching those fabs go up in proverbial smoke.

In other words, the real-world choice might be between partnering with TSMC or seeing Intel's fabs down and ceasing to exist in the medium term. In which case, it's got to be worth giving the TSMC thing a go. After all, the US government could take control of or shutter those fabs should it wish.

Moreover, if tensions between China and Taiwan increase yet further, TSMC may benefit from a larger US presence. It already has a 4nm fab of its own up and running in the US, with 3nm and 2nm fabs in the planning.

In the long run, that could once again make the US the global center of chip production. And yet the idea makes my spidey sense tingle. You only have to look at graphics card prices to see what happens when one company dominates. The idea that TSMC swallowing up Intel's fabs is going to be a good thing for we mere gamers doesn't feel at all convincing.

In the end, the only thing we can be fairly confident of is that the next decade or so looks likely to be volatile if not tumultuous in the tech industry. Tariffs, AI, geopolitics, scalpers, fake frames, a mooted mega monopoly in chips, it's all pretty baffling. Long gone are the innocent days when the most controversial aspect of building a new gaming PC was whether you were an Intel or AMD fan. It's all so much more complicated now.


Best CPU for gaming: Top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game first.

]]>
https://www.pcgamer.com/hardware/processors/intel-is-reportedly-in-talks-to-spin-off-its-chip-factories-into-a-partnership-with-arch-rival-tsmc-and-now-i-think-ive-seen-everything/ kYvSCr4xwMdUC2jNM7DCCb Fri, 14 Feb 2025 13:36:10 +0000
<![CDATA[ New EU regulation finally cuts massive CPU boxes down to size ]]> Once upon a time I bought a single, very much regular sized eyebrow pencil online. It arrived in a needlessly MASSIVE cardboard box, presumably for ease of storing in the back of a delivery van. Thankfully most of that box was recyclable, but the same cannot be said for what's become typical for CPU boxes.

For me, excitement for new hardware is only matched by the existential dread that arises when confronted with a CPU box full of styrofoam packing peanuts that you know is just gonna go straight to landfill. Well, perhaps no longer; as of February 11, new EU regulation came into effect that will see those pesky CPU boxes finally cut down to size (via TechPowerUp).

The European Commission's refreshed Packaging and Packaging Waste Regulation (PPWR) seeks a number of aims, though the most pertinent here would be, "Minimising the weight and volume of packaging and avoiding unnecessary packaging." Some are already wondering aloud whether this means those cooler bundles CPU manufacturers are so fond of may soon become a thing of the past, but I'm unsure whether those would fall under the designation of "unnecessary" packaging—surely some people use those bundled coolers, even if most of us rightly slap on a third-party one.

We already have EU regulation to thank for standardised charging ports on your phone, stronger legislation around 'right to repair,' and—with a bit of luck—the avoidance of anything like an AI monopoly in the future. This latest packaging regulation gives manufacturers far and wide an 18-month grace period to get their act together for a hopefully less wasteful, potentially greener future. That means the days of Destiny engram-esque packaging or any of the infamous examples seen in this story are decidedly numbered.

Furthermore, the PPWR aims to both "make all packaging on the EU market recyclable in an economically viable way by 2030," and "decrease the use of virgin materials in packaging and put the sector on track to climate neutrality by 2050." E-waste is something that continues to give me The Fear, so I definitely welcome the European Commission taking aim at wasteful packaging more broadly.

Practically speaking (and less full of existential dread), holding on to the original box can be handy for ensuring delicate tech survives, say, a stressful house move. For this reason, I've been wishing for a long time that hardware boxes were ever so slightly smaller—after all, if the boxes for my anime figures can form a pleasingly compact fort, then why not CPUs?


Best CPU for gaming: Top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game first.

]]>
https://www.pcgamer.com/hardware/new-eu-regulation-finally-cuts-those-massive-cpu-boxes-down-to-size/ tMZeZsgmgupFcYE8DJiwnA Thu, 13 Feb 2025 13:30:04 +0000
<![CDATA[ AMD's CEO Dr Lisa Su claims 'highest sell-out in many years' for desktop processors and the company is 'catching up with some demand' for gaming CPUs ]]> While excitement (or derision) might currently be directed towards Nvidia's new graphics cards and AMD's upcoming ones, let's not forget that the past few months has seen the introduction of new CPU line-ups from both Intel and AMD. From Intel, we had the somewhat underwhelming Core Ultra 200S 'Arrow Lake' line-up, and from AMD, we had the first of its 9000-series 'Zen 5' processors. And on the latter front, AMD's now claiming pretty spectacular success.

During AMD's recent Q4 2024 earnings call (transcript here), AMD CEO Dr Lisa Su explained that the company "saw our highest sell-out in many years, as we went through the holiday season, launching our new gaming CPUs." Given we've been unable to find reasonably priced AMD Ryzen 7 9800X3D CPUs in stock since launch, this tracks.

Stock sell-outs are, of course, not always due to over-demand but can also be due to under-supply. This is something we're well aware of in the wake of the Nvidia RTX 5080 and RTX 5090 launch, which some are calling a paper launch, ie, a launch in name only with little actual stock to back it.

The 9000-series most certainly wasn't a paper launch, but stock has been hard to come by. Dr Su says: "We actually think that what we're seeing is very strong adoption of our new products [...] Frankly, [the gaming CPUs] have been constrained in the market, and we've continued shipping very strongly through the month of January as we are catching up with some demand there."

This is good news amidst the mixed, however, as the earnings call also highlights that the gaming graphics division isn't doing too well—although some of that could be to do with the company nearing the end of a desktop GPU and console GPU generation. On this front, we'll have to see how well the RX 9070 and RX 9070 XT do when they launch in a month or so.

On the CPU front, we can't forget that the market has kind of been set in AMD's favour, too. Yes, the Ryzen 7 9800X3D is the best CPU for gaming on the market by quite a margin, but the competition hasn't exactly been compelling, either. Intel's latest Core Ultra 200S line-up hasn't pushed the boat out very far and many of its 13th and 14th Gen CPUs were plagued with instability issues.

In other words—and not to deny the genuine prowess of AMD's latest desktop and mobile processors—given Intel's offerings, I'd be more surprised if AMD's client CPU division wasn't doing well.

Of course, this is saying nothing of the mobile side. AMD's latest AI 300 'Strix Point' mobile processors have been adopted not only in laptops but also in the OneXPlayer OneXFly F1 Pro, with presumably more handhelds to come. Intel's not empty-handed on this front either, though, as a Core Ultra 200V 'Lunar Lake' mobile processor sits inside the MSI Claw 8 AI+. Lunar Lake processors are very efficient, but for gaming they seem to struggle to compete with the brute graphical horsepower of Strix Point chips.

The two companies' offerings here are much closer than on the desktop side, though. That's probably why Dr Su says: "Going into the first quarter, we do expect seasonality in there. But the part of our business that is performing better than seasonality is the desktop portion of the business."

Great stuff, Su, just get those shelves stacked with fresh Ryzen 7 9800X3D chips and let us satiate that demand. I've seen enough 'out of stock' signs already this year.


Best CPU for gaming: Top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game first.

]]>
https://www.pcgamer.com/hardware/processors/amds-ceo-dr-lisa-su-claims-highest-sell-out-in-many-years-for-desktop-processors-and-the-company-is-catching-up-with-some-demand-for-gaming-cpus/ X4UCY2JomZvBdXCXftLgVV Wed, 05 Feb 2025 14:35:22 +0000
<![CDATA[ Old AM4 CPUs including the Ryzen 5000 still make up 50% of AMD's sales today ]]> "You'd be surprised. On a global scale, the split between AM4 and AM5 is not far off from 50/50." So, says David McAfee, AMD head of Client Channel Business, of AMD's current CPU sales (via TechPowerUp).

In other words, half of the CPUs AMD is selling today are still made for the old AM4 socket. And, of course, the last family of chips made for the AM4 socket was the Ryzen 5000. The Ryzen 7000 and latest Ryzen 9000 CPU families both go into the current AM5 socket.

McAfee does concede that split isn't universal to all markets. "Different markets have different preferences. North America and Western Europe skew toward higher-end AM5 builds," he says. But still, it's near 50/50 overall.

For context, the AM4 socket first appeared back in 2016 while the newest family of Ryzen CPUs for AM4, the Zen 3-based Ryzen 5000 chips, was first released in late 2020. But here we are in 2025 and half of AMD's desktop CPU sales are still AM4-based.

Actually, that shouldn't be a huge surprise. CPU performance gains tend to be incremental. A Ryzen 7 5800X3D, for instance, is still a very good gaming CPU. Likewise, a 16-core Ryzen 9 5950X will tear through content creation in a thoroughly modern manner.

Indeed, in most gaming benchmarks, the 5800X3D is still roughly on par and often faster than Intel's latest Core Ultra 9 285K flagship GPU. And that's running at 1080p to expose and CPU bottleneck. At 1440p or perhaps 4K, especially with a new and graphically demanding game running at high detail settings, you'd struggle to feel the difference between the 5800X3D and the latest 9800X3D beast because the limitation on frame rate won't be the CPU.

What's more, as recently as June last year, AMD actually launched two new-but-old Zen 3 chips for the AM4 socket, the eight-core Ryzen 7 5800XT and the 16-core Ryzen 9 5900XT. It followed those two up in October with the six-core Ryzen 5 5600XT, further demonstrating that there's life yet in the old AM4 dog.

In fact, the longevity of AM4 feels like it keys into a broader trend. I bored on recently about how much stamina the Nvidia RTZ 4090 has. It's just been replaced by the new RTX 5090. But it's still a fundamentally very powerful GPU and I'd take it all day long over the new RTX 5080 and its minimal raster performance gains over the old 4080 and relatively measly 16GB of VRAM.

Long story short, the 4090 will be very, very close to the top of the GPU performance tree for about four years. That's a long time. The 5800X3D is two and half years old and still going strong.

Anyway, the point is that generational gains in performance from core PC components seem to have slowed somewhat. On the one hand, it's always slightly disappointing when a new CPU or GPU comes out and doesn't blow away existing chips. On the other, it means your PC stay current for longer. And that's probably a good thing overall.


Best CPU for gaming: Top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game first.

]]>
https://www.pcgamer.com/hardware/processors/old-am4-cpus-including-the-ryzen-5000-still-make-up-50-percent-of-amds-sales-today/ JVfF3KtdbA5JYd7LJBpJsV Tue, 04 Feb 2025 17:25:38 +0000
<![CDATA[ AMD's Ryzen 9 9950X3D and 9900X3D CPUs are rumoured to launch at the end of March at roughly the same time as the RX 9070-series GPUs ]]> CES 2025 saw the announcement of some exciting new Zen 5 CPUs from AMD, not least two X3D additions that we've been anticipating for some time. The Ryzen 9 9950X3D and 9900X3D look like they might be some seriously powerful CPUs, and now the latest scuttlebutt suggests they might be coming at the end of March, alongside some long-awaited RDNA 4 graphics cards.

That's according to French hardware website Cowcotland, who's sources say that the new AMD 3D V-cache-sporting chips will make an appearance at roughly the same time as the RX 9070 and RX 9070 XT.

AMD's David McAfee had previously tweeted that gamers will get their hands on the new GPUs in March, so that all lines up rather nicely if true.

As does the naming scheme, giving that these are the latest 9000-series CPUs, potentially launching next to the 9070-series GPUs. What a difference a digit makes, ey?

As for the chips themselves, the specs are pretty monstrous. The 9950X3D is a 170 W TDP chip with 16-cores and 32-threads with a boost clock of 5.7 Ghz and 144 MB of total cache, whereas the 9900X3D has 12-cores, 24-threads, and 140 MB of cache, with a slightly slower 5.5 GHz boost clock.

Compare those specs to our current best CPU for gaming, the Ryzen 7 9800X3D, and you can see why hearts are a flutter in anticipation of the new chips. The 9800X3D is the fastest gaming CPU we've ever tested, but with eight cores, 16-threads, and 104 MB total cache, it looks like it's firmly outmatched by its new siblings—when it comes to multi-threaded workloads at least.

Still, there's always price to consider. While the new AMD mega-CPUs are likely to be top performers, if the 9800X3D can still monster ahead of the Intel competition at a more reasonable price point, it might still be the chip to go for this generation to get the best bang for your buck. Time will tell. Still, if these rumours prove out, it looks like we won't have too much longer to wait.


Best CPU for gaming: Top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game first.

]]>
https://www.pcgamer.com/hardware/processors/amds-ryzen-9-9950x3d-and-9900x3d-cpus-are-rumoured-to-launch-at-the-end-of-march-at-roughly-the-same-time-as-the-rx-9070-series-gpus/ TGNKz4zD8dbAASLRpDuHrD Mon, 03 Feb 2025 16:42:17 +0000
<![CDATA[ Intel will be keen to forget 2024 despite its products selling well because its foundries still keep on swallowing money ]]> You'd be forgiven for thinking that 2024 was a year that Intel would prefer to just forget about. Raptor Lake chips self-destructing, Arrow Lake disappointing, and Pat Gelsinger getting the boot (sorry, retiring) were just some of the low points from last year. However, it wasn't all gloom and doom as it turns out that Intel's products all generated more revenue than the previous year, despite making a substantial $19 billion loss.

So how did Intel manage to do well and poorly at the same time? In the case of the latter, the 2024 financial year statement shows the culprits to be its foundry service (Intel's chip-making plants) and something called intersegment eliminations. I'm certainly no financial expert but as I understand it, that refers to money being transferred between different segments of a business and in Intel's case, it was a pretty steep $17 billion in transferred revenue.

The real money loser was Intel Foundry with a net loss of $13.4 billion off a revenue of $17.5 billion. It's important to note here that Intel has been spending a lot of cash gearing up its fabrication and packing plants around the world, especially those that will be using its 18A process node, the very thing that Gelsinger "bet the whole company on".

For example, in its financial report, Intel states that $23.9 billion and $37.9 billion were spent on "additions to property, plant, and equipment" and "purchases of short-term investments" respectively. In turn, it generated a little over $41 billion in cash through "maturities and sales of short-term investments".

While the other divisions (Client, Data Center and AI, Networking and Edge) only generated a little extra revenue apiece, compared to 2023, they all turned in a net profit. Networking and Edge did especially well in 2024, although Data Center was down a touch. So while PC gamers might not have liked the Core Ultra 200S range all that much, it's clear that OEMs, system builders, and others are still buying plenty of Intel processors.

All of which suggests to me that Intel, as a whole, is doing fine though as I mentioned before, I am absolutely no money magician. Share prices briefly rose after Intel posted its financials before plopping back down to where they were, so perhaps the stock market is inclined to disagree with me.

I suppose it does raise the question (again) of what the future holds for Intel Foundry. After all, no matter how much money one throws at something, it eventually has to start making a genuine profit but precisely because Intel is spending so much cash on its plants, it's hard to tell just how much money its foundry arm is actually making.

$17 billion in revenue isn't to be sniffed at but if that's entirely Intel's own cash (i.e. it's paying itself to make its own chips, rather than $17 billion in outside orders), then the picture is even muddier.

AMD will be reporting its 2024 financials tomorrow, so it will be interesting to see how Team Red's year has panned out compared to Team Blue's. Of course, TSMC manufactures everything for it, so AMD doesn't have a hugely expensive foundry service dragging it down (it learned that lesson many years ago) but that doesn't mean it's going to be McScooge rich in cash.

Anyone fancy making a dollar bet that AMD's Client figures are better than Intel's? I'm game if you are.


Best CPU for gaming: Top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game first.

]]>
https://www.pcgamer.com/hardware/processors/intel-will-be-keen-to-forget-2024-despite-its-products-selling-well-because-its-foundries-still-keep-on-swallowing-money/ YBo73BrtLjLtJhAatAN5ZD Mon, 03 Feb 2025 12:23:05 +0000
<![CDATA[ It looks like there will be no new Intel desktop CPUs until 2026 now that next-gen Nova Lake is officially a 2026 product ]]> Intel won't launch a new desktop CPU until 2026. That's the clear implication from comments made this week by the company's interim co-CEO.

Speaking on an earnings call to an assembly of the usual financial analyst types, Michelle Holthaus said, "2026 is even more exciting from a client perspective as Panther Lake achieves meaningful volumes, and we introduce our next-generation client family code-named Nova Lake."

Nova Lake is, of course, Intel's next desktop architecture, the replacement for the current Arrow Lake chips including the Intel Core Ultra 9 285K. Now, we only reviewed that back in October last year. So Arrow Lake is still very new. You wouldn't normally expect a new CPU architecture anytime soon.

In that sense, the fact that Nova Lake isn't due until next year—and we don't have any clear indication in what part of 2026—is hardly a surprise. That said, there are reasons to think Intel might be keen to move things along more quickly than usual when it comes to Nova Lake.

By way of example, Intel released its new Meteor Lake laptop chips in December 2023, only to follow them up in very short order with the Lunar Lake family in September 2024. Arguably, Lunar Lake wasn't a direct replacement for Meteor Lake. But, still.

Moreover, Meteor Lake wasn't super competitive for various reasons and you could say the same of Arrow Lake. Likewise, Arrow Lake is built using predominantly TSMC silicon, and Intel is on record saying that hurts its profit margins.

So, Intel has plenty of reasons to get Nova Lake out the door ASAP. The potential catch is that Holthaus also confirmed that at least some Nova Lake CPU models will have silicon made by Intel on its new ultra-advanced 18A node.

"Nova Lake will actually have die both inside and outside [Intel Foundry] for that process [18A]," she said. As we reported earlier, Intel actually plans to release Panther Lake mobile chips on 18A later this year.

Your next upgrade

Nvidia RTX 4070 and RTX 3080 Founders Edition graphics cards

(Image credit: Future)

Best CPU for gaming: The top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game ahead of the rest.

But as we also reported in the same story, Intel only made about 5% of its chips on the latest Intel 4/3 node in 2024, despite releasing the first Intel 4 product, Meteor Lake, in 2023. The point being that Intel has hardly been ramping up its new nodes at speed.

That implies limitations as to Intel's ability to crank out lots of chips on 18A and therefore its ability to bring Nova Lake forward. In the meantime, we might expect to see Arrow Lake get some kind of refresh, though there are also rumours that Arrow Lake Refresh was planned but has now been canned.

Arguably, it doesn't really matter. Arrow Lake is what it is and a refresh is highly unlikely to turn it into an AMD Ryzen killer. It's not a terrible CPU by any means. It's just a little disappointing. Whether Nova Lake will change that, whenever exactly it arrives, is anyone's guess.

]]>
https://www.pcgamer.com/hardware/processors/it-looks-like-there-will-be-no-new-intel-desktop-cpus-until-2026-now-that-next-gen-nova-lake-is-officially-a-2026-product/ WiY784aUFqE9Bm9dvZCHYa Fri, 31 Jan 2025 16:09:16 +0000
<![CDATA[ Intel says next-gen Panther Lake laptop chips on its new 18A silicon are still on track for later this year but things are more complicated on the desktop ]]> Intel has been talking numbers this week with the announcement of some reasonably positive financials for the final quarter of 2024 (you can read the full transcript here). With that comes the usual chat around product and official confirmation that the Panther Lake mobile CPU is still due in the second half of 2024 using Intel's all-important new 18A silicon node.

But Intel said 18A wouldn't necessarily win all future CPU designs and also revealed how slowly the company's transition to advanced silicon has been. In 2024, just over 5% of Intel's internal chip manufacturing was on its latest EUV-based nodes.

"Looking ahead to the rest of the year, we will strengthen our client road map with the launch of Panther Lake, our lead product on Intel 18A in the second half of 2025," Intel's new interim co-CEO Michelle Holthaus said on the latest earnings calls with the usual money men and spreadsheet soothsayers.

However, Holthaus also implied that Panther Lake using Intel's new 18A node doesn't automatically mean that all of the company's future CPUs will be coming back in house. For its next-gen desktop CPU Nova Lake, it seems Intel is planning to split production between its own 18A silicon and likely a TSMC node, perhaps N2.

"We look at each generation of products based on what's the right product, what's the right process, what's the right market window and what allows our customers to win. So, for Panther Lake, that was 18A," Holthaus explains.

"Then as you look forward, to our next-generation product for client after that, Nova Lake will actually have die both inside and outside [Intel Foundry] for that process. So, you'll actually see compute tiles inside and outside," she said.

Intel Lunar Lake

The Lunar Lake CPU came out after Meteor Lake, but ditched the then-new Intel 4 node for production at TSMC. (Image credit: Intel)

During the call, Holthaus made generally positive noises about the 18A node which is critical to Intel's future, not only for its own chips but also as a foundry service offering to customers in competition to Taiwanese megafoundry TSMC. But the fact that Intel doesn't plan to make Nova Lake exclusively on 18A is, perhaps, revealing.

It is not, however, entirely surprising. Despite plans to begin selling Panther Lake chips built on 18A later this year, by Intel's own admission it has barely begun scaling up production of chips on its latest Intel 4 and Intel 3 nodes.

"EUV wafer revenue grew from 1% of total revenue in 2023 to greater than 5% in 2024," revealed Intel's CFO David Zinsner. Intel 4 is company's first commercially available node to use EUV or extreme ultraviolet lithography and is used for the Meteor Lake family of Laptop CPUs.

Your next upgrade

Nvidia RTX 4070 and RTX 3080 Founders Edition graphics cards

(Image credit: Future)

Best CPU for gaming: The top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game ahead of the rest.

However, the follow up to Meteor Lake, Lunar Lake, somewhat surprisingly reverted to TSMC for production, as did the Arrow Lake desktop family including the Intel Core Ultra 9 285K.

In other words, Intel is planning on cranking out 18A chips shortly despite having barely begun the transition to Intel 4 (or Intel 3, which is a derivation of Intel 4). Given that previous CES, Pat Gelsinger, admitted to betting the company on 18A it's needed to keep the new node on track and get product actually built on it whatever state the manufacturing is in with the Intel 4/3 nodes.

With all that in mind, it remains a very mixed picture at Intel. Even if the company does manage to release some Panther Lake chips on 18A later this year, questions will remain about the viability of its production technology.

Intel doesn't seem to have been able to ramp Intel 4 and Intel 3, despite releasing Meteor Lake well over a year ago. So, who knows what will happen with 18A.

]]>
https://www.pcgamer.com/hardware/processors/intel-says-next-gen-panther-lake-laptop-chips-on-its-new-18a-silicon-are-still-on-track-for-later-this-year-but-things-are-more-complicated-on-the-desktop/ NqcQj2qdh4mEVoHMEttgzU Fri, 31 Jan 2025 13:42:51 +0000
<![CDATA[ Arrow Lake's had three months of Windows and BIOS updates to fix its performance, and my testing shows in some games, it's worse ]]> Almost immediately after the launch of its Arrow Lake desktop CPUs in October 2024, Intel said it would work on producing firmware and software updates to improve its lacklustre performance. The likes of the Core Ultra 9 285K not only ran games slower than Raptor Lake chips, the previous generation of processors, but also worse than Intel's claimed internal results.

Near the end of December, Intel released a report on what it had identified as going wrong with Arrow Lake and what fixes were coming to solve them all. So, as we approach the three-month mark since the Core Ultra 200S series of CPUs first appeared, I wanted to see just how much better things are.

To that end, I've used an MSI MAG Z890 Tomahawk WiFi motherboard, sporting the latest 1.A70 BIOS and all of Intel's most current drivers. I'm also using Windows 11 24H2, as some of the fixes are embedded within operating system updates.

In terms of RAM and GPU, it's the exact same setup I used when I first checked out the Core Ultra 9 285K in this MSI motherboard: 32 GB of DDR5-6000 CL32 and a GeForce RTX 4070. Enough jibber-jabber, let's go through all the new results with our CPU gaming benchmark suite.

Cyberpunk 2077

Since October of last year, Cyberpunk 2077 has been patched six times, so some of the changes you can see might be down to that. But even if they are, the 285K does run our CP2077 benchmark a little better than before, with an uplift of 15% and 12% to the average and 1% low frame rates respectively.

It's not a huge increase but it's better than nothing, and free performance is free. And it's also on par with the Ryzen 7 9800X3D, though still slower than the Core i9 14900K with the 1% lows. At least we're off to a good start with the updates!

Baldur's Gate 3

Well, this is disappointing. That 2% increase in the average frame rate isn't worth shouting about, especially in the face of the significant 19% decrease in the 1% lows. I've checked those figures multiple times but that performance deficit doesn't change.

Compared to the Ryzen 7 9800X3D, the Core Ultra 9 285K looks ridiculously weak in this test, and while it's not much slower than the 14900K, with regards to the average frame rate, the 1% low figures are 32% lower.

Why this is happening isn't clear and I can't identify anything specific, using my usual performance analysis tools (PIX on Windows, Nsight Systems), that suggests what the problem is. I suspect Windows 11 24H2 might have something to do with it, as I've noticed that minimum frame rates across multiple test PCs have all dropped with the OS update.

Homeworld 3

Homeworld 3 falls the same pattern as seen with Baldur's Gate 3—the average frame rate is a little better but the 1% lows are worse. The Core Ultra 9 285K isn't too far behind its predecessor, but once again, the 9800X3D makes a mockery of them both.

So far, then, we're 1-2 in terms of gains and losses with the Arrow Lake updates affecting game performance. This isn't looking all that great, yes?

Metro Exodus Enhanced

Enter stage left, Metro Exodus Enhanced to save the day! Admittedly, it's only a 9% improvement to the average frame rate and the 1% lows are just 5% better, but the 285K is now faster than the 14900K!

Well, the 14900K as it was back in August, which was when we last tested that power-monching monster. Meanwhile the 9800X3D is just sitting there, waving its 3D V-Cache at them both and going nyer-nyer.

Total War: Warhammer 3

Oh dear. Oh, this isn't good at all. That's a 20% improvement to the average frame rate but at a cost of an enormous 37% decrease to the 1% lows. And before you ask, I checked those figures time and time again, with system reboots, driver installs, and even a spot of cheating in the form of overclocking to see if there was something I've missed.

There wasn't—the 285K really is goosed in Total War: Warhammer 3 with all the latest BIOS, driver, and OS updates. The 9800X3D's 1% lows are 97% better. Ninety seven.

Factorio

Strictly speaking, our Factorio test isn't quite a full test of the game itself. Instead, we use a benchmark script on a massive, complex map that just runs the game's engine minus all rendering, over 5,000 cycles. It's a fantastic test for how good a CPU's last-level cache is (L3 in the case of these chips) and it's one that AMD's 3D V-Cache just loves.

With a 23% decrease in the processing time, you'd think that this is some good news for Arrow Lake but it's not really. That's because the initial time of 33.3 seconds was due to a BIOS issue with the MSI motherboard, that eventually got resolved late last year.

That time of 25.5 seconds just puts it in line with the Asus ROG Maximus Z890 Hero I used for the initial review of the Core Ultra 9 285K. In other words, the chip performed no better in our Factorio test than it did four months ago.

Intel Application Optimizer

One thing that Intel noted in its report about Arrow Lake's performance was that a missing power profile meant that its thread scheduling software (Intel Application Optimizer, APO) just didn't work as intended.

It only works with games that Intel has specifically tested and tweaked the scheduler for, and in this case, it's just Cyberpunk 2077, Metro Exodus Enhanced, and Total War: Warhammer 3. We don't include APO results in our processor reviews to keep things fair but any PC gamer with an Arrow Lake should be using it.

Or maybe not, if my test results are anything to go by. Who wants an 11% drop in the 1% lows in Cyberpunk 2077? And if it's not doing anything to help in Metro Exodus, why bother using it? At least it did help out in Warhammer 3, to the tune of 11% in the 1% lows.

Just as with the other tests, I reran the APO checks multiple times and used different power profiles to see if it would make a difference. Well, it didn't.

Arrow Lake in January 2025

I've only tested six games but it's clear that for these specific ones, Intel's fixes and Microsoft's Windows updates have barely made any difference and in some cases, it's made things significantly worse.

There are probably games out there that are markedly better now but I'm doubtful that Arrow Lake is ever going to be as good as Intel first claimed when it launched the CPU architecture.

But what really isn't helping matters is the fact that the 'best' Arrow Lake is hugely expensive.

If you head over to Amazon right now, you'll see that the Core Ultra 9 285K costs $600. With no movement in the price tag since launch, there's no reason for me to change my recommendation about the processor, i.e. it's just not worth buying it.

AMD Ryzen 7 9800X3D held between thumb and forefinger

If you can find one at a sensible price, the Ryzen 7 9800X3D is the gaming CPU to buy. (Image credit: Future)

Arrow Lake is good for content creation and productivity applications—I've retested the 285K in Cinebench 2024, Blender, 7zip, and Handbrake and while there's no improvement worth noting, it's still just as strong as it was.

But for gaming? It's not great and the far cheaper Core Ultra 7 265K ($372 at Amazon) would be the sensible choice to pick if one absolutely must have an Arrow Lake processor. It's just as fast as the 285K in games, bar a few fps here and there, and it's not much slower in content creation as it only has four fewer threads.

The obvious choice when you're spending this kind of money for gaming, though, is the Ryzen 7 9800X3D. However, the demand is so high, that it's hard to find one at a sensible price right now. But that doesn't mean one should turn to Arrow Lake—instead, just be patient and wait for stocks to improve.

That will surely happen sooner than the Core Ultra 200S becoming a decent gaming chip, at this rate.

]]>
https://www.pcgamer.com/hardware/processors/arrow-lakes-had-three-months-of-windows-and-bios-updates-to-fix-its-performance-and-my-testing-shows-in-some-games-its-worse/ m8LA5sqptQUZo6Nhm854DZ Wed, 22 Jan 2025 15:58:04 +0000
<![CDATA[ Intel's next-gen desktop CPU Nova Lake allegedly spotted and can't come soon enough ]]> We're still getting our heads around Intel's latest Arrow Lake desktop CPUs (actually you could say, so is Intel). But here comes news of its successor, namely Nova Lake.

It's been spotted on a shipping manifest by X user X86 is dead&back (via Tom's Hardware). The manifest provides little detail, though it seems to indicate the chip in question may be a low-end i3 variant. Of course, Intel no longer uses the "i" prefix in its CPU nomenclature, so that may be some kind coincidence or just a hangover from the previous branding era, a placeholder or similar.

Anyway, Intel itself hasn't revealed much about Nova Lake other than to say that it will mostly be built in house, with a few isolated models possibly farmed out to TSMC.

Due around the 2026 or 2027 time frame, Intel hasn't explicitly said which production node will be used by Nova Lake, but given that it says its next mobile CPU, Panther Lake, will be produced internally on 18A silicon, and that Panther Lake comes before Nova Lake, 18A at least seems like a safe bet.

While there are some rumours that Nova Lake will be built on Intel's future 14A node, that is rather speculative, what with Intel yet to prove that 18A is working well. Moreover, actual test chips like this built on 14A doesn't seem likely.

The other big question involves Nova Lake's construction. It'll be a chiplet design again, but what kind of chiplets? Arrow Lake's architecture arguably suffers due to the memory controller being located on a separate die from the CPU cores, introducing performance-sapping latency.

Intel's more successful Lunar Lake laptop chip used just two chiplets and put the CPU cores on the same die as the memory controller. It's unclear exactly how Panther Lake and Nova Lake will be segmented. But some rumours claim Panther Lake will have CPU cores and memory controller on the same die, while Nova Lake will not.

As for the specifics of those cores, it's thought Nova Lake will get Panther Lake's Panther Cove P-cores, renamed to Coyote Cove, plus Arctic Wolf E-cores. At this stage, very little is known about how those cores perform. But, arguably, it'll be that 18A silicon that defines Nova Lake as much as anything.

How good will 18A be? Will Intel actually use 18A for Nova Lake? After all, Arrow Lake was originally meant to at least in part built on Intel's 20A node before Intel summarily ditched that idea and shifted the whole thing bar the base tile to TSMC, with the CPU cores made on TSMC's N3B node.

Anyway, with Arrow Lake hardly setting the PC enthusiast world alight, we dare say Nova Lake can't come soon enough. Lunar Lake shows that Intel hasn't totally lost its mojo, so there's no reason why Nova Lake can't be a banger. But it does rather need to be just that to get Intel back on track and stop what currently feels like a bit of an inevitable slide towards an eventual demise.


Best CPU for gaming: Top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game first.

]]>
https://www.pcgamer.com/hardware/processors/intels-next-gen-desktop-cpu-nova-lake-allegedly-spotted-and-cant-come-soon-enough/ gDwYqYVnVCRjpzs7PVrgDj Wed, 22 Jan 2025 11:13:08 +0000
<![CDATA[ AMD says it took four goes to get its new Strix Halo uber APU right and that included designing new CPU dies that 'put Threadripper in the palm of your hands' ]]> AMD's new Strix Halo uber APU for laptops was already pretty interesting, what with its 256-bit memory bus and monster sized iGPU. Now it turns out that its gestation was a little unusual, with AMD needing four goes at it to get it right and adding some trick tech to its CPU dies in the process.

In an interview with website Chips and Cheese, AMD Senior Fellow Mahesh Subramony revealed some new details about Strix Halo's inner workings. Subramony says AMD "took four iterations" to get Strix Halo right.

That's perhaps not a huge surprise, given Strix Halo had been rumoured for some time and arrived a little later than initial expectations. What is news is that Strix Halo's CPU CCD dies might not be exactly what you expected.

When the APU was first revealed, it looked like AMD had taken a pair of its eight-core Zen 5 CPU CCD dies and crammed them into a package with a new I/O die contain that huge (for an APU) 40 CU iGPU.

Well, that's not the case. Strix Halo has its very own CPU CCDs. They're still Zen 5 based, but AMD has tweaked the CCDs to suit Strix Halo's mobile remit.

For starters, they have a new interconnect. Subramony says the existing interconnect AMD uses between the CCDs in its desktop Zen 5 chips like the Ryzen 9 9950X is fast but has limitations when it comes to power efficiency involving the range of power states that were supported.

The new interconnect for Strix Halo is said to be better in every way. "Low power, same high bandwidth, 32 bytes per cycle in both directions, lower latency," Subramony explains. He also says that switching power states is now "almost instant".

The downside? It's a little more expensive to fabricate than the desktop interconnect. However, Subramony also says that Strix Halo is a full-feature Zen 5 implementation, including the 512-bit FPU.

"I almost joke about it saying it's a Threadripper to put in the palm of your hands. So we didn't pull any punches. These have the 512 bit data path. It is a full desktop architecture," he says.

The only exception to that is clockspeed. "We have binned the parts for efficiency. So it might not hit the peak frequency that you would see on the desktop," Subramony explains.

Your next upgrade

Nvidia RTX 4070 and RTX 3080 Founders Edition graphics cards

(Image credit: Future)

Best CPU for gaming: The top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game ahead of the rest.

He also says that the 32 MB of Infinity cache on the GPU die currently can't be directly accessed by the CPU, it's for the GPU, though that might change in future. "We change that with a flip of a bit but we don't see an application right now where we need to amplify CPU bandwidth," he says.

There are further details about Strix Halo's inner workings in the interview. But suffice to say that what was already one of the most interesting chips in recent years just got a bit more intriguing.

The effort AMD has clearly put into Strix Halo also bodes well for its performance an battery life. If anything, it was the latter that was the greatest unknown with Strix Halo. Could AMD really cram 16 Zen 5 cores and a huge GPU into a power-efficient package?

I was doubtful, for sure. But after learning more about the technology AMD has put into Strix Halo, I can't wait to see just how good AMD's uber APU really is.

]]>
https://www.pcgamer.com/hardware/processors/amd-says-it-took-four-goes-to-get-its-new-strix-halo-uber-apu-right-and-that-included-designing-new-cpu-dies-that-put-threadripper-in-the-palm-of-your-hands/ TKnWPWBCFfKjr2zxvfxkVk Tue, 14 Jan 2025 11:53:15 +0000
<![CDATA[ Core i9 14900KF CPU hits a world record 9.12 GHz and proves Intel chips are still good at something ]]>
Your next upgrade

Nvidia RTX 4070 and RTX 3080 Founders Edition graphics cards

(Image credit: Future)

Best CPU for gaming: The top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game ahead of the rest.

There's an new king of the HWBot overclocking hill and it's the Intel Core i9 14900KF. Actually, the 14900KF has held the frequency record for CPUs before. But a Chinese overclocker going by the name "wytiwx" just upped the ante to a new world record for any CPU ever of 9.12161 GHz.

Yup, once again it's an older Raptor Lake Intel CPU that's marginally increased on the previous record, the 9.11775 GHz achieved by a 14900KS in March last year. In other words, Intel's latest Arrow Lake CPUs, including the Core Ultra 9 285K, are nowhere to be seen in the top echelons of overclocking.

Is that because of Arrow Lake's architecture? Or is TSMC's 3N silicon as used for Arrow Lake's CPU cores not actually as good as Raptor Lake's Intel 7 node (the node formerly known as 10nm) at overclocking?

It's possibly a bit of both. But despite Intel's well-publicised difficulties when it comes to chip manufacturing technology in recent years, there's no doubting that Big Blue has a great track record when it comes to achieving top frequencies.

Intel is due to move back to its own 18A node for high performance desktop CPU manufacturing with Panther Lake. So it will be interesting to see if Intel's new 18A deskto CPUs can beat these old Raptor Lake chips and edge us closer to the magic 10 GHz.

Anywho, wytiwx used an ASUS ROG Maximus Z790 Apex motherboard and, inevitably, liquid helium for cooling to achieve -258 degrees C. Chilly.

If you're wondering, the fastest ever AMD CPU is the Bulldozer-based AMD FX 8370, which hit 8.7228 GHz over 10 years ago. It's also interesting to note how the top frequency achieved with a PC processor has levelled off in recent years.

Your next upgrade

Nvidia RTX 4070 and RTX 3080 Founders Edition graphics cards

(Image credit: Future)

Best CPU for gaming: The top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game ahead of the rest.

Between 1996 and 2007, it shot up from just 233 MHz to just over 8 GHz. But it took a further 15 years from there to top 9 GHz in 2022, and another two years or so to go from 9.008 GHz to today's 9.12-and-a-bit GHz.

Way back in the year 2000, Intel predicted it might hit 10 GHz by 2005. Of course, back then Intel was all about clockspeed. Its Netburst Pentium 4 chips were designed for raw frequency above all else and the assumption was that improved CPU performance hinged largely on operating frequencies going up.

Shortly afterward, Intel hit something of a clockspeed wall and the whole industry switched tack in favour of more cores running rather slower than 10 GHz. Then again, with improvements in transistor density also levelling off, perhaps we'll see a return to an emphasis on frequency and that 10 GHz barrier might finally be breached. Watch this space, peeps.

]]>
https://www.pcgamer.com/hardware/processors/core-i9-14900kf-cpu-hits-a-world-record-9-12-ghz-and-proves-intel-chips-are-still-good-at-something/ BCcz6yesqooihzp8t2Lkia Mon, 13 Jan 2025 17:56:18 +0000
<![CDATA[ If you're a content creator and gamer looking for a CPU upgrade, this Ryzen 9 9900X deal will get you $90 off ]]>

Ryzen 9 9900X | 12 cores | 24 threads | 5.6 GHz boost | 64 MB L3 cache | 120 W TDP | AM5 socket | $499 $409.99 (save $89.01)

In our Ryzen 9 9900X review, our main criticism was the price, with it not majorly outshining the much cheaper Ryzen 9 7900X but, at this price, that value proposition is much better. View Deal

Recently, over the last few years, with Intel picking up many losses in the CPU division, the divide between content creation and gaming has mostly been down to AMD chips with and without 3D V-cache technology.

If you are looking for a dedicated content creation rig, which can also happily handle games, you end up paying a little more but, luckily, with a few sales, you can now get the rather impressive Ryzen 9 9900X at a reasonable price.

At Newegg right now, you can pick up this CPU for $409.99, which is $89.01 cheaper than its full retail price. With a relatively low TDP of 120 W, it's easy to cool and a great workhorse, but it, unfortunately, has a terrible eco mode if you plan on bringing it all the way down to 65 W, being outpaced by much cheaper CPUs. As well as this, the original price puts it way above comparable chips, and it therefore becomes not great for the money. However, this sale puts it in a much more attractive range.

The 9900X performs productivity tasks fantastically, outpacing the Ryzen 9 7900X and Intel Core i9 149000K in single-core CPU rendering but being beaten out by multi-core rendering by Intel's much more expensive chip. If you are considering a Zen 5 station, this CPU is strong for all kinds of encoding, rendering, and editing.

When you move into gaming, this chip's nice specs aren't quite as reflective. It still performs well, roughly matching the Ryzen 7 9700X stats and performing just a little worse than the Intel Core i9 14900K.

When we originally reviewed this CPU, we recommended picking the Ryzen 9 7900X instead, not because it performs poorly but because the price increase wasn't fully reflected in performance gains. With that last-gen CPU now being $398.99 at Newegg, this upgrade makes a lot more sense.

If you want solid but not hugely noteworthy game performance, yet excellent content creation and productivity performance, and you have the AM5 socket to plug this thing in, you won't be disappointed in your upgrade.

]]>
https://www.pcgamer.com/hardware/processors/if-youre-a-content-creator-and-gamer-looking-for-a-cpu-upgrade-this-ryzen-9-9900x-deal-will-get-you-usd90-off/ jCd78XZWYEcBYgPT3yLAwE Fri, 10 Jan 2025 15:49:36 +0000
<![CDATA[ AMD accuses Intel's Arrow Lake of being a 'horrible' product and implies a lack of options for consumers has caused the Ryzen 7 9800X3D shortage ]]> The Ryzen 7 9800X3D, the current best CPU for gaming, launched just a little while ago and its impeccable performance has already caused shortages. AMD says the level of demand is partially down to Intel's launch of the rather underwhelming Arrow Lake.

As told to a writer at Tom's Hardware in a roundtable interview with AMD executives, AMD blames part of the severe demand for its current best chip on Intel launching a mediocre product in the hotly anticipated Arrow Lake. Okay, AMD put it a bit more harshly than that, saying "We knew we built a great part. We didn't know the competitor had built a horrible one."

On this, we tested out both the Intel Core Ultra 9 285K and Intel Core Ultra 5 245K back in October and they are certainly underwhelming, though we wouldn't quite call them horrible.

The former CPU is great for productivity needs but beaten out in gaming by cheaper, older AMD CPUs. The latter is a strong budget choice for content creation, but the use-case is far more niche than one might expect from a competitive Intel CPU.

However, a 'just okay' launch of a hotly anticipated set of CPUs, especially when you consider the CPU instabilities that still negatively impact the consumer view on Intel, these chips needed to be better to compete with AMD, which is currently performing really well.

Historically known for making the best CPUs in the world, Intel has had a bit of a fall from grace and the newest CPUs aren't helping it.

AMD itself didn't have a flawless showing at CES. Early Radeon RX 9070 benchmarks are certainly positive but the cards themselves weren't given a release date, benchmark results, or even price point. We can say the fps in some games looks healthy but won't be able to say much more until we know what we're comparing it to in the market.

It isn't yet clear when the 9800X3D will trickle into the normal market at its intended price point but it's a good chip, and one worth waiting for. I recently upgraded to the previous chip, the AMD Ryzen 7 7800X3D in my own personal rig and can happily state it still performs excellently.

Hopefully, Intel can start to claw its way out of the hole it has dug itself, even just so it's easier to get ahold of the best CPUs.

Catch up with CES 2025: We're on the ground in sunny Las Vegas covering all the latest announcements from some of the biggest names in tech, including Nvidia, AMD, Intel, Asus, Razer, MSI and more.

]]>
https://www.pcgamer.com/hardware/processors/amd-accuses-intels-arrow-lake-of-being-a-horrible-product-and-implies-a-lack-of-options-for-consumers-has-caused-the-ryzen-7-9800x3d-shortage/ gEGi9MAjyTNxFPVcHNXbcF Fri, 10 Jan 2025 12:49:58 +0000
<![CDATA[ Nvidia CEO Jen-Hsun Huang's simple reminder that useful quantum computing is a long way off has somehow caused industry stocks to plummet ]]> Remember when a computer meant something that used traditional, familiar algorithms? Ah, simple times. Now we not only have machine learning—aka, supposed "artificial intelligence"—but also quantum computing, which uses microwaves to get qubits to do wacky and seemingly impossible things. Regarding the latter kinds of computers, though, Nvidia CEO Jen-Hsun Huang reckons we're quite far from seeing actually useful ones.

That's straight from the horse's mouth, so to speak, which you can witness for yourself by skipping to 40:00 in the video of the CEO's recent investor Q&A (via The Register) held at CES 2025.

In response to a question about quantum computing, Huang says: "We're probably somewhere between—in terms of the number of qubits—five orders of magnitude or six orders of magnitude away, and so if you kind of said '15 years' for very useful quantum computers, that would probably be on the early side. 30 is probably on the late side. But if you picked 20, I think a whole bunch of us would believe it."

The unfortunate side effect of Huang's words, as reported by Reuters, is that many quantum computing companies have seen their stocks drop. Reuters explains that "Rigetti Computing, D-Wave Quantum, Quantum Computing, and IonQ all fell more than 40%" and "the companies, in total, were set to lose more than $8 billion in market value."

Stocks for quantum computing companies had only recently shot up after Google introduced Willow, which it claimed "performed a standard benchmark computation in under five minutes that would take one of today’s fastest supercomputers 10 septillion years—a number that vastly exceeds the age of the Universe."

Huang's not wrong, though. Practically useful quantum computers—at least "useful" in the way people usually mean—really are a long way off. To think that's a mark against them, however, is to misunderstand what quantum computing is and what its purpose is.

One big thing that quantum computing investors who have now upped ship might have overlooked is, as analyst Richard Shannon says (via Reuters), there should be "considerable government-related revenues in the next few years." Investors might, therefore, be "missing a key part of the equation."

This is because quantum computers are very good at doing niche calculations with low data. Huang himself explains: "Quantum computing can't solve every problem. It's good at small data, big combinatorial computing problems. It's not good at large data problems, it's good at small data problems."

One such kind of problem is encryption/decryption, which makes quantum computing something governments and defence industries are very interested in. It's arguably more of an "arms race" contender than AI is, given the sheer potential compute power for these niche applications compared to traditional computing.

AI and quantum aren't at odds with each other, either. As Huang also explains in the Q&A, "It turns out that you need a classical computer to do error correction with the quantum computer. And that classical computer better be the fastest computer that humanity can build, and that happens to be us." In fact, Huang says, "We want to help the industry get there as fast as possible and to create the computer of the future."

Somehow I don't think Huang's "20 years" claim is going to stop the march of progress on the quantum computing front, regardless of any stock slides.

Catch up with CES 2025: We're on the ground in sunny Las Vegas covering all the latest announcements from some of the biggest names in tech, including Nvidia, AMD, Intel, Asus, Razer, MSI and more.

View Deal

]]>
https://www.pcgamer.com/hardware/processors/nvidia-ceo-jen-hsun-huangs-simple-reminder-that-useful-quantum-computing-is-a-long-way-off-has-somehow-caused-industry-stocks-to-plummet/ Hg2rYyeAaSPiUTP3G8nj3K Thu, 09 Jan 2025 15:03:02 +0000
<![CDATA[ Nvidia seems to have just confirmed upcoming Arm and Blackwell laptop chips based on its new GB10 processor in collaboration with MediaTek ]]> With all the RTX 50-series chatter coming out of CES 2025, it might be easy to forget that we've been hoping for Nvidia to announce something else this year: namely, an all-Nvidia Arm laptop chip. On that front, it seems we now finally have confirmation that such a thing is in the works and seemingly just around the corner.

According to HardwareLuxx, Nvidia CEO Jen-Hsun Huang confirmed during a Q&A that Nvidia is working with MediaTek to create an end-user system on a chip (SoC) based on the just-announced Project Digits mini home-user AI supercomputer. An "end-user system" would presumably mean a mobile chip that could be used in a laptop.

Huang reportedly said: "We're going to make this a mainstream product. We'll support it with all the things that we do to support professional and high-quality software, and the PC (manufacturers) will make it available to end users."

Given the context, by "this", Huang presumably means the GB10 chip at the heart of the Project Digits mini supercomputer, though likely with different core configurations, GPU core counts, etc. In other words, it seems like he's saying we'll be seeing a (presumably scaled-back) version of this SoC hitting the end-user market, via PC manufacturers.

The GB10 SoC in the Project Digits supercomputer—like a GB100, sans a zero, geddit?—features a Blackwell GPU capable of one petaFLOP of FP4 AI compute and a Grace CPU with 20 Arm cores, plus 128 GB of LPDDR5X memory and up to 4 TB of NVMe storage.

The GB10, Nvidia says, is the "world’s Smallest AI Supercomputer Capable of Running 200B-Parameter Models". It's primarily for students, researchers, and hobbyists to try out powerful local AI that kind of replicates cloud-based AI, and all this runs on a Linux-based DGX operating system.

The GB10 is much more powerful than the Nvidia Jetson Orin Nano that Nvidia announced in December 2024, which is only capable of 67 INT8 TOPS. Project Digits is much more worthy of the "supercomputer" name, and that's probably why it costs $3,000 while Jetson Orin Nano costs just $249.

The idea that Nvidia might make an end-user SoC with Arm CPU cores in collaboration with MediaTek isn't new. In fact, it's one of the things we've been excited about potentially seeing in 2025, as it could mean getting our hands on an all-Nvidia laptop. We'd heard rumours of such chips going into production in 2025 since at least November last year, and we'd even heard talk of the first such SoC having RTX 4070 mobile and Strix Halo-level performance.

Of course, all that performance talk is still speculation, but it seems that what's not speculation now is that Nvidia's working on bringing an Nvidia x MediaTek SoC to market as a "mainstream product." And now that we have the actual GB10 chip that can act as a springboard for consumer chips, we might not have to wait too long.

HardwareLuxx mentions Computex 2025 (at the end of May) as a possible time for Nvidia to introduce such end-user mobile chips, and this might make sense. Nothing definite, of course, but here's hoping.

Catch up with CES 2025: We're on the ground in sunny Las Vegas covering all the latest announcements from some of the biggest names in tech, including Nvidia, AMD, Intel, Asus, Razer, MSI and more.

View Deal

]]>
https://www.pcgamer.com/hardware/processors/nvidia-seems-to-have-just-confirmed-upcoming-arm-and-blackwell-laptop-chips-based-on-its-new-gb10-processor-in-collaboration-with-mediatek/ vg2DjACdDY8iVL8PNHstg8 Thu, 09 Jan 2025 12:05:48 +0000
<![CDATA[ AMD says there are no technical reasons for not having an X3D processor with 3D V-Cache on both CCDs, but we probably won't see such a dual-stacked chip anyway ]]> Alongside sparse RDNA 4 graphics card details and somewhat denser mobile chip announcements, AMD also announced the Ryzen 9 9950X3D and 9900X3D this CES 2025. Much like their 7000-series counterparts, however, and despite each CPU having two Core Complex Dies (CCDs) full of CPU cores, only one of the two CCDs has 3D V-Cache stacked underneath or on top (underneath for 9000X3D chips and on top for 7000X3D ones).

This, however, doesn't seem to be due to any technical limitations, but just because it's not worth it. HardwareLuxx says it asked AMD about it and "the answer was surprising: there are no technical reasons or challenges" why "we haven't seen a Ryzen processor with two CCDs and 3D V-Cache on each of the CCDs."

Apparently, "such a processor would simply be too expensive and games would not benefit from a second CCD with 3D V-Cache to the same extent as the step from 32 to 96 MB L3 cache for one CCD."

The AMD Ryzen 7 9800X3D, the current best CPU for gaming, has just one CCD and 64 MB of 3D V-Cache that sits underneath it. This 3D-stacked cache is great for gaming, which is very cache-consumptive.

The just-announced Ryzen 9 9900X3D and 9950X3D, however, have two CCDs—the former with six cores per CCD and the latter with 8 cores per CCD. But the 3D V-Cache sits underneath just one of these chiplets, which means the top-end 9950X3D has the same 64 MB of 3D V-Cache as the 9800X3D, and only 8 of its cores (again, like the 9800X3D) access it.

The benefit of this is twofold. First, the cores on the CCD without the chiplet can boost to a higher clock speed. Second, there are more cores, which is great for applications that require lots of multicore performance. Crucially, with the 9950X3D at least, in addition to these two benefits, we should also get similar gaming performance to the 9800X3D, given it still has eight cores with access to 64 MB of 3D V-Cache.

That's the theory, anyway. In practice, performance will depend on how well software handles picking between cores with access to 3D V-Cache and faster cores that lack such access. CPU core scheduling can cause issues, which was noted with the 7900X3D and 7950X3D.

One might naturally think the next step would be to chuck stacked cache underneath both chiplets. However, as AMD points out to HardwareLuxx, there's little benefit to doing so as it's "too expensive and games would not benefit."

That's primarily because thread schedulers try to keep all game threads running on the cores of a single CCD regardless. And inter-CCD latency is so high that it would make no sense for these cores to reach across to the other chiplet's stacked cache. As AMD reportedly says, games wouldn't benefit.

Which isn't to say that nothing would benefit. As HardwareLuxx points out, "there are applications that would definitely benefit from 192 MB of L3 cache with 16 cores." But a game won't be one of them, and AMD has clearly—until now, at least—judged that the market for those applications that might benefit isn't big enough to economically justify making dual-stacked X3D chips.

Catch up with CES 2025: We're on the ground in sunny Las Vegas covering all the latest announcements from some of the biggest names in tech, including Nvidia, AMD, Intel, Asus, Razer, MSI and more.

View Deal

]]>
https://www.pcgamer.com/hardware/processors/amd-says-there-are-no-technical-reasons-for-not-having-an-x3d-processor-with-3d-v-cache-on-both-ccds-but-we-probably-wont-see-such-a-dual-stacked-chip-anyway/ Mq2CHu8rMwFkWAhQuhmoo8 Wed, 08 Jan 2025 15:12:24 +0000
<![CDATA[ Intel on its next-gen laptop chip: 'Panther Lake will take everything you love about Lunar Lake to the next level' ]]> Almost 14 months ago, Intel launched its first multi-tile processor, codenamed Meteor Lake, for laptop PCs. Just nine months later, it was joined by the much-improved Lunar Lake chip, for the ultra-low power market and at this year's CES event, Intel briefly demonstrated its successor, Panther Lake. As this is the first processor to be made on Intel's 18A process node, the little chip garnered quite the attention.

2024 is almost certainly a year that Intel would prefer everyone to forget about and at this year's mega-tech CES event, it's been singing its own praises regarding how it plans to bring things around in the coming months. Top of the list, despite only appearing briefly at the conference, was Panther Lake—the architectural successor to the well-received Lunar Lake.

Interim Co-CEO Michelle Johnston Holthaus held a sample of the tiny chip aloft for the assembled crowd of journalists to see. That was about all we really got to observe about Panther Lake, though Holthaus did her best to promote it.

"Panther Lake, our lead product on Intel 18A, will launch in the second half of this year. It will take everything you love about Lunar Lake, all the advances in the architecture, to the next level. We have systems already running Panther Lake, and we're sampling it across all of our major customers already."

A variety of different laptops using Panther Lake chips were on display in another booth but given that the processor is still in development, nobody was permitted to play around with the machines nor observe them running any applications.

Not that Panther Lake will really matter for PC gaming because while some gaming laptops have sported Meteor Lake CPUs, almost none are using Lunar Lake. The exception, and it is a very loose exception, is MSI's Claw 8 handheld gaming PC. That's clearly not a laptop but Lunar Lake does pretty well in that format, so there's a good chance that Panther Lake might appear in the odd handheld or two.

A photograph of Intel's Interim Co-CEO Michelle Johnston Holthaus standing on stage, with a background displaying Panther Lake and Intel 18A

(Image credit: Future)

The one thing we do know about Intel's next-gen processor is that it won't be using on-package DRAM like Lunar Lake does. This should help to make the chip more palatable to system vendors, who want to offer customers a broad range of products, especially in the budget sector. Compared to a typical laptop CPU, Lunar Lake is quite expensive and rather limited in scope, because it has two DRAM chips bonded to the package.

Investors will be looking very carefully at how good Panther Lake is because it's the first commercial chip to roll off Intel's 18A production line. Previous CEO Pat Gelsinger famously "bet the whole company on 18A" and if the little CPU turns out to be a miniature marvel then his gamble will be fully justified. If it isn't though, then it's difficult to see how Intel Foundry can continue operating.

It might just be for a fairly niche sector but an awful lot of Intel's fortunes are going to be riding on Panther Lake. For the consumer's sake (i.e. a competitive market), let's hope it's as good as Intel says it is.

Catch up with CES 2025: We're on the ground in sunny Las Vegas covering all the latest announcements from some of the biggest names in tech, including Nvidia, AMD, Intel, Asus, Razer, MSI and more.

]]>
https://www.pcgamer.com/hardware/processors/intel-on-its-next-gen-laptop-chip-panther-lake-will-take-everything-you-love-about-lunar-lake-to-the-next-level/ oak4jN42HqBT5HPEd4RGUU Tue, 07 Jan 2025 14:59:56 +0000
<![CDATA[ AMD Ryzen AI Max is finally here: 'the most advanced mobile x86 processor ever created' with 40 RDNA 3.5 CUs and 16 Zen 5 cores ]]> Well, it's finally here. We've been crossing our fingers for a long, long time waiting for AMD Strix Halo, and now the waiting's over. As one of its many rabbit hat-pulls for CES 2025, AMD's just announced the halo (ie, top) end of its Strix Point mobile processors, combining Zen 5 CPU cores, RDNA 3.5 integrated graphics, and an NPU.

These new "halo" processors are the Ryzen AI Max 385, AI Max 390, and AI Max+ 395. (Oh, and the AI Max 380, but that's a Pro-only, ie, business-focused, processor.) These will be available Q1-Q2 2025.

We've suspected a Strix Halo launch for a while now, and we'd heard about its AI Max naming and possible specs since September last year. As it turns out, those previously rumoured specs were almost entirely accurate. The only difference is that now we know the Ryzen AI Max 390 will have just 32 CUs, not 40 as the previous rumour had it. That number of CUs is the privilege of the top-end Max+ 395 alone.

Here's the full range of specs:

If you're wondering just how powerful those 40 RDNA 3.5 CUs on the AI Max+ 395 will be, bear in mind the Z1 Extreme found in the Asus ROG Ally X and a couple of other handhelds has just 12 RDNA 3 CUs, which are very similar in architecture. Even in the AI Max 385 and 390, you're getting close to three times the graphics capabilities of such handhelds.

But these AI Max chips probably won't be for handhelds—what use would a handheld have for so many CPU cores plus an NPU? Instead, AMD is explicitly targeting these chips at the laptop market, especially for creative and AI workloads.

Image 1 of 3

AMD Ryzen AI Max+ 395 chart comparing 3D rendering performance to Intel Core Ultra 9 288V

(Image credit: AMD)
Image 2 of 3

AMD Ryzen AI Max+ 395 chart comparing 3D rendering performance to Apple M4

(Image credit: AMD)
Image 3 of 3

AMD Ryzen AI Max+ 395 chart comparing graphics performance to Intel Core Ultra 9 288V

(Image credit: AMD)

This is in part thanks to its "unified coherent memory architecture" which allows up to 96 GB of memory to be dedicated to graphics, with a bandwidth of 256 GB/s. Such bandwidth, AMD says, is "unprecedented in any x86 mobile device". Ultimately, all of this combined with the NPU means AI Max chips can run large AI workloads with "performance faster than high-end desktop graphics cards".

AMD backs up these claims with its own graphs—which we should, of course, take with a pinch of salt. Comparing the top-end AI Max+ 395 to the Intel Core Ultra 9 288V, AMD claims 1.4x faster graphics performance and 2.6x faster rendering (including a whopping 402% faster performance in Blender Classroom). And it even seems to trade blows with, or beat, Apple M4 chips for rendering.

We can obviously expect good things for gaming, too, compared to other systems with integrated graphics. For context, we found the 16-CU 890M to have modern games playable at 1080p, and these new AI Max processors will have at least double the CUs of that mobile GPU.

That's why I certainly take the AMD representative seriously when they say it is "simply the most advanced mobile x86 processor ever created". The only question is price and target market.

I don't, unfortunately, see these AI Max chips being the best options for gaming. They'll be a dab hand at gaming, for sure, but they'll surely be expensive given their overall feature set that will be beneficial for creative professionals and graphics/AI developers.

In which case, I'd suspect we'd get more gaming bang for our buck from gaming laptops with dedicated mobile GPUs in them. Then again, Strix Point is lower power than a fully fledged gaming chip and GPU combo, so maybe the battery life will justify the likely premium cost. We'll have to wait and see.

Catch up with CES 2025: We're on the ground in sunny Las Vegas covering all the latest announcements from some of the biggest names in tech, including Nvidia, AMD, Intel, Asus, Razer, MSI and more.

View Deal

]]>
https://www.pcgamer.com/hardware/processors/amd-ryzen-ai-max-is-finally-here-the-most-advanced-mobile-x86-processor-ever-created-with-40-rdna-3-5-cus-and-16-zen-5-cores/ HSWU84GitmvpNEaceNsnpM Mon, 06 Jan 2025 19:47:00 +0000
<![CDATA[ AMD has just announced the two new 9000-series X3D chips we were hoping for, the Ryzen 9 9950X3D and 9900X3D ]]> AMD's X3D chips have long been some of the best for gaming because they stack tons of 3D V-Cache either on top of or underneath the processor, and games absolutely love all that cache. We had the AMD Ryzen 7 9800X3D launch late last year, this being the current best CPU for gaming, and now, AMD's just announced two even more powerful versions.

The two new chips, the just-announced AMD Ryzen 9 9900X3D and Ryzen 9 9950X3D, should be available from Q1 this year.

They don't come as a complete surprise, either—indeed, they were two of the main PC gaming components we were anticipating for 2025. That's because the 7000-series also featured a Ryzen 9 7900X3D and Ryzen 9 7950X3D (although these actually launched before the Ryzen 7 7800X3D).

As with these 7000-series chips, AMD has now confirmed that the 9900X3D and 9950X3D will feature a split design where the 3D-stacked cache sits on top of just one of the two dies, meaning half of the processor's cores can benefit from the 3D cache but the other half can—at least in theory—benefit from higher clock speeds.

At least, that's the theory. We'll have to see how they stack up (sorry) in practice. The "higher clock speed" argument for the non-stacked die made sense with the top two 7000-series X3D chips because the cache sat on top of the die, which stood in the way of cooling. But with these 9000-series X3D chips the cache is underneath, so cooling isn't as much of an issue—that's one of the big pluses of the latest chips compared to the 7000-series.

Image 1 of 2

AMD Ryzen 9 9950X3D vs 7950X3D chart for content creation

(Image credit: AMD)
Image 2 of 2

AMD Ryzen 9 9950X3D vs 7950X3D charts for gaming

(Image credit: AMD)

Nevertheless, the boost clocks are rated higher on the 9900X3D and 9950X3D than on the 9800X3D—300 MHz and 500 MHz higher, respectively.

But on the cache front, note that with the 9900X3D we're only getting six cores (half of its total 12) that can access the stacked cache. And of the 9950X3D's 16 cores, we're getting the same number with access to the stacked cache as we get with the 9800X3D—just eight.

While we won't know until we test it, this might mean little benefit for gaming compared to the 9800X3D, given the stacked cache and number of cores able to access it will remain the same.

One of AMD's charts shows the 9950X3D to be 8% faster on average than the 7950X3D for gaming, but that's the previous-gen X3D chip, not the current-gen one. However, another chart shows it to be 20% faster than the Intel Core Ultra 9 285K, and that's Intel's current best offering.

Image 1 of 2

AMD Ryzen 9 9950X3D vs Intel Core Ultra 9 285K charts for gaming

(Image credit: AMD)
Image 2 of 2

AMD Ryzen 9 9950X3D vs Intel Core Ultra 9 285K content creation chart

(Image credit: AMD)

So what's the benefit? Well, we get the higher boost clocks, for sure, but we also get more total cores. Which should mean better multithreaded productivity performance.

And we can see that this bears out in AMD's testing—which should obviously be taken with a pinch of salt. In content creation apps, for instance, AMD claims 10% average faster creation based on 20 apps tested compared to the Intel Core Ultra 9 285K and 13% faster than the 7950X3D.

We really will have to test all this for ourselves, though, not least because mixing and matching dies and stacked caches can make it so that software has to decide whether and how to divvy up its workloads—whether to use more cache or faster core speeds. That's the same problem that faced the 7900X3D and 7950X3D.

Whatever the case, it's exciting to have some new, powerful X3D chips just around the corner.

Catch up with CES 2025: We're on the ground in sunny Las Vegas covering all the latest announcements from some of the biggest names in tech, including Nvidia, AMD, Intel, Asus, Razer, MSI and more.

View Deal

]]>
https://www.pcgamer.com/hardware/processors/amd-has-just-announced-the-two-new-9000-series-x3d-chips-we-were-hoping-for-the-ryzen-9-9950x3d-and-9900x3d/ PDmg65VMwKhCpbdD6bjoCK Mon, 06 Jan 2025 19:45:10 +0000
<![CDATA[ AMD's throwing the considerably hefty Ryzen 9 9950X3D at gaming laptops and calling it a Ryzen 9 9955HX3D ]]> AMD isn't pulling any punches in gaming laptops. The company's latest plan announced over at CES 2025 is to take its beefiest 16-core/32-thread processor, whack loads more cache on the top, and run it at a laptop-friendly TDP of 54 W.

It's called the AMD Ryzen 9 9955HX3D. It comes with a max boost clock of 5.4 GHz and 144 MB of L3 cache. It's a little bit slower than the desktop version, the Ryzen 9 9950X3D also just announced at CES, but really only a little bit. That desktop chip runs at 5.7 GHz and 170 W.

It really is just the desktop chip reborn for mobile, too. The same chiplet design with a laptop-friendly socket. A large package by laptop standards.

This chip follows on from the Ryzen 9 7945HX3D, which we were impressed with in testing but never saw much of again. It is available in a couple of ROG Strix Scar 17 laptops, as we tested, but little else. Here's hoping that changes with this newer version and it's more widely available.

Though a beefy laptop chip will face other constraints. Thermals are going to be a challenge, and we don't yet know who will utilise this chip in a gaming laptop. It's a pretty extreme number, and might be less popular than some of the more sensible options available from AMD, including the new 16-core 9955HX, new 12-core 9850HX, new Ryzen AI Max, or one of the many Ryzen AI 300-series.

A table showing specifications for three new AMD mobile processors, led by the AMD Ryzen 9 9955HX3D.

(Image credit: AMD)

Speaking of the Ryzen AI 300-series, there are new Ryzen AI 7 and Ryzen AI 5 chips on the way. Here they are:

  • Ryzen AI 7 350 — 8C/16T, 5 GHz boost, 24 MB L3 cache
  • Ryzen AI 5 340 — 6C/12T, 4.8 GHz boost, 22 MB L3 cache

Those join the many Ryzen AI Max chips, including the top 16C/32T model with 40 (forty!) GPU compute units.

With so many options to choose from, some including beefy GPUs, AMD must be expecting a big uptick in gaming laptops stuffed with its chips. The dominant combination has long been an Intel CPU with an Nvidia GPU, though we've seen more of AMD's chips make it into the market in recent years. Not so much discrete GPUs, though, and AMD has nothing new to offer gaming laptops in this regard at CES.

Catch up with CES 2025: We're on the ground in sunny Las Vegas covering all the latest announcements from some of the biggest names in tech, including Nvidia, AMD, Intel, Asus, Razer, MSI and more.

View Deal

]]>
https://www.pcgamer.com/hardware/processors/amds-throwing-the-considerably-hefty-ryzen-9-9950x3d-at-gaming-laptops-and-calling-it-a-ryzen-9-9955hx3d/ kVBqUjwKMS9a2mBmmVUVGL Mon, 06 Jan 2025 19:45:00 +0000
<![CDATA[ Intel unveils second round of updates intended to bring Arrow Lake desktop chips up to expectations:'our software for the 200S has reached full performance' ]]> Intel has announced the final wave of updates aimed at its latest 200S Series processors and improving performance for these chips, which faced a rocky start and mixed results.

Intel's VP Client Computing Group and general technical spokesperson, Robert Hallock, has outlined what the second wave of updates (the first came back in November) for Arrow Lake look like over at CES 2025. These updates Intel hopes will shorten the distance between its internal performance expectations and the results seen by reviewers that tested these chips on release, including ourselves.

Hallock began with a general recommendation to turn on its APO, or Application Optimization, tool. This uses various presets to better access the CPU in certain supported games, and since it's not exactly new, let's move onto the fresher announcements.

"We saw some certain motherboards that had some BIOS settings that didn't always convey the best performance, and that was on us."

"For some systems, a feature called AutoGV was inadvertently left on, and that was our mistake. CCF AutoGV controls the frequency at which one CPU core can talk to another, and on Arrow Lake S, that should be 3,800 MHz from one core to the next, not the frequency of the CPU cores themselves, but how quickly they can talk with this feature left on. Sometimes that frequency would go as low as 800 MHz in an effort to save power."

"That power saving makes a lot of sense for a mobile solution, where you might spend most of your time at the desktop, but it doesn't make a lot of sense for a gaming CPU," Hallock continues.

Robert Hallock, VP of CCG at Intel, on stage at CES 2025.

(Image credit: Future)

BIOS updates from December onwards rectify this issue, and Hallock says we can expect to see a "pretty healthy uplift" in certain workloads such as file compression and mathematical synthetic workloads.

As for the other updates that were previously applied, Intel says with the latest updates that both the 'Best Performance' setting and 'Balanced' setting in Windows power settings will provide the best performance by default. That, apparently, is a key step as Intel found "most users basically just stick with the balance plan by default."

Image 1 of 2

Robert Hallock, VP of CCG at Intel, on stage at CES 2025.

(Image credit: Future)
Image 2 of 2

Robert Hallock, VP of CCG at Intel, on stage at CES 2025.

(Image credit: Future)

Users on the balanced profile in Windows will see the most gain from the PPM, or performance and power management package, which is the means for controlling frequency ramping, waking up/sleeping specific cores, and how long they sustain maximum boost speeds.

"Any application that is sensitive to memory access or the performance of a small number of CPU threads will get a pretty generous kick from having this PPM system."

The balanced changes and PPM were automatically applied in previous Windows updates, but does outline the importance for anyone rocking an Arrow Lake system to update your Windows install to get the most out of your chip. Though, that said, Windows 24H2 has its own share of problems to work out.

In total, Intel expects a system with all the upgrades in recent weeks and months to make a pretty substantial difference to performance, including by reducing memory latency, which might help gaming performance in some systems.

"Our software for the 200S has reached full performance," Hallock notes.

Though he is also keen to point out that this really depends on the exact specs of a given system, so the exact benefit will vary a lot. Similarly, what we know of these chips and their performance anyways, I wouldn't expect a panacea for performance capable of knocking AMD's X3D chips off their lofty gaming performance perch.

How these updates now combine to boost performance for Arrow Lake, most of all in games, will require rebenchmarking these chips. In our tests over the past few months, we're not convinced we'll see major changes to the performance picture, due to Windows as much as anything else, but it'll all come out in testing.

In the meantime, this is about as good as any reason to keep your system and motherboard BIOS updated.

Catch up with CES 2025: We're on the ground in sunny Las Vegas covering all the latest announcements from some of the biggest names in tech, including Nvidia, AMD, Intel, Asus, Razer, MSI and more.

View Deal

]]>
https://www.pcgamer.com/hardware/processors/intel-unveils-second-round-of-updates-intended-to-bring-arrow-lake-desktop-chips-up-to-expectations-our-software-for-the-200s-has-reached-full-performance/ K8tnTX6ahVLyFEjYhWqEFd Mon, 06 Jan 2025 19:28:02 +0000
<![CDATA[ Intel announces new mobile Core Ultra 200HX Series processors to power the next generation of gaming laptops ]]> Intel has announced the new Core Ultra 200HX and 200H laptop chips here at CES 2025. The HX parts are the most interesting to PC gamers, as the chips most likely to power your next gaming laptop, so let's get into those first.

There are six new HX models, with the top Intel Core Ultra 9 285HX offering up eight Performance-cores (Lion Cove) and 16 Efficient-cores (Skymont) for a total of 24 threads. Remember, there's no Hyper-Threading on the Lion Cove P-cores used here, so what you see is what you get. The 285HX features a P-core Turbo of 5.5 GHz.

That equates to an increase in performance of roughly 5%, by Intel's internal testing, for single-thread versus the last generation Raptor Lake Refresh chips. It also reckons we'll see a big uplift in multithreaded performance of around 20%, which bears out with the desktop chips.

The HX-series will run at a base power draw of 55 W, though can ramp up to 160 W when required. That's a pretty chunky power demand—that's a Turbo power actually 3 W higher than the 14900HX. So we'll have to see how that plays out in gaming laptop designs surrounding these chips.

That said, Jim Johnson, SVP of Intel's Client Computing Group, said of the HX Series: "plus it's about 40% more efficient than the our previous generation, leaving more headroom for discrete graphics to to stretch its legs."

"It's kind of a one-two punch."

Which all sounds more promising than maybe the specs allude to, but Intel says we'll find out more at a later date. Assumedly when some new GPUs come out, which is expected to happen any day now.

What's a little more interesting is that these chips support at least one exciting new memory form factor, CAMM2. CAMM2 is a like system RAM, only pancaked onto the motherboard for greater cooling. It's still removable, though, so could make for some interesting applications in gaming laptops specifically.

In terms of integrated GPU performance, these are not equipped with the mighty powerful GPU that we've seen in Lunar Lake or its competitors. Just 4/3 Xe GPU cores across this lineup, which won't be great for gaming. These chips will, of course, be paired with discrete GPUs instead.

And there's a lot of room for expansion. As Intel's Robert Hallock claims, the HX Series offers "an entire desktop worth of expandability on platform, like, a hilarious amount of USB ports."

When it comes to AI performance, which I know you're all clamouring to find out about, the NPU within the HX-series is only capable of a middling 13 TOPS. That's not enough to bag any sort of Microsoft Copilot+ certification.

Intel Core Ultra 200HX and 200H series chip renders.

(Image credit: Intel)

The H Series maxes out with fewer threads the HX Series, at just 16 threads total. The Core Ultra 9 285H has six P-cores (Lion Cove), 8 E-cores (Skymont), and 2 Low-power E-cores. Yep, it's quite a bit like Meteor Lake in that regard, though the newer architecture still offers some benefits over Meteor Lake—15% single-thread and multithread versus Meteor Lake, according to Intel.

It also has a specially tweaked Xe GPU with more AI acceleration, from new XMX engines. Quite a bit more, in fact, up to 77 TOPS. Though what is more exciting for gamers is that Intel claims it has rough performance parity with AMD's Ryzen AI 9 365 in a range of gaming benchmarks, though that's over a range of games and AMD does take a few big wins, too.

We'll probably see a wide mix of chips in the next-generation of gaming laptops, between these new HX Series, H Series, V Series, and then AMD's options, including the Ryzen AI chips.

As for the HX Series gaming laptops, likely to make up a lot of gaming models alongside a discrete GPU, those are coming in late Q1. 200H chips will turn up "early Q1", which should be any day now.

We'll see more as CES 2025 unfolds. We're at the show, so stay tuned for all that.

Catch up with CES 2025: We're on the ground in sunny Las Vegas covering all the latest announcements from some of the biggest names in tech, including Nvidia, AMD, Intel, Asus, Razer, MSI and more.

View Deal

]]>
https://www.pcgamer.com/hardware/processors/intel-announces-new-mobile-core-ultra-200hx-series-processors-to-power-the-next-generation-of-gaming-laptops/ ArRiet3ywpWaBV7AJukXvb Mon, 06 Jan 2025 17:04:54 +0000
<![CDATA[ A German retailer reports the Ryzen 7 9800X3D has outsold the entire non-X3D 9000 line ]]> Ryzen's X3D chips appear to be outselling all of its other chips, and Intel's latest aren't even coming close, if sales figures shared by a German retailer are accurate.

As reported by Videocardz, the new Ryzen 7 9800X3D is reportedly selling quite well. German retailer Mindfactory reports its sales figures alongside all listings of CPUs, which means you can see how many have been previously sold while you check out each CPU.

The Ryzen 7 7800X3D, the previous best CPU for gaming, and the chip that drives my own rig, for example, shows as having sold 78,470 units to date on its listing. This is expected from such a well-regarded chip but the most interesting sales figures are in the recent competing Ryzen chips.

The Ryzen 7 9800X3D, which is currently the best CPU for gaming has racked up 9,160 sales in just over a month, whereas the Ryzen 5 9600X, Ryzen 9 9900X, and Ryzen 9 9950X, all of which launched in August, are below 1,000 sales each. The Ryzen 5 7600X3D, which also launched in August has 8,570 sales. The 7600X3D infamously had a bit of a staggered launch, due to originally only being available at a single seller in the US.

We thought all three of the named non-3D chips were pretty solid in our testing, and the Ryzen 9 9950X is our choice for best high-end CPU but it's just not great value for money at around $700, which could account for those lacklustre sales. This follows on from a report in August, claiming the 9600X and 9700X weren't selling very well.

Of course, it's important to note that these are just the sales figures from a single retailer in a single country and that there's a little bit of wiggle room with those figures due to changing sales and multiple listings of some chips. They don't necessarily indicate broader sales figures but can give a rough idea of how the chips are seen in the market.

It's important to note that the non-X3D chips launched in the same month and mostly give marginal returns for the extra cost investment. They are therefore a bit more of an enthusiast pick, or good for someone looking to game and heavily edit videos from the same rig.

However, these sales could also be a testament to AMD's cache chiplet technology. X3D processors get better gaming performance due to the 3D V-Cache tech. If you are a dedicated gamer who wants the highest frame rate per dollar, it's likely that an X3D chip is on your radar. It was on mine before I upgraded just a few months ago and my performance has been outstanding for the price.

On the flip side of this, Intel's sales figures indicate they aren't all that popular, especially with the latest wave of chips not performing great. All listings of the Intel Core Ultra 5 245K, which launched in October, have sold less than 100 units, which is a similar figure for the Core Ultra 9 285. Mindfactory's best-selling Arrow Lake chip is the Core Ultra 7 265K but that's just managed just over 200 sales across all listings.

Not that it's been much better for Intel's last-gen processors, as the Core i9 14900K has sold a mere 2,950 units. Ouch.


Best CPU for gaming: Top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game first.

]]>
https://www.pcgamer.com/hardware/processors/a-german-retailer-reports-the-ryzen-7-9800x3d-has-outsold-the-entire-non-x3d-9000-line/ qQGVhcnCEXGzWXm7eCGaHN Fri, 03 Jan 2025 15:48:46 +0000
<![CDATA[ PC Gamer Hardware Awards: The best gaming CPU of 2024 ]]>
Gear of the Year

PC Gamer Hardware Awards 2024 logo on a black background

(Image credit: Future)

Check out more of the year's best tech in our PC Gamer Hardware Awards 2024 coverage.

This year, there has been a glut of fresh CPUs from AMD and Intel for desktop PCs, with both vendors launching new processor architectures and model variants of older designs.

Over in the Team Red corner, 2024 started with a Zen 3 chip, in the form of the Ryzen 7 5700X3D. Essentially nothing more than a processor not quite good enough to be sold as a Ryzen 7 5800X3D, it's arguably better because it's a lot cheaper but not much slower.

Zen 5 made an official appearance in August, to a somewhat muted reception. That's because despite being a complete architectural overhaul, it isn't that much faster than Zen 4. But for sheer processing power, the Ryzen 9 9950X is hard to beat. Particularly when it uses less power than the competition.

We all had high hopes for Intel's Core Ultra 200S chips but it's fair to say that they missed the mark. Truth be told, most PC gaming enthusiasts were really only interested in one CPU and when the Ryzen 7 9800X3D finally appeared, we weren't disappointed. AMD's third generation of 3D V-Cache is just as magic as ever.

I've specifically mentioned these three processors because those are the nominees for best gaming CPU of 2024. We'll announce the winner on New Year's Eve but for now, let's see why they were chosen.

Best gaming CPU 2024: the nominees

AMD Ryzen 7 5700X3D
The very first Zen 3-based CPU appeared in November 2020 so when AMD announced yet another model in that old lineup, we were pleased to see the AM4 socket still being supported with new processors. Technically, this one wasn't new as its progenitor, the Ryzen 7 5800X3D, appeared in April 2022.

Not every chip off the manufacturing line makes the cut but that doesn't mean they can't be used. In this instance, the Ryzen 7 5700X3D is just a 5800X3D but with 400 MHz knocked off the base and boost clocks. Other than that, it's the same CPU. Except it's considerably cheaper—as much as $200 less at some points in the year.

As a drop-in upgrade to any AM4 gaming PC, it's a fantastic option, as that stack of 3D V-Cache can substantially boost the performance of many games. For anyone looking to build a budget gaming rig, it's a no-brainer.

Read our full AMD Ryzen 7 5700X3D review.

AMD Ryzen 7 9800X3D
We knew it would launch at some point in 2024 but after the slightly disappointing Zen 5 launch, we did wonder if the 3D V-Cache version of the Ryzen 7 9700X would be worth the wait. Well, it was and it's fair to say that the Ryzen 7 9800X3D single-handedly made everyone forget about Zen 5's muted uplift over Zen 4.

Most of that is down to AMD's third-generation 3D V-Cache. The full redesign shifted the placement of the extra L3 cache from on top of the core chiplet to being underneath the whole thing. That removed the thermal barrier preventing the previous-gen Ryzen 7 7800X3D from being clocked high.

It's not particularly cheap, mind, and there's always the risk that the Zen 5 Ryzen 9 X3D models, which should appear early next year, will be even better. But even if that turns out to be the case, there's no denying that this is the CPU that most gamers want right now.

Read our full AMD Ryzen 7 9800X3D review.

AMD Ryzen 9 9950X
At first glance, AMD's Zen 5 CPU architecture doesn't seem to be notably different to Zen 4, and the Ryzen 9000-series of processors all seemed very 7000-series-like. But AMD had rejigged almost every part of its core design and in the case of the Ryzen 9 9950X, the updates result in a mighty powerhouse of a chip.

Of course, 16 cores and 32 threads aren't really needed for gaming, but if one games and works on the same PC, then there's nothing to touch the 9950X when it comes to sheer processing grunt. The fact that it does all of this without ever consuming more than 230 W of power, is a remarkable feat of engineering.

3D V-Cache might get all the gaming kudos but the 9950X is no slouch—it's really only bettered by the power-devouring Core i9 14900K (and the 9800X3D, of course) but even then, only in certain games. You'll need to pay a small fortune to own a Ryzen 9 9950X but you won't be disappointed by what it can do.

Read our full AMD Ryzen 9 9950X review.

The winner of the PC Gamer Hardware Award for the best gaming CPU will be announced on New Year's Eve. AMD wins no matter what but only one of its 2024 CPUs can be crowned best.

]]>
https://www.pcgamer.com/hardware/processors/pc-gamer-hardware-awards-the-best-gaming-cpu-of-2024/ aKiU8LutQesSzvTVTkVLbF Fri, 27 Dec 2024 14:00:00 +0000
<![CDATA[ Intel quietly slips out another Raptor Lake refresh with the Core 200-series mobile CPU lineup ]]> There are two key denominations in Intel's latest mobile processor lineup, Core and Core Ultra. The Core Ultra processors include new architectures such as Lunar Lake, Arrow Lake, and Meteor Lake. While Core chips contain an architecture we've extremely familiar with. Yep, it's Raptor Lake, and there are more chips on the way.

The new Core 200H-series has just hit the Intel website. These new product listings give us all the information we could need about the new mobile processors, which will make up the mainstream of the laptop market.

The Core 200H-series chips include up to 14 Performance-cores (Raptor Cove) and 8 Efficient-cores (Gracemont). That's more than what's available in the socketed 100-series (up to 2 P-cores and 8 E-cores) but a match for the 100-series embedded processors (up to 6 P-cores and 8 E-cores).

The top 200H-series chip, the Core 9 270H, is rated to a significantly higher clock speed than any 100H-series chip at 5.8 GHz.

You can see some of the differences between the mobile (socketed) 200-series and 100-series chips in this table, spotted by momomo_us on X (via Videocardz).

So, why does the Core 200H-series exist? In some ways it's set to be the replacement to some of the 14th Gen mobile processors. It's tough to say exactly how laptop manufacturers will position the 200H-series but there are a few worthy replacements in the lineup. Take the Core 9 270H. This chip offers a decent improvement over the Core i5 14500HX, which the same number of cores but a much higher clock speed and a lower TDP.

The problem is the Series 2 chip is billed with a recommended customer price of $697, while the 14th Gen chip is $337. That doesn't quite add up to me considering both use the same, now-outdated architecture.

What makes things more complicated is that these aren't the only 200H-series processors we're expecting to hit the market. The Core 200H-series Raptor Lake chips are here first, but Intel has already confirmed that Core Ultra 200H-series and 200HX-series chips are on the way next year. These are built around the newer Arrow Lake architecture, which is hybrid of its Meteor Lake (Series 1) and Lunar Lake (Series 2) chips.

If you find these codenames, architectures, and generations confusing, I dare say you wouldn't be the only one. Intel has really made a meal of it. These latest launches are only going to confuse matters for customers.

Here's a table of the most prevalent mobile series, to help clear things up:

There are actually more processors than those noted above, including the UL and HL embedded chips, but I've stuck to what you might feasible run into as someone looking to buy a gaming laptop.

Your next machine

Gaming PC group shot

(Image credit: Future)

Best gaming PC: The top pre-built machines.
Best gaming laptop: Great devices for mobile gaming.

Check your CPU spec before you head to the checkout—that's what I'd recommend from here on out. We might see a few more mainstream laptops take advantage of these socketed Core chips than we have seen to date, as the 14th Gen is generally pushed out of the market. Nevertheless, we're most likely to run into a Core Ultra 200H or 200HX chip inside a gaming laptop and those should be just fine for our frame rate requirements.

For a quick recap, Arrow Lake H/HX includes the same core architecture as Lunar Lake (Lion Cove and Skymont). They'll include up to 8 P-cores and 16 E-cores. However, these chips skip over the new Xe2 GPU found in Lunar Lake and feature an older Xe GPU with some AI acceleration thrown in for good measure.

Stay tuned for CES 2025 at the start of January if you're eyeing up a new gaming laptop. That's when we'll know more about the many models hitting the market.

]]>
https://www.pcgamer.com/hardware/processors/intel-quietly-slips-out-another-raptor-lake-refresh-with-the-core-200-series-mobile-cpu-lineup/ 9TtheTpxQF3MPMH7mLwni7 Thu, 19 Dec 2024 12:34:38 +0000
<![CDATA[ Arm pushes back against Qualcomm in court, claiming it's not out to be a chip competitor and the current licence situation is losing them $50 million in revenue ]]> The Arm vs Qualcomm legal battle has been ongoing for a couple of years (can you believe it?). And while nothing's been settled, Qualcomm now seems to be assigning motive to Arm which the semiconductor and software design company has pushed back against.

According to Reuters, in court on Monday, Qualcomm's legal team argued that part of the reason for the dispute is that Arm desires to design its own chips and Qualcomm would therefore be a competitor.

Arm CEO Rene Haas was apparently dismissive of the documents presented supporting this. He said that while Arm doesn't build chips, it is always considering different possible strategies, stating: "That’s all I think about, is the future."

The dispute all started in 2022 when Arm filed a lawsuit against Qualcomm, claiming the chip company didn't have the proper licence to use some of the Arm designs it was using. These designs came from a company Qualcomm had acquired in 2021 called Nuvia. Arm thinks that, given this, all chips made using these designs should be destroyed.

Over the course of the dispute, however, Qualcomm has argued that it already had its own broad-ranging licencing rights that should cover chip production based on such designs.

Arm argues that some Qualcomm devices, such as the Snapdragon X processors, use designs that Nuvia had the licence for before it was bought by Qualcomm, but once it was bought that licence expired. Perhaps crucially, Nuvia's original licence was meant to be for server chips, not consumer ones that the designs have ended up making.

Because of this, Arm believes that Qualcomm isn't paying for the designs, despite Qualcomm's protestations that its own licencing should cover it.

The documents shown at court this week apparently showed or attempted to show that Arm has lost about $50 million of revenue because of Qualcomm's acquisition of Nuvia. That's $50 million that Arm could (so the argument goes) be getting in addition to what Bernstein analyst Stacy Rasgon (via Reuters) says is $300 million that Qualcomm already pays the company.

Both companies obviously have lots of incentive to win this court case, and I for one can see the argument on both sides. On the one hand, Qualcomm already had a licence and it now owns Nuvia, so one could argue these designs should fall under Qualcomm's pre-existing licencing. On the other hand, Qualcomm isn't paying what Arm seems to have originally intended these Nuvia-bought designs to be worth and what it originally agreed to sell them for.

Licencing is a complicated thing, and I'm glad I'm not the judge and jury presiding over this case. Especially $50 million truly is the stake at the heart of the matter.


Best CPU for gaming: Top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game first.

]]>
https://www.pcgamer.com/hardware/processors/arm-pushes-back-against-qualcomm-in-court-claiming-its-not-out-to-be-a-chip-competitor-and-the-current-licence-situation-is-losing-them-usd50-million-in-revenue/ JMWvsbsHnqJ7tZwBfeB6yc Thu, 19 Dec 2024 12:26:28 +0000
<![CDATA[ Intel reveals the four fails of Arrow Lake in a new blog post, promising more performance fixes in January ]]> If you're a regular reader of our hardware news and reviews, you'll know that Intel's much-vaunted Core Ultra 200S series of processors not only failed to impress at launch but it's seemingly been left to flag against the competition. In a new blog post today, which we've had early access to, Intel says that while the former is undoubtedly true, it has been busy identifying what went wrong, with four key issues noted and fixes published.

Arrow Lake's gaming performance was known to be lower than last-gen Raptor Lake's before the testing even commenced because Intel openly said that this would be the case when it launched its new CPU architecture in October. The primary goal for multi-tiled processors was lower power consumption in games, while still offering generational performance gains in multithreaded workloads.

The latter was certainly achieved but once we all got our hands on the Core Ultra 9 285K and Core Ultra 5 245K (replacements for the Core i9 14900K and Core i5 14600K), it was clear that Intel was experiencing very different gaming performance to reviewers. After a while, though, and after I had spoken with Intel about my results, the truth of the matter came to light: Arrow Lake wasn't as good as expected and something was obviously amiss.

In the update, Intel notes that its engineers have identified four specific issues that were causing significant problems with gaming performance.

Missing Performance & Power Management (PPM) package

A presentation slide details the core improvements in Intel's Arrow Lake CPU architecture

(Image credit: Intel Corporation)

Modern CPUs can operate in all kinds of performance states but they require active support by the operating system in order to function properly. This is done via a PPM package—think of it as a driver that controls clock speeds and other timings, depending on what power settings the OS is using.

Intel didn't schedule the PPM for Arrow Lake to appear in a Windows Update, in time for reviews and retail availability of the Core Ultra 200S chips. The results? In Intel's words, "Unusual CPU scheduling behaviour; artificial performance increases when cores are disabled or affinitized; high run-to-run variation in benchmarks; reduced single-threaded performance; intermittent DRAM latency spikes; unexpected performance differences between Windows 11 23H2 and 24H2."

All of these problems from a 'simple' power profile just go to show just how complex today's CPUs really are but the fact that Intel didn't schedule the profile to appear in a Windows update in time for launch is somewhat annoying to read.

At least these all should be a thing of the past now, as the Windows 11 update KB5044384 apparently contains the profile and Intel says that alone can address up to 30% of performance loss against expectations.

Intel Application Performance Optimizer could not take effect

A cropped screenshot of Intel's Application Optimizer tool for games

(Image credit: Intel Corporation)

Intel's processors have been using a hybrid architecture (aka full performance, high power P-cores and lower performance, lower power E-cores) for four generations of processors, including Arrow Lake, and they all require a bespoke thread scheduler to ensure that the correct cores are handling a game's threads, to ensure best performance.

Application Performance Optimizer (APO) is a tool that manages that process for specific games and Intel was using it for its pre-launch performance data. However, the missing PPM meant that APO just wouldn't do anything, leaving up to 14% of game speed off the table.

With the PPM now in the KB5044384 update it should work but it's worth noting that few reviewers, myself included, ever use APO when first testing an Intel CPU. Oddly enough, when I did try APO on a 285K, it did work so I can only assume I had the Windows update by then.

For actual gaming, though, it does make sense to install APO and have it optimise thread scheduling in the games it supports. You're getting a free performance boost, after all.

BSODs when launching games using Easy Anti-Cheat service

The Fortnite Battlebus.

(Image credit: Epic Games)

I didn't experience this issue while reviewing the Core Ultra 9 285K and Ultra 5 245K but that's simply because no games in our CPU benchmark suite use Epic's Easy Anti-Cheat service. However, it turns out that there's a conflict between Windows 11 24H2 and older versions of EAC, which Epic has now addressed and is pushing the update out to developers.

That's good news for fans of Fortnite, for example, but this doesn't seem to be something that's symptomatic of Arrow Lake, just Windows. It may well be a case that it is but only with the 24H2 update, but for anyone using 23H2, the EAC fix won't change any game's actual performance.

Performance settings misconfigured on reviewer/early BIOSes

Testing Arrow Lake was an exercise in frustration, as motherboard vendors kept releasing new BIOS updates throughout the pre-review period. I noticed multiple differences between how motherboards were configured to run Core Ultra 200S chips—for example, the ring bus clock is meant to be constant but one manufacturer set it to scale down, as required with Raptor Lake.

Intel says it had noticed that so-called 'very important settings' were inconsistently being applied, such as resizeable BAR, Intel APO, compute tile ring clock, memory controller ratio (aka gear), and sustained and transient power limits.

That's no small list and together they account for issues like aberrantly high memory latency (up to two times expected values), erratic ring clocks, high run-to-run deviation for dynamic or unpredictable workloads, and no performance uplift from BAR or APO.

Apparently, this has also been resolved and "current BIOS releases for Intel Z890 motherboards have harmonized these settings." However, I'm currently using an MSI MEG Z780 Hero motherboard with its latest BIOS, but it's still letting the ring clock vary. Whether this is an Intel or MSI problem, I can't tell at this point because I only have MSI boards to hand.

More to come

(Image credit: Intel Corporation)

Intel rounds off its blog by saying that it has identified further BIOS improvements that it's currently in the process of validating and is targeting a firmware release, with all these updates and microcode 0x114, sometime in late January. It plans on presenting all of these findings in detail at CES 2025.

Once all of that takes place, I will retest the Core Ultra 9 285K's gaming performance, just to see how much better it is after all this work. Whether it's enough for me to be able to recommend an Arrow Lake chip over the likes of the Ryzen 7 9800X3D for gaming or the Ryzen 9 9950X for content creation is another matter entirely.

I'm genuinely pleased that Intel has worked on resolving Arrow Lake's problems but time will tell if it's a case of too little, too late.

]]>
https://www.pcgamer.com/hardware/processors/intel-reveals-the-four-fails-of-arrow-lake-in-a-new-blog-post-promising-more-performance-fixes-in-january/ pRMoX4fkx6sGwMAT5P5VdW Wed, 18 Dec 2024 19:00:10 +0000
<![CDATA[ All of today's mighty CPUs owe a debt of gratitude to the Intel 8080, which just turned 50 ]]> Just six years after its founding, Intel created a milestone in the history of computing. Instead of making chips for very specific purposes, it designed a processor that could be used in any scenario. Launched as the Intel 8080, it would go on to be recognised as the world's first general-purpose microprocessor and 50 years on, Team Blue is celebrating the success of the little chip.

For anyone relatively new to the world of gaming PCs or just computers in general, it's probably hard to picture just how much processing technology has changed over the years. But as someone who's four years older than the Intel 8080, I've been fortunate to live through the advancements and experience them first-hand on appearance.

My first IBM PC was powered by an Intel 8080, though by that time it was already 15 years old. It was an 8-bit processor, not too dissimilar to the Zilog Z80 chip that I'd spent many years programming before getting a PC. There was a good reason for that similarity: Both processors were designed by Federico Faggin, who left Intel in 1974 to set up Zilog.

Anyway, leaping forward to the present, Intel has marked the 50th anniversary of the 8080 with a short blog and a somewhat spurious infographic, in which some of the specifications of the world's first general-purpose processor are compared to those of Intel's latest Core Ultra 200S chips.

For example, where an Arrow Lake CPU comprises 17.8 billion transistors, with a minimum feature size of 3 nanometres, the Intel 8080 housed up to 6,000 transistors and a feature size no smaller than 6 micrometres. Or, if you want to use the same scale as the Core Ultra chip, the old processor had 0.000006 billion transistors and a minimum feature size of 6,000 nanometres.

Naturally, the modern CPU is quite a chunky fella in comparison to the old boy, with a total die area of 243 square millimetres. The Intel 8080 was just 20 square millimetres. But while Arrow Lake takes up 12 times more area, it does pack in roughly three million times more transistors. Thank the advances in silicon lithography for that, or in the case of Arrow Lake, thank TSMC.

Modern chip manufacturing allows transistors to switch at ridiculous speeds, across the whole processor, and the P-cores in the likes of the Core Ultra 9 285K can reach 5.8 GHz. The Intel 8080 initially launched with a 2 MHz (0.002 GHz) clock speed, though later versions could reach over 3 MHz.

The Intel 8080 wasn't just for PCs, though, as it used to power devices like electronic cash registers and arcade game machines, and in some respects that hasn't changed. Intel's processors are the norm for embedded systems across the world, though when it comes to gaming machines, that market almost entirely belongs to AMD.

I have to say that I never quite got to grips with my old Intel 8080 and it put me off PCs in general for a few years. By the time I jumped back in, processors at that time (think original Pentium) were so much better that the 8080 felt like an ancient relic. Which, of course, it is but the modern x86 architecture can be traced all the way back to the old 8-bitter, so it's more than worthy of this celebration.

If you fancy seeing an Intel 8080 in action, this online emulator does a great job and you can load up some code to play Space Invaders—which just so happens to be one of the best-selling arcade games of all time, all powered by the little 8080.


Best CPU for gaming: Top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game first.

]]>
https://www.pcgamer.com/hardware/processors/all-of-todays-mighty-cpus-owe-a-debt-of-gratitude-to-the-intel-8080-which-just-turned-50/ GC7VKbpdzW737beGfskGq3 Tue, 17 Dec 2024 13:29:46 +0000
<![CDATA[ Pat Gelsinger rallies against claims some chipmakers are struggling to produce good wafers: anyone using yields as a % 'doesn't understand semiconductor yield' ]]> The $10 million he's reportedly receiving in severance pay must soften the blow, but I can't help but assume ex-Intel CEO Pat Gelsinger must surely feel the sting of Intel's current situation and all reporting of its dire straits. The company is, after all, one he guided on a new journey towards increasing reliance on chip fabrication.

It therefore wouldn't surprise me if Gelsinger's recent comments on X (via Hot Hardware) regarding fabrication yields, while ostensibly about TSMC yields, were actually in part directed as a response to recent chatter about low Intel 18A yields.

Gelsinger says: "Speaking about yield as a % isn't appropriate. Large die will have lower yield, smaller die - high yield percentage. Anyone using % yield as a metric for semiconductor health without defining die size, doesn't understand semiconductor yield. Yields are represented as defect densities."

This X post comes just a few days after Korean outlet Chosun Daily reported that Intel's 18A process has just 10% yields, and just one day after it was picked up by the Western media at large—we also reported on it at PC Gamer but were sceptical about the claims because no sources were cited by Chosun Daily.

Whether coincidental or pointed, Gelsinger's clarification is apt. "Yield" is how much of a wafer is usable for chips. But, as Gelsinger points out, the very same fabrication process can yield vastly different yields (sorry) depending on the size of the wafers being produced. Generally, the smaller the dies taken from the wafer the better the yield—ie, the lower the "defect density".

The original 10% claims seemed to be related to Broadcom's tests with the 18A node. But, as Tom's Hardware points out, Broadcom is known for its use of gigantic chips. So one would expect lower yields for such customers. Plus, the 10% calculation only seems to factor in "perfect" dies, when in reality even many imperfect ones are usable as chips.

What really matters is that defect density number, which we've heard that Gelsinger had previously indicated to be below 0.4. Even if that did lead to a "low" yield for customers making giant dies, other similar process, for example from TSMC's N2 process at a comparative time in its development, would also have such a "low" yield. The point is, from all we know—admittedly mostly based on Gelsinger's own words—18A yields are perfectly fine compared to other new fabrication processes.

It can be easy to jump on the "sinking ship" bandwagon when it comes to Intel. There's certainly plenty of reason to do so. But we can't say that about 18A yet. It could end up being the one thing that keeps Intel afloat, given Gelsinger admitted he "bet the whole company on 18A". And while I'm no gambling man, it still seems all bets are off.


Best CPU for gaming: Top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game first.

]]>
https://www.pcgamer.com/hardware/processors/pat-gelsinger-rallies-against-claims-some-chipmakers-are-struggling-to-produce-good-wafers-anyone-using-yields-as-a-percent-doesnt-understand-semiconductor-yield/ ayKsf7dzaEpDDRR5iFiNEE Thu, 12 Dec 2024 10:52:26 +0000
<![CDATA[ AMD's flagship Strix Halo APU makes its Geekbench debut with a monstrous name and RTX 4060 levels of GPU compute ]]> We've been writing about Strix Halo for so long, without actually seeing any working hardware, that you'd be forgiven for believing that it's just vapourware. But with the first appearance of AMD's flagship laptop APU in the Geekbench database, we finally have some numbers to pore over. And a decidedly cumbersome name.

The moniker in question is the AMD Ryzen AI Max+ Pro 395 w/ Radeon 8060S, though the model's actual name doesn't have the Radeon part. But even so, saying 'I have a Ryzen AI Max+ Pro' before you even get to the specific version is hardly silky smooth marketing.

There's a good reason why desktop CPUs are just Ryzen 5 or Core i7, and this just feels like AMD's marketing division has been let loose with a box of crayons and a colouring book. I kid; it's clearly worked very hard to come up with the name.

Anyway, we already knew about AMD's clunky nomenclature and the name is ultimately irrelevant when compared to the fact that this is an APU with one heck of a CPU and GPU combination. Sporting two Zen 5 CCD chiplets for 16 cores, 32 threads, and 32 MB of L3 cache, the CPU side of things is going to be seriously potent. Not quite as good as a desktop equivalent, due to having half the amount of L3 cache (if the Geekbench info is correct), but still very nice indeed.

But it's the GPU that's the star of the show. RDNA 3.5 architecture, 40 compute units, and a 256-bit memory bus. That's not just massively better than any current laptop or handheld gaming PC APU—we're talking console-levels of performance here.

Well, potentially. We still don't know how well it'll run games, of course, but we now have a bit of insight into its compute performance, thanks to Geekbench (and BenchLeaks on X).

Someone has uploaded a score for the benchmark tool's Vulkan GPU compute test and the figure of 67,004 is not to be sniffed at. For example, this RTX 4060 laptop in the database achieved a score of 64,587 so purely based on my utterly unscientific sample of one, Strix Halo's GPU is better than an RTX 4060 (direct comparison).

Of course, that's a silly claim to be making, because one could spend a while browsing through the database (longer than necessary, given the lack of filters…) and eventually find an RTX 4060 result that's much better.

Geekbench doesn't show important values such as GPU clock speeds, power settings, or driver version. And it's purely a synthetic compute test, using Vulkan.

The computer's name in the Strix Halo result, "AMD MAPLE-STXH", strongly suggests that this is an engineering sample, so it's likely to be running with lower-than-retail clock speeds and a power setting that's aimed at stability, rather than performance.

One Geekbench result doesn't make for a mighty APU, no matter how good the score is, but I should imagine that this will be the start of a flood of entries, and over the coming weeks we'll get a better view of how Strix Halo is going to be.

The Ryzen AI Max+ Pro 395 is going to be very expensive but I can't wait to see just how good it is at gaming. Let's just hope that there's some monstrous performance to match that monster of a name.


Best gaming PC: The top pre-built machines.
Best gaming laptop: Great devices for mobile gaming.

]]>
https://www.pcgamer.com/hardware/processors/amds-flagship-strix-halo-apu-makes-its-geekbench-debut-with-a-monstrous-name-and-rtx-4060-levels-of-gpu-compute/ BJQMi47aLgPLbSaKR7XwNo Tue, 10 Dec 2024 14:48:12 +0000
<![CDATA[ Intel CEO Pat Gelsinger is gone but will Intel's chip fabs follow him out the door? ]]> When Intel announced the return of Pat Gelsinger as CEO back in 2021, it was to pretty universal acclaim. But now he's gone and it feels like all hope is in tatters. So, what does it all mean for Intel? If Intel's arch rival AMD is any guide, you can kiss goodbye to Intel's fabs.

Hold that thought. Gelsinger's return in 2021 seemed like a hugely positive development and that arguably came down to two things. First, he was an engineer and not a money man. Second he was a great communicator.

For what felt like the better part of a decade, Gelsinger's keynote was the unambiguous highlight at the annual Intel Developer Forum techfest in San Francisco. He was passionate, engaging, convincing. To hear Gelsinger speak was to believe the future of tech was bright and that Intel would be driving it.

But now Gelsinger is toast and it's not hard to understand why. As things stand right now, Intel has very little that's truly tangible to show for the Gelsinger era, at least from an outside perspective.

The company's financials have been getting worse and worse, with its most recent results returning the largest loss in Intel's history. Meanwhile, it's losing market share to AMD in its core CPU segments and it doesn't even seem to be able to get the basics right after a huge debacle involving crashing and instability problems with its bread and butter desktop CPU products and delays launching key server chips.

The final ignominy in all this was Intel being replaced by Nvidia in the totemic Dow Jones Industrial stock index. Yes, there has been the odd ray of light. The new Lunar Lake laptop chip is pretty good, for instance. But it apparently doesn't make Intel much money thanks to being mostly made by TSMC and using on-package RAM. So Gelsinger has said Intel wouldn't do it that way again.

At the heart of all this is are Intel's problematic fabs, the industrial units where chips are actually manufactured. The harsh reality is that those fabs have, to a greater or lesser extent, been dysfunctional for a better part of a decade.

More importantly, there's limited hard evidence that the problem has been fixed. Gelsinger's plan involved what he claimed would be five new silicon production nodes in four years.

However, the reality was that the plan only involved two truly new nodes, of which only one has entered limited production in chips you can actually buy and Intel increasingly relies on TSMC to manufacture its products which says all you need to know about what Intel thinks about its own fabs.

Indeed, by Intel's own estimations, we're still three years away from the bulk of its production capacity being represented by those new nodes and that's presumably a best case scenario. So, the big question is where Intel goes from here.

Under Gelsinger, the plan was ultimately two-fold. First, return Intel to technology leadership when it comes to manufacturing chips, investing around $100 billion into its fabs in the process. Second, use that leadership to expand into contract manufacturing for customers and therefore take on Taiwanese uber-fab TSMC head on.

As Gelsinger departs, neither of those objectives have really been partially, let alone unambiguously, achieved. Intel remains at least a node behind TSMC and it's yet to be proven that major players in the chip business are ready to give their business to Intel.

All the while, and perhaps most shockingly, Intel has conspired to almost entirely sidestep the most lucrative new trend in the computing industry, namely AI training and inferencing using GPUs. Intel barely has a foothold in that market and Gelsinger has essentially dismissed Nvidia's multi-billion dollar profits making AI chips as a fluke.

Nvidia Hopper GPU die

Intel has failed horribly to cash in on the AI trend that has seen Nvidia become the world's most valuable company. (Image credit: Nvidia)

Not good is it? The counterpoint to all this is that Intel still dominates the market for traditional PC processors and server CPUs, a market that isn't going anywhere and unlike AI GPUs isn't at any risk of suddenly falling out of fashion.

What's more, there is actually a handy case study Intel can turn to, namely its arch rival AMD. If anything, AMD was in an even more parlous state than Intel is now and has managed to entirely turn its business around.

Notably, AMD did that by spinning off its own troublesome fabs and focusing on simplifying and rationalizing its product range. Internally, Intel has already separated its fabs into a standalone business unit.

Most recently, Intel announced various measures to scale back aspects of its massive investment into fabs including "pausing" plans to spend $30 billion on a fab in Germany, a supposedly temporary development that many observers suspect will become permanent.

So, a widespread assumption is that with Gelsinger's departure, the demise of his Intel Foundry dream is inevitable. The fabs will be sold off, maybe even pseudo-nationalised if the US government sees them as sufficiently important in strategic terms.

Indeed, chip fabs are generally seen as so important that the usual concerns over anti-trust may not apply. So, any number of mergers that normally wouldn't seem like a goer could be possible.

For sure, the product side of Intel's business looks healthy and could be making good money as a fabless operation farming all its production out to TSMC.

Intel wafer capacity

Intel's own chip production capacity plans are revealing to say the least. (Image credit: Tom's Hardware)

The problem is that the fab side of the business is losing $5 billion and upwards a quarter and isn't worth anything right now. The fabs' only substantial customer is Intel itself and the critical 18A process that will supposedly put Intel back in something approaching a leadership position isn't due to go into volume manufacturing until late 2026. And that's obviously a best case scenario.

Meanwhile, the list of companies with the technical prowess and financial resources to take on Intel's fabs in any kind of acquisition is extremely short and probably only contains a single name. Long story short, there is no easy way out of this, which is why Gelsinger fell on his sword. If it was easy, he'd have fixed it.

The future for Intel is therefore very hard to predict. My best guess is that the company probably will be split in two and that the future of what is now Intel Foundry will likely depend on government intervention.

Should the US government view Intel's fabs as strategically critical, it may seek something along the lines of a partnership with TSMC that sees the latter take over the burden of running those facilities in return for a life raft for its own operations should tension with mainland conflict spill over.

That could ensure the US retains cutting-edge domestic chip manufacturing technology. Otherwise, Intel's fabs may go the way of AMD's, becoming producers of legacy nodes for cheap, bulk chips with no aspirations to compete at the bleeding edge.

Anyway, all we can say with any confidence is that Gelsinger's departure signals that Intel's current plan isn't working. Exactly what the future holds is impossible to say. But the simultaneously sad but reassuring truth is that it probably doesn't matter for the PC. What with AMD's success and the increasing incursion of ARM CPUs into the PC platform, Intel's ability to dictate what happens to the platform is almost certainly in terminal decline, whatever happens to the business itself.


Best gaming PC: The top pre-built machines.
Best gaming laptop: Great devices for mobile gaming.

]]>
https://www.pcgamer.com/hardware/processors/intel-ceo-pat-gelsinger-is-gone-but-will-intels-chip-fabs-follow-him-out-the-door/ arJeDzCNNoNyRdpAjCymHQ Thu, 05 Dec 2024 13:34:38 +0000
<![CDATA[ Intel CEO Pat Gelsinger retires: 'Leading Intel has been the honor of my lifetime' ]]> Intel CEO Pat Gelsinger is retiring, effective immediately. The company boss that sought to lead the company's resurgence in the ever challenging chip market has stepped down and a permanent successor has yet to be found.

"Leading Intel has been the honor of my lifetime—this group of people is among the best and the brightest in the business, and I’m honored to call each and every one a colleague," Gelsinger says of the decision. "Today is, of course, bittersweet as this company has been my life for the bulk of my working career. I can look back with pride at all that we have accomplished together.

"It has been a challenging year for all of us as we have made tough but necessary decisions to position Intel for the current market dynamics. I am forever grateful for the many colleagues around the world who I have worked with as part of the Intel family."

Gelsinger's retirement ends a lengthy and successful career at Intel, cutting his teeth as an engineer and leading the creation of top chips. He later left to become CEO of VMware before heading back to Intel.

Gelsinger joined Intel in 2021, replacing then CEO Bob Swan. Swan had been sitting in the role since first being appointed interim CEO and then as operating CEO, after Intel could not find a suitable replacement.

The lack of candidates for the top job could be a concern with Intel's board today, too. For now, it's naming two of its current leadership team to take over at the helm: Michelle Johnston Holthaus, head of the Client Computing Group (CCG); and David Zinsner, chief finacial officer.

Holthaus has also been handed a new role as CEO of Intel Products, which covers the whole span of product-facing groups at Intel. Effectively running the show, it seems.

Intel faces an incredibly difficult market with competition from not only AMD but also ARM-based designs. A CEO search couldn't have come at a worse time. But then again, would it be in this situation if everything was shaping up and its recovery plan was working?

Pat Gelsinger holds an Intel Arc A770 graphics card.

Here's Pat with an Intel Arc graphics card. (Image credit: Intel)

The simple answer is no, it wouldn't. Gelsinger felt to many like the right pick for a hard job at the time of appointment—someone with an engineering background capable of turning Intel around, including its struggling manufacturing arm. That recovery plan has not yet paid huge dividends, and though it still might be ultimately the right course of action, Intel remains in a tough spot.

Your next upgrade

Nvidia RTX 4070 and RTX 3080 Founders Edition graphics cards

(Image credit: Future)

Best CPU for gaming: The top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game ahead of the rest.

"We are grateful for Pat’s commitment to Intel over these many years as well as his leadership," Zinsner and Holthaus say in a joint statement. "We will redouble our commitment to Intel Products and meeting customer needs. With our product and process leadership progressing, we will be focused on driving returns on foundry investments."

The challenges facing Intel are huge and span out further than even the original chipmaking firm itself. For example, Intel is embroiled in a battle to defend x86 versus Arm and similarly is central to US policy to move some chipmaking back to American shores.

For now, however, it does feel a shame that Gelsinger never saw his recovery plan through to its end. At its most ambitious, it would've seen Intel competing with TSMC for the top chipmaking crown and x86 proven to be top dog. As it stands, he retires from a company with more questions marks over its future than perhaps ever before.

]]>
https://www.pcgamer.com/hardware/processors/intel-ceo-pat-gelsinger-retires-leading-intel-has-been-the-honor-of-my-lifetime/ N2fepv2R7r3kHhkhPoUV4Z Mon, 02 Dec 2024 15:31:07 +0000
<![CDATA[ Qualcomm reckons it will be flogging $4 billion worth of PC CPUs annually by 2029 which is about what AMD sold in 2023 ]]> Qualcomm had its annual investor day yesterday and with that comes news that the company expects to be selling no less than $4 billion in PC processors annually by 2029 (via CNBC).

If that sounds like a big number, it is. For context, AMD's most recent full-year earnings figures showed that it brought in $4.7 billion in sales for its "client segment" in the most recent quarter, which mostly comprises PC processors and excludes PC graphics cards and chips for consoles.

AMD has admittedly upped its game since then, netting $1.9 billion in the last quarter alone for client PC sales. But Qualcomm's claims would very much put the company in the same ballpark as AMD for client PC processor sales. That would be some achievement, given Qualcomm was virtually starting from zero in the PC market when it launched the Snapdragon X earlier this year.

Intel, meanwhile, reported nearly $30 billion in sales for its "Client Computing Group" in 2023. But that includes not only PC processors, but chips for motherboards, cellular modems, Wi-Fi controllers and more.

You'd still expect Qualcomm's $4 billion to be quite a bit smaller than whatever Intel is currently pulling in from just CPU sales. But however you slice it, Qualcomm is expecting to sell a very large number of its PC processors by 2029. And that means a very large number of Arm-based PCs, presumably mostly laptops, being bought by actual end users.

For now, it's very hard to say how well sales of laptops with Qualcomm Snapdragon X chips have been going. But one key question is whether Qualcomm's success would make for a larger market for PC processors overall, or could it be a zero sum affair, taking away sales from Intel, AMD or presumably both?

The likely reality is a bit of both. But the other imponderable is whether Qualcomm will even be making PC processors by 2029. Qualcomm and Arm are currently fighting it out in the courts over Qualcomm's very right to make CPUs based on Arm's instruction sets.

Arm has said it intends to cancel Qualcomm's licence to make Arm-based CPUs. But we suspect that even if Qualcomm loses the legal case, the result is more likely to be a larger licence fee to Arm than Qualcomm actually ceasing to make the chips.

While all this is going on, we also have rumours that Nvidia is planning to enter the PC processor market, perhaps as soon as late 2025, with its own Arm-based chip. How that factors into Qualcomm's supposed $4 billion in annual PC processor sales isn't clear. But if Qualcomm is selling $4 billion worth and Nvidia is selling billion dollars more in PC processors, well, the market sure is going to be crowded.

In the meantime, we're really waiting to see if anyone can get PC games running reliably and consistently on an Arm CPU of any kind. Both Apple and Qualcomm have proved that Arm cores can be more than competitive with AMD and Intel's x86 CPU cores for raw performance. But software compatibility and particular game support remains the main stumbling block.

It will be interesting to see if big games publishers begin to release native Arm builds of their latest games any time soon. If PC gaming on Arm is to really take off, game support is going to be absolutely critical.


Best CPU for gaming: Top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game first.

]]>
https://www.pcgamer.com/hardware/processors/qualcomm-reckons-it-will-be-flogging-usd4-billion-worth-of-pc-cpus-annually-by-2029-which-is-about-what-amd-sold-in-2023/ RD8gYUmHd3CDwQ6ZxmV8vj Wed, 20 Nov 2024 17:15:08 +0000
<![CDATA[ Nvidia's new 'AI PC' social media channel has me foaming at the bit for the in-house CPU we all want ]]> Everyone knows Nvidia's king of the very large and very frightening AI castle, but it hasn't stretched those tendrils fully into the consumer PC market just yet. Nvidia AI has usually been more of a datacentre thing.

That's until now, it seems, because Nvidia's just started a new social media channel angled squarely at the home computing AI segment, which lends more credence to recent rumours surrounding an all-Nvidia consumer PC processor.

The channel's on X and it's called "NVIDIA AI PC" (as spotted by VideoCardz). As its proverbial "hello, world", Nvidia explains: "Welcome to the new NVIDIA AI PC channel! Whether you are deep into AI or just a little curious, we're here to explore the power of AI on your local PC. Follow along for the latest news, tech, and AI inspiration."

It makes sense for Nvidia to lean its AI chops more heavily into the PC market. I mean, why wouldn't it? It's already secure atop the AI datacentre market and it certainly has enough money and expertise to smash into any other AI market segment.

The PC segment—which is already starting to become the AI PC segment—is a big one, and while there's lots of competition there already, mostly from the likes of Qualcomm, I'd bet Nvidia has the AI infrastructure and resources to weather that particular storm.

As for what form Nvidia's pushing into the AI PC segment might take, we're hopeful it'll result in a nice, shiny, black-and-green Arm-based processor running Windows. We've heard talk of just such a thing for over a year, so it's not a fool's hope.

It wouldn't just mean this, of course. Nvidia's got a whole AI infrastructure cooked up already for its datacenter servers, so the main thing will presumably be how it can bring this more directly and in more varied fashion to consumer PCs equipped with Nvidia hardware. Just like DLSS, for example.

On this front, at Microsoft Ignite, Nvidia just announced lots of new tools that "enable application and game developers to harness powerful RTX GPUs to accelerate complex AI workflows for applications such as AI agents, app assistants and digital humans." We're talking, for example, updates to ModelOpt to enable faster and more accurate AI models for PCs using RTX GPUs and updates to Nvidia ACE, which "brings life to agents, assistants and avatars."

But what would make for a better home for such things than an Nvidia AI PC? Especially if, as the latest rumours claim, Nvidia's first Arm APU will offer Strix Halo and RTX 4070 mobile performance. That would mean not only some great gaming performance, but also presumably some stellar AI compute, a perfect stomping ground for all the new AI PC tools Nvidia's launching.

That very same rumour also says the company is "trying to rush this thing out by late 2025 or 2026 at the latest," a claim that doesn't seem impossible given recent Windows on Arm improvements such as the fact it now supports AVX and AVX2 instructions.

This new Nvidia channel at the very least lends some credence to the notion that there might be an all-Nvidia AI PC sitting on our laps before too long. And if it does offer RTX 4070 mobile performance, that's an exciting prospect for us gamers, regardless of the AI aspect.


Best CPU for gaming: Top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game first.

]]>
https://www.pcgamer.com/hardware/processors/nvidias-new-ai-pc-social-media-channel-has-me-foaming-at-the-bit-for-the-in-house-cpu-we-all-want/ 5FPFEgzQ4kzRG4eXZzCFdE Wed, 20 Nov 2024 15:26:50 +0000
<![CDATA[ The top ten best selling CPUs on Amazon are all AMD chips, with the two-year old Ryzen 7 5700X sitting at the tippety-top and Intel's best effort relegated to 12th place ]]> When you've been following PC gaming hardware for as long as I have, there are certain norms that have been around for so long, that it's difficult to shake them off. For example, in the CPU market, traditional wisdom when I was younger was that AMD was often the plucky underdog, snipping at Intel's heels.

Over the past few years though, the metric seems to have seriously shifted. A look at Amazon's best-selling CPU list right now reveals a top ten domination of AMD processors, with Intel's best effort scraping in at number 12 (via Tom's Hardware). Topping the list is the AM4-socketed Ryzen 7 5700X, a Zen 3 CPU that (at its current price of a mere $130) seems to be treading the price/performance line rather nicely.

Let's not forget, while the 5700X might be on an older platform and have a couple of years of release time under its belt, it's still a formidable eight-core, 16-thread processor that seems to have caught buyers' attention with its attractive pricing and excellent performance. Right under that is the six-core Ryzen 5 5600X, while the mighty Ryzen 7 7800X3D sits pretty in third place.

As for Intel? The Core i5 13600KF puts on the best team blue showing, but it only manages a 12th place spot. It's an attractive chip for the money (currently selling at a discount for $175), but it's noticeable by the sheer weight of AMD chips sitting above it.

An interesting new addition is the Ryzen 7 9800X3D, a processor that's only been out for five minutes yet has already made its way to fifth position. Given the stunning performance on offer that should probably come as no surprise, but it's not exactly cheap at $479.

Still, it seems to be shifting in serious numbers right now—which should probably come as some relief for AMD, as the rest of the Ryzen 9000 series reportedly suffered poor sales at launch earlier this year.

Intel's Arrow Lake chips have also only just made it to market, but their reception was less than glowing. Our Nick came away impressed with the power efficiency but little else when he reviewed the Core Ultra 9 285K and Core Ultra 5 245K, and the sales seem to reflect the poor reviews from multiple outlets around the troubled chips launch.

You have to scroll down all the way to number 39 to find an Arrow Lake chip, where the aforementioned Core Ultra 9 285K makes a belated appearance. At $699, and with less than phenomenal performance in many benchmarks compared to the Core i9 14900K of the previous generation (currently available for $438 and sitting at number 15 in this list), it's a hard sell, that's for sure.

More bad news for Intel then. Whether this is a reflection of a lack of consumer confidence in Intel chips since the (now resolved) crashing debacle over the summer, or simply an acknowledgement of price/performance ratios driving sales overall, is unclear.

Still, what with Intel's ongoing woes over its operation as a whole, it doesn't bode well that the biggest retailer on the planet is showing buyer's preference for the competition in the desktop CPU market.

If I were a time-traveller, showing Young Andy™ this list would blow his tiny little mind. Still, the times they are a'changing—and if this list is any indication of the market overall, it looks like Intel is on the backfoot when it comes to CPU sales this time around.


Best CPU for gaming: Top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game first.

]]>
https://www.pcgamer.com/hardware/processors/the-top-ten-best-selling-cpus-on-amazon-are-all-amd-chips-with-the-two-year-old-ryzen-7-5700x-sitting-at-the-tippety-top-and-intels-best-effort-relegated-to-12th-place/ 9DkVGuyyR9CpgYaotGyEwA Mon, 18 Nov 2024 12:17:56 +0000
<![CDATA[ MSI responds to reports of 9800X3D CPUs 'burning out' in its motherboards but the problem seems limited for now ]]> Is AMD's hot new Ryzen 7 9800X3D gaming CPU going into meltdown? Two reports from gamers experiencing dramatic CPU "burn outs" involving AMD's new wonder chip have emerged in recent days (via Videocardz). However, early indications are that user error or a narrow manufacturing error are the likely explanations.

The 9800X3D aside, common to both burn outs is MSI's MAG X870 Tomahawk WiFi motherboard, prompting the company to put out a brief statement. "Recently, we received a user report indicating damage to an AMD Ryzen™ 7 9800X3D processor on an MSI MAG X870 TOMAHAWK WIFI motherboard. At MSI, we are fully committed to the quality of our products and have begun investigating this incident."

So, what exactly is going on? For now, nothing is certain. The two incidents, one reported on Reddit, and the other on Quasarrzone forums, appear to be very similar. In both cases, the 9800X3D chip suffered from what appears to be shorting out, resulting in substantial burn marks on the bottom of the CPU and also on the CPU socket pins and, shall we say, a failure to operate.

Perhaps more intriguing is visual evidence of fracturing or damage to the border of the CPU socket. Again, this is apparent to some degree in both cases and also appears to have occurred at the same or similar points on the socket border.

The implication, therefore, is that the CPU was not correctly seated when the socket bracket was levered down, leading the CPU to be incorrectly aligned. From there, it's not hard to imagine how various pins and pads could have shorted out, leading to burn marks and damage.

What's less certain, for now at least, is whether that misalignment was down to user error. That's certainly a possibility. Another option is that there could be a bad batch of AM5 sockets with manufacturing errors afflicting the socket border, in turn causing the socket misalignment.

One of the afflicted Ryzen 7 9800X3D owners commented, "At first, I thought I did it wrong, but there are more unnecessary parts that are not in the normal socket guide. I've assembled hundreds or thousands of units at my current job, but this is the first time I've encountered a guide injection defect like this."

If that's the case, it's possible that other motherboard models and indeed motherboards from other manufacturers could be impacted. You would also expect the problem to impact not just the 9800X3D, but other CPU models too were that the case.

If there is a large batch of bad sockets out there, it's likely we'll know soon enough, as reports of similar failures and burn outs will surely emerge. But based on the limited number thus far, we don't think message boards are about to be hit with an epidemic of burned-out AMD CPUs. And there's little reason, for now, to think the problem is specific to the 9800X3D.


Best CPU for gaming: Top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game first.

]]>
https://www.pcgamer.com/hardware/processors/reports-emerge-of-amd-9800x3d-cpus-burning-out-in-msi-motherboards-and-a-batch-of-bad-cpu-sockets-could-be-to-blame/ Dsav2PgXntdihT2uv5L9z3 Fri, 15 Nov 2024 14:28:19 +0000
<![CDATA[ Nvidia's first Arm APU is said to offer Strix Halo and RTX 4070 mobile performance, with Alienware already onboard to create an all-Nvidia gaming laptop ]]> We've suspected that Nvidia's making an Arm processor for over a year, now, but how it will be built, what form it will take, and what market it will target are questions that have remained somewhat of a speculative mystery. Now, however, there's possibly good news for us gamers because, rumour is, these chips will be used for relatively low-power but decently performing gaming laptops.

This rumour does come from tech YouTuber Moore's Law is Dead (MLID), so prepare to to crank those the salt and pepper grinders. Apparently, though, a source from an Nvidia partner says the chip is "targeting up to 80 W", and MLID quotes another unknown source as saying, "Behind the scenes, Nvidia is comparing their new APU to an RTX 4070 laptop GPU running at ~65 W in gaming performance."

That's not all, though, because again according to MLID, the previously quoted Nvidia partner also claims Nvidia is "at least partnering with Dell under the Alienware brand" for the new Arm-based APU. What this would mean, presumably, is a low-power but high-performing Alienware gaming laptop using an Nvidia CPU + GPU and Arm on Windows.

MLID also quotes an Nvidia source as saying "we're trying to rush this thing out by late 2025 or 2026 at the latest". That ties in with what we'd heard previously, that an Nvidia APU should be entering production in 2025.

The Nvidia source also reportedly says this chip's going to be a "direct competitor to AMD's Halo APUs" and will have a "powerful NPU". And MLID clarifies that "they think it will be at Strix Halo performance at most, maybe a little lower".

Regarding the NPU: Of course it's going to have one. Because Nvidia and AI go together like Cherry and Bakewell (or Apple and Pie for the Americans in the audience). But an Nvidia APU with RTX 4070 mobile-level performance that will compete with AMD Strix Halo? That's the real surprise, here, and it certainly tickles my fancy.

AMD's Strix Halo chips (AKA Ryzen AI 300 Max chips) are going to be the company's most powerful mobile chips to hit the market, featuring a healthy dolloping of RDNA 3.5 compute units. The latest leaks point towards at least three such chips, with the top-end one, the Ryzen AI Max+ 395, featuring a whopping 40 RDNA 3.5 CUs and 16 CPU cores.

For context, the Z1 Extreme APU found in handhelds such as the Asus ROG Ally X has 12 RDNA 3 CUs and 8 cores, and although it's not an apples-to-apples comparison, the RX 7700 XT has 54 CUs.

With Strix Halo, then, we're looking at close to high-end discrete GPU performance in a mobile chip. If this is what Nvidia's competing against, it might make sense for them to ensure their upcoming Arm APU delivers graphics performance akin to an RTX 4070 mobile GPU as these latest rumours suggest.

Your next upgrade

Nvidia RTX 4070 and RTX 3080 Founders Edition graphics cards

(Image credit: Future)

Best CPU for gaming: The top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game ahead of the rest.

Whether the chips themselves will be built by Intel or TSMC is still an open question, but with Microsoft's Qualcomm-only Windows on Arm deal rumoured to be ending soon, we might not have to wait long to find out. If this deal is running out soon, it certainly makes sense that Nvidia's said to be partnering with Qualcomm rival MediaTek for production of these chips.

Another potentially exciting element of this, if it's true, is the Alienware aspect, and that's because an all-Nvidia Alienware gaming laptop could surely only work if Windows on Arm is up to snuff.

According to MLID's supposed Nvidia source, "there's a HUGE effort underway to make it work". I can kind of buy it, too, given all kinds of improvements are underway, such as an Insider build of Windows now supporting AVX and AVX2 instructions that should get more games up and running on Windows on Arm. Could Nvidia's APU prompt a proper transition to Arm x Windows gaming? Let's wait and see.

]]>
https://www.pcgamer.com/hardware/processors/nvidias-first-arm-apu-is-said-to-offer-strix-halo-and-rtx-4070-mobile-performance-with-alienware-already-onboard-to-create-an-all-nvidia-gaming-laptop/ X3jMLhs2DJssxx4uiesoZD Thu, 14 Nov 2024 13:14:18 +0000
<![CDATA[ Not that any PC gamer will care but Intel is lining up low-power Arrow Lake chips for launch in January ]]> Intel's new Core Ultra 200S don't have much going for them in terms of gaming performance, but they're generally pretty good at sipping power in games. Compared to Raptor Lake's cavernous hunger for energy, it's a major plus. Now the chip giant is getting ready to launch Arrow Lake processors with even lower power demands.

Strictly speaking, the demand from the processor won't actually be lower in these new models, just that their power limits will be significantly reduced from the 125 W Ultra 9 285K, Ultra 7 265K, and Ultra 5 245K. The details on the new Arrow Lake variants were posted by X user Momomo_us (via Videocardz) who's well-known for getting hold of such information ahead of schedule.

What we're getting are non-K and T variants of the existing Core Ultra 200S chips. For example, the 285K will be joined by a Core Ultra 9 285 and 285T, with the former having a 65 W TDP and the latter just 35 W. Everything else—core count, cache, number of PCIe lanes—will still be the same.

Well, the clock speeds will be lower, and in the case of the 285, the base P-core will be 2.5 GHz compared to the 285K's 3.7 GHz; the 285T will be right down at 1.2 GHz. There's no word about how the reduced power limits will affect the boost clocks but we can take a reasoned guess at them, based on how Intel set the clocks for the likes of the Core i9 14900T.

Where the full-fat 14900K has P-core clocks of 3.2 GHz (base) and 6.0 GHz (max turbo), the T variant is 1.1 and 5.5 GHz respectively. For reference, the non-K 14900 is 2.0 and 5.8 GHz. Since Arrow Lake is more energy efficient than Raptor Lake, we might not see quite as large a decrease in the peak boost clocks, but we'll have to wait for the official launch to know for sure.

The leaked details also point to some additions to the Core Ultra 5 range, with the 225 (and iGPU-less 225F) sporting just 10 cores, base clocks of 3.3 GHz, 20 MB of L3 cache, and a 65 W TDP.

Not that any of these forthcoming chips are aimed at PC gaming enthusiasts. Traditionally, T-models have been sold to manufacturers who need capable but ultra-low power chips for embedded computers, and the non-K versions are normally snapped up by system integrators looking to save a few bucks here and there.

Given that I don't recommend anyone buys a normal Core Ultra 200S (the 245K isn't bad but there are better options for gaming), you're not going to see me ever suggesting one should rush out for a non-K chip.

Yes, they'll be cheaper than the K-models but quite frankly, who cares when they'll have second-tier gaming performance at a brand-new price?

Videocardz also reckons Intel will launch the new range in January, probably at CES 2025, and we should see its more affordable B860 motherboard chipset announced at the same time.

You never know, Team Blue might do an AMD and significantly drop its prices to undercut the sales of Zen 5, especially the mighty Ryzen 7 9800X3D. Does anyone fancy taking a punt on that happening?


Best gaming PC: The top pre-built machines.
Best gaming laptop: Great devices for mobile gaming.

]]>
https://www.pcgamer.com/hardware/processors/not-that-any-pc-gamer-will-care-but-intel-is-lining-up-low-power-arrow-lake-chips-for-launch-in-january/ DzuzjRY6bszQTXZw9C2iAR Tue, 12 Nov 2024 15:38:29 +0000
<![CDATA[ Nvidia has reportedly killed production of all RTX 40 GPUs apart from the 4050 and 4060 as affordable 50-series GPUs could arrive earlier than expected ]]> Nvidia has stopped making almost all its current-gen RTX 40 GPUs. Only the AD107 chip remains in production, the GPU used in the RTX 4050 and 4060 graphics cards, the former a mobile-only model.

At least, so says a report on Board Channels (via Videocardz), claiming "Nvidia has completely shut down the AD106 production line, with all its capacity reallocated to the RTX 50 series lines. Only a single AD107 line is temporarily retained. As a result, the RTX 40 series has entered its final quarter of clearance, with mid-to-high-end RTX 40 GPUs gradually halting production and supply. "

This news follows earlier reports in September of Nvidia's plans to end production of the AD102 chip found inside RTX 4090 and 4090D graphics boards. For the record, the AD106 chip is used for the RTX 4060 Ti and RTX 4070, while the AD107 is for the 4060 desktop and mobile and 4050 mobile.

The implications of all this are clear enough. If Nvidia is winding down the RTX 40-series family, whatever follows it must be nearly ready for launch. If you assume a typical launch schedule, you would indeed expect Nvidia to wind down high-end GPU models like the 4090 first, given the company typically rolls out premium members of any new GPU generation on day one, with more affordable variants emerging over the following months.

If there is anything surprising about this rumour, then, it's not that some Nvidia RTX 40-series GPUs have stopped production, but that quite so many have. Current rumours suggest Nvidia will unleash the new RTX 5090 and 5080 graphics cards in January, very likely at the CES show in Las Vegas.

Now, if we wind back to the launch of the RTX 40-series, there was a six month gap between the RTX 4090 and the RTX 4070, with a further month for the 4060. However, for that generation, Nvidia released the 4090 and 4080 boards in October, with the 4070 in the following April.

Anyway, the point is that if Nvidia has indeed already knocked RTX 4060 Ti and 4070 production on the head, leaving only RTX 4060 desktop and RTX 4050 and 4060 laptop, then that implies a much compressed launch timetable for the RTX 50 family, with the 5090 in January and the 5070 in February or March and the 5060 and 5060 Ti perhaps a month later.

That would mean some elements of the RTX 50 family will launch later that their respective RTX 40 counterparts, while others arrive earlier. Given most gamers will be looking at the more affordable end of the range, this is all pretty good news.

It means that just because the RTX 5090 and 5080 probably won't appear until January, it doesn't necessarily mean we'll have to wait until July or August next year for the likes of the RTX 5060 and 5070. Hurrah and huzzah, though we'll probably want to see how Nvidia prices all these new cards before we get too excited.


Best CPU for gaming: Top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game first.

]]>
https://www.pcgamer.com/hardware/processors/nvidia-has-reportedly-killed-production-of-all-rtx-40-gpus-apart-from-the-4050-and-4060-as-affordable-50-series-gpus-could-arrive-earlier-than-expected/ bEUoUNkrsgtHv69LFioCeJ Tue, 12 Nov 2024 11:35:41 +0000
<![CDATA[ AMD's 2025 laptop plans sure do include a lot of refreshed and rebranded APUs, but who cares when you've got Fire Range, Strix Halo, and four RDNA 4 mobile GPUs heading our way next year ]]> It's hard to believe that 2025 is only two months away now but with lots of new gaming PC stuff scheduled for release next year, it's two months too long. If you were hoping for AMD to bring some of its desktop CPU magic to laptops, though, you might be disappointed to see that current plans point to a lot of refreshes and rebrands of current APUs. But countering them will be some seriously great gaming chips.

According to Wccftech, citing a now-removed video from Weibo user Golden Pig Upgrade Pack, the current Ryzen AI 300 series will still consist of Strix Point APUs—a CPU with up to four Zen 5 and eight Zen 5c cores, and a GPU with 16 RDNA 3.5 compute units (CUs). However, they will be joined by a new chip, Kraken Point, that has all the hallmarks of being a partially disabled Strix Point processor. That's because it just seems to have four fewer Zen 5c cores and CUs.

Even the new Ryzen AI 200 series of chips are just rebranded Hawk Point Ryzen 8040-series processors, with eight Zen 4 cores and 12 RDNA 3 CUs. But it's not all gloomy news, as AMD is planning on making laptop versions of its Ryzen 9000-series chips under the codename of Fire Range, including 3D V-Cache variants.

Just like the current Dragon Range, these will use the same chiplets as their desktop equivalents, just in a smaller package (and presumably with lower clocks and power limits). The Ryzen 9 7945HX is one heck of a gaming laptop CPU (as used in the Asus ROG Scar 17) so the Zen 5 version should be at least as good.

The real stars of the laptop show, however, will be the Strix Halo chips, though, aka Ryzen AI Max. We've covered leaks about the chips before but the specs are still worth mentioning again, especially in light of how disappointing the other 'new' APUs seem to be. The range will start with the lowly Ryzen AI Max Pro 380, with six Zen 5 cores and 16 RDNA 3.5 CUs. At the other end of the scale is the AI Max+ 395, with 16 Zen 5 cores and 40 (yes, 40!) RDNA 3.5 CUs.

However, the video does add some additional information. Rather than extend the current naming scheme for integrated graphics in laptops (e.g. Radeon 780M and Radeon 890M), AMD will use Radeon 8060S and 8050S for 40 and 32 CU iGPU variations.

I understand why AMD felt the need to have a notably different name with these new graphics chips (the performance difference between a 780M and an 8050S will be huge, thanks in no small part to the 256-bit memory bus), but it's just more confusion in AMD's evermore complex nomenclature.

In that same video, there's news about the next generation of discrete laptop GPUs from AMD, too. X user Everest (via Igor's Lab) managed to grab a screenshot of a slide, before the video was taken down, that shows the current RX 7000M variants all being swapped for one of four 'R25M' chips, although there's not an awful lot of information.

They will be (or should that say, hopefully be?) RDNA 4 GPUs and the lowest spec one will come with 8 GB of VRAM, on a 128-bit memory bus, with a 50 to 130 W power budget. I know that doesn't sound spectacular but Nvidia's lowest model laptop GPU is the RTX 4050, which just has 6 GB on a 96-bit memory bus.

Higher-end laptops will be served by an R25M variant sporting 16 GB of VRAM, a 256-bit memory bus, and up to 175 W of power. Without any words on the CU count and clock speeds, though, the details don't tell us anything about performance. That said, I'd be surprised if they were lower than their equivalent RX 7000M models.

Your next machine

Gaming PC group shot

(Image credit: Future)

Best gaming PC: The top pre-built machines.
Best gaming laptop: Great devices for mobile gaming.

All of this will be for nought if AMD can't get laptop vendors to use its chips, though. If one browses through Newegg's new laptop offers, discounting third-party sellers, around 50 models are sporting Intel CPUs and Nvidia RTX GPUs.

Searching again but this time for those with AMD CPUs shows just 11 models and only one of those has an AMD discrete laptop GPU. It's a far better picture if one searches from any seller but trying to find one that houses a Radeon RX 7900M, for example, is a frustrating affair.

AMD has the goods for 2025, even if some of the new chips are just rehashes of older ones, now we just need more gaming laptop manufacturers to use them.

]]>
https://www.pcgamer.com/hardware/processors/amds-2025-laptop-plans-sure-do-include-a-lot-of-refreshed-and-rebranded-apus-but-who-cares-when-youve-got-fire-range-strix-halo-and-four-rdna-4-mobile-gpus-heading-our-way-next-year/ UtAaKmrBSdTYD6mfgELqBN Mon, 11 Nov 2024 14:16:26 +0000
<![CDATA[ Intel admits the Arrow Lake launch missed the mark and promises performance fixes by December, but my testing suggests you shouldn't get your hopes up ]]>

In an interview about its Arrow Lake processor launch, Intel admitted that the release didn't go as planned, citing a disparity between its own results and those from reviewers. It also said it has some performance fixes coming soon that should address those differences. But my own meeting with Intel, and additional tests I've carried out since, suggest you shouldn't expect to see any big gains heading your way.

The promise came from Robert Hallock, Intel's Vice President and general manager of Client AI and Technical Marketing while chatting with HotHardware. He began by pointing out where Intel has discovered problems. "I can't go into all the details yet, but we identified a series of [multifactor] issues. They're at the OS level, they're at the BIOS level."

Hallock also remarked on Arrow Lake's test results, saying that "the performance we saw in review—and to be very clear, through no fault of reviewers—was not what we expected and not what we intended. The launch just didn't go as planned."

The phrase "not what we expected" is particularly interesting because in my meeting with Intel, after I provided them with my full set of results and benchmarking methods prior to the review's release, it said those figures were in line with its internal testing, albeit a little bit behind in some tests.

But even so, I'm not convinced Arrow Lake can be so easily fixed, for gaming at least. I've been experimenting with changing the multitude of clocks in Core Ultra 200S processors, as well as testing them with different RAM speeds. The most stable, overclocked configuration I could achieve, with some super-fast RAM, was an 11% increase in the compute tile's cache ring clock, a 15% increase in the uncore clock (NGU), and a 19% increase in the die-to-die (D2D) clock.

Paired with a 48 GB kit of DDR5-8000 RAM, running in Gear 2 mode, the average performance boost was just 2% to the mean frame rate and a decrease of 7% in the 1% low figures. Mind you, the culprit for the latter was Total War: Warhammer 3. Games such as Cyberpunk 2077 and Baldur's Gate 3 both improved by around 5%.

However, those gains also came with a mean power increase of 15% in gaming (26% higher in the case of Baldur's Gate 3). Fortunately, since Arrow Lake isn't anywhere near as power-hungry as Raptor Lake is in gaming, that power rise is perfectly acceptable.

I've also been experimenting with Intel's Application Optimization tool (APO), which manages the threads generated by certain games to work better on its hybrid architecture. In the marketing slides for the Core Ultra 200S series, all of Intel's performance results were taken with APO enabled, where relevant.

Image 1 of 2

Intel slide showing Arrow Lake gaming chart.

(Image credit: Intel)
Image 2 of 2

Intel slide showing Arrow Lake gaming chart.

(Image credit: Intel)

On my Arrow Lake test rig, using an MSI MAG Z890 Ace and DDR5-6000 CL30, APO makes absolutely no difference with Cyberpunk 2077. It does improve how well Metro Exodus and Total War: Warhammer 3 run, though not massively so.

So if overclocking, high-speed RAM, and APO can't bring Arrow Lake's gaming performance up to a level where it's competitive against Raptor Lake, let alone AMD's Zen 5, will Intel's incoming Windows and BIOS fixes make a huge difference?

That's certainly possible, especially regarding the firmware on motherboards, as the Asus board I used for reviewing the Ultra 9 285K and Ultra 5 245K was hugely better in our Factorio test than either of the MSI Z890 boards I have.

Your next upgrade

Nvidia RTX 4070 and RTX 3080 Founders Edition graphics cards

(Image credit: Future)

Best CPU for gaming: The top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game ahead of the rest.

There is still lots to like about Arrow Lake—the greatly reduced power consumption and the content creation performance—but I think the architecture is fundamentally just not ideal for gaming. In many ways, Intel's new design strikes me as being similar to Ryzen 3000-series processors, AMD's first desktop CPUs to use a chiplet design.

Zen 2 and Arrow Lake both have the memory controller in a separate die to the CPU cores, which isn't great for latency, but AMD is now three generations on from Zen 2 and has improved timings and clock speeds all around to minimise the latency issue. In the case of its 3D V-Cache chips, a massive slab of L3 cache helps out even more.

I hope that Intel does manage to bring Arrow Lake up to the point where it's a decent enough gaming CPU, if only because most new prebuilt Intel gaming PCs are going to be sporting a Core Ultra 200S chip. But with the Ryzen 7 9800X3D exceeding expectations and selling like proverbial hot cakes, they're never going to be the best gaming CPUs money can buy.

]]>
https://www.pcgamer.com/hardware/processors/intel-admits-the-arrow-lake-launch-missed-the-mark-and-promises-performance-fixes-by-december-but-my-testing-suggests-you-shouldnt-get-your-hopes-up/ eW4HmHRksTzw9vDWtKkeff Mon, 11 Nov 2024 12:22:46 +0000
<![CDATA[ AMD's desktop CPU market share jumps by nearly 10% in a year, all at the expense of poor old Intel ]]> This is getting a teensy bit repetitive now, but there's yet more bad news for Intel. Its arch rival AMD has clawed back a hefty 10% market share in the desktop x86 CPU segment over the last year. And that has us wondering, could those crashing 13th and 14th Gen CPUs be hurting Intel's sales?

The market share figures are according to the long-time PC hardware soothsayers and data analysts at Mercury Research (via Tom's Hardware). AMD now owns 28.7% of the desktop CPU market, up from 19.2% a year ago, though its mobile CPU share is a fair bit lower at 22.3%.

This time last year, AMD owned 19.5% of the mobile CPU market. So, its share of that segment is rising, too, just not as fast.

To put all that into context, if you go back to the second half of 2016, about six months before AMD began to turn things around with the first Ryzen-branded CPUs, AMD's desktop CPU market share was just 9.1%.

Of course, with Intel still owning over 70% of the market, reports of its comprehensive demise are clearly overstated. But bar one or two fleeting blips, AMD has since been chipping away at Intel, quarter by relentless quarter.

This most recent quarter saw AMD snag an additional 5% desktop market share from Intel, the biggest quarterly jump in at least a decade. If this carries on much long, those doom-laden narratives around Intel will be much closer to reality.

The explanations for that are somewhat speculative. But it certainly seems like it's not a coincidence that the first quarterly figures that might have been impacted by the PR fallout from those crashing 13th and 14th Gen CPUs has indeed registered a big fall for Intel.

One caveat to all this is that Intel told Mercury Research that some of its market share loss was due to an "inventory correction" at one of its customers. it's not totally clear what that means, but the implication is that we could see a bit of a bounce back for Intel next quarter.

If that does indeed happen, then the whole "crashing CPU craters Intel market share" narrative will be a little harder to push. Still, however you slice it, the data has only been moving in one direction. AMD is taking market share from Intel with relentless consistency for the better part of a decade.

Inventory corrections or no, it's hard to see that trend changing, what with Intel's new Arrow Lake CPUs proving pretty disappointing. As things stand and given Intel has only just launched an all-new desktop chip, we're not expecting a major update from Intel in terms of desktop processors until Nova Lake in 2026. So, it's hard to see anything but further market share losses until at least then.

That said, the mobile market is increasingly important and Intel's Lunar Lake CPUs are much more competitive there. Arguably, they are more appealing than AMD's chips thanks to excellent efficiency, at least for thin-and-light laptops.

The slight problem with Lunar Lake is that, according to Intel itself, Lunar Lake isn't as profitable as the company would like. That's thanks to two factors. First is the fact that it is mostly manufactured by Taiwanese foundry TSMC, rather than in Intel's own fabs.

The other factor increasing costs and hitting margins is the use of integrated DDR5 memory, which is why Intel has said it won't do that again. As it happens, Intel's next-gen Panther Lake mobile CPU addresses both problems by using more in-house Intel silicon and ditching the integrated memory.

Panther Lake is due next year, likely ahead of AMD's next major mobile CPU update. So, 2025 could be a better overall picture for Intel, even if it looks like it will be fighting a losing battle on the desktop for some time to come.


Best CPU for gaming: Top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game first.

]]>
https://www.pcgamer.com/hardware/processors/amds-desktop-cpu-market-share-jumps-by-nearly-10-percent-in-a-year-all-at-the-expense-of-poor-old-intel/ RgZ8f4wjvkYLNeHAz9vUr5 Fri, 08 Nov 2024 16:30:24 +0000
<![CDATA[ AMD's new Ryzen 7 9800X3D gaming CPU sold out almost instantly and scalpers are already selling the chips for up to $999 ]]> Hop onto all the big etailers and you'll find that AMD's new Ryzen 7 9800X3D CPU is sold out. Amazon, Newegg, Best Buy, the lot. Nobody has stock, I just checked!

In fact, some observers reckon the chip, which Dave describes as the "new king of gaming CPUs", sold out in minutes. Meanwhile, other sources reckon Intel's new Arrow Lake CPUs have been catastrophically slow sellers, with German etailer Mindfactory claimed to have not shifted a single chip in the first week of sales.

That said, Intel Arrow Lake chips like the Core Ultra 9 285K have also been sold out at times. Indeed, right now it's sold out on Amazon.

So, the nuance here involves availability. How many chips were actually available for launch? Unfortunately, that's very hard to say with any certainty. Allegedly, for instance, there were 3,000 Ryzen 7 9800X3D chips available at launch for the whole of Germany.

Is that a lot or a little? Even that isn't easy to say definitively. Stories have also circulated regarding very low availability of Intel's Arrow Lake CPUs, so it's pretty hard to be sure exactly where these new CPUs fall along the demand and supply curve.

However, one indication is the annoying scalper market. If there's evidence that people are willing to pay over the odds for something to scalpers, there's probably quite a bit of demand.

Well, wouldn't you know it, a Ryzen 7 9800X3D popped up on eBay for $999, miles above its $479 recommended retail price. And it appears to have sold yesterday. A current eBay search throws up a number of Ryzen 7 9800X3D chips for sale around the $675 mark for chips that are on "pre order".

Actually "in hand" and ready to ship CPUs seem to be more expensive. This eBay listing claims to have a 9800X3D ready to go for $735 and the listing indicates 10 have already sold.

But what of the Intel alternative? Actually, eBay has listings for those well above list, too. How about this one for $1,299? Yes, really. A Core Ultra 9 285K for $1,299. In fact, if you want one that ships from the US, the cheapest is $899, and that's a pre-order. Suppliers in Israel will do you one for about $760.

All of which means that we can fairly confidently say that all of these chips are in pretty short supply, right now. Are any of them worth paying a huge premium? We'd say no, not even the AMD Ryzen 7 9800X3D and certainly not an Arrow Lake CPU.

Ultimately, chips like these are often in short supply just after launch and if you can wait a month or two, it's very likely there will be plenty available at MSRP. There really is little reason to pay massively over the odds.


Best CPU for gaming: Top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game first.

]]>
https://www.pcgamer.com/hardware/processors/amds-new-ryzen-7-9800x3d-gaming-cpu-sold-out-almost-instantly-and-scalpers-are-already-selling-the-chips-for-up-to-usd999/ D2BxVPz7X9UDbrhGv3hhBQ Fri, 08 Nov 2024 13:53:01 +0000
<![CDATA[ New class action lawsuit alleges Intel knew its CPUs were crashing even before they went on sale ]]> This feels like something of an inevitability, but now it's actually happening. A class action lawsuit accusing Intel of knowingly selling CPUs prone to crashing has been filed in a federal court in California (via Techspot).

The plaintiff is actually one Mark Vanvalkenburgh of Orchard Park, New York. But his beef is with an Intel Core i7-13700K he picked up from Best Buy in January 2023.

"After purchasing the product, Plaintiff learned that the processor was defective, unstable, and crashing at high rates," the suit states. Vanvalkenburgh applied the Intel microcode patch designed to fix the problems, but apparently this failed to resolve the instability and crashes. Vanvalkenburgh's lawyer's expect other Intel customers to join the suit.

Just for clarity, all this pertains to the well documented problems with Intel's 13th and 14th Gen Raptor Lake CPUs. Intel now claims that the problems are resolved with the aforementioned patch, albeit it took Intel three tries at releasing a patch before it settled on what is now claimed to be the overall solution. But even if that is true, it will surely be problematic should it be proven that Intel knew the chips were duds but sold them anyway.

Thus the killer passage in the suit could be the following claims:

"By late 2022 or early 2023, Intel knew of the defect. Intel’s Products undergo prerelease and post-release testing. Through these tests, Intel became aware of the defect in the processors. In addition, Intel monitors return rates, press reports, and user reports of defects. By late 2022 and early 2023, there were numerous reports that the Intel chips were failing at high rates. Thus, Intel knew that its Products were defective by late 2022 or early 2023."

If that is proven, it will surely be very ugly for Intel. Of course, it is perhaps not totally surprising that this particular Intel customer found that his problem was not resolved by Intel's microcode patches.

Intel has conceded that once the clock tree circuit in a CPU is damaged by voltage spikes, those patches won't fix it and the chip needs to be replaced. That's why Intel extended the warranties on Raptor Lake chips by two years and implemented an enhanced RMA process.

Anywho, the suit seeks damages including punitive damages, restitution, disgorgement, and an order broadly awarding the plaintiff and all other class members compensation to be determined at trial.

Given Intel's well publicised struggles, which most recently have seen it drop out of the Dow Jones stock index, this is hardly good news. But it was almost certain to happen. It will be interesting to see if Intel fights this one to the death or offers a speedy settlement. Watch this space.


Best CPU for gaming: Top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game first.

]]>
https://www.pcgamer.com/hardware/processors/new-class-action-lawsuit-alleges-intel-knew-its-cpus-were-crashing-even-before-they-went-on-sale/ q7ugzw8HiH99D96AiBs9Vb Fri, 08 Nov 2024 11:34:17 +0000
<![CDATA[ The top AMD Ryzen 7 9800X3D overclock just hit 7.24 GHz on HWBOT, but also a stable 6.9 GHz that delivers literally thousands of in-game fps ]]> Update November 8, 2024: Records are made to be broken—as noted right-wing libertarian, Norris McWhirter, was wont to say—and dedication is what you need, if you want to be a record breaker, as Roy Castle would sing before blowing his own trumpet. And so, the 6.9 GHz overclock that we saw yesterday from the Ryzen 7 9800X3D has now been surpassed by Taiwanese overclocker TSAIK using an MSI MEG X870E Godlike motherboard with a number 1 position on the HWBOT leaderboard for the chip's CPU frequency (via Videocardz).

It's been hit with a frequency of 7.24 GHz, achieved by disabling SMT (something we've found helpful in terms of getting higher frequencies on the Ryzen 7 9800X3D), and using a core voltage of a massive 1.969 V.

Obviously it's also needed liquid nitrogen to achieve that clock speed, and isn't a frequency they've been able to use when actually running benchmarks such as Cinebench. The top HWBot Cinebench score has been achieved 'only' with a 6.8 GHz overclock. That's the difference with the 6.9 GHz achieved by the Asus China GM, Tony Yu, who actually showed applications running with their peak clock speed.

Original story November 7, 2024: The AMD Ryzen 7 9800X3D is now officially with us. PC Gamer's hardware command and conquerer Dave's been testing it for a while and reckons it's "the new king of gaming CPUs". That much we expected or at least hoped for given its predecessor, the Ryzen 7 7800X3D, held the crown for the longest time. Now, we're seeing just how far it can be pushed, and the answer is far indeed.

Asus China's GM Tony Yu has shown (via Wccftech) the 9800X3D reaching a whopping 6.9 GHz at just over 100 W and achieving—at points—over 1,000 fps in CS2 and Valorant at max settings. That's 1080p, of course, but these are esports titles and we like our frames more than our pixels for competitive gaming.

The main takeaway isn't the frame rates achieved, though, nice as they are; the main takeaway is that 6.9 GHz overclock. For reference, the 9800X3D comes with a stock base clock of 4.7 GHz and a boost of 5.2 GHz, so that's a 1.7 GHz increase over the boost clock.

What's important about this is that the 9800X3D is the first unlocked 3D V-Cache CPU from AMD. Previous-gen X3D processors had the V-Cache sitting on top of the chips which would prevent adequate heat transfer for cooling when overclocking, so the chip was kept locked. You could overclock them somewhat by messing with PBO, but you couldn't get much out of them.

With the Ryzen 7 9800X3D, however, the cores sit proudly on top of their V-Cache, leaving the IHS comparatively flush against the silicon underneath. This means better heat transfer for cooling, which has ultimately meant an X3D CPU that's finally unlocked for overclocking. Plus, clock speeds in general, even without manual overclocks, can now creep higher.

Given we've never been able to properly overclock an X3D chip, the Ryzen 7 9800X3D really is the first time we're seeing what X3D tech is capable of when no holds are barred. It's obviously worth noting this overclock was hit using liquid nitrogen, though just from our own messing around with the overclocking capabilities of the new Ryzen chip ourselves, we've managed to get it almost stable at 5.6 GHz, and rock solid at 5.57 GHz. That's without getting into the curve optimizer shenanigans, either, though obviously with only 400 MHz on top of the standard 5.2 GHz boost clock you're not seeing much in the way of frame rate increases and a lot more power being required.

But still, judging by these overclocks, that's pretty impressive stuff, which is saying something for a chip that, even without any overclocks, already handily bests other non-X3D 9000-series chips and Intel Arrow Lake ones in gaming performance.

Speaking of which, now that we know the Ryzen 7 9800X3D can overclock so well, there really is no reason to opt for Arrow Lake for gaming. Although, I suppose there wouldn't be a reason to anyway, given the Ryzen 7 9800X3D performs better, runs cooler, and consumes less power. This latter point is the one thing that Arrow Lake had going for it prior to the Ryzen 7 9800X3D, which has now taken it away. Big oof.

Maybe not so much for Intel, but it's exciting times for the rest of us. An X3D chip overclocked by 1.7 GHz, who'd have thunk it?


Best AIO cooler for CPUs: Keep your chip chill.
Best air cooler for CPUs: Classic, quiet cooling.
Best PC fans: Quiet and efficient.

]]>
https://www.pcgamer.com/hardware/processors/the-amd-ryzen-7-9800x3d-just-hit-6-9-ghz-and-thousands-of-in-game-fps-with-an-overclock-and-it-barely-even-broke-a-sweat/ fFxdvbBAVAXiA6aHG2hxk4 Thu, 07 Nov 2024 17:34:49 +0000
<![CDATA[ AMD finally beats Intel in server revenue, but surprise surprise, Nvidia's still miles ahead ]]> Much as I might instinctually recoil at the suggestion, the server rather than the desktop market is the big growth area right now. I'm told that this is thanks to a newfangled thing called "artificial intelligence", which is why Nvidia, proverbial emperor of the AI crowd, is doing so well and expects to keep doing well with Blackwell. But Nvidia's not the only player in the server market, and there's a change afoot elsewhere, over in the land of datacentre CPUs.

The change in question is that AMD has, for the first time ever, outsold Intel in the datacentre market, bringing home a trend that's been a long time coming. This is coming from X user Sravan Kundojjala of SemiAnalysis (via Tom's Hardware), who charts the datacentre revenue for the two companies against Nvidia's networking revenue.

The reason SemiAnalysis charts it against Nvidia's similar amount of networking revenue is to demonstrate that AMD's and Intel's server revenue is still very far behind Nvidia's overall datacentre revenue—Nvidia's networking division is small beans compared to its compute division, after all.

Talking actual numbers, now, Intel's Q3 AI and datacentre revenue was $3.3 billion while AMD's was $3.5 billion. For reference, Nvidia's Q2 datacentre revenue was $26.3 billion—in other words, over three times AMD's and Intel's Q3 datacentre revenue combined.

That's no surprise, though. GPUs—sorry, "AI accelerators"—are expensive, and AI workloads require a whole load of them to churn through all that data. But CPUs are needed for servers, too, and in this area, AMD's finally caught up with and surpassed Intel.

AMD's been taking bites out of Intel's lead for a while, now, but it wasn't always so close. AMD Epyc processor shares used to hold barely a candle to Intel Xeon ones. We can see as much from SemiAnalysis's chart: follow the yellow line back towards 2021 and watch the cavern open up between AMD's and Intel's revenue. It's only really since about mid-2023 that AMD started to make real inroads.

Part of that's down to AMD offering great high-end value for datacentres with its latest Epyc processors, and part of it's down to Intel's general financial struggles—struggles that are present despite the company beating its Q3 financial expectations. AMD's doing better overall, too, having recently posted record Q3 revenues.

It's important not to infer the wrong things from revenues alone, however. Intel still dominates the CPU market in terms of actual shipments—as in, chips actually sitting on server racks. For context, back in Q2, AMD's share of the server market was 23% to Intel's 77%. Intel still commands the lion's share, and this is true in the overall computer CPU market, too, with the Blue Team commanding a roughly 60-40 lead.

Your next upgrade

Nvidia RTX 4070 and RTX 3080 Founders Edition graphics cards

(Image credit: Future)

Best CPU for gaming: The top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game ahead of the rest.

Revenue is good for assessing trends, however, and obviously for assessing a company's financial standing. Under this light, AMD's doing well, and Intel not so well.

For us gamers, the big metaphorical stamp alluding to this is Intel's line-up of Arrow Lake desktop processors, which has had its fate sealed by AMD's 9000-series, especially now the AMD Ryzen 7 9800X3D is out. Intel's Arrow Lake chips are just too expensive for too little performance, even compared to its own previous-gen chips.

And don't think I'm taking much joy in this assessment, either. As fun as it might be to rag on big companies, I can't help but feel pangs of guilt, now, given how Intel seems to receiving blow after proverbial blow. Too much and it's just sad, you know? Plus I'm sure we all want to see some actual competition in the CPU space.

On that front, to get a little more positive, as Wccftech points out, Intel's only just bringing Granite Rapids CPUs into the market, and they're meant to be great performers. So we'll just have to see how that goes, I suppose. Time for a comeback?

]]>
https://www.pcgamer.com/hardware/processors/amd-finally-beats-intel-in-server-revenue-but-surprise-surprise-nvidias-still-miles-ahead/ RVr2Pvt6gX2zDr4uD5Mnsa Thu, 07 Nov 2024 16:39:45 +0000
<![CDATA[ AMD just took the one thing Intel's Arrow Lake CPUs had going for them and slapped it right out of their hands ]]> The new AMD Ryzen 7 9800X3D launches today, after a fanfare of review scores accompanied the full reveal yesterday, and with that the best gaming CPU is now out in the wild and available to anyone looking to build the ultimate gaming rig. And the final nail in the coffin of Intel's brand new Arrow Lake CPU generation has been hammered into place, taking the one thing it had left—namely, gaming efficiency—and easily stealing it away.

Intel must be looking at the CPU market ashen-faced right now because when it comes to helping a GPU fling frames around a screen at a spectacular rate, its once seemingly unassailable lead in gaming performance has now completely evaporated. The practically all-TSMC-made latest Intel processors, code-named Arrow Lake, arrived only recently, putting their all into low-power, efficient running.

Unfortunately, that has come at the expense of practically everything. Sure, in some heavily multithreaded applications the Intel Core Ultra 9 285K and its impressive Efficient cores are doing some serious work, and delivering great benchmark numbers, but when it comes to gaming the chips are often well behind its previous generations of processors.

But they've been mightily, impressively efficient, and therefore very cool-running.

That was some consolation for the Intel-faithful (what few are left), especially after consistent generations of CPUs that relied almost exclusively on higher clock speeds and ever-higher power consumption to hit its performance numbers.

But here comes AMD saying "Hold my pint" while it rolls out the Ryzen 7 9800X3D with the highest gaming frame rates of any tested CPU of this generation, the lowest gaming power draw, and the lowest operating temps.

For our latest CPU benchmarking suite we've been using Baldur's Gate 3 as our gaming temperature and efficiency benchmark; it scales really well with different processors and can really highlight a given chip's high and low points. Within that test, the Ryzen 7 9800X3D delivers nearly 50% higher gaming frame rates alongside those lower temp and power draw numbers.

It was the last thing Intel had to offer gamers, a potential small form factor mini-beast of a CPU. But even that is now gone thanks to the new 3D V-Cache chip. Where does Intel go from here? Back to the CPU schematics board, I would guess.

Best SSD for gaming: The best speedy storage today.
Best NVMe SSD: Compact M.2 drives.
Best external hard drive: Huge capacities for less.
Best external SSD: Plug-in storage upgrades.

]]>
https://www.pcgamer.com/hardware/processors/amd-just-took-the-one-thing-intels-arrow-lake-cpus-had-going-for-them-and-slapped-it-right-out-of-their-hands/ MpxcMXypu6i2z8DWErSr5H Thu, 07 Nov 2024 16:26:56 +0000
<![CDATA[ Five things I always tell people before they buy a new gaming CPU ]]> Every computer has a central processor or CPU, be it a tiny tablet or a hulking workstation, but when it comes to desktop gaming PCs, oh boy is there a lot to choose from. Despite there being only two vendors in the whole market, you've got hundreds of different models vying for your attention.

Browse through the listings of any major retailer, and you'll be faced with three to four generations worth of chips from AMD and Intel. You can get them with just four or six cores, to as many as 24. And there's a bewildering number of varieties of the same chip—you can find up to nine variants of AMD's Ryzen 5 5000-series chips.

It's easy enough to find out what the very best gaming chips are (just check out our best CPU for gaming guide for that) but what if you want something different? As the go-to person in my family and circle of friends when it comes to tech-related guidance, I've been asked more times than I count about advice for getting a new CPU.

So before you charge off into the Black Friday sales to get your perfect processor, here are five things I always tell anyone before they splash the cash on a cache-toting chip.

  1. Do you want to keep your current motherboard?
  2. Eight cores are more than enough for gaming, even six is fine
  3. The best way to save money is to buy a last- or two-gen old CPU
  4. You don't actually need a big AIO liquid cooler, even with a high-end chip
  5. Modern CPUs work best with super-fast RAM but it's not a must-have

1. Motherboards

A close-up photo of AMD's AM4 CPU socket

(Image credit: Future)

Do you want to keep your current motherboard?

There are two approaches to picking a new gaming CPU: (1) get one that will work in the motherboard you already have or (2) get a new CPU, motherboard, and RAM kit altogether. Rather obviously, the second option is going to be a lot more expensive than the first one, but it greatly simplifies the process of choosing which new CPU to get.

That's because every motherboard has a finite range of processor models it can support. In some cases, it's a huge number—for example, the Asus ROG Strix B550-E supports every single desktop Ryzen 3000, 4000, and 5000-series chip (as long as it has the latest BIOS installed). Nearly 70 different CPUs in total!

But the very latest processors from AMD and Intel use a different socket from those a few years ago. For example, if your gaming PC has a Core i5 10600K inside it, the motherboard that houses it won't accept a 13th or 14th Gen Core chip, nor the new Core Ultra models.

So if your gaming PC is five or more years old, you're probably better off getting a whole new setup if you want to get a current or last-gen CPU. That will mean replacing the motherboard and RAM, and possibly even the CPU cooler. You'll then need to reinstall Windows (although you can just risk shoving the storage drives in as they are), which makes the whole thing a lot more hassle than just dropping in a new CPU.

This is why it's the first thing I ask anyone if they're planning on getting a new CPU. If you're not sure what motherboard is in your gaming PC, then download CPUID and click on the motherboard tab. It's completely free to use and it'll tell you everything you need to know about your CPU, motherboard, and RAM.

Should you decide to keep your motherboard, use CPUID to find which model it is, and then look up the manufacturer's webpage for it. Head to the support section and you should see a list of what CPUs you'll be able to install.

2. Core count

Intel Core i5 14600K inside a Z790 motherboard.

(Image credit: Future)

Eight cores are more than enough for gaming, even six is fine

While the latest high-end CPUs are pretty expensive, they're a whole lot cheaper than anything in the equivalent sector of graphics cards. So you might be tempted to splash out and get yourself an AMD Ryzen 9 9950X or an Intel Core i9 14900K.

The thing is, for gaming, the huge number of cores they both offer are wasted. Content creation tools, like Blender for rendering or Handbrake for video encoding, will take as many cores as you can throw at them but games are primarily designed around six- or eight-core CPUs.

Partly because many of them are developed for consoles, as well as gaming PCs, and partly because games just don't create dozens of demanding threads for the CPU to crunch through. A thread is basically a string of instructions—e.g. add these numbers, put the answer here, get this data, add the results together—and game engines only use four to six big threads for everything.

Some will use more than this, but the extra threads are typically very light in nature, and the game won't suffer if there aren't enough cores available to crunch them. What happens is that the whole processor just gets worked a bit harder and, unless you have a very old chip, it'll cope just fine.

So don't worry about cores and instead focus on things like what generation of model the CPU is, its maximum clock speed, and how much cache it has. The very best CPUs for gaming are all AMD 3D V-Cache models, with the stars of the show being the Ryzen 7 5700X3D, the Ryzen 7 7800X3D, and the new Ryzen 7 9800X3D. All three have eight cores, and you'll never want for more in gaming.

3. CPU generation

Intel Rocket Lake CPU back

(Image credit: Intel)

The best way to save money is to buy a last- or two-gen old CPU

On the point of AMD's 3D V-Cache chips, the Ryzen 7 5700X3D uses the Zen 3 microarchitecture that first appeared four years ago. At the time of writing, that particular CPU is $210 at Amazon but it's been cheaper than that, and the price will probably drop further still in the Black Friday sales. Cheaper still is the six-core, 12-thread Ryzen 5 5600X, just $131 at Amazon, which also uses Zen 3.

Nobody would expect them to be on par with the very latest models and yet both chips stack up well against AMD's 9000-series and Intel's 14th Gen and Core Ultra 200S when it comes to gaming performance.

While the Ryzen 5 5600X is clearly the slowest of all the CPUs shown above, it's not like the figures are really bad. And as for the 5700X3D, well that puts the Core Ultra 5 245K ($319 at Amazon) to shame in both Cyberpunk 2077 and Baldur's Gate 3.

Once you increase the game's resolution to 1440p or higher, the CPU becomes far less of a limiting factor in the performance—it's almost all about the GPU then (an RTX 4070 was used for the above figures).

So, if you're looking to save a bit of cash when choosing a new CPU, don't dismiss older chips, as they're still more than good enough in today's games. And in the case of the 5700X3D, it'll probably be fine for at least another three or four years, before it becomes the main limit in how fast your games run.

If consoles work fine with 8-core CPUs using a microarchitecture that's five years old, and clocked a lot slower than most desktop equivalents, your PC gaming experience will be just fine with something that's a couple of years old. Your wallet experience will be even better!

4. Cooling

Noctual NH-D12L CPU cooler installed in a case with other Noctua products

(Image credit: Noctua)

You don't actually need a big AIO liquid cooler, even with a high-end chip

CPUs require electrical energy to work but all of it (to any measurable degree, at least) ends up being transferred into heat. All modern processors have systems to prevent themselves from getting damaged if their temperature gets too high but you still need a cooler to dissipate the heat.

AMD's processors and Intel's new Core Ultra 200S chips don't generate a lot of heat when used in games. Even those that have a small mountain of cores don't—for example, the 16-core Ryzen 9 9950X averages less than 130 W of heat in Baldur's Gate 3. However, Intel's last-gen Core i9 14900K is a little more serious, pumping out 177 W in the same test.

So you might think that you need a big and expensive all-in-one (AIO) liquid cooler to keep temperatures in check. The truth of the matter is that you don't or at the very least, definitely not with AMD CPUs. Something like the Thermalright Peerless Assassin 120 SE air cooler (just $39 at Amazon) will have no problems cooling any Ryzen processor.

It'll cope with Core Ultra 200S chips just fine too but with Intel 13th and 14th Gen Core i7 and Core i9 CPUs, you just need to give it a helping hand in the form of reducing the power limits. Modern CPUs have at least two power limits: one that acts as a baseline figure, usually called TPD (thermal design power or just PL1) and the other as a short-term limit (aka PL2).

Intel's Core i7 and i9 chips are designed to run with a TDP of 125 W or so, and 253 W for PL2, but motherboard vendors often set both limits to 253 W. Most Intel motherboards will let you set PL1 and PL2 to a different value in the BIOS.

Reducing them will mean losing a little bit of performance but you're unlikely to notice it much in games. For example, my Core i7 14700KF is always set to 175 W on both limits for normal use, though I switch back to Intel's values when testing. But in doing so, I have a cooler and quieter PC and air coolers have had no issue in dealing with it all.

5. RAM

Corsair DDR5 RAM up-close

(Image credit: Future)

Modern CPUs work best with super-fast RAM but it's not a must-have

AMD and Intel design their processors to work with a particular type of RAM and they're all rated to work 100% normally up to a certain memory speed. For example, Intel 14th Gen Core chips can be paired with either DDR4-3200 or DDR5-5600, while AMD's Ryzen 9000-series it's just DDR5-5600.

However, you can buy faster RAM than this and I have an Intel 14th Gen i7 14700KF using DDR5-6400, an Intel Core Ultra 9 285K with DDR5-7800, and a Ryzen 9 9950X running with DDR5-6000 in front of me as I write this. Technical speaking, this means the memory controller in each of these chips is overclocked.

The rest of the CPU is running normally, though, which is why they're all fine with such speeds, but there are no guarantees that every combination of CPU and motherboard will cope with faster-than-normal RAM.

Speedier memory does equate to better performance, but the actual gains one experiences depend an awful lot on what game is being used to measure the impact. Some will show practically no difference between running at the vendor default speed (e.g. DDR5-5200) and an overclocked mode (often called EXPO mode with AMD chips and XMP mode with Intel). Others will show a marked improvement, even with just a minor overclock.

It also depends on what quality settings and resolution you game at—running something like Cyberpunk 2077 at 4K with path tracing enabled will show almost no change with faster RAM because it will be so GPU-limited.

Slower RAM is cheaper, but not massively so. A 32 GB kit of Corsair Vengeance DDR5-5200 is $97 at Amazon at the moment, while the DDR5-6000 version is $120, only $23 more expensive. However, if you wanted to save as much as possible, to get a better CPU for example, then going with slower RAM is one way to do it (and the same is true if you're looking at a processor that uses DDR4).

So don't think that you absolutely must get the fastest RAM money can buy with your new CPU. I usually recommend DDR5-5600 and DDR4-3200 as the best balance between money, performance, and compatibility—it just comes down to the CPU and motherboard configuration, so check with the vendor for the latter to see what RAM the board is happy with.

]]>
https://www.pcgamer.com/hardware/processors/five-things-i-always-tell-people-before-they-buy-a-new-gaming-cpu/ 7hgsETbreUAFAg8JeEs3ZC Thu, 07 Nov 2024 16:20:53 +0000