Intel and Nvidia deny conspiracy rumors against AMD

Illustration for the article titled Intel and Nvidia are probably not in cahoots to keep AMD out of gaming laptops

Print Screen: Nvidia

If you visited the AMD Subreddit, a Linus Tech Tips Forums, or anywhere else in the past few months, you may have come across a conspiracy theory that Intel and Nvidia have made a secret deal with each other to keep high-end GPUs away AMD Ryzen 4000 Series Laptops. If you look at the list of AMD laptops launched last year, believe me. The Asus ROG Zephyrus G14, Lenovo Legion 5 and others came with an AMD processor, but nothing superior to an RTX 2060. Conspiracy theories are tempting, but this seems to be nothing more than a product of the Intel / AMD / Nvidia war. It doesn’t help that unsubstantiated claims from blogs and news sites around the world continue to promote the same narrative. Just a little digging is enough to see that there is no interesting scandal here – just a complicated web of how CPUs and GPUs work together.


In April 2020, Frank Azor, AMD’s chief game solutions and marketing architect, answered a question from a Twitter user about the lack of next-generation GPUs on AMD laptops, saying “You would have to ask your favorite OEMs and PC developers that question.” It was at this time that conspiracy theory began to take shape, but Azor was right. Laptop configurations are decided by the OEMs, not the chip makers. And these settings are usually determined by cost, but they also need to make sense. A low-powered CPU with a high-powered GPU is not a good combination, and this is the kind of trap that something like the Ryzen 9 4900HS, or less, falls into.

Azor even sat down with The Full Nerd in May 2020 to address the problem again, specifically talking about the OEMs’ confidence in Ryzen processors. “I think the Ryzen 4000 exceeded everyone’s expectations, but most of the time everyone kept us on tiptoe. So it was difficult to imagine a world where we were the fastest mobile processor, ”said Azor. “I think that when you are planning your notebook portfolio as an OEM and have not yet come to that conclusion – and remember, all of that planning for these notebooks was done last year – you leaned a little towards AMD.”

Essentially, OEMs’ confidence that AMD has an extremely fast mobile processor simply did not exist. So why would they combine a state-of-the-art mobile processor with something they believed to be inferior? Finding the middle ground, the “meat of the market”, as Azor said, were laptops with RTX 2060 and below. However, even with this reasonable explanation, the rumor continues, looking for answers in the processor specifications.

Gizmodo contacted Intel and Nvidia about these rumors, which both companies vehemently denied. An Nvidia spokesman told Gizmodo: “The statement is not true. OEMs decide on system settings, selecting GPU and then CPU to pair with it. We support Intel and AMD CPUs in our product stack. “

An Intel spokesman expressed the same sentiment: “These claims are false and there is no such agreement. Intel is committed to conducting business with uncompromising integrity and professionalism. ”

The firm denials from Nvidia and Intel certainly suggest that this theory has little or no water, but I don’t think you necessarily need their denial to know the theory is bullshit. The fact is that the Ryzen 4000 series would never be a strong competitor for next-generation mobile games.


There are three elements of The Ryzen 4000 series from AMD probably took into account the OEMs’ decision not to pair it with a high-tech graphics card. PCIe limitations, CPU cacheand, and the most obvious: single-core performance.

Games rely more on single-core performance than on multiple cores, and Intel typically has the best single-core performance. This is true both historically and with respect to Intel’s 10th generation compared to AMD Ryzen 4000 series. Damn, the gaming benchmarks on the 10th generation Core i9-10900K are on a par with the newest ones Ryzen 9 5950X when both are paired with an RTX 3080.

IIn our previous laptop tests, AMD’s Ryzen 9 4900HS on Asus’ ROG Zephyrus G14 performed on a single core weaker than the Intel Core i7-10875H on MSI Creator 15. The Core i7-10875H is not at the top of Intel’s 10th generation mobile line, but the Ryzen 9 4900HS is at the top of AMD. Yet with almost the same GPU (RTX 2060 Max-Q on the G14, RTX 2060 on the Creator 15), the Intel system still averages 1-3 fps higher (1080p, ultra settings.). Pairing a more powerful GPU with a Ryzen 9 4900HS would likely cause bottlenecks in some games due to the performance of a single core.

This will lead to less than stellar performance compared to Intel’s offerings, especially when combined with the coward L3 CPU cache in the Ryzen 4000 series. Only 8 MB of L3. That’s half from Intel. Therefore, the average time it would take to access main memory data would be slower than that of the Intel mobile processor.

The PCIe limitations of the Ryzen 4000 series may also have contributed to OEMs’ reluctance to adopt, but this idea is a little more unstable. It originated from a blog post at igor’sLAB this explained, since Ryzen 4000s CPUs have only eight PCIe 3.0 lanes dedicated for discrete GPUs, this could cause a bottleneck if combined with something larger than an RTX 2060. Each PCIe device requires a certain number of lanes to operate at capacity total, and both Nvidia and AMD GPUs need 16. Since 10th generation Intel processors have 16 support ranges, this made them a better match for the RTX 2070 and higher GPUs in last year’s gaming laptop list.

However, many people on Reddit and other online forums have pointed out that the performance drop in pairing a Ryzen 4000 CPU with an RTX 2070 or higher GPU would be very small, if noticeable at all, so for them the explanation made no sense. (More fuel for conspiracy theory.) Naturally, I had to test it all myself to see if there was a drop in performance from 16 to 8 tracks.

Running my own tests, I found that 16 lanes really offer better performance in GPUs, but this performance can also be negligible. GI used a much more powerful processor than the Ryzen 9 4900HS, so it was quite capable of handling an RTX 2060 and above, regardless of how many PCIe lanes were available.

My test PC was configured with: Intel Core i9-10900K, Asus ROG Maximus XII Extreme, 16 GB (8 GB x 2) DRAM G.Skill Trident Z Royal DDR4-3600, SSD Samsung 970 Evo 500 GB M.2 PCIe, a Seasonic 1000W PSU and a Corsair H150i Pro RGB 360mm AIO for cooling.

The game’s performance hardly changed after I changed the PCIe configuration from 16-way to 8-way, but the performance difference was noticeable in synthetic benchmarks. Comparing an RTX 2060 with an RTX 2070 Super (the closest GPU I had in the hands of an RTX 2070), I ran benchmarking tests on GeekBench 5, 3DMark, PCMark 10, Shadow of the Tomb Raider, and Metro Exodus, some of which are part of our usual list of tests.

Frame rates increased by a maximum of only 4 fps, the most notable difference Shadow of the Tomb Raider in 1080p. This reinforces what many have said about gaming performance not being substantially affected by cutting the number of PCIe lanes for the GPU in half until it reaches something so powerful on the RTX 2080 Ti.

The synthetic benchmark tests did not change much from 8 tracks to 16 tracks with the RTX 2060, but the difference in scores was more pronounced with the RTX 2070 Super, suggesting that there is a measurable difference that can matter in other applications. The GeekBench score of the RTX 2070 Super jumped 3,000 when all 16 tracks were made available to the GPU. Time Spy produced results in line with gaming benchmarks, and strangely the RTX 2060 saw a greater boost in the PCMark test compared to the 2070 Super.

Of course, synthetic benchmarks are not a measure of performance in the real world, and PCIe bandwidth is not the main factor that will slow your system down. But considering that many reviewers use these metrics to paint the image of a laptop or desktop, any of the AMD 4000 series processors paired with anything larger than an RTX 2060 would have lower than normal results. For high-end GPUs that are “performance oriented,” each extra number, each extra frame is important, especially when there are many OEMs competing for a place at your table.

This suggests that, yes, OEMs will favor the “best” CPU, even if the best CPU is only slightly better. The lack of AMD 4000 series processors paired with next-generation Nvidia graphics may be the result of OEMs underestimating how many consumers were really interested in this type of laptop configuration in the past year, but it probably has to do with the lack of the 4000 series of L3 cache and slower single-core speeds. Sure, the RTX 2070 and above can work well on the PCIe x8, but if the CPU doesn’t have the power to handle the GPU, none of that matters.


There is an end point to unmask this theory. If Intel and Nvidia were in cahoots to exclude AMD, then because they are more OEMs hugging with all my heart The AMD / Nvidia combination this time? Many of your laptops with AMD Ryzen 5000 series technology will even have an RTX 3070 or 3080; the newest mobile AMD Ryzen processors will have 20MB of L3 + L2 cache and support for up to 24 PCIe Gen4 lanes (16 lanes dedicated to a discrete GPU) – just what you need to pair well with something larger than a mid-range card .

Companies are regularly found involved in a myriad of obscure activities that increase your financial results while harming consumers and affecting the choices we make every time we enter a Best Buy with money burning in our pockets. But no, Intel and Nvidia are probably not to blame for the slow adoption of AMD CPUs by OEMs. AMD has had to spend the past few years rebuilding its reputation and creating processors that really rival Intel in the mobile space and can support the powerful GPUs that Nvidia produces for laptops.

The Ryzen 4000 series was very good, but was not quite ready to compete in the areas that matter most gamers and OEMs of gaming laptops. The Ryzen 5000 series, if OEM adoption indicates anything, will be a totally different beast. And it will probably be found on all the big gaming laptops that the 4000 series was not. Nvidia and Intel have nothing to do with this.

.Source