Skip to main content

Get the Reddit app

Scan this QR code to download the app now
Or check it out in the app stores
r/Amd icon
r/Amd icon
Go to Amd
r/Amd
A banner for the subreddit

Welcome to /r/AMD — the subreddit for all things AMD; come talk about Ryzen, Radeon, Zen4, RDNA3, EPYC, Threadripper, rumors, reviews, news and more. /r/AMD is community run and does not represent AMD in any capacity unless specified.


Members Online

AMD Ryzen 5 5600X Up to 10% Faster than the Intel Core i5-12600K in eSports Titles

Benchmark
Archived post. New comments cannot be posted and votes cannot be cast.
Share
Sort by:
Best
Open comment sort options
u/looncraz avatar

This caught me a bit off guard considering 12600k clocks to 4.9GHz... but it only has 20MB of L3... These particular games scale well with cache and the extra 12MB of L3 available to the 5600X is probably what's responsible for this unexpected performance win.

This is exactly why AMD needs some mid range VCache CPUs, though I doubt that's what we will see. AMD is aiming for the premium and high end markets to maximize profit and help change their image as the value oriented underdog... But they need to be careful about being seen as the overpriced option given the close proximity of performance with Intel and Intel's massive locked in user base.

u/riderer avatar

when everyone wants their currentand new server cpus, new vcache cpu availability gonna be limited on desktop.

u/looncraz avatar

I am quite concerned that could be the case.

And you forgot cryptominers as well.

Not to mention that I'm not sure if AMD will put the VCache on midrange CPU's, when we first knew about it we only thought about 5900 & 5950 and maybe 5800 in a less degree.

But, with how intel's 12th series is doing, AMD might think about Ryzen 5 too.

[deleted]
[deleted]

Comment deleted by user

Yeah, but there's some bad news from Raptoreum as it was targeting high cache CPUs from a while already.

https://www.reddit.com/r/Amd/comments/qrejqq/amd_ryzen_cpus_could_be_in_threat_from_raptoreum/?utm_source=share&utm_medium=ios_app&utm_name=iossmf

more replies More replies

Crypto mining on cpus is not worth it. Too much power (and heat) for too little reward.

u/asterics002 avatar

Can confirm that mining monero is worth it on Zen 2 or Zen 3, but GPU is more profitable overall

It would take some crypto that can't be properly mined on GPUs to change that. Something that can't be scaled over the amount of cores on a GPU and requires the multitude of different instructions CPUs can do. If you can't mine it on GPUs, it won't adjust to that rate of mining and stay viable on CPU.

More replies
More replies
More replies

My only thought to your concerns is AMD is playing a long game given how short sightedness has bit them in the ass in the past. Short term lacking vcache on the lower end sku's (If it turns out that way), will suck for we the consumer. However when it trickles down the sku's as the years go on it won't be a thought in folks minds. Eventually it'll be simply included in the CCD/CCX or whatever its referred to as these days.

[deleted]
[deleted]

When did short-sightedness bite them in the past?

AMD's GPU designs were future oriented, moreso than nVidia's.
AMD's CPU designs (including Bulldozer) were future oriented.

The issue with AMD's past CPU designs is that they were SUPER ambitious, made strong assumptions that weren't accurate and required more funding to do even semi-well than AMD could muster.

AMD rested on their laurels with the Phenom series and pissed away their warchest. So when Bulldozer had teething issues they didn't have the money to fix the issues. AMD nearly fell apart because they thought they had Intel's number and ended up having the spin off their entire fabrication capacity just to stay afloat.

AMD is not going to make that mistake again.

[deleted]
[deleted]

AMD wasn't resting on their laurels with Phenom.

Bulldozer was originally meant for a 2009 release date. https://news.softpedia.com/news/AMD-to-Release-Bulldozer-in-2009-83957.shtml

The issue is that the follow up product, Bulldozer, was 2 years delayed (wasn't feasible to manufacture on the original node it was intended to operate on), performed worse than expected, used more power than expected...

Same goes with Llano. It was also delayed.

There's a big difference between resting on your laurels and being incompetent.

AMD nearly fell apart because they thought they had Intel's number

Intel was ~5x larger in early 2006 by market cap (this was when AMD was at a peak and Intel was at a lull) and had a strong product line-up. By mid 2007 Intel was almost 20x larger.

NO ONE with any dash of business sense (or engineering expertise) thought AMD (which was maxed out on manufacturing capacity and not able to instantly get 10x larger in a profitable manner) had Intel's number just because Intel had one shoddy product (Prescott) - heck Intel even had a decent product (Yonah) at the same time that was in mass production and even allowed media outlets to test Conroe early.

K10 was good, but still not close to Intel at the time. It performed about as well as Penryn (45nm Core 2), while Intel already had Nehalem out that did better. The good ones (Phenom II) came out in 2009/10, while Nehalem was launched in 2008. So they were about 2-3 years behind.

It wasn't a bad architecture, the price was good and there was plenty of headroom for overclocking, but it wasn't at 1:1 rate with the competition.

The last time they were they absolute lead was during the Athlon 64/late Pentium 4 times, which was countered pretty quickly, tbh.

More replies

Are you sure it wasn't intels anti consumer practices that were the teething issues?

Edited

It would've hurt them without Intels sabotage (albeit slightly less drastically), its why there are still those with the lingering "AMD BAD" when looking at DIY builds for select folks. Folks at AMD board that had no knowledge of the engineering side made directives to cut costs that resulted in the high power draw and low performing parts we saw for about a decade. Lisa Su was actually the director of the team responsible for the Cell Processor used in the PS3. Reason I bring that up is that once she was at the helm of AMD, she helped guide the board into starting the complete overhaul of architectures not only on the CPU side, but also GPU side of AMD. That engineering insight is crucial, and unfortunately after the original founder left, and Hector Ruiz took over, the original engineering insight was caput.

I know folks still reference the intel sabotage, and yes they were harmful practices, however AMD had plenty of warranted criticism for their past decisions being a major contribution to their sharp decline, which obviously isn't really an issue anymore.

more replies More replies
[deleted]
[deleted]
Edited

Pretty much anywhere Hector Ruiz went (CEO of AMD, then Global Foundries) almost ended up imploding by the end of Ruiz's tenure (with all future products being developed also being hamstrung). Could be a coincidence, could be a pattern.

Intel's practices hurt AMD but AMD's big gambles (all of which went poorly) did more to harm them than anything else.

More replies

Attempting a modular design with Phenom I and FX whilst cutting corners which provided for some serious shortfalls in the long term even though teh short term benefits was a major reduction in R&D requirements. This directly after acquiring two very high priced acquisitions which although beneficial in the long run contributed to a major decrease in long term mind share. Even without intels sabotage, AMD sabotaged themselves. Its why they had to pull a hail mary with a from scratch design to start over as their iterative approach with flawed designs set them back for more than a decade.

Thats not even the ATI/RTG front, where they banked on compute being the mainstay for gaming, whilst they were the minority in market share, completely deviating from graphical workloads and harming their competition in the long term on an architecture they banked on for decade as well.

AMD has obviously learned from their mistakes and are making carefully drawn out solutions for their historical issues.

Tbh, the other side also used modular designs (Pentium D, Core 2 Quad) and even got flak for not having a "true" dual or quad core design.

On the other hand AMD had some pretty good aquisitions like NexGen which basically did the K6 design or ATi, which kept them afloat during Bulldozer.

more reply More replies
More replies
More replies
More replies
u/puz23 avatar

I can't imagine stacking chips terribly easy or cheap at this point. I'm not going to begrudge them keeping that tech high end for the first generation.

u/looncraz avatar

It's not as easy as not doing it, but it's still relatively easy. HBM can be stacked 8 chips high, but is usually 4 deep stacks, for example.

The cooper to copper bond requires precise lapping and surface prep, not to mention alignment, but once the process is cracked it should be relatively low cost (though obviously still an added cost).

I would expect $30/die to be at the high end of VCache cost... And I can certainly imagine there being a decent amount of SRAM dies that have issues and need to have reduced capacity which would make sense to couple with a quality six core chiplet (this could also be a result if incomplete bonding, for example).

The biggest hurdle with 3D stacking is cooling. You need to get rid of all that heat. HBM had serious heat issues that turned the Vega cards into space heaters.

u/looncraz avatar

HBM doesn't have a heat issue, Vega was hot because the core was super inefficient and AMD pushed it hard. The HBM would be at 45C and the core would be at 110C.

The heat issue is real, of course, but it doesn't make it into production products, TSVs and, now, C2C connections transfer heat very well.

More replies
More replies
More replies

Lmao. What caught you off guard? The fact you clearly haven’t read the article? The headline is so misleading it’s ridiculous.

u/looncraz avatar

5600X shouldn't be remotely close to 12600k in games. This article isn't the only one showing these types of results.

12600k is still the better CPU by a mile, but that's not showing up in games.

[deleted]
[deleted]

Comment deleted by user

u/looncraz avatar

I ignore teething issues when I trust they will be fixed or at least a simple workaround available.

more replies More replies
More replies
More replies

You think the enthusiast PC community cares about price? When it comes to high end price doesn't matter it seems even for a 5% difference lol

u/looncraz avatar

There's a difference between being price sensitive and caring about price.

Everyone cares about price to some degree... Unless you don't mind me charging you $100k for a 5950X.

Even enthusiast PC community cares about price, believe it or not and even more then your budget friendly community.

More replies

Something like a 5300X would be nice as successor to the 3300X which is practically nonexistent on the market. Just because someone is looking for a lower CPU doesn't mean they don't have their old card available. They don't necessary need an APU. Sure the 5300G is fine, but OEM only.

Or how about a 5500 that runs 6 cores with 16 MB Cache

Would also help using those dies with less than 6 functional cores and cache. Maybe even go further and make a 8 core chip that runs 2x4

More replies
u/SmokingPuffin avatar

The headline is pretty wrong, as the data says 5600X is 32% faster in Valorant than 12600K. So, it can be more than 10%, clearly.

However, looking at the data, it looks more like "5600X is faster in Valorant" than any general claim. The other tests all look like a wash.

Yeah this is blatant misuse of statistics. The outlier makes the entire result.

u/karl_w_w avatar

Little faster in CoD as well, but I think the 5 people who play CoD already knew that.

5 people play cod? Lol isn't warzone the biggest fps shooter right now? It really is. Yet no one seems to benchmark it. Yet at the same time it's got the biggest player base. It seems to me like some kind of cod bias exists in the tech community. I don't blame them tbh I hate cod and have hated cod since black ops 2 days. However, warzone is genuinely the most impressive take on the br genre. It's not perfect I'm not saying that however it's an amazing idea, great foundations. It's just that the game is ruined by cheaters and bugs and such. However the underlying game is incredible.

It is hard to test Warzone as its not single player.

Fake news. Yes the error is present at a slightly higher degree in warzone but it's really not that big of a difference. Game to game is pretty close in fps.

More replies
More replies
More replies

how about fortnite? i have a r5 5600x

More replies
u/Resouledxx avatar

Clickbait title, it performs significantly worse on Valorant for some reason but that’s likely related to something else.

[deleted]
[deleted]

Probably some weird scheduling thing going on in the game engine with P vs E cores. Would be really curious to see another Valorant benchmark in a few months after some driver updates from Intel.

That being said 5600X does hold it's own pretty well in most of these Esports titles.

u/Jamessuperfun avatar

It would be interesting to see these tests run again with affinity set to disable E-cores for the game.

The scheduling issue sounds plausible but the 12900k doesn't seem to have the same problem.

More replies
u/MakeItGain avatar

It says up to 10% when valorant is the only one over 5% and valorant is only high due to an obvious outlier. 12600 shouldn't be that low even compared to the 12900

u/exccc avatar

Yup... the 12600k's valorant score seems like an outlier, and without it the 5600x is only a few % higher - so within margin of error.

u/Darkomax avatar

It's rather Zen 3 that is the outlier, it scales insanely with (probably) cache and is like 2x as fast compared to Zen 2 in this game. Here the 12900K is also abnormally faster than the 12600K which also hints that cache helps a lot.

u/exccc avatar

You're most likely right. Now I wish they tested with the 12700k as well to see how the fps scales with cache size on alder lake, and to see if the cache size is the reason for the fps difference in Valorant.

u/Darkomax avatar

I suppose it would be right in the middle. It's most likely cache because all Zen 3 CPUs performs equally in Valorant so it's not thread count related.

more reply More replies
More replies
More replies
More replies
u/ImSkripted avatar

feel like this shows a relook at Rule 9 might be a useful consideration. you want to post an article as it has interesting information but a shit title. then again i think the shit title is what has caused this to get upvoted so much by people not reading the actual source.

More replies
u/kasimoto avatar

daily copium post because theres no other way in life than being die hard fan of a single company

u/runfayfun avatar

Honestly I can find many use cases for Intel's latest 12xxx chips where they'd be better than AMD's price-equivalent. But I don't want anyone to "win." I want brutal head to head competition.

I mean, with staggered releases, you should absolutely hope that each new release 'wins', as it forces the competitor to try and one-up them, either on performance or at least value. So long as progress gets made, I'm happy.

[deleted]
[deleted]

I mean, I don't understand why people care that Intel made something that is slightly better than a year old chip.

I dont think many of you honestly grasp how petty and laughable posts like this come across to anybody not invested in this brand war nonsense.

More replies
More replies
[deleted]
[deleted]

what a dumb take. And I can hardly think of a more pathetic use of time than lurking in the AMD subreddit until you can call out instances of fanboyism in an attempt to 'shame' the community.

u/kasimoto avatar

i lurk the subreddit for any news that might be interesting or important to me, i have AMD cpu and gpu yet somehow i dont need to reassure myself with posts like these because newer competition product performs slightly better (as you would expect with new stuff)

More replies

Copium?

Intel is on a new platform, with new architecture, better memory channel design, draws twice as much power, has be operated unlocked and permanently boosted, and using an OS who’s scheduler is specifically designed to specially handle their cores and architecture.

And after all that, it’s within margin of error?

Lmfao. Yeah there some copium happening but it sure as fuck isn’t from AMD fans.

u/Montezumawazzap avatar

draws twice as much power

nope, not in games.

This 100% reads like copium.

"Intel is on a new platform"

And they change motherboards like every two generations. Your point?

"better memory channel design"

For now, ddr5 is basically on par with ddr4 in gaming. And latency in alder lake is actually still as high as it was in rocket lake which might hurt gaming... hoping Intel will fix that in raptor lake.

"draws twice as much power"

In gaming, it is just as efficient as zen 3 (might actually end up being slightly more if I remember the graphs right)

And even in MT tasks, only the 12900k is extremely inefficient because they clocked it so high to compete with the 5950x. The 12700k and 12600k, once again, are just as efficient as zen 3. (Techpowerup).

" has be operated unlocked and permanently boosted"

Idk what you mean by this. You can overclock alder lake k skus?

"and using an OS who’s scheduler is specifically designed to specially handle their cores and architecture."

Funnily enough according to gamers nexus the 12900k actually gets more performance from windows 10 than windows 11. But don't forget, windows also had to work with AMD to get some of the early bugs out from working with the chiplets technology. Either way, it's not like windows is handicapping zen 3 performance so alder lake can edge it out.

"And after all that, it’s within margin of error?"

3dcenter.org meta review of (17?) alder lake has average gaming performance at 10 percent faster. Alder lake has the ST crown- that's the consensus by basically all reviewers. This article is specifically about esports, and valorant is the outlier. But even then, the 12900k still beats the zen3 stack because it has more cache than the 12600k.

On the efficiency while gaming... Well, it's a bit of a bad test, at least for me. I don't remember last time I had just a game opened, without a browser with like 20+ tabs, discord, steam, some add-ons for the game... It adds up. Still probably not much of a difference, but as long as they're in the same ballpark, I'll just keep my alternating pattern. If one is outright better, I'll just go for that

[deleted]
[deleted]

Gotta love intel fans defending their fav cpmpany these days.

More replies

posts like this make me feel second hand embarassment

u/BobisaMiner avatar

LMFAO dude go outside and let the rain cool you a bit.

More replies
u/IrrelevantLeprechaun avatar

Yep, Intel fans are trying to scrounge up anything at all to try to feel better about Elder Lake...erm, sorry, Alder lake.

I revel in the shintel fanboy tears as they get beaten by AMD yet again. As always, it's a good day to be loyally team red

please lord say this post is made ironically

u/IrrelevantLeprechaun avatar

I'll never tell ;)

More replies
More replies
More replies
u/TT_207 avatar

Behold the worlds most hollow victory!

That extra 50 FPS on 600 in esports really makes all the difference.

I mean a year old CPU being faster than a brand new one is significant, especially considering they're in a similar price bracket.

In one esports tittle? The rest of the esports tittles are basically tied, the valorant outlier just skewed the results.

Also Idk why so many people are saying "alder lake just barely beats a one year old cpu", like sure, but AMD hasn't released anything in desktop since, so what else are we supposed to compare it too?

Lastly, the 12600k and the 5600x are priced similarly, but the 12600k is a lot more expensive in terms of platform costs. Until Intel and motherboard vendors come out with lower end motherboards, I would hesitate to call the 5600x a direct 12600k competitor.

You can also see this in the lackluster sales of alder lake according to tech epiphany on twitter. Yes, the CPU cost themselves are extremely competitive, but when you have to buy a 200 dollar z690 board minimum vs the still extremely good 130-150 dollar boards for zen 3... then is it really a value product?

u/DudethatCooks avatar

From a gaming standpoint probably not, but from a productivity standpoint yeah the extra $50-$70 bucks is def worth it. Seriously if you're having to upgrade to a all new platform the 12600K is an easy pick over the 5600X even if the mobo is more expensive. You get similar gaming performance, but on average around 40% more productivty performance. Lets say you snag a B550 for $140 and get a Z690 for $230. You're looking at like a 17% difference in total platform cost for 40% better productivity and better single core performance (like 20ish%) for things like Photoshop or games like Age of Empires.

The 5600X is not a great value pick IMO right now and even with a more expensive platform cost I think Intel is the better buy for the 12600K and 12700K over what AMD offer. The only way I am recommending a Zen 3 chip to someone at this point is if they already have a Zen compatible mobo, otherwise I think Intel is the better buy overall.

u/TT_207 avatar

You're probably right to be fair. I have a 5600X as I built a new system and the overall price increase on having that over a 3600 compared to the general performance uplift was an easy choice. If I built today alder lake would likely be a strong contender.

But I do some productivity and the games I like are physics based and heavily cpu bound, the cpu makes a massive difference for me. I'm still running a GTX 950 as the gpu doesn't matter enough in what I play. If I was doing esports titles though? Think I would have been trying hard to justify a 3600 even (probably more likely 10400F) and looking to spend remainder on a GPU upgrade.

Don't forget that Z690 motherboards also come with more M.2 slots, so you can have even more SSDs.

More replies
More replies
More replies
u/seriousbangs avatar

If it gets you to 144 FPS @ 4k then no, it's not a hollow victory.

I used to play with friends a long, long time ago and higher resolutions coupled with higher frame rates made a big difference. You might see diminishing returns after 4k (the human eye can only do so much even when you're close to a screen) but not until then.

u/demi9od avatar

Resolution doesn't affect CPU based frame rate limitations.

Until you throw in ray tracing then it can absolutely hammer the CPU.

u/kasimoto avatar

no one (on competitive level) plays esports titles with rtx or 4k

more replies More replies
More replies
More replies
[deleted]
[deleted]

Indeed. 4K @ 120 is glorious. But you still get jaggies in 4K. So more power for AA is always welcomed.

3700X will get you to 144fpz at 4k easy too.

u/seriousbangs avatar

Right but will your 1% lows be 144fps? How about your .1% lows?

In vast majority of games at capped 144hz the difference between 3700X and 5600X in these will be minimal

More replies
More replies
More replies
u/jortego128 avatar

It is what it be's.

More replies

it has to do with the amount of l3 cache on the CPU. It becomes more of a bottle neck as the system GPU pushes frames out faster

I would be surprised if this didn't change with updates to the games to allow for better support

If valorant really does love extra cache this much, imagine how much performance could be gained by zen 3D...

u/Pillokun avatar

disable the e-cores and the 12gen flies.

The advantage zen3 still have it is the 32MB l3$, massive advantage for e-sport titles.

12900k is the closest with the 30MB l3$ and it shows in the gaming if you look at say HU bars.

[deleted]
[deleted]

Website subtitle "It just works" completely summarizes this completely unprofessional article :)

If the 12400 will be significantly cheaper and with reasonable operating characteristics, the price of the 5600X could be reduced or the 12400 could be an interesting choice.

u/SuperMrBlob avatar

Seems silly to do an 'eSports Titles' benchmark and not include League of Legends. Would've liked to see the performance.

Half the reason I went 5600X over 11400F etc. is for LoL.

u/Raster02 avatar

You can run that on your toaster, it’s a waste of video time.

u/SuperMrBlob avatar

Not in chaotic teamfights if you want 144+ fps. You couldn't hit that with Zen 2.

u/MonstrousYi avatar

True, i've seen few koreans benchmark and there is no CPU atm that can keep the minmal FPS above 200FPS in big teamfights. That being said some results showed that the 12900k > 5950x > 5900x > 5800x > 12700k > 5600x > 12600k (in League of Legends), i don't know how accurate that is as it's rare to find a full gameplay benchmark with the minimal FPS overlay shown on top, the 1% low FPS 95% of the reviewers show you is useless in Esport Games when the FPS jumps between 100 to 500 all the time, it's the minimal FPS what matter here.

More replies
More replies
u/karl_w_w avatar

League isn't an esports title it's an online cancer ward.

u/SuperMrBlob avatar

Ah, a DOTA player XD

u/BobisaMiner avatar

Hey, DotA is not cancer.. it's AIDS.

More replies
More replies
More replies

If they used equal DDR4 memory on both Platforms, I bet it would even out both CPUs

u/xthelord2 avatar
Edited

expected considering intel is not giving CPU's much cache

there is a reason why these CPUs look good and bad in workstation;

-they are forcing more wattage which makes efficiancy much worser because stock 12900k consumes amount of power overclocked 5950x would consume while being at best 10% faster

reason why they look bad here:

-they are starved on cache because 30MB of L3$ is causing pipeline problems hence why intel sees such improvements when overclocking memory

what is giving intel such a uplift:

-scheduler is doing what process lasso would but on hardware level hence why DRM was such pain to work with because it effectively increases data hit rate meaning CPU wastes less time looking for data chunks it needs

what can we expect from AMD:

while AMD just shoves more cache and calls it a day because their future will probably be that they will use task scheduler windows 11 has meaning we could see CPPC 3.0 which would let us choose CCX's for efficiancy and CCX's for performance meaning your weak CCX now can work on background while good cores work on main workload which should be atleast 10% improvement and much better frametime consistency

Please actually read the article lmao

More replies
u/ZealousGhost avatar

Faster: Yessss Esports: boooooo

Those framerates are too high to be relevant and not seeing system config.

u/jortego128 avatar

I thought they had made an error in the title, but apparently the 12600 isnt nearly as potent as the 700 and 900.

u/ODoyleRulesYourShit avatar

No it wasn't an error, it was intentionally trashy clickbait. They mean to say "30% faster in Valorant and slower or negligible everywhere else".

u/xpk20040228 avatar

I think it might because Intel cut the cache heavily on i5

More replies
[deleted]
[deleted]

didnt intell release 11 and 12 series AFTER AMD 5000? They released 2 LOSING series of CPUs?

The fact that we are talking about a new architecture competing with and barely winning or flat out losing to a 2 year old product is sad and isn't being talked about enough.

Why should it be talked about, just curious.

Because it shows how far Intel has fallen behind in recent years?

Sure, but most consumers don't care about that. They just want the best price/perf, not "which company has the better reputation".

That's like saying "zen 3 finally beating a half decade architecture in gaming" wasn't being talked about enough when zen 3 launched. It's technically true, and shows how far AMD was behind before, but it's completely ridiculous and completely useless to the end user other than making fun of a companies reputation.

If anything, lots of people are excited for the returning competition from Intel in the desktop space in terms of MT performance, in hopes of decreasing prices. Idk why more people aren't excited about that and instead fanboying over AMD (hahah alder lake finally beat a 1 year old processor) AND Intel (hahah alder lake has a 5 percent lead in gaming it's so much better).

u/IrrelevantLeprechaun avatar

Reputation is everything. The reason Intel is getting their shit pushed in is because their habit of scamming and bribing everyone finally became widely known and everyone abandoned them.

I'll go AMD every time even when they are worse purely because they are actually morally good.

Reputation is not everything, if anything it's the last thing. Performance/dollar is.

Also companies are not people. You say "Intel scammed and bribed people", and sure, that did happen, but the executives that were involved in that, most of them, are all gone now. Intel has reorganized their top level management quite a bit and rehired a lot of old talent. If Intel changed the people who led it, why is Intel still morally bad?

And most companies did not leave Intel when that became "widely known". They left Intel when they stopped creating innovative products and have 18 core HEDT chips vs 64 core AMD chips. But despite that, most companies didn't even leave Intel, even now, yes AMD is gaining marketshare, but Intel still has a good bit above 50 percent marketshare.

Lets say everyone decided to stick one to Intel and only buy AMD cpus, like you are doing. And lets say AMD then becomes a near monopoly, and Intel finally brings a cpu that is competitive with them. I have no doubt that AMD would do the shady anticonsumer shit that Intel did back in the 2000s. Why? Because companies are not people, and you should only expect them to do what is good for their share holders.

Lastly, maybe you have the money too care about what company does the moral thing or not, but many people don't. They want the best perf/dollar. Maybe someone needs a cpu for work, and saving that extra render time could save them money. Or maybe someone is building a budget rig, and going for Intel could bring them an extra 10% fps. I don't blame them for going for the company that has the best perf/dollar, and neither should anyone else.

If you want to go for AMD because it's morally better or something (even though it's really not because its a company regardless) sure, go ahead. Buit that doesn't mean "reputation is everything" or even that you are in the majority of consumers.

More replies

Will it even be a competition when the next Ryzen releases though?

Took them 2 years, 2 Gens and they barely surpassed them. Only to possibly get crushed again in 6 months to a year. Doesn't seem very competitive to me.

Most people seem to think raptor lake would be atleast somewhat competitive with zen 4. Alder lake has around 16 percent higher ipc than zen 3 according to anandtech, and raptor lake supposedly has new big cores which could bring 5-10 percent improvement on top of that. So lets say we go with the low end and say raptor lake will have an only 5 percent uplift, and say its ipc gain over zen 3 is only 21 percent higher.....

Well that would require another zen 2- zen 3 ipc uplift (23 percent, anandtech) which is certainly possible. Either way, it seems like atleast in ST it would be close. As for MT, raptor lake seems to double the small cores to an 8+16 configuration, so unless AMD goes for a 24 core zen 4 (all leaks point to 16), I doubt we would see any "crushing" here.

More replies
More replies
More replies
u/Bobjohndud avatar

I wanna see the valorant test redone on more processors. 30% difference at max 10% frequency difference on the same architecture is ridiculous.

Edited

It is very scary that Intel must clock very high in order to win and at 241W. Esport is good on AMD so the buying decision should be made there.

[deleted]
[deleted]

Comment deleted by user

u/IrrelevantLeprechaun avatar

Literally any alder lake benchmark. The damn things suck down immense power even under mild load. It's a joke. Just like Intel had always been.

u/Hailene2092 avatar

~240w for the 12900k is when everything is running. In gaming it maxes out around 125w.

I guess you actually haven't seen any efficiency benchmarks, or you would have seen that Alderlake is actually very competitive in most workloads in terms of efficiency. For example, check out this review, that focusses on efficiency tests in gaming: https://www.igorslab.de/en/intel-core-i9-12900kf-core-i7-12700k-and-core-i5-12600k-review-gaming-in-really-fast-and-really-frugal-part-1/5/

u/f0xpant5 avatar

So are we forgetting the multiple years in a row where AMD was also a joke? I really dislike the term fanboy, but if there ever was one, it's you.

u/IrrelevantLeprechaun avatar

Only reason shintel ever beat AMD is because of the bribes that made software developers intentionally nerf AMD performance. If things were coded properly AMD would have been in the lead every time in the past.

more reply More replies
More replies
u/BobisaMiner avatar

Have you tried reading bechmark articles beyond titles?

More replies

What if you don’t play esports tittles?

What if you have a Mac?

More replies
More replies
[deleted]
[deleted]

Comment removed by moderator

I can’t tell if you’re baiting or being genuine

Absolutely genuine.

More replies
u/Razhad avatar

amen

More replies
u/cuttino_mowgli avatar

ohh it's hardware times eyyy

12600k is better than 5600X but then I’m looking at the price of lga 1700 motherboards … holy fk….

u/dudeoftrek avatar

Is the battlefield series considered e-sports? I really like the new battlefield but want to make sure I choose the best platform when I get to upgrade

12600K is the better CPU but the compatibality issues with old games makes worse than the 5600X imo

I have seen this title before, back when 7700K vs 1800X

12600k is the new gaming king despite your efforts to convince people otherwise. Ryzen 7 5800x is a more comparable CPU in terms of performance and MT and is still beaten by the i5 as proven on essentially any review you can find on the internet.

Don’t get me wrong, Ryzen 5 5600x is a hell of a CPU and I’ve been recommending Ryzen 5 3600 to friends for a long time, but Intel has made a statement with this new gen, specially with the i5 12600k. Let’s try to not fall on the fanboy’ism.

u/jortego128 avatar

Not my efforts dude, I just linked the article.

More replies