Amazing what happens when your primary competitor spends 18 months stepping on every rake they can find.
And, then, having run out of rakes, they then deeply invest in a rake factory so they can keep right on stepping on them.
This’ll probably be a lot more interesting a year from now, given that the product lines for the next ~9 months or so are out and uh, well…
18 months? Lol.
Intel has been stagnating since the 4th gen Core uarch in 2014 with little competition. They knew they were top dog and they sat on their hands until their hands went numb. There’s a reason “14nm++++++++++” was a running joke. This is a decade of monopolistic market behavior finally coming home to roost.
So you’re telling me that milking my 4770k until this year when I built a new rig with AMD was in fact a genius move?
Perfect market timing.
Basically yeah. Up until Zen 2 intel didn’t do much innovating and only around the zen 2 era did those 4th/6th gen chips start to really struggle in modern workloads.
TBH the only thing that caused me grief with that old beast of an i7 (other that the fact it would have bottlenecked my new GPU) was playing Stellaris.
That’s a wee revisionist: Zen/Zen+/Zen2 were not especially performant and Intel still ran circles around them with Coffee Lake chips, though in fairness that was probably because Zen forced them to stuff more cores on them.
Zen3 and newer, though, yeah, Intel has been firmly in 2nd place or 1st place with asterisks.
But the last 18 months has them fucking up in such a way that if you told me that they were doing it on purpose, I wouldn’t really doubt it.
It’s not so much failing to execute well-conceived plans as it was shipping meltingly hot, sub-par performing chips that turned out to self-immolate, combined with also giving up on being their own fab, and THEN torching the relationship with TSMC before you launched your first products they’re fabbing.
You could write the story as a malicious evil CEO wanting to destroy the company and it’d read much the same as what’s actually happening (not that I think Patty G is doing that, mind you) right now.
early zen werent performant in lower core count loads, but were extremely competitive in multi core workloads, especially when performamce per dollar was added into the equation. even if one revisits heavy multi core workload benchmarks, they faired fairly well in it. its just at a consumer level, they werent up to snuff yet because in gaming, they were still stuck with developers optimizing for an 8 thread console, and for laptops amds presence was near non existant.
Not only that, but it was vastly more power efficient, and didn’t have the glaring security vulnerabilities that Intel had. All while being on a worse Global Foundries manufacturing process.
Unless you were a PC gamer who also didn’t care about $/perf, Zen1 was the better architecture.
Its chronic underinvestment in engineering to “maximize shareholder value” for a decade before AMD launched Zen. Then Intel got 5 years behind on engineering, and have only managed to get 2 of those 3 caught up. The newest tile based architecture only just matches the performance of AMD’s 3 year old AM4 architecture.
There’s also a little tidbit that I think gets overlooked: the Arrow Lake CPUs are on a better TSMC node than AMD’s Ryzen 9000 series. You wouldn’t know it from any of the charts.
Which puts into perspective any Intel fanbois saying this is their Zen 1 moment. They’re on a better node but still doing worse. There are no signs of life here, which was not the case for Zen 1.
Single core workloads Intel still had the lead. But multi core (or just multi tasking) Zen 1 was a beast. By zen 2 there was hardly a reason to get Intel even for gaming, and especially at normal setups (nobody is using a top of the line GPU at 1080p). Even when you’re “just” playing a game you still have stuff running in the background, and those extra cores helped a lot.
Plus newer games are much more multi threaded than when zen first came out so those chips aged better as well.
Zen1 was slower in gaming and most 1-2 core workloads, but it was immediately far faster in server, faster in highly-threaded tasks, was hugely cheaper to manufacture, didn’t have the huge security flaws Intel chips had, and was way more power efficient.
They achieved that while still being on an inferior Global Foundries manufacturing process.
Zen1 was overall better than Coffee Lake. Just not to PC gamers, the loudest online PC hardware demographic.
Also, PC gamers are loud, but they make up a pretty small portion of the market. There was a time when Intel’s server division made more revenue than all of AMD. Even now, AMD as a whole is only a little above that. That’s not even considering the OEM market, which is far, far larger than PC gamers.
I got really annoyed with /r/buildapc. Everyone is a gamer and thinks they’re the center of the universe. They haven’t the faintest conception that someone would do a build for anything other than gaming and how that changes the choices.
Zen 2 was only a little slower for gaming, but it cooked the 8 core Intel 9900K in multicore performance. You could stick a 16 core 3950x into a normal mobo. The chiplet was a revolution
Huuuuuh
All of my computers had been Intel for many many years and here about a year and a half ago I got my first AMD computer because I had seen other people’s machines with AMD processors but I had never owned one for myself and so now I do I have one with an AMD Ryzen 5
If you look at who is manufacturing silicon, the numbers look even worse for Intel. All of these competitors are using TSMC fabs. AMD, Apple, Qualcomm, etc.
TSMC is the real 500lb gorilla in the room.
It’s gonna suck so hard for the whole world when they get invaded :(
Pray they don’t, but I’m almost certain they will now that the US is appointing complete morons to every portion of the US government. The US won’t really be able to help until this rot gets cleaned out. China has four years before we can really help Taiwan again. (Or at least give them air superiority)
Biden just finalized the Arizona TSMC plant.
If that gets invaded, I think semiconductors are the least of our problems.
But the Arizona plant wouldn’t be allowed to manufacture the most cutting edge chips.
How/why is that blocked?
Would it still be if China invades Taiwan?
Taiwan is incentivized to keep the latest and greatest local, so they can hopefully get protection from the USA and Europe
Taiwans rule. Foreign tsmc fabs have to be a gen behind. This would definitely change if China took over Taiwan, but who knows what China will do or allow at that point. They could shut the whole US fab down if they want. Even if they did try to re-tool the US fab (taiwan or china or tsmc) in a few years, it would cost billions and a lot of time to get it done.
you also have to keep in mind, the client that purchases cutting edge nodes first is apple. AMD only currently uses it for Zen 5c, and Qualcomm uses it for snapdragon elite/8 gen 4. mobile usally always gets them for efficiency reasons(and better yields due to smaller dies). other markets have historically been a node behind already (e.g despite the 9800x3d being new, its only a N4 die with a N6 io die)
And Intel. Intel has been using TSMC fabs for a while.
They used to get a 40% discount, too, but that stopped recently when Pat Gelsinger said people should stop buying from TSMC because there’s a good chance they’ll be invaded.
TSMC’s CEO didn’t like that, and said “ok, no more 40% discount for you. Effective immediately.” (TL;DR’d, obviously).
Even some of Intel’s Arrow Lake/Lunar Lake chips are being fabbed at TSMC.
I just built a computer for a friend and she decided to get an AMD when I told her it was about the same performance but used half as much electricity.
This is a person who knows nothing about computers. Intel is losing their “household name” status in a big way judging by that.
People like long battery life and computers that don’t cook your crotch.
What are the chances they were building a laptop?
Wait you don’t straddle your desktop tower?
Good point.
Not surprised. I switched to AMD CPU and GPU about a year ago. Could not be happier. Ryzen sips power and I run mine in Eco mode (since I’m on an air cooler). Performance is still fantastic.
Invested in a water cooler setup back when I had a Bulldozer chip, which was near essential. Now on a Ryzen, and getting it to exceed about 35 degrees is very difficult. Been very good for long-term stability of my desktop - all the niggling hard disk issues seem to just go away when they’ve not subjected to such thermal cycling any more.
Fantastic chips.
it’s the year of the linux des-
oh wait, wrong thread
The joke: every year is the year of the Linux desktop
head tap
because Linux is rather awesome.
get away from me you filthy crocodile
I feel like they are dropping the ball in the GPU space though, both on desktop and in servers.
They’renot really leveraging it. They killed the steam deck line of “small core count, GPU heavy APUs” which is why Valve hasn’t updated it and competitors seem so power hungry. They all but killed server APUs, making them mega expensive and HPC only. They’re finally coming out with a M-Pro like consumer APU, but it took until 2025, and pricing will probably be a joke just like their Radeon Pro GPUs…
And I don’t even wanna get into the AI space. They get like 99% there and then go “nah, we don’t really care about this market, let Nvidia have their monopoly and screw everyone over.” It makes me want to pull my hair out.
The fact they pulled ROCM support for older cards boggles the mind.
It wouldn’t be possible to dethrone nvidia in AI anyway, at least not alone.
My thought process:
Desktop: I need cost for performance…
Server: fps for the Jellyfin, transcodes for the transcode god
I’d drop in an old Nvidia GPU for transcoding, anyway. There’s lots of old cards that support nvenc. Don’t neglect the Quadro cards, either. Lots of them are cheap on ebay and will transcode just fine without even needing their own cooling fan.
Transcodes worked vastly better with QuickSync last time I bought a machine.
Does the AMD transcoded work as well these days?
I don’t think so. The Jellyfin documentation still says it sucks lol.
Damnit.
I wonder if thats because the transcoding hardware ismcrap or they just aren’t concentrating on that in the software.
I’ve been using my AMD 5600G’s iGPU to do hardware decode and encode in Jellyfin and it works pretty well. Only downside is that it doesn’t support AV1, but it works well with H264 and H265.
Sad but true. Intel’s performance was poor over the last year. Shuddering thinking about my Mac with Intel CPU, there must be burn victims from this thing. Still, less competition is never a good thing.
Why does desktops lag behind servers which is at 50%??
It’s taken this long for Intel to lose gamer trust.
Intel also have lower power consumption iirc, which is useful for laptops etc.
AMD have the best server chips: https://www.cpubenchmark.net/high_end_cpus.html
You have to remember that most people aren’t “choosing a CPU” as much as buying a PC. If the majority of pre-build retail PCs have Intel, then the majority of purchases will be Intel.
I don’t think Intel is more efficient if their desktops and this one link is anything to go by
https://www.cpu-monkey.com/en/cpu_benchmark-cpu_performance_per_watt
But I’m not up to date on laptop stuff at all so might be wrong
That’s under load. At Idle (which is where your average home PC will spend most of it’s time) I think Intel has the edge still.
It’s certainly a consideration for a battery device. Watching a video reading emails or staring at a spreadsheet will likely have better battery life than a similar spec AMD device.
We’ve reached a point where most everyday computing tasks can be handled by a cheapo N100 mini PC.
Actually AMDs mobile parts are pretty good at idle power consumption and so are their desktop APUs. Their normal CPUs, which use the chiplet design are rather poor when it comes to idle power consumption. Intel isn’t really any better when compared to the monolithic parts at idle and Intel CPUs have horrible power consumption under load. Their newest CPUs are better when it comes to efficiency than 13th and 14th gen CPU, bus still don’t match or even exceed AMD.
I would have to ask for a source on that. I can’t really find anything comparing many cpus.
However this video compares top end models on otherwise pretty much identical laptops and amd definitely wins in YouTube playback on battery https://youtu.be/X_I8kPlHJ3M?si=8a4Tkmd556hQh7BZ
But if you’ve got anything to better compare I’m all ears
It may well be the case that they’re similar or even swapped now. I can see that the N100 is pretty low power compared to the newest low end AMD chips, but then the AMD chips are better in terms of what they can do.
This one reckons they’re pretty similar.
https://www.reddit.com/r/Amd/comments/10evt0z/ryzen_vs_intels_idle_power_consumption_whole/
This one reckons Intel are better.
https://news.ycombinator.com/item?id=32809852
I doubt there’s much in it either way. Even if AMD are ahead now, laptops don’t get replaced right away, normies replace shit when it fails or is too slow to run whatever shit Google shoehorned into Chrome this year, and the most popular laptops are probably the ones with the lowest sticker price.
Ah yeah, I should have specified I was looking at the laptop side of things more as the person I originally replied to mentioned that power usage is more important there (which is understandable). There appears to be only a handful of laptop chips that I can recognize in that first link and all of them amd but I don’t know the naming scheme of modern intel laptop parts anymore.
Servers need very high uptime. Also, when something is documented to work a certain way, it had damn well better work as stated.
Intel had a long reputation of solid engineering. Even when they were losing at both performance and performance per watt, they could still fall back on being steady. The 13th/14th gen degradation problems have shot that argument to hell, and server customers are jumping ship.
Note: I’m not from the US, so in a lot of cases going to a manufacturer’s website and purchasing computers is not an option. Resellers are still the ones in charge here.
I work IT and when it time for a hardware refresh the reseller we are in contact with said they don’t stock AMD as there’s no demand. Which in a way creates a chicken and egg problem. I asked them if it would be possible to get laptops with AMD chips and the reseller said yes but we have to wait. So we bought 4 Intel machines for the meantime and placed a custom order for ones with AMD chips. The ThinkPads we are buying are significantly cheaper if they come with AMD chips, I was honestly a bit baffled there was no demand. Regardless, we are happy with the purchase and so are the users who claim the computers are relatively cooler than their Intel 8th gen predecessors. It just goes to show that for the most part, enterprise makes a huge chunk of the desktop market share nowadays (as younger generations tend to simply not use a computer and do everything on their phone) and that market just isn’t ready for the transition yet. They’ve been going strong with Intel for about 30-40 years. Weening of that tit is gonna take some time.
Thank you, that was enlightening
Not entirely sure but afaik their EPYC cpus are good.
Businesses make decisions based on money. People make decisions based on vibes.
Except the businesses run by people
EDIT: Sorry, the article isn’t about GPU rather it’s about the CPU market where AMD is projected to overtake Intel in the far future.
When the AI Crash wipes out nVidia’s demand in the server market they’re not gonna have any loyal customers in the desktop market right as the tech boom comes to places formerly reliant on only smartphones. Then they’re gonna be like surprised_pikachu.jpg
Still not ready to trust AMD/ATI again. I used them exclusively right up until they bought ATI and then decided fuck open source and the drivers for Linux tanked.
I hear all the issues folks have had with Intel/NVIDIA but I have yet to experience any of them. From where I’m sitting they are still working great. And their open source has not been perfect but its consistent. Instead of going from being golden to fuck you Linux folks overnight.