
Your GPU is Useless Now
Your GPU is Useless Now. It’s been a growing trend that more and more games are using your CPU in a PRETTY dramatic way. You would think that since graphics are getting better, that your GRAPHICS CARD would have to work proportionally as hard, but that doesn’t seem to be the case. General-purpose computing on the CPU has become the go-to. Even a top-end CPU from just 3 years ago, can’t push 60 FPS in some games.
On top of that, most people don’t want to be upgrading CPUs frequently because it is a lot more expensive than buying just a new GPU. You might have to also upgrade you motherboard and RAM at the same time. Is this only going to get worse?
VV RX 6800XT I tested as well (affiliate links) VV
PowerColor Red Dragon rx 6800xt 16gb:
RX 6800xt (in general):
HUB: –
Nvidia:
BLAndrew575:
Daniel Owen:
Digital Foundry:
Unreal Sensei:
0:00- WTH is going on??
1:34- The Irony of CPU Utilization
2:47- New Features are More “CPU Demanding”
4:37- CPUs are for General-purpose Computing
5:44- What Inefficient CPU Usage ACTUALLY Means
8:20- We’ll see if things get better
10:04- CPUs are becoming MORE important than GPUs
11:04- Silver-lining
Nothing will improve. They will continue to make more and more unoptimized shitty games for the price of 80$. This is a systemic crisis in the gaming industry. There is nothing really new in AAA gaming in terms of gameplay for decade at least. And an increasingly photorealistic picture no longer creates any "wow effect" as it was in the 2000s. If their new shiny game that looks and feels same as thousands of others, require me to buy new hardware every year – I will not buy it. I just refuse to play the game, that it.
I agree optimization is the issue. I would strongly argue the CPU is less of a factor than you suggest. My supporting evidence is in modern consoles, excluding the Switch. The Playstation and Xbox systems have weak CPUs and manage to display playable frame rates. Furthermore, your onscreen usage numbers weren't showing peak CPU usage. Your GPU was struggling. Drop your resolution and re-evaluate your thoughts. But once again, I agree optimization is an issue.
The problem is that the jump from 1080p to 1440p is basically double the pixels. And 4k is 4 times the pixels. It took a lot of years to go from 720p to 900p to 1080p and these isn''t even 50% more pixels per resolution. Add to that RT which is horrible for performance and gaming has become extremely expensive.
I wouldn't say that CPUs have become more important than GPUs, look at Starfield it destroys every single GPU. Even the RTX 4090 and RX 7900XTX struggle without FSR at 4K. Poor GPU optimization is still very much a thing.
I think that if anything's a portent of what's to come, just look at the recent reaction to BG3 by all the "big" studios.
People with big bucks will just throw money at less than excellent creators and force them into making something they hate, because it will make money. When the creators don't want to put up with it anymore, they'll just be replaced.
This scum that has infiltrated the gaming industry is a cancer that will devour the spirits of creators and destroy the art of game creation.
Its a regression.
Every step outside the graphics card increases latency and inefficiency.
GPU at ~80% and CPU at 24% Remnant 1440p – how is there a CPU bottleneck? Both are not used that much?
Same goes for Valheim I guess
I think all your videos are very good, but this one….this one's exceptional. Great analysis, Vex.
The problem is greed, the technology is not being used to make games better, it's being used to make things cheaper, easier, faster, and even worse as long as they can get away with it.
The thing about a free market is that if you keep buying things you don't like, you are going to get screwed harder and harder.
So realistically the only way the vast majority AAA games will get better and not exponentially worse, is if we stop buying them. Sadly, I can't see that sort of movement happen any time soon, not until it gets really, really bad.
Looking at the graphic. CPU and GPU isn't max.
a.i frame generation will be the future, less powerful cards with astronomical frame rate increase.
I see similar problems with my 9900K and 3070, thought about changing to 13500
games at pc was never optimized and this is the biggest fail at the most games
5:02 now direct storage comes into play, right?
My brotha, that was 70
Yeah it has became a cost cutting solution, why spend the money to optimize a game where we the gamers are still buying buggy and or low performance games. Eg. With raytraced why would you have your level designer to setup direct light composition now?
Look at CPU usage on those games 25-40%
In my opinion the target should be 100/100 utilisation unless you limit FPS. I don't understand why the CPU should not be utilised?
If you limit frames with radeon chill or the nvidea equivalant 15% lower than your highest fps, the amount of power, heat, usage etc all go down drastically. You dont need more in pve games as witcher 3 etc..
I think it's the executives rushing a release is what's ruining modern gaming. Like they always does. Those god damn cum stain
The music is so faint in the background. Is the song in the `New Features are More "CPU Demanding"` chapter from Risk of Rain 2?
one word: consoles.
It's just a game, and a shitty one at that
Altho I got it like 5 years ago, I feel like my 2700x became obsolete over night. Can't keep up with my 3060ti
obviously an up and coming report at 98K views is that x86 upside down? mb
This phenomenon isn't new. It's because CPUs weren't capable enough, that dedicated graphics harware was developed back in the 1960s and '70s. It was a problem for early 8 bit home computers. People would buy a new machine to play new games. On PC, games like Doom, Quake and Half Life were prime examples. There are some things that ONLY a general purpose CPU can do. And some tasks simply can't be easily broken down and spread over several cores. GPUs are a bit like the external floating point copros of old, they are designed to only do a limited set of fixed tasks, but do them extremely fast. Actual game logic isn't one of them, as every game is different. If you made a GPU that could do it… you'd end up with a general purpose CPU, with a GPU attached.
This shows how little the hardware actually matters today. If the software (games, in this case) isn't using the hardware resources properly, it doesn't matter how powerful your PC is. The output will be only as good as the software allows it to be.
A good example, somehow I can run RDR2 on my old Ryzen 1600 + 1050Ti on low – medium at perfectly playable frame rates. Sure, it won't look as good as it does on Ultra and it won't run at 60fps, but it still looks great and runs at 30-40+ consistently. Now cue to today's newest titles – I can either upgrade my system to what was 4K Ultra level some years ago, or forget about PC gaming and get a console instead. Thanks, but no thanks, there are plenty of older games that still look great and run great on low-end hardware.
All these new development shortcuts like UE Lumen and Nanite, are understandable and justifiable for indie devs with little to no resources, who often don't even have a friend to help them out with, let alone a dev team. But for studios, especially AAA ones, using such development shortcuts is just shameful. They're simply cutting down costs and saving development time for the price of a good chunk of the PC gaming market. Hopefully there will be a big enough community backlash that their decreasing sales numbers will force them to get back to proper, quality development.
Odd thing with StarField my CPU rarely goes above 60% I have an 10750H with an RTX 2070 Max Q
TechDeals channel really nailed it when he said "more cores! You'll need more cores sooner than you think!" 6 cores for a new pc is not enough for tomorrow's games.
This has been my number one gripe with PC games lately!
i think now days there making games bad like this for people to talk about it and that makes more sales in the end so its a win win for devs less work more cash
UE5 needs more gatekeeping.
what I like most about this channel is how he takes the normal person and the normal person's budget into account in his videos and reviews, etc. = not everyone can afford more expensive parts / put 1000€/$ in one part, etc.
you have cute cat..! 🙂
This is why we stick to consoles. Everyone got the same fps and same issues. So its balanced. On pc if you dont have 1440p 120hz you are in a disadvantage in competitive games.
And dont come with xim issues. Xim can be used on pc with a mouse too to negate recoil. I would rather play against xim users then hackers on pc
Game engine dev here: its hard to utilitize full potential of compuer just because every pc is very different and you need to fullfill somewhat comparable performance on all architectures.
Man, do I got some BAD news for ya. Nope. Look at who is doing the development now. And what is their bottom line? Think about the Linus Fiasco. They had to pump like 20 things of content a day.
Video game companies are having to do the same thing.
So what are they going to do? Pass the buck on to the hardware to deal with it. Before long? Our CPU's are going to be 32 Cores average. With 64 being the Pro Spec version of gaming rigs. Why?
Developers are just going to push more and More for the CPU to do the work as the API is EASIER to program for.
This has always been the case about CPUs in video games where objects, view distance and NPCs can fuck up your frames just not as severe.
But games nowadays promotes big worlds, detailed buildings, & crowded hubs and seamless travel with no loading screen and now it becomes a heavy duty for the CPUs
Starfield is the ultimate example of big world and big cities hence the ambition comes with a cost.
I have a sense that APUs like PS5 and Series X hardware will survive much longer than PC they generally do a better job in world loading and texture streaming.
This has been a thing for a decade or so it's because consoles are the industry leader and they are gpu bottlenecked. Games that are released primarily on PC they run well, but as we can see games made for the PS5 are poorly optimized in insane ways. It's going to be interesting in the coming years with machine learning taking over GPUs to see if anything actually changes.
I have a 5800x and a 4080 and I thought now I'm able to play cyberpunk with raytracing above 60 FPS but guess what my CPU couldn't handle it
Spagetti and forgetti! Deadlines met.
It should be noted that this mostly an nVidia issue. AMD video cards don't rely on the CPU anywhere near as much for their performance.
Its definitely CPUs, this is happening to well optimized games too especially open worlds.
Looks like CPUs can't process the amount of data these new games have
CPUs are ripping us off more than GPUs 😄
TLDR yes, game companies don't optimize their games very well. This is why Fortnite and the newer Resident Evil games run very well because they cared and actually took the time to optimize their games properly without relying on DLSS or FSR
3080 5800x3d too, I’ve felt something was wrong with my rig but this is definitely helping to explain some things
The main reason games need newer CPUs is to increase system memory bandwidth. You can see big gains from AMD 5000 to 7000 especially if you add VCache.
the cpu is not the problem since it's only used at 30% max, optimisation is shit, that's all
No, games will not run better. You know it is not work like that.
Good programmers retire, and codemonkeys just cannot into optimizing things. And managers okay with that.