
This is Destroying Our FPS
Vex
Views: 122253
Like: 6295
DirectX 12 could be destroying our FPS. This is for many reasons, but top of which is the API’s sheer amount of control that it grants to developers, while at the same time, introducing SO many NEW and graphically demanding features like raytracing. Devs are having to learn faster than ever to adapt and optimize properly. What you think?
==JOIN THE DISCORD!==
Valorant:
MuleSoft:
Nvidia Explains DX12 Ultimate:
Daniel Owen:
Gameranx:
Vulkan:
My Spotify:
==WINDOWS AT A HUGE DISCOUNT!==
Windows 10 pro ($15):
Windows 11 pro($21):
Code: “vex” for 25% off!
0:00- Current State of New Games
0:30- The Common Denominator
0:57- DX11 vs DX12 in new games
2:03- How APIs affect games
3:15- DX12 is SUPPOSED to be better
3:53- DX12 Ultimate go brrrrr
5:40- Warning signs
6:43- Inexperience working with DX12
7:35- What about Vulkan?
9:47- We want better
24.08.2023
English NOT main laugish. Sorry.
I am like "DX11 drives my games better, I run that." I understand that people wanna hav moive like grafic, but its only really good in singel player games, when you have time to stop and smell the flowers and see the sunset.
All PRO-gamers pull down the grafic anyway to get some advance over over the players with high grafics. World of Tank, Hell let loose, Armord Warfare, Tarko, aso. Players turn off, grass and trees and things like that. It gives U better view = Faster Spotts = more Kills. This goes for almost all competition shooter game. Same in MMO's and all PVP games.
Sun shaffs and raytracing is nice in airplane games, when looking at replays or taxiing on ground, nice and realistic. BUT as soon as you are in air fight, all that better be turned off, for the Win. It's only anoing getting blinded by sun and you cant turn your head because it is a game.
Games like RDR2 loves to show you the grafic and try to make the plains look realistic and grand aso. But they know that with bush and trees, sunsets and all, those missions for hunting was impossible, so they had to make a "power" so you can see the routh the animals has traveled.
Again the conflict between super grafics and playerbilety. So not I dont need DX12, atm.
Vulkan is a Godsend. It is a massive help in emulations. Like Tears of the Kingdom and other recent Nintendo games would run like ass without it.
We are basically in 2013 in terms of how console ports translate to PC.
There were similar issues back then – acceptable by era standards performance on consoles and awful lag show on PC.
About Vulkan vs DX12: try running the games on Linux 😉
About game performance: new developer seem to be less and less learning how the low level stuff work, the problem is not on the API, the problem is on relaying too much on engines (e.g. UE).
The DX12 vs Vulkan comparison with ID is flawed:
You said it yourself there are examples of DX12 having a benefit over DX11. That is IF the Devs optimize it properly. Same for Vulkan. (and DX10 for that matter compared to DX9) And ID is known for making efficient engines with low level optimization whilst other games you mention were mainly for DX11 with DX12 "attached". If the engine and optimization was build for DX12 from scratch the story would be different. So we see a very slow transition thats all.
Also what you said about devs never having to adapt as fast as today is wrong: look at the 90s and the development of 3d technology back then.
And in general, just a tip: you said it yourself you dont really know a lot about APIs besides from wikipedia, so I would prefer you formulate your opinions nt as definitive as you do and more speculative (or at least qoute your sources, cause they are out there, like dev-interviews and stuff).
You also compare titles that are DX11toDX12 titles, meaning its not a native DX12 build but a DX11 (Last of us, Wither 3)
I would be happy if you reupload this video with the corrections you receive from the community and not leave it as it is, as it is wrong or at leat not totally right in many things, and therefore spreading false info
DX12 (and Vulkan) are technically superior to DX11 (and especially OpenGL) in terms of possible performance gains when implemented correctly.
The issue here is "correctly". The API works on a lower level than DX11, so it's really easy to screw up. 😐 It's the same as writing on assembler instead of C++ or higher level languages. You may write incredibly fast code, but you are more likely to write a complete garbage which doesn't even work right and whatever you wrote is much harder to optimize during compilation for the same reason.
Additionally, GPU drivers are very optimized specifically for DX11. It will take time for them to catch up. I think MS should've given DX12 different name since it is not a direct successor of DX11.
So get a 4090 and play Ultra in Native 8k while getting 60-65fps or most games 90+ with dlss
Yes you heard me, 8k at around 90-100fps. 4090.
The only issue realy is most devs now are lazy stupid evil parasites…. just look at diablo 4 or total war utter complete and total garbage and if you think otherwise you are just wrong and need to take a serious deep look at whats wrong with you…
Dx12 has sucked ass since it released.
It has never outdone dx11.
Microsoft is the kings when it comes to Downgrade.
What a video
I'm a software engineer.
I've used both D3D12 and Vulkan. They're pretty much the same level of complexity (with different resource locking strategies in terms of API usage).
Vulkan is great BUT D3D12 will run on more Intel GPUs in Windows. And D3D12 has better feature support on older AMD GPUs as well (like Linked GPUs [aka GPUs in cross-fire mode]).
Assuming all APIs are correctly optimized (which many game-engines don't do) it comes down to hardware & drivers combinations.
D3D9, D3D10 can be faster vs D3D11 for example if the hardware was designed around those APIs. And Linux, OpenGL can be faster than Vulkan on 4th gen Intel chips.
All this stuff gets complex & to correctly talk about it with a fine degree of accuracy you will sounds like a stuttering scatterbrained nerd.
That said a lot of these issues people see in games are poor Game-Engine graphic systems and script-kiddies just dumping things into the mix without understand or caring much about how things work leading to poor results.
Need ai to optimize code
nice buzzcut
DX 12 is shit 😂
dX12 is more complicated and game studios usually do not want to put in effort learning the api or optimizing in it
this is why i play indie games only now, a lot are free and u get a better time if u want something short and fun
@2:46 Exactly, and in the old days when your parents planned their parenthood devs had to know the specifics of each existing graphics card to program their games. You could argue games where better optimized but on the other hand it is pretty inefficient. 😊
@3:32 The idea is great that the successor should be somewhat superior. 🤭
The developers arent doing enough, its not the API
i had a lot more fps with dx12 in fortnite because i had old multi core cpu (i7 5960x)
Red dead redemption 1% low abd thebframe time were much neter and amooth on vulkan, averrage fps means nothing when fps are dropping
Ready or Not runs better with DX12, I get 10-20 more FPS compared to DX11
the same problem is with "Resident Evil", where ther's 30fps of difference between dx11 and dx12
Although I think it depends on the game, in Rise of the Tomb Raider, which also has DX11 and DX12, in DX11 I get FPS drops, while in 12 it doesn't, just like in RDR2 in Vulkan sometimes it crashes, while in DX12 no
The only devs I see optimizing their games are the ones chasing after the switch market. I’m happy to see that but man is is sad when devs will only optimize their games when they absolutely need to
I want more games made with Glide!
Consider trying DXVK 😉
I'd rather just turn the settings down slightly to eliminate the lows and slap a 59.9 cap on it to get a more consistent framerate. Consistency is always going to look better than a series of peaks with highs and lows. 59.9 99.9% of the time is better than 110 50% of the time and 70 50% of the time. But then I'm older, my eyes aren't quite as good as these teenagers playing games these days and I remember when getting 20 FPS out of old games was something you were glad to be able to do. 59.9 with 1440p and reasonable settings are more than good enough for me, I grew up playing NES and SNES games, so… I don't need super duper ultra 4k individual hair strand ray tracing quality. I can enjoy games just fine even without.
Idk about that fortnite one, I get 50fps max settings as long as I turn off Anti-ailiasing or however u spell it
Nah just look at Tears of the Kingdom that's like the most optimized game on this planet
Vulkan API is better, but Xbox not support this…
You forgot to mention Vulkan has higher lows and averages. DX12U fluctuates all around. Check your footage
Welcome to competency crisis, this won't go away until companies start hiring real devs instead of grabbing intern-level people and not caring about the end product.
Bro Im happy with what I have. RTX 3050 gives me enough happiness while I play my favorite games.
…Read more
Linux users only use Vulkan or OpenGL to play this games no prob.
Nanite and lumer are not DX features, but Unreal Engine 5 features. High-res textures are unrelated as DX9 can render HQ textures…
I also wanted to add- that this trend is in existence since the beginning of computer graphics, just look at GTA and bumps in hardware requirements between 2d -> 3d -> next get eras
Good info and all but damn ! Slappin with that music at the end ! Damn bro lets go
In game devlopement timeframes 3 years isn't much. We should start seeing games in dx12u from the ground up with experienced (in dx12u) in a couple years. In three we'll be in a good position to judge. It's still early days for it.
API has nothing to do with game state. A lot of games run better in DX 12, but the optimization is what the developers give us. Or rather they won't deliver.
2023 forced me to start playing at 120 FPS locked because with what came out today 144hz is useless
Vulkan/DX12 both originate in AMD's Mantle API.
The idea was to give developers more hardware control, but in exchange, they had to plan for every GPUs out there.
What we're seeing now is devs that programmed for one single GPU.
The condescending way you explained APIs bruh . . .
the vulcan option in rdr2 is massively better bc you get a lot less stutter as you can see in your comparison video and look at the frame time – a lot more consistent and A LOT better feeling than dx12
AMD engineer here – from our perspective, we can't do any of the optimizations that are possible in the drive in DX11/OGL. Because DX12 is much lower level, it's entirely on the game developer's shoulders to write optimized graphics code. They are doing the job we did for the older APIs… unfortunately, just a lot worse. While we have hundreds of graphics experts that work on these drivers, an average game studio may only have a handful of them. It's infeasible to expect the same level of optimization, frankly.
i like vulkan because it has no boundaries to any particular version of windows , i prefer windows 7 8.1 over windows 10 and newer to be able to enjoy rdr2 and several emulators that have vulkan support on the operating systems i rather use is why im for that api over directx
the fact that fortnite is in the thumbnail really makes the fact that the game used to run far better on the exact same settings feel less like a me issue
I'll tell you why we have bad performing games, it's both developer incompetence and management carelessness. Some companies say like Blizzard demanded that their remote developers to come back to office which means if you're working remotely on another town, you have to move your house if you wish to continue working at blizzard and they offered no compensations in return. That's a terrible treatment if you ask me. And there are also developers that do a lot of mistakes and use really bad practices consistently even if you warn them and show them the correct practice. Some consistently keep making mistakes you'll have to fix later before it goes live. Bad stuff at work happen and we get bad software for more expensive. There are some really bad people man. Unless those people change, we'll keep getting games that run terribly. UE5 games will suck in that terms to be honest unless epic games change something, every single UE5 game runs really bad. The issue isn't limited to DX12 Ultimate API.
If games run good on our hardware, we're keeping our hardware for longer, and that's what the gaming industry do not like at all.
Enabled DX12 in BF1 and BFV. Went from 200 FPS to 300 FPS. Actually had the surprise Pikachu face when I saw it.
Just to clarify, the Witcher 3 DX12 update (released Dec, 2022) does seem like it is particularly bad compared to the original DX11. But, what does this mean for other recent releases?
Lol also it’s “Application Programming Interface” 🙂
Feeling like LTT with these corrections 🫠😳