https://old.reddit.com/r/Games/comments/1bjfj3q/dragons_dogma_2_review_thread/kvqq87k/
the reviewer from Eurogamer has a 7700X(which has exactly 8 cores like PS5) had pretty good performance review, which kinda match what I guessed.
https://old.reddit.com/r/Games/comments/1bjfj3q/dragons_dogma_2_review_thread/kvqq87k/
the reviewer from Eurogamer has a 7700X(which has exactly 8 cores like PS5) had pretty good performance review, which kinda match what I guessed.
It’s probably some glitch due to the performance settings for PS5/Xbox series X and then porting to PC. Chances are if modern CPU we try to limit the max core available to the game the performance would become better then letting it use full cores.
Plus, whatever anti-cheat/anti-modding stuff they put on the PC.
source: https://gpuopen.com/fidelityfx-super-resolution-3/
FSR Quality preset for 1080p only render at 720p, while DLSS Quality render at whatever next available resolution the fits so it would likely render at 1600x900. It is possible to render with Ultra Quality preset similar to what Avatar developer made at 1.2x upscale, but it does not come in standard. The video have over half of benchmarking using outdated version(only 2 on FSR3) and comparing pretty much different tech. Anyone with decent enough graphics tech knowledge knows it will not have good result with FSR since it does not doing use AI to help filling the missing pixels. AI is surprisingly good when doing upscaling and filling in the details( if you know what those image generator do at internal res and keep upscaling and tweaking the images), while shader solution relies on input information for better work. Means, if the input frame rate or resolution is too low, it’s not gonna work as intended. (similar to the newer optical flow frame gen also rely on frame rate being stable for better frame gen result. )
You don’t have to like AMD nor FSR’s result, but comparing result while ignoring the implementation detail and what are the techs are for is like comparing beer and scotch and then complain why scotch doesn’t give you the burp.
Disclaimer: I fully support universal basic income even though I am in the higher tax bracket. Because of people needs buffer during those mass layoff(in case where unemployed benefit doesn’t really cover basic living cost) and disabled/retired people without support.
Modern computer tech trend is based on 4 basics:
Not all newer release gets all 4 checked but that’s the goal. Computer component actually have longer life spam compare to all other tech gadget that comes with processor. In example, your 970 is released 10 years ago 2014, and during this period of time, I am on my 4th phone not because I chase newer one, but because the battery degradation and I happen to had a nexus that battery died after a year. Average shelf life of modern phone battery is around 3~5 years. The thing is, as a video game developer myself, we are constantly struggle with that power creep as well. (steam hardware survey helped a lot honestly.)
So it’s not really why game developer no longer wants to make game compatible with older hardware, there are plenty retro style or clever use of graphic that can still run on older hardware. It’s ultimate the time/cost and return of investment calculation. We don’t work for computer component manufacturer, we survey the market, see how we can make games up to our standard and try to reach as many potential customer. It’s the same for all other software based companies. And for smaller company that can’t afford to build their own engine, when you use Unreal or Unity, even without putting anything into the game, it already comes with a min spec before you build your first prototype just to support say, rendering 1 light and 1 square ground plane. And anything you then put into the game will slowly churn away the HW budget(storage, ram/vram, frame time).
Like you, many companies simply wants to make money and survive, as they aren’t really big enough to make any of those decisions. (like the average battery shelf life)
All the butt hurt people watching their old GPU getting obsolete. Just do some quick google of “is 8GB vram enough” and there are so many break downs out there so I won’t get into details. PS4 is a joke comparison cause switch have even less ram and still there are games making to run on it. (like the newer Prince of Persia). Anyway, old GPUs days are numbered because of low VRAM and lack of mesh shader compatibility. Nvidia is basically doing a how stupid can the market go test.
lol, you might as well just use the integrated GPU at this point. 6GB means you can only play like PS4 or before era games as PS4 has 8GB ram.
I know, it’s that they probably have the console optimization left in so any CPU that’s not 8 cores might get the short end of the stick. We will see how quickly capcom can patch that.