• 0 Posts
  • 7 Comments
Joined 1 year ago
cake
Cake day: June 11th, 2023

help-circle



  • source: https://gpuopen.com/fidelityfx-super-resolution-3/

    FSR Quality preset for 1080p only render at 720p, while DLSS Quality render at whatever next available resolution the fits so it would likely render at 1600x900. It is possible to render with Ultra Quality preset similar to what Avatar developer made at 1.2x upscale, but it does not come in standard. The video have over half of benchmarking using outdated version(only 2 on FSR3) and comparing pretty much different tech. Anyone with decent enough graphics tech knowledge knows it will not have good result with FSR since it does not doing use AI to help filling the missing pixels. AI is surprisingly good when doing upscaling and filling in the details( if you know what those image generator do at internal res and keep upscaling and tweaking the images), while shader solution relies on input information for better work. Means, if the input frame rate or resolution is too low, it’s not gonna work as intended. (similar to the newer optical flow frame gen also rely on frame rate being stable for better frame gen result. )

    You don’t have to like AMD nor FSR’s result, but comparing result while ignoring the implementation detail and what are the techs are for is like comparing beer and scotch and then complain why scotch doesn’t give you the burp.


  • Disclaimer: I fully support universal basic income even though I am in the higher tax bracket. Because of people needs buffer during those mass layoff(in case where unemployed benefit doesn’t really cover basic living cost) and disabled/retired people without support.

    Modern computer tech trend is based on 4 basics:

    • more instructions run per second
    • smaller size
    • more operating(ram/vram)/storage space
    • use less energy per TFLOP

    Not all newer release gets all 4 checked but that’s the goal. Computer component actually have longer life spam compare to all other tech gadget that comes with processor. In example, your 970 is released 10 years ago 2014, and during this period of time, I am on my 4th phone not because I chase newer one, but because the battery degradation and I happen to had a nexus that battery died after a year. Average shelf life of modern phone battery is around 3~5 years. The thing is, as a video game developer myself, we are constantly struggle with that power creep as well. (steam hardware survey helped a lot honestly.)

    • do we use the newer features? new example would be mesh compute shader which some older cards doesn’t have those circuits. By using some newer feature you automatically cut away certain population that owns older hardware. (SpiderMan 2 is PS5 exclusive
    • do we support older hardware? what can we cut to still maintain our vision and craft quality. (ie, we can’t push so far that low spec become a ps1 era game, it’s simply not feasible and actually the art dept will hunt you down. )
    • how do we balance the game for low to ultra spec. (ie. in FPS, there are cheats to disable bush or grass rendering, but make grass/bush not render helped frame rate a lot, thus if you allow it ultra spec people still play at low grass settings for better visibility. If you disallow it then your min spec needs to bump up to support the minimum amount your game needs to render. )

    So it’s not really why game developer no longer wants to make game compatible with older hardware, there are plenty retro style or clever use of graphic that can still run on older hardware. It’s ultimate the time/cost and return of investment calculation. We don’t work for computer component manufacturer, we survey the market, see how we can make games up to our standard and try to reach as many potential customer. It’s the same for all other software based companies. And for smaller company that can’t afford to build their own engine, when you use Unreal or Unity, even without putting anything into the game, it already comes with a min spec before you build your first prototype just to support say, rendering 1 light and 1 square ground plane. And anything you then put into the game will slowly churn away the HW budget(storage, ram/vram, frame time).

    Like you, many companies simply wants to make money and survive, as they aren’t really big enough to make any of those decisions. (like the average battery shelf life)


  • All the butt hurt people watching their old GPU getting obsolete. Just do some quick google of “is 8GB vram enough” and there are so many break downs out there so I won’t get into details. PS4 is a joke comparison cause switch have even less ram and still there are games making to run on it. (like the newer Prince of Persia). Anyway, old GPUs days are numbered because of low VRAM and lack of mesh shader compatibility. Nvidia is basically doing a how stupid can the market go test.