How does this KEEP GETTING WORSE??

  • barsoap@lemm.ee
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    1
    ·
    9 months ago

    We had 4 way split on 20inch tube tvs on hardware that measure their ram in MBs

    And were still compute-bound. Things like the N64 pretty much used resources per pixel, mesh data being so light that the whole level could be in the limited RAM at the same time – and needed to be because there weren’t CPU cycles left over to implement asset streaming. Nowadays the only stuff that is in RAM is what you actually see, and with four perspectives, yes, you need four times the VRAM as every player can look at something completely different.

    Sure you can write the game to use 1/4th the resources but then you either use that for singleplayer and get bad reviews for bad graphics, or you develop two completely different sets of assets, exploding development costs. I’m sure there also exist shady kitten-drowing marketing fucks who would object on reasons of “but hear me out, let’s just sell them four copies instead” but they don’t even get to object because production-wise split-screen isn’t an option nowadays for games which aren’t specifically focussing on that kind of thing. You can’t just add it to any random title for a tenner.

      • barsoap@lemm.ee
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        9 months ago

        I completely agree but I doubt you can afford a StarWars license if you’re making an indie game. Needs oomph and AAA to repay itself, and that’s before Disney marketing gets their turn to say no because they’ve seen the walk cycles in Clone Wars and went “no, we can’t possibly go worse than that that’d damage the brand”. I’m digressing but those walk cycles really are awful.

    • isles@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      9 months ago

      Aren’t you also effectively down-resing the 4 screens? You’re not running 4x 1080p streams, you’re running 540p, textures can be downscaled with no perceptual loss. Non-consoles are already designed to be adaptive to system resources, so I don’t see why you need two completely different sets of assets. (IANA dev)

      • barsoap@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        9 months ago

        Textures can be scaled quite easily, at least when talking about 4k to 1k below that could cause GFX people to shout at you because automatic processes are bound to lose the wrong details. Meshes OTOH forget it.

        Also I kinda was assuming 4x 1k to throw at a 4k screen as people were talking about home theatre screens. The “uses 4x VRAM” equation is for displaying 4x at 1/4th the resolution as opposed to 1x at 1x the resolution, whatever that resolution may be, and assuming you don’t have a second set of close-up assets – you can’t just take LOD assets, being in the distance is a different thing than being in the forefront at a lower resolution: Foreground assets get way more visual attention, it’s just how human perception works so you can’t get away with auto-decimating and whatnot.