Currently trying to refund the new Indiana Jones game because it's unplayable without raytracing cri. My card isn't even old, it's just 8GB of VRAM is the absolute minimum apparently so my mobile 3060 is now useless. I miss when I used to be able to play new games in 2014 on my shitty AMD card at 20fps, yeah it didn't look great but developers still included a very low graphics option for people like me. Now you need to be upgrading every 2 years to keep up.

  • RNAi [he/him]
    ·
    7 days ago

    People were saying this about Morrowind

    • Kaputnik [he/him]
      hexagon
      ·
      7 days ago

      Yeah but they were right, Morrowind looks too good, every game should look like Cruelty Squad

    • 7bicycles [he/him]
      ·
      edit-2
      7 days ago

      They were kind of correct back then two with the amount of upgrading the industry would expect you to do. That just petered off there for a while, luckily. seems to be back in full force now though

      That said, at least back then all the shit gave you actual functionalities as per graphics instead of like raytracing on retinas or some bullshit you'd never notice

      • KobaCumTribute [she/her]
        ·
        edit-2
        6 days ago

        I think that has to do with consoles: when a console generation is outdated mid or low range hardware that forces more general optimization and less added bullshit, especially when that generation drags on way too long and means devs are targeting what is basically a decade old gaming computer towards the end. When they're loss leaders and there's a shorter window between generations or upgraded same-generation versions, it means devs are only optimizing enough to run on a modern mid range gaming rig and specifically the console configuration of that.

        Although there's some extra stuff to it too, like the NVidia 10 series was an amazing generation of GPUs that remained relevant for like a decade, and the upper end of it is still sort of relevant now. NVidia rested on their laurels after that and has been extremely stingy with VRAM because their cash cow is now high end server cards for AI bullshit and they want businesses to buy $5000+ cards instead of <$1000 ones that would work good enough if they just had a bit more VRAM. GPUs have also gotten more and more expensive because of crypto and AI grifters letting NVidia know they can just keep raising prices and delivering less and people will still buy their shit, and AMD just grinning and following after them, delivering better cards at lower prices but not that much lower since they can get away with it too.