Currently trying to refund the new Indiana Jones game because it's unplayable without raytracing cri. My card isn't even old, it's just 8GB of VRAM is the absolute minimum apparently so my mobile 3060 is now useless. I miss when I used to be able to play new games in 2014 on my shitty AMD card at 20fps, yeah it didn't look great but developers still included a very low graphics option for people like me. Now you need to be upgrading every 2 years to keep up.

  • Mindfury [he/him]
    ·
    edit-2
    5 hours ago

    nah, the problem is that they are demanding, but don't push boundaries graphically at all - due to absolutely no optimisation.

    i know i really shouldn't play this slop, but Black Ops 6 legitimately looks worse that new-MW3 but requires me to massively turn settings down to achieve similar framerates/avoid hitching compared to the game it literally replaced. I may as well be playing a game from 10 years ago, and I have a 3070 and ryzen 5000 series cpu. barely anything i've played in the last 5 years looks "boundary-pushing amazing", save for maybe Control which was more down to how it used certain effects i guess

    i know i'm talking about activision, but it's not unique to them and their shitty engine. Halo Infinite looked like ass and ran worse. I didn't even play Starfield because lmao bethesda engine. Shit like Dead by Daylight even takes up 50gb. And i know they blow every single game launch, but given that Frostbite can look very good in some settings, BF2042 was an ugly, empty mess to the point that it killed the game harder that BFV. basically all AAA devs have regressed to inserting the slop into engines cobbled together 15 years ago and releasing games that would immediately crash if anything was compressed because treatpigs (like me) just accept 100gb installs and having to juggle what game you can play each week