Currently trying to refund the new Indiana Jones game because it's unplayable without raytracing . My card isn't even old, it's just 8GB of VRAM is the absolute minimum apparently so my mobile 3060 is now useless. I miss when I used to be able to play new games in 2014 on my shitty AMD card at 20fps, yeah it didn't look great but developers still included a very low graphics option for people like me. Now you need to be upgrading every 2 years to keep up.
I got a special fucking bone to pick with Cities Skylines 2. I've never had a game look like such vaseline-smeared ass while making my computer sound like it's about to take off. It's a shame because it's definitely come a long way as a game and has some really nice buildings now, but to play it I start to get nervous after like half an hour and have to let my computer cool down, fuck that shit.
There's a number of YouTube videos examining how CS2 was designed in such a shockingly bad way to murder your GPU
I want my games to be able to be rendered in software, I want them to be able to run on a potato from the early 2000s and late 90s, is this too much for a girl to ask for
Todd Howard made Morrowind run on 64MB of RAM in a cave. With a box of scraps.
The new indiana jones is actually pretty decently optimized, like I run it at 1080p all high/ultra settings on my rtx 3060 12gb, with DLAA downscaling enabled at a mostly locked 60fps. Like it is leagues better than any UE5 game, it's just the hard VRAM requirements that suck.
I feel like a lot of the issues game graphics have nowadays is just that GPU prices have been ridiculously inflated over the last two decade because of crypto/ai. Like it is not surprising that devs will follow the newest trends and technologies when it comes to graphics, but the hardware needs of raytracing and global illumination and the likes are just too high for what gpu performance/dollar you can get in 2024. I just recently upgraded from an AMD RX480 to a used Nvidia RTX 3060 12GB (which seemed to be the best bang for the buck, an RTX 4060 would have been much more expensive for not a lot more performance), and that upgrade gets you maybe double performance in your games, for a GPU that is a whole seven years newer (and no VRAM upgrade at all when you get the base model). These cards just simply shouldn't cost as much as they do. If you don't have unlimited money to spend, you are going to have a much worse experience today compared to half a decade or a decade ago.
I just want ps2-level graphics with good art direction (and better hair, we can keep the nice hair) and interesting gameplay and stories. Art direction counts for so much more than graphics when it comes to visuals anyway. There are Playstation 1 games with good art direction that imo are nicer to look at than some "graphically superior" games.
There are so many games that don't have this problem. How about you play those?
that one is atrocious but another thing i also find nasty is the amount of disk space new games need. sure, buying more disks is way cheaper than getting more graphical power but downloading +100Gb for a game I might just play once feels like an incredible waste
games should have a lo-fi version where they use lower textures and less graphical features for the people that cannot actually see the difference in graphics after the ps2 era
Rainbow 6 Siege had to downgrade map assets because the skins take up too much space lol
Cosmetics is a wholly separate clown-show. Dota 2 used to be a few gigabytes in space. Now because of all the hats it's like 30gb compressed.
There's been a couple games I've decided to just not buy because the disk space requirement was too high. I don't think they care much about a single lost sale, unfortunately.
- ∞ 🏳️⚧️Edie [it/its, she/her, fae/faer, love/loves, ze/hir, des/pair, none/use name, undecided]·5 hours ago
War Thunder (garbage game, don't play) does this. You can choose to download higher quality textures. I don't care, I haven't noticed the difference
- ∞ 🏳️⚧️Edie [it/its, she/her, fae/faer, love/loves, ze/hir, des/pair, none/use name, undecided]·4 hours ago
Never. Always ftp
Same, I managed to get to BR 8.7 in the Soviet ground tree as a free-to-play, but I stopped making progress because the higher BR matches just aren't that fun so I stick around in 3.7-4.0 and gain like no research points, lol
- ∞ 🏳️⚧️Edie [it/its, she/her, fae/faer, love/loves, ze/hir, des/pair, none/use name, undecided]·4 hours ago
Real. I have my first 8.0 in soviet, but its just a grind, no matter what br I play. 2.7 or so is my go to
- ∞ 🏳️⚧️Edie [it/its, she/her, fae/faer, love/loves, ze/hir, des/pair, none/use name, undecided]·4 hours ago
Garbage game. Yet I continue to play
The gamers yearn for Forward+ rendering...
Yeah i think gaming as an industry is becoming 'more specialized' which is not necessarily good. All the engine developers are just working on very generic graphics stuff for like Unreal and Unity, rather than engine devs being a position at a company that makes games themselves, which can greatly optimize them for specific games.
I'm finding the latest in visual advancements feels like a downgrade because of image quality. Yeah all these fancy technologies are being used but its no good when my screen is a mess of blur, TAA, artifacting from upscaling or framegen. My PC can actually play cyberpunk with path tracing but i can't even begin to appreciate the traced paths WHEN I CAN'T SEE SHIT ANYWAY.
Currently binging forza horizon 4 which runs at 60fps on high on my steam deck and runs 165fps maxed on my PC with 8x msaa and it looks beautiful. And why is it beautiful? Its because the image is sharp where I can actually see the details the devs put into the game. Also half life alyx another game that is on another level with crisp and clear visuals but also ran on a 1070ti with no issues. Todays UE5 screen vomit can't even compare
All games these days know is stutter, smeary image, dx12 problems and stutter
TAA, dof, chromatic aberration, motion blur, vignetting, film grain, and lens flare. Every modern dev just dumps that shit on your screen and calls it cinematic. Its awful and everything is blurry. And sometimes you have to go into an ini file because it's not in the settings.
Chromatic aberration! When I played R&C: Rift Apart on PS5 I was taking screenshots and genuinely thought there was some kind of foveated rendering in play because of how blurry the corners of the screen looks. Turns out it was just chromatic aberration, my behated.
Hate film grain too because I have visual snow and I don't need to stack more of that shit in my games.
Dev: should we make our game with a distinctive style that is aesthetically appealing? Nah slap some noise on the screen and make it look like your character is wearing dirty oakleys and has severe astigmatism and myopia that'll do it.
I despise TAA. I remember back when I played on PS4, I could immediately spot a UE4 game because they almost always had awful TAA ghosting.
I feel like that trend is actually past us. Maybe I haven’t followed gaming too closely but there doesn’t seem to be a benchmark game that is as overwhelmingly demanding, considering the landscape of tech during its time, as something like Crysis.
The most popular games nowadays don’t seem to be prohibitively demanding for commonly bought pcs. Maybe im wrong
nah, the problem is that they are demanding, but don't push boundaries graphically at all - due to absolutely no optimisation.
i know i really shouldn't play this slop, but Black Ops 6 legitimately looks worse that new-MW3 but requires me to massively turn settings down to achieve similar framerates/avoid hitching compared to the game it literally replaced. I may as well be playing a game from 10 years ago, and I have a 3070 and ryzen 5000 series cpu. barely anything i've played in the last 5 years looks "boundary-pushing amazing", save for maybe Control which was more down to how it used certain effects i guess
i know i'm talking about activision, but it's not unique to them and their shitty engine. Halo Infinite looked like ass and ran worse. I didn't even play Starfield because lmao bethesda engine. Shit like Dead by Daylight even takes up 50gb. And i know they blow every single game launch, but given that Frostbite can look very good in some settings, BF2042 was an ugly, empty mess to the point that it killed the game harder that BFV. basically all AAA devs have regressed to inserting the slop into engines cobbled together 15 years ago and releasing games that would immediately crash if anything was compressed because treatpigs (like me) just accept 100gb installs and having to juggle what game you can play each week
There is some boundary pushing, but I feel like the time period between 2005 and 2015 was like... If you had a two year old graphics card you'd struggle with the latest games. Or something. Certainly, a 5 year old graphics card in 2005 would have been rough (as some people mention in this thread).
I think graphics cards have comparatively gotten more expensive though, compared to the rest of the computer.
I am lucky enough that I'm not that interested in high-specs AAA titles to begin with: of the 100+ games I've put on a DIY wishlist, I'd say less than 10 of them fall in this category. It's mostly indie/retro titles, older titles or mid-budget.
My card isn't even old
It's about 4 years old, which is pretty old for an entry tier card to be running the latest triple AAA titles. My Radeon 7770 was only three years old when The Witcher 3 came out and it couldn't hit minimum spec. Only two years for AC: Unity but that was especially demanding
I do think now is an awkward time where we're shifting to new tech that isn't quite ready for prime time, but it's never going to be until we shift to it
Yea that's fair, it's just hard to think of it being outdated when I paid so much for it. Also it's the first time I've experienced VRAM size being the chokepoint of what I can run, but maybe that's just the new normal.
Yeah it feels arbitrary, especially given how cheap VRAM is. Common Nvidia L (not that they care given the stacks they're making with data centers)
There are some good videos out there that also explain how UE5 is an unoptimised mess. Not every game runs on UE5 but it's the acceptable standard for game engines these days
This is the main one I saw. It's kind of an as for this guy's game company, but clouds in a skybox shouldn't cause performance issues https://youtu.be/6Ov9GhEV3eE
I found a YouTube link in your comment. Here are links to the same video on alternative frontends that protect your privacy:
That and DX12 in general, in my experience. Almost every game where I've had the option to use DX11 instead of DX12, the difference has been night and day. Helldivers 2 especially had an absurd improvement for me.
mobile 3060
If your using a laptop you need to do maintenance on it. If you don't reapply thermal paste/pads and clean your fans your GPU and CPU will throttle.
2nd addressing the thread more generally if you have a 1440p or 4k monitor you will struggle. Lowing resolution is a very quick way to get performance. This is why the steam deck and switch are 720p screens. My laptop is a 3k monitor and have to downscale any game I play on it.
Also "can it run crysis?" was a thing almost 20 years ago now
"3K" doesn't translate to any specific resolution, and it's exclusively a thing in higher end laptops afaik. Anything from 2560x1600 to 2736 x 1824 to 2880x1620 to 3260x1834 or other totally random display resolutions are marketed under this label.
Totally unusable terminology.
All of the boomer game devs that had to code games for like a 486 have now retired, replaced with people who nVidia or AMD can jangle shiny keys in front of to make their whole games around graphics tech like cloth physics and now ray tracing.
This is why solo or small team indie devs are the only devs I give a shit about. Good games that run well, are generally cheap, and aren't bloated messes.
I just want to punch nazis why does it have to matter if the reflection of a pigeon off screen appears in Indiana Jones' eyes??
Yeah, I'm getting real fucking tired of struggling to get 60fps in new games even with DLSS cranked to max. They don't even look much better. There's plenty of older games that look better and run better that you don't need to subject yourself to DLSS ghosting and frame gen latency to play. I've been telling my main co-op buddy that I might just stop playing new games (at least larger releases) because this shit is so frustrating.
I decided I'm not gunna' play any game that isn't released on PS4 or Switch. My PC isn't really powerful enough to play PS5-exclusive games anyway, so that sets a similar hard-limit. I was gunna' decide no game post-2020 but this choice made more sense.
I remember reading an article by Cosmo about music/metal on his ex-blog Invisible Oranges about how since we have a limited amount of time on Earth we're all making limits or acting within limits like these anyway. Some are just more conscious of it than others, so it shouldn't be treated as weird for someone to consciously decide they want to engage with a hobby in a specific manner like this.
I'm in a similar boat but I'm letting my budget dictate what I play instead. Tekken 8 looks cool but why pay $70+DLC when I have Tekken 7, also very cool, for $6? I'm always a version behind but my wallet thanks me.
Also Peggle is extremely cheap and I've gotten an RPG number of hours out of that series.
DLSS created an excuse for developers to throw optimization to the side and just do whatever they please. I figured this is what would happen when it was created and it’s definitely happening now. I’m glad that I don’t play AAA games for the most part, cause this shit sounds annoying.