it's because it doesn't have HDR. which doesn't work in linux with amd anyway. FUCK NVIDIA :up-yours-woke-moralists:
it's better because it supports MST - the displayport protocol that allows daisychaining. the equivalent 2022 monitor costs double the price lmaoooooo I don't game :hentai-free: and HDR is something I have literally never seen or used so I don't fucking care. I got exactly what I wanted.
Always buy slightly ''outdated'' tech. It works fine so why would you bother paying for super expensive shit?
Especially displays and graphic cards. Personally I don't give a flying fuck about 4k or whatever, ever since we reached 1920*1080 I've been happy enough and I've stayed there, I barely see a change on higher res anyway.
4k is such a waste for gaming outside of VR, especially if you don't have displayport or hdmi2.1.
Keep your fucking pixels. I'd rather play at 120hz.
I prefer 4k but not for gaming pixels. it's cause I want to fit more code on the screen and fonts are noticeably blurry at 1920x1080p. at 4k, I might have to size up my fonts a little but I can fit more on the screen and avoid eyestrain.
This is why I like 3440x1440 ultrawide, it's so much screen real estate in 2.35:1 like the film nerds love, obviates multi monitor setups, and it's still fewer pixels to drive than 4k.
To get the most out of the pixels they had, artists extensively used the slight blurring, roundness of pixels, and the way brighter colors bled into darker ones to create detail that's lost on modern displays.
They make the spyware TVs cheaper because they expect to make money off your data over the life of the TV.
But every TV is dumb if you never connect it to the Internet.
Don’t doubt for a second that they’re not farming out unregulated spectrum for always-on connectivity that you cannot opt out of without a faraday cage
A CRT from the 90s I got off Craigslist for 40 bucks in the early 2000s does 2048x1536@60 or 1024x768@240 and was only surpassed in picture quality when I got a 4K LCD.
I still use it for emulating old video games.
I once found a CRT by a dumpster so I salvaged it. This was about 11 years ago. By then I had long been using LCD displays even though I had memories of CRTs from when I was a kid.
Anyway, I hooked it up to my desktop and started playing Starcraft 2. That refresh rate was SO SMOOTH. It was absurd how good it looked. These days I have a 144Hz display which definitely does help. Looking forward to a future where I have 4k 144Hz.
Tragically I had to dumpster that display myself at some point, because I was moving apartments so frequently (poor college student lifestyle) that I just couldn't keep up with my own stuff even though I barely owned much at all.
my work laptop has an nvidia card and I've been continually perplexed that it feels like 2009 again, scanning Xserver logs and mucking with Xorg.conf to try and fix it. so uhh nvidia more like novideo has literally been my life lately.
amd does support hdr fwiw
and eh, it's an okay monitor but it's specs are just okay and the price where I live really isn't that good. i quickly found a comparable display $100 less new. pricing tends to swing pretty wildly on displays regionally though as they're bulky so it's possible it's much better where you are
generally I'm pretty pro going for older tech but the trend I've found in monitors is they really don't drop enough if you're still getting them first hand. secondhand/refurb stuff is the way to go for savings as monitors are pretty reliable unless you're going for a panel type that suffers burn in
The HDR you get in almost every monitor isn't worth it anyway. It usually dims the screen too much or isn't true HDR unless you spent thousands on it.
still using the three monitors I picked out of the recycling bin for free :fidel-cool:
I remember playing DMC 5 on my ps5 and noticing that my colors weren't as vibrant as they were on my laptop. Turns out it was the HDR and once I turned it off I never turned it back on