Currently trying to refund the new Indiana Jones game because it's unplayable without raytracing . My card isn't even old, it's just 8GB of VRAM is the absolute minimum apparently so my mobile 3060 is now useless. I miss when I used to be able to play new games in 2014 on my shitty AMD card at 20fps, yeah it didn't look great but developers still included a very low graphics option for people like me. Now you need to be upgrading every 2 years to keep up.
There is some boundary pushing, but I feel like the time period between 2005 and 2015 was like... If you had a two year old graphics card you'd struggle with the latest games. Or something. Certainly, a 5 year old graphics card in 2005 would have been rough (as some people mention in this thread).
I think graphics cards have comparatively gotten more expensive though, compared to the rest of the computer.