I distinctly remember gaming on CRTs and then LCD screens and it was night and day difference, in favor of LCD. Monitors have only gotten better and I certainly don’t miss CRTs, least of all how hot they were.
I'm curious what the primary causes of that are. Like, I had a similiar experience growing up in the 90's. I think it was just the sheer increase in resolution. Text looked so much better, and you could fit more on a screen.
And then they got BIGGER.
I'm a _bit of a snob_ when it comes to that both due to my film & tv background as well as my game collection (jesus, that's a lot of games including full snes, n64 sets, mega drive, nes, etc). I have various broadcast monitors from PVMs to BVMs as well as some of the finest consumer ones including B&O etc. I can say that now with ultrafast OLEDs (240Hz) we're 95% there now, finally. With high quality shaders or hardware gadgets it's really nice. For that 5% more I think those things like ultra high DPI OLEDs and phosphor dot level emulation shaders with black frame insertions will get us there. Until then - good ol' Trinitron is still superb choice if you want 100%. Another thing, outside of actual display is that old console + CRT are almost zero lag input to screen experiences which I actually think plays significant role in the overall experience.
Agreed a 100% CRTS were wobbly flickery mess. Especially in the 60Hz era. Everyhting below 90Hz on a crt gave me horrible migraines when working longer than 4 hours.
LCDs that were just constantly lit were so much easier on the eyes than a CRT where every bright pixel is flashing at 60Hz.
But one thing is true: a low res game designed to look good on a CRT looks much worse on a low res LCD. CRTs being a blurry mess gave you free 'antialiasing'.