It's probably just me being out of touch, but I don't think the GeForce RTX 4000 or 5000 series really mattered/matters that much.
At the same time I'd add the S3 ViRGE and the Matrox G200. Both mattered a lot at the time, but not long term.
I remember there was a kernel module for the Matrox/MPlayer combination. You get a new device that MPlayer could use. You did get `-vo mga` for the console and `-vo xmga` for X11; you couldn't tell the difference, and both produced high-quality hardware YUV output.
For a moment, a Matrox G400 DualHead was THE card to have for a multi-monitor setup.
My contributions: Matrox Parhelia for the first card supporting triple-monitors, and ATI All-in-Wonder which did TV out when media centre TVs weren’t really a thing.
Recency bias probably, Iirc I think the 3000 and 4000 series did make significant improvements on RTX performance so compared to the 2000 series it's far more useful today.
> S3 ViRGE and the Matrox G200
Both were only really famous for how terrible they were though. I think the S3 Virge might even qualify as 3D decelerator ;)
G200 Matrox GPUs came integrated with servers for absolute ages,like past 2010's
The G200 mattered to some degree for a long time, because most x86 servers up until a few years ago would ship a G200 implementation or at least something pretending to be a G200 card as part of their BMC for network KVM.
This is an ad from viral marketing company and everyone here is falling for it.
>S3 ViRGE
decelerator?
>Matrox G200
because it never got opengl driver? Because it was 2x slower than even Savage3D? Nvidia TNT released a month later offering 2x the speed at lower price
https://www.tomshardware.com/reviews/3d-chips,83-7.html
truly a graphic card that mattered! :)
Or the S3 Savage3D, which, while being inferior to the TNT2, pioneered texture compression.
https://en.wikipedia.org/wiki/S3_Texture_Compression