• Please review our updated Terms and Rules here

What is a more "interesting" choice for an AGP card?

The Rage 128 was meant to compete with the Voodoo 2, Nvidia TNT 1, Matrox G200. It was an OK card that did 32 bit rendering better then 16 bit. They upgraded it with a Rage 128 Pro which was faster.
 
Nope, Rage cards were horrid. They barely worked rendering DirectX, and had virtually zero support for OpenGL.

The Rage XL and ATI ES1000 (also Rage XL) indeed stayed around on server motherboards until the 2010s and it sucked. The last drivers for those chips were for Windows XP, anything beyond that and the desktop was basically a slide show. They were kept around because they were dirt cheap, and deemed "good enough" by server board OEMs. I used to have a few boards with those crap chips on them, and I always threw in a PCI video card because performance was so bad. Since they couldn't accelerate the desktop, software rendering was done on the CPU and cratered performance.
 
I used to have an ATI Rage Fury Maxx, which, when it worked, was a decent card. Sadly, the drivers held it back. Most games didn't support it properly. And having 0 shaders really hurt its performance by time the drivers became mature. Ended up replacing it with a 32mb DDR Radeon card when they were first released.

Nonetheless, might be up your alley, since it was unique, and dual gpu on a single card.
 
I used to have an ATI Rage Fury Maxx, which, when it worked, was a decent card. Sadly, the drivers held it back. Most games didn't support it properly. And having 0 shaders really hurt its performance by time the drivers became mature. Ended up replacing it with a 32mb DDR Radeon card when they were first released.

Nonetheless, might be up your alley, since it was unique, and dual gpu on a single card.
I actually have one and similarly have had issues "making it work". It posted once and then never again.
 
You do not need a S3 card to use S3TC compression. S3 licensed their compression algorithms and it was incorporated into Direct X 6.0 and OpenGL 1.3. Most cards that support these rendering standards will also support S3TC compression in some form. Even though S3 has been defunct for decades as a graphics chip manufacturer, they were still getting royalties on their S3TC patent up until it expired in 2017-2018 from anyone that used it.
Thanks for the clarification. I am not remembering correctly then. I seem to remember back around the time that some card had a technology to use larger or more detailed textures that other cards did not have at the time. I always thought it was Savage 3D...
 
Thanks for the clarification. I am not remembering correctly then. I seem to remember back around the time that some card had a technology to use larger or more detailed textures that other cards did not have at the time. I always thought it was Savage 3D...

That is true, there were video cards at the time that had texture limits lower than other cards. Some examples would be the Voodoo2 and 3, which had a maximum texture size of 256x256. The Voodoo4 and 5 increased this to 2048x2048. Nvidia's TNT had a texture limit of 1024x1024, which was increased on the TNT2 to 2048x2048. ATI's Rage 128 had a 2048x2048 texture limit. There were other cards with other limits, but those are some quick examples.

But while larger supported texture sizes looked good on paper, in practice, they couldn't be used. Both the video hardware and PCs at the time couldn't work with more than just a couple of huge textures without either running out of video memory, or tanking performance. As an example, a 32 bit 2048x2048 Targa image (a format common at the time) would use 16 megabytes of memory in the worst case. That was as much memory as the entirety of video memory on many cards at the time. And since cards didn't have a T&L unit to offload geometry processing from the CPU, larger textures taxes rendering that much more. This is why many games in the late 90s had a maximum texture size of 256x256, part of it was for video card limits, part of it was for practical performance limits.

S3's gimmick with S3TC compression is that it allowed more textures to be used, and larger textures to be used, while using far less memory to do so. Using the previous example of a 16 MB 2048x2048 texture, S3TC could get this down to a few hundred kilobytes. But like with anything, there's no free lunch here. What you trade in memory usage, you lose in quality. S3TC compression is VERY lossy, being roughly equivalent to a bad JPEG compression job. Another problem it causes is with texture modding. Since the source texture is already compressed, modifying the texture further distorts the image. You can only do this so many times before you have a garbled mess.

But S3TC solved a major problem with huge texture sizes, and was widely adopted in the game industry. It's still in use today.
 
Back
Top