You know, I think we're going to have to agree to disagree here, as we're talking right past each other. Your position seems to boil down to the Mach64 "sucking" because it wasn't quite in the same league as Nvidia or 3Dfx's contemporary offerings and was over-marketed, whilst my point is that the same chip was a perfectly great GUI accelerator that when it came to 3D certainly had its flaws but still beat the **** out of having something by Cirrus Logic or S3 soldered to your motherboard and therefore cannot be reasonably described as "laughable". If your entire basis of comparison consists of the two *very best at 3D cards on the market* at its introduction then, yes, it sucks ****s, but otherwise, no.
Just generally speaking, when you're describing the value/"goodness" of a thing there are in fact many shades of grey in between "wonderful" and "laughable". I'm not saying it was wonderful, far from it but, really... "Laughable"? In a world where Trident Microsystems existed do you *really* think "laughable" is a defensible position for descirbing pretty much *anything* ATI's ever made? Sorta don't think so, sorry.
Not really sure what your angle is, you basically reiterated why I think the Rage was awful. Those benchmarks are also skewed by the fact the CPU is several years newer than what was available at the time of the cards release (which was mostly Pentiums, K5s, K6s and Cyrix.)
Look, I don't know what to tell you there. I pulled that link out of the air as a source for benchmarks of ancient cards, and
I guess I can't really help it if you don't like the fact that the Rage Pro pretty consistently manages a decent showing on these game benchmarks. I was there at the time and while I wasn't a "real gamer" I did play casually, and my data points are pretty much:
A: The Rage card I found on sale in... 1998? to replace an S3 Trio in my K6-233 seemed like a freaking miracle by comparison. Later I found a Riva 128 and yes, it played games better, but I actually swapped the Rage back in and put the Riva into a slower Pentium because the K6 spent most of its time in Linux and for that the Rage was less fussy.
B: This is in reference to the laptop version, but in 2000 I was able to reasonably effectively murder coworkers during after-hours Counterstrike tournaments with just my Rage LT-equipped laptop. Being a Rage Pro there were some sort of amusing visual defects in certain games, one of my favorites was the lack of fog in Star Wars Episode I racer which let you see the track being drawn off in the distance ahead of you, but the frame rate was perfectly acceptable which was something you couldn't say about *many* of the chipsets that were being stuffed in laptops at the time.
Have you ever tried to get hardware accelerated video modes on Rage cards working under *nix? It's neigh impossible. Even back in the day, at minimum you had to spend hours mucking with xorg.conf and hope the planets were in alignment. It's only gotten harder over time as lots of older cards (including the Rage series) have had support dropped from Xorg and Mesa. You either have to stick with really old Linux distros or do some transmorgification to bolt support for those old cards onto a modern xorg server and mesa, then recompile the mess and hope it works.
If you mean *standard* video output, then of course the vesafb driver will work, it works on most any GPU with sufficient video memory. But that has nothing to do with ATI.
I was there, and I have no idea what you're talking about. On x86 at least the Mach64 driver was easy-peasy to get working *except for 3d*, I'll totally admit that was a problem (the Mach64 DRI driver was always sort of a red-headed stepchild) but 3D was hard to get working at all back then under Linux so "problem" was matter of degree. But the 2D driver was fine and the acceleration significantly helped with tasks like playing DVDs. (But wait, we're talking right past each other again! I'm talking about Mach64 making for a perfectly good accelerated X server while you're making 3D the one and only measure of a video card's worth.)
On non-x86 platforms I do think the Mach64 driver *may* have suffered from the same issue that a number of xfree86's drivers did at the time, in that they relied on the BIOS to set up video modes (this was specifically something of a Linux-ism). Whether it was for that reason or just a bullheaded obsession with sticking with the Open Firmware framebuffer to "keep it simple" it was indeed a pain in the **** to get the native drivers for *any* card working under the early versions of Mac Linux. And yes, Xorg has been regularly shedding drivers for older cards so *today* you're not going to have a prayer of getting 3D working at all (there was a huge purge in... 2011?) and 2D might be a long shot, but you're not going to be a whole lot better position if your machine sports an S3/Trident/Matrox/Voodoo/whatever.
And of course, if we're going to bag on the Mach64 for iffy 3D support under Linux we could *totally* go down that rabbit hole with those other cards. How's Mesa working for you on that RIVA 128? Oh, wait, there never was a driver for that, proprietary *or* open source. And the tdfx driver was an atrocious house of cards and yes, I know first-hand because I had a Voodoo3 and am well aware of just what a schizophrenic piece of hardware it was.
If there were ever any motherboards...
I did realize as soon as I said it that I was probably wrong about there *never* being motherboards integrating those chips because, well, when you're talking PCs everything that can exist did exist at some point, no matter how terrible an idea. In any case, you proved my point: by no means were those chips *ever* the sort of thing you could have expected Apple to solder onto a Mac's system board.
Aaaanyway, my other point, which appears to have been completely lost in this discussion, is that even if the original Tray-Loader iMac had been equipped with
Richard Scarry's BEST GPU EVER! (circa 1998) it would have sucked running the early versions of OS X because *all* Macs sucked running the early versions of OS X. The Mach64 was basically a non-issue until Jaguar and in fact take note: Public Beta came out in September 2000, and at that point every Mac Apple was selling was equipped with either a Rage Pro or Rage 128-based GPU as standard equipment. Those are equally unusable with Quartz Extreme, which, again, was 2 years in the future, so unless you specifically got the Radeon BTO option in a G4 Tower or Cube your machine was going to be "forever crippled" by its video card so far as OS X is concerned, just like that original iMac. The iMac
did not suck because of its video chipset. It sucked because of plenty of other reasons, but I'd put things like "2 USB ports" and "terribly unreliable analog boards" way higher on the list.