• Please review our updated Terms and Rules here

LowEndMac 1997 web site

I have an old Mac Performa 550. No CD caddy, keyboard or mouse. Its from '93, and I am trying to sell it on craigslist. But what do you guys think its worth?
 
I ran YDL on the iMac someone left on my doorstep in 2000 for while. It was... underwhelming. Ran a lot faster than the OS X Public Beta, though!
 
I ran YDL on the iMac someone left on my doorstep in 2000 for while. It was... underwhelming. Ran a lot faster than the OS X Public Beta, though!

It's probably because of the ATI Rage GPUs used in all iMac G3s (and most Macs back to the earlier 60x models.) They were awful to begin with and didn't really have proper driver support in Xorg. I have an iMac G3 Snow with a 600 MHz CPU and 1 GB of RAM; It flies on anything that doesn't require video acceleration.

The original iMac G3s with 233/266/333 MHz CPUs have an even worse time because they have such a limited amount of video memory. The original 233 MHz variant only had 2 MB of video memory, which is barely enough to run at 800x600 with 32 bit color (you'd be using 1.875 MB for this.) The later 233B/266/333 MHz variants weren't much better with 6 MB (1024x768@32BPP would chew through half of this.) 3D games are pretty much out because the slowness of constantly swapping textures to system memory results in a slide show.

It's a really hard choice to make: Use OS X/OS 9 and be stuck with legacy software or use Linux and have up to date software at the expense of video performance.
 
It's probably because of the ATI Rage GPUs used in all iMac G3s ...

The performance of OS X Public Beta (and 10.0 and 10.1, for that matter) pretty much sucked on every Mac in existence at the time. Remember, "Quartz Extreme", IE, GPU acceleration of the Quartz Compositor, didn't exist until 10.2, so prior to that point the entire GUI essentially consisted of huge bitmaps rendered by the main CPU and stuffed into the framebuffer making fairly minimal use of any sort of acceleration. So, yeah, the iMac Rev B. I was playing with sucked but its Rage Pro was only one of *many* reasons why.

(And really, I generally like the 3D Rage Pro, in both its desktop and mobile version. By today's standards it's pretty laughable, sure, but at the time it was roughly as good as its contemporaries; on the iMac it played "Nanosaur" competently enough under OS 9. Under Linux it wasn't so good because it was *hard* to make Yellow Dog even try to use the accelerated drivers; by default it used a simple framebuffer akin to the contemporary generic VESA Framebuffer Xfree86 driver. PCs with the same chip didn't have that problem.)

Public Beta was so lovably broken. Back then you had the option to install on a "UNIX" filesystem instead of HFS+; that's how I did it the first time, and very shortly discovered that a significant percentage of the small base of available software, including the X11 port, broke on filesystems that actually enforced case sensitivity. Good Jorb.
 
very shortly discovered that a significant percentage of the small base of available software, including the X11 port, broke on filesystems that actually enforced case sensitivity. Good Jorb.

That joy lives on today. The main FS has to be case-insenstive if you want to install Acrobat Pro.
 
By today's standards it's pretty laughable, sure, but at the time it was roughly as good as its contemporaries;

It was laughable when it was new. The 3D Rage Pro was released in 1998, yet it was easily beaten by GPUs released two years prior (like the Riva 128 from Nvidia and Voodoo1 from 3dfx.) And common with every Rage GPU, it had pretty much zero support for OpenGL. This wouldn't have been a problem if it weren't for the fact that ATI touted their later Rage cards as gaming chips (which they weren't.) Considering that pretty much every game in that era used either OpenGL or Glide, it killed ATI's position in the market.

ATI never got their drivers together to get OpenGL working properly on the Rage, if it was even possible on their anemic GPUs. By the time it was used in the iMac G3, it was years out of date. The only reason it was used in the iMac, or any PC for that matter is because it was probably the cheapest GPU to get stuff to display on the screen.

What boggles my mind is that Rage GPUs were STILL found in servers as late as 2011. When stocks of those chips finally ran out, Chinese companies started ripping off the Rage XL core and producing it instead. These chips have no place in servers, there is no OS today that has proper drivers to run them with, leaving horribly slow VESA modes that chew up tons of CPU time to render the screen.
 
It was laughable when it was new.

Uhm, no. It wasn't. Yes, it had its flaws and, yes, it could be beaten by full-blown GAMERZ cards, but its performance was more than adequate for something positioned as a "mainstream" graphics solution. Yes, ATI may have seriously oversold its gaming credentials in their marketing BS, but that doesn't distract from the fact it's a basically competent "GUI accelerator". (And regardless of whatever problems it had with Windows drivers the Linux/Unix support for the hardware was always very solid.)

In any case, it's totally beside the point. The early versions of OS X performed terribly on the most powerful Macs you could buy at the time. It just made it that much worse that I happened to first experience it on the slowest officially supported machine. It seriously was hardly better on a 450mhz Sawtooth, played with it on one of those too. 3D performance was basically irrelevant in OS X until 10.2.
 
Last edited:
My strawberry iMac (admittedly with a 600MHz Sonnet HARMONi) performed substantially better in 10.2 with Shadowkiller. The blur effects under menus and windows brought it to its knees. With that monkey off its back, graphics performance isn't bad, considering.

That said, the 7300 with the Rage 128 card next door wipes the floor with it. I bought that Rage Orion when it was new, and it was probably some of the best money I spent at the time. But it's running OS 9, so.
 
That said, the 7300 with the Rage 128 card next door wipes the floor with it. I bought that Rage Orion when it was new, and it was probably some of the best money I spent at the time. But it's running OS 9, so.

Well, yeah. If you're talking about 3D performance/feature completeness the Rage 128 is really the point where ATI became a serious-for-reals player. But for onboard/integrated graphics chips in that era you could do soooo much worse than the Mach64 Rage. So far as I'm aware no one ever soldered a Voodoo or Riva 128 to a motherboard.
 
Uhm, no. It wasn't. Yes, it had its flaws and, yes, it could be beaten by full-blown GAMERZ cards, but its performance was more than adequate for something positioned as a "mainstream" graphics solution. Yes, ATI may have seriously oversold its gaming credentials in their marketing BS, but that doesn't distract from the fact it's a basically competent "GUI accelerator".

Not really sure what your angle is, you basically reiterated why I think the Rage was awful. Those benchmarks are also skewed by the fact the CPU is several years newer than what was available at the time of the cards release (which was mostly Pentiums, K5s, K6s and Cyrix.)

(And regardless of whatever problems it had with Windows drivers the Linux/Unix support for the hardware was always very solid.)

Have you ever tried to get hardware accelerated video modes on Rage cards working under *nix? It's neigh impossible. Even back in the day, at minimum you had to spend hours mucking with xorg.conf and hope the planets were in alignment. It's only gotten harder over time as lots of older cards (including the Rage series) have had support dropped from Xorg and Mesa. You either have to stick with really old Linux distros or do some transmorgification to bolt support for those old cards onto a modern xorg server and mesa, then recompile the mess and hope it works.

If you mean *standard* video output, then of course the vesafb driver will work, it works on most any GPU with sufficient video memory. But that has nothing to do with ATI.

Well, yeah. If you're talking about 3D performance/feature completeness the Rage 128 is really the point where ATI became a serious-for-reals player. But for onboard/integrated graphics chips in that era you could do soooo much worse than the Mach64 Rage. So far as I'm aware no one ever soldered a Voodoo or Riva 128 to a motherboard.

If there were ever any motherboards with a Voodoo or Voodoo2 chip on them, they're so rare that they might as well not exist. However, while extremely rare, there were motherboards with Voodoo3 IGPs.

Here's an example:

http://www.hw-museum.cz/view-vga.php?vgaID=56

Likewise with the Riva 128, the Intel RC440BX has one:

http://www.nitroware.net/images/stories/zotac_z77/rc440bx.jpg
 
You know, I think we're going to have to agree to disagree here, as we're talking right past each other. Your position seems to boil down to the Mach64 "sucking" because it wasn't quite in the same league as Nvidia or 3Dfx's contemporary offerings and was over-marketed, whilst my point is that the same chip was a perfectly great GUI accelerator that when it came to 3D certainly had its flaws but still beat the **** out of having something by Cirrus Logic or S3 soldered to your motherboard and therefore cannot be reasonably described as "laughable". If your entire basis of comparison consists of the two *very best at 3D cards on the market* at its introduction then, yes, it sucks ****s, but otherwise, no.

Just generally speaking, when you're describing the value/"goodness" of a thing there are in fact many shades of grey in between "wonderful" and "laughable". I'm not saying it was wonderful, far from it but, really... "Laughable"? In a world where Trident Microsystems existed do you *really* think "laughable" is a defensible position for descirbing pretty much *anything* ATI's ever made? Sorta don't think so, sorry.

Not really sure what your angle is, you basically reiterated why I think the Rage was awful. Those benchmarks are also skewed by the fact the CPU is several years newer than what was available at the time of the cards release (which was mostly Pentiums, K5s, K6s and Cyrix.)

Look, I don't know what to tell you there. I pulled that link out of the air as a source for benchmarks of ancient cards, and I guess I can't really help it if you don't like the fact that the Rage Pro pretty consistently manages a decent showing on these game benchmarks. I was there at the time and while I wasn't a "real gamer" I did play casually, and my data points are pretty much:

A: The Rage card I found on sale in... 1998? to replace an S3 Trio in my K6-233 seemed like a freaking miracle by comparison. Later I found a Riva 128 and yes, it played games better, but I actually swapped the Rage back in and put the Riva into a slower Pentium because the K6 spent most of its time in Linux and for that the Rage was less fussy.

B: This is in reference to the laptop version, but in 2000 I was able to reasonably effectively murder coworkers during after-hours Counterstrike tournaments with just my Rage LT-equipped laptop. Being a Rage Pro there were some sort of amusing visual defects in certain games, one of my favorites was the lack of fog in Star Wars Episode I racer which let you see the track being drawn off in the distance ahead of you, but the frame rate was perfectly acceptable which was something you couldn't say about *many* of the chipsets that were being stuffed in laptops at the time.

Have you ever tried to get hardware accelerated video modes on Rage cards working under *nix? It's neigh impossible. Even back in the day, at minimum you had to spend hours mucking with xorg.conf and hope the planets were in alignment. It's only gotten harder over time as lots of older cards (including the Rage series) have had support dropped from Xorg and Mesa. You either have to stick with really old Linux distros or do some transmorgification to bolt support for those old cards onto a modern xorg server and mesa, then recompile the mess and hope it works.

If you mean *standard* video output, then of course the vesafb driver will work, it works on most any GPU with sufficient video memory. But that has nothing to do with ATI.

I was there, and I have no idea what you're talking about. On x86 at least the Mach64 driver was easy-peasy to get working *except for 3d*, I'll totally admit that was a problem (the Mach64 DRI driver was always sort of a red-headed stepchild) but 3D was hard to get working at all back then under Linux so "problem" was matter of degree. But the 2D driver was fine and the acceleration significantly helped with tasks like playing DVDs. (But wait, we're talking right past each other again! I'm talking about Mach64 making for a perfectly good accelerated X server while you're making 3D the one and only measure of a video card's worth.)

On non-x86 platforms I do think the Mach64 driver *may* have suffered from the same issue that a number of xfree86's drivers did at the time, in that they relied on the BIOS to set up video modes (this was specifically something of a Linux-ism). Whether it was for that reason or just a bullheaded obsession with sticking with the Open Firmware framebuffer to "keep it simple" it was indeed a pain in the **** to get the native drivers for *any* card working under the early versions of Mac Linux. And yes, Xorg has been regularly shedding drivers for older cards so *today* you're not going to have a prayer of getting 3D working at all (there was a huge purge in... 2011?) and 2D might be a long shot, but you're not going to be a whole lot better position if your machine sports an S3/Trident/Matrox/Voodoo/whatever.

And of course, if we're going to bag on the Mach64 for iffy 3D support under Linux we could *totally* go down that rabbit hole with those other cards. How's Mesa working for you on that RIVA 128? Oh, wait, there never was a driver for that, proprietary *or* open source. And the tdfx driver was an atrocious house of cards and yes, I know first-hand because I had a Voodoo3 and am well aware of just what a schizophrenic piece of hardware it was.

If there were ever any motherboards...

I did realize as soon as I said it that I was probably wrong about there *never* being motherboards integrating those chips because, well, when you're talking PCs everything that can exist did exist at some point, no matter how terrible an idea. In any case, you proved my point: by no means were those chips *ever* the sort of thing you could have expected Apple to solder onto a Mac's system board.

Aaaanyway, my other point, which appears to have been completely lost in this discussion, is that even if the original Tray-Loader iMac had been equipped with Richard Scarry's BEST GPU EVER! (circa 1998) it would have sucked running the early versions of OS X because *all* Macs sucked running the early versions of OS X. The Mach64 was basically a non-issue until Jaguar and in fact take note: Public Beta came out in September 2000, and at that point every Mac Apple was selling was equipped with either a Rage Pro or Rage 128-based GPU as standard equipment. Those are equally unusable with Quartz Extreme, which, again, was 2 years in the future, so unless you specifically got the Radeon BTO option in a G4 Tower or Cube your machine was going to be "forever crippled" by its video card so far as OS X is concerned, just like that original iMac. The iMac did not suck because of its video chipset. It sucked because of plenty of other reasons, but I'd put things like "2 USB ports" and "terribly unreliable analog boards" way higher on the list.
 
Last edited:
You know, I think we're going to have to agree to disagree here, as we're talking right past each other.

Yes, we are. I was talking about the 3D performance and drivers of the Rage and you keep going off on 2D performance of the Mach64, which isn't even in the Rage family and I never said anything about to begin with. The only relation the Mach64 has to the Rage is that it was its direct predecessor.

Your position seems to boil down to the Mach64 "sucking" because it wasn't quite in the same league as Nvidia or 3Dfx's contemporary offerings and was over-marketed, whilst my point is that the same chip was a perfectly great GUI accelerator that when it came to 3D certainly had its flaws but still beat the **** out of having something by Cirrus Logic or S3 soldered to your motherboard and therefore cannot be reasonably described as "laughable". If your entire basis of comparison consists of the two *very best at 3D cards on the market* at its introduction then, yes, it sucks ****s, but otherwise, no.

My position is the Rage was terrible for those reasons. If a card is marketed to be a direct competitor to a Riva 128, TNT/TNT2, Voodoo, Voodoo2, etc. I expect it to be able to do similar things to those cards, which is have good 3D performance without headaches of bad drivers. The fact is that no Rage card could do this, and I'm pretty sure of it because I had almost every model of Rage card at some point or another starting from the IIc all the way up to the Fury. I still have a handful of Rage 128 cards as well.

If you buy X product that promises to have X feature and when you get said product and it doesn't have X feature, it's false advertising. I'm not going to go easy on a company and let them off the hook just because their product is cheaper than everyone else's.

Just generally speaking, when you're describing the value/"goodness" of a thing there are in fact many shades of grey in between "wonderful" and "laughable". I'm not saying it was wonderful, far from it but, really... "Laughable"? In a world where Trident Microsystems existed do you *really* think "laughable" is a defensible position for descirbing pretty much *anything* ATI's ever made? Sorta don't think so, sorry.

I have several Trident and S3 cards and actually prefer them over Rage cards. The performance leaves a lot to be desired, but at least the drivers were solid and they weren't trying to pretend to be something they weren't.
 
Yes, we are. I was talking about the 3D performance and drivers of the Rage and you keep going off on 2D performance of the Mach64, which isn't even in the Rage family and I never said anything about to begin with. The only relation the Mach64 has to the Rage is that it was its direct predecessor.

Uhm, no. I have a habit of using "Mach64" to avoid confusion with RAGE 128 chipset and because the Rage (up through "Pro"/"Mobility"/whatever) is a direct descendant/member of the Mach 64 family: the Rage I and Rage II were even *branded* "Mach64 GT/264GT" and "Mach64 GT-B/264GT-B" respectively and the same Xfree86 driver was used for all Mach64 chipsets up to and including the 3D Rage Pro and its laptop brethren. It's a Mach64, sorry.

As I said, I'm fully cognizant of the fact you don't cut the Rage-labeled a fig of slack for being decent 2D chips, it's all about how they TOTALLY SUCK at 3D, but they *are* decent 2D chipsets with excellent alternate OS compatibility *and*, although it's not very good, they at least had a DRI driver under Linux that sorta works despite some warts, unlike the Riva 128 or any Voodoo prior to the Voodoo3. (And the DRI driver for that was bad, really bad. I wasted so much time trying to make that work with *anything*; the Matrox G400 and Rage 128 were easy as pie by comparison and the video quality was far better.)

Anyway, whatever. Apparently you must *really* have gotten burned by buying one because of ATI's evil false advertising, as I can't imagine why someone would hold this much of a grudge against an inanimate object. I'm sorry?
 
Uhm, no. I have a habit of using "Mach64" to avoid confusion with RAGE 128 chipset and because the Rage (up through "Pro"/"Mobility"/whatever) is a direct descendant/member of the Mach 64 family: the Rage I and Rage II were even *branded* "Mach64 GT/264GT" and "Mach64 GT-B/264GT-B" respectively and the same Xfree86 driver was used for all Mach64 chipsets up to and including the 3D Rage Pro and its laptop brethren. It's a Mach64, sorry.

Just because early Rage chips were sold under the older Mach64 moniker doesn't mean they're in the same family. It's a common marketing tactic where older parts will be rebranded to sell under a new brand name or vice versa, newer parts will be sold under an older established name "to test the water" or because they don't want to go through the pains of making a new brand name yet.

Here's a 3D Rage II under the old Mach64 moniker and under the new Rage II moniker:

Mach64:
http://www.vgamuseum.info/images/palcal/ati/34_ati_3d_rage_ii_pn.109-38200-00_top_hq.jpg

Rage II:
http://www.vgamuseum.info/images/zaatharen/ati/rageII_fhq.jpg

This marketing scheme has been used by all video card manufacturers on nearly all of their lines, and the same applies to them.

And the reason the Mach64 xorg drivers support Rage chips is because rage chips were used on Mach64 branded video cards. The naming of the driver is for sanity sake, not because it defines what the generation of chips are. It's the same reason why Nvidia drivers that support the GT200 core also support the G92 core, because of the GTS250, which was a rebranded 9800GTX+.

Anyway, whatever. Apparently you must *really* have gotten burned by buying one because of ATI's evil false advertising, as I can't imagine why someone would hold this much of a grudge against an inanimate object. I'm sorry?

Anyone is going to be mad if they pay for promised features that are non-existent. I can't really imagine anybody paying for an advertized 3D card that doesn't do 3D and be like "oh ok whatever it still does 2D fine!", that makes no sense whatsoever. People letting companies roll over on them and not complaining about it is why they're still getting rolled over on and ripped off.
 
Just because early Rage chips were sold under the older Mach64 moniker doesn't mean they're in the same family....

And the reason the Mach64 xorg drivers support Rage chips is because rage chips were used on Mach64 branded video cards. The naming of the driver is for sanity sake, not because it defines what the generation of chips are. It's the same reason why Nvidia drivers that support the GT200 core also support the G92 core, because of the GTS250, which was a rebranded 9800GTX+.

Again, clearly I'm not going to convince you so I give up... but the fact is you're completely wrong. Read the sources for the Linux drivers; a "Rage" is a Mach64 with 3D support grafted on, and the same 2D code is reused for all of them. (In Xfree86 3.3 all the ATI support was bundled into an "ATI" driver to keep it simple for the user, but that itself was composed of different modules for different families of chips, IE, Mach8, Mach32, etc. If the Rage had really merited its own family designation on technical ground it would have had its own module.) There were multiple generations of the non-3D Mach64 chips as well, with different feature sets: which one as the REAL Mach64? I mean, for crying out loud the NAME OF THE DRI MODULE WAS "MACH64":

http://dri.freedesktop.org/wiki/ATIMach64/

Talk until you're blue in the face about it not being a "genuine" Mach64 if you really want to but it seems pretty clear that most of the world disagrees. (It's an incredibly stupid thing to argue about in the first place so, yeah, I don't care anymore. You win!)
 
Back
Top