• Please review our updated Terms and Rules here

IBM 8512 vs Modern Monitors

evildragon

Veteran Member
Joined
May 29, 2007
Messages
1,646
Location
Tampa Florida
I recently found my first VGA monitor, my good 'ol trust IBM 8512.. My understanding, is that this is pretty much the first VGA monitor.

So, since I use my SEGA Dreamcast with VGA monitors (and it outputs standard 31.6KHz scanrate 640x480 60Hz), I figured I'd use the 8512 for games instead.

Almost instantly, I was blown away.. Sure, the 8512 may not have high resolution, and high DPI, but dang, it's contrast and colors was MUCH better than any of my modern CRT monitors!

Is this really a case of "they don't make them like they used too"?

I mean, most of my modern CRTs go bad after a few years, yet this 8512 still works as good as the day I originally got it.. (and mine is dated 1987)

Here's some example pictures, of how the color on the 8512 just are plain "richer" and "deeper" than a modern CRT... (both monitors had their white balance professionally calibrated)

Modern Monitor: (notice how the floor has a weird purple-ish hue)
http://blackevilweredragon.spymac.com/moderncrt.jpg

IBM 8512 Monitor: (notice how the colors are deeper)
http://blackevilweredragon.spymac.com/8512crt.jpg

(Please ignore the moire pattern from the shadowmask's DPI differences)
 
Last edited:
You've set my mind to working (which could be dangerous) and reminded me that back when the 486 was king, you couldn't beat a VESA local video card with Tseng Labs ET-4000 chipset.

I worked in a small computer store at that time, and we compared several video cards (including some very expensive ones) and in my opinion, none were better than the ET-4000. Hard to quantify the differences, but for one thing the colors looked "richer." Sadly, the next generation of Tseng Labs chipsets looked like everyone else's. It's only the ET-4000 that really shines.

Now, if you had a Tseng Labs ET-4000 video card for your 8512 monitor, just think how that would look. :)

Kent
 
if it looked richer, chances are it was the DAC it usesd.. you could probably find another video card with the same DAC.. ;)

anyways, i ran a test on my 8512, and sure enough, it has better contrast than my modern monitors... it's kind of scary actually, since my 8512 has never been serviced..
 
The older VGA monitors were probably over designed (they could get away with that because of the money they charged back then).

As far as video quality using the same monitor, it is the RAMDAC on the video card that counts, later video cards like the TSENG ET6000 started incorperating the RAMDAC into the video chip itself with varying results. You generally tell the difference between RAMDACS at higher resolution, and the chip used will limit you on refresh rates and resolution on the high end.
 
Would that be the same as a Palette DAC? I have a video card that appears to have a GPU and an IBM -- thing that says Palette DAC. Same thing or is that totally different?
 
Would that be the same as a Palette DAC? I have a video card that appears to have a GPU and an IBM -- thing that says Palette DAC. Same thing or is that totally different?
i think i saw that card, where i used to work, in the stock room..

does the IBM thing look like a PowerPC chip?
 
Yup thats the IBM thingy. The card I have used to be high end and is Mac only. So Tragically I have no use for it.
 
Yup thats the IBM thingy. The card I have used to be high end and is Mac only. So Tragically I have no use for it.
I wonder if yours has the PCI slots, or the NuBUS connector.. the card I saw was PCI, and would fit my Mac (which runs OS X and OS 9, dual booted)..

I can't help but wonder why they would use such a big DAC, but I used to have a Diamond Weitek card (long gone, I used an SMD rework station and removed all the chips on it), and it's RAMDAC is huge..
 
im really curious about the chip.. i wonder if it has raster effects, like how game consoles do some raster effects..

on another note, why do so many people hate the IBM 8512? I mean, it has a great picture quality, so what's the hate about them? Is it a failure rate or something? mines never died, so maybe im lucky..
 
I have no way to put the card though its paces, the last Mac I had capable of using it malfunctioned alot before it gave out. I suppose I could try to revive it.
 
I believe IBM used similar RAMDAC technology on some of their RS/6000 PPC workstation graphics cards too. If nothing else, AIX CDE was rather speedy on a Model 250 compared to Solaris CDE on a SS10. I can't recall if it had more vibrant colours.

That would be MCA bus, but surely there neither was a Mac with MCA nor would you confuse PCI and MCA cards... :)
 
Concerning the "unpopularity" of the 8512 Display. I wasn't really aware it was unpopular, but if memory serves correctly this model only does 640x480 in non-interelaced mode, and *may* do higher resolutions with interlacing. I believe the vertical refresh rate is not very high. So, while it may have good contrast, it's probably not really great unless you're playing DOS games. For a 286 or 386 system an 8512 would have been fine. However, I think around the time of the 486, most people who were serious about their system wanted 800x600 72Hz non-interlaced graphics with decent dot pitch (at least .28).

As for the ET4000 graphics cards...yes, the DOS performance is great. But, not all of us just played DOS games. ET4000 is rather mid-range when not running VGA modes. Also, a lot of the ET4000 chips went on budget end cards with crappy RAMDACS...but some companies like Hercules had better quality. Anyway, back then if you were serious about accelerated graphics you'd be running a VRAM card with S3 or ATi chipset.
 
Concerning the "unpopularity" of the 8512 Display. I wasn't really aware it was unpopular, but if memory serves correctly this model only does 640x480 in non-interelaced mode, and *may* do higher resolutions with interlacing. I believe the vertical refresh rate is not very high. So, while it may have good contrast, it's probably not really great unless you're playing DOS games. For a 286 or 386 system an 8512 would have been fine. However, I think around the time of the 486, most people who were serious about their system wanted 800x600 72Hz non-interlaced graphics with decent dot pitch (at least .28).

As for the ET4000 graphics cards...yes, the DOS performance is great. But, not all of us just played DOS games. ET4000 is rather mid-range when not running VGA modes. Also, a lot of the ET4000 chips went on budget end cards with crappy RAMDACS...but some companies like Hercules had better quality. Anyway, back then if you were serious about accelerated graphics you'd be running a VRAM card with S3 or ATi chipset.
The thing is though, when connected to a Sega Dreamcast, which outputs standard VGA sync with 16.7 million colors, this 8512 monitor just looks amazing.. My modern monitor couldn't even compete against it's awesome colors, the modern monitor suffered from major blue push...
 
Some observations on this thread:

I think that people are misusing the term RAMDAC. The RAMDAC is the part of the graphics cards that converts the binary color number to a voltage level for display on the monitor. It has nothing to do with accelleration. It does control image quality though - a good RAMDAC is essential for a clear display.

Second, you may be enamoured with your 8512 Display, but it's 20 years old. Electronics age and materials degrade, so it is not the same as when it was new, even if you did not use it. I'm sure that if you hooked your Sega up to a professional quality CRT tube today it would look quite nice too.

(If that's possible .. I don't think that too many manufacturers are concerned with CRT displays anymore.)

I have two fairly high quality Mitsubishi displays in the house - an old 17" ($600 in 1998 ) with BNC inputs and a large 19" ($480 in 2000), which is the primary monitor now. (Wow, time flies .. it's 7 years old already!) Both use a derivative of the Sony Trinitron tube that Mitsibushi licensed. Even these monitors, pro quality at the time, are aging. But they are aging gracefully. :)

My next monitors will be flat panels. No doubts about it. We use flat panels at work now, and although they are not great for vintage systems, they are just wonderful and clear when used with a good video card through a DVI interface.
 
all my trinitrons and clones, are dim, they all lost their brightness, even if only manufacturered 10 years ago.. (even when they were 5 years old they got dim).. their flyback was adjusted by me to get more brightness out of them, but i know it won't last long until that's gone.. either i'll need a new flyback, or something else is out of whack in them...

i know my 8512 isn't as good as it was, but it seems like it to me, i got trinitons, flat CRTs, semi-flat CRTs (ones that aren't truly flat, but the lens makes it flat), LCDs, etc, and the 8512 still just has better colors than them.. especially the color red, it's just better.. a professional calibrator used those things that clamp on the CRT, and he said my 8512's primaries were excelent, the best he's seen in a while (compared to HDTVs too)..
 
I generally don't like CRT's. My vision is not so great and sometimes I can watch them refresh with my glasses off. (rarely) LCD is much nicer...
 
Last edited:
Without turning this thread into a religious war between CRT and LCD technologies, are not LCD more picky which resolutions it will display perfectly? I mean it is manufactured for one or a few optimal resolutions and will emulate the others, resulting in more or less bleed and blur. My coworker (him with the IBM AT) is using some 20" Samsung LCD which has a picture format of 16:10 but so far he is unable to find a resolution or driver that fully utilizes the screen. Regular resolutions are a-ok, but a bit blurred. It could be a limitation of the graphics card in his laptop too.

Regarding what it does to your eyes, I've heard that if you use a flat screen you should take care to sit straight in front of it, or as much as the view angle allows. A receptionist or something at a different workplace got her CRT replaced with a LCD, and was used to watch the computer from the corner of her eyes, not straight in front of it. After a while she got problems with her sight and went to the eye doctor who gave this explanation, having seen it before?

Evildragon: If your 8512 is so good, you may want to use it sparsely so it will work and last for many more years to come.
 
My problem was a pre-existing condition that arose when I worked around CRT's a lot. From what I was told, the flicker of the refresh rate was hard on your eyes. I've looked at both from odd angles and was never really bothered.
 
Why I hate (and love) the 8512

Why I hate (and love) the 8512

im really curious about the chip.. i wonder if it has raster effects, like how game consoles do some raster effects..

on another note, why do so many people hate the IBM 8512? I mean, it has a great picture quality, so what's the hate about them? Is it a failure rate or something? mines never died, so maybe im lucky..

My first VGA monitor was also an 8512; still have it and it's true what you say about the color saturation and contrast. However, it was also the first monitor that I used to try X windows on Slackware. The biggest problem, as I recall, besides the interlace was the fixed refresh. I have the conf file somewhere (on Travan TR-1 tape, probably), but I seem to recall that I had to run it interlaced for 640x480 and it was very picky about the refresh rate but it was not the same as the "published" rate (i.e., it was off slightly or I was just using the wrong one). Most likely it was all my fault as I was still very green, but it left a whole bunch of frustrating memories. The next VGA that I bought was just a 'cheap' 14 inch, but it autosynched and took all of five minutes to get up and running...hence the bias.

Still have it, but it has developed a "wiggle line" due, I'm told, to a drying capacitor, so I've moved it from my workbench and given it an easy retirement, occasionally displaying Windows 3.0 and DesqView on an old '386.

-Scrappy
 
Back
Top