• Please review our updated Terms and Rules here

Representing IBM 5153 color output more accurately

VileR

Veteran Member
Joined
Jul 21, 2011
Messages
647
Location
Israel
(Preamble: to most of you, this will seem like OCD nitpicking of the worst kind, but it ain't so. Fact is, I'm working on a graphical project which targets the IBM Color Display, and accurate colors become important for the artwork, especially when you factor in things like dithering.)
Some of you are familiar with the commonly-accepted mapping of RGBI CGA colors to 24-bit (s)RGB values. This "canonical" palette assigns a level of 2/3 to the original R,G,B components, treats I (intensity) as an additional 1/3 in all three channels, and implements the "brown fix" by halving G for color #6:

5153colors-canonical.png

"Canonical" RGBI -> rgb24 palette (all values are in RGB hex triads)




However, this doesn't necessarily reflect the appearance of a CGA monitor like the 5153. Rather, it's derived from the way *VGA* rendered these 16 colors for backward compatibility - it's simply expanded from VGA's 6 bits per channel to 8.

It's canonical in the sense that it follows exactly what IBM did with VGA. But since I got my 5153, I've become convinced that this isn't a good enough model for the colors' original RGBI appearance. At least with *my* 5153, no amount of fiddling with the controls really gets the output to match that palette perceptually.


So, is it possible to get a better representation?

T-squared's recent Sanyo thread reminded me of Hugo Holden's excellent investigations of the 5153's guts. In Using an IBM EGA card in the IBM 5155 (see https://www.worldphaco.com/), pp. 21-24 present his measurement of the voltages driving the gun amplifiers for each RGBI color.

Here's the relevant chart (I hope he doesn't mind me reproducing it here):

Click image for larger version  Name:	5153drivelevels.png Views:	0 Size:	397.7 KB ID:	1234463

These measured drive levels deviate from expected values in a number of interesting ways, which the article goes into. But for me, it's even more interesting to try deriving a more accurate palette based on these results. At least, they give us a much more realistic starting point.

Normalizing the voltage range of 0--1.30 to the rgb24 range of 0--255 gets me this:

5153colors-holden-voltage.png

RGBI to rgb24, as derived from Hugo Holden's 5153 gun drive level measurements



Things to note:
  • The levels were measured after the input processing stage, and I'm assuming that they don't require any further gamma correction.
    That can't be true unless the 5153's gamma value is equal to sRGB gamma (2.2). It very likely isn't, but I couldn't find any concrete information to tell me that, and the results are close enough to what I'm actually seeing on my monitor.
  • The non-intensified colors are brighter (closer to the intensified ones) when compared to the "canonical" palette. But keep in mind that the measurement was done with the contrast knob in the 'full' position; dialing it down makes the non-intensified colors darker.
  • The "brown fix" for color #6 reduces green to 64%, not 50%.
  • The hi-intensity colors are also affected by the *count* of primary colors combined to create them (Hugo Holden's document explains why).

This mapping already looks a lot closer to reality. But does it really tell the full story? I'm not sure.

Here's a photo of the 16 colors on my 5153. I've tried to make sure that the camera settings reflect what I'm seeing as closely as possible, and to correct and white-balance the image so that gray is really gray.
Even so it's not perfect, but it gets kinda-sorta close enough:

Click image for larger version  Name:	5153drivelevels.png Views:	0 Size:	397.7 KB ID:	1234463

One thing you may notice is, the contribution of each primary color to overall luminance seems to be somewhat different than expected.

For example: in my derived colormap, #2 and #3 (green and cyan) are very close in brightness. Same goes for #4 and #5 (red and magenta). But on my monitor, both of those pairs show a more noticeable difference.

It's as if blue contributes more than expected to the luminance, and/or red/green contribute less. Interestingly this doesn't have a huge effect on hue... although the magentas do appear slightly more bluish than they do in my derived palette. I think. However, anything I can come up with at this point is strictly subjective.

So if Hugo Holden is reading this, or for that matter anyone else with the right knowledge...

- Is my above mapping valid?
- Can it be fine-tuned even further, based on any published information or measurable data?
- Can I even trust what I'm seeing on my monitor, or should I just ignore all those differences and chalk them down to aging components and so on?
 
Last edited:
However, this doesn't necessarily reflect the appearance of a CGA monitor like the 5153. Rather, it's derived from the way *VGA* rendered these 16 colors for backward compatibility - it's simply expanded from VGA's 6 bits per channel to 8.
That's not the actual issue. Look at the color sets in this post on 10 different monitors and you'll see 10 different color sets... RGB does not represent absolute colors. E.g. "#FF0000" means red at maximum intensity. That's it. Every monitor will render that how the manufacturer thinks is correct and it's also affected by the user settings (brightness etc.).

Since I'm a web designer and graphic artist, I'm using a profiled and color-calibrated display. And on that, the "Canonical" colors match your 5153 photo almost exactly, while your version of the RGB colors look off. Moreover, no two 5153 will display the colors exactly the same either. You simply can't do this perceptual.
 
That's not the actual issue. Look at the color sets in this post on 10 different monitors and you'll see 10 different color sets... RGB does not represent absolute colors. E.g. "#FF0000" means red at maximum intensity. That's it. Every monitor will render that how the manufacturer thinks is correct and it's also affected by the user settings (brightness etc.).

Since I'm a web designer and graphic artist, I'm using a profiled and color-calibrated display. And on that, the "Canonical" colors match your 5153 photo almost exactly, while your version of the RGB colors look off. Moreover, no two 5153 will display the colors exactly the same either. You simply can't do this perceptual.

That's fine, I've done graphics and web design myself for long enough both professionally and otherwise - I'm perfectly aware that (s)RGB is a device-dependent color space, and that there's no hope of representing absolute color with it. What I'm aiming for is a better relative representation, so that a given set of colors (displayed on an arbitrary monitor) preserves the *internal* relationship between them in terms of hue, contrast and levels.

I'm explicitly trying to avoid deriving it perceptually. That's why I went to the only source I know of for hard data based on concrete measurements of the 5153's internals. That's the right way to go about things if you're trying to emulate the behavior of such devices, and it has already been successfully done for many others.

And yes, using a VGA-derived palette *is* an actual issue if you're trying to represent the output of a very specific device (the 5153) which was never based on analog VGA signals, and which processed its input entirely differently internally.
Personally and subjectively, I have the VGA version of those 16 colors pretty much ingrained into my mind, having done ANSI art since the '90s; so I can tell when they appear 'off' in relation to each other. ;) But the only relevant part here is the available data, which objectively tells us that the 5153 displays them somewhat differently. What I'd like to know is whether any additional information exists which could improve the results' accuracy.
 
Another attempt at a color-corrected photo of my 515; this time I used a better camera, and actually made sure that the contrast knob was in the full position for the shoot.

cgacal4-good.jpg

Now the same photo next to the two palettes, in one image (to eliminate any weirdness such as non-matching ICC color profiles, etc.)

To my eyes, the revised palette (derived from the voltages) does a somewhat better job than the "canonical" palette, at least in terms of brightness relationships. When it comes to saturation, it's still not quite accurate to what I see on the monitor... and I'm still not 100% convinced about exact hue, either.

compare.png

Hoping for Hugo Holden's insight regarding my interpretation of his voltage measurements. As mentioned, my assumption of identical non-linearity between the 5153 and sRGB/VGA (as usually approximated by a gamma law) is likely not quite right.

One more thing I noticed: when I move the contrast knob away from the full position, brown becomes slightly more greenish (i.e. less reddish). It's not really a continuous change - it just happens around a specific position of the knob, and then the same hue is retained all the way down. I'd be curious to know what that one's all about.
 
Funnily enough, I had a very similar idea a couple of months back, albeit for a Commodore 1084 rather than an IBM 5153.
My approach would have been slightly different, though:
I would have taken the schematic for the circuitry that converts the digital IRGB signal to an analog RGB signal to try and derive the relevant information mathematically.
 
Yeah, since the 1084 already supports both RGBI and analog I would assume that's what it does internally. You'd still end up with numbers that have to be mapped to generic VGA/sRGB levels somehow, but you already have an analog signal in both cases, so that's probably much easier. I guess the gamma response curve is even more likely to be similar.

The 1084 is also fairly common as a CGA monitor, so the results may be interesting too. Although my own target still remains the One True Big Blue display. :)
 
On my Tandy CM-11 the brown color is much more reddish. I'm not sure if there's something wrong with it or if they're all like that. Mine has some vertical ringing which suggests it may be due for a recapping.
 
On my Tandy CM-11 the brown color is much more reddish. I'm not sure if there's something wrong with it or if they're all like that. Mine has some vertical ringing which suggests it may be due for a recapping.
I guess not all 'brown fix' circuits were created equal, so that could be just how Tandy designed it. I've got a Samsung Samtron SC-452 where brown is near indistinguishable from red... but that monitor has known color/contrast problems anyway (all reds fade out and disappear within 5 minutes of power-up). Needs repair for sure, although with the fuzzy huge dot pitch it's probably not worth it.
 
It looks pretty close to me. It is interesting what IBM did in the 5153 VDU, it is the only VDU I have seen where these unique combinations of mixing were applied and the oddball feature of dynamically deactivating the contrast control for intensified colors. The mixing system they use gives the intensified colors a pastel look as it effectively adds white to the color.

I have restored two 5153 VDU's and one of them I added an EGA scan modification too, and it produces a good 16 color EGA image which is about 9" diagonal on the screen. The intrinsic CRT resolution and frequency response of the 5153's gun amplifiers is very close to the 5154.

Oddly, although the 5154 VDU, being dual standard, is more technically capable than the 5153, the design of it at the hardware level was bungled from the thermal management perspective IMHO, the case less well ventilated and the pcb's prone to cooking up around the Video output amplifiers and the vertical scan output IC. If I owned a 5154 I would add ventilation holes to its case and add a cooling fan.The 5153 also had better quality internal metalwork than the 5154.

The 5153 can be improved by eliminating the electrolytic capacitors that couple the gun amplifiers to the CRT and replacing those with film capacitors.
 
It looks pretty close to me. It is interesting what IBM did in the 5153 VDU, it is the only VDU I have seen where these unique combinations of mixing were applied and the oddball feature of dynamically deactivating the contrast control for intensified colors. The mixing system they use gives the intensified colors a pastel look as it effectively adds white to the color.
I've always found those slightly pastel bright colors to be a little easier on the eyes, compared to some other schemes that just cranked up the saturation to 11 on everything. Like the ZX Spectrum's, or for that matter Microsoft's choice of the default 16-color VGA palette in Windows.
IIRC, a few third-party manufacturers had TTL RGBI monitors available for CGA some time before IBM introduced the 5153. I don't think I've seen any of those myself, but they must have had their own interpretations of intensity and contrast (and brown).

About my interpretation of your voltage measurements, would you say it's valid? After all, I simply normalized the range of 0-1.30 V to sRGB values of 0-255. But it's not a given that the 5153 has the same response curve as a more modern display, tuned for >=VGA signals and a gamma of ~2.2.

Your input is appreciated - I think the result does look pretty close, but if you're aware of any other relevant data, it might help towards a more definitive answer.

I have restored two 5153 VDU's and one of them I added an EGA scan modification too, and it produces a good 16 color EGA image which is about 9" diagonal on the screen. The intrinsic CRT resolution and frequency response of the 5153's gun amplifiers is very close to the 5154.
Yes, I remember your writeup about that modification too - was surprised that the 5153 was able to pull it off! I guess the build quality also explains why 5153s seem to age relatively gracefully... most mentions of the 5154 these days seem to involve magic smoke and other pyrotechnics.
 
About my interpretation of your voltage measurements, would you say it's valid? After all, I simply normalized the range of 0-1.30 V to sRGB values of 0-255. But it's not a given that the 5153 has the same response curve as a more modern display, tuned for >=VGA signals and a gamma of ~2.2.

Yes, I remember your write up about that modification too - was surprised that the 5153 was able to pull it off! I guess the build quality also explains why 5153s seem to age relatively gracefully... most mentions of the 5154 these days seem to involve magic smoke and other pyrotechnics.

The interpretation of it converted into a scale of 256 steps is about as close as you can ever get without an exact analog copy of what IBM did applied to another type of VDU. Probably it would be difficult to tell the difference comparing your synthesized version of it on another brand VDU to the original 5153.

There was another interesting feature of the 5153 CRT, the picture tube itself. It had a particularly dark tinted glass faceplate compared to most color CRT's of the time. The idea being that incident light is less well reflected from the phosphor because it has to pass through the layer twice, but light from the phosphor only once to exit the CRT. So it gave a very high contrast ratio look in relatively high ambient lighting. Whenever I get a CRT VDU, I always buy a spare CRT for it (that spare parts disease I have). So I started hunting around for a new old stock CRT from my usual CRT suppliers. Interestingly, while I could get a compatible CRT with the same phosphor pitch, gun assembly that will work, I could not get one with the same dark color glass faceplate. It must have been a special that Tatung had made for IBM.

The world has certainly seen some great vintage VDU's, Conrac in the USA made some amazing ones. There is one here that makes a great computer VDU, green CRT (It is awkward to use as it has a 3 Phase power supply requirement). I got some and fitted white P4 CRT's. Being a mil spec VDU, there was no manual so I had to trace out the schematic by hand. When I did I found some very clever black level clamp circuitry in there that turned out to have been originally invented by Tektronix. This still remains the most well constructed VDU I have ever seen, so Conrac gets the prize:

https://www.worldphaco.com/uploads/The_1987_Vintage_Avionics_Conrac_Video_monitor..pdf
 
I have seen my Tandy CM-5 have a darker brown color than my IBM 5153. If the contrast and brightness knobs are not carefully set, brown and red will look indistinct on those monitors. My 5153 does not have this issue. My friend Cloudschatze has also shown me photos where the brown on his Tandy monitors (CM-5 and CM-11) is also dark brown and my friend NewRisingSun has shown me his RGBI-capable Commodore 1084 with an IBM 5153-like brown. It seems like there are two standards for brown, a 1/2 green brown and a 2/3 green brown.
 
Very interesting! I take issue with one statement: "Rather, it's derived from the way *VGA* rendered these 16 colors for backward compatibility". As I understand it, the "halving green to get brown" story comes from the ECD (6-bit) color model, where brown is represented as primary red plus secondary green. However, this raises the interesting question: how does the Enhanced Color Display (IBM 5154) display brown, and does it differ between 200-line modes (which have the EGA output 4-bit colors) and 350-line modes (which have the EGA output 6-bit colors)? My hypothesis is: the 5154's "RGB brown" circuit is the same as the 5153's, so 200-line modes will display the more pleasing brown with green at 44% (or 67% of red), while 350-line modes, being based on the ECD color model, will display the more familiar VGA-like reddish brown with green at 33% (or 50% of red). To test this hypothesis, somebody in possession of both an EGA card and an IBM 5154 Enhanced Color Display would have to run the same color bar program in a 200-line text mode (achieved by setting the EGA card's DIP switch to indicate an RGBI monitor) and then in a 350-line text mode (achieved by setting the EGA card's DIP switch to indicate an ECD monitor), and take a picture with the same lightness settings of each situation.
 
The interpretation of it converted into a scale of 256 steps is about as close as you can ever get without an exact analog copy of what IBM did applied to another type of VDU. Probably it would be difficult to tell the difference comparing your synthesized version of it on another brand VDU to the original 5153.

There was another interesting feature of the 5153 CRT, the picture tube itself. It had a particularly dark tinted glass faceplate compared to most color CRT's of the time. The idea being that incident light is less well reflected from the phosphor because it has to pass through the layer twice, but light from the phosphor only once to exit the CRT. So it gave a very high contrast ratio look in relatively high ambient lighting.
True, I've noticed that the glass face is nicely darker than most pre-SVGA monitors, or color CRT television tubes. Interesting choice compared to the 5151, where they opted for etching (I believe) to reduce reflection, while the glass is tinted much lighter.

Another quirk I've noticed with the 5153: it seems that the beam takes slightly longer to switch on than to switch off. Most evident when two colors alternate at a high frequency, say by using character 0xB1 ("▒") in 80-column text mode. The resulting color "blend" is darker than expected, at least when compared to a crisp pixel-perfect rendering on a modern display. How much darker depends on the number of RGBI components that differ between the two colors: the effect accumulates as more of them keep switching on and off, so it appears that the voltage takes longer to rise than to fall. Then again I suppose it could be partly due to aging components.

This still remains the most well constructed VDU I have ever seen, so Conrac gets the prize:
https://www.worldphaco.com/uploads/The_1987_Vintage_Avionics_Conrac_Video_monitor..pdf
Looks excellent. Great research once again, and those Time Tunnel shots are on point. :)
 
Very interesting! I take issue with one statement: "Rather, it's derived from the way *VGA* rendered these 16 colors for backward compatibility". As I understand it, the "halving green to get brown" story comes from the ECD (6-bit) color model, where brown is represented as primary red plus secondary green.
True, thanks for the correction. Yes, VGA was simply following EGA in that respect.

However, this raises the interesting question: how does the Enhanced Color Display (IBM 5154) display brown, and does it differ between 200-line modes (which have the EGA output 4-bit colors) and 350-line modes (which have the EGA output 6-bit colors)? My hypothesis is: the 5154's "RGB brown" circuit is the same as the 5153's, so 200-line modes will display the more pleasing brown with green at 44% (or 67% of red), while 350-line modes, being based on the ECD color model, will display the more familiar VGA-like reddish brown with green at 33% (or 50% of red).
A very good question - I was going to ask the exact same thing after re-reading the previous replies which reminded me of the 5154, but you beat me to it. :)

It could also be interesting to compare the 5153 with IBM's other RGBI displays - the color monitors for the PCjr and the Convertible. I can't think of a reason that they'd be any different, except that IBM might have sourced them elsewhere, so the little details re: brown and intensity may not be the same. At least it would indicate whether or not they bothered to be consistent...
 
However, this raises the interesting question: how does the Enhanced Color Display (IBM 5154) display brown, and does it differ between 200-line modes (which have the EGA output 4-bit colors) and 350-line modes (which have the EGA output 6-bit colors)?
Refer to [here]. The contents of the ROM within the IBM 5154 make any colour changes.

As for 200-line mode (a.k.a. CGA, a.k.a. mode 1), IBM have published the RGBI-to-rgbRGB mapping done by the ROM.
That mapping is shown in a table on page 4 (PDF page 8 ) of IBM's technical document for the IBM 5154 (at [here]).
In the 'Brown' row of that table, the reduction of green to achieve brown results from 'Gg' being '01' (G=0,g=1).
It would be '10' for yellow.
 
All right, so that little ROM translates the RGBI color number to ECD color number. It would still be useful to have those camera pictures taken off the CRT to be absolutely sure.

I have been thinking about what determines the subjective reddishness of the #AA5500 brown. It will depend on
  • gamma, because gamma will determine the relative perceived brightness of the #55 value versus the #AA value;
  • color temperature, because color temperature determines the relative brightness of green versus red;
  • the purity of the red phosphor, because a washed-out red phosphor -- as used on cheaper monitors -- will make red drift towards brown.
Ideally, one would obtain a colorimeter to perform a full colorimetric analysis of the IBM 5153 and 5154 displays. It will be difficult to obtain such a device and to know how to correctly use it. Measuring the true gamma will be difficult as well because unless the monitor is new old stock, its electro-optical transfer function ("gamma") will have changed after 30+ years of use. The original color temperature can be properly measured on a monitor whose gains of the red, green and blue guns have not been subject to readjustment over the years; it will simply be the CIE 1931 x/y chromaticity of light gray (color #7) or full white (color #15); measuring light gray might be preferable to avoid distortions from phosphor saturation. Measuring the CIE 1931 x/y chromaticities of the red, green and blue phosphors should be the easiest, as the chromaticity depends neither on gamma, color temperature nor age, as far as I know. One only needs to be sure to select a pure red (color #4), pure green (color #2) or pure blue (color #1) color; the intensity-bit colors are not pure because they set the respective other two channels to 33%, and chromaticity measurement is not disturbed by colors #4/#2/#1 being only at 67% of the full output.
 
Last edited:
I have a color chart for the IBM 4055, which is basically a high end EGA monitor because it uses the IBM EGA card for display. The color chart is from the IBM maintenance and service guide for the monitor if it ever needed repair. Not sure if the 5154 or 5153 had such a manual with color chart. But basically, I can say the 16 colors on the chart and the brown in question looks like the canonical RGBI image in the first posting.

Does anyone know if there is such a chart IBM printed for the 5153?
 
A colorimeter was used on Conrac color studio CRT TV monitors, you could plug it onto the front of the monitor. The idea was to color balance an array of monitors in the one TV studio. I have one of these monitors, but not the colorimeter attachment. It is a well made monitor, I will list the type number later. It has a part in it that "drives me up the wall" because it violates my spare parts disease, in that I cannot make or acquire the part, well I have one but it is of no use as a spare part. What it is; a Motorola CPU that has its UV Eprom built into it, so I cannot read out the eprom file to make a duplicate. Though there was a fellow who offered to help, by using the original programmer circuit and doing a type of error check, it is too risky to send overseas, because if anything happens to the file in the Rom, the entire monitor is toast. I wrote to Conrac to try to get the .bin file (1980's vintage now) but they did not reply. The CPU, is on the scan board and controls "everything" including the gun set ups, RGB channel gain, brightness contrast scan sizes etc etc.

Of course if you do colorimetry on the same brand VDU's you would find differences due to the static gun bias in the RG&B channels that sets the low light color balance and the high light balances due to the individual gains of the three channels This changes with age of course or if people have been tweaking these adjustments after repairs etc. In the 5153 for example the gun bias is affected by leaky electrolytic caps on the CRT neck board and if the adjustments there were tweaked as they often are to compensate, they need to be re-adjusted after the electros get replaced.
 
Last edited:
The 4055 manual describes how to setup and use the Minolta Chroma Meter to perform the calibration. It is used much like what you are saying for TVs overlaying the front of the screen, and then turning bunches of pots until the meter's x/y values balance to zero. If they have a manual like this for the 5153, hopefully they'd show the colors to expect and you could even attempt to calibrate a monitor and see possibly what was intended from the factory, given healthy enough components.
 
Back
Top