• Please review our updated Terms and Rules here

Monochrome VGA monitors: different types of phosphor

AlexC

Experienced Member
Joined
Jun 26, 2015
Messages
275
Location
Germany
I have three monochrome VGA monitors, all apparently 'white' but only one has white phosphor, as far as I can tell. The other two appear to be a composite of yellow and (possibly) blue. It's not easy to tell, except by moving one's eyes quickly while a high-contrast image is on the screen.

The white on the composite phosphor displays is slightly greyer too. These two have shorter duration (the pure white one ghosts a lot) and are a little more flickery.

I'm guessing that in the late 80s/90s, a monitor described as paper-white or paperwhite actually had white phosphor, whereas standard mono VGA displays may not have done. Does anyone know of any brands - preferably European as I'm in Germany - that were definitely made with white phosphor?

Or is the yellow/blue effect a visual artefact of a fast-decay phosphor, and paperwhite displays just had longer duration? I used to own an Amstrad paperwhite VGA monitor and that was definitely truly white.
 
P4 is the standard short-persistence white phosphor, used by black & white TVs. But some "white" CRTs, such as in the Apple Lisa, are actually blue with green persistence:


Many older monochrome CRTs were medium or long persistence, to reduce flicker, but with VGA, text mode is 70 Hz so flicker isn't much of an issue for most people, allowing the use of short-persistence white phosphor.
 
I've got this one:

attachment.php
 
Or is the yellow/blue effect a visual artefact of a fast-decay phosphor, and paperwhite displays just had longer duration? I used to own an Amstrad paperwhite VGA monitor and that was definitely truly white.
See here on "Paper-White Phosphors"... apparently they did use a blend of blue and yellow (and some pink).
 
See here on "Paper-White Phosphors"... apparently they did use a blend of blue and yellow (and some pink).

Ah, that explains it beautifully, thanks. Though I'll have to take a microscope to my 'true' white one and see what colours it actually uses.
 
P4 is the standard short-persistence white phosphor, used by black & white TVs. But some "white" CRTs, such as in the Apple Lisa, are actually blue with green persistence:


Many older monochrome CRTs were medium or long persistence, to reduce flicker, but with VGA, text mode is 70 Hz so flicker isn't much of an issue for most people, allowing the use of short-persistence white phosphor.

I always thought the old Apple screens looked more blue-ish than later PC monitors, though I guess the camera accentuates it in that clip.
 
I have no knowledge of the phosphor makeup of the mono VGA monitors, but I do remember having 256 shades of gray on some of our mono VGA workstations back in the day. IIRC, they didn't last very long in the work place. I seem to remember 17" color VGA with progressive scan going for slightly under $600 in the early to mid 90's. Apologies for slightly derailing this thread.
 
I have no knowledge of the phosphor makeup of the mono VGA monitors, but I do remember having 256 shades of gray on some of our mono VGA workstations back in the day.

Standard VGA only has 64 shades of gray because a monochrome VGA monitor only uses the green pin; it doesn't combine R+G+B.
 
Radius designed their monochrome monitors to do 256 shades of gray with special drivers.

Having had an IBM 8503, I found it difficult to tell the difference between more than about 6 gray shades counting white and black.
 
No argument from me on that. Maybe confused 256 VGA color pallet in the mono mode?

Does it depend on the monitor or only the graphics card? I've just plugged one of the mono VGA monitors into a 24-bit Linux machine with VGA port. Viewing full-colour images I can't see any apparent dithering or posterization. Presumably the electron gun can be moderated to an infinite degree since the VGA output is an analogue signal. Or am I missing something?
 
Yes, the VGA signal is analog, but only after it leaves the card. VGA cards (in the pre-'high'/'true'color days) used 6-bit DACs so R, G, B each had 64 possible values, and if the mono monitor only uses the green pin that's what you'd get.
 
Yes, the VGA signal is analog, but only after it leaves the card. VGA cards (in the pre-'high'/'true'color days) used 6-bit DACs so R, G, B each had 64 possible values, and if the mono monitor only uses the green pin that's what you'd get.

Yup, which is why VGA was quoted as having 262144 colours: 2^(6*3).
If you only tap green, it's only 2^6.

Since the monitors are indeed analog, you could use the same monitors for the later 15, 16, 24 and 30-bit colour standards via the same analog VGA interface.
Most LCDs actually digitize the VGA signal and downsample it back to something like 18-bit, since the LCDs can't handle the full 24+ bit accuracy. With many of them, you can see some dithering or posterizing because of this.
 
So by running a 24-bit VGA output I'm probably seeing more levels of grey than these monitors have ever displayed before.

I wonder if there's a way to drive the monitor with a true monochrome signal, i.e. to send an average of the RGB intensity signals to the green pin. I read about an xorg tweak but I think that was for displaying greyscale on a colour monitor, which may not have the same effect.

Yes, I've seen a few LCDs like that.
 
How would you weight the colors? Also note that CRT phosphors are not usually very linear in their conversion efficiency.

Back in the day, I used Tatung mono VGA monitors because they were inexpensive compared to the full color ones. I gave up after awhile because of the many programs color-coding their contents. e.g., "Click on the red box" To get a similar confusing effect, try swapping the analog RGB signals around on a conventional full-color VGA monitor.
 
How would you weight the colors? Also note that CRT phosphors are not usually very linear in their conversion efficiency.

Back in the day, I used Tatung mono VGA monitors because they were inexpensive compared to the full color ones. I gave up after awhile because of the many programs color-coding their contents. e.g., "Click on the red box" To get a similar confusing effect, try swapping the analog RGB signals around on a conventional full-color VGA monitor.

A gamma tool to vary RGB until it looks good? Something comparable to joining the RGB wires together with resistors, but in software.

My first PC gaming experience was with a mono VGA monitor. It made seeing the number of 'lives' left in Prince of Persia remarkably difficult, amongst other things.

I'm actually using the mono VGA monitor to post this, in Linux at 640x480. It's not as awful as it sounds.
 
In a way, this reminds me a bit of the old-2-color processes used in early movie films (e.g. Technicolor Process 2). Results not nearly as awful as it would seem at first.
 
Last edited:
In a way, this reminds me a bit of the old-2-color processes used in early movie films (e.g. Technicolor Process 2). Results not nearly as awful as it would seem at first.

I had never heard of that... Is that this? Looks quite reasonable indeed.
 
I wonder if there's a way to drive the monitor with a true monochrome signal, i.e. to send an average of the RGB intensity signals to the green pin.

That's what VGA cards are supposed to do. Upon powerup, if the Monitor ID pins in the cable detect that you have a monochrome monitor attached, it should operate in grayscale mode, combining R+G+B and sending it out the green pin.

Or at least that's the way it used to work, before those ID pins were reassigned to VESA DDC: https://en.wikipedia.org/wiki/Display_Data_Channel
 
Back
Top