• Please review our updated Terms and Rules here

CGA and monochrome monitors compatibility and CGA/mono mode of EGA

Raiker

Member
Joined
Oct 5, 2010
Messages
12
Good day,

I've got two questions around CGA and EGA videos.

1. Was IBM CGA really incompatible with MDA monochrome monitors (like IBM 5151)? Number of sources tell it was compatible, and other say it wasn't:

"MDA monitors will work perfectly with both CGA and EGA video cards."
http://www.uncreativelabs.net/xtreview/xtreview.htm

"some people also used 5151 monitors with CGA or EGA boards for their mono modes."
http://en.wikipedia.org/wiki/IBM_5151

Are all those mentions mistakes?

2. The second question is about EGA. Accroding to InfoWorld (1985, 1), EGA had a jumber with two modes: first for EGA, second for CGA/monochrome. But CGA-mode in EGA is colored and 640x200 and EGA mono is 640x350. So, how did EGA detect exactly monochrome (or CGA) monitor connected and not CGA (monochrome)? I mean, if IBM 5151 could be broken with CGA (question #1), than it could be broken with EGA CGA mode, didn't it? So how EGA knew it isn't CGA-monitor?

Thanks.
 
I'll answer the first question first.

Out of the box, the 5151 is not compatible with the CGA. There are two reasons for this. The first is that it's designed for a horizontal scan rate of about 18KHz and the CGA uses 15.6KHz. You'll get a image if you try it, as the 5151 has no horizontal oscillator--it just takes whatever is sent over the horizontal sync line and uses it. You could conceivably end up damaging the 5151.

The other reason is that the CGA outputs RGBI signals and the 5151 only uses two signals video and intensity. (compare the connector pinouts at http://www.pinouts.ru).

Now, it's certainly possible to find a monochrome CGA monitor. They were available as a low-cost option and merely rendered colors as gray shades. VGA monochrome monitors were also offered.

Answer to your second question was that the EGA card didn't sense the monitor type--it was usually set with DIP switches or jumpers on the card itself.
 
I believe there were also clone monochrome monitors that could take a CGA signal; I think mine's one, though I haven't done any real testing.
 
I'll answer the first question first.

Out of the box, the 5151 is not compatible with the CGA. There are two reasons for this. The first is that it's designed for a horizontal scan rate of about 18KHz and the CGA uses 15.6KHz. You'll get a image if you try it, as the 5151 has no horizontal oscillator--it just takes whatever is sent over the horizontal sync line and uses it. You could conceivably end up damaging the 5151.

The other reason is that the CGA outputs RGBI signals and the 5151 only uses two signals video and intensity. (compare the connector pinouts at http://www.pinouts.ru).

Now, it's certainly possible to find a monochrome CGA monitor. They were available as a low-cost option and merely rendered colors as gray shades. VGA monochrome monitors were also offered.

Answer to your second question was that the EGA card didn't sense the monitor type--it was usually set with DIP switches or jumpers on the card itself.

Thank you for the answer.

About the second question, I'm confused with this:

"You don't have to replace your existing display adapter if you have an extra slot for the EGA because IBM lets you operate with two display adapters. You could then have two monitors attached to the computer. This may cause a problem, however, if you want to attach the Enhanced Color Display along with one of the standard monitors. The EGA requires a special switch setting depending on what kind of monitor is used, one setting for the IBM Enhanced Color Display and another for the two old IBM monitors."

Source: http://books.google.ru/books?id=iS8EAAAAMBAJ&lpg=PA1&pg=PA49#v=onepage&q&f=false

So could EGA based on the pins used decide what the signal (CGA or MDA) should be sent? Just like modern DVI does choosing between analog and digital signal? Since MDA and CGA pins (according to pinouts.ru) are almost the same, except the Mono Video (btw, what was this pin used for if for monochrome mode intensity pin was used?), could EGA choose between CGA and MDA based on pins used? Moreover, according to EGA pinouts (http://pinouts.ru/Video/EGA.shtml) jumper only chose between secondary green and intensity for pin 6 (=Intensity in MDA/CGA pinouts). Could it be so: you set jumper to CGA/Mono position, and EGA card detects if color pins are used or intensity pin is used and based on this info sends either CGA or MDA singal?

Btw, another idea. If Pin 7 Mono Video pin was used for monochrome mode (was it?), than in EGA it's Secondary Blue. In modern virtual machines (with EGA emulation), when running Windows 1.0 in monochrome mode, I get the following:



If monochrome monitors didn't understand color singnals, could they simply get "any" signal (let's say EGA's blue) and process it as Mono Video signal (that's why we get in current EGA mono emulation "blue" Windows)?

Thanks.
 
Last edited:
I believe that manual excerpt is actually referring to the capability to run a "twin-head" system with both a Hercules/MDA card and a CGA/EGA card (my 286 box is such a setup.) The switch in that case is to prevent the EGA from taking up the memory area that the monochrome card uses for its framebuffer. I'm not 100% sure on that, though - if that is the case, they're making it confusing by referring to monitors and cards interchangeably :/
 
Yes, essentially that's it--not only the display memory mapping, but the I/O ports used to access the controller. EGA is quite a bit smarter than CGA or MDA/MGA and so can change some things to emulate an MDA or CGA, but basically the rule follows that of not being able to have two CGA cards or two MDA cards in the same box at the same time.

The (broad) rule of thumb is that you can't have two color cards (EGA/CGA) or two monochrome (EGA/MDA?MGA) operating in the same box. For EGA, read "operating in mono or color mode, as set by the configuration switches.) One note is that if you want to use a Hercules-graphics-type card in the same box as an EGA or CGA, you need to set it to "half graphics" mode so the display memory doesn't conflict with CGA. Not all Herc-type cards can do this.
 
I see.

Btw, back to pins in EGA. According to pinouts.ru:

6 Secondary Green / Intensity

Isn't this the only pin changed by switch? I mean there is no other variants in the table:
http://pinouts.ru/Video/EGA.shtml

If MDA/CGA mode also should has been changed by switch (jumper), what was actually the actions of the switch in each of the scenarios (CGA and MDA)?

And, finally, what was the Pin 7 Mono Video in MDA (reserved in CGA, secondary blue in EGA)? Was it used to work with monochrome monitors? Is there any explanation of "blue" EGA monochrome mode emulation based on this information?

Thanks.
 
While pins (and their meanings) are changed by EGA configuration switches, recall also that scan frequencies also are.

If you have questions concerning exact IBM meanings of pinouts, recall that things are called out in glorious (or painful) detail in the Options and Adapters reference, available online.
 
Thanks, the link was very useful.

I got number of new questions)

1. According to the document there, 640x200 graphics mode of CGA was "black and white only". However according to the wikipedia, "by default the colors are black and bright white, but the foreground color can be changed to any other color of the CGA palette." So was it possible to change foreground color on original CGA cards in 640x400 hi-res mode?

2. According to the same document, only color modes of CGA supported home televisions, and hi-res 640x200 mode required color monitor. So did RCA actually not work (connected to a TV or NTSC-compatible monochrome monitor) with 640x400 resolution mode?

3. In the EGA documentation among other available modes Enhanced Display Emulation Mode is stated. I couldn't find any additional information about this mode in the doc. What was this mode?

Thanks.
 
Last edited:
1. Yes. The border/background color register (bits 0-3 of port 3D9h) controls the foreground color instead in 640x200 mode.

2. Composite video output technically works in 640x200 mode, but creates a lot of color artifacting, making text only somewhat readable: like so.

A good source of information on CGA capabilities is this page.
 
1. No. Read the document carefully--in particular, read page 1-152 in the techref. "
Bit 4 A 1 bit selects the high-resolution (640x200) black-and-white graphics mode. One color of 8 selected on direct-drive (i.e. TTL color monitor) in this mode by using register hex 3D9
. In particular read the description of regiser 3D9 on page 1-150, where 640x200 mode is specifically mentioned in connection with the color-select register.

2. A matter of expediency. 640x200 mode going through both a modulator and the the NTSC IF of a standard broadcast set exceeds what the circuitry can do. (hint: Compute how many dots per second are sent out when in hires mode and then compare to the bandwidth of an NTSC signal)

3. I assume that you're talking about the switch settings. If you look at page 108 in the EGA BIOS listing, you'll see that 0010 is taken to be the same as 0001--indeed, page 110 shows that the two switch settings jump to the same location.

Really, it's all in there, you just have to RTFM. :)
 
Unfortunately, I don't understand everything there correctly, so thanks a lot fot the explanation :)

Btw, you said that there, around the manuals, I could find IBM meaning of pinouts, however in MDA/CGA/EGA docs I could find only the same pinouts schemes, and not their explanation. Could you prod me with a stick to the explanation of IBM's pinouts, please. :)
 
Now you have me at a disadvantage. I don't understand. It's all there--BIOS listings, schematics, register descriptions. For the few chips whose function isn't obvious, there's always a web datasheet search. The schematic lays bare the mind of the engineer who designed the boards.

So, I don't understand how you don't understand.
 
Now you have me at a disadvantage. I don't understand. It's all there--BIOS listings, schematics, register descriptions. For the few chips whose function isn't obvious, there's always a web datasheet search. The schematic lays bare the mind of the engineer who designed the boards.

So, I don't understand how you don't understand.

The problem is that I'm not an engineer, I'm just an enthusiast learning some stuff of interest :)

I'll be thankful if you answer one more question regarding CGA support in EGA. According to IBM's docs EGA supported all the CGA modes plus 16-color modes. So there are switch configurations for Color Display support, also described there. However wikipedia says:

The EGA uses a female 9-pin D-subminiature (DE-9) connector which looks identical to the CGA connector. The hardware signal interface, including the pin configuration, is largely compatible with CGA. The differences are in the repurposing of three pins for the EGA's secondary RGB signals: the CGA Intensity pin (pin 6) has been changed to Secondary Green (Intensity); the second ground of CGA (pin 2) has been changed to Secondary Red (Intensity), and pin 7 (Reserved on the CGA) is used for Secondary Blue (Intensity). If the EGA is operated in the modes having the same scan rates as CGA, a connected CGA monitor should operate correctly, though if the monitor connects pin 2 to ground, the shorting of the EGA's Secondary Red (Intensity) output to ground could conceivably damage the EGA adapter. Similarly, if the CGA monitor is wired with pin 2 as its sole ground (which is poor design), it will not work with the EGA, though it will work with a CGA. Finally, because of the use of the CGA's Intensity pin as Secondary Green, on a CGA monitor connected to an EGA, all CGA colors will display correctly, but all other EGA colors will incorrectly display as the standard CGA color which has the same values for the g, R, G, and B bits (ignoring the r and b bits.) Conversely, an EGA monitor should work with a CGA adapter, but the Secondary Red signal will be grounded (always 0) and the Secondary Blue will be floating (unconnected), causing all high-intensity CGA colors except brown to display incorrectly and all colors to perhaps (but probably not) have a blue tint due to the indeterminate state of the unconnected Secondary Blue.

So did actually EGA required swithes settings to be changed for using Color Display? As I understand, accroding to wikipedia, 16-color mode (with CGA palette support) worked great with "default" EGA piout, didn't it? Than, did choosing Color Display switches configuration also emulated CGA palette system with only 4 colors in defined combination available to using and not 16-color mode?

Thanks.
 
Last edited:
Back
Top