• Please review our updated Terms and Rules here

Fixing CGA composite monochrome

resman

Veteran Member
Joined
Dec 31, 2013
Messages
705
Location
Lake Tahoe
An annoyance back in the '80s and presently an annoyance now that I have my backup Compaq Deskpro hooked up to an Amdek 300A: CGA firmware that doesn't disable color when you ask it. Back in the day with my PC XT clone, I always had to issue a 'mode bw80' to get the CGA to disable composite color so I could read the text on my cheap green screen monitor. With the Compaq CGA card, it's even worse - it won't disable color even when you ask it. You have to force the BW bit in the mode select register. To add insult to injury, I hooked up a color composite monitor (from my Apple IIe) and it only displays in black and white - no color burst signal. Another color monitor had the same result. I don't even get to play with NTSC color artifacting! I suppose nobody ever used the composite output on a Deskpro. I guess if you were going to lay out that kind of money for a Compaq, you'd get their fancy monochrome dual-mode monitor. In fact, I have one on my main Deskpro, and it works great, but the backup Deskpro has to deal with the composite video.

Here is what I had to look at:
IMG_2373.jpg

Unusable. Since I had just finished a fresh install of Compaq DOS 3.31, MASM 5.1, and MCS 5.1, I needed a coding project and this fit the bill: write a TSR to hook Video INT 10 and fix up the CRT mode register when a mode set happens. Hours later, after pouring over BIOS listings, CGA registers, and MS-DOS functions, I get this:
IMG_2370.jpg

Interestingly, the Compaq CGA card only outputs three levels on composite: black, medium intensity, and high intensity. No 16 shades of grey. So none of the fun video modes are really worth playing with. Even 160x100 mode looks horrible.

For anyone interested, here is the code for my simple TSR MONOCGA.COM:

Code:
INTSEG  SEGMENT AT  0
INT10VEC        EQU     40H
INTSEG  ENDS

BIOSDATA SEGMENT AT 40H
CRT_MODE_SET    EQU     65H
BIOSDATA ENDS

COMSEG  SEGMENT
        ASSUME  CS:COMSEG
        ORG     100H
START:  JMP     INIT
INT10:  OR      AH, AH                          ; Mode set if AH == 0
        PUSHF
        CALL    DWORD PTR CS:OINT10
        JNZ     EXIT
MONO:   PUSH    AX
        PUSH    DX
        PUSH    DS
        MOV     AX, BIOSDATA
        MOV     DS, AX
        ASSUME  DS:BIOSDATA
        OR      BYTE PTR BIOSDATA:CRT_MODE_SET, 04H      ; Force BW
        MOV     AL, BIOSDATA:CRT_MODE_SET
        MOV     DX, 3D8H
        OUT     DX, AL
        POP     DS
        POP     DX
        POP     AX
EXIT:   IRET
OINT10: DD      ?
INIT:   MOV     AX, CS
        MOV     DS, AX
        ASSUME  DS:COMSEG
        MOV     DX, OFFSET HELLOS
        MOV     AH, 9
        INT     21H
        XOR     AX, AX
        MOV     ES, AX
        MOV     AX, ES:INT10VEC
        MOV     WORD PTR OINT10, AX
        MOV     AX, ES:INT10VEC+2
        MOV     WORD PTR OINT10+2,AX
        MOV     AX, OFFSET INT10
        MOV     ES:INT10VEC, AX
        MOV     AX, CS
        MOV     ES:INT10VEC+2, AX
        PUSHF                                   ; Set up return by IRET
        PUSH    CS
        CALL    MONO
        MOV     DX, OFFSET INIT
        INT     27H                             ; Terminate but Stay Resident
HELLOS: DB      "Monochrome CGA override TSR.$"
COMSEG  ENDS
        END     START

Finally, here is a glamour shot for that cool '80s vibe:
IMG_2381.jpg

Dave...
 
Interestingly, the Compaq CGA card only outputs three levels on composite: black, medium intensity, and high intensity. No 16 shades of grey. So none of the fun video modes are really worth playing with. Even 160x100 mode looks horrible.

Which is the way both MGA and CGA cards worked back then--signals are TTL and you get one line for video (on/off for whatever color) and the level (normal/hi intensity). Real variability didn't start until the EGA--and even then, it still was TTL, but with 2 bits per color. VGA was analog output.

If your monitor has an analog video input, you might want to experiment with your own DAC circuit to take the color outputs and weight them accordingly.
 
Which is the way both MGA and CGA cards worked back then--signals are TTL and you get one line for video (on/off for whatever color) and the level (normal/hi intensity). Real variability didn't start until the EGA--and even then, it still was TTL, but with 2 bits per color. VGA was analog output.

If your monitor has an analog video input, you might want to experiment with your own DAC circuit to take the color outputs and weight them accordingly.

That makes sense. For some reason, I recall my XT clone outputting 16 shades of intensity out of the composite port. But that was 30 years ago and I can barely remember last week.

Dave...
 
That makes sense. For some reason, I recall my XT clone outputting 16 shades of intensity out of the composite port. But that was 30 years ago and I can barely remember last week.

Yes, new CGA has 16 different luminance values over composite, see here: http://nerdlypleasures.blogspot.nl/2013/11/ibm-pc-color-composite-graphics.html
Also, some CGA/Hercules clones can display CGA on an MDA/Hercules monitor with 16 shades of intensity.
The ATi Small Wonder is one such example. See this link: http://www.vogons.org/viewtopic.php?f=46&t=41856#p440757
 
Didn't there used to be a trick for tapping the composite output before the colour burst was added to it directly off the card? Could have sworn there was an article about that in PC World back in the day. Supposedly the results were even better than what you could do software-side? I know I did that on the Coco; simply grabbing composite was'nae enough, ditching the colour signal helped even more.

Same as the separate luma and chroma output for things like the C64, where if you left the chroma disconnected you got a ridiculously sharp and beautiful black and white? (and why I have a pot and switch on my SVHS converter instead of a fixed resistor?)

I actually noticed that with my Junior that has the busted composite -- it only outputs monochrome... 80 column mode on a modern TV looks GREAT. Certainly far better than the composite out on my Tandy 1000.
 
Didn't there used to be a trick for tapping the composite output before the colour burst was added to it directly off the card?

Not on IBM CGA - the carrier isn't added to a mono composite signal, the signal is synthesized (multiplexed) with the carrier in place. But enabling the +BW bit (port 0x3d8 bit 2) in software has essentially the same effect (though on old CGA you only get 4 shades of grey instead of 16, as colours 1-6 are the same as 7 and colours 9-14 are the same as 15 with +BW set).

It's also not difficult to make a circuit that converts the output from the RGBI port to a mono composite signal - I've done this, and it works quite well.
 
Just throwing this out there; have you tried the old trick Control-Alt-> ? That would cause the VDU to switch from the 9x14 cell to an 8x8 cell on the original Portable.

One of my Compaq desktops (both 286s) runs the monochrome multi-scan, and the other just has a clone CGA card in it. The latter works fine, but on an actual CGA display. I don't have a mono CGA monitor over here.
 
Just throwing this out there; have you tried the old trick Control-Alt-> ? That would cause the VDU to switch from the 9x14 cell to an 8x8 cell on the original Portable.

One of my Compaq desktops (both 286s) runs the monochrome multi-scan, and the other just has a clone CGA card in it. The latter works fine, but on an actual CGA display. I don't have a mono CGA monitor over here.
I have the Deskpro's MB switches set to use a standard monitor so that it doesn't try and use the high resolution mode (which wouldn't work on the composite output). So that key sequence is disabled. I do have the Compaq dual mode monitor on another Deskpro. It plugs into the 9 pin connector and is able to display 16 levels of intensity just fine. It looks like I will just be playing with the 640x200 mono graphics on this rig.
 
It's also not difficult to make a circuit that converts the output from the RGBI port to a mono composite signal - I've done this, and it works quite well.

I made one but its kind of jank (it works but only 8 shades and its not that optimized for warm vs cool palettes) I would like to see your's in order to improve mine

thanks
 
I made one but its kind of jank (it works but only 8 shades and its not that optimized for warm vs cool palettes) I would like to see your's in order to improve mine

Mine is not optimized for warm vs cool palettes either - it just converts the RGBI index into a voltage linearly (so colour 7 is one shade darker than colour 8, not lighter as it would be if I was converting based on the luminance of the corresponding colour). That's because of the application I had in mind for it, but it should be easy to fix (just use the resistor values from the new CGA circuit). Here's the schematic, here's my notes and veroboard layout, here's a picture of the board and here's a photo of what the output looks like with an image optimized for it.

Edit: This circuit is actually much more complicated than it needs to be to just output a mono composite image - I also wanted my circuit to output a white pulse after the hsync in order to calibrate levels for a capture device. The OR gates, and the circuit that creates the delayed hsync pulses, can be removed. The new-CGA resistor values mentioned above are: 2K for R, 1K for G, 3K for B, 680 ohms for I, and 680 ohms for sync - adjust to taste.
 
Last edited:
ah ok I did a R-2R daq what messes me up is "warm" is R&G then "cool" is R&G&B so you get warm looking perfect and cool is blown out, or cool is perfect and warm is too dark so you either fiddle with a pot fine tuning it or find a middle ground which is still eh
 
This is a follow up to my OP about composite color on my Compaq CGA card

I was perusing the premier issue of PC-Tech Journal (one of my favorites back in the day) reading the article on the CGA card, I read an interesting passage near the very end. It made a refrence to composite monitors and TV sets that have a problem displaying color in high resolution alpha mode. Since I was having no luck getting color in my alpha and the hybrid 160x100x16 modes, this caught my eye. The solution, according to the article, was to set the border color to yellow ('1110') in the color select register.

Huh?

Sure enough, I set it to yellow and got colors. Very dim colors, though. So I tried normal yellow (brown) for the border('0110'). The colors were a little brighter (but still pretty dim). Green gave brighter colors, yet. Yep, the border color is being used as a stand-in for the colorburst signal. Depending on what border color you use will shift the displayed colors hue and brightness.

One thing I noticed on my AppleColor monitor (the one I'm using for color composite) is that the image is shifted very far to the left. I don't want to mess with the internal horizontal position because I use this monitor on my Apple II's as well. I'm wondering if different timings for the CRTC would shift the image toward the center and get the color burst signal working. Oddly, the hires graphics mode (640x200) gets composite color artifacts just fine when I disable the BW bit in the mode register.
 
Sure enough, I set it to yellow and got colors. Very dim colors, though. So I tried normal yellow (brown) for the border('0110'). The colors were a little brighter (but still pretty dim). Green gave brighter colors, yet. Yep, the border color is being used as a stand-in for the colorburst signal. Depending on what border color you use will shift the displayed colors hue and brightness.

The final version of 8088MPH has a calibration screen which lets you try various different values of the border colour, horizontal sync width and phase of the CRTC horizontal sync with respect to the CGA's internal +LCLK signal to get the best possible colours in hi-res text mode.

One thing I noticed on my AppleColor monitor (the one I'm using for color composite) is that the image is shifted very far to the left. I don't want to mess with the internal horizontal position because I use this monitor on my Apple II's as well. I'm wondering if different timings for the CRTC would shift the image toward the center and get the color burst signal working.

Tweaking the horizontal sync position register will shift the image but won't make any difference to the color burst other than flipping the phase.

Oddly, the hires graphics mode (640x200) gets composite color artifacts just fine when I disable the BW bit in the mode register.

That's because as far as the CRTC is concerned, 1BPP graphics mode isn't actually a "hi-res" mode at all. The graphics modes and 40-column text mode use 80 bytes of data per scanline with a normal width, but the high-res text mode uses twice as much. This is directly related to the sync pulse in 80-column text mode being half the width that it should ideally be.
 
The final version of 8088MPH has a calibration screen which lets you try various different values of the border colour, horizontal sync width and phase of the CRTC horizontal sync with respect to the CGA's internal +LCLK signal to get the best possible colours in hi-res text mode.



Tweaking the horizontal sync position register will shift the image but won't make any difference to the color burst other than flipping the phase.



That's because as far as the CRTC is concerned, 1BPP graphics mode isn't actually a "hi-res" mode at all. The graphics modes and 40-column text mode use 80 bytes of data per scanline with a normal width, but the high-res text mode uses twice as much. This is directly related to the sync pulse in 80-column text mode being half the width that it should ideally be.

Thanks for all the great information! You guys are truly the NTSC masters. I was able to get a nice color display with the 8088MPH CGA tweaking and run it on my Deskpro. Everything worked great except the chasing-the-beam demo at 7.2 MHz. The plygons really flew, though. At 4.77 MHz the beam chasing looked better but not sure it was exact (probably 8086 vs 8088 ).

I will have to spend some effort on my CRTC timings to match 8088MPHs.
 
That's because as far as the CRTC is concerned, 1BPP graphics mode isn't actually a "hi-res" mode at all. The graphics modes and 40-column text mode use 80 bytes of data per scanline with a normal width, but the high-res text mode uses twice as much. This is directly related to the sync pulse in 80-column text mode being half the width that it should ideally be.

Thank you, you have succinctly explained the bandwidth issue that causes CGA snow!

40 columns of text cell bytes + 40 attribute bytes = 80 bytes
320 pixels / 4 pixels per byte = 80 bytes
640 pixels / 8 pixels per byte = 80 bytes
80 columns of text cell bytes + 80 attribute bytes = 160 bytes
 
Perhaps a better question is why the crappy PC CGA design *does* have snow.

Because the 160 characters per line take up pretty much all bandwidth on the CGA memory.
With 80 characters per line, they could insert wait states for the CPU, to avoid CPU and CGA accessing the memory at the same time (which is what causes the snow in this case).
If they did that with 160 characters per line, the system would basically freeze, because you'd constantly be in a wait state.
So instead, they left it up to the programmer not to access the video memory during DISPLAY_ENABLE.
If you access the memory anyway, then the CPU gets precendence. Which means that the output is whatever data the CPU happened to put on the bus during that particular cycle. This is what leads to the 'random' characters and colour patterns, which we have come to know as snow.

As for the PCjr... My guess is that it uses a simple internal buffer, much like how the C64 handles its colorram. The C64 fetches the colorram once every 8 scanlines into an internal buffer, and it re-uses it for the next 8 scanlines. This fetch-line is known as a 'badline' because it steals many cycles from the CPU.
CGA does a dumb bruteforce fetch of the same memory every scanline, even though the contents will generally be the same in textmode (or at least, you don't care whether or not it changes on a given scanline, you can do row-oriented rendering). The character bitmaps are different every scanline, so you'll still have to fetch those.
So I wouldn't be surprised if they did that trick on PCjr as well: fetch the attributes once every 8 scanlines and re-use them. That way you've effectively reduced the bandwidth by almost 50%, with just one 'badline' where you have quite a lot of waitstates, but the other 7 scanlines are as fast as in 40 column or graphics modes.

Various CGA clones also have a snow-free implementation, and generally they don't use fancy expensive memory to get there (with more bandwidth or 'dual port', so you can read and write at the same time).
 
Various CGA clones also have a snow-free implementation, and generally they don't use fancy expensive memory to get there (with more bandwidth or 'dual port', so you can read and write at the same time).

Yeah, that was my point. It was typical for the Apple II 80 column cards to generate snow when you accessed them, too. But those were designed by some guy in a basement, not IBM. For the PCjr, wasn't it shared memory? It probably did something like the Apple II and interleaved access between the CPU and video HW.
 
Actually, come to think of it, the PCjr has more bandwidth anyway, because of its 16 colour graphics mode.
So perhaps it can just use that for 80 column mode as well, in a way that remains compatible with the CGA memory layout.

Still, I wouldn't be surprised if that's actually how they did it... You could do interleaved memory access... access two banks of memory at a time (even and odd bitplanes). That way you double the bandwidth without requiring faster memory chips.
You'd need an internal buffer to 'linearize' it.
 
Back
Top