• Please review our updated Terms and Rules here

CGA 640x200 (Screen mode 2) - Black & White or Greyscale?

Huh I can see it that way. I guess without a real dedicated pixel clock there's no real way to call it digital, when there's no true fixed way to map them.

I've done a lot of work on CRT's and video signals in my time, for video production and video generators, etc, but also just flat out messing with old monitors to see what can be done. For example, my Model 25 CRT hooked up to a Dreamcast playing Quake III: https://youtu.be/U7hWvBc3nD4?t=2m51s
 
Screen 2 gives only two colors. However, they can be any two colors from the 16 in the RGBI palette - so you can have black and white, black and grey, purple and blue if you want.

Not on a real IBM CGA. In Mode 6 (640x200x2), IBM CGA allows you to use the color select register to pick the foreground color. The background is always black.

On some clones with embedded CGA clone hardware, it was possible to change the 640x200x2 foreground and background color independently of each other; I have vague memories of doing this with OPTIK on my AT&T 6300. But that's not standard.

But personally i think that is kinda "hack" referencing to the specific hardware, i don't think it will work on other machines like the IBM convertible...

Yes, some of this conversation was muddled due to treating the HP 200LX as a true clone when what it actually does is map 320x200x4 to the existing grayscale screen with shades of gray.
 
Huh I can see it that way. I guess without a real dedicated pixel clock there's no real way to call it digital, when there's no true fixed way to map them.

All display adapters use a pixel clock, MDA 16MHz, CGA 14MHz, EGA 18MHz, VGA 25 & 28MHz etc. It is only the digital interfaces which make them readily available to the video output port.
 
All display adapters use a pixel clock, MDA 16MHz, CGA 14MHz, EGA 18MHz, VGA 25 & 28MHz etc. It is only the digital interfaces which make them readily available to the video output port.

That's what I'm referring to, being available on the video port, to get true 1:1 pixel mapping, without having to "match" the phase manually (or automatically like most VGA monitors tried to do).

I guess I need to be more specific since people always look past what I say,. lol.
 
That's what I'm referring to, being available on the video port, to get true 1:1 pixel mapping, without having to "match" the phase manually (or automatically like most VGA monitors tried to do).

Yup, 'legacy' display signals only sync with a pulse at the start of each scanline (hsync), and another one at the start of each frame (vsync).
Traditionally, the display expected a fixed frequency for hsync and vsync, and the signal had to be within a certain tolerance, where the display circuitry could sync to it.
With VGA, 'multi-sync' displays were introduced, which could sync to signals in a wide range of frequencies. They have some electronics to measure the vsync and hsync frequencies, and then sync to them (which is why many later multisync monitors have either an LCD panel or an on-screen display which can tell you what frequencies and effective resolution you're using).
For VGA this was more or less required, since even standard VGA has 2 crystals already, and SVGA cards often had 3 or more, to enable higher resolutions and refresh rates.
 
Yup, 'legacy' display signals only sync with a pulse at the start of each scanline (hsync), and another one at the start of each frame (vsync).
Traditionally, the display expected a fixed frequency for hsync and vsync, and the signal had to be within a certain tolerance, where the display circuitry could sync to it.
With VGA, 'multi-sync' displays were introduced, which could sync to signals in a wide range of frequencies. They have some electronics to measure the vsync and hsync frequencies, and then sync to them (which is why many later multisync monitors have either an LCD panel or an on-screen display which can tell you what frequencies and effective resolution you're using).
For VGA this was more or less required, since even standard VGA has 2 crystals already, and SVGA cards often had 3 or more, to enable higher resolutions and refresh rates.

And to top it off, the resolution it displayed was more of a guess. For example, it showed "720x400" a lot of times in the VGA text mode, when only 640 dots are present horizontally. If it had a true pixel clock it could access on the video port, it'd know exactly how many pixels were available.

I have one LCD, an AvidAV, and on DVI, it shows you exactly how many pixels there are, and on VGA, it only guestimates the lines, but never guestimates the horizontal resolution in the OSD.

Fun fact, the model 25 CRT only looks at the vertical sync polarity, and NOT the horizontal, which is why if you modify them with a VGA card, EGA 350 line mode will not work, even though the CRT can sync to it. To make 350 line mode work, you can invert the vertical sync polarity, and then it syncs up, but the picture looks squished vertically as it's 50 lines short of what it now thinks is 400 line mode..
 
Interesting, so that would make a VGA monitor a triple-sync monitor because it has to support 350, 400 and 480 line resolution modes. EGA monitors would be dual sync by supporting 200 and 350 line modes.
 
Interesting, so that would make a VGA monitor a triple-sync monitor because it has to support 350, 400 and 480 line resolution modes. EGA monitors would be dual sync by supporting 200 and 350 line modes.

No, it's still fixed frequency because it goes by the horizontal frequency, 31.5KHz..

The more lines, the slower the vertical.. Which is why 350 and 400 line are 70Hz. On VGA cards there's extra blank lines in 350 line mode to make it really 400 line, but the polarity of the horizontal sync tells the monitor to stretch it vertically based on a separate adjustment POT. 480 lines has 60Hz because like I said, more lines equals slower vertical, if the horizontal isn't changing. (Remember 800x600 at 56Hz? I wish I didn't).
 
And to top it off, the resolution it displayed was more of a guess. For example, it showed "720x400" a lot of times in the VGA text mode, when only 640 dots are present horizontally.
Yes, a guess, and the logic seems very simple:
- if measured vertical resolution is 350, assume it's 640x350
- if measured vertical resolution is 400, assume it's 720x400
- if measured vertical resolution is 480, assume it's 640x480
- and so on for SVGA resoultions...

But in this case it's correct guess: VGA text mode by default uses 9x16 character box, so 80x25 = 720x400.
However, try setting some 320x200 or 640x200 (using SCREEN 2 in Basic, just to stay on topic :mrgreen: ) mode on VGA, and the monitor will still think it's 720x400.

Edit: note that all that frequency measuring only applies to SVGA monitors, plain VGA monitors don't need to measure anything, as they know the vertical resolution by looking at the polarity of HSYNC/VSYNC signals.
 
Last edited:
You know, it's been a while since I remembered how large the text was, I thought it was 8x16, which would have been 640 dots wide. My memory is fading in that regard. Then again, I run my model 25 in 132x50 in 400 lines using an ATI VGA Wonder card.
 
You know, it's been a while since I remembered how large the text was, I thought it was 8x16, which would have been 640 dots wide. My memory is fading in that regard. Then again, I run my model 25 in 132x50 in 400 lines using an ATI VGA Wonder card.
If you run that Model 25 with on-board MCGA, the character box will be 8x16 indeed.
On VGA, default is 9x16, but 8x16 is also possible.
 
If you run that Model 25 with on-board MCGA, the character box will be 8x16 indeed.
On VGA, default is 9x16, but 8x16 is also possible.

Correct, and on MCGA, in graphics mode, it's usually 8x8 double scanned to 8x16 (and no you can't cheat it).
 
Correct, and on MCGA, in graphics mode, it's usually 8x8 double scanned to 8x16 (and no you can't cheat it).
What do you mean?
If you set 320x200 or 640x200 graphics mode, and then output characters using BIOS calls, then indeed, BIOS uses 8x8 font, and the result is double-scanned to 400 lines, exactly as in VGA.
But in 640x480 mode, BIOS should use 8x16 font, shouldn't it?
 
Interesting, so that would make a VGA monitor a triple-sync monitor because it has to support 350, 400 and 480 line resolution modes. EGA monitors would be dual sync by supporting 200 and 350 line modes.

Yes, except EGA doesn't really 'sync'... They just invert the polarity for 350-line mode. So it still 'assumes' 200 line or 350 line mode, rather than actually syncing to the signal.

As for VGA... Is there a 350 line mode? I didn't think there was. I thought they just faked it with 400-line mode (there's only two crystals on VGA, so it can only support two pixel clocks... everything else has to be done by changing the number of scanlines and the length of each scanline).
 
As for VGA... Is there a 350 line mode? I didn't think there was. I thought they just faked it with 400-line mode (there's only two crystals on VGA, so it can only support two pixel clocks... everything else has to be done by changing the number of scanlines and the length of each scanline).
But yes, from the monitor's point of view, there is a 350 line mode.
Timings are the same as in 400 lines, but the monitor needs to vertically stretch the 350 lines to fill the entire screen area.
From the card's point of view... still yes, as it has to provide the appropriate HSYNC/VSYNC polarity combination on its output.
 
But yes, from the monitor's point of view, there is a 350 line mode.
Timings are the same as in 400 lines, but the monitor needs to vertically stretch the 350 lines to fill the entire screen area.
From the card's point of view... still yes, as it has to provide the appropriate HSYNC/VSYNC polarity combination on its output.

Correct. That's why a model 25-XT CRT (monochrome or color) is incapable of syncing to 350 line, as it doesn't read horizontal sync polarity, only vertical. You can invert the polarity to make it work as it'll think it's 400 line, but it'll look widescreen.
 
Back
Top