• Please review our updated Terms and Rules here
  • Exhibitor application for VCF West 2022 is now open! If you are interested in exhibiting, please fill out the form here.

Help me understand early VGA

Benson86

Member
Joined
Jun 1, 2022
Messages
20
So I'm trying to wrap my head around early 90's PC gaming, in particular resolutions. I collect games and hardware, and part of that is I really like to play games on period correct displays in their native resolutions. I get how CGA/Tandy/EGA work, they support certain modes and you you just need a matching monitor, simple. Same with vintage consoles, there's 240p/480i/480p, just match the TV. But the PC VGA era just seems like a poorly documented wild west. Whenever I google about "VGA" everything just says 640x480, but that's obviously not the case, as most early 90's VGA games ran at lower than that. I'm assuming 320x200 or something similar? I currently have a Tandy setup for 80's games, and then I have a more modern PC hooked up to a late model Compaq FS7600 "VGA" monitor for playing GOG/Dosbox/PCEmu stuff, and for everything 640x480 and above its awesome, but it seems like that's lowest resolution that display can go, I have to 2x scale lower res games and then they just look emulated. I can trick it into lower with super resolutions but then it has huge sharp scan-lines and I know that's not period correct.

I'm beginning to think to really see the early 90's correctly I need to get another setup yes? Did early VGA monitors support resolutions lower than 640x480 natively? When browsing say eBay how would one know what to even look for? I imagine it's not really possible to do that from the modern box so will prob be needing to put something together from that era, late 486/early Pentium. But with everything after EGA just being called VGA I quickly get confused trying to figure out what supports what lol. Help 😂
 

Agent Orange

Veteran Member
Joined
Sep 24, 2008
Messages
6,111
Location
SE MI
When I first got a VGA display in the early 90's (Oak), VGA was in fact 640x480. That's the point where VGA kicked things off. Next was 800x600 and so on. EGA was almost as good at 640x350. Then, of course, is CGA which compelled me to go EGA in the first place. The Hercules Graphic Adapter was available at 720x348 but very pricey for its time.
 

rmay635703

Experienced Member
Joined
May 29, 2003
Messages
464
Location
Wisconsin
Video devices supporting different resolutions to gain more color depth occurred right at the beginning with CGA
640x200-mono
320-200-4 color
And a strange 16 color modified text mode ignoring composite

Storage, memory and performance were at a price premium when vga released in 1987

Initially VGA was made to support 256color MCGA 320x200
And
640x480 - 16 colors
720x400 or 350 for text

It was made to be backward compatible so you could also run most of the previous EGA and CGA modes

At the beginning most games used 16color VGA 320x200 to share assets with EGA but with the added benefit that you could choose a better set of 16 colors

As time moved on more games took advantage of the 256 color mode and card games / sim city / simulation/ productivity used the “hi res” 16 color mode

Eventually programmers found they could drive other 256 color vga modes using mode X techniques
320x240
360x240
320x480
Were generally possible on plain 256k VGA moving into the later years
 

Benson86

Member
Joined
Jun 1, 2022
Messages
20
Video devices supporting different resolutions to gain more color depth occurred right at the beginning with CGA
640x200-mono
320-200-4 color
And a strange 16 color modified text mode ignoring composite

Storage, memory and performance were at a price premium when vga released in 1987

Initially VGA was made to support 256color MCGA 320x200
And
640x480 - 16 colors
720x400 or 350 for text

It was made to be backward compatible so you could also run most of the previous EGA and CGA modes

At the beginning most games used 16color VGA 320x200 to share assets with EGA but with the added benefit that you could choose a better set of 16 colors

As time moved on more games took advantage of the 256 color mode and card games / sim city / simulation/ productivity used the “hi res” 16 color mode

Eventually programmers found they could drive other 256 color vga modes using mode X techniques
320x240
360x240
320x480
Were generally possible on plain 256k VGA moving into the later years

Ok this makes sense, so VGA does indeed support those lower resolutions. But how does this translate to monitors? Are there VGA CRT's that support those resolutions?
 

rmay635703

Experienced Member
Joined
May 29, 2003
Messages
464
Location
Wisconsin
Ok this makes sense, so VGA does indeed support those lower resolutions. But how does this translate to monitors? Are there VGA CRT's that support those resolutions?
Everything pretty much runs the same pixel clock so VGA is for all practical purposes fixed frequency even though the different resolutions listed can be driven
 

Benson86

Member
Joined
Jun 1, 2022
Messages
20
Everything pretty much runs the same pixel clock so VGA is for all practical purposes fixed frequency even though the different resolutions listed can be driven
So is there a reason my current monitor can't? Or is that because it's a higher resolution CRT?
 
Last edited:

SomeGuy

Veteran Member
Joined
Jan 2, 2013
Messages
4,160
Location
Marietta, GA
Basic VGA monitors had to support three main modes: x350 x400, and x480 scan lines, everything else was upscaled using pixel doubling. 350 was used for EGA compatibility 640x350, 400 was used for text and upscaled 320x200, 640x200, or such. 480 was 640x480.

The only standard was "copy what IBM is doing".

Many early VGA cards came with tools to tweak sync signals and timings because it wasn't all really standard.

Some application software could futz with sync timing and might not display properly on all monitors.

Higher resolutions such as 800x600 and 1024x768 were "super VGA", and came in various timings and sync settings. You always had to check your manuals for compatibility. Lower resolutions such as non-pixel doubled 640x200 were not supported on VGA.
 

Benson86

Member
Joined
Jun 1, 2022
Messages
20
Basic VGA monitors had to support three main modes: x350 x400, and x480 scan lines, everything else was upscaled using pixel doubling. 350 was used for EGA compatibility 640x350, 400 was used for text and upscaled 320x200, 640x200, or such. 480 was 640x480.

The only standard was "copy what IBM is doing".

Many early VGA cards came with tools to tweak sync signals and timings because it wasn't all really standard.

Some application software could futz with sync timing and might not display properly on all monitors.

Higher resolutions such as 800x600 and 1024x768 were "super VGA", and came in various timings and sync settings. You always had to check your manuals for compatibility. Lower resolutions such as non-pixel doubled 640x200 were not supported on VGA.
So everything was up-scaled anyway even back then? There really was no way to really play those old games and what I would consider "native" resolutions? What got me thinking about this was I fired up Loom and Monkey Island on the Tandy and they looked stunning in native Tandy 16 color but ran awful on the 8088 lol. Maybe a 386 machine with an EGA card is more what im looking for. Thanks!
 

Eudimorphodon

Veteran Member
Joined
May 9, 2011
Messages
4,630
Location
Upper Triassic
Did early VGA monitors support resolutions lower than 640x480 natively?

So... yes, but strictly speaking, not much. I think you may be getting a little confused by what "320x200" means in a VGA context, and thus this whole thread is a little, well, let me explain:

Original VGA supported three horizontal line counts, which matched up with different refresh rates because technically the line rate (number of lines per second) was the same for all of them, 31.5khz. Bog-standard non-multi-scanning VGA monitors can ONLY display a 480 line mode @60Hz, or 400 and 350 line modes @70Hz. Technically those second two are actually the same mode, the same number of lines are scanned out the back door of the VGA card; a change in the horizontal sync pulse polarity tells the monitor do use different vertical overscan settings to "stretch" the 350 lines to more fully fill the vertical space. These three line counts are used in various combinations with two standard pixel clocks (25 and 28 mhz, each of which also had a 1/2 divider available) to create the standard modes that "SomeGuy" just mentioned above, IE:


Line modevideo modeswhat-for
400@70hz720x400Standard VGA text mode
350@70hz640x350EGA high resolution
200@70hz, double-scanned320x200
620x200
VGA mode #13
CGA and low-res EGA modes
480@60hz640x480High-res VGA

So, to be clear, the original 1987 IBM VGA standard double-scanned the 200 line modes; it is perfectly normal and expected that 200 line software won't look the same on a VGA monitor as it does on an EGA or CGA monitor; they're effectively "2x scaled" even on the original hardware. If you want "wide scanlines" like you see on a CGA monitor then, yes, you shouldn't be using a VGA card. (Or you can run it in an emulator that gives you wide scanlines.)
 

Eudimorphodon

Veteran Member
Joined
May 9, 2011
Messages
4,630
Location
Upper Triassic
... Re: the above, this thing about the "200 line" modes actually being double-scanned remixes of the 70hz text mode is why those inexpensive VGA to HDMI adapters often only work in Windows and other graphical environments, not for DOS. Computer monitors that have HDMI ports in addition to other connectors can sometimes make sense of the resulting nonstandard-for-HDMI signal, but TVs or video capture devices not so much.
 

Agent Orange

Veteran Member
Joined
Sep 24, 2008
Messages
6,111
Location
SE MI
VGA cards and monitors were an expensive in the early 90's. My first EGA monitor was about $350 for a no-name product with a discount. The EGA card was a Matrox 8-bit PCI and went for about the same. That was a fairly good investment back then.

Some monitors will not respond to lesser video cards for whatever reason. Case in point is that I have a W7gamer, which will not show with a XFX 7970 video card on a 28" IPS monitor. It runs perfectly with a Nvidia 1080. That same 7970 ran okay with the same monitor on a Intel Z170 motherboard and a 6700K chip. This all probably has to do with how the video card's BIOS is interacting with the system. So, looking back, there may have been an update or two for that 7970 which never got installed. Confusing, yes it really was.
 

Benson86

Member
Joined
Jun 1, 2022
Messages
20
So... yes, but strictly speaking, not much. I think you may be getting a little confused by what "320x200" means in a VGA context, and thus this whole thread is a little, well, let me explain:

Original VGA supported three horizontal line counts, which matched up with different refresh rates because technically the line rate (number of lines per second) was the same for all of them, 31.5khz. Bog-standard non-multi-scanning VGA monitors can ONLY display a 480 line mode @60Hz, or 400 and 350 line modes @70Hz. Technically those second two are actually the same mode, the same number of lines are scanned out the back door of the VGA card; a change in the horizontal sync pulse polarity tells the monitor do use different vertical overscan settings to "stretch" the 350 lines to more fully fill the vertical space. These three line counts are used in various combinations with two standard pixel clocks (25 and 28 mhz, each of which also had a 1/2 divider available) to create the standard modes that "SomeGuy" just mentioned above, IE:


Line modevideo modeswhat-for
400@70hz720x400Standard VGA text mode
350@70hz640x350EGA high resolution
200@70hz, double-scanned320x200
620x200
VGA mode #13
CGA and low-res EGA modes
480@60hz640x480High-res VGA

So, to be clear, the original 1987 IBM VGA standard double-scanned the 200 line modes; it is perfectly normal and expected that 200 line software won't look the same on a VGA monitor as it does on an EGA or CGA monitor; they're effectively "2x scaled" even on the original hardware. If you want "wide scanlines" like you see on a CGA monitor then, yes, you shouldn't be using a VGA card. (Or you can run it in an emulator that gives you wide scanlines.)
Ok that makes much more sense thank you. So in essence I'm asking for an answer to a problem that doesn't really exist lol. Just to make sure im understanding this correctly, if you take something like The Dig or Full Throttle, that were VGA releases but run at 200 lines, they were more or less always meant to be seen double scanned. So im not missing out on anything by playing them on a more modern multi-scan CRT?
 

Benson86

Member
Joined
Jun 1, 2022
Messages
20
Ok ignore my previous, I get it now. I was able to flash a bootable DOS to a USB stick and boot native DOS on that computer. And low and behold its running at 720x400 and the monitor is happy. I took a chance and booted Dune, and while the game runs way to fast and has no sound it looks AWESOME. The monitor reports 70hz user mode which is obviously the double scanned 200@70hz you mentioned and it looks perfect, no aspect issues. So the issue is not the monitor at all, I was thinking about this backwards. It's just windows and dosbox not playing nice. Looks like there may be a 486 build in my future, just what I needed another computer lol 😂
 

Eudimorphodon

Veteran Member
Joined
May 9, 2011
Messages
4,630
Location
Upper Triassic
Just to make sure im understanding this correctly, if you take something like The Dig or Full Throttle, that were VGA releases but run at 200 lines, they were more or less always meant to be seen double scanned. So im not missing out on anything by playing them on a more modern multi-scan CRT?

Yep. I remember actually being a little weirded out the first time I saw what CGA graphics running on VGA looked like back in the 80's; 640x200 in particular looks a little "off" because it's more obvious that the pixels are effectively very tall rectangles when they're double-scanned.

(I mean, obviously I *knew* that mode is whack when it comes to drawing graphics, but on a real CGA monitor you can kind of pretend you're looking at a sane resolution through a set of very fine blinds because the individual *pixels* are single scanline dots, not rectangles. The 320x200 modes don't really have that problem, it just looks like, well, a lower resolution mode displayed on a finer grid. Something you're used to if you had, say, a Tandy Color Computer.)

That the double-scanned 200 line modes run at 70hz vs. 60hz does technically introduce some subtle differences between "native VGA 200 line" software and older software running on VGA; it's rare but if a given piece of software tries to play timing tricks to update the screen only between frames the different screen refresh rate from CGA/EGA can occasionally introduce subtle visual artifacts. These are not something most people will perceive, but it crops up when people try to do screen captures or whatever.) Software *written* for VGA should know what the native refresh is, obviously.
 

SomeGuy

Veteran Member
Joined
Jan 2, 2013
Messages
4,160
Location
Marietta, GA
So everything was up-scaled anyway even back then? There really was no way to really play those old games and what I would consider "native" resolutions? What got me thinking about this was I fired up Loom and Monkey Island on the Tandy and they looked stunning in native Tandy 16 color but ran awful on the 8088 lol. Maybe a 386 machine with an EGA card is more what im looking for. Thanks!
A lot of games were specifically designed for these modes. Doom, for example, ran at 320x200x256, but I'm not aware of any system that could actually run it with actual 200 scan lines. It was intended to be run on double-scanned VGA.

Even Commander Keen, which used EGA 320x200x16 mode was never intended to run on genuine IBM EGA. As I recall, the timings on genuine EGA are different and the game's internal timing was synced to the VGA double-scanned interpretation of EGA, so it didn't quite work perfectly genuine EGA.
 

Eudimorphodon

Veteran Member
Joined
May 9, 2011
Messages
4,630
Location
Upper Triassic
I guess there’s one ”mode” I didn’t mention in my post about how the line counts and pixel clocks are remixed, and that’s “Mode X”. Mode X actually describes a technique for programming the VGA registers to display nonstandard modes not supported by the IBM BIOS and make better use of the available RAM, but colloquially it usually refers to a double-scanned 320x240 mode using said techniques. This turned up in some mid-90’s VGA games; it’s mostly of interest because these games run at 60hz like the corresponding high-res line mode instead of 70hz like the normal VGA games.

(There are some pictures viewers and such software that supports really oddball higher resolutions like 360x350 by leveraging different weird combinations of the available clocks and dividers, but with all Mode X stuff compatibility was kind of hit or miss.)
 
Top