• Please review our updated Terms and Rules here

Why did IBM create CGA - What user was there target?

This is not neccessarly a question of the graphics card. The color choice could also be done in the monitor, the dark yellow to brown conversion of IBM cga is an example.

Yes. That seems to be how the IBM 5271 generates its 8 colors, for instance (like the color 3270 terminals): the monitor nominally accepts digital R, G, and B, but internally transforms the results to a palette that's quite different from a 1:1 mapping of the input channels to the electron guns.

The 5154 EGA monitor does a similar translation while it's in 200-line (CGA) mode, with the RGBI input converted to 6 bits using a ROM. In theory you could put a modified ROM in there and select your CGA 'palette' from the full 64-color EGA range.
 
I read on this thread that the Apple II was something that IBM wanted to match or slightly exceed with its first CGA PC.

So it makes sense that the composite mode offers the same palette as the Apple II. I was aware the composite mode used the Apple II method, but I didn't know it produced identical colours. It's really something that doing so basically gave it the IIe capabilities!

It's probably a stretch to assume that IBM made a deliberate choice to match the Apple color selection (or anything else in particular).

In both cases, the colors are just an artifact of the method - pushing out white pixels at 4 times the frequency of the NTSC color carrier. As long as the phase relationship is close enough to a multiple of 90 degrees from some reference phase (and a simple design lends itself to doing that, w.r.t. the color burst), the 16 possible dot patterns are going to be decoded to the same set of colors.... or close enough.

The bit-order is different between the Apple II and CGA, too:

artifacts_apple.pngartifacts_cga.png

At any rate, IBM never documented any of this artifact color stuff. It's also worth remembering that there was no single composite 'mode'... for the average programmer tinkering with CGA on a TV, this particular 'Apple-like' palette would have been *extra* obscure, because it requires setting mode 6 (640x200) and then clearing the BW bit in the Mode Select register. Other artifact color palettes, which are possible in mode 4 (320x200), would be more immediately obvious because this mode already enables the color burst by default - no direct hardware programming needed.

The reason this palette is seen most often is simply because it's useful, in that it covers the color gamut more evenly than other options. But again that's a consequence of the method.
 
  • Like
Reactions: cjs
@Eudimorphodon that's really the kind of 'addon' idea I would expect somebody would have produced back in the day.
The fact that there wasn't such a solution, kinda points to - there was no problem? Again I'd like to stress my opinion that 'ugly CGA' is backlash from users using it after its time, due to having no better option.

Even existing binaries could be easily patched to send few outs here and there to change the palette, and there is no failure if the device isn't present...
 
@Eudimorphodon that's really the kind of 'addon' idea I would expect somebody would have produced back in the day.
The fact that there wasn't such a solution, kinda points to - there was no problem? Again I'd like to stress my opinion that 'ugly CGA' is backlash from users using it after its time, due to having no better option.

Yeah. I think a thing people forget today is that color, period, was kind of a novelty to have at all, and the "business case" for it was to make things like graphs easier to interpret, not necessarily to make them "pretty". Honestly a lot of early color machines have pretty whack palettes that were chosen mostly for expedience instead of artistic value. The Commodore/MOS VIC-II, for instance, has a remarkably dull/brown palette; there's a quote from one of the designers in response to a query from someone who apparently overthought a theory on why it has the colors it does:

"I'm afraid that not nearly as much effort went into the color selection as you think. Since we had total control over hue, saturation and luminance, we picked colors that we liked. In order to save space on the chip, though, many of the colors were simply the opposite side of the color wheel from ones that we picked. This allowed us to reuse the existing resistor values, rather than having a completely unique set for each color."

Any system using RGB is of course in a particularly tough spot because you don't get to have additional colors for "free" like composite gives you, every increase in color gamut costs you memory. One interesting thing to think about is the fact that IBM did choose(*) to go with a two bit "chunky" framebuffer for the low-res mode; in some alternate universe we might have ended up with a version of CGA that used a bitmap/attribute system like Mode 2 on the Sam Coupe. (IE, the "graphics" layer would be a monochrome 320x200 bitmap, and every 8 pixel group would have its foreground/background colors choosable from the 16 color text mode palette.) You could probably make a case that a system like that would be better for games, because you'd be able to create very colorful 16 color displays using no more memory than our CGA's 4 color, but it would be worse for business graphics like charts and graphs because you'd have the problem of attribute clash where different color objects met. (Granted, with 1-pixel-high attributes blocks the problem wouldn't be as bad as it is on systems with character-cell-size color regions; with a little care you could make it work in most circumstances. Pie charts would be a particular challenge, though.)

(*)Actually... now I'm really kind of wondering why they didn't go with an attribute system? Technically that would have been a snap to reuse the 40 column color text mode pixel pipeline for that; just switch out the character generator so the raw bits of the "character" memory cells go straight into the shift register, use the attribute pipeline completely unchanged, and do the same thing they do in graphics mode where the first character row address line becomes A13 for the graphics memory. They could have saved a significant chunk of circuitry. The memory mapping would be a little gross, because you'd have the "bitmap" and "attribute" memory cells interleaved, not in separate planes like they are on most graphics+attributes systems, but horrible memory layouts were par for the course back then.

Apparently someone thought having a "pure" bitmap mode was worth adding all that additional circutry to the card to implement the 2 bit chunky system. Just as aside, I wonder if that decision to have the "Red" and "Green" lines be the ones directly mapped through (with blue and intensity as all-or-nothing "tints" for the non-background colors) was inspired by the old 2-strip Technocolor process that only used red and green. That system was remarkably good at doing flesh tones... so is that the answer, CGA was designed to be the best possible 2 bit solution for displaying p0rn!? Now that would indicate some ahead-of-the-curve thinking for 1981. ;)

Even existing binaries could be easily patched to send few outs here and there to change the palette, and there is no failure if the device isn't present...

Part of me wants to breadboard it up just for LOLs in spite of, or even because of, how pointless it would be. It *is* legitimately the sort of thing someone might have built in the 80's.
 
Yeah. I think a thing people forget today is that color, period, was kind of a novelty to have at all, and the "business case" for it was to make things like graphs easier to interpret, not necessarily to make them "pretty". Honestly a lot of early color machines have pretty whack palettes that were chosen mostly for expedience instead of artistic value. The Commodore/MOS VIC-II, for instance, has a remarkably dull/brown palette; there's a quote from one of the designers in response to a query from someone who apparently overthought a theory on why it has the colors it does:

"I'm afraid that not nearly as much effort went into the color selection as you think. Since we had total control over hue, saturation and luminance, we picked colors that we liked. In order to save space on the chip, though, many of the colors were simply the opposite side of the color wheel from ones that we picked. This allowed us to reuse the existing resistor values, rather than having a completely unique set for each color."

Yeah, I mean, what sub 8 bit standard is pretty? For me PC graphics start getting vibrant and nice with 256 colours. NES is definitely tons better than CGA, but is still not pretty and vibrant as PC Engine. I really like the blue tones on that system.

IBM PC with CGA has more 'fidelity' than arcade machines 5 years before it which is quite quite good.

Any system using RGB is of course in a particularly tough spot because you don't get to have additional colors for "free" like composite gives you, every increase in color gamut costs you memory. One interesting thing to think about is the fact that IBM did choose(*) to go with a two bit "chunky" framebuffer for the low-res mode; in some alternate universe we might have ended up with a version of CGA that used a bitmap/attribute system like Mode 2 on the Sam Coupe. (IE, the "graphics" layer would be a monochrome 320x200 bitmap, and every 8 pixel group would have its foreground/background colors choosable from the 16 color text mode palette.) You could probably make a case that a system like that would be better for games, because you'd be able to create very colorful 16 color displays using no more memory than our CGA's 4 color, but it would be worse for business graphics like charts and graphs because you'd have the problem of attribute clash where different color objects met. (Granted, with 1-pixel-high attributes blocks the problem wouldn't be as bad as it is on systems with character-cell-size color regions; with a little care you could make it work in most circumstances. Pie charts would be a particular challenge, though.)

You could get them for free if you could shift the palette easily before each line.

What I would do if I was IBM, CGA analogue RGB. The monitor would be a tad bit easier to produce, a standard 15kHz RGB screen. The ISA card circuitry would not be more complex, it could be even slightly less. It would have a 0.7V and 0.35V source, and pass them on to R, G, B depending on intensity bit. Of course this wouldn't yield anything different at that point, but it would remove digital TTL schema out of the equation, and that was a dead end after all. After all it's what the TTL monitor does internally, the beams are not digital...

Once the basic signalling stops being a gamut limiter I'm sure people would come up with ideas of simple circuits how to get more than 4/16 colours out.

Part of me wants to breadboard it up just for LOLs in spite of, or even because of, how pointless it would be. It *is* legitimately the sort of thing someone might have built in the 80's.

It does sound pretty fun and ultimately pointless :)
 
What I would do if I was IBM, CGA analogue RGB. The monitor would be a tad bit easier to produce, a standard 15kHz RGB screen.

It is kind of strange in retrospect that TTL color monitors were ever a thing. Three channel analog RGB with common composite sync was part of the RS-170A standard (that also covers composite and s-video) in the 1970's and analog RGB did exist in a television studio production context. The only justification that really stands out to me for putting the DAC in the monitor instead of the computer is doing TTL over the cable makes it more resistant to interference, which, eh, might have actually been useful in the 1970's given how many early computers simply *oooozed* RF interference.

Because of the requirement for better shielding and cabling I would guess pricewise the difference between a digital RGBI and Analog RGB monitor is probably a wash. (And of course the price of the DAC and better output drive circuitry gets punted into the computer.) But once you start adding color depth you run into that whole issue of needing an unweildly number of wires, which starts eroding the advantage of them needing less shielding for short runs and introduces the risk of skew, so... yeah. Weird little cul-de-sac we went through there.

Of course now with DVI/HDMI/Displayport/etc we're back to digital but we're using serial transmission to keep the number of lines down. Maybe the next evolution in monitor technology will involve paper cups and string after someone discovers it was, in fact, the perfect signal transmission system the whole time, kids were just doing it wrong.
 
Last edited:
I'm a little surprised that nobody built one. Third-party PCGs were quite popular in Japan. HAL Laboratory started out by building and releasing a PCG for the PET 2001, and went on to release seven more for various systems.
Worth remembering is that this was the only way to freely display any shape on a PET or other text-mode only computers. It would had made more sense to have this as an add-on for the MDA card than the CGA card.

Yeah, an SRAM character generator would have been a big improvement (it could have been just 1K for the upper character set with "standard" characters in a low 1K ROM, but I don't know if that would have saved any money). Complicated dual-ported access wouldn't really be necessary - just require apps to only change the SRAM when the display is disabled, and updates to the SRAM do not need to be fast. The ROM BIOS could have had multiple compressed characters sets in it and a shared decompression routine (it is interesting that contemporary ROMs back then rarely compressed anything, probably reflecting the premium on RAM at the time making it not worth it).
A way to update the content of a character generator RAM could had been to just write contents to the regular screen ram, and have a mode where the character generator would be written rather than read, kind of sort of.

==========================

Some general thoughts:
We have to remember that at the time the microcomputer companies did more or less just wing it, and some things became successes, and others became failures. In-house IBM knew that compatibility was key to selling computers, with their 360 mainframe line where all models could run software written to run on the lowest spec models, and other software would run on all but the lowest spec ones.

All other companies that lacked this experience seemed to just wing it. Commodore and Apple had successes with their PET and Apple II, and both tried an incompatible and in theory better follow-up that flopped, the CBM-II/B series and the Apple III. Later on many companies learned the hard way that creating a microcomputer not compatible with others really required the computer to have killer features to take any market shares. I.E. once the PC was well established as a business computer, and a few of the successful home computer models were available in enough quantity, there were almost no market for anything else unless it was superior to what already existed (i.e. say the Amiga with superior features or the Atari ST that beat the price for more or less all similar spec systems at the time, and so on).

IBM just realized that they wanted a monochrome and a color option.

A thing that this thread seems to miss is that some early MDA cards is said to actually support color text, i.e. all four attribute bits are routed all the way to the DE9 d-sub connector. (Don't quote me on this being early cards, just that a few cards does this). This makes me think that IBM probably initially had some thoughts of making a graphics card and a text only card, and have both be usable with either monochrome or color displays, but then reality hit them and they really wanted a better than 15kHz monitor, and it was likely hard to buy a monitor chassis (or have an OEM manufacture monitors according to your spec) for color with anything else than 15kHz TV frequencies. Technically I think it wouldn't had been that hard for a qualified TV/monitor manufacturer to create a color monitor that would run at something else than 15kHz, but it would likely had taken some time and whatnot and wouldn't had fitted with IBM's intentions to use off-the-shelf parts.

There are so many "what if"'s here. Like for example, since MDA has a full 8-bit attribute byte, what if the signal to the monitor had been analogue instead of two-bit digital? That way we could had had 8 or 16 grey scale levels rather than 4.

========================
Re memory and whatnot: If IBM had opted for the 8086 rather than the 8088, the PC might had had a 16-bit data bus, and that would had made a 32k CGA card more reasonable, or for that sake UMA graphics on the main board.

A problem with UMA is how it interacts with any faster versions of a computer. The AT would had either required non-UMA graphics hardware, or the bus clock would had had to been a multiple of the PC/XT bus clock, more or less.

Also re the UMA v.s. NUMA discussion: To make things more complicated, there were computers where the main / default memory were UMA, but add-on memory were separate. In particular on a VIC 20 the internal RAM is UMA and sits on one side of bus buffers (shared with the video chip and the character ROM), while expansion RAM sits on the other side (shared with the CPU, I/O chips and ROM).

========================

Speaking about "what if"'s, and also the comment that some Intel support chips were less than stellar (the dram controller v.s. 8085 v.s. 64kbit DRAM chips not fulfilling the timing margins and so on), I would say that a major mistake with the PC was to use the Intel style active high with pull-up interrupt lines. IBM should had for example added an inverter to the IRQ lines on the ISA bus, removing all problems with conflicting interrupts and whatnot. The 6800/6502 systems uses active low interrupts and you just connect all interrupt sources to the same line (and poll each chip to determine what cased an interrupt). This could had been a thing on the PC too. Btw since IDE/ATA is basically a stripped down ISA bus, it's worth mentioning that Commodore solved the interrupt polartity issue for the IDE/ATA interface on the Amiga 600, 1200 and 4000 by simply having a pull-down resistor on the interrupt line. I.E. if you didn't have a hard disk connected, it wouldn't try to hog the interrupt line. (Can't remember the details but IIRC that interrupt line is shared with one of the CIA I/O chips).

I think that in comparison the issues with the CGA card seems miniscule as compared to the interrupt polaritly blunder.
 
This makes me think that IBM probably initially had some thoughts of making a graphics card and a text only card, and have both be usable with either monochrome or color displays, but then reality hit them and they really wanted a better than 15kHz monitor, and it was likely hard to buy a monitor chassis (or have an OEM manufacture monitors according to your spec) for color with anything else than 15kHz TV frequencies.
There was actually a color text-only video card, made for Tandy by STB -- several versions of it, in fact: the Deluxe Text Display Adapter:

 
You could get them for free if you could shift the palette easily before each line.

Tricks like that either murder your CPU (and therefore tend not to be much good for interactive software, mostly just static pictures or demos) or need a lot more circuitry than would have been reasonable on the CGA card. (The VGC chip in the Apple IIgs is all about per-line palettes, with one option that allows every line to have one of 16 palettes with no CPU overhead at all, or an "expensive" option of doing a completely unique palette per line. But of course the VGC is a big custom IC, not discrete logic.) I mean, anything's possible, you could have done something like defined a graphics line as actually being 41 character positions wide instead of 40 in the CRTC (which would have added 2 bytes to every active area) and built circuitry that would have grabbed those two bytes at the beginning of each active line and loaded 4 nibbles into that '670 we're using as a palette register before displaying the actual active line? Actually... wait, no; CGA has only 384 bytes to spare, if we need two bytes per scanline for these palette flips we're 16 bytes short. So we narrow the display to 39 characters wide?

I'm kind of thinking the graphics/attribute model makes more sense compared to this, since it gives you the ability to use all 16 RGBI colors in every line. It just puts some semi-nasty limitations on where you can have different colors next to each other, but even that is pretty minor compared to a lot of other contemporary systems.
 
Thanks, everyone, for these recent comments.

I read on this thread that the Apple II was something that IBM wanted to match or slightly exceed with its first CGA PC.

So it makes sense that the composite mode offers the same palette as the Apple II. I was aware the composite mode used the Apple II method, but I didn't know it produced identical colours. It's really something that doing so basically gave it the IIe capabilities!



I can inform people here that subtle EGA palette changing has been used in a recent Commander Keen fangame, to nice effect.

View attachment 1301201

View attachment 1301202

It's surprising that new retro DOS games haven't utilized basic CGA palette swapping very much, never mind the advanced palette hacking that's being discussed above.



BTW I was always amazed by what the Apple II could accomplish via its hacky composite colour, even with a handful of colours per screen. With IBM / PC, though, the CGA monitor eventually became the "default" entry-level colour display mode, not composite, which I think is why CGA is still discussed so much.

Foray in the Forest uses the "standard" 16 colour palette on an actual EGA. On a VGA, it uses the wider selection, since the EGA doesn't allow a wider selection in 200 line mode (a rather foolish limitation that has been discussed at length elsewhere).

You could implement a 320x350 mode and then choose from the 64 colours, but a game like Foray in the Forest is going to have major struggles with bandwidth to update that. Not to mention you'd have to redraw everything for the different aspect ratio.
 
Foray in the Forest uses the "standard" 16 colour palette on an actual EGA. On a VGA, it uses the wider selection, since the EGA doesn't allow a wider selection in 200 line mode (a rather foolish limitation that has been discussed at length elsewhere).

You could implement a 320x350 mode and then choose from the 64 colours, but a game like Foray in the Forest is going to have major struggles with bandwidth to update that. Not to mention you'd have to redraw everything for the different aspect ratio.

Yes, FITF is locked-in to the CGA palette because of the 200-line mode on EGA monitors. The developers were explicit that the wider selection of colours is strictly a feature of the game rendering within VGA.

As you say, the game was programmed for a 200-line mode, so it's either CGA/EGA 200-line, or the same resolution pulling from the VGA palette. If there was a HW hack allowing the EGA expanded palette in 200-line mode, games like FITF would be able to exploit it.

What this does highlight is how retro-gaming does not really utilize the EGA 350p 16/64-colour mode. Most 'EGA' gaming is really 'super-CGA.' We hardly got any 'EGA' [hi-res] software, then or now. There aren't many demos either. We may not actually know what a 1984 or later IBM / PC with EGA can really do.
 
My impression is that most non-game software that used EGA at all ran it at 640x350.

Also I would say that Windows 3.x is a quite important piece of software that runs at 640x350 with EGA.

Re hardware hack: With a simple BIOS hack that sets the palette to be CGA compatible by default in the 200 row modes, rather than setting the unused bits to zero (to reduce problems from using a true CGA monitor that grounds pin 2, with an incorrectly set jumper for pin 2), it would just be a matter of modifying the monitor to always use EGA palette mode even in the 200 row CGA compatible resolution.

Today it would be super simple. Back in the days you'd probably had wanted a switch to enable/disable it rather than patching the BIOS or load a TSR that hooks into the change graphics mode INT.

But also: I don't know about North America but over here in Europe anyone gaming on a PC before the 90's would had been seen as a really weird oddball. Few people had PCs at home in the 80's, and when they did it was for business purposes. Think either someone who used it for word processing, or a super boring stiff upper class middle aged man who did whatever with his PC and would never let his kids touch it, kind of sort of, to exaggerate a bit.

Many companies that eventually offered PCs at prices suitable for home users would also sell computers way more suitable for gaming. Thinking about companies like Commodore, Amstrad (that also owned the Sinclair brand from mid 80's) and whatnot.
 
If there was a HW hack allowing the EGA expanded palette in 200-line mode, games like FITF would be able to exploit it.
This was possible with specialized (multisync/multi-standard) monitors which could be set to interpret the signals this way, *if* the software bothered to support this specifically.
A handful of games still did it:

16250387-rambo-iii-dos-title-screen-ega-enhanced-color.png tumblr_ph2m6stKhJ1uajwxuo9_r1_1280.jpg

What this does highlight is how retro-gaming does not really utilize the EGA 350p 16/64-colour mode. Most 'EGA' gaming is really 'super-CGA.'
I wouldn't say that, because many games used EGA-specific tricks even at the 320x200 mode. Things like the different write modes, video latches, and multiple screen pages. Quite a bit more advanced than what a 'super-CGA' like the Tandy 1000 would support, even if the resolution and palettes used were the same.

You have to remember that EGA had a fairly short period as the chief PC video standard to support. It only really became a popular option once the cheap clones started flooding the market in ~1986 or so. The following year you already had VGA, and the third-party chipset makers were quicker to upgrade their EGA solutions to the new standard.
 
Most 'EGA' gaming is really 'super-CGA.' We hardly got any 'EGA' [hi-res] software, then or now.
I believe the prevailing opinion was that EGA's hi-res 640x350 mode was too slow for games, especially on a real EGA card rather than VGA running in EGA compatibility mode.

There certainly were hi-res EGA games, but they were mostly non-fast-action games, like card and puzzle games. Flight Simulator 3.0 added hi-res EGA mode, but you were lucky if the screen updates were any better than a slide show.
 
I wouldn't say that, because many games used EGA-specific tricks even at the 320x200 mode. Things like the different write modes, video latches, and multiple screen pages. Quite a bit more advanced than what a 'super-CGA' like the Tandy 1000 would support, even if the resolution and palettes used were the same.

That's a good point. Some of the HW tricks enabling smooth-scrolling were EGA-specific for some games. I guess I'm just upset that there weren't many hi-res EGA games.

Yes, I've only seen adventure / simulation games using hi-res, for the most part. Because of the aforementioned speed limitations.


Miam: Also I would say that Windows 3.x is a quite important piece of software that runs at 640x350 with EGA.

Another good point. I was thinking of standalone DOS applications, primarily. I did remember the UI capabilities that EGA brought upon release.

Apple made that (in)famous ad about their competitor being a monochrome totalitarian state. But I was just thinking that I'd rather run Win 3.0 in EGA than a Macintosh I. It's interesting that EGA allowed for a more Apple-like UI in games like Simcity Classic, or especially SimAnt.

IBM PC sure wasn't the preferred gaming PC of the 1980s, but it's fun to think about what could be done on that hardware. Many of the popular early-90s games ran on what was essentially 1980s hardware. Even a 1984-5 EGA PC should be a decent competitor to the Commodore 64 (not cost-wise), although that opens a whole new discussion.

Some of the EGA features / hidden features are so obscure that you can read about them in an article, and then not remember them during a discussion.


You have to remember that EGA had a fairly short period as the chief PC video standard to support. It only really became a popular option once the cheap clones started flooding the market in ~1986 or so. The following year you already had VGA, and the third-party chipset makers were quicker to upgrade their EGA solutions to the new standard.

Yes, I lived it. My first computer was a VGA 286 in 1989. Skipped over EGA entirely. Likely because of those developments.

I probably didn't see many computers with EGA at all, not in people's homes at least.
 
I probably didn't see many computers with EGA at all, not in people's homes at least.

EGA cards actually kind of outlived EGA monitors; it was briefly popular to buy a cheap (because nobody wanted them anymore) EGA card to use specifically with a CGA monitor because were plenty of *those* in circulation in the late 80’s into the early 90’s. With most games targeting the 200 line modes it was a good strategy for getting a little extra mileage out of an AT, especially as a second computer for the kids.
 
Yes. That's what prompted my "most EGA gaming was like super-CGA" comment. Even though that wasn't really the case.

A lot of the lower-end PC gaming in that time was EGA running on CGA monitors, as you say. It strikes me that's a display output mode compatible with the first CGA monitors. But many of those games were using EGA-specific tricks on the CGA monitors, as others have pointed out.

There's some amusing videos out there of people asking "why does my CGA monitor work with my EGA card?"
 
IBM built CGA because they wanted to compete for the home computer market where others also had color graphics.
Right on - yes they wanted a Color video display that could compete with the home computer market that was not dominated by IBM.
 
Last edited:
This is not neccessarly a question of the graphics card. The color choice could also be done in the monitor, the dark yellow to brown conversion of IBM cga is an example.
Ya, but the monitors video logic must be able to support the additional color choices. Most old broad cast grade composite monitors only support a 1v to 1V peak video signals. So 16 colors is about the maximum for them. Some other monitors will support 5volt TTL level signals on their color and contrast controlled inputs which allows for more color selections. (64 color, 64 insensitivity levels, 64 different contrast levels) but this was mostly found on more expensive graphics workstations with proprietary hardware .
..
But I get what you are hinting about
 
Last edited:
Ya, but the monitors video logic must be able to support the additional color choices. Most old broad cast grade composite monitors only support a 1v to 1V peak video signals. So 16 colors is about the maximum for them. Some other monitors will support 5volt TTL level signals on their color and contrast controlled inputs which allows for more color selections.
You have this exactly backwards. "Broadcast quality" monitors, or any monitor that has analogue inputs, support effectively unlimited colours. TTL monitors support much fewer colours.

The most basic TTL RGB is 0V or 5V for red, and the same for green and blue, so you have eight colours. (1 V is the same as 0 V: black. 5 V on blue is the same as 0 V on blue: blue.) On an analogue input, you have as many shades of blue as you can think of between, say, 0.2 V (black level) and 0.7 V (max brightness level). E.g., 0.20000 V, 0.20001 V, 0.20002 V, etc.
 
Back
Top