• Please review our updated Terms and Rules here

Why did IBM create CGA - What user was there target?

Basically all monitors support infite number of discrenable shades of any of the beams they use, driver circuit is adapting the signal.
TTL monitors are triggered to set up exact intensity.

I thing the issue in late 70s early 80s was getting some sort of normal DAC on board graphics card. A simple, fixed and non-integrated circuit could be realized for 2 and 4 bit depth, but for more, and true flexibility (palette choice) a real DAC would have to be there. But it was expensive. Easier to create TTL adapter in the monitor itself.

5 years pass by and now the problem is there is no DAC on the board and any extension of gamut requires pins on the video port.

Ofc it's worth mentioning that by increasing transistor count in an average IC over time, multiplexing problems suddenly go away, as it becomes cheap to serialize bitstreams. In about 20 years from digital/analog monitor shenanigans we moved everything to serial approach, both external peripherals and video.

But yes, analog VGA allowed us some very high resolutions meanwhile, just by using a rather simple RAMDAC. Standard lasted good 15-20 years.
 
Also using TTL reduces the quality requirements for the cable. As long as shadows and other problems have less strength than the threshold for valid TTL logic levels, they will be invisible.

I remember that in the 1980's it was more or less impossible to find "VGA wires" as just wires without connectors. (It was even worse for SCART where you wanted even more signals, I.E. say that you wanted to make a SCART cable for an Amiga you really wanted three 75ohm coax wires, one separate shielded wire for the CSYNC signal, preferably two wires for +12V to pin 8 and with a series resistor +1..+3V to pin 16, and preferably two shielded wires for audio, unless you ran that as a separate wire all the way to the computer rather than a pigtail from the 23-pin d-sub at the Amiga end.

On the other hand wires with enough conductors for CGA or later EGA were available "everywhere".

Anecdote: Back in the 90's I used TV antenna coaxial wire to make a VGA - 3*BNC adapter as I couldn't really find any other suitable wire readily available. The VGA de15-de15 cables tended to have a too poor signal quality to fit a 20" workstation monitor.

Nowadays, or rather for the last 20 years or so, flat screen monitors analyze the signal and compensates for quality problems related to the wiring!
 
One of the comments in the EGA thread was how CGA was not for gaming. In "your" opinion ... what was CGA for? On one hand my first thought is that its incredibly limiting... but... for when it was introduced... it did have 80 column 16 color text mode. It also had 600x200 monochrome (yes 4 color 300x200 and 40 col text but the 4color mode was limiting and 40col text was largely unused/not on anyone's wishlist that I know of especially with 80col mode sitting there). Considering the graphics modes were pretty bad for gaming and the cga dot pitch was/is largely bad for text. Why did they create cga? What was their target demographic/use

Can I try answering this?

You are looking at 1981. There are not many that much powerfull competitors to CGA. Apple II sure, but it used NTSC artifact bleading to implement color. Others... Commodore PET, TSR 80, etc... Used only a text mode. The early 16 color computers were just released in 1980: Atari 400/800 and TI-80/4.

So, CGA was not obsolete at the time it was released. It was pretty contemporary. And it was made quickly from off the market components for cheap. Knowing that, it's actually pretty capable piece of hardware.

It featured two text modes: standard 80x25 for high-res monitors and 40x25 for low res monitors and TVs. It featured 3 official graphic modes: 160x100 (which was officially documented in CGA docs, beside being half text mode hack), 2-bit 320x200 in 3 official palettes with changable background color and intensity values, 1-bit 640x200 for high-res graphics and GUIs (as used by Windows 1.0/2.0 and, with less success, Windows 3.0 which was optimizied for block pixels of later graphic adapters).

Also, you could use NTSC artifacting on composite output to implement 16 colors (or more as some of earlier mentioned demos show). While this wasn't widely publicizied and official CGA docs don't mention it, it became a common knowledge by mid 80s and there is a number of games that support 16 colors on composite output. Most famously, early Sierra graphical adventures like King's Quest and Space Quest series used it quite effectively.

So, CGA was actually quite ok and competitive for some time. Biggest problems were interlaced addressing, strange aspect ratio and low resolutions. Following graphic adapters solved that and CGA was forgotten. Although, I guess at some time they might've get resurrected for a bit as they were probably getting obsolete and cheaper (so affordable to a lot of people who wouldn't be able to afford shiny new (S)VGA + SB + CD-ROM multimeda PC of the 90s).
 
Last edited:
Agree. had CGA had some sort of palette registers for the bitmap modes it would had been as good as or even better than the competition.

A tiny correction: Atari had a 128 (or was it 256?) color palette that could all be visible at the same time on screen (although not on the same line without tricky programming).

The main disappointment with CGA as compared to many 8-bit computers from the early 80's (mostly slightly newer though, but still) seems to be how it's obviously made from discrete chips except the 6845, while Atari, Texas, Commodore and whatnot made custom chips that allowed the graphics to be more "elegant" from a software perspective.
Things weren't super straight forward for the competitors though. We all know the weird memory map the Apple II had, but also in bitmap mode the Commodore 64 stored data oriented as if it were characters, i.e.. it's not 40 bytes in a row to draw 320 horizontal pixels (or 160 4-color pixels), but rather eight bytes in a row fills the same space as a character in text mode. And this was about the limitation of the chip manufacturing at the time.

In hindsight I think IBM made a big mistake when they made their PS/2 range as similar to the existing PCs as they did. Sure, they wouldn't had been able to sell something that wasn't backwards compatible, but instead of VGA it would had been better if they had made something more like the Amiga, with a large chunk of memory shared between the graphics hardware and the CPU, some form of hardware acceleration and also while at it let the same hardware handle audio too.
If they had done this, I think that chipset manufacturers would had made compatible chip sets but for ISA bus computers rather than MCA computers, and it might had caught on as a standard.

Going off on this tangent even more - every now and then I kind of toy with the idea of what Commodore could had done with combining their other computers with their PC range. I think it would likely had been fairly easy to add some glue logic to make the Amiga hardware appear as EGA compatible in bitmap mode. The hard nut to crack would be text mode.
Or for that sake how about a "VIC-III" chip, like the one in the C64, but running at twice the clock speed, allowing 640x200 monochrome or 320x200 four colors, with the memory map like CGA, but also with user definable characters in text mode, and sprites, a readable raster row register and raster interrupts, sprite collision detection and whatnot.

Continuing further on this tangent, I find it surprising that Commodore didn't make a PC sound card with two SID chips. PC and the C64 were in way different market segments, so a PC sound card wouldn't had eaten into the C64 sales, and if they had released it in say 85 or earlier it would had been great as compared to the non existing alternatives. (I just watched The 8-bit Guys video about two IBM sound cards, One was a speech synthesis thing, not much to say there, but the other was a "MIDI sound card" of sorts, and boy did all those sound lack luster at the time.
 
  • Like
Reactions: cjs
Following graphic adapters solved that and CGA was forgotten.
But not nearly soon enough. Desktop PCs with CGA were sold as late as 1992 in major U.S. stores, and laptops until 1994, plus palmtops like the HP 200LX until 1999!
 
But not nearly soon enough. Desktop PCs with CGA were sold as late as 1992 in major U.S. stores, and laptops until 1994, plus palmtops like the HP 200LX until 1999!

Wait till you find out when they stopped selling commodore 64s and apple 2s.
 
  • Haha
Reactions: cjs
I find it surprising that Commodore didn't make a PC sound card with two SID chips. PC and the C64 were in way different market segments, so a PC sound card wouldn't had eaten into the C64 sales, and if they had released it in say 85 or earlier it would had been great as compared to the non existing alternatives.
There was a SID sound card for the PC, but not from Commodore and not until 1989:

 
In hindsight I think IBM made a big mistake when they made their PS/2 range as similar to the existing PCs as they did. Sure, they wouldn't had been able to sell something that wasn't backwards compatible, but instead of VGA it would had been better if they had made something more like the Amiga, with a large chunk of memory shared between the graphics hardware and the CPU, some form of hardware acceleration and also while at it let the same hardware handle audio too.
If they had done this, I think that chipset manufacturers would had made compatible chip sets but for ISA bus computers rather than MCA computers, and it might had caught on as a standard.

Curious you would say that, that very difference brought death to Amiga / Atari once 3D exploded.

IBM had raw CPU power and raw pixel-pushing power, so 3D engine can compute and render itself.

On the other hand, IBM had a very vibrant 2D experience with VGA too, without ever resorting to blitters and other added features. Amiga was already stagnating when PCs got first 2D accelerated VGA cards en masse and they weren't even used for games :)

Amiga/Atari could not follow the mid 90s 3D craze because the design was too specialized for 2D. Their computers are great, but they're product of a mindset of creating the best ever 80s machine from the experiences of the early 80s.
 
Continuing further on this tangent, I find it surprising that Commodore didn't make a PC sound card with two SID chips. PC and the C64 were in way different market segments, so a PC sound card wouldn't had eaten into the C64 sales, and if they had released it in say 85 or earlier it would had been great as compared to the non existing alternatives. (I just watched The 8-bit Guys video about two IBM sound cards, One was a speech synthesis thing, not much to say there, but the other was a "MIDI sound card" of sorts, and boy did all those sound lack luster at the time.

Digging up this fact is interesting for the topic of discussion, not tangential, for me it shows PC "gaming" market relevancy.

NES exploded in the latter half of the 80s. I had a peanuts-cheap NES clone before I ever had a sound card on PC. Yes, if people sought chiptunes on a PC, there would be cards out in 1985. Customers pressing software manufactures, them pressing hw manufacturers in turn, as soon as real demand occurs and real market opens up, that's how it goes IRL, or at least I believe it should. There were sound circuits on various PCs fit for home usage such as PCjr, Tandy, Olivetti Prodest, again nobody in the real PC market cared too much...you had to had an user of IBM PC AT sitting at his expensive machine, look at some tune-playing PCjr and actually be envious of it. Which I guess did not happen frequently.
 
Curious you would say that, that very difference brought death to Amiga / Atari once 3D exploded.

IBM had raw CPU power and raw pixel-pushing power, so 3D engine can compute and render itself.
It would technically had been relatively easy to have the hardware switchable between chunky and planar mode.

My main point is that it would had been better if IBM had used an architecture that shares the same RAM both for regular code/data and also video + audio. But also the Amiga way of using a simple co processor, acting as a glorified display list thing, rather than having memory counters that reset each page, made the hardware way more flexible without being that much more complicated. Instead of having registers stating where in memory your display starts, at some hard coded intervals, and a separate counter for the lower bits to go through memory while outputting video, the Amiga way is to just have a counter that can be written to, and the co processor writes a suitable value to it each frame before the start of actual picture output. This way the picture can start in any arbitrary location in memory (obviously even divisible to fit the bus width).

Given that very little software except OSes and special tools talked directly to the floppy controller, IBM could had gotten rid of the old style ISA DMA too.
 
Back
Top