• Please review our updated Terms and Rules here

Why did IBM create CGA - What user was there target?

There's a lot of stuff IBM could've done differently but hindsight is 20/20, they didn't know how stuff will go.

I don't think many people would assume PC is going to be a target for arcade-quality games in 10 years time, and those games will target CGA graphics too.

For example, making an Olivettiesuqe CGA+ card that can be connected to MDA monitor in 1984 makes a lot of sense; if they knew for how long CGA code and mono monitors are to be spread around...

I feel that most of these topics are nonsense, as we're talking about the rise of "low cost gaming" phenomenon.
Imagine building a dumpster gaming PC from 2010 parts and installing a 2020 AAA title on it just to watch it lag. Nobody would do that. They know what its all about and would carefully pick games to go with it.
I just wonder why "gaming commnunity" at large isn't capable of thinking alike in CGA case.

And by the way, even using CGA out of place like running Prince of Persia on 4.77MHz XT is better than nothing, I hope people would get that, something is better than nothing. Obviously if you were running CGA in 1990 you had no $ for better.
Just the other day I've visited a school friend who used to run Medal of Honor Allied Assault in 5 FPS just because his little brother really really wanted to play it and didn't care. I also didn't care about slo-mo Prince.
 
I really have no idea. Although I favor the comment about business graphics. That makes sense since IBM was about business primarily. They may have wanted some of the home market later but I doubt that their CGA was about capturing the home market. Heck, most homes could not afford IBM's stuff.

I can say why I bought into CGA, although a few years later and after CGA was pretty much dead in the marketplace. I was years running CP/M word processing on my NorthStar Advantage. The characters were not great. We could clearly see the pixels. And it was a lot worse than my FIL's IBM PC and the 5151 green monitor where we did our database work. We upgraded to Hercules on the PC and played some games after hours.

When I bought my Tandy 1400 FD in about 1989 it came with a CGA port to supplement the internal LCD display. I used the laptop LCD display for word processing and database entry but if we wanted to play games after hours it was not very good. So I bought the CGA monitor - for games. And I ended up using it for database forms and file management in color. It was a lot nicer using color for business apps than that blue-gray LCD screen. And the pixels were not much different than my old NorthStar Advantage. Then I started getting into DTP and found the CGA 640x200 mono mode adequate to get me started.

Seaken
 
Was there any business graphics software for microcomputers in 1980? I tried looking but everything I can find arrived after IBM set on the PC.

The design drawings provided two planned computer configurations.
The Business System had 32K, cassette, BASIC*, and an 80x24 display. Disk drives and extra memory were planned options.
The Personal System had no RAM on the planar board, no BASIC, and a cassette. 16K of RAM was supplied with the color TV adapter which had planned maximum resolutions of 40x16 and 280x192. Lower resolutions were detailed on page 6 with indications of the amount of RAM needed for the video mode which suggests that the rest of the video RAM would be used for the system. I wonder what type of high resolution game could fit into about 2K. All the options for the Business System could be added.

* Okay, it reads extra ROS of 24K so it's possible that something else might have filled those chips.
 
I remember Tandy DeskMate that came with my 1000SX. When you ran it in the native 40 column mode, the words looked like something out of "Dick and Jane run Spot run". You could see the pixels and they resembled ping-pong balls, an exaggeration but you can get the picture. Also, the presentation resembled Mrs. Murphy's music class where she would take that framix that held 5 pieces of chalk and rake across the backboard and then drew musical notes on. You could actually do a letter in DeskMate and it didn't look too bad on a DMP-130. Things have some along way in the 35-40 years.
 
I doubt the the display was blurrier on a monochrome 15.7 kHz monitor; without the bandwidth limitations introduced by NTSC colour and the shadow mask or aperture grille used in colour monitors, it should have been quite clean, as it was on other systems such as the Apple II with an 80-column card.

If you set a typical composite monitor side by side with a TTL interface *mono* monitor (obviously shadow masks completely break any comparison to mono monitors), even one that runs at NTSC scan rates (like the internal monitors you'd find in a lot of terminal-shaped computers and portables) in my experience the composite monitor usually looks just a *little bit* softer around the edges, which I think you can mostly just chalk up to "analog-ness", or sometimes "cheap-ness"... but in this context when I said "blurry" I was using it inexactly to describe the fact that the characters are going to be lower resolution compared to MDA.

But yes, theoretically speaking at least there's no inherent resolution limit with composite monitors as a *protocol*. The softness you generally see comes from a lot of sources, like capacitance in the cabling, the fact that some computers *intentionally* add a little smearing to their composite output because too-sharp edges can make the output look worse (by causing overshoots), etc, etc. And honestly I think I might prefer it to too-pin-sharp pixels if I'm just looking at text. Might be annoying with precise graphics work.
 
Last edited:
Was there any business graphics software for microcomputers in 1980? I tried looking but everything I can find arrived after IBM set on the PC.

The design drawings provided two planned computer configurations.
The Business System had 32K, cassette, BASIC*, and an 80x24 display. Disk drives and extra memory were planned options.
The Personal System had no RAM on the planar board, no BASIC, and a cassette. 16K of RAM was supplied with the color TV adapter which had planned maximum resolutions of 40x16 and 280x192. Lower resolutions were detailed on page 6 with indications of the amount of RAM needed for the video mode which suggests that the rest of the video RAM would be used for the system. I wonder what type of high resolution game could fit into about 2K. All the options for the Business System could be added.

* Okay, it reads extra ROS of 24K so it's possible that something else might have filled those chips.

Indeed. Those early specs and buzz around IBM, it's pretty clear that the PC was meant to be a way for them to enter the home computer market. The people coming up with these concept drawings were really thinking about ways to reduce the cost of the machine to make it really sell. It does seem for whatever reason, the management dialed that back a bit, to focus more on what they were good at. I thought I heard there was talk in IBM saying they did not think the PC would be successful. So the PC that was released, despite their low end offerings they had, was still priced out of what the home market was selling at for the time ... and it still sold lots of units.

I'd hate to imagine that they decided to really cut costs and really limit the machine as their /first/ machine. It'd probably sucked and not sold. So you do have to give some of the rigidity of IBM credit in that they instead kept focus on what they are good at more, despite the flak of it being blasted as too expensive for being a home computer. Because we all know what a cost reduced "real" home computer from IBM looked like... what happened to that... (again the design of the PCjr wasn't exactly a disaster, but it came in as a "me-too" in a sea of home computers, so it just wasn't good enough for the time, and wasn't what IBM was good at) If their first big splash into the market had been closer to the PCjr with the PC, that would have been disastrous, and likely killed the whole thing.

Anyway, it was an "expensive home computer", that succeeded, beyond their expectations, and not in ways they could exactly stop, or really wanted.

CGA was what they wanted for this machine. It was a key offering to make it a success.
 
Are there statistics on what percent of 5150s shipped with a CGA vs. an MDA?
The lack of CGA monitor at release seems it would be relevant there.
 
If you set a typical composite monitor side by side with a TTL interface *mono* monitor (obviously shadow masks completely break any comparison to mono monitors), even one that runs at NTSC scan rates (like the internal monitors you'd find in a lot of terminal-shaped computers and portables) in my experience the composite monitor usually looks just a *little bit* softer around the edges, which I think you can mostly just chalk up to "analog-ness", or sometimes "cheap-ness"... but in this context when I said "blurry" I was using it inexactly to describe the fact that the characters are going to be lower resolution compared to MDA.

But yes, theoretically speaking at least there's no inherent resolution limit with composite monitors as a *protocol*. The softness you generally see comes from a lot of sources, like capacitance in the cabling, the fact that some computers *intentionally* add a little smearing to their composite output because too-sharp edges can make the output look worse (by causing overshoots), etc, etc. And honestly I think I might prefer it to too-pin-sharp pixels if I'm just looking at text. Might be annoying with precise graphics work.

I managed to repair and get a CGA card running last week. It's still plugged into a NTSC display at the moment.

@cjs noted it should approach the apple output on NTSC in mono, and it does, even on the 5" LCD composite screen, taken from the CGA output.

20250401_081140.jpg

If you allow for the digitisation, issues and the fact that a Samsung Ultra can't take a detailed photo without screwing it up a few times over with invasive AI, then it's still pretty good, and to my eye it looks better - so @cjs isn't wrong there. About the same as the apple. Maybe even slightly sharper as an opinion.
 
Here's a shot of my homebrew video dingus running displaying a 512x384 interlaced display on my old Tandy VM-something (VM-4?) green mono composite monitor:

index.php


It's not as sharp as VGA, but it's pretty darn close. If IBM had supported an interlaced mode with CGA there really wouldn't have been a need for the MDA monitor/card; 30hz interlace flickers like crazy on a color monitor but on long-persistence green it's perfectly tolerable.

(* In terms of tech specs, this is 32 lines of an 8x12 font, 64 columns using a 12mhz pixel clock; the dingus will do 80 columns fine with a faster crystal.)
 
Did any business applications of the time leverage the 16 colors available in the CGA text mode?
@cj7hawk's post of Lotus 1-2-3 above doesn't really count as a "business application of the time," I think; Lotus 1-2-3 itself didn't come out until 1983 and in the original version the spreadsheet itself was still black and white. (If you go a little further forward in the video you can see close-ups that show the 8-dot-high character box, indicating that the display being used for the demo was CGA.)

That said, at some point heavy use of colour for text user interfaces became widespread; it would be interesting to see when this first started and what the early programs that used this were. It no doubt started with games; the PC version of Rogue, released in 1983, used text with colour.

VS--YouTube-RogueTheAdventureGame-ArtificialIntelligenceDesignSystems1983-IBMPC4K-1’59” (1).jpg


Considering the graphics modes were pretty bad for gaming and the cga dot pitch was/is largely bad for text. Why did they create cga?
Going back to this, the dot pitch was not bad for text at the time; it was more or less standard. The VT100, released in 1978 and still in heavy use for years after the PC came out, had an 8×10 character cell (including the spacing between lines and characters), making most characters 7 pixels wide by 8 pixels high. (And it used a standard 15.7 kHz monochrome display.

Soon after Hercules for great text and mono graphics that were arguably BETTER then cga.. so what was going on with CGA??
So you're asking why the designers of CGA didn't anticipate, 2-3 years later, a product from another company?

Seems like an afterthought and a quick punt out the door.
No, it was a pretty decent implementation of microcomputer display technology at the time. It was MDA that was unusual; I can't think of anybody else in North America using a custom higher-resolution display like that on a consumer computer. (In Japan, of course, people started moving up to 8-colour 640×400 in 1981, but they were simply far ahead of North America during the early and mid '80s.)
 
I'm no expert, but this is my recollection. CGA was probably intended primarily for business use. I remember the advertising pictures of it showing colored pie charts. The text mode has some potential for business users to differentiate data as well. I don't think IBM really considered the game market very closely. They knew PCs were an important, evolving market and wanted to have a product.

I don't think they intended to compete directly with the Atari 800, and the TI 99/4a was released the same year, and the C64 after. Yes, CGA should've had 16 colors on screen in graphics mode, I think so as well, but it was really not intended for gaming (having no useful audio for such either).
 
The VT100, released in 1978 and still in heavy use for years after the PC came out, had an 8×10 character cell (including the spacing between lines and characters), making most characters 7 pixels wide by 8 pixels high.

FWIW, 8x10 was pretty standard on TTL monitors that ran at NTSC frequencies, (resulting in 240 lines out of the possible 262 for a 24 line screen) but most computers using composite limited themselves to ~200 lines or so because they needed to account for the television-level degree of overscan many composite monitors were tuned to. (And, also, there was always the possibility that the machine was actually being used on a real TV; IBM actually sold an RF modulator for the 5150, as crazy as that seems in the rear view mirror.) Thus the reason most early 80’s “home” computers and even many business computers settled for 8x8. Even adding one more line, like the Tandy 1000 does by default, causes problems with a lot of monitors, pushing characters into the bezels unless the centering is *just* right.

When you build the computer/terminal into the monitor then, sure, you can tweak the sizing so you can get right up into the porches. They did that horizontally, too; for instance, the TRS-80 Model III has a pixel clock of only a hair over 10 mhz for its 64 column screen. That’s going to go deep into the overscan on most normally adjusted composite monitors. The 12mhz of my Homebrew dingus (which is using the Model III’s font in that shot) is needed to give a decent border.
 
  • Like
Reactions: cjs
Are there statistics on what percent of 5150s shipped with a CGA vs. an MDA?
The lack of CGA monitor at release seems it would be relevant there.
That was a moving target
in 1981 for example the balance was likely very different than the end of production. (First 2 years there was only a single IBM branded monitor for the machine and compatible color TTL screens were extraordinarily expensive)

Do not underestimate, You also had IBM think and full standards lock in the first few years. As businesses at least initially agreed business machines shouldn’t have graphics, this made it shocking that Hercules succeeded being a non-standard from a 3rd party.
Past 85 the IBM lock-in and aversion to graphics was mostly over.

Worth noting a not insignificant number of PCs sold barebones with a PSU, cpu, ram and motherboard, which was an option and you added your own components.
 
Last edited:
I can see where sprites might have been nice to have-- remember that the VIC 20 also didn't have them either-- but the big thing that made CGA disappointing was the palette choices in 320x200.

If they had set up a handful of registers, so the four colours could be freely mapped from the 16-colour space, it would have dramatically increased flexibility and avoided the "look" distinct to CGA software. I suspect you'd also see a lot more demoscene-esque "bash the registers during the right part of a frame" techniques to get much closer to a true 16-colour APA mode.
 
I can see where sprites might have been nice to have-- remember that the VIC 20 also didn't have them either-- but the big thing that made CGA disappointing was the palette choices in 320x200.

If they had set up a handful of registers, so the four colours could be freely mapped from the 16-colour space, it would have dramatically increased flexibility and avoided the "look" distinct to CGA software. I suspect you'd also see a lot more demoscene-esque "bash the registers during the right part of a frame" techniques to get much closer to a true 16-colour APA mode.
PCJr / Tandy mostly solved this as many early JR compatible programs were still 4 color but you could freely choose said colors.

IBM did sort of botch CGA by not having pallets freely chosen.

It’s worth noting CGA has enough extra memory to have 196 full color pallet changes (almost one for every row), such as capability is similar to Atari systems that could also change 4 color pallets on each row
 
If you set a typical composite monitor side by side with a TTL interface *mono* monitor (obviously shadow masks completely break any comparison to mono monitors), even one that runs at NTSC scan rates (like the internal monitors you'd find in a lot of terminal-shaped computers and portables) in my experience the composite monitor usually looks just a *little bit* softer around the edges, which I think you can mostly just chalk up to "analog-ness", or sometimes "cheap-ness"... but in this context when I said "blurry" I was using it inexactly to describe the fact that the characters are going to be lower resolution compared to MDA.
Well, if we set aside the "lower [vertical] resolution than MDA" thing (obviously 350 lines is going to produce a nicer 25-line text display, vertically at least, than 200 lines), I think there is no difference.

I don't see what difference a TTL interface (by which I am taking you also to mean separated sync signals) is going to make. Composite sync is both outside the display area and much lower frequency than the video signal itself. (A horizontal sync pulse is ~200 kHz, versus >10 MHz for an alternating on-off 640 pixel/line display, if I've got my math right.) And the monitor quantising the luminance input to "on/off" versus an analogue black/white range doesn't make any difference at all to the frequency that has to be handled by the generation system and the cable connected to the monitor. (In fact, it might make it worse if it's forcing you to use 0-5V TTL levels rather than 0-0.7 V video levels.)

We can have a look at what the horizontal bandwidth limitations look like on an MDA display by looking at the output from the MDA Video SOC-FPGA Project. This feeds out an image stored in a 720×350 bit image with one bit per pixel (so it's not using the intensity input, just the video input, on the monitor); thus all pixels should theoretically be the same brightness. But have a look at some crops of the high-resolution photos of the output on both the original IBM 5151 monitor and an amber clone (a "GM-1230"):

crop-green-1688521681366884937.png crop-amber-6270781681367035376.png

Here you can clearly see the bandwidth limitations of the 5151: the off pixels do not go anywhere near full black when surrounded by adjacent on pixels, and the on pixels are preceptably below standard on brightness when surrounded by adjacent off pixels.

Interestingly, the clone monitor (presumably cheaper, but I am also guessing newer) shows this issue less. It does make me wonder if perhaps the issues with the 5151 might be related to phosphor behaviour, instead of or as well as, bandwidth limitations.

We can compare this to other more standard (15.7 kHz) systems of the day by looking at an Apple IIe running in 80 column mode with the standard Apple monitor. From this blog entry I've grabbed a couple of screenshots (you'll want to click on these to blow them up to full resolution because I'm too lazy to do more cropping):

1743473929971.png1743473974648.png

The system is running at about 20% lower horizontal resolution (560 pixels instead of 720), and you can pretty clearly see the horizontal bandwidth limitations on characters such as the 'm' and the '0'. Here, too, a single on pixel isn't as bright as a sequence of adjacent on pixels (though the difference seems less to me) and off pixels do seem to go fully off but for a much shorter time (essentially, the on pixels creep into adjacent off pixels).

Unfortunately for this comparison, it's unclear what's causing the "on pixel extension" on the Apple IIe system, and whether that might be intentional, so it's hard to tell if this is really a bandwidth issue or not. (I suspect something else might be doing this.)

I will try at some point soon to get my workbench clean enough that I can haul out some old computers and try them on my old monitor, and maybe 'scope out the signal to see what the actual widths are between the on and off pixels.
 
I don't see what difference a TTL interface (by which I am taking you also to mean separated sync signals) is going to make.

No, I was specifically referring to the input being digitally off or on for the pixel data. I get what you’re arguing here, if you have a completely single level analog composite signal it theoretically should be able to ‘snap’ between black level and fully on on timescales roughly similar to the TTL switching time threshold, but… I simply stated my vague impression that machines that use a digital input in practice have slightly sharper edges, impressions largely based on memory because I haven’t had my hands on an MDA monitor for the better part of thirty years. I’m not going to get into some kind of a contest trying to prove it one way or the other, especially not with photos of other people’s monitors, especially since I have no way to control for variables like brightness or contrast settings, or the fact that any picture taken today involves a forty year old piece of equipment.

(And sure, there are undoubtedly TTL mono monitors, the 5151 might be one, that were bandwidth limited to something less than their pixel clock when new. I never actually owned a 5151 so I can’t say much about using one for extended periods. To be blunt, CGA seemed a lot more common in the wild.)

Again, what I remember is the first PC our family had as a kid was equipped with a somewhat unusual video system combining a Princeton Graphics MAX-12 TTL monochrome monitor with a CGA card. The MAX-12 was one of the first digital monochrome monitors that supported *both* MDA and CGA scan rates, with 16 gray (amber) tones in CGA mode, and I have *never* seen CGA look as sharp on a composite mono monitor as the MAX-12 or other digital monochrome CGA monitors (this “supports CGA” feature was not that uncommon for a period in the later 80’s). Maybe it’s just selection bias of some sort, but I’ve also owned some really nice composite monitors (like that NEC green screen you commonly see with Japanese computers) and while I love looking at composite green they always feel “softer” in my memory at least than digital.

FWIW, someone posted a thread a couple days ago about a visual artifact they noticed in their commodore PET, which also has a TTL monitor, which boils down to being able to see on the screen the ~10-15ns switching delay there exists between the character shift register latching and the “invert” switch that reverses the characters when the high bit is set. That’s the kind of defect in the source material I’d expect would probably be “smeared out” on most composite monitors. But, sure, I imagine you could probably find one able to resolve it if you looked hard enough, not going to bother debating it.
 
Last edited:
Back
Top