• Please review our updated Terms and Rules here

Fantasy CGA redesign

One point, though, about the 16k RAM limit: Referencing a July 1980 issue of "80 Microcomputing" it looks like a set of 16k DRAM chips was retailing for about a hundred bucks in mid-1980. By next year, still prior to the PC's official unveiling, that was down to about $45. That doesn't really seem that cost-prohibitive in the grand scheme of things.

Very interesting. I wonder if IBM limited the CGA to 16K to avoid competing with other (later?) graphics cards rather than because of the RAM cost. Upping it to 32K would have required more than just the cost of the DRAM chips themselves though - there would have to be additional logic to control the extra chips.

Also, not all DRAM is equal - the CGA card used 2118-4 chips instead of the 4116 chips used on the PC system board. I'm not 100% sure but I think this is because the 2118s were faster - the CGA needed to be able to read or write a byte every 279ns but the corresponding figure for system memory is three times that (838ns). I have no idea how much difference that would have made to the price in 1980 though.
 
Also, not all DRAM is equal - the CGA card used 2118-4 chips instead of the 4116 chips used on the PC system board. I'm not 100% sure but I think this is because the 2118s were faster - the CGA needed to be able to read or write a byte every 279ns but the corresponding figure for system memory is three times that (838ns). I have no idea how much difference that would have made to the price in 1980 though.

A close look at a picture of an IBM CGA card on eBay showed a unit sporting Mostek 4516-12 chips, which are 16k DRAMs with a 120ns cycle time. Googling enough found me the datasheet for an *M2118-4*, which is apparently a mil-spec version of the 2118-12 and has identical timing specs, so... I'm not sure what's up with Intel's labelling. In any case, comparing the datasheets to that of a plain-jane TMS4116 the cycle times for a given NS rating are about the same. The difference is that the chips used on the CGA card are single-supply 5v-only chips while 4116s used the three voltage +12/+5/-5 setup. (The Mostek data sheet actually calls out "pinout compatibility with the 4164" as a feature.) I imagine one would have to dig a little to find out what the cost of those were relative to 4116s but speed isn't why they were used. (Undoubtedly the reason had to do with power consumption.)

As to the cost of supporting chips, I doubt that would amount to much. This is oversimplifying it a bit, but it seems to me that if one used a 16 bit wide arrangement for the additional memory all you'd need is a twice-as-wide shift register on the output side and an even/odd byte-to-word selector on the bus (which could be basically free. Or... maybe you could insert a two-byte buffer to collate writes of successive bytes into word-size accesses. That *could* possibly help somewhat reduce snow? I suppose you'd have to have some mechanism to determine whether two successive writes were consecutive, though, so... scratch that.). The memory access speeds for resolutions of double the resolution/double the color would be the same as the otherwise corresponding mode on CGA.
 
Last edited:
Cost for 2116/2117/2118/4116 etc. around 1980 wasn't predictable. The US, prompted by SIA, was in a trade squabble with Japan over 16K parts and getting parts in any quantity at all became more of a matter of who you knew had parts. Prices surged terrifically. I recall that our Intel sales guy asked if we'd be happy with 8K DRAMs for our prototypes. 8K? I'd never heard of them, but anything was better than nothing.

It turns out that Intel wasn't having spectacular yields on their 16K parts, so they took those that tested fault-free in one half or the other and relabeled them as 2109-x, where if x was even, you pulled an address line high, or low if it was odd.

Desparate times--trucks were hijacked, parts cribs were burgled--a very interesting time. Mostly because of stupid trade actions.

Japanese DRAMs at the time really were much better than the US variety. I think it was Mostek who accused the Japanese of "quality dumping"--selling better parts in the US than the US could afford to sell!
 
I recall that our Intel sales guy asked if we'd be happy with 8K DRAMs for our prototypes. 8K? I'd never heard of them, but anything was better than nothing...

When restoring a dynamic-RAM-board PET I actually encountered some of those magical 8K DRAMs, labelled as 4108s. Likewise I'd never heard of such a thing before seeing them with my own eyes.

I'm actually curious in that case whether *all* the 4108s were actually "half-bad" 4116s or if Commodore "needed" so many of them of them (to fulfil their requirement of having 8k in one bank so they could punch holes through the empty bank to prevent end-user upgrades) that at least some percentage might have been completely good 4116 dies.
 
Well, I have an S-100 board that I mostly populated with 2109s that tested in that particular application all-good and they ran as 2117s just fine. I haven't revisited them in 20 years, but maybe I should one of these years...
 
What about PCjr (Tandy 1000) compatibility?

What about PCjr (Tandy 1000) compatibility?

Since this topic is letting us go wild with our creativity anyway, here's a crazy idea:

How about adding support for PCjr graphics modes such as 320x200x16 and 640x200x4 to such a card? I know that the Video Gate Array was based on shared video, but it's mapped to the standard B8000 area. Putting 128KB of video memory on said fantasy/alternate-universe CGA card would be cheating, but what if this card would have a gimmick where it would "sit between" the memory sockets of the first 128KB of on-board memory and the RAM chips? So this card would go into an 8-bit ISA slot, have empty memory sockets of the same type that go into a standard IBM PC that could take up to 128KB of regular IBM PC RAM. It would also have a ribbon cable attached to the card which would then connect to the RAM sockets of the motherboard.

The card would come without RAM when bought, keeping it cheap, and people installing such a card would be instructed to remove the first 128KB of memory from the motherboard, place the removed RAM chips in the sockets on the card, and then attach the ribbon cable coming out of the card into the vacated RAM sockets on the motherboard.

For 8086-based clones, such as the Olivetti M24 and AT&T 6300, a special version could be developed that would support a 16-bit memory bus, and would fit in the proprietary 16-bit expansion slots on those models.

And while we're designing such a CGA+ card anyway, we might as well throw in three-voice audio compatibility, which would install in a similar way: people would have to disconnect the wires from the internal speakers, connect them to input pins on the sound card, and then either connect the output of the card to the existing internal speaker, or use an external audio output on the back of such a card. That would pretty much complete after-market game-mode PCjr and Tandy 1000 compatibility.

Would this be too crazy an idea? And would this be "cheating", in the sense that such a card would have to be feasable with the same technology and hardware budget available to IBM back when they designed the original CGA card? Shipping the card without memory and letting people reuse part of the RAM on their motherboards would keep the card cheap enough to remain within that budget, right?

Or would this be too complicated to implement, or are there perhaps other technical hurdles that I've missed?
 
Of course, IBM's answer to those who wanted improved CGA video was to sell them a 64K EGA card and a 5153 CGA monitor. That way you would get snow-free text mode and full 16-color 320x200 and 640x200 graphics.
 
I know that the Video Gate Array was based on shared video, but it's mapped to the standard B8000 area. Putting 128KB of video memory on said fantasy/alternate-universe CGA card would be cheating, but what if this card would have a gimmick where it would "sit between" the memory sockets of the first 128KB of on-board memory and the RAM chips? So this card would go into an 8-bit ISA slot, have empty memory sockets of the same type that go into a standard IBM PC that could take up to 128KB of regular IBM PC RAM. It would also have a ribbon cable attached to the card which would then connect to the RAM sockets of the motherboard.

That is a really interesting idea and definitely within the spirit of the game (assuming you can figure out a way to do it without the ULA). I suspect the main objection would be that the original CGA card worked just fine even with the low-end 5150s that came with just 16KB of system RAM, but obviously you would need more than that to use this redesigned card (and, if you had a 32KB floppy machine, for example, adding the CGA card would mean you could no longer use the floppy drives). So it might not have fulfilled the design requirements for that reason.

The ribbon cable might not be necessary - I can imagine just having sockets for the RAM chips on the CGA card that the RAM chips you take off the motherboard would fit in to, and any RAM not in use for video would be available as system RAM just as if it was on an expansion card.

You'd also have to make sure that the system RAM is suitable for use as video RAM - in particular the access times. CGA needs an access time of 559ns (1 character period in 80-column text mode) but system RAM needs an access time of 838ns (4 CPU cycles). I'm not sure if that's the only reason that the CGA card used different RAM chips than the (16-64KB) 5150. Refresh time might also be an issue - the CGA card uses the CRTC accesses for DRAM refresh and the system board uses DMA, so that complicates things too.
 
The ribbon cable might not be necessary - I can imagine just having sockets for the RAM chips on the CGA card that the RAM chips you take off the motherboard would fit in to, and any RAM not in use for video would be available as system RAM just as if it was on an expansion card.

Yeah, I thought about that too, but since the original PCjr actually uses the first 128KB of system RAM as shared video memory, at first I wasn't sure if it would be compatible without such a ribbon-cable hack, since add-in memory on ISA cards always gets added on top of existing system RAM (or isn't that necessarily true?).

However, the shared memory is always mapped to base address B8000 and up anyway, and, according to this archived Usenet post, the Tandy 1000 in fact places the shared video memory on top of whatever RAM upgrade that is added to it, while managing to maintain compatibility with PCjr graphics modes. So I guess you'd be right: a ribbon cable might indeed not be necessary. Simply having sockets on the card to accommodate 128KB of RAM chips taken out of the motherboard might suffice.

In that case, an optional feature could be implemented where people willing to buy additional RAM chips could leave the full 640KB in the motherboard, and add RAM chips to the card, which would then map the memory directly to B8000, without having to share any memory in the 640KB region. Of course at that point, we'd be potentially cheating again, with the card having its own dedicated memory, but at least people would have the choice. ;-)
 
add-in memory on ISA cards always gets added on top of existing system RAM (or isn't that necessarily true?).

It's not true - ISA cards can implement whatever addresses they decode, and system board RAM addresses correspond to particular RAM banks. So you could certainly unplug the lowest bank of RAM and plug in an ISA card which implements it. Memory cards of that ere had DIP switches or jumpers to set which addresses the card should map to.
 
I suppose something that cuts to the heart of this thread would be the question: What is the goal? Is it to improve the CGA's competency as a gaming platform, or to provide better "business oriented" graphics? If it's the former the biggest things I can think of that might help without breaking the budget (or necessarily requiring more video RAM) would probably be:

1: Programmable raster interrupts. You could do this with a simple counter.

2: "Tile Mode" graphics. (IE, redefinable character sets, with the option of pointing the hardware at different character tables for different areas of the screen or allowing characters in sizes other than 8x8.) This would require clever programming of the CRTC and a few small hardware modifications to the pixel generation hardware but I don't think it would "cost" much.

3: An arbitrary RGBI pallette register that can be reloaded on a per region/per scanline basis. (See item #1.)

4: Per-region or scanline video mode swapping. (Also see item #1.)

5: More/better "chunky" low-res modes. (Like an official 16 color 160x200 framebuffer mode, or an "official" semigraphic mode that's more easily mixed with text characters.)

Item #1 would allow for both snow prevention (only redraw the whole screen during the vertical blanking period without having to waste time polling the CRTC register) and would facilitate getting more colors onscreen at once via pallette swapping tricks. Apparently there were at least a few games that used pallette swapping on CGA by polling the CRTC and then using a very precise timing loops, but said tricks break easily. Interrupts would make them much easier.

Item #2 would have allowed games to use the fast character-based techniques used by systems like the VIC-20 (and to some extent most home computers) to be used instead of/in addition to the standard interlaced framebuffer. Character graphics aren't as good as sprites and certainly have their limitations but for a certain class of programs they work really well.

Item #3's already been discussed, but it is worth emphasizing that it should have been pretty cheap to include. One 16 "nibble" SRAM chip/buffer and a couple latches and you're good to go.

Item 4: I'm not sure how practical it would be to include this on a CRTC-based card, I'm trying to think of a CRTC-based system that *did* allow it. Think of how machines like the C64, Atari 800 series and Apple IIgs allow for a different character or graphics modes to be mixed on the same screen. The latter two machines included video ASICs which allowed this to happen without CPU involvement, but to do it with the VIC-II in the C64 you used the programmable raster interrupt. *IF* it were possible on the fly to rewrite the CRTC registers at regular intervals during the horizontal blanking period using a raster interrupt then you could pull off some fun tricks like using a colorful low-resolution framebuffer mode for the 2/3rds of the display and using a real text mode for the remainder, both saving RAM and possibly providing support for smooth playfield scrolling (by adjusting the starting point in RAM for the framebuffer or text data when writing the CRTC registers.)

Item 5: Doable with some more fairly minor changes to the pixel generation hardware. The "semigraphics" idea may be moot if you added the support for user-defined character sets.

The unspoken #6 would be "sprites", but obviously that gets into la-la land unless we're willing to assume that IBM was interested enough in the project to cook a custom chip, which they quite clearly weren't. (The Atari 800's ANTIC and CTIA chips existed back in 1979 so it certainly would have been possible for them to do it.) The CGA card is actually sort of notable for its sheer chip count even by 1981's standards.

If the goal is better "business" graphics then it all goes back to wanting more VRAM.
 
Item #1 would allow for both snow prevention (only redraw the whole screen during the vertical blanking period without having to waste time polling the CRTC register) and would facilitate getting more colors onscreen at once via pallette swapping tricks. Apparently there were at least a few games that used pallette swapping on CGA by polling the CRTC and then using a very precise timing loops, but said tricks break easily. Interrupts would make them much easier.

It is actually quite possible to do this with the PIT already on CGA (assuming that the CGA/CRTC clock and PIT clock are derived from a common timebase, and I don't know of any CGA implementations for which this isn't true). You can use the status register to figure out where you are on the screen and then program the PIT frequency for 1 frame - the CGA frame is 912*262 pixels, and each PIT tick is exactly 12 pixels, so you can program the PIT count to 19912 for 1 interrupt per frame. Subdivisions of that are also possible as 19912 = 2*2*2*19*131.

This is in fact how the palette change for California Games worked, but there was a bug in the "am I on a genuine CGA" test routine which made it much more fragile than it really needed to be - the programmers forgot or didn't know that the default mode for PIT channel 0 is 3 (square wave generator) and that in this mode the PIT counts down twice as fast, so they weren't doing the test they thought they were (which was the wrong test anyway) - see http://www.reenigne.org/blog/geeky-video-timing-stuff/ for details.

The same trick can be used to switch between high/low resolution graphics modes, and even between graphics modes and 40-column text mode (with some CRTC reprogramming to get the right character row height). 80-column text mode uses a different CRTC clock, though, so that's a whole different ballgame.
 
Could support for smooth scrolling have been implemented smartly and cheaply in an alternative CGA design?

We can't do much better than the CGA already does in that respect while still being based around the 6845 CRTC. The 6845 does actually have facilities for hardware scrolling which (for reasons I've never quite understood) was very rarely used (I think Super Zaxxon used it, I don't know of any others). The 6845 only allows scrolling with a resolution of 1 character, though (so, 8 scanlines in text modes, 2 scanlines in graphics modes, 1 character width in text modes, 16 pixels horizontally in high-res graphics mode or 8 pixels horizontally in low-res graphics mode).

To get perfect smooth scrolling, you also need to be able to change the position of the sync pulses relative to the image data to 1-pixel resolution, horizontally and vertically. The VGA (and EGA?) have facilities for this but the 6845 doesn't. With some CRTC tweaks you can probably do smooth scrolling vertically with 1 pixel resolution, but 1 pixel horizontal scrolling is impossible as you can't change the CRTC input clock except between 1 text character width in 80-column mode and 1 text character width in 40-column mode.
 
Here's an idea of what would have made the design of the original CGA card better: what if IBM had allowed the option of a memory RAM upgrade? Ship the system with 16KB of RAM on the graphics card, but allow it to be upgraded later through the addition of RAM chips into empty sockets, thereby increasing the memory to for instance 32KB, which would then enable such extra stuff such as 16 color graphics modes?

Yes, I know, from a business standpoint, it would be less attractive, since it would remove an incentive for customers to just upgrade to EGA or VGA later.. But IBM already offered the flexibility for customers to upgrade their system RAM, so allowing the same flexibility w.r.t. graphics memory wouldn't be that far-fetched.
 
Here's an idea of what would have made the design of the original CGA card better: what if IBM had allowed the option of a memory RAM upgrade?

That's a really nice idea! I think it would have been extremely easy to add such sockets to allow more pages of video data in existing modes (which is nice for games and animation). Allowing different video modes is also a matter of adding more pixel sequencers, which is not impossible but might have been just difficult enough to warrant doing it in a ULA instead of with discrete logic, yielding the EGA.
 
From the original:

"Make the palette fully adjustable in 640x200x2 and 320x200x4 modes (i.e. have 16 bits of palette registers instead of 6) so CGA games don't all have the same ugly green/red/yellow or cyan/magenta/white palettes. In the alternate universe where this change had been made, most CGA games would look dramatically better!"

You know, since I read this I haven't been able to stop thinking about it. There has to be an open port that could be captured and used to control a daughter board of some kind that would allow the 4 color mode to be remapped to any of the 16 colors. I don't understand the hardware enough, but you could cheat and have an LPT adapter that you would route the output signal to and put a programmable color shifter. You could set it to a screen mode, have a power circuit process the CGA signal. The cool part is that you would have to use a digital controller to modify an analog controller - without modern chips you couldn't digitize the input signal. Building the analog notch filters wouldn't be that hard, or, some type of voltage multiplier might work. Hmmmm.... LPT is fast enough because you wouldn't change the colors on the fly. Somewhat cheating, but, it would have been possible to implement back in the day, does not require modification of the CGA board and could be easily reproduced and provided for others to share.

e-CGA adapter :)
 
You know, since I read this I haven't been able to stop thinking about it. There has to be an open port that could be captured and used to control a daughter board of some kind that would allow the 4 color mode to be remapped to any of the 16 colors. I don't understand the hardware enough, but you could cheat and have an LPT adapter that you would route the output signal to and put a programmable color shifter.

It could be done, though the easiest way (except from an installation point of view) would probably be easier to cut some traces on the CGA card and add some chips to create the additional palette memory.

An external device could be made that modified the digital output, but trying to modify the analog composite output in this way wouldn't really work - the colour signals are ambiguous by that point, so you wouldn't be able to tell if a bit of cyan was actually cyan or an artifact of the placement of other pixels.
 
Back
Top