• Please review our updated Terms and Rules here

New CGA controller IP brainstorming

what would be involved in developing an actual VGA card
The ao486 core has a relatively self-contained vga component. It relies on dual port block ram, so it requires a CycloneV A4 or larger to get a 256k frame buffer. The bus interface isn't ISA, but looks like it can be adapted with some minimal glue. The bios is broken out separately in the project, but the SeaVGABIOS project might also be worth looking at.
 
Here again, where is the value add in creating yet another VGA (except for the challenge of actually doing it)? I've got stacks lying around and plenty of (working!) LCD panels with VGA inputs. Unlike my carefully hoarded shelf of working composite color monitors. A SuperCGA using DVI/HDMI output with emulated NTSC color artifacting fills a unique niche in the low-end PC space. The CGA is special in that it is well matched to the 8088/8086/80286 performance wise. An emulated NTSC color artifacting ~16 color graphics mode @ ~160x200 is supported in many games including MS flight simulator and I think Prince of Persia, to name a couple. Running flight simulator on an 8088 with a VGA is pretty painful. Not to mention the 16 color 160x100 hacked text mode that is close to my heart. 640x400 mono windows drivers already exist. And the CGA BIOS code is already baked into all PCs. Adding support for enhanced text modes and graphics modes can be done with a small, loadable shim. Check out the A2DVI project for emulated NTSC color artifacting: https://github.com/ThorstenBr/A2DVI-Firmware
 
Last edited:
I haven't bought one (yet), but doesn't the RGB2HDMI have an artifact color mode that can make a fair college try at replicating CGA composite artifacts from the RGBI output?
I have the chain of converters to finally output VGA or HDMI. It works but only from RGBI input. I have the A2CDVI for my Apple //c which it still needs some tweaks to the artifact color emulation, but being all code running on a Pi Pico, super easy to update. A great product.
 
I have the chain of converters to finally output VGA or HDMI. It works but only from RGBI input.

I was referring specifically to the Hoglet67 "RGB2HDMI" that uses a CPLD and a raspberry pi zero as a scaler. Here's a blog entry that has a screenshot of it faking artifact colors from the RGBI input; it's certainly a clever idea and strictly speaking the information you'd need to do it should be present on the RGBI port, even the tricky cases where you're mixing "real" color with artifacts, but... again, I can't testify as to how well it actually works. Honestly I don't really care much about those demos because I can't run them anyway on my Tandy 1000s (I mean, maaaybe I could assemble something that could, I've heard rumors that 8088mph is *mostly* compatible with an IBM Convertible's CGA slice...), and I'm not sure I care that much about CGA composite games because... I have to be truely honest, I think they mostly look *pretty bad*, and I'm not huge into games anyway. (I mostly "hack on" my retro computers; I still enjoy playing the ocassional game, sure, but I kind of feel like I already have the tee shirt from playing them 40 years ago,)

(I'm not sure why CGA's artifact colors look so terrible to me even compared to the Apple II's, even it's a roughly equivalent palette. Sure, they're an improvement over the awful 4-color "real" palette of CGA, but somehow the Apple's version of them is less... I'll go ahead and say the juvenile word that popped into my head, barfy.)
 
I think the reason CGA artifact color games weren't as prevalent as they could have been was due to the CGA not really being designed for it. @Trixter and @VileR can probably give you a much better explanation as to why. They had to write quite a utility to get the colors to sync properly to the monitor. Something about the color burst being half as long as it should be. So a few hacks to make it work across the board. Nevertheless, there was some support for it commercially. Interestingly, many of the emulators that support NTSC colors take the difficulty out of getting a decent image, and of course the colors don't look "barfy" ;-) I don't really play games on my CGA machine either but do play with "what could have been": https://forum.vcfed.org/index.php?threads/more-fun-with-cga-composite-color.1251640/ (honestly, I struggled to get these images displayed on a real CRT)

Having a card that would take the challenge out of getting quality artifact colors would be interesting to me, for sure. But that would only be part of the interest in getting a CGA card that could output DVI/HDMI.

[Update] And I just ordered the RGBtoHDMI. I'll let you know how it goes...
 
Last edited:
The claim was that 1K colors were possible with artifact coloring out to composite. In theory the FPGA could have a sensing circuit which then generated the 1K colors over VGA/HDMI/DVI. I wonder how many bits of R G and B would be needed to support this.
 
Here’s a page showing the palettes achieved by the demo techniques:


Take this with the grain of salt that they’re going to be different on every monitor… that can display them at all because of the sync/colorburst problems.

Anyway, a 4 bit per color DAC (that’s what the Amiga had) would give 4096 colors, numerically enough, but when you look at those palettes and see just how many of the colors are microscopically different shades of puke green and magenta you probably have to concede that you’re going to need 24 bit color to capture the sheer majesty of it.
 
Technically the CGA outputs four bits per chroma cycle. But due to numerous factors, including filtering of the signal, the interaction of the color generating circuitry in text mode, and the sliding window NTSC monitors use to generate colors, you can do a lot better than the simplistic 16 colors at 160 pixel horizontal resolution. How much better is the debate.
 
Technically the CGA outputs four bits per chroma cycle. But due to numerous factors, including filtering of the signal, the interaction of the color generating circuitry in text mode, and the sliding window NTSC monitors use to generate colors, you can do a lot better than the simplistic 16 colors at 160 pixel horizontal resolution. How much better is the debate.

Yep. In the page I linked to above refer to the section "solid artifact colors" for a description of how most of the old-tyme games that used composite color did it on on the PC. The most straightforward version of it forces the colorburst on in the 640x200 monochrome graphics mode and *only* uses the monochrome pixel information to trick the NTSC color detection circuitry to think said pixels are also color phase information. This is exactly what the Apple II did in graphics mode; IE, there's no separate color circuitry in that machine, it just whacks a colorburst on in front of its 280x192(*) monochrome graphics mode. Both the Apple and the PC have a pixel clock that's a multiple of the 3.57954 MHz colorburst frequency, so by setting a monochrome pattern you can get a predictable "artifact color" for every group of pixels that align with the colorburst frequency on the screen; 2 pixels in the case of the low-res Apple II, four pixels in the case of CGA. This is why you'll see the Apple II's *color* resolution often quoted as "140x192" in 6 (* I'll explain this below) colors, and CGA Composite's color resolution is said to be "160x200" in 16; in CGA's case the 640 pixel monochrome mode has 4 pixels for every turn of the NTSC color wheel (IE, the pixel clock is 4x the colorburst reference), there's 16 possible bit patterns in 4 pixels, and 160 blocks total, so boom, 160 pixels with 16 colors. Of course these quotes are actually kind of a lie, because whether you get the block of pixels interpreted at a color can depend on the state of neighboring pixel blocks, so your true resolution can vary a lot. (And impose some ardious limits on *where* you can switch colors and what colors you can have bordering each other.)

(* More about the Apple: the asterisk is there because the machine has a special trick: you might have noticed that 280 divided by 40, the number of character/byte cells across the screen, is 7, not 8. The Apple II uses that 8th bit to control whether the pixels in a given byte cell are offset by half a pixel width, IE, the Apple kind of actually has a 560x192 resolution with the limitation that all pixels are double-width and each group of seven are bound together. The offset doubles the number of colors to 8 instead of the 4 you'd get if it didn't have this shift ability, but there are two duplicates so it's really six... and again, those six have to come in 4 pixel groups and there's goofy interactions at the borders... fun stuff. The Apple IIe added a "double-high resolution" mode that actually uses the full 14mhz pixel clock for the pixel width, so it has the same "16 colors" as CGA composite if you treat it as "140x192"... but because it retains 7 pixel wide byte groups instead of 8 the memory mapping for complex pictures can be even more brain-melting.

... Of course this is all for the benefit of the OP, @resman is well aware of all this stuff...)

Anyway, that page I linked does a pretty good job explaining how it's possible to take advantage of various techniques to "hack the gibson" and squeeze both alternative color palettes and "more" colors out of CGA. The nub of it is because the CGA card does have hardware for generating "Direct Colors" (See here again) in addition to having a pixel clock suited for generating artifacts you can stack the pulses that come out of the color generation hardware together with pixel patterns to trick the poor monitor into displaying more colors. I suspect the reason these palettes are so dominated by greens and magentas is because that's what the "direct colors" on CGA are dominated by, there's relatively few paths that pull the color phase to some other parts of the color wheel while there's a *ton* of those puke greens to play with.

The thing that's ultimately important to remember is that all a composite monitor sees is a lumpy sine wave and its ability to distinguish "pixels" (IE, areas with distinct edges and luma values) from the color information is limited. (Trying to understand the circuits that do it give me a big headache.) The techniques that add up that supposed "1K colors" on CGA rely on the fact that the various parts of the physical card (IE, the pixel generation chain, the output paths that convert the RGBI signals to luma values(*) on the composite wire with a crude resistor DAC, the direct color circuitry that imposes its own digital pulses on top of the luma values via another resistor, etc) don't operate perfectly synchronously with each other, and in the process of mixing them all together adds up moving around the "blip" that the NTSC decoder detects as a color change.

(* this is another significant difference between CGA and the Apple II. The Apple II doesn't have "luma", the pixels under the artifact color are always just black or white, not gray. So not only can you move "around" the color wheel, you can get different brightnesses of what would otherwise be the same color.)

So...

The claim was that 1K colors were possible with artifact coloring out to composite. In theory the FPGA could have a sensing circuit which then generated the 1K colors over VGA/HDMI/DVI. I wonder how many bits of R G and B would be needed to support this.

I guess the TL;DR is if you were building a card specifically to *fake* these palettes you probably *would* need a decently high-res DAC to capture every possible green and pink you can get out of this. That is not to say that all those greens and pinks are actually artistically/practically useful. The fun part to ponder here is if you *were* going to program an FPGA to emulate this is *how* you'd go about it. The "full" method would be to: A: build a state machine that perfectly replicates all the analog delays and voltage amplitudes of the old CGA's composite output and B: feed that into a second state machine that emulates the NTSC decoder circuitry inside a vintage TV. In the real world I imagine you'd be looking for shortcuts.
 
Last edited:
One possible FPGA architecture:
- ISA bus controller asynchronous to everything else
- Video mode specific state machine which reads the video RAM and builds the scan line
- Either a single DVI output stage or two of them - one for RGBI equivalent and the other for composite equivalent which supports artifact colors
- DVI output stage. I believe 640x480 @60 Hz is supported by all displays, so maybe this mode is enough for this design.

For the artifact color decoder if would look at the character byte plus the foreground and background colors - so 512/1K of possible colors.
- If 9 bits of RGB to DVI was enough then a single block RAM could be used for this translation.
- FPGA would probably fetch multiple bytes/characters at a time, so one translation block RAM for each one.

A PCB with the FPGA, configuration ROM, some buffers, an oscillator, and one or two digital video interface connectors would probably do it.
 
DVI output stage. I believe 640x480 @60 Hz is supported by all displays, so maybe this mode is enough for this design

Suggestion if you go DVI: Fix the output resolution at 1920x1080, and scale the active CGA area to a horizontally centered 1280x1000. (IE, in the highest res 640x200 mode each pixel will be 2x5.) This will give the pixels almost exactly the same aspect ratio they have on a real CRT. This is a scaling preset that the GBScontrol firmware for those cheap arcade scaler boards implements and it look *really good*. You could offer this both “solid” or with scanline emulation, either way you get to have integer pixel scaling from the original.
 
Artifact color question(s):

Is there any math to the quick phase changes and the resulting composite artifact color or is it all done my experiment?
Have all values already been calculated by someone? (Is there a table of 512/1024 entries that have as an address (a combination of the character byte and the foreground and background colors) and an output which is the resulting artifact color?)
I'm curious how to populate 512-1024 values in a translation table...

What is the "window" of consecutive pixels that should be grouped to decide on an artifact color - which bits to feed to the table?

I see from the @VileR 512 colors example that a characters like 0xB1 repeats a 01010101 pattern of a certain color with a black background will result in a block of a certain artifact color.

Was this resulting artifact color block the smallest block achievable? What if the pattern was 0101_1010. Would there be two artifact colors? What if it was 01_10_01_10? Would this be four artifact color changes?

Im curious what the limit is - what is the smallest phase shift that produces an artifact color.

I guess this is just academic as there are only a limited number of text values in the character ROM which have alternating 1's and 0's.
 
Suggestion if you go DVI: Fix the output resolution at 1920x1080,
Im not sure an inexpensive FPGA can reach that speed - 148.5 Mhz pixel clock / 1.485 Ghz bit clock is a bit fast. 640x480 would be 25.175 Mhz and 251.7 Mhz which is a lot easier to implement. (or something in between)
 
Is there any math to the quick phase changes and the resulting composite artifact color or is it all done my experiment?
Have all values already been calculated by someone? (Is there a table of 512/1024 entries that have as an address (a combination of the character byte and the foreground and background colors) and an output which is the resulting artifact color?)
I'm curious how to populate 512-1024 values in a translation table...

What is the "window" of consecutive pixels that should be grouped to decide on an artifact color - which bits to feed to the table?

My old 1024 colors article (see post #48) goes into the very rudimentary basics of how artifact colors are generated in 'standard' modes, before it goes into the tweaked text mode/more-color hacks.

Some important things to remember: there are 160 NTSC color cycles within an active 320- or 640-pixel CGA scanline, and a composite waveform at the NTSC color frequency will generate a "solid" artifact color. So any repeating pattern of four hi-res pixels (or two low-res pixels) will do that. That's half a character width in 80-column text mode, or one quarter of a character in 40 columns.

However, artifact color emulation can't be done right by just rendering blocks of solid colors from a lookup table. A few early attempts did something along those lines (see e.g. very old Applewin or DOSBox circa 20 years ago), and it looks completely off. The reason lies in how NTSC is encoded and decoded: there's no perfect separation between luma and chroma (brightness and hue if you prefer), and the latter varies smoothly across the color cycle. Basically every transition between colors will be somewhat artifacted - among other things, that's why you get color "fringing" on text, even when it's white on a black background and contains no neat repeating patterns.

In other words, the trick of lining up characters, dot patterns, etc. may be useful for the graphic artist (for solid blocks of color), but not as a basis for simulating the actual rendering... it's just one specific 'high-level' phenomenon resulting from what's really going on at the signal level. After all, the NTSC decoder doesn't (and can't) know anything about repeated pixel patterns, let alone character bitmaps. For a passable emulation, you'd really have to forget about these 'logical' units and consider the physical signal.

So what's the optimal way to go about it? The best approach I've seen is @reenigne's: it *is* based on a table, which IIRC has 1024 entries - each representing one possible color transition (between the 16 'direct' CGA colors) at one of the 4 possible phases within the color carrier cycle (640/160), or 16*16*4 values. The actual values themselves were sampled from the CGA multiplexer's output, so they basically allow you to reconstruct the raw composite signal, and decode that (using a filter) much like a true NTSC device.

There should be a few versions of this code around, e.g. in MartyPC, variants of 86box and DOSBox (I could try to find the specific ones if needed), and reenigne's own CGAArt. I'm guessing this approach should be somewhat adaptable to FPGA hardware - at least I hope so!

I see from the @VileR 512 colors example that a characters like 0xB1 repeats a 01010101 pattern of a certain color with a black background will result in a block of a certain artifact color.

Was this resulting artifact color block the smallest block achievable? What if the pattern was 0101_1010. Would there be two artifact colors? What if it was 01_10_01_10? Would this be four artifact color changes?

Im curious what the limit is - what is the smallest phase shift that produces an artifact color.

As mentioned above, you get *some* color artifacting at pretty much every transition. But for a solid 'block' of color, it's anything that matches the frequency of the NTSC color carrier - 160 cycles per line (four hi-res/80-column pixels). The patterns in my 512/1024 examples are twice as long, but that's only because they use text mode, so the limitation is the width of the character cell.

With the above mentioned low-level approach, you could get a faithful reproduction no matter the video mode - the 512/1024 color trick would simply 'fall out' of that.

I guess this is just academic as there are only a limited number of text values in the character ROM which have alternating 1's and 0's.

Yes, but if you choose to go one better and allow arbitrary/'soft' character sets, then nobody's stopping us from picking more "suitable" character bitmaps. ;)
 
Looks like the RGB2HDMI project does a decent job of converting RGBI to artifact color.
I don't have an rgb2hdmi, but if it's like the CGA and Apple screenshots at their gallery (https://github.com/hoglet67/RGBtoHDMI/wiki/Gallery-of-Screencaps), those still look quite blocky and not really close to what you get from the real thing, or nowadays even from a software emulator.

Not to be dismissive... that's still decent, considering that the RGBI signal is all it has to work with. Using just that to simulate a composite picture is a pretty neat hack, and when you're sitting between an RGB signal and the display, you probably can't do much better than this (at least not without introducing lag). But when you're the actual display adapter, you should have much more flexibility...
 
Back
Top