• Please review our updated Terms and Rules here

VGA to MDA Adapter for IBM Monochrome Monitor?

Thanks for the information krebizfan. How would I switch to the monochrome VGA mode? It seems unlikely that a newer computer would have such an option.

As for the actual connection from the computer (VGA) to the monitor (MDA), what kind of pinout would be necessary?

How new a computer is this being tried with? I think monochrome VGA drivers were available through Win 95 and OS/2 Warp; sorry, never really looked at it with even more recent systems. Failure of my monochrome IBM VGA monitor negatively impacted my ability to research the topic. Some drivers should be able to do it. Windows Vista and later expect more capable video as a minimum so aren't good options.

You will need to create an adapter to convert the signals; the more similar the signal the easier the converter should be. I have seen some VGA to Black and White TV adapters which could be a useful starting point. Actually building the adapter is a bit beyond my skill set.
 
Yes, I've seen quite a few adapters that convert MDA to VGA (for using a newer monitor with an older computer), but not the other direction./QUOTE]

Can you provide some additional info on such adapters? That would be a handy adapter to have...even though it is opposite of what the OP is looking for... :-)

Thanks,

Wesley
 
I have an IBM 5151 monochrome monitor with a DE-9 connector for MDA. I would like to connect this monitor to a newer computer with a DE-15 connector for VGA.
...
The VGA output from the computer will display a full screen terminal window with white text and a black background.
...
Does this thinking make sense?

No, it doesn't. It would be a lot simpler and much less time-consuming to just find a vintage system to connect your vintage monitor to. You can run terminal programs on old PCs; you can even run telnet over ethernet (google "mTCP").
 
Hello,

I have an IBM 5151 monochrome monitor with a DE-9 connector for MDA. I would like to connect this monitor to a newer computer with a DE-15 connector for VGA.


Stone,

I am planning to display a full screen terminal window on the IBM monitor, so there will only be text.


As i understand it VGA has an effective resolution of 720x400 in text mode and MDA has 720×350. Each character in VGA is 9x16, and in MDA it is 8×14. On VGA you can load a custom font and tweak the character size IIRC, so it may be possible to set a VGA card up in a format that is close enough to MDA to avoid requiring a full scan converter.

Some VGA cards can emulate MDA, what this actually outputs to the screen in terms of raster size, i am not sure of. It would appear that sending 720×350 to a VGA monitor would work, but i don't know. I do know that in Linux with custom modlines, and the right VGA card you can get a VGA card to output directly to a TV via the SCART connector (in a graphics mode).

Either way you would still need some external components to do the analogue to digital conversion. The Amdek monitor in the link *may* support standards other than MDA. I used to own a "CGA" monitor that through an adapter could connect to a BBC Master. I don't know how it did this, but i would guess that the adapter exploited some additional functionality of the monitor.

If all you want is text, by far the easiest thing, as already stated, would be to have a PC with an MDA card connected to the monitor running a terminal program. You could have something small footprint like this:

http://www.ebay.com/itm/233-MHz-LEI...351044?hash=item488912b984:g:zI0AAOSwL7VWq-UQ

Not saying i would buy that, as it is very expensive for what it is, and i dont even know it it would work.

If you want graphics, the most practical method I can think of is to get a 486/Pentium system with a Hercules card. I seem to recall a Herc driver for win95, so with this you can output a raster. You could then use something like VNC to connect to a desktop of your choice. I'm sure Linux has options too. Yes it may be a bit slow, but the phosphor on that monitor is very slow anyway.

Andrew
 
Hi, I found an interesting article that talks about exactly what we need, it's from 1989 and from a magazine called Elektor, it talks about an "easy to build" adapter but I'm not so sure about that. Do you know of anyone who has done a circuit like this? I attach the link of the article in Google Drive. Kind regards

 
Hi, I found an interesting article that talks about exactly what we need, it's from 1989 and from a magazine called Elektor, it talks about an "easy to build" adapter but I'm not so sure about that. Do you know of anyone who has done a circuit like this? I attach the link of the article in Google Drive. Kind regards

It’s certainly an interesting circuit, especially the sync separation portion. I might actually have a use for that…

But there’s a couple things to make clear, here: first off, this circuit does no scan conversion. The test case is they’re using it to display output from a PAL BBC micro on a TTL monitor; the difference in line rate between those two standards is about 15% (slower, for PAL) and they run at the same vertical frame rate (50hz), so this is far closer to within specs for the monitor than VGA is, which has a line rate about 80% higher. I doubt many MDA monitors are going to lock onto that. And…

Point #2, the video conversation relies on a single bit brightness cutoff for deciding if a pixel is on or off. This circuit produces *no* shades of gray. So for anything but a DOS prompt it’s likely to be essentially useless.
 
But there’s a couple things to make clear, here: first off, this circuit does no scan conversion. The test case is they’re using it to display output from a PAL BBC micro on a TTL monitor; the difference in line rate between those two standards is about 15% (slower, for PAL) and they run at the same vertical frame rate (50hz), so this is far closer to within specs for the monitor than VGA is, which has a line rate about 80% higher. I doubt many MDA monitors are going to lock onto that.

I think it's not even talking about MDA monitors. 15khz mono TTL monitors were common-ish in the mid 80s, meant to be used with non-PC platforms. This circuit is just separating composite video into video and separate syncs to work with one of those monitors.
 
I think it's not even talking about MDA monitors.

The bold subtitle of the article specifically says “Among the welcome side effects of the current invasion of IBM PCs and Compatibles are the drastic price cuts for high resolution 12 and 14 inch, TTL compatible monochrome monitors.”

It is of note by the second half of the 80’s that MDA monitors that could also emulate CGA weren’t particularly uncommon, so you didn’t necessarily have to take your chances on whether a given MDA-only monitor would happen to sync down to the lower line rate, but considering the machine in the article is a PAL BBC micro I’m not sure a CGA frequency monitor would be happier, given they’re meant for NTSC(ish) framing. (Definitely pick a monitor with vertical size adjustments.)

Maybe ‘generic’ 15kHz TTL monitors were more common in Europe, I dunno, but in the US most computers with them had them in the form of bare chassis units installed internally.
 
"Grayscale" can be achieved with PWM.


Also my info about VGA Wonder earlier was incorrect, it can only do CGA and EGA on monochrome monitors.

Sorry to quote your old post,

I have a question, how exactly does this PWM thing drive a standard MDA screen to different shades? Wouldn't IC201 just trigger a single level regardless of input, or I'm mistaken about its purpose in 5151?

In a video about driving MDA from VGA signal author says in comments

I am guessing your monitor is built like mine, there is a logic gate (Exclusive OR type) that will turn any analog signal into a digital (full off or full on) signal. To get AmberScale, you will probably need to do like I did, bypass this gate

Clearly ATI advertises universal monitor support for EGA Wonder 800+, no mods needed.
 
I have a question, how exactly does this PWM thing drive a standard MDA screen to different shades? Wouldn't IC201 just trigger a single level regardless of input, or I'm mistaken about its purpose in 5151?

The pixel clock in a standard MDA is around 16mhz, which if you toggle bits on and off at full speed gives you the 720 pixels across the active area of a line will give you 360 each black/white pixels; take a few steps back from the screen and you’ll see a shade of gray. Pair the pixels up and light up zero, one, or two in each pair and you’ll have three shades… and if you really go nuts and utilize the intensity line that MDA also has and you’ll be able to fake quite a few shades of gray at a resolution half that of the pixel clock. (Remember, MDA natively supports three or four shades; why am I saying “three or four”? Apparently MDA monitors aren’t entirely consistent about how they display “intense black”. Some will show it as dark gray, like CGA monitors do, while others show both black and intense black as black.) So…

One method cards like those ATI boards can use to fake grayscale they multiply the pixel clock compared to the mode they’re displaying and dither. IE, if you’re faking a 640x350 16 color mode you shove out 1280x350 dithered pixels and rely on the fact that the monitor isn’t going to be able to fully resolve them, thus creating a grayscale blur. You can reasonably describe this as “PWM”.

Som cards may also use a method where they change the intensity state for a given pixel between frames, relying on persistence of vision to average it? At the frame rate MDA runs at I can image that would cause significant flicker, although the long phosphor persistence of the 5151 might help.
 
If I understand this correctly, dithering would happen from sending twice more pixels per scanline. Two subpixels are sent and the endgame is a single real pixel on the screen with artifact colour.

Also in my understanding, this would happen by dividing a TTL signal period into two. So the situation about logical trigger IC isn't relevant, the signal is still 1 or 0. It just shifts mid-pixel, resulting in the beam not lighting the entire geomerty of physical pixel but partially, which can be worked into getting shades algorithmically.

The issue I have is that oscilloscope signal looks like two different peak-to-peak sinewaves on Intensity and Video.
If I got the gist of your message, PWM isn't the goal, the screens are not driven analogly. It just appears as PWM, because the divided TTL becomes iregullar, and oscilloscope itself might be the source of lo-pass filter.
IC in the IBM 5151 so reference design for all MDA screens, will always trigger high from input, a non-moded MDA screen cannot intepret information from analogue signal.

Does this sound reasonable?

I'm trying to figure out how to turn current RGB upscalers into being able to digest this. If above is on the correct course of thinking, only software patch would be needed.
 
When ATI cards simulate 16 shades of gray on an MDA monitor, there is no dithering -- you still get the full 720x350 resolution. The darker shades do have some visible flicker, but with the 5151's long-persistence phosphor, you hardly notice it.

It's the same trick that monochrome LCDs in old laptops used to display multiple shades of gray. If you rapidly flash a pixel on and off, it will appear as a lighter shade of gray than a pixel that is fully on. Adjusting the amount of on time vs. off time adjusts how much lighter it is. That's what PWM (Pulse Width Modulation) is.
 
So they just drop the TTL line from high to low before the end of a pixel period, as sooner as it happens it's dimmer?
 
Back
Top