• Please review our updated Terms and Rules here

AT to XT Keyboard Converter

Yes, I like lean software too... Wirth's Law is in full effect. But I like lean software on the principle that modern apps are written poorly in the presence of nearly unlimited memory. Nothing wrong with using all 640k though, IMHO :p.

I imagine that if you're prepared to swap ports between input and output mode, you could do this with a 4-bit microcontroller.
 
I imagine that if you're prepared to swap ports between input and output mode, you could do this with a 4-bit microcontroller.

4 bit or 4 pin? The 6-pin PICs have 4 I/O pins (you need 2 pins for power), but there was something about those pins that made the job impossible. It's been too long since I did my design work to remember what it was. Besides, there wasn't a significant cost differential between the 6- and 8-pin PICs.

Of course, today, you'd throw a 120MHz ARM4 at the job and write the code in Javascript. :)
 
4-bit (i.e. 6 pin, like you said)...

I was born in the wrong era. I would've been a successful programmer for DOS, but not for Windows... Linux, maybe... but it doesn't help that anything I know about multitasking are things I taught myself. It wasn't covered in my ECE curriculum.
 
Last edited:
Finally have some time to work on this again. I also have a PCB design that I'm going to get made. Chuck: In your XTATKEY.asm code, I can see that for each possible scenario in your CLK interrupt, you take care of reading or sending all bits from/to the AT keyboard in one invocation. Where do you clear the bit that prevents the interrupt from being called again when the clock transitions from high-to-low? Is this PIC12-specific that each interrupt is served once and an interrupt cannot be invoked from within an interrupt?
 
You've got it. The lower PICs are pretty primitive; there's one interrupt and no auto-saving of status--you've got to do that via a bit of mumbo-jumbo--remember that the "stack" only holds 8 return addresses and is not accessible by program code. You end the interrupt with a RETFIE instruction (after restoring things). While it's possible to nest interrupts, it's some pretty ugly code and, given the small stack used for both interrupts and calls, not really worth the trouble. Remember that there are only 256 8-bit memory addresses, broken into two banks of 128 (there's a bit in the status register to change banks).

If you're used to AVR code, it seems pretty brutal. If I had to do it over again today, I'd probably use a ATTiny MCU, but the PIC was cheap and adequate for the job. It's a little bit of entertainment seeing how much can be squeezed out of one.
 
You end the interrupt with a RETFIE instruction (after restoring things). While it's possible to nest interrupts, it's some pretty ugly code and, given the small stack used for both interrupts and calls, not really worth the trouble...

If you're used to AVR code, it seems pretty brutal. If I had to do it over again today, I'd probably use a ATTiny MCU, but the PIC was cheap and adequate for the job. It's a little bit of entertainment seeing how much can be squeezed out of one.

The MSP430 is also cheap if you keep requesting free samples at regularly-spaced intervals ;).

So, by nesting interrupts, the best you can do on a PIC (with some ugly stack manipulation as you said) is invoke a second interrupt after the first is finished- i.e. akin to a lower priority interrupt being invoked after a higher-one finishes?

EDIT: Apologies, I should actually read the datasheet next time- it's explained in the Interrupts section on page 61. How appropriate: http://ww1.microchip.com/downloads/en/devicedoc/41190c.pdf


Well, I'm attempting to translate the AT send-command-to-keyboard portion of your code to C, and while my code looks superficially similar to yours, I call my interrupt on a bit-by-bit basis. Looking at your code and comparing it to mine made me realize that my code wouldn't work unless I disable interrupts within the interrupt. While the code I have is divided into a 'host mode' (send to AT) and 'not host mode' (receive from AT) using an if-statement, I still can see that getting ugly fast :p. So I think I'll just keep the interrupts in 'send by bit' mode.

Also, in your interrupt, you have a particular comment that stands out:
; See if we're asking to send a byte. If we get a collision,
; (a byte coming in when we're trying to send), it's not fatal

You're referring to the fact that it's the keyboard's responsibility to yield to the host, correct? What about the case where the keyboard has already sent 11-bits, but has not released clock yet? Unless you specifically wait for that condition (keyboard released clock) before enabling interrupts.
 
Last edited:
Yeah, nesting on the PIC would be pretty ugly.

When the PIC interrupt hits, all that's known is that the clock has transitioned from high to low. While we might have a character queued up to send to the keyboard at that point, there's always the possibility that the keyboard dropped the clock line in preparation to send a character. In that case, the PIC yields to the keyboard and accepts the character; the PIC will send its character later.
 
I'm beginning to think that emulating the host on the MSP430 just isn't possible... In a last ditch effort, I took the 'Send AT command byte' ISR and made it into a polling routine that checks when the clock goes low than high. No matter what I do, I simply cannot get the keyboard to correctly respond (send 11 bits) when I send a particular command byte... the ISR (which only contains the 'receive byte from AT keyboard') always receives at most 10 bits... and furthermore, it appears that it's trying to send a resend code back (0xFE) but stalling before the stop bit.

Additionally, I can verify that there is a full clock pulse between sending the stop bit (and releasing the Data line) and the keyboard driving the data line low (so instead of 11 clock pulses, the send byte to keyboard routine registers 12 clock pulses before the keyboard drives data low)... it's like the keyboard is refusing to send an ACK bit at all but still sending the remaining data to respond to the host!

EDIT: Some progress- I had to change the clock speed of the MSP430 from 1.1 MHz to approximately 2.3 MHz... now all bits are being sent and received properly. there literally wasn't enough time to service the interrupt when the MSP430 served the role as AT host emulator and keycode converter.

However, I'm still consistently getting 0XFE (resend command request) in response from the keyboard :/... close but no cigar...

EDIT2: One of my constants was wrong... I'm pointing out my mistakes only as a warning for other people to watch their code carefully.

The offending line was the following:
//P1OUT |= ((P1Out_buffer & 0x01) << (PORT1_DATA - 1));

PORT1_DATA is a constant which aliases the value 0x10, or BIT4 in MSP430 headers... P1OUT is the output values on the I/O pins... P1Out_buffer holds the command byte to send (with parity).

The above line shifts the data 0x10, or 16 places, when I intended to shift it 4, which is the pin number of PORT1_DATA.

The correct constant I needed to use was PORT1_DATA_PIN, which equals 4:
P1OUT |= ((P1Out_buffer & 0x01) << PORT1_DATA_PIN);

And now I can communicate with the keyboard.

Chuck: Can you see why I'm not a C programmer by profession XD? I certainly know the language, I just make stupid mistakes... Mike Brutman has already seen the tricks I use when coding in DOS, and let's just say he thought I found rather creative ways to trash DOS.
 
Last edited:
This is my last question- no seriously it is! Can anyone else verify that the IBM PC and XT BIOSes (and perhaps some clones) reset the keyboard twice during POST?

When doing a soft reset, it appears the PC host is requesting two keyboard resets using a debug variable that I have. My code therefore sends two resets to the AT keyboard without giving it enough time to process the first one. The keyboard sends 0xFE (resend) in return, but sends 0xAA soon after, as if it received the command just fine. So it's more of a case where the code works, but not necessarily for the right reason. I may just leave it as is, since no one using the keyboard converter could tell the difference.

EDIT: I lied... THIS is my last question... how can I reliably distinguish an AT host from an XT host from the point-of-view of the microcontroller (i.e. send a byte, and if there's a response back, assume AT host)?

I have everything working, and I sent out for three prototype PCBs for cheap. When I get them and test them out, I will upload my Gerber files and C source to this thread as an alternative implementation of an AT to XT keyboard converter. I would offer to sell completed PCBs, but unfortunately, I do not have the time- nor money- to make a large number of them. However, I can certainly point anyone interested who has a MSP430 Launchpad in the right direction :D.

The code (including table) is now 1700+ bytes :D! It is in dire need of optimization by removing certain compiler intrinsics and debugging symbols, so that is my next task.

Edit: 1398 bytes of code, 218 bytes of data in current Release configuration (I'm using Code Composer Studio as my IDE, which uses Eclipse as a backend). Note that 132 of those bytes (The AT to XT translation table is declared as a constant char array 132 bytes long) are shared between code and data...

So I got 300 bytes back... not bad. Still not hand-optimized assembly quality though :p!
 
Last edited:
The XT clone I have here (on loan) does not RESET twice, but it powers the unit up twice, which is interesting.

Sorry I have not been following this thread for a few days, but I just finished AVR C code that implements AT to Parallel/serial/XT conversion. It offers the following features:

  • Supports 83-key and extended XT keyboards
  • ASCII output via RS232
  • ASCII output via parallel, with configurable strobe sense and length
  • Configure parameters from keyboard (Ctrl-ALT-BS). A full list is at: http://www.go4retro.com/products/ps2-encoder/
  • AT and XT protocols handled via state machines
  • C code
  • GPL license
  • Configuration saved in EEPROM
  • Ctrl/Alt/Delete to enable RESET output supported

The code is at: http://www.go4retro.com/downloads/PS2Encoder/PS2Encoder_v0.7.3.zip

The code runs on PS2Encoder, which I do sell, but the code will run on any ATMEGA8/88/168/328, and can be quickly ported to M16/M32 and others. It requires 2 INTs, and 2 timers.

For posterity, here is some additional XT KB info I don't think I've seen elsewhere (but, maybe it is and I just did not understand it)

On a enhanced (read, more than 83 keys) XT keyboard, there are some rules for the scancodes associated with: END,HOME, INSERT,DELETE,PGDN, PGUP, and the cursor keys:

* If num lock is on, each of these keys is preceded by an extended
shift (0xe0, 0x12)
* If the key is still being held down and another key is depressed,
the extended shift is removed (0xe0, 0x92), even if the new key is
also one of the extended keys noted.

Also, of general note:

* This IBM Model M puts .767ms between scancodes. It does not look
like it is needed, but FWIW
* clock cycle is .117ms. 80.375uS high, 36.625uS low, except for the
first low hold time, which is 100uS.
* start bit is set 7.5uS after CLK goes low the first time.
* data bit transitions happen 25uS after high to low transition (I am
assuming that in the start bit case, the code is easy (bring high),
so 7.5uS is just a few instructions. In the latter case, the extra
time is needed to determine the data line's state after the high to
low transition).
* after the last CLK fall, the low is held for 147uS. Data is kept at
the same level for 106uS. When attached to this PC clone, the data
line is drug low by the PC right after the CLK line goes low for the
last time, and then it releases it (and then the KB brings it low
after the 106uS mark.
* The .767ms is made up up the 147uS of CLK holdoff + 620uS of delay
before sending another character.
* The Model M startup is very different, since it is autosensing. I've saved a trace and am looking at what it is doing.
* On a Clone XT KB I have here, after the reset condition is found,
and the CLK line goes high, the KB waits 3ms before sending data. (I
assume it uses that time to "reset" the KB)
* The clone KB waits 2.5ms between chars
* The clone KB doesn't bother setting DATA low until the CLK goes
high. Then, 3ms later, it sends reset (0xaa)

More minutiae for those who care.

My emulation was working fine, but I was leaving DATA high at idle, so I went in tonight I rewrote the code to set it low at idle.

Added the 3ms after RESET delay, changed CLK periods to match Model M, added .6mS inter char delay. None appear to be needed, but probably adding them makes thing more robust. I did not add the 107/147uS wait after char finish to IDLE, and I performed all of my DATA transition on CLK rise, instead of midpoint.

Jim
 
The most difficult is to try and parse the set 2 scancode sequences and work out which key they correspond to. In this scenario, the fake shifts E012 / E0F012 can be ignored completely, and you'll need to keep track of at least two more items of state: 'was the last code E0?' and 'is fake Ctrl pressed?' (if it is, scancode 77 is Pause. If it isn't, it's Num Lock).
Hehe, I didn't know any better, and so I did this (I created a state machine to track all of this information.

Jim
 
Good heaven's man--the PIC version takes up 469 (decimal) 13-bit program words--less than half of the 1K program memory and 160 of that is the translation table. What on earth are you doing with all that code?

Hehe, then you'll hate my version, which I think is 7K

In my defense, it's that big because:

I decided against lookup tables in favor os easier to modify case statements. That would have saved a few K, in my opinion
The code does two conversions, AT to ASCII and AT to XT
I have included partial support for switches and a keyboard matrix, as the code will eventually include raw matrix to AT, raw matrix to XT, etc.

A hat tip to you, sir, for writing very compact code.

Jim
 
Hehe, I didn't know any better, and so I did this (I created a state machine to track all of this information.

Jim

My loss of sanity was all for naught T_T... I should've just waited for your version. I did an MSP430 version (I never really jumped on the AVR bandwagon), and am waiting for some test PCBs to come in.

I also used an FSM.

I basically used the first approach JohnElliot described. I pass E0 and E1 through as is from the AT to XT and leave it to the host to interpret it, so that my circuit basically will mimic an 101-key keyboard that only understands Scan Code Set 1 (which is legal, according to Microsoft's scancode.doc). Sorry John, but I STILL don't quite get your explanation, unless what you're implying is that the ScanCode Set 2 on an AT keyboard was changed slightly for 101-key keyboards and beyond. I used scancode.doc and these tables as my reference:
http://www.computer-engineering.org/ps2keyboard/scancodes1.html
http://www.computer-engineering.org/ps2keyboard/scancodes2.html
Only PAUSE is handled and BREAK isn't given an entry in either table. Likewise, PRNT SCRN, but not SYS REQ is handled.

That being said, I found it's not actually necessary to test whether fake CTRL was pressed, and added special logic to NUM LOCK to prevent the LED from going out of sync inside the PAUSE scancode. The only key that acts 'weird' in this regard is PRNT SCRN, and PAUSE... well, E1 does nothing, but my IBM 5150, which is a 256k model, handles the remaining codes gracefully enough (NUM LOCK is NOT triggered on the host while the PAUSE scancode is being interpreted. Perhaps IBM was looking ahead? The duplicated keypad codes (i.e. arrow keys) ALSO behave as expected regardless of the state of NUM LOCK, unless a piece of software (such as testing the keyboard as if it only had 83 keys in CHECKIT) causes the duplicated keys to go out of sync behave as if NUM LOCK was perpetually on.
 
So what's next? Someone want to do a USB-to-XT converter? You could probably do it on the cheap with an AT90USB162 MCU (I use them for USB-to-parallel interfaces; have an internal voltage regulator and 5V-tolerant inputs. They come with a USB flash bootloader installed, so no special programmer is needed.)
 
Sorry John, but I STILL don't quite get your explanation, unless what you're implying is that the ScanCode Set 2 on an AT keyboard was changed slightly for 101-key keyboards and beyond.

Which bit of the explanation are you having trouble with?
 
I thought the 90usb162 only supported device mode? To support a USB KB to an XT, you'd need host mode or OTG, right?

I picked up a PIC32 that has host mode the other day, so I can implement the USB KB portion. It was the most economical option. 90USB647 was $10.00 at 100s, and ARMs were cheaper. But, nothing beat this little MIPS based PIC32 on price.
 
Jim, you caught me! The AT90USB647 and AT90USB1287 have OTG mode, which means that they'll act as hosts. The 162 unfortunately is doomed to be a slave forever.

Probably better is to use a ARM, such as a STM32F105 device--even cheaper than an AT90USB162, depending on your memory requirements. 5V tolerant inputs, too.

I like PIC32MX for big memory and use the '795 parts. But be careful to read the errata sheets! Some of the discovered"features" are rather unpleasant. After a "you can't get there from here" erratum, I'm back to looking at ARM3 and ARM4 MCUs. Traditionally, they've been miserly with onboard RAM, but that's improved quite a bit in the last couple of years.
 
Last edited:
Oh, no worries.

Yeah, I priced the STM32 (I have a discovery dev kit here), but it was $4.50 in 100s, and this little PIC was 2.08 in 100s, so I thought I'd at least give it a go. I agree, though, I'm transitioning to ARM on many new projects, the cost is right and the silicon offers a lot. STM32 is my preference.

My main criteria is a sensible C compiler for the target. That's why I initially went AVR as opposed to PIC8. I might try PIC8 again, since some parts are just dirt cheap, and offer a mix of specific features I want. Lack of a good C compiler is a major turnoff, though. I wrote some PIC assembly for the Ubicom/Scenix parts a while back, and I am just not efficient enough at my ASM skills to make it worth my time. My day job is nothing related, so I need to use tools I can work fast in, and ASM is just not it.

Jim
 
I don't much care for the AVR assemblers either. You'd figure that since they were planning on a cross-assembler, it would be feature-rich--at least as good as, say MASM.

No such luck, I guess.
 
Back
Top