I also took a midi cable and used that to connect to the XT.
I didn't even have to chop one -- I actually have a bin of the plugs unsoldered and ready for action... though most of them are left over from when I was making TRS-80 /similar cassette port cables.
I've ordered a PS/2 board online that had 4 pins on the other side (easier than soldering myself)
Nice approach... I was sitting here questioning the PC/XT timings I'm seeing so I'm turning my Teensy 3.0 into a el-cheapo graphing logic probe -- actually got it down to 0.46ns granularity for one channel, 0.63ns for two at once! There's data and clock -- but was searching my scraps for a 5 pin female...
Completely forgetting this bag a friend sent me two years ago filled with like 30 or so 5 pin female to PS/2 male adapters. NOT the most useful of directions to convert in this day and age, but handy to chop one up for especially since I'm still at the breadboard stage.
When I press A on PS2 translate that into A on XT and when I press B send a B. (may be too simplistic though)
I see a lot of code (which I don't understand (yet)).
Most of the code you are seeing has to deal with the fact that the communications protocols used are two pin clocked serial, and the protocols are DIFFERENT. Lemme see if I can explain it a bit.
Both are clock timed -- what that means is that one side (the keyboard, aka 'client') generates a clock when data is to be passed. Each time the clock ticks off (by pulling the signal low to indicate data is changing, then high when the data is set) the data line sets one bit.
For the AT, that bit pattern is: (0 is low, 1 is high)
Bit 0 == start bit, always 0.
Bit 1..7 == data bits
bit 8 == parity bit. This is "odd" parity.
Bit 9 == stop bit, always 1
bit 10 == acknowledge bit, should be 1... SENT BY OTHER SIDE!
On the XT, that pattern is WAY simpler:
bit 0..1 == start bit, always 1, held high for
two cycles
bit 2..8 == data bits
bit 9 == stop bit, always 0.
Also when no data is being sent, the clock line is held high.
Also since it's held for two cycles, I'd call it two start bits, not one! That was confusing me a lot.
The XT's data protocol is Dick simple. It is monodirectional in that the keyboard sends messages to the computer and just blindly hopes the computer is there. There is no response and no way for the computer to tell the keyboard anything. The data message itself is just the keyboard scancode on bits 0..6, with bit 7 indicating if it's a keyup (1) or key down (0).
So if you hit the A key and then released, the data would be:
0x1E, 0x9E
for down and release... the release just being the same value with the high bit set.
The AT protocol on the other hand is bidirectional... this required some pretty hefty changes to how things work... some of them not for the better. Not only are the scancodes different, bit 7 is not used for up/down. Instead if bit 7 is set it indicates the rest of the data is a "command code"... 0xFA is acknowledge a message, 0xFE is resend last message, 0xF0 is next press is a release, and so forth.
So that same example of pressing A and releasing sends:
0x1C, 0xF0, 0x1C
Scancode is different, the 0xF0 saying the next code is a keyup message, which then says the same scancode over again.
So you have to decode the AT signalling, if it's just a scancode you translate and send using the XT signalling... but if it's a command code you have to handle it as appropriate... like trapping 0xFA so that you "scancode | 0x80" on the next scancode transmission.
Likewise on the AT side you have to send 0xF0 back, or wait until the keyboard times-out and says "oh well". Ideally one should be actually checking the parity and sending 0xFE if the parity doesn't match.
Checking that parity bit is fun too since it's odd parity. Personally I prefer to calculate that as each bit is received, whereas everyone else prefers to screw around with shifts after the fact...
Me, on the initialize I'd set my parity variable to one (odd parity), then just "if (bit) parity++;" on each data bit, with on parity verification checking "if (parity & 1 == bit) {" to trigger if we need to 0xFE or 0xF0 after the stop bit.
Which is why I'd have an output buffer as well, since the transmission clock is handled by the keyboard we'd need our ISR to handle sending data. To that end we'd probably have to have a routine called by LOOP to trigger pullin g the clock line low when commands are waiting in the buffer.
Really it's the AT side that makes it so complex... though again I'm REALLY questioning the information I have about the XT side since all the charts I have --
like this one -- the math doesn't add up.
Seriously, look at it... how can (9 cycles of 95μs) + 66μs + 30μs + 120μs + what appears to be 25 extra μs of a down-clock at the end == 975μs... that's BS, not μs. My math that comes to 1001μs!
The 5μs of rise/fall time I recognize why it's so high. Under ideal circumstances TTL devices have a rise/fall time of 2.5μs. Atmega rates the digital outs on most of their 8 bit chips at 3ms, and some devices can push it to 4 depending on things like extra resistors for safety and line capacitance. Like all good engineers, the folks who made the PC keyboard spec were smart and Mr. Scott the numbers. Figure out how much it needs, then double it. Likewise it seems to be good practice to wait that full rise/fall time BEFORE you change the data value. So far NO implementation I've seen on the arduino or PIC seems to be doing that, and that could bite you with some keyboards or some devices. If anything there should be a 5us delay after pulling the clock low, and 5us after setting the data, in addition to the peak/trough times. 95us seems to be accurate for each pulse, and the initial PC/XT start pulse does NOT violate this near as I can tell. (I'll be posting up some actual output from my homebrew logic probe later).
As such, MY interpretation of the XT timing is:
5 clocks for each rise/fall
20 clocks deck for each trough
70 clocks ceiling for each peak
So for all clocks except the first -- which is inverted and stays low, that's
5 fall, 20 low, 5 rise, 65 high
A bit off from the oddball 66 number that people seemed to be throwing in there.
So... my interpretation of the XT start "bit" is:
Pull clock low, wait 5µs
pull data high, wait 90µs (rest of cycle)
pull clock high, wait 95µs (entire cycle with rise time)
Which is a far cry from what I'm seeing elsewhere. The peak for the second clock pulse on the start "bit" is out of sync with the rest of the data stream. That combined they add up to more than the 180µs that two normal pulses would also makes it suspect in my mind... and the data stream I'm seeing says that too.
... although I'm using a cheap AT/XT switchable as my exemplar as I don't have any model F's in my collection.
To that same end, and the charts agree with this visually but don't spell it out, I'm likely going with this for each of the data pulses:
pull clock low, wait 5µs
set data bit, wait 20µs
pull clock high, wait 70us (which includes the 5 rise/fall time on data)
It also seems "off" that you hold the stop bit until the next cycle, and that the stop is low. Very interesting approach as it lets you monitor data OR clock for the start of the next cycle.
Though I imagine that the interface has a LOT of tolerance since it likely does NOT take the full 95µs for the either side to recognize the data... especially not if they can handle a 20µs low pulse on the clock line.
I think at one point when talking about making something like ADT Pro but for PC's someone (was it Chuck (g)?) mentioned they were able to push the AT keyboard interface WAY past spec for data transmission, so it's probably NOT as important the timings as my brain wants it to be. I mean hell, it's TTL logic, you should be able to blast that sucker as fast as it takes to rise, fall, and trigger! That's the entire reason you'd have a clock line in the first place, otherwise single line transmission would be adequate.
But I'm a real stickler for consistent timings and accurate specifications... both seem to be a lost art.