• Please review our updated Terms and Rules here

Arduino PS2 to XT converter

I have no clue on what the script is actually doing, I just grabbed the codes from this topic and put them together.
I would love to learn what does what and why, but for now I really don't know.
I have some basic Basic knowledge, some php knowledge and some lua knowledge, but that's about it.
 
I would love to learn what does what and why, but for now I really don't know.
Ah, well... a project like this is a very good place to start learning C and things like circuit interfacing since the Arduino makes it relatively simple.

You know, normally when I write a website in HTML/CSS/PHP I'll do a breakdown section by section explaining why I chose the tags I did, why/how the CSS works and is applied, and the logic of the underlying PHP code. I can probably do that with what you have in C to better explain what that is doing for you if that might help.

I know PHP and web technologies in general people are able to get by blindly pasting together other people's code, but when you're down at 16 or 32k of flash for program space, 1 or 2k of RAM, and 16mhz or less on a 8 bit RISC processor, doing that can work against you VERY quickly.

For now, the "big" problem is how you are turning your _read into _write to the xt. You're doing read only on the AT side, and the AT really isn't meant for that -- but more of an issue is that the AT returns a slew of values higher than 0x7F that are control codes. Your program is trying to look them up regardless and not acknowledging any control messages.

OH wait, you are trapping 0xF0... ok, that's sloppy.

Strangest part though is this:

Code:
    _write(0xAA) ;

That's sending a keyup for 0x2A, which on the XT is the "9" key... The device that should be sent the 0xAA is the AT keyboard, and even that doesn't make sense as that's a reset. After the key checks parity and you reply with the ack bit (yes, the host replies to the AT keyboard with the ack bit), the code should be sending 0xFA if everything went fine, or 0xFE as there was a parity error, so we ask for the code again.

I'm out the door in a while, but when I get home tonight I'll take a stab at doing a quick rewrite with my nano here, and see what I come up with... digging through my bin I'm a bit shocked, I'm out of PS2 female plugs... got an overabundance of 5 pin males though since I'm a MIDI guy.

Maybe I'll just see if I can remove one from one of the dead mobo's I have lying around rather than ordering some.
 
I also took a midi cable and used that to connect to the XT.
I've ordered a PS/2 board online that had 4 pins on the other side (easier than soldering myself), I connected an old cd-rom audio cable to it and connected that to the Arduino.

Can't it be just as simple as:

When I press A on PS2 translate that into A on XT and when I press B send a B. (may be too simplistic though)
I see a lot of code (which I don't understand (yet)).
 
Ah, well... a project like this is a very good place to start learning C and things like circuit interfacing since the Arduino makes it relatively simple.

Hello and what an interesting topic.

I would like to throw a curved ball in to the mix.

a. my C programming is ultra rusty, and I simply do Not have time to learn it

b. I purchased recently a very early Sirus with monitor BUT no keyboard. As far as I understand it uses very unusual key board and also floppy disk format.

I would really like to know if someone like yourself, or another programmer here can spend 1hr seeing if there is a way to make a XT keyboard work via this Arduino keyboard converter.

regards
David
 
I also took a midi cable and used that to connect to the XT.
I didn't even have to chop one -- I actually have a bin of the plugs unsoldered and ready for action... though most of them are left over from when I was making TRS-80 /similar cassette port cables.

I've ordered a PS/2 board online that had 4 pins on the other side (easier than soldering myself)
Nice approach... I was sitting here questioning the PC/XT timings I'm seeing so I'm turning my Teensy 3.0 into a el-cheapo graphing logic probe -- actually got it down to 0.46ns granularity for one channel, 0.63ns for two at once! There's data and clock -- but was searching my scraps for a 5 pin female...

Completely forgetting this bag a friend sent me two years ago filled with like 30 or so 5 pin female to PS/2 male adapters. NOT the most useful of directions to convert in this day and age, but handy to chop one up for especially since I'm still at the breadboard stage.

When I press A on PS2 translate that into A on XT and when I press B send a B. (may be too simplistic though)
I see a lot of code (which I don't understand (yet)).

Most of the code you are seeing has to deal with the fact that the communications protocols used are two pin clocked serial, and the protocols are DIFFERENT. Lemme see if I can explain it a bit.

Both are clock timed -- what that means is that one side (the keyboard, aka 'client') generates a clock when data is to be passed. Each time the clock ticks off (by pulling the signal low to indicate data is changing, then high when the data is set) the data line sets one bit.

For the AT, that bit pattern is: (0 is low, 1 is high)
Bit 0 == start bit, always 0.
Bit 1..7 == data bits
bit 8 == parity bit. This is "odd" parity.
Bit 9 == stop bit, always 1
bit 10 == acknowledge bit, should be 1... SENT BY OTHER SIDE!

On the XT, that pattern is WAY simpler:

bit 0..1 == start bit, always 1, held high for two cycles
bit 2..8 == data bits
bit 9 == stop bit, always 0.

Also when no data is being sent, the clock line is held high. Also since it's held for two cycles, I'd call it two start bits, not one! That was confusing me a lot.

The XT's data protocol is Dick simple. It is monodirectional in that the keyboard sends messages to the computer and just blindly hopes the computer is there. There is no response and no way for the computer to tell the keyboard anything. The data message itself is just the keyboard scancode on bits 0..6, with bit 7 indicating if it's a keyup (1) or key down (0).

So if you hit the A key and then released, the data would be:
0x1E, 0x9E

for down and release... the release just being the same value with the high bit set.

The AT protocol on the other hand is bidirectional... this required some pretty hefty changes to how things work... some of them not for the better. Not only are the scancodes different, bit 7 is not used for up/down. Instead if bit 7 is set it indicates the rest of the data is a "command code"... 0xFA is acknowledge a message, 0xFE is resend last message, 0xF0 is next press is a release, and so forth.

So that same example of pressing A and releasing sends:

0x1C, 0xF0, 0x1C

Scancode is different, the 0xF0 saying the next code is a keyup message, which then says the same scancode over again.

So you have to decode the AT signalling, if it's just a scancode you translate and send using the XT signalling... but if it's a command code you have to handle it as appropriate... like trapping 0xFA so that you "scancode | 0x80" on the next scancode transmission.

Likewise on the AT side you have to send 0xF0 back, or wait until the keyboard times-out and says "oh well". Ideally one should be actually checking the parity and sending 0xFE if the parity doesn't match.

Checking that parity bit is fun too since it's odd parity. Personally I prefer to calculate that as each bit is received, whereas everyone else prefers to screw around with shifts after the fact...

Me, on the initialize I'd set my parity variable to one (odd parity), then just "if (bit) parity++;" on each data bit, with on parity verification checking "if (parity & 1 == bit) {" to trigger if we need to 0xFE or 0xF0 after the stop bit.

Which is why I'd have an output buffer as well, since the transmission clock is handled by the keyboard we'd need our ISR to handle sending data. To that end we'd probably have to have a routine called by LOOP to trigger pullin g the clock line low when commands are waiting in the buffer.

Really it's the AT side that makes it so complex... though again I'm REALLY questioning the information I have about the XT side since all the charts I have -- like this one -- the math doesn't add up.

Seriously, look at it... how can (9 cycles of 95μs) + 66μs + 30μs + 120μs + what appears to be 25 extra μs of a down-clock at the end == 975μs... that's BS, not μs. My math that comes to 1001μs!

The 5μs of rise/fall time I recognize why it's so high. Under ideal circumstances TTL devices have a rise/fall time of 2.5μs. Atmega rates the digital outs on most of their 8 bit chips at 3ms, and some devices can push it to 4 depending on things like extra resistors for safety and line capacitance. Like all good engineers, the folks who made the PC keyboard spec were smart and Mr. Scott the numbers. Figure out how much it needs, then double it. Likewise it seems to be good practice to wait that full rise/fall time BEFORE you change the data value. So far NO implementation I've seen on the arduino or PIC seems to be doing that, and that could bite you with some keyboards or some devices. If anything there should be a 5us delay after pulling the clock low, and 5us after setting the data, in addition to the peak/trough times. 95us seems to be accurate for each pulse, and the initial PC/XT start pulse does NOT violate this near as I can tell. (I'll be posting up some actual output from my homebrew logic probe later).

As such, MY interpretation of the XT timing is:

5 clocks for each rise/fall
20 clocks deck for each trough
70 clocks ceiling for each peak

So for all clocks except the first -- which is inverted and stays low, that's
5 fall, 20 low, 5 rise, 65 high

A bit off from the oddball 66 number that people seemed to be throwing in there.

So... my interpretation of the XT start "bit" is:

Pull clock low, wait 5µs
pull data high, wait 90µs (rest of cycle)
pull clock high, wait 95µs (entire cycle with rise time)

Which is a far cry from what I'm seeing elsewhere. The peak for the second clock pulse on the start "bit" is out of sync with the rest of the data stream. That combined they add up to more than the 180µs that two normal pulses would also makes it suspect in my mind... and the data stream I'm seeing says that too.

... although I'm using a cheap AT/XT switchable as my exemplar as I don't have any model F's in my collection.

To that same end, and the charts agree with this visually but don't spell it out, I'm likely going with this for each of the data pulses:

pull clock low, wait 5µs
set data bit, wait 20µs
pull clock high, wait 70us (which includes the 5 rise/fall time on data)

It also seems "off" that you hold the stop bit until the next cycle, and that the stop is low. Very interesting approach as it lets you monitor data OR clock for the start of the next cycle.

Though I imagine that the interface has a LOT of tolerance since it likely does NOT take the full 95µs for the either side to recognize the data... especially not if they can handle a 20µs low pulse on the clock line.

I think at one point when talking about making something like ADT Pro but for PC's someone (was it Chuck (g)?) mentioned they were able to push the AT keyboard interface WAY past spec for data transmission, so it's probably NOT as important the timings as my brain wants it to be. I mean hell, it's TTL logic, you should be able to blast that sucker as fast as it takes to rise, fall, and trigger! That's the entire reason you'd have a clock line in the first place, otherwise single line transmission would be adequate.

But I'm a real stickler for consistent timings and accurate specifications... both seem to be a lost art.
 
Last edited:
I would like to throw a curved ball in to the mix.
All balls are curved. I'm so slow when it comes to broken English, took me half a minute before I realized you meant a curve-ball.

I would really like to know if someone like yourself, or another programmer here can spend 1hr seeing if there is a way to make a XT keyboard work via this Arduino keyboard converter.
That would involve having the scancode specifications of the Sirius, as well as knowing what bit-wise protocol it is using.

I'd be willing to take a stab, but it's gonna be a lot more than an hour and I'd likely need someone to research the documentation for said device.

It would also likely be more useful to adapt a AT keyboard to it, as they are usually easier to find than XT ones at this point.

Besides, why settle for less than a Model M?

While I never had one, I'm at least aware of the Victor's quirks... the keyboard might be really challenging as I know its mapping was programmable -- you could literally assign any key any value you liked! I was never quite clear if that was done on the keyboard or on the machine itself. If it's anything like the chip inside the PC keyboar,d if they gave it more than just a scratchpad worth of RAM it would have been very easy (and very efficient) to do keyboard side -- just like how the AT keyboard lets you choose between two different mappings. Would also have lessened the load on the host machine itself -- and given the Victor/Sirius 1 much like the Tandy 2000 and DEC Rainbow were in fact better DOS machines than PC's on the spec and performance side, it would surprise me at all if they went that route.

Which again touches on something I'm surprised these converters are NOT doing, setting the AT keyboard to mapping 1 (PC/XT) by sending 0xF0 0x01 at it. Then these translation tables wouldn't even be necessary! Internally that's how the keyboards with the switches on them work, they lock the keyboard into mapping 1 and then use the older/simpler XT protocol since the phsyical wiring is the same... but still ALL PC/AT keyboards to my knowledge support mode 1 (mode 3 is the default). (only mode 2 was dropped on later keyboards, and really, that hybrid of mode 1 and 3 mappings was a confusing mess nobody ever used!)

(the laugh being Linsux STILL forces AT keyboards into XT mapped behavior! That's why the numpad acts so funky/useless if you try to use it as arrows -- a habit I never broke myself of. I'm still useless with the arrow-pad... Most peope chop off the number pad when shrinking keyboards, I would get rid of the arrow-pad area. I just don't use those keys!
 
I do have some Samsung keyboards that do have a spare hole for a XT/AT-switch, the switch itself is not in place.
It also has holes in the print for the switch, but when I connect as if the switch was 'ON' it doesn't work.

Very interesting story about timings, I thought it was just a simple thing like for example an Amiga joystick, just switches that send stuff.
 
Very interesting story about timings, I thought it was just a simple thing like for example an Amiga joystick, just switches that send stuff.

Most joysticks use one wire for each button, in addition to ground... take the 9 pin atari interface that uses 6 wires -- one for each direction, one for the button, and ground... or the Sega expansion to this that recycles the two paddle lines for two more buttons.

Now, think on trying to apply that to a 89 to 104 key keyboard... you really think they can cram 105 separate wires into a keyboard cable?

INTERNALLY a keyboard in fact does this, though they use something called "matrixing" that creates rows and columns of connections that looks for shorts between them. This reduces the number of wires needed for a 101 key keyboard from 102 to only 14... but that's still more wires than you'd really want to put into a keyboard cable -- particularly over any distance as crosstalk can become an issue.

A lot of older microcomputers (pretty much all 8 bit ones) do in fact access said matrix directly since usually the computer was in the keyboard. Usually their internal cable is two, maybe three inches in length at the most.

The problem with directly accessing the keyboard switches is how do you detect when the value changes? Sure, you could hook every key or use a string of gates to connect them to an interrupt, but what if another interrupt is hogging things? If the system is too busy to check the keyboard right that second, you lose keypresses.

Worse, you need to monitor VERY closely and average the responses to avoid something called "bounce" -- where a newly connected switch will often lose contact then regain it. You have to average responses over time in order to eliminate that which means a VERY active polling of the device. That's why -- generally speaking -- most 8 bit home systems were cute toys, but unsuited to long term business data entry use.

To address this "real" systems would give the keyboard it's own separate computer who's only real task was to monitor the matrix and spit out the corresponding keystrokes in serial. You go serial you can get away with as little as two wires, though more is usually a good idea. I liken it to how Commodore handled disk drives, basically giving them their own 6502 processor.

To that end the original PC keyboard had a Intel 8048 microcontroller inside it.... AND put a 8042 on the other end as well to handle serial comms and matrix monitoring so the system CPU didn't have to waste time on it. As all the 8048 has to do is loop endlessly checking the matrix, the odds of it missing a keystroke are pretty damned slim.

Which is part of what makes the Junior's keyboard such a pain in the ass and part of what makes the junior slow -- they skipped the extra processor in the system and instead hook the non-maskable interrupt on the CPU to handle data from the keyboard. Also what makes the internal serial port on the Jr. screw up if you press a key while trying to receive data.

Having a separate processor on each end means cheaper cabling, more reliable transmission, and best of all no risk of missed keypresses unless you fill up the buffer. That last part is the real thing that made such methodology "win out" over all other methods.

Though it would have been nice if they just settled on a common format and specification like RS-232 for it.
 
Alright, my version of this is coming alone nicely, in a measure 50 times cut once kind of way. Got my teensy 3.0 based recording logic probe up to 0.1567μs accuracy, and was able to record from the one PC/XT capable keyboard I have. It's an old "Micro Express" XT/AT switchable I've had forever, used it on real 5150's and 5160's as well as a slew of clones, so I'm pretty confident that it's numbers are compatible.

It's not that I don't trust other people's timing charts, I just don't trust other people's timing charts.

This is the result:
http://www.deathshadow.com/AT2XT/microExpressTimings.html

The above chart is triggered by the clock hitting the teensy's interrupt. (I'm using pin 3 for clock and 4 for data). In the process of responding to that interrupt and setting up to record the elapsed time and data, I did the math on the generated assembly and I'm losing somewhere around 24 or 25 microseconds -- that is SPOT ON with that little sliver of the data line holding low for 3μs... There's our ~27..30 accounting for the rise/fall times.

What I'm coming up with is 92μs per cycle... there's some interesting bits in there too. It looks like they wait a lot longer before flipping the data bit than the graphs I've seen online for this, it almost appears like they wait half the down-pulse on the clock. I think I'm going to go with that, using a full 15ms before I flip to make sure the client sees the change.

Because the code likely makes you lose two or three μs over the write operation, (possibly more if the AT interrupt fires while outputting to XT) I'm going to round down to 30 for the troughs and 60 for the peaks, so 90 total.

So my new timing plan is:

HEADER
pull clock low, wait 15µs
pull data high, wait 105µs
pull clock high, wait 60µs

DATA (repeat 8 times)
pull clock low, wait 15µs
set data, wait 15µs
pull clock high, wait 60µs

END OF DATA
clock low, wait 15µs
data low, wait 15µs
clock high, wait 990µs, then wait until new data available.

Figure in the rise/fall times, and that should be near identical to what this keyboard is generating, and again I've never had this one not work anywhere.

Though what I'm going to do is tie the outputs of the little Atmega 328 DCCDuino Nano I'm using to the Teensy and adjust the timings until they match what the real keyboard is outputting as close as possible. I think that's a reasonable approach for the next step. I also set up a little test to try and find the shortest delay between key-up and key-down as I figure that's the most likely way to find the minimum delay between messages, and the shortest time I found was around 910μs -- below the 1ms used in most people's code... I'm gonna call it 990μs to be the same as ten full data pulses. I think that makes sense -- make the delay between messages the same length as a full message. I may store micros globally and put the wait at the START of the next byte, so that if other processing is going on the processor doesn't waste time with it's thumb up it's proverbial arse waiting around for nothing.

I'm going to be doing the same thing analysis on the AT keyboards input routine just to double-check what everything online says. Nope, not paranoid and taking my time at ALL.

Though... anyone care to donate a model F to the project? Eh, might see if I can pick one up on the cheap somewhere. That or anyone out there with a model F have a Teensy 3.0? I can send you the code.

I'm probably also going to enhance that chart page with some JavaScript to let you alternate between screen width and full sample per pixel data... actually, this is 2016, for modern browsers I shouldn't even need JavaScript for that.

-- edit -- Yeah, didn't need JavaScript for that at all. Won't work in IE8/earlier, but really I could give a rat's ass at this point. In proper modern browsers, you click the full size option and instead of scaling to screen width, it will give you one pixel per sample resolution.
 
Last edited:
Ok, going through all the available codebases I could find on the topic, all the documentation I could find, and pulling genuine timings from actual keyboards with my little home-brew recording logic probe (since it's on/off TTL only I won't call it an oscilloscope) I see a number of worrisome flaws and false assumptions.

1) Timing info seems to be way off. biggest seems to be that real devices wait until HALFWAY through the clock trough to flip the data line... but also the negative clock start on XT is the same width as a normal clock.

2) No buffering... if the interrupt for the AT data fires while the XT is being written to, it will overwrite the variable storing the current keycode!

3) failure to leverage internal timer and use of delayMicroseconds leading to wildly inaccurate timing! Atmega chips have a perfectly good timer interrupt -- HELL, it has THREE of them!... Why not use that to service the XT output side so you have accurate responses? At the 16mhz default that's commonplace now, we could use 240 clock ticks to give us a nice 15μs accuracy... half our clock trough, a quarter our clock peak. Removes worries about execution time OR the interrupts screwing with our bit-banging timings.

4) I've yet to see anyone reprogram the keyboard to use character set 1. Everyone seems to be wasting time on lookup tables or nested conditionals when AT class keyboards -- ALL OF THEM -- have a "mode 1" that will output PC/XT instead of AT scancodes! Only mode 2, aka "extended XT" is dropped on newer devices, modes 1 and 3 (3 is the default) are retained on pretty much all of them I can find.

So... my measure 30 times cut once proceeds apace. Testing the AT side right now to make sure I can indeed set mode 1, acknowledge messages properly instead of leaving the keyboard to timeout, and "play" with other values as needed.

Running two interrupts at once should be fun too... as will the use of two circular buffers, one for commands to be sent to the AT keyboard, one for scancodes to be sent to the XT side.

I'm also playing with the idea of storing the XT scancodes bitwise reversed to save time. Since I have to

Code:
if (data & 1) xt_data_high; else xt_data_low;
data >> = 1;

why not just:
Code:
if (data & 0x80) xt_data_high; else xt_data_low;
data << = 1;

instead, that way when storing the data I can just:

Code:
data = (data << 1) | bit;

instead of the painfully stupid:
Code:
data = (data >> 1) | (bit << 7);

... or worse, trying to index it as I've seen some people do:
Code:
data |= bit << count;

Though doing so would mean I'd need my AT command codes to be bitwise inverted too. Gah, I hate how atmega processors lack a proper bitwise rotate or shift... what they call ROL and ROR (their only two "shifts") are what us x86 folks would call RCR and RCL... rotate through carry. There is no proper shift, so to shift you have to CLC first... and there is no proper rotate so you have to try and set the carry flag to the low bit first... but I'm one of those guys who thinks RISC can take a sugar frosted **** off the end of my ****!

Hell, still bugs me most high level languages don't even have the concept of them being three different things! I'd settle for two!

But what's the old joke? CISC was created for people who write programs, RISC was created for people who write compilers?

Ever since I started playing with ARM and AVR, the more I've become convinced that's not a joke. Kind of like every time I play with C and Unix, the more I'm convinced the joke about them being a hoax isn't a joke either.
 
So every AT keyboard has an XT mode, but it lacks of a switch or something to use that mode, is that what you say?
 
So every AT keyboard has an XT mode, but it lacks of a switch or something to use that mode, is that what you say?

It... half has the mode. It can be programmed to output the correct scancodes, but it still does so using the AT wiring and transmission protocol. All mode 1 does is change what scancodes it spits out, not how it spits them out.

On the keyboards that have the switch, they likely force the keyboard into mode 1 for the scancodes, and then use the different output protocol as well. Setting mode 1 from software doesn't change the AT protocol itself -- it's still 11 bits bidirectional with release being a prefixed command code instead of stored on bit 7.

Still, setting that mode will GREATLY simplify the process and reduce the memory footprint, as the whole lookup table thing can be pitched.
 
As told in a previous post I do have a few Samsung keyboards (they look like IBM). They're with a simple membrame though, but they do have the full Intel cpu on board and a hole where the AT/XT switch could have been. Also on the small pcb in the keyboard there are holes and lines where the switch would have been.
They have 5p regular DIN though, not ps/2. Also when I connect the 2 empty pins (where the switch would have been) it doesn't directly work, but it may change the mode though. Don't know how to test that.
 
Seems like a fun project/challenge, but it it is for a real IBM 5160 PC/XT, wouldn't it just be easier to pop in a pair of BIOS chips with the 5/86 bios or flash one, which provided native PS/2 keyboard support rather than go through all of this, or just get a 5160 MOB with the right BIOS?

Mike
 
I believe that the 1000's protocol is almost identical to the XT protocol except for the timing and the use of a stop bit as opposed to a start bit. Have you ever looked into it deathshadow?

I do not believe that the 5160 can really handle the 11-bit AT/PS2 protocol. The 1986 IBM PC/XT Technical Reference indicates it uses the 9-bit XT protocol, so you still need a keyboard that can switch between the two.
 
Also when I connect the 2 empty pins (where the switch would have been) it doesn't directly work, but it may change the mode though. Don't know how to test that.

It probably does not -- when they omit the switch they usually cheap out with a smaller ROM, leaving off the code required to output the 10 bit (that people call 9 bit) monodirectional XT data format instead of the 11 bit AT format.

but it it is for a real IBM 5160 PC/XT, wouldn't it just be easier to pop in a pair of BIOS chips with the 5/86 bios or flash one, which provided native PS/2 keyboard support rather than go through all of this, or just get a 5160 MOB with the right BIOS?

1) Pretty sure a 286 BIOS isn't gonna fly on a 8088.

2) PC/XT's hardware interface is monodirectional -- System side is read only. AT interface is bidirectional and requires an ACK bit to be set to function properly. The two aren't just different scancodes and bit encoding, the hardware interfaces themselves are fundamentally different!

Pretty sure the PC/XT hardware would be incapable of even acknowledging the AT keyboard's power on and test ok message, much less setting the mode or ack bits.

3) AT's have a intel 8048 processor dedicated to handling the keyboard input. The PC/XT uses a shift-in parallel out chip along with a 8255. Not even close to the same hardware.

So no, it would not be "easier" -- much less how is burning a ROM and installing it easier?!? Much less that's ASSUMING you know what you are plugging it into, which I can't guarantee since I've got three XT clones here and one keyboard -- and christmas only knows what the clones did or what ROM's they'd be compatible with.

I swear, every time someone uses the word "easier" I start to think I have an entirely different definition of the word from the rest of the world!

I believe that the 1000's protocol is almost identical to the XT protocol except for the timing and the use of a stop bit as opposed to a start bit. Have you ever looked into it deathshadow?

It's on my to-do list. Since I've got a perfectly good working 1000SX keyboard here, I'll be able to investigate what it REALLY outputs since again, the data I'm finding online doesn't make sense and/or doesn't seem to line up with reality... though the most complete and accurate data (which is still pretty inaccurate) seems to be about going the other direction; from every other type of keyboard to AT which... doesn't seem to me like the most useful of things to do.

I'm also thinking on seeing what I can figure out about the Junior... if any computer ever desperately needed some lovin' from a model M...

But... PC/XT first. I'm actually having trouble reading from the AT interface side of things reliably. None of the code I have that claims to work is doing a damned thing, nor is my own code... I can read it fine with my little homebrew scope, but in terms of translating it into useful data it goes tits up to the point of the DCCduino going off to never never land.

Gonna try reading it using a teensy instead. Be a laugh if my one dollar Arduino nano knockoff is the actual stumbling block. You'd THINK Atmega 328p is Atmega 328p... but NO... (and yes, it's a newer one with the CH430, not the old FTDI knockoff that the driver updates now brick!)
 
What do you mean "Pretty sure a 286 BIOS isn't gonna fly on a 8088"? I've got the 5/86 BIOS in my IBM 5160 PC/XT and use a PS/2 keyboard - an IBM Model-M with the optional 5 pin DIN connector instead of the PS/2 connector, attached to my IBM PC/XT system and it works beautifully with no other mods needed. The 5/86 BIOS was the last native IBM BIOS update for PC/XTs that added support for 101-key PS/2 style keyboards. I don't understand your statement - can you please clarify?

NC_Mike
 
What do you mean "Pretty sure a 286 BIOS isn't gonna fly on a 8088"? I've got the 5/86 BIOS in my IBM 5160 PC/XT and use a PS/2 keyboard - an IBM Model-M with the optional 5 pin DIN connector instead of the PS/2 connector, attached to my IBM PC/XT system and it works beautifully with no other mods needed. The 5/86 BIOS was the last native IBM BIOS update for PC/XTs that added support for 101-key PS/2 style keyboards. I don't understand your statement - can you please clarify?

NC_Mike

You are one of the lucky people to get an autosensing Model M which works with both XT and AT systems. Not all Model M keyboards do that.
 
I have a Model M as well, but never tested it on an XT. How can I know if it does autosense? Just test, or can you see it (type number or something)
 
It has nothing to do with autosenese, that is nonsense. Either your XT BIOS is late enough to support 101-key keyboards or or isn't. I have many IBM Model-M keyboards - they all work. You have to make sure you use a 5 pin DIN cord, and not use a PS/2 to 5-pin DIN adapter.

Mike
 
Back
Top