• Please review our updated Terms and Rules here

MS-DOS Reading keyboard port (60 61) without interrupts

Mills32

Experienced Member
Joined
Sep 25, 2018
Messages
149
Location
Spain
I'm adding joystick support for my engine, and It is simpler to read joystick and keyboard at the same time after waiting for vsync.

All keyboard functions and info I find, suggests I should read 0x60 port by inserting a custom function in interrupt vector 9.

Will there be any problem with the keyboard if I only try to read the 0x60 port once per frame after vsync?
I'm testing in dosbox and it is working fine, I'll try later on 86Box, PCem and a real 286-6 Mhz.

Thanks!.
 
Will there be any problem with the keyboard if I only try to read the 0x60 port once per frame after vsync?
IRQ1 will still happen at each key press, invoking the keyboard interrupt handler. The handler will read the byte in the keyboard shift register, via 0x60 [8255 port A], put that byte into the keyboard buffer within RAM, then reset the keyboard interface circuitry. When the keyboard interface circuitry is reset, that clears the keyboard shift register.

Rhetorical: Could there be times when the aforementioned sequence happens within two of your 0x60 reads? If so, that would result in a key press not being picked up by your code.
 
Rhetorical: Could there be times when the aforementioned sequence happens within two of your 0x60 reads? If so, that would result in a key press not being picked up by your code.
I don't think so. VSYNC is roughly 50..70 Hz. If you can beat that.....
 
IRQ1 will still happen at each key press, invoking the keyboard interrupt handler. The handler will read the byte in the keyboard shift register, via 0x60 [8255 port A], put that byte into the keyboard buffer within RAM, then reset the keyboard interface circuitry. When the keyboard interface circuitry is reset, that clears the keyboard shift register.
This. Although as long as you're taking care to read the port yourself, you can mask off IRQ1 at the PIC via port 21h, ensuring that INT 9 will not be invoked until you re-enable IRQ1.
 
I don't think so. VSYNC is roughly 50..70 Hz. If you can beat that.....
It's not a matter of beating it: for a key pressed at any random time, it's a question of whether your vsync routine triggers before the interrupt is handled, which seems very unlikely to me.

You can switch to polling easily enough by simply vectoring INT 9 to a return instruction, so that the interrupt is effectively ignored. I'd cons up a little test program though to see if there's enough buffering in place that you won't be dropping key down or key up events when a user mashes a bunch of keys at the same time: it seems to me that could easily produce several events within the ~16 ms period between your polls and, without something immediately retrieving the scan code before the next one is sent, one of the two might get lost.

If that is the case, probably the easiest thing to do is just have a proper interrupt service routine that stuffs the scan codes into a small buffer, and read from that buffer once per frame.
 
It's not a matter of beating it: for a key pressed at any random time, it's a question of whether your vsync routine triggers before the interrupt is handled, which seems very unlikely to me.

You can switch to polling easily enough by simply vectoring INT 9 to a return instruction, so that the interrupt is effectively ignored. I'd cons up a little test program though to see if there's enough buffering in place that you won't be dropping key down or key up events when a user mashes a bunch of keys at the same time: it seems to me that could easily produce several events within the ~16 ms period between your polls and, without something immediately retrieving the scan code before the next one is sent, one of the two might get lost.

If that is the case, probably the easiest thing to do is just have a proper interrupt service routine that stuffs the scan codes into a small buffer, and read from that buffer once per frame.
I inserted a null function at interrupt 9, so I guess I disabled the keyboard.

Then I used this every vsync (set to 60Hz):

void Key_Handler(void){ asm{ cli in al, 060h //READ KEYBOARD PORT mov keyhit, al //Store in keyhit in al, 061h //READ SYSTEM CONTROL PORT mov bl, al //Store state in bl or al, 080h out 061h, al //DISABLE KEYBOARD (send "received" command (bit 7)) mov al, bl out 061h, al //RE ENABLE KEYBOARD (original state) sti } //Some lines to process if a key is down or up, and store that in array "keys[256]" }

This engine only needs at most tho keys/buttons at the same time (up+right, right+jump), I tested all emulators I could, and it worked fine, I don't think the real PC will have trouble.

The only trouble is an uninteded exit of the program, which leaves the keyboard disabled.
 
Last edited:
This engine only needs at most tho keys/buttons at the same time (up+right, right+jump), I tested all emulators I could, and it worked fine, I don't think the real PC will have trouble.
Is it a problem, that at sample-the-shift-register time, one of the quoted key sequences has only been partially clocked into the shift register ?
 
Do you actually need to disable/hook int 9? Reading the scan code directly from port 60 was a trick commonly used with (Q)BASIC. And the only thing required was to clear out the keyboard buffer in the BDA so it didn't overflow. For example:

Code:
CLS
PRINT "Hold enter key to increase tone"
PRINT "Press esc to end"

DO
  SOUND (50 + inc), .5

  k = INP(&H60)
  SELECT CASE k
    CASE 1
      ' esc pressed    
      END
    CASE 28
      ' enter pressed
      inc = inc + .5
      IF inc > 120 THEN inc = 120
    CASE 156
      ' enter released
      inc = inc - .5
      IF inc < 0 THEN inc = 0
  END SELECT
  
  ' clear keyboard buffer
  DEF SEG = &H40
  POKE &H1A, PEEK(&H1C)

LOOP
 
When I wrote ConFormat (cf SIMTEL20), I had to do just this because the idea was that the utility ran in the background, independent of MSDOS. My recollection is that there was a slight difference in XT and AT key code handling. That also involved using a separate stack for the code. But heck, that was 1988, so memory is a bit dim on that.
 
Last edited:
Here's a little TSR that intercepts keystrokes and displays them as hex before forwarding them to the regular keyboard ISR.
Code:
	page	,132
	title	Testkey - Test keyboard strokes
 
pkdata	equ	60h			; keyboar data

cseg	segment para public
	assume	cs:cseg

	org	100h
Entry:
	jmp	Start

Revec	label	dword
	dw	(0)
	dw	(0)

Start:
	push	ds
	xor	ax,ax
	mov	ds,ax
	mov	bx,ds:[9*4]
	mov	dx,ds:[9*4+2]
	mov	word ptr cs:Revec,bx
	mov	word ptr cs:Revec+2,dx
	cli
	mov	ds:[9*4],offset Serve
	mov	ds:[9*4+2],cs
	sti
	pop	ds
	lea	dx,Endmem
	int	27h


Serve:
	push	ax
	push	bx
	push	cx
	push	dx
	push	si
	push	di
	push	bp
	in	al,pkdata
	test	al,al
	js	Serve8		; if release
 
;	change it to hex.

	push	ax
	shr	al,1
	shr	al,1
	shr	al,1
	shr	al,1
	call	dnib		; display a nibble
	pop	ax
	call	dnib		; ditto
	mov	al,' '
	call	dchar
Serve8:
	pop	bp
	pop	di
	pop	si
	pop	dx
	pop	cx
	pop	bx
	pop	ax
	jmp	cs:Revec		; to the real servicer

dnib:
	and	al,15
	add	al,'0'
	cmp	al,'9'
	jbe	dchar
	add	al,'a'-'9' -1		; correct
dchar:
	mov	ah,0eh			; output tty
	mov	bh,7
	int	10h			; show the byte
	ret

Endmem	label	byte

cseg	ends

	end	Entry
 
This. Although as long as you're taking care to read the port yourself, you can mask off IRQ1 at the PIC via port 21h, ensuring that INT 9 will not be invoked until you re-enable IRQ1.
I tried that, and it did not work as I expected, this should ignore the keyboard, but it does not:

Code:
asm in  al, 21h    //Read existing bits.
asm or  al, 02h    //Turn off IRQ 1 (KEYBOARD)
asm out 21h, al    //Write result back to PIC.
 
What did you do afterwards? Could you have gone back to running code that might have re-enabled the interrupt?
 
Do note that keyboard handling is somewhat different between an XT-type system and an AT+ one. In the XT, it's a simple 9 bit shift register; in the AT+, it's handled by the 8042 (or similar) MCU.
 
What did you do afterwards? Could you have gone back to running code that might have re-enabled the interrupt?
I tested that in a simple program which does nothing else after that, I think I'll just use the interrupt 9 to read the keyboard and then update my "pushed/released keys array" only at vsync. It seems the "less buggy" option.

I just realized it works on some emulators and it doesn't on others (dosbox-x).
 
Last edited:
All keyboard functions and info I find, suggests I should read 0x60 port by inserting a custom function in interrupt vector 9.
If you are already on MS-DOS, why don't you want to use its "int 21h" API to get keystrokes?

Will there be any problem with the keyboard if I only try to read the 0x60 port once per frame after vsync?
Depends on how well you know your hardware, and how much you want to tie your code to it.

The blog post (https://www.os2museum.com/wp/how-fast-is-a-ps-2-keyboard/) on the excellent os2museum page provides some more details: A PS/2 keyboard needs approximately 0.66 to 1.1 ms per byte (the PS/2 hardware layer is clocked by the keyboard using a somewhat unstable clock source). The keyboard controller translates the incoming byte stream to a common scan code set, which takes a variable amount of time depending on the number of bytes it needs to consume/produce. USB keyboards in legacy mode are limited by the USB-HID polling interval (e.g. 10 ms rounded down) plus processing time, and Michal has observed 16 ms.

With a 60 fps frame rate, your vsync events are ~17 ms apart, so you will most likely lose bytes. So your approach will not work.

Also note that real hardware behaviour may or may not apply to emulators. They may send keystrokes much faster (or slower) than real hardware ever could, for example when "pasting" an input file.
 
If you are already on MS-DOS, why don't you want to use its "int 21h" API to get keystrokes?


Depends on how well you know your hardware, and how much you want to tie your code to it.

The blog post (https://www.os2museum.com/wp/how-fast-is-a-ps-2-keyboard/) on the excellent os2museum page provides some more details: A PS/2 keyboard needs approximately 0.66 to 1.1 ms per byte (the PS/2 hardware layer is clocked by the keyboard using a somewhat unstable clock source). The keyboard controller translates the incoming byte stream to a common scan code set, which takes a variable amount of time depending on the number of bytes it needs to consume/produce. USB keyboards in legacy mode are limited by the USB-HID polling interval (e.g. 10 ms rounded down) plus processing time, and Michal has observed 16 ms.

With a 60 fps frame rate, your vsync events are ~17 ms apart, so you will most likely lose bytes. So your approach will not work.

Also note that real hardware behaviour may or may not apply to emulators. They may send keystrokes much faster (or slower) than real hardware ever could, for example when "pasting" an input file.
I finally used the interrupt 9 with a custom function, because reading keys only at vsync was causing trouble in some emulators/real machines. It is also very fast and it did not affect the game speed (even at 4.77 MHz). The only bad thing is, I have to read the joystick with a separate function, and keyboard/joystick will work at the same time. But this is good enough.
 
You'd think 16.7 ms would be a short enough period of time, but when I was implementing keyboard support in my emulator I was sending a scancode per frame to the core and quickly noticed a fast typist can generate multiple scancodes per frame, especially when you consider keyup/keydown.
 
You'd think 16.7 ms would be a short enough period of time, but when I was implementing keyboard support in my emulator I was sending a scancode per frame to the core and quickly noticed a fast typist can generate multiple scancodes per frame, especially when you consider keyup/keydown.

I'm not sure about the metric but button mashing techniques Nintendo speedrunners do are downright insane. They manage to achieve what was possible for tool-assisted runs only, where tool assistance is just scripting the frame perfect input at 60FPS.
 
Back
Top