• Please review our updated Terms and Rules here

The best way to identify the Olivetti M24 and its derivatives?

Krille

Veteran Member
Joined
Aug 14, 2010
Messages
1,104
Location
Sweden
How can I differentiate the Olivetti M24, AT&T PC 6300, Xerox 6060 and the Logabax Persona 1600 from all other systems?

Recently I got the idea to improve the Auto Configure option in XTIDECFG so that when it finds a Lo-tech XT-CF class controller it should select the most compatible, but also slowest, transfer mode (XTCF_8BIT_PIO_MODE) only on the computers that actually need it, namely the above mentioned systems. The rest should use the faster XTCF_8BIT_PIO_MODE_WITH_BIU_OFFLOAD by default. This requires that the machines can be identified with absolute certainty as I don't want to aggravate XUB-users more than necessary. ;)

The best I can come up with is to checksum the system BIOS with something simple like CRC-CCITT. AFAICT, these machines all use the same BIOS (right?) and there are only three versions publicly available; 1.21, 1.36 and 1.43? The downside is that a machine with a modified BIOS won't be detected but that's got to be extremely rare.

So, any suggestions for a better method? Or should I just scrap this idea?
 
Why use the BIOS? The 6300/M24/etc. has several hardware differences. For example, there's a fairly unique clock/calendar chip that's not present at the I/O address in other systems. i.e. an MM58174AN at 70h-7fh.

On 80286 systems, (AT class and greater) this I/O port address is occupied by the MC146818 CMOS/RTC, but it's a very different chip and not present on 8086 systems.
 
Last edited:
I don't sure, but you can try use 0c0h function of Int 15h. In ES:BX registers you get ROM table offset which contain model, submodel and bios revision. I don't know model/submodel constants for Olivetti M24, AT&T PC 6300, Xerox 6060 and the Logabax Persona 1600 computers but perhaps they are different from system to system.

Also you can check byte at address FFFFh:FFFEh on this machines. He also contain model type.
 
Chuck's suggestion of accessing the Olivetti RTC clock is likely the best way to do this, since that hardware/interface was only in those systems. You can't check the model byte at FFFF:000E reliably because it's 0 in those systems, and might be 0 in others. On such systems, int 15h,c0h also returns 0. (That said, I sheepishly admit that I do exactly those two things in my own code; I'll get around to it the next time I have a 6300 on the bench.)
 
Isn't there a quirk in the address decoder logic that maps the "extended" CGA to two port regions at the same time? I think $3Cx and $3Dx are supposed to both hook to the video chip, so if you modify the non-standard 3Cx and can read back from $3Dx the change... that might be all you need.

Though I'm not sure if any other systems using a 'compatible' extended CGA like certain Compaq's have that flaw.

-- edit -- yeah, found it. Came across that gem when I was researching the 160x100 mode.

http://www.seasip.info/VintagePC/cga.html

Right there under M24 about two-thirds down the page.

Addresses are incompletely decoded, so ports 03C0h-03CFh are the same as 03D0h-03DFh. This means an EGA or VGA can't coexist with the IDC.
 
This might be a good way to test, although writing to the legal port and then reading from the illegal port would be a better strategy, since writing to 3Cx on a non-M24-based system might lock it up.
 
Isn't there a quirk in the address decoder logic that maps the "extended" CGA to two port regions at the same time? I think $3Cx and $3Dx are supposed to both hook to the video chip, so if you modify the non-standard 3Cx and can read back from $3Dx the change... that might be all you need.

Though I'm not sure if any other systems using a 'compatible' extended CGA like certain Compaq's have that flaw.

-- edit -- yeah, found it. Came across that gem when I was researching the 160x100 mode.

http://www.seasip.info/VintagePC/cga.html

Right there under M24 about two-thirds down the page.

Doesn't help that guy who disables the integrated display controller for a VGA card, does it? You want your determination of M24-edness to be based on components on the motherboard.
 
True, but it could be useful to determine if the display adapter is one of the enhanced M24 ones. I ran across one game that performed this check and used it to display two pages of CGA graphics via pageflipping, as the M24 and variants have enough video memory for two pages (32K). Not surprisingly, this game was programmed by an Italian development company.

Also, I have to apologize, I do indeed have an example of detecting the M24 via the RTC -- it's exposed via the BIOS:

Code:
      ;Int 1A/AH=FEh
      ;AT&T 6300 - READ TIME AND DATE
      ;AH = FEh
      ;Return:
      ;BX = day count (0 = Jan 1, 1984)
      ;CH = hour
      ;CL = minute
      ;DH = second
      ;DL = hundredths}

Embarrassingly, I use this code in my video detection routine, and not in my system detection routine. Doh! I'll get around to fixing it when I have my 6300 on the bench.

I also have this code:

Code:
      mov     dx,03dah         {a monitor ID in bits 4-5 (2=colour, 3=mono}
      in      al,dx            {If a DEB is present, bits 6 and 7 will be 0}

...however, I seem to recall that this was not reliable -- I think I tested it on my 6300 and it didn't work.
 
I thought that the point of this exercise was to differentiate the M24 ilk from the rest of the herd for the XTIDE.

Forgive me for losing the thread of this discussion. :blush:
 
No, you're right -- and the RTC clock, through the interrupt interface, is likely the best way to do this. I just expanded on the other solutions for posterity.

The only other thing I can think of is to search the ROM for the string "OLIVETTI 1984", which shoes up in the 6300 and most likely shows up in the M24 as well.

Hey, I just had another thought: Krille, if the point of this exercise is to make sure the M24/variants aren't using 16-bit word reads (which I understand come up endian-swapped because M24 sucks), is there a way to test for that behavior and fall back if found? (Or did you just want to know this for the setup program so that the correct BIOS can be burned?)
 
Checking should be fairly easy; pick a pair of 8 bit registers on the disk, write both with byte writes and read back with a word write. If you don't get what you expect, fall back to 8 bit mode.
 
Thanks for all the replies and sorry for being a bit absent.

Why use the BIOS?

When looking in RBIL I couldn't find any mention of hardware unique for Olivetti M24 machines except for the graphics card which isn't useful in this scenario (for the same reason you stated earlier in this thread). I also couldn't find any mention of what these machines return for INT 15h/AH=0C0h which would have been the most intuitive way so I figured the next best thing would be to checksum the entire BIOS (basically, when all you've got is a hammer...)

The 6300/M24/etc. has several hardware differences. For example, there's a fairly unique clock/calendar chip that's not present at the I/O address in other systems. i.e. an MM58174AN at 70h-7fh.

On 80286 systems, (AT class and greater) this I/O port address is occupied by the MC146818 CMOS/RTC, but it's a very different chip and not present on 8086 systems.

I agree that this is clearly a better way. I'm somewhat familiar with programming standard RTC chips for PCs like the MC146818 but the MM58174AN is completely new territory for me. I've downloaded the datasheet and also disassembled the last version of the BIOS (v1.43) but I'm struggling with how to easily discern the differences in a safe and reliable way.

Hey, I just had another thought: Krille, if the point of this exercise is to make sure the M24/variants aren't using 16-bit word reads (which I understand come up endian-swapped because M24 sucks), is there a way to test for that behavior and fall back if found? (Or did you just want to know this for the setup program so that the correct BIOS can be burned?)

The latter. The idea is that the configurator should select the fastest possible, but still safe, transfer mode when selecting Auto Configure. It will still be possible to override this decision by changing the transfer mode manually (even if this means that you can shoot yourself in the foot - I hate when software is designed to "outsmart" the user).

But now that you mention it, I realize I've been going about this the wrong way. Instead of trying to detect the machine type, I should just try to verify that 16-bit transfers can be done reliably. This could be useful as a simple confidence test for regular 16-bit controllers as well.

Checking should be fairly easy; pick a pair of 8 bit registers on the disk, write both with byte writes and read back with a word write. If you don't get what you expect, fall back to 8 bit mode.

Unfortunately, this doesn't work with XT-CF controllers since the offsets to the registers are left-shifted. I can write to two 8-bit registers but not read them back in a single word. However, the basic idea is sound so I've come up with this for a test using DEBUG (as I recall, the problem was not only that the bytes are swapped but also that some byte is lost or inserted somewhere so this should work too, I think);
Code:
150C:0100 BA0003        MOV     DX,0300        ; Or whatever base address the XTCF controller is at
150C:0103 B855AA        MOV     AX,AA55
150C:0106 B90001        MOV     CX,0100
150C:0109 89C3          MOV     BX,AX
150C:010B EF            OUT     DX,AX
150C:010C EB00          JMP     010E
150C:010E ED            IN      AX,DX
150C:010F 39D8          CMP     AX,BX
150C:0111 E1F8          LOOPZ   010B
150C:0113 CC            INT     3

Anyone up for testing this in DEBUG? If so, run this as many times as possible. Also make note of what CX is after each run. If this test works then CX should always be non-zero on Olivetti M24 and friends and always zero on other machines.
 
I can get to this by the weekend if nobody beats me to it. I have a 6300 and I understand what the code does and what you're testing for. However, I'm confused about XT-CF registers being left-shifted: I have two XT-CF cards (hi James!), one "white board" variant that floated around ebay earlier in the year, and an original XT-IDE v1 (I don't believe it has the chuck mod, if that was a physical mod). Any specific board I should/shouldn't test with?
 
I can get to this by the weekend if nobody beats me to it. I have a 6300 and I understand what the code does and what you're testing for. However, I'm confused about XT-CF registers being left-shifted: I have two XT-CF cards (hi James!), one "white board" variant that floated around ebay earlier in the year, and an original XT-IDE v1 (I don't believe it has the chuck mod, if that was a physical mod). Any specific board I should/shouldn't test with?

AFAIK, none of those are Lo-tech boards (which are all of the XT-CF class regardless if they use CF media or not - yes, I realize it's confusing :) ). If, by "white board", you mean the Optima then that is based on Glitch's XT-IDE v3 design so both of your boards are then based on the original XT-IDE design which is not the same as XT-CF. IIRC, you also have an XT-CF v2 from Lo-tech? That should fit the bill.

Also, I've changed the test code slightly. This is what it looks like in XTIDECFG;
Code:
	mov		ax, 0AA55h
	mov		cx, 256
	mov		bx, ax
.Loop:
	out		dx, ax
	rol		bx, 1
	JMP_DELAY
	in		ax, dx
	rol		ax, 1
	cmp		ax, bx
	loope	.Loop

JMP_DELAY is just a 'jmp $+2'.
 
(Does this problem only affect lo-tech xt-cf boards?)

Of all the 8-bit boards, only the Lo-tech ones can do 16-bit transfers both ways. That's why they are faster than the XT-IDE boards. The XT-IDE v2 (and v3 and v4...) can do 16-bit transfers when in HI-SPEED mode but only when doing reads. That's why they need to be set to COMPATIBLE mode in the Olivetti M24 since that forces 8-bit transfers both ways.
 
Back
Top