• Please review our updated Terms and Rules here

Anyone have errata notes for intel 8251A UART chip?

BradN

Member
Joined
Aug 27, 2006
Messages
32
Hi, I'm getting to the point in writing my Sanyo MBC-55x BIOS that I'd like to create proper interrupt driven buffering serial port support, and I noticed that while the NEC 8251AC chip has a ton of hardware bugs, the intel chip was used on the serial board. The NEC chip is used on the motherboard but as a keyboard controller where really none of the problems matter.

So, my question is, if the NEC chip was that bad, are there any such hardware bugs in the intel 8251A chip? I'd find it hard to believe there's not any, so if I can't find such info I'll implement workarounds for all of the NEC bugs just in case.

I hate running into stuff like that and thinking my assembly is wrong, because debugging assembly is kinda a bitch :) For example, I found 2 glitches in the mitsubishi M5W1793 floppy controller that were a bit counter-intuitive... writing a sector returns a spurious error flag at the beginning, and the chip does not reliably establish sync to the data track before beginning operations, resulting in impossible size sectors being transferred with garbage data if enough delay isn't used once the track seek or spin-up is done. It's bad enough trying to write code to be fast enough to transfer data when the CPU is barely fast enough to do it (only polled I/O is supported on this motherboard), and running into inconsistencies makes it even harder.


If anyone's curious what kind of bugs the NEC chip has, here are the notes I made about them and possible workarounds in my application (note: the NEC uPD8251ACF and uPD8251AF have fixed these bugs):

Code:
;notes for working around uPD8251AC hardware bugs (this chip is ugly):

;1)
;If the remote side cancels CTS while transmitting, you must stop transmission
;(by shutting off txenable) before the next byte is sent or it will be a
;duplicate of the last byte, until the next byte is loaded from the CPU.
;Basically, the hardware confuses non-empty tx buffer with a buffer that
;hasn't begun transmission.
;Workaround:  Always shut off txEN after starting a transmission or
;(maybe) immediately after the character finishes sending. Shutting off txEN
;immediately prevents interrupt on completed transmission though.  Shutting
;it off after completed byte transmission is safe as long as remote side always
;keeps CTS cancelled for at least one byte at a time.  HW mod is possible
;to enforce this condition.

;2)
;Break detect can latch up and in some cases requires a device reset.
;Always reprogram the device after receiving a break.

;3)
;Bugs relating to synchronous transmission - irrelevant here.

;4)
;Status register can return garbage if it was being updated.
;Read status register until it returns the same byte twice.
;Mask off bit 2 (txEmpty) if txEN is off, because of item 6 or it could
;oscillate and cause unpredictable delays.

;5)
;RxRDY clears after 2 cycles instead of instantly - irrelevant.

;6)
;TxEmpty doesn't return proper value when transmission is stopped.
;This can be worked around by using TxReady instead.

;7)
;TxRDY and TxEmpty clear half a cycle too late - irrelevant.

;8)
;Enter hunt command causes receive problems in async mode - shouldn't
;be using it anyway.

;9)
;control/data select line has bad performance - irrelevant because our system
;clock speed is as slow as 8088 has been known to run in a mass produced PC
;- we should be fine.

;10)
;data overrun is not always reliably detected and can cause garbage data.
;Workaround:  Never allow a hardware overrun but emulate it through software
;when the received byte cannot be buffered.

;11)
;In async mode, the first bit sent is delayed by one transmit clock period after
;a reset - irrelevant.

;12)
;RxRDY can glitch when clk doesn't have a fixed phase relationshp to receive
;clock - this shouldn't be a problem because all clocks are derived from one
;source.

;13)
;Receiver sometimes generates an extra character after a break ends.
;No proper workaround is possible since the end of a break cannot be
;reliably timed without polling the chip constantly.  Every character
;immediately following a break can be discarded unless the break's end was
;found during status polling.

;Problem due to implementation on serial card:
;rxRdy and txRdy share an interrupt line.  Transmit enable must be shut off to
;allow reception of rxRdy interrupt.  rxRdy must be polled at the end of
;transmission after shutting off txEN to make sure a byte isn't missed, since
;txRdy going high would trigger the interrupt, and rxRdy can trigger just
;after the interrupt handler checks for it.  This can however be done in the
;interrupt handler - only exit once txEN is shut off and no rxRdy occurs.
 
Last edited:
They're the same chip. NEC and Intel cross-licensed the designs--just as the 8272A is the same chip as the uPD765A.

I believe the NEC uPD 71051 may have some of the glitches fixed. All in all, the 8251A was a huge improvement over the 8251. Both chips are very old designs.

Are you trying to write an IBM-compatible BIOS? If so, IBM didn't use interrupt driven code for theirs--it's insanely simple. Most comms packages, for that reason, had their own drivers.
 
I'm trying to make the BIOS as compatible at an API level as is reasonable given the hardware. Since no comm program that uses its own driver will ever work with this machine unless it was designed specifically with it in mind, I'm trying to make the BIOS implementation as useful as I can.

The reason I want to use interrupt driven I/O is because this machine is significantly slower than the IBM XT (3.58MHz clock, CPU<->bus access only once every 4 cycles - the onboard video is a bus hog), and because of the different serial chip, the status byte is returned all mixed around compared to what PC BIOS needs to return. The best operation I've come up with to fix this ordering uses 16 rotates and shifts, which takes 128 cycles on a 8088 in here (16 2 byte instructions -> 32 code bytes -> 128 clock cycles to load them).

I could use a lookup table but I'd rather spend 128 cycles than 128 bytes - this is a bit of a superoptimization project... right now my BIOS is under 4KB UPX compressed and supports keyboard (using ROM mapping table), text onboard video with underline support (using ROM character table), disk with most error handling, AES-256 disk encryption with a simple breakout key sequence to configure, cga init that makes some CGA programs happy, and contains a freedos boot sector since the real boot sector is used to load the BIOS.

So my plan is to update the status only when bytes finish transferring and let the BIOS calls pull it and the data from copies rather than the chip. This will allow the program to poll faster and be more likely to succeed at higher baud rates. This will still ensure that the flow status bits and post-receive bits are correct enough, but I'm not sure what to do with the DSR signal since it may change without bytes being transferred. It might not be important enough for me to care. Since I'm going through all this effort, it's not much more to implement buffering to improve throughput when data is processed in bursts, and maybe an extension to allow moving/sizing these buffers anywhere in RAM.

Maybe I will look for other serial drivers in ralf brown's interrupt list and implement one I like, maybe some existing software can be used here after all... Do you know of any comm programs that use an external driver that might be worth trying to make work?
 
Last edited:
Erratic behavior Transmit BAUD due to aging or frequency wear?

Erratic behavior Transmit BAUD due to aging or frequency wear?

QUOTE
So, my question is, if the NEC chip was that bad, are there any such hardware bugs in the intel 8251A chip? I'd find it hard to believe there's not any, so if I can't find such info I'll implement workarounds for all of the NEC bugs just in case.

I hate running into stuff like that and thinking my assembly is wrong, because debugging assembly is kinda a bitch :)
UNQUOTE
.
.
.

I ran into issues with a 8251A chip I used 30 years ago in a MIDI implementation. That is to say 31250 baud as MIDI demands. This worked perfectly in 1992 however for some reason that same chip now refuses to function properly at 31250 baud. Mind that it is exactly the same chip for real I saved for 30 years. I ordered replacements via Ebay and they have the same odd behavior.

Clock 8251A : 3.27Mhz
RxTx Clock: variations between 125Kbaud to 300baud
factor: x1
Asynchroneous transmission
Separate unit to produce RxTx Clock and change it by arbitrary numbers between a few hundred up to 62500.
Works: 15625 baud, 62500 baud, 28800 baud, and some low values.

Garble transmission: 31250 baud and a few lower bauds.

This is incredible since it defies logic.

I use an Arduino to produce the RxTx Clock which is properly within specs. Everything is within specs. The amazing thing is that I have succeeded in transmitting at 125000 baud without issues using over specs clocks. So I suspect NEC numbered failure uPD71051's as 8251A's because 8251A specs point to a maximum asynchroneous baud of 19kbaud while I succeeded transmitting successfully at 125000 baud. Crazy but true. However it refuses to function properly at 31250 baud to then function nice and neat again at 28800 baud in the same setup and by just changing the RxTx Clock.

My take is frequency wear because that's the only possible conclusion.
 
I'm not quite sure if this helps, but those of us running BBS software used a fossil driver to overcome many of the issues with the 825x chips. Maybe it was only for the 8250, not that later series?

I've heard it stated somewhere that without special drivers, the 825x chips couldn't be reliable by themselves above 9600 baud.

Again, my memory is poor on this issue (and i can't say how well i knew it back then!)
 
The 8251 actually precedes the 8250 in time. The 8250 was actually a National Semiconductor chip. The 8251 (non-A) is really primitive and is best left to the glass cases. The 8251A has some quirks, but there are usually workarounds.

For vintage stuff, I like the Signetics 2651--a dual USART somewhat akin to the Z80 SIO chip--in a 28 pin package, which includes a baud-rate generator. The 2661 is the next generation of that chip-and there's a Motorola incarnation of it as well.
 
Back
Top