• Please review our updated Terms and Rules here

Design flaws in classic computers

How about HaltAndCatchFire on the TRS-80 model II. There were a set of out instructions in basic that could actually burn out the video circuitry.

Wikipedia says the TRS-80 video flaw related to the relay used on the Model III to switch between 32- and 64-column video modes. Supposedly, repeatedly switching between 32- and 64-column modes at high speed could damage the relay.

A more serious flaw was the Commodore PET's infamous "smoke POKE" or "killer POKE". In the original revision of the PET, the command modified the video timing in order to display text more quickly on the screen. However, when a new revision of the PET was introduced, it was discovered that this command would instead cause the video display chip to produce an out-of-limits video signal. This would strain the circuitry of the PET's built-in CRT to such an extent that it could burn out if left in that state for too long.
 
My biggest gripe with IBM--floppy disk choices.

There was no practical reason to make a single-sided floppy drive as an option for the 5150. Double-sided drives had been around for a long time by the time the 5150 came out and the incremental price increase between single- and double-sided drives was minor.

The high-density 5.25" floppy drive. A grand mess. Different media for high-density that wasn't obvious by looking; the stupidity of "you can read 360K disks on a 1.2MB drive, but don't try writing them (although we won't stop you from doing it--or even warn you) if you expect to read them with a 360K drive. It would have been better to use a 3.5" drive for the new media--they had been around several years before the 5170 made its appearance.

The insane PS/2 floppy drives that were host-select for media type, with the result that you got 1.44M diskettes written to DS2D media and later, 2.88M written to DSHD media. The result, of course, was that the reliability of later getting the data back was that of getting a Muntz TV to operate in a fringe reception area.

On 3.5" diskettes, the competition (sensibly) used media-select and IBM eventually adopted that--with the result that 1.44M diskettes written to 2D media were no longer readable on new systems.

Every so often, I get in a job with a 360K format written to 5.25" HD media or 1.2M format written on DD media. The former is far more successful in recovery than the latter. In the case of 1.2M on 360K media, your data is pretty much toast after cylinder 40.

NEC adopted a much more sensible approach with their PC98 series (which actually preceded the 5150). 1.3MB floppies with identical format on 8", 5.25" and 3.5" media (80x2x8x1024 bytes)
 
Wikipedia says the TRS-80 video flaw related to the relay used on the Model III to switch between 32- and 64-column video modes. Supposedly, repeatedly switching between 32- and 64-column modes at high speed could damage the relay.

I have (somewhere) documented an instance of someone killing their model II video hardware using out and in from basic. Wikipedia isn't definitive. Just because it is or isn't there doesn't make it so or not so.

Here you go (one reference):
http://groups.google.com/group/alt....k=gst&q=trs-80+ii+kill+video#26bee7ee973e716d

And another:
http://groups.google.com/group/comp...nk=gst&q=halt+and+catch+fire#bf36ec0cd2f1d0c6

And see the one by Frank Durda early in:
http://groups.google.com/group/comp...nk=gst&q=halt+and+catch+fire#126d805d05ab3bad
 
Last edited:
re: the BBC Model B, of which not much said so far -
- it used the less-common symmetrical DIN-5 connector for the
Serial Port, and this could be plugged in 2 different ways...only
one of which worked correctly, of course.

Oh there is another one like that, on the same machines the 5 pin DIN plug for econet can also be plugged into the 7 pin DIN socket for the tape interface. This of course shorts out the network, and econet being a bus network means all traffic stops till the idiot responsible is found.

No one mentioned the ZX80 and its then famous external RAM pack, which would come loose when it was least expected and wipe out all your work, blob of blue tack was the official fix. Come to think of it the entire range of machines had so many flaws, cheap nasty keyboard, complete lack of a video chip when meant the screen went blank when the CPU was busy.
 
I always assumed that it was implied you press F1 after plugging the keyboard back in.

Anyway one that that's always kinda irked me is metal chassis edges that act like a meat cleaver. Used to have an 8088 that was always out for blood and a 386 that's not any better.

Same happened to me, about a week ago I've cut myself.

Nothing serious, but shows how dangerous are vintage computers :D
 
I don't know whether anybody said this or not, but those "twiggy" floppy drives in the Apple Lisa were terrible, and the Solder in the first Macintosh power supply was extremely prone to cracking.
 
Here's one: the six gazillion different proprietary serial-port connectors on home computers. Even more annoying when the protocol itself is standard (i.e. RS-232) but the connector is proprietary. Look, I can understand not wanting to take up board space with one or more DB-25 connectors, but could they at least have settled on a standard mini serial connector? Gah.
 
The Amstrad PCW printer.

To keep costs down the printer itself contained only the motors and high-current components. All the print processing was done in software by the PCW, which then drove the print mechanism directly. Consequently the printer was specific to the PCW and could not be used with any other computer, nor could the PCW use any other type of printer. This would not have been a big deal, except that later PCWs switched to using a DB25 printer port in order to allow the use of standard printer cables rather than having to manufacture the previous proprietary cable. The result was a computer and printer with standard connectors and cable, but with different pinouts and weird voltages, resulting in a fried PCW or ruined printer if you tried to use them with other hardware as you might reasonably expect to be able to.
 
In the Compaq Presario 3020, the heatsink they put on the stock CPU puts so much pull on the socket that once you remove it, there's a good chance that the clips on the socket will break, forcing you to find one that uses the other set of clips (and we're lucky that those alternative clips exist!). This happened to mine.
 
The result was a computer and printer with standard connectors and cable, but with different pinouts and weird voltages, resulting in a fried PCW or ruined printer if you tried to use them with other hardware as you might reasonably expect to be able to.

Oh, connectors that look the same and are completely different are always good for a laugh. Amstrad did a similar trick with joystick ports on their later Spectrums, when they used standard Atari ports but with different pin assignments so that only their own joysticks would work. And there was also an add-on Teletext adaptor for the Spectrum, that used the same power connector as the Spectrum, but at about twice the voltage. Get them the wrong way round and *zap*.
 
The joystick ports in the 128K Spectrums are the same pinout as the old Sinclair Interface 2 joysticks, which isn't totally unreasonable as that's the interface the built-in ports emulate.

The Interface 2 was the only commonly supported adaptor that supported two ports as well, so it makes sense to copy it instead of the other interfaces.
 
Last edited:
The joystick ports in the 128K Spectrums are the same pinout as the old Sinclair Interface 2 joysticks, which isn't totally unreasonable as that's the interface the built-in ports emulate.

The Interface 2 was the only commonly supported adaptor that supported two ports as well, so it makes sense to copy it instead of the other interfaces.

But the original Interface 2 used the Atari pinout. It's only the Amstrad +2 and +3 that changed the pin assignments.
 
My favorite is the "pizzabox" line of Macintoshes that were "DOS Compatible". Specifically the Quadra 610 DOS Compatible and Power Macintosh 6100 DOS Compatible. What made these especially fail-worthy was the fact that the power button is right next to the floppy drive.

The auto-eject floppy drive.

That doesn't have a physical eject button.

Which meant that a regular PC user that had just switched would be very likely to push the button next to the floppy drive when they wanted to eject the floppy. The button that instantly turns off the computer.


I remember doing that in a Mac lab at Ball State in the mid-90s!
 
The TI-99/4A was my first computer and it had a well known keyboard function that would cause headaches. When you were trying to output a '+' you hit 'SHIFT =' but the 'FCTN' key was right underneath 'SHIFT' and 'FCTN =' caused a warm reboot! I had this computer from when I was 8 in 1983 and once I started entering BASIC programs I'm sure I lost plenty of typing time with this flaw ;).

I won't get into any other flaws because this one affected me the most!
 
I was flummoxed as to why Commodore didn't use an external power supply for the 1541 drive. Also no track zero sensor (head banging) and the snail-slow interface. Why the BASIC 2.0 on the 64? Why not a cooling fan for the floppies on the TRS-80 Model III/4? Why no lower case or double-density on the TRS-80 Model I? Why 64 columns instead of 80 columns on the TRS-80 Model III? Why did Tandy do such a bad job on TRSDOS for Model I? What possessed Atari to use the awful keyboard on the 400 and why the flakey BASIC on the Atari's? Who's idea was it for the super-slow double-interpreted BASIC on the TI-99/4(A) and why the kibbled PEG?
 
I was flummoxed as to why Commodore didn't use an external power supply for the 1541 drive.
Well, maybe not a bad thing. There were enough cables and bits and pieces as there was!

Also no track zero sensor (head banging) and the snail-slow interface.
You got me there. Something to do with saving money I would imagine.

Why the BASIC 2.0 on the 64?
See above. Apparently Commodore has some kind of once-only payment license from Microsoft for BASIC 2.0 which meant they could use it at no cost.

Why no lower case or double-density on the TRS-80 Model I?
Easy. This computer came out in 1977. Lower case wasn't standard on most micros, neither was double density. When it started to become so, the Model 1 was about to be retired in favour of the Model III...which had those things. Later Model 1s had lower case (once l/c drivers were run).

Why 64 columns instead of 80 columns on the TRS-80 Model III?
For compatibility with the Model I?

Why did Tandy do such a bad job on TRSDOS for Model I?
You got me there. But it did open the door for some great 3rd party DOSes!

What possessed Atari to use the awful keyboard on the 400
To save money and also the Atari 400 was the games machine. It possible kids with sticky fingers would be using it. Memberane keyboards are easy to wipe. The Atari 800 was the computer for serious stuff.

Tez
 
Also no track zero sensor (head banging) and the snail-slow interface.
As tezza suggested, the omission of a track-0 sensor was to reduce cost - drive mechanisms featuring one cost extra, and Commodore was all about cheap, accessible hardware in those days. The famously slow serial interface is a slightly more complicated story.

The original plan, back when the interface was being designed for the VIC-20, was to use the shift register on one of the 6522 chips to transmit and receive data a byte at a time with no work from the CPU, but apparently the people responsible hadn't heard about a fairly well-known bug in the shift register, and didn't discover it until a bunch of VIC boards had already been fabricated. Rather than replace the 6522s with the non-buggy 6526 chips, they opted to re-work the software into manually bit-banging rather than the nice automatic transmission, making it over eight times slower! Unfortunately, when the C64 came around, they decided that, rather than change the software now that they had a machine with 6526 chips and change the drives to match, they'd stick with the VIC protocol so people could use their existing drives. Understandable, but a pain nonetheless :/

So to sum up: caused by uninformed engineers facing a hardware error, maintained by a focus on backwards-compatibility.
 
I was flummoxed as to why Commodore didn't use an external power supply for the 1541 drive. Also no track zero sensor (head banging) and the snail-slow interface. Why the BASIC 2.0 on the 64? Why not a cooling fan for the floppies on the TRS-80 Model III/4? Why no lower case or double-density on the TRS-80 Model I? Why 64 columns instead of 80 columns on the TRS-80 Model III? Why did Tandy do such a bad job on TRSDOS for Model I? What possessed Atari to use the awful keyboard on the 400 and why the flakey BASIC on the Atari's? Who's idea was it for the super-slow double-interpreted BASIC on the TI-99/4(A) and why the kibbled PEG?

The other points have mostly been covered, but Atari's BASIC was a result of their inability to cram Microsoft BASIC onto the cartridge. Atari purchased the Microsoft "8K" 6502 BASIC, but when they got the code it turned out to be more like 10K in size and was undocumented, and as the introduction date of the 400/800 loomed, Atari couldn't downsize it to fit onto the 8K cartridge ROM without crippling important features like support for graphics and sound.

So, they enlisted the help of Shepardson Microsystems, who had previously written Cromemco BASIC, and Shepardson wrote a new 8K BASIC from scratch for Atari. It's slow and has a few bugs, but it generally works OK, and has full support for graphics and sound without having to resort to PEEKs and POKEs, like the C64's BASIC.
 
Back
Top