• Please review our updated Terms and Rules here

Assmebly programming

The lack of memory protection on the 8086 didn't really matter since with only a 1MB addressing space, there was no room for any decent multitasking OS.

There was MP/M-86, concurrent CP/M/concurrent DOS.

Aside from PCs, the x86 line was only used in rather obscure PC-lookalikes such as the Tandy 2000 and Sanyo MBC-550.

The NEC APC, which morphed into the PC98 line, the IBM DisplayWriter were two notable mainstream examples. I don't think anyone ever considered the 8086 for a game machine. The Mitsubishi Multi-16 preceded the 5150 by a few months and sold well in Japan. Columbia, Eagle, Televideo and Xerox all had 8086 systems out during this time.

The x86 line was really the only choice for the IBM PC, since in 1980-1981 the 68000 and Z8000 were too new and not ready for use in a production machine.

Actually, the Z8000 and 8088 came out the same year. One thing that worked in Intel's favor was the availability of second sources (which Motorola was not willing to do) and bundling of existing peripheral chips with CPU sales which Intel was happy to do. I think Intel initially made more money from their sales of peripheral chips to IBM than from the CPUs.

The Z8000 had only AMD as a somewhat confused and unwilling partner, together with Siemens (the venture was called AMC). Zilog was not doing well in 1979-1981; they'd lost Ralph Ungerman in 1979 and Federico Faggin in 1980 and were hemorrhaging money. AMC didn't last long--Siemens became frustrated and pulled out (they ended up using the 8086 in their PLCs).

Similarly, Intel had smoothly provided software support for the 8086--the first coding I did was assembled on an MDS-200 8-bit system. They had a converter in place--the first sample job I submitted at the local sales office took 7 hours to translate about 3000 lines of 8080 assembly. It was slow, but it was there.

One could not say the same thing for Zilog or Motorola.

I agree that if IBM needed to introduce a system in 1981 and use low-cost commodity parts, the 8086 was the only 16-bit system that could practically be deployed. But I had the suspicion that IBM corporate didn't really take the project seriously for quite a while and that the sales, even in the face of the Apple Mac that came along a couple of years later, surprised them.

So the question for me boils down to "Was there a compelling reason for IBM to introduce the 5150 in 1981?"
 
I started off using UCSD Pascal running on a PDP-11. 64k code; 64k data; segments don't scare me. The 8086 did all that at a nice cheap price.

The 16-bit versions of OS/2 were quite stable. I guess having functions to handle cross segment addressing worked better than trusting programmers to do math correctly.
 
There was MP/M-86, concurrent CP/M/concurrent DOS.

What I should have said was that there wasn't enough memory to multitask major applications, not that a multitasking OS wasn't possible. You could add Windows 3.0 to the list of multitasking OSes that run on an 8086, but you can't really do much with it unless you have a 286.

The NEC APC, which morphed into the PC98 line, the IBM DisplayWriter were two notable mainstream examples. I don't think anyone ever considered the 8086 for a game machine. The Mitsubishi Multi-16 preceded the 5150 by a few months and sold well in Japan. Columbia, Eagle, Televideo and Xerox all had 8086 systems out during this time.

As I said, those are all pretty obscure machines compared to the Mac or Amiga. Most would fit into the category of PC workalikes that ran DOS or CP/M-86. Although they usually offered better hardware than the IBM PC, the lack of compatibility killed them. People came to believe that any x86 machine that ran DOS should be IBM-compatible.

I'd think one look at the 8086's segmented addressing was enough to scare anyone away from using it in an arcade machine or a game console. The x86 line did nonetheless see use in that area: on the Xbox long after paged memory had arrived and 64k segments were a thing of the past.

Actually, the Z8000 and 8088 came out the same year. One thing that worked in Intel's favor was the availability of second sources (which Motorola was not willing to do) and bundling of existing peripheral chips with CPU sales which Intel was happy to do. I think Intel initially made more money from their sales of peripheral chips to IBM than from the CPUs.

From what I've heard, the 68000 did not yet have a complete set of peripheral chips in 1980. It would have also necessitated IBM's using all 16-bit components, which were rare and expensive at the time. For that reason, they finally settled on the 8088, which allowed the use of cheap, readily available 8-bit components.

The Z8000 had only AMD as a somewhat confused and unwilling partner, together with Siemens (the venture was called AMC). Zilog was not doing well in 1979-1981; they'd lost Ralph Ungerman in 1979 and Federico Faggin in 1980 and were hemorrhaging money. AMC didn't last long--Siemens became frustrated and pulled out (they ended up using the 8086 in their PLCs).

Unlike the 8086 and 68000, the poor Z8000 was condemned to obscurity. It was used mainly as an embedded processor and in workstations. Some of Namco's arcade games used them as well.

I agree that if IBM needed to introduce a system in 1981 and use low-cost commodity parts, the 8086 was the only 16-bit system that could practically be deployed. But I had the suspicion that IBM corporate didn't really take the project seriously for quite a while and that the sales, even in the face of the Apple Mac that came along a couple of years later, surprised them.

There was supposedly an IBM spokesman who said in 1983 or 1984 that (and I'm paraphrasing) "When we started, we expected to sell at most 100,000 PCs. Now we're selling 100,000 a month."

If that's true, it would work out to a little over a million PCs a year, which is not an unrealistic figure considering nearly that many Commodore 64s were sold per year at their peak in 1984-1986.
 
Last edited:
There was supposedly an IBM spokesman who said in 1983 or 1984 that (and I'm paraphrasing) "When we started, we expected to sell at most 100,000 PCs. Now we're selling 100,000 a month."

That would fit. The 5100 from a few years earlier was a huge flop for IBM and the idea that IBM was going to sell a system using nothing but commodity components using third-party software must have been a major culture shock to the suits at IBM. Everything about the 5100 was IBM, right down to the CPU and the peripherals.

The introduction of the PC RT and MCA systems shows that IBM had a hard time shaking the NIH mindset.
 
Everything about the 5100 was IBM, right down to the CPU and the peripherals.

The 5100 was intended for the engineering and scientific markets (as evidenced by its $10,000+ price tag) and never meant to be a general-purpose home or office computer. It was made entirely out of custom hardware in traditional IBM style. Lee Felsenstein (the designer of the Sol-20 and Osborne) said, "My experience was that whenever you found IBM parts in a junk box, you threw them away because they were all custom jobs and you couldn't find any information about them."

The introduction of the PC RT and MCA systems shows that IBM had a hard time shaking the NIH mindset.

The MCA was a well-meaning idea as the 386 had just come out and needed a 32-bit bus for optimum performance. Making companies pay IBM a licensing fee to use it was not a bright idea (Tandy and NEC were the only takers), but then the open-standard EISA bus proved no more successful than the MCA was. When the PS/2 line came out, IBM also foolishly tried to demand that all clone makers pay them retroactively for using the ISA bus (sound of crickets chirping).

Even the original IBM PC had a nonstandard parallel port and also introduced the cable twist to set drive letters (instead of using jumpers).
 
Was IBM the first with the cable twist? I thought that was pretty clever--it allowed the sytem to control the drive motors individually--you can't do that with a "flat" cable.
 
Twisted Cable...explained!

Twisted Cable...explained!

Was IBM the first with the cable twist? I thought that was pretty clever--it allowed the sytem to control the drive motors individually--you can't do that with a "flat" cable.

Alright Chuck...you set yourself up :wink: ! I have heard this story so many
times, and now is a good time for a great master to give us a plain explanation
as to what the "twisted cable" is all about.

Thank you in advance

ziloo
 
No, now would be a good time to resist going off topic in a thread labeled 'Assembly programming'.

;-0
 
Alright Chuck...you set yourself up :wink: ! I have heard this story so many
times, and now is a good time for a great master to give us a plain explanation
as to what the "twisted cable" is all about.
Thank you in advance

I'll post in another thread a bit later today.

No, now would be a good time to resist going off topic in a thread labeled 'Assembly programming'.

You didn't read the thread title, did you, Mike ? ;)
 
Last edited:
I did notice the spelling, but that's what contributes to the character ...
 
Back
Top