• Please review our updated Terms and Rules here

What if IBM didn't choose the Intel CPU!

gerrydoire

Veteran Member
Joined
Aug 25, 2008
Messages
1,145
Word for thought, useless but hey!

What "if" IBM Choose the 6502 back then, where would Intel be today? Where would the PC industry have gone, where would the company that held the patents to the 6502 be today?

Since "most" other computers back then used the 6502 type cpu, if would of been interesting.

:)
 

NobodyIsHere

Veteran Member
Joined
Dec 21, 2006
Messages
2,394
Hi

IBM did make a 68000 based PC-like system in the early 1980's

http://en.wikipedia.org/wiki/IBM_System_9000

It was a system for lab instrumentation but had a lot of PC characteristics.

Ironically it has many of the characteristics of a "what if" type IBM system sometimes you hear discussed.

It used the 68000 CPU with large VERSAbus based boards (the predecessor to VME)

http://en.wikipedia.org/wiki/Versabus

I think had IBM pursued a lower cost version of it instead of the PC it would have crushed all competitors including Apple.

Thanks and have a nice day!

Andrew Lynch
 

krebizfan

Veteran Member
Joined
May 23, 2009
Messages
5,430
Location
Connecticut
Motorola couldn't build enough 68000 CPUs to meet the limited volume requirements of Apple, Atari, Commodore, Apollo, etc. I doubt they could have quickly ramped up production to meet the huge numbers of PCs and clones that got sold. A 68000 based IBM Workstation would have been an expensive flop especially in 1981.

The 8-bit CPUs were by 1981 starting to show their limits. So if IBM used one, the 1984 refresh would have had to use a different CPU architecture just to remain viable. The whole PC -> AT -> 386 progression would not have happened and PC deployment might have stalled when all existing software had to be tossed.
 

Chuck(G)

25k Member
Joined
Jan 11, 2007
Messages
39,441
Location
Pacific Northwest, USA
As used to as we are with Intel pushing the limits of integration, the 68K was a huge gamble using high-density short-channel NMOS (HMOS). I suspect that yields were pretty low and that the usual second-sources weren't ready to take the gamble.

I think it was WESCON where I got my hands on 68K documentation--this was the same WESCON where TI was giving away samples of the PACE, complete with documentation (Moto wasn't giving away samples of the 68K). Fairchild didn't want to talk to you about the 9440 unless you were in the defense industry, etc. The 8086 and Z8000 had been out for a while and got mostly yawns as being souped-up versions of the 8-bit CPUs.

The 68K was exciting--large, linear address space, generous in the register department, simple instruction set and--this one got everyone's notice--an asynchronous bus, in addition to "user" and "supervisor" modes of operation. That is, there is a "handshake" between the CPU and addressed devices, including memory. An address must be responded to with an acknowledge that the device being addressed actually existed. No acknowledge--you get a bus error trap on the CPU that throws you into the privileged supervisor mode.

What this meant was that a device or memory address not present would cause an error exception. Can you say "virtual memory"? One of the most-asked questions at that WESCON was "Can I use this to implement virtual memory?" In other words, if I get a Bus Error interrupt, is there enough information present for me (the supervisor-mode executive) to restart the instruction causing the error if I remedy the error condition?.

In short, "Can I run Unix on this chip?" (Usually meaning 4BSD) was a big item of interest.

Sadly, the answer was "not really--there are some instructions (mostly memory-to-memory) that cannot be restarted with the information provided by the BERR interrupt." I think that cost Motorola dearly--and they remedied the situation later in the 68K family. Some vendors (Apple, Apollo) went ahead anyway. Apple simply proscribed use of the instructions that couldn't be restarted; Apollo on their Domain workstations ran two 68000s; one that ran slightly ahead of the other--when the first got a bus error, the condition could be satisfied before the second CPU encountered it. Clever, but expensive.

The only other plausible contender at the time was National Semiconductor, who contracted out the design of a very advanced CPU, the NS32000 series. But the story was always "Real Soon Now" and the thing missed the boat completely by coming out too late--it later found embedded use in things needing lots of CPU such as laser printers. And National didn't have the best track record for support--it had lost interest in the SC/MP, then the PACE and the new CPU might well be next. NS was always better in the digital world at second-sourcing CPUs.

Intel, at the time, was involved in its own "Real Soon Now", the IA432--a hugely expensive "next generation" CPU chip set. The 8086 was initially intended only as a stopgap "bridge" product until the really serious people jumped on the 432 bandwagon.

So when IBM came out with the 9000 lab computer, you could almost taste the hope that the yet-to-be-annouced Personal Computer would use it. But that was extremely unlikely--IBM had already deployed the 8085-based Datamaster systems and the Displaywriter was 8086-based. That the PC would use Intel bits and pieces was never really in doubt--Intel could supply chips in quantity and I suspect IBM got a real sweetheart deal by incorporating other Intel chips in the contract. In fact, the early 5150s didn't use a NEC 765 floppy controller, but rather the Intel labeled clone--the 8272. Intel and NEC were involved in a number of cross-licensing deals--I've heard that Intel couldn't get the 8272 to work and so let NEC complete the job in exchange for manufacturing license for NEC's advanced display controller.

I don't think that the 6502 was even in the running. How would IBM have competed against the likes of Apple and Atari with yet another 6502 design? In particular, IBM had no interest in developing software for the PC intially. Trot the thing out, make the internals public right from the start; offer development tools and stand back and see what happens.
 

Chuck(G)

25k Member
Joined
Jan 11, 2007
Messages
39,441
Location
Pacific Northwest, USA
I remember getting a reaction of utter disgust from an Intel engineer to the effect of "we give you a memory-managed 32-bit state of the art design and the software people piss it all away running DOS".

I couldn't really argue with him.

The 65816 didn't come out until too late--1984.
 

yuhong

Experienced Member
Joined
Mar 2, 2010
Messages
333
I remember getting a reaction of utter disgust from an Intel engineer to the effect of "we give you a memory-managed 32-bit state of the art design and the software people piss it all away running DOS"..
And the sad thing is that it was MS to blame. MS turned what was originally OS/2-386 into an entire fiasco, with it later even attacking 32-bit OS/2 later on, and MS was the one originally shipping the MS OS/2 2.0 SDK to developers back in 1990!
 

Chuck(G)

25k Member
Joined
Jan 11, 2007
Messages
39,441
Location
Pacific Northwest, USA
Yeah, Microsoft really screwed with the OS/2 developers. When the pre-orders (about $3K IIRC, but my mind may be playing tricks) for the new OS/2 SDK came out, Microsoft shipped NT 3.1 SDKs instead, until threatened with a lawsuit to either refund in full or supply the real thing. Almost right up to the day that MS announced NT, BillG was still making smoochies with Cannavino telling the world how committed MS was to OS/2 and how wonderful the world would then be.

To their credit, IBM shipped out the new SDK for OS/2 2.0 for just a modest upgrade fee if you already had the 1.x kit.

(cf. Section VII of "The OS/2 Notebook", (c) 1990 by Microsoft Press)
 

NeXT

Veteran Member
Joined
Oct 22, 2008
Messages
6,884
Location
Kamloops, BC, Canada
An MC68000 would of made the 5150 far too expensive. A 6502 was a bit of a weakling in IBM's eyes (and for the most part was a "hobbyist" CPU found in inexpensive home computers) but the Zilog Z80 would of been an excellent alternative. Right off the bat they would have CP/M and it's software.
Potentially, this could of completely changed history because unless Microsoft ported DOS to the Z80 CP/M would of had the advantage instead of the other way around like it currently is.
 

Chuck(G)

25k Member
Joined
Jan 11, 2007
Messages
39,441
Location
Pacific Northwest, USA
...because unless Microsoft ported DOS to the Z80 CP/M would of had the advantage instead of the other way around like it currently is.

...and you'd be stuck with 16-bit address buses, bankswitching for more memory, small I/O address spaces and a world filled with competitors doing exactly the same thing. Let's not forget that in the early days of DOS, there wasn't a whole heckuva lot of difference between it and CP/M--and in view of CP/M-86, DOS 1 was inferior in API support. Interestingly, if you're running 32-bit Windows, you can still write 16-bit programs that place many system request ordinals in CL and long call location 5 in your TPA, er PSP--just like 8-bit CP/M. CP/M-86 at once went for a program interrupt--and a better choice of one--224. IBM had already done an 8085 box and obviously wasn't convinced that it would capture any market as a home computer.
 

NobodyIsHere

Veteran Member
Joined
Dec 21, 2006
Messages
2,394
Hi
IBM was a major chip manufacturer at the time. If they needed to second source the MC68000 from Motorola to meet production demands I am confident they could have negotiated a good deal similar to the one they had with Intel at the time. IBM had enormous resources that dwarfed anything that Apple, Amiga, Atari, Apollo, Sun, etc could ever hope to muster. Production capacity can be ramped up over time with enough resources I think they could have done it easily.

Instead of the plodding development of the Intel 8088 -> 80386 with all its broken architecture I think we would have seen an accelerated 68000 to 68020 to 68030 and beyond on a similar time line if not sooner. The key would be adding an MMU into the mix early. The major difference is IBM would have used VERSAbus instead of the awful 8 bit ISA bus on the PC. The next logical progression would have been VME which is highly suited for the MC68000 and a hugely *much* better bus than the 16 bit AT ISA bus ever could hope to achieve. Modern VME bus systems are still in production and with the middle connector are at least comparable to PCI if not better.

It is all speculation in any event but I maintain IBM really screwed up by not adopting the 68000 when they had a chance. Put a decent Sys V or BSD derivative OS on it (Xenix or AIX) and it would have smoked any competition. I think IBM lost 10 or more critical years trying to remediate the deep brokenness of the IBM PC legacy architecture. Even today, it's effects are still lingering in the background. We'll probably never be completely free of it unless we migrate to ARM or something equally drastic.

Instead Apple semi-led the adoption of the 68000 and so screwed it up with the Mac it basically killed the architecture. I had a bit of hope for them after they fired Jobs and released the Mac II line but they managed to kill that too. And the PPC clones. Its amazing to me Apple is even still in business after the mess they made in the 1990's.

Just think, we could have started with 6U chassis VME based 68K CPU boards with full 32 bit support from day one. I think it would have been a better world. Oh well, I guess it was never meant to be.

Thanks and have a nice day!

Andrew Lynch
 
Last edited:

yuhong

Experienced Member
Joined
Mar 2, 2010
Messages
333
Yea, that was bad enough, but the really bad one (and definitely much worse than the JDA) is the "Microsoft Munchkins" and other unethical attacks on 32-bit OS/2 later on that I mentioned in my blog post.
 
Last edited:

krebizfan

Veteran Member
Joined
May 23, 2009
Messages
5,430
Location
Connecticut
Just think, we could have started with 6U chassis VME based 68K CPU boards with full 32 bit support from day one. I think it would have been a better world. Oh well, I guess it was never meant to be.

IBM tried that with the 9000 lab computer. Not a success. Would the potential benefits of 32 bit be worth paying an extra $4,000 in 1982 (basically comparing dual floppy systems with 128kB of RAM roughly $6,000 for a stripped 9000 versus roughly $2,000 for an expanded PC). The magic price point for a PC was $1,000 which would be difficult to achieve as long as Motorola priced the 68000 higher.
 

commodorejohn

Veteran Member
Joined
Jul 6, 2010
Messages
3,162
Location
California, USA
By what logic? It's not like the PC won out because of design superiority (*cough*640KB*cough*) And it wouldn't have been an issue of compatibility because there wasn't anything for it to be compatible with in the first place. x86 wasn't even close to the juggernaut it became later on; the only thing that used it at the time were some of the million zillion different microcomputers that got squashed by the PC anyway.
 

Unknown_K

Veteran Member
Joined
Sep 11, 2003
Messages
8,559
Location
Ohio/USA
The IBM PC was a hit because it was cheap (for IBM anyway), sold by IBM (gave the cheap business computer legtitimacy with the fortune 500 of the day), completely open and documented with off the shelf parts, and backed by a huge corperation that knew how to provide support and you knew wasn't going away in a few years. It wasn't realy a superior design, just an open one. And once IBM started making a killing selling open designs they decided to come out with the PS/2 series of propreitary expensive stuff and went down hill from there.

Its not like home users were buying this thing, and the PC Jr was a flop because IBM knew corperate needs not the home users.
 

krebizfan

Veteran Member
Joined
May 23, 2009
Messages
5,430
Location
Connecticut
By what logic? It's not like the PC won out because of design superiority (*cough*640KB*cough*) And it wouldn't have been an issue of compatibility because there wasn't anything for it to be compatible with in the first place. x86 wasn't even close to the juggernaut it became later on; the only thing that used it at the time were some of the million zillion different microcomputers that got squashed by the PC anyway.

640kB was a fine amount of memory. I remember reading a SHARE article that the typical 1980 mainframe had 512kB of RAM. PC lines were expected to run about 5 years and then be replaced with something new. Would anyone have expected that any PC would need more RAM than a 5 year old mainframe?

IBM needed a CPU that was cheap enough that IBM could sell 250,000 units but also sufficiently more capable than existing 6502/Z-80 machines that users would be willing to switch. The CPU also needed a complete set of working support chips and software that could be quickly introduced. AMD, National Semi, and Zilog all had chips that were running into difficulties. Motorola had the 6809 but IBM hadn't been working with it before. Other chips, including IBM's own designs, would have cost more than IBM's planned price for the complete system. It was pretty much either 808x or do normal IBM development and get some form of very expensive workstation that ships in 1986 at the earliest.
 

Chuck(G)

25k Member
Joined
Jan 11, 2007
Messages
39,441
Location
Pacific Northwest, USA
Intel did offer a more-or-less complete supply of chips (but for the video controller), so they offered the easy way. 16 bit bus right out of the box would have been highly desirable--and other manufacturers had done just that.

The Zilog Z8000 was certainly a possibility--AMD had begun second-sourcing it and even set up a cooperative venture with Siemens (called Advanced Microcomputers "AMC"), but Zilog lacked the depth in the support chips. The 68000 was similarly crippled by lack of full-speed support chips.

Who knows--had IBM kicked the idea around another year or two, the choices might have been quite different. But to be very clear--IBM was not the first vendor to offer an 8086-based personal computer.
 
Top