• Please review our updated Terms and Rules here

Thoughts on minicomputer to microcomputer transition (PDP to Macintosh)

voidstar78

Veteran Member
Joined
May 25, 2021
Messages
690
Location
Texas
I had in mind a kind of "mural" of the origins of the personal computer. Attached below is a concept/prototype of that thought.

origin_of_the_PC_small.jpg



I don't intend to debate about which computer was "first." Just like cars, airplanes, pianos - any engineered product, there was a series of prototypes and refinements, then eventually a final general-form emerged. I once saw a drawing of the first steam engine train i nthe 1700s, some kind of tractor looking thing that didn't have a track yet to run on! Was it really a train yet? And to me, in a way, I kind of consider upright pianos as the first personal computer - it has a keyboard input, the sheet music is the software, and the duration of the string vibrations is the memory. In any case, "first" isn't that important and is very subjective.

And also, there was certainly a lot of pioneering hardware and software work prior to the 1970s that all led up to this point of a "personal digital computer". A computer, of course, "computes" - but a computer has become more than just a numerical calculator. Computers can execute programs, and the concepts of expressing a program and debugging a program were pioneered throughout the 1950s (FORTRAN, COBOL, and the concept of "debugging" and system hooks to let you step through the instructions).

To me, the "personal" part of the PC means: (1) it can be bought/financed by an individual, (2) can be installed/setup by an individual, and (3) used/operated by an individual.

The PDP was a bit before my time, so I don't know that much about them. I've seen the PDP-1 in operation, and once one of those was purchased and setup, I see it can be operated by an individual. So I can certainly agree it is a form of personal computer -- but practically speaking, I don't think it could have been afforded (and then maintained) by an individual. I've seen images of PDP-8's fitting in the back seat of cars, so in a sense they are portable. And across the 1960s, it seems the price of the PDPs went from ~$120k down to ~$10k (depending on accessories and options, I'm sure).


Then there is the Wang 2200/HP9830/IBM 5100... In parallel is the Kinbak, Micral, Altair. I draw a line between these systems (and the Kinbak) marking the difference between "classic" TTL, and "off the shelf components." Of course that distinction isn't exactly clear cut. But view it this way: we can make modern replicas of the Altair, but we would struggle to make a replica of the IBM 5100 (and its famous "tincans"). And there are emulators of the Wang 2200 (I think only as far back as the "B" model), but I don't think anyone is up for replicating its proprietary processor.

And the Micral N is interesting, as it was developed in France by a Vietnamese-born citizen. Remember, Vietnam was a French colony for decades, there is a lot of French culture influence in Vietnam. And my understanding it is the French who "helped" the Vietnamese transition from traditional Asian-style written letters, to more Latin-style symbols. [ interesting side note: the first photograph is said to have come from France. ] Did the concept of the Micral N make its way over to the folks at MOS? Or, rather like the light bulb and airplane, were they sort of independently developed in a couple places at nearly the same time?

The rest in my mural: about the KIM-1, Altair, Apple, and eventual IBM PC is all fairly well known and discussed. I don't mean to downplay the significant of the Altair - my view is the Sol-20 represents the epitome of what an Altair kit could become (much of those accessories wasn't readily available, since it just couldn't be built fast enough or had quality issues)? My only real issue with the Sol-20, is that BASIC had to be loaded, it wasn't built in. That understandable with a 4K system (especially if it was designed more to be a terminal frontend), but I think it does make a distinction between "still a kit" vs "a desktop appliance" that anyone can sit down and use right away.

Then of course the 1977 Trinity. 1978 was a lot of execution on buy-orders of the Trinity, and a mad-dash towards making an affordable disk drive system - which was starting to become fairly standard by 1979/1980. Then even by 1982, 64KB systems were still common. But then IBM changed all that, with multiple segments of 16-bit, and a simple OS that made all that seem seamless.

I saw a video once of the Commodore C64 production line (in Germany), which to me seemed very sophisticated for early 1980s. I'm not sure if the PDP, Wang, or IBM 5100 had that kind of large-scale production formality (clean rooms and automated testing) - maybe someone here has insight about that? Plus how much of pre-1977 systems were "hand-built"? (the Apple-1 for sure was "hand built", no automated assembly line).

It was that process in getting to that single mainboard design that made the large-scale production possible, which did require some ingenuity, but also for the industry to scale up and make available the necessary chips in quantity.

And I could have stopped at 1982.. But I wanted to show the 1984 highlights: Tandy 1000 SX and the Macintosh. Although IIRC, Macintosh sales struggled compared to the ][gs? But the point in including those is that the "meta" of PCs had been established, and that overall design of a desktop personal PC has stood for 40+ years (detached keyboard, central CPU box with expansion cards, and a video display). From 1984 on, the industry explodes out with many refinements on that formula (especially in portable PCs).


So in terms of "first personal computer" - if you needed to make some machine to control another industrial machine, the Micral N may have been able to do it (with recognizable instructions and external IO; I'm not sure if any were sold internationally, so you'd have to read the operating manual in French probably). But for a system that your mother could sit down and store and recall recipes -- well, a typical home couldn't fit a PDP. And a Wang or IBM 5100 might be up to that task, except at $7500+ they also weren't very affordable -- and I don't recall coming across any verified production numbers of those early systems.

The IBM 5100 is sort of the "last of an era" in that the 5100 is like all the components of a previous-decades mainframes, packaged up into a desktop unit. But the 5100 has all the merits of, say, the Commodore PET: built in screen, keyboard, storage device, can be coded into a generous 64KB space.


Lastly, I wanted to note about VisiCalc in 1979 - as a very pioneering "killer app" that really woke things up, in terms of what these microcomputers could be used (even if they only had 16KB, they could keep track of local team scores, and other small group needs). There was some digital music production going on, some industrial automation work going on (like controlling lights in a theater production), but image processing hadn't yet come of age. But now with the line-printer replaced with a CRT, the "visible calculator" (VisiCalc) is still extremely useful to this day. Then it was on to accounting, payroll, patient databases -- all the stuff big mainframes had been doing for insurance companies, now small companies could enjoy that software.

There are lots of systems missing (CoCo, Amiga), but wondering how folks think of this "mural"?
 
Last edited:
Of course, these machines are all from your side of the pond.

We had the Sinclair computers (in various guises from the MK14, through the ZX series and then the QL), The BBC computer (and then the electron etc.). The BBC computer inspired a whole new generation of computer engineers - being targeted for schools.

I suppose the trouble is, your mural will get rather large and cluttered unless you define your boundaries carefully.

Dave
 
For usage the LINC computer was intended to be directly operated by a person. Did even have an example of use at home. Way outside your affordable by a person criteria. It was university/lab/business personal.
https://en.wikipedia.org/wiki/LINC
 
Back in 1980 or so, I didn't think of Commodore as the third brand, I thought of Atari. It wasn't until the C64 that I accepted them as having surpassed Atari. At the very least Atari needs to be in there.

It's still a US-centric view, though, at least the post-1980 collection. I was always astonished that they got along for so long with cassette storage over in Europe with their little doorstop wedges. (Yes, I know not all were wedges, there was the BBC, but there were a lot of "toy breed" computers, especially from Sinclair.)

EDIT: and the Z80 needs to be there with the 6502. While I may love the 6809, it wasn't nearly as important as it deserved to be.
 
Last edited:
There were minicomputers before the term " minicomputer" was coined. Consider, for example, the PB250 or the CDC 160A. Similarly, "microprocessor" was often used as the part of a machine that employed microcode. And it seems everyone has forgotten the "midicomputer" (yes, that was a thing).
 
Thanks for the notes! Agreed it is rather "US-centric", maybe one of the ACORN systems could be squeezed in.

The Z80 is represented by the TRS-80. But, IIRC, Zilog came about as a kind of a "rebellion" against Intel (ex-Intel engineers trying to go off on their own)? Which is also sort of what MOS was (ex-Motorola employees?)? I did want to squeeze in a CoCo, but then it's awkward how the TRS-80 morphed from Z80 based to a 6809.

I never had an Amiga (couldn't afford one), but I think they were from 1985 on?

Didn't Jack Tamriel (of Commodore) end up buying Atari? Or Atari morphed into the Activision software company? I recall the IntelliVision and ColecoVision, but (IIRC) they had no BASIC or weren't reprogrammable, so not quite PCs. To me, Atari was always more of a software-focused company. But you're right, I think their early hardware work (1972-ish) inspired the Apple-2?

One of the first things my grandmother used a PC for was family genealogy records. A crisp CRT makes that viable, for writing the long descriptions for entries. Then, my father ran his pool service on a CoCo2, just showing an example of a type of small business that could be run from one of the "attaches-to-a-television" type PCs (though he had to upgrade equipment by the mid 1980s).

Throughout the 1970s, I can see how "normal" people couldn't quite see what a computer could be used for. I recall some statement along the lines of some IBM executive saying "the world only needs about 5 computers" -- which is interesting, because we sort of come full circle on that: we have a few mega data-centers, and we have enough bandwidth these days we can stream our processing (even gaming) from those data-centers. (now personally, I still do a lot of local image processing for astrophotography work - where a single image is composed of over 300GB of raw data).

Based on archived articles at LO*OP, there are articles with explicit concern about using computers (throughout the 1970s) -- it seems there was some wisdom there that foresaw some of the issues were seeing now in things like online social media and "mysterious accounting" by agencies. But then there were late-1970s initiatives by the BBC (and similar) to give PCs a more "friendly" and approachable face. And once the price went below $1000, that's the threshold where they could be obtained for hobbies.


Anyhow, I'm still trying to understand the PDP-8 vs PDP-10/11/12 better. It seems to me the "core" aspect of the PDP was its processor, in a container a bit taller than say the Altair 8800. The rest of the cabinet space was additional accessories, like memory or IO cards? The OS and compilers have to be stored and loaded from somewhere. Conceptually, could one run a multi-line BBS on a PDP? (not sure if anyone ever did). But still, a base PDP was at least $10k (original) - not too practical for hobby/home.

Before cloud-computing really took off, I imagined that a new business might be homes having their own mini-data-centers installed -- imagine a kind of mini-fridge sized device, holding a few hundred processing cores and 20TB or so of archived media (family photos and such). So maybe cabinet style computing might yet make a come-back :D The big work now is transcoding -- locally we make a 4KHD video, but we want to serve it in a more appropriate device-friendly format (and only do that conversion once).
 
Bottom line for the "personal computer revolution" is the development not of microprocessors, but of communications. There's not much fun playing against yourself if you're into games. Once telephone lines and modems came within the range of affordability, the revolution was on. Before that, PCs were still niche products when it came to everyday work.
 
I think an important sideline to the development of computers was the development of high end calculators. Calculators provided volume necessary to push out lower power consuming technologies like CMOS and LCD and large capacity ROMs. Calculators also proved that there was a sizable market for standalone systems instead of the mini-tel style dreams of home terminals. Having some of the smaller calculator companies switch to making early microcomputers when the calculator industry consolidated might make them even more relevant.
 
Bottom line for the "personal computer revolution" is the development not of microprocessors, but of communications. There's not much fun playing against yourself if you're into games. Once telephone lines and modems came within the range of affordability, the revolution was on. Before that, PCs were still niche products when it came to everyday work.

I guess it's depends on how you define "niche" Chuck. Anyone who wrote documents as part of their job (I was one of them) be they creators or typists, found word processing via computers a boon to everyday work. It was a revolution to ditch the pencil (and numerous hard-copy drafts), and have an electronic copy you could store, re-edit and cut and paste from. This is before the communication potential of computers became mainstream.

Tez
 
Anyone who wrote documents as part of their job (I was one of them) be they creators or typists, found word processing via computers a boon to everyday work.

I would second the observation that word processing may well have been *the* killer app that took computers from things that people used for "number crunching" into the mainstream appliance category. You didn't need to be an engineer or accountant to see the value proposition, it turns a computer into a useful tool for just about anybody who has need to generate or manipulate written communications. I'd also throw "database management" out there, although I would add the asterisk that one of the biggest uses I remember for early database software was to store mailing address information to be consumed by word processor mail merge functions.

Obviously the "Information Revolution" as a whole has relied on computers gaining the capability to communicate, but to me that kind of feels like that wasn't totally mainstream until the 1990's, when the Internet started supplanting the old private online services. (And the $5+ an hour connect charges for such...)
 
Depends on what field you were in. POS was becoming to be a big deal, as was the standard business suite (AP, GL, Payroll, Inventory...) If you were in finance, the killer app was spreadsheet--and that drove quite a bit of the market development before communications got to be a big thing. But even at a company that made PCs, one of the standard bits of kit was a Model 32 ASR TTY for handling Telex and TWX . Police departments still used teleprinters. There was also a small but growing market in automation (PLCs, for example).

When someone buys a computer nowadays, it's for communications (phone, web, etc.) In fact, it's communications that makes much of modern banking (e.g. credit cards) possible. When is the last time you saw a counter check?
 
Anyone else remember that 15 minutes around 1990 where they were pushing the line that the one true and best use evar for a personal computer was as a FAX machine? You couldn't swing a dead cat without knocking over a 9600/2400 fax/modem card.
 
Bottom line for the "personal computer revolution" is the development not of microprocessors, but of communications. There's not much fun playing against yourself if you're into games. Once telephone lines and modems came within the range of affordability, the revolution was on. Before that, PCs were still niche products when it came to everyday work.
I have to disagree also. The average PC user wasn't on the Internet until the mid 90s. Dialing up BBSes was for nerds. Lots of games were "multiplayer" with just a second controller.
 
Still have our FAX that takes thermal paper. And a box that can receive FAXes that sits between a parallel interface printer and a PC. Uses a V40 inside and has a 1.44M floppy drive.

But the world of micromputers was a lot wider than the home market--and that's my point.
 
Ultimately that's the thing with the term "Personal Computer", it means whatever you bring to it. Computers are an enabling technology for all sorts of useful AND useless mischief, and really the only thing that separates a "Personal" computer from what came before is that "personal" computers were cheap enough that a mere mortal could lay hands on one without filling out a corporate PR or filing a grant request.

Much of what we do with computers today in our daily lives was at least conceptualized and demo'ed in some form by the mid-1960's; the problem was that the hardware to do it cost a few million bucks. (Inflation adjusted... or sometimes not.) If your definition of a personal computer is it has to be able to do hypertext and video conferencing for a price that makes it practical for grandma to own one then the PC wasn't invented until the late 1990's. But of course that's a really silly place to set the bar. Even the most brain-dead little 8-bit microcontroller can be massively enabling technology in the right hands if having it lets someone spend just a few hours writing software instead of days/weeks/months implementing some arcane process control system for... whatever they're interested in, be it business or, well, personal. A simple computer like this might not look like a "personal computer" to someone who sets the minimum bar at something they can sit down at and do (insert task here) without needing to crack a manual, but it's certainly was a revolutionary thing in the right hands at the time. So... yeah, I don't think you can even remotely draw a line and say that "this is the point at which personal computers became 'real'".
 
I was reading thru the Altair 8800 operators manual from 1975. At the very end it has this:
1663649308414.png

That $500 kit quickly escalates in price :D But the fact that these options even existed (outside of government, bank, university) is telling.

But even by 1977, things were still rough for "PCs". I was reading back through one of the PET magazines from c. 1979, with various write-in complaints about quality issues (keyboard stops working or CRT goes out- having to send back under warranty; I'm sure all the brands then had their QC issues), shady software shops not delivering (or just lousy software that was returned), and lots of memory-expansion demand -- 4K is really rough to do any meaningful application (beyond a fancy calculator that can store some variables and run a formula). And even with a CRT, text processing on 40-columns is rough. (though as I recall, the 80-col PET was technically made available before the 40-col version?)

With all those issues in mind, that kind of me makes respect the IBM 5150 even more for what it was at the time. (sure, it too had its issues in 1981)


As for minicomputers not just being home PCs: Some people are perfectly happy with a PC that has no screen - kind of "embedded processors" that can do things like home automation, stage/studio light management, or some industrial things like driving plotter-type machines (large ones, with saws). But to me, while those are computers in a pure sense, they are a different category of device than the "appliance-style" computer that sits on a desk (and we can now look at the long path it took in finding the formula to that easy-to-use appliance).

One of the liberating aspects of Personal Computers - no per hour usage charge, yet enough capability to do a few interesting things. But it's interesting how "cloud computing" is now returning to that pay-for-usage concept -- of renting storage space and maybe processing cores, or leased-per-month software.

The Wang systems were apparently renowned for their word processing capability -- some decent software and decently sized CRTs. Just still not affordable for home use. There is a story at one of the Wang archive sites, about how a sales engineer rolled a Wang 2200 into an IBM Vp's office to show this first hand how something that sat on a cart could be a useful machine that didn't cost $100k+.
 
Last edited:
I bought the "special deal" Altair kit with 2x4K DRAM (horrible stuff) and SIO card and lots of crappy white wire that was used to hook the 4-slot backplane up to the front panel. Stil have it. What was the cost of the "special deal"? $1,000--and that was in 1976 dollars.
I had a friend who picked up an IMSAI 8080 a couple of years later and used it to run his vacuum-molding shop.
When the IBM PC came out, there were very useful, lower-priced systems out. Consider, for example, Bill Morrow's package deals that included basic accounting, CP/M, CBASIC, Wordstar, etc...
 
though as I recall, the 80-col PET was technically made available before the 40-col version?)

That’s a claim I’ve never heard before. Citation needed?

A lot of the early video systems for small computers used 64 columns, not 40 or 80. And I’m kind of surprised it mostly died out before the 70’s were over. (The TRS-80 was one of the only holdouts.) If you only have 1k of video memory 64x16 is for a lot of purposes a more useful layout than 40x25, and as a bonus the addressing circuitry is simpler. I used to do word processing on a TRS-80 and 64 columns is a huge improvement over 40 columns; it‘s enough to show a letter-size page with 1-inch margins in the standard 10CPI font with no horizontal scrolling, so it’s effectively no loss compared to 80 columns.

The one place 40 columns wins is it’s more readable on a TV if you’re stuck going through an RF modulator.
 
Back
Top