• Please review our updated Terms and Rules here

Why did DOS/86 overtake CP/M-z80 ?

I want to say the 8086 has something like DJNZ using CX. Don't quote me on that one; I'm an x86 ASM n00b.
You are correct. The Z80 has DJNZ (Decrement B and Jump if Not Zero). The 8086 has LOOP (decrements CX and jumps if not zero) and its variants LOOPE/LOOPNE, as well as JCXZ (Jump if CX Zero). The 8080 has nothing comparable.
 
I think I brought this up before, a long time ago, but I think you’re going to be disappointed trying to do what you’re planning here. TL;DR, the high address lines on a CF card are a lie. You won’t be able to use this to fake 128 byte blocks.
I don't recall that discussion before. I might have missed it. I know the high address lines on many card interfaces are gone, and they typically support about three for ATA register compatability, but I didn't know the remainder up to A8 weren't implemented on many CFs themselves... That is of interest. I haven't seen that before. Kind of disappointing of it's correct. I will clarify I'm only looking at early era cards but I haven't built an interface for one yet.
 
I don't recall that discussion before. I might have missed it. I know the high address lines on many card interfaces are gone, and they typically support about three for ATA register compatability, but I didn't know the remainder up to A8 weren't implemented on many CFs themselves... That is of interest. I haven't seen that before. Kind of disappointing of it's correct. I will clarify I'm only looking at early era cards but I haven't built an interface for one yet.

Read footnote #2 on page 4-3 on this datasheet carefully. Notice how A1 through A9 are "don't care" on the chart, and these ominous words as the last line:

"Note that this entire window accesses the Data Register FIFO and does not allow random access to the data buffer within the CompactFlash Memory Card."

Here's another PDF describing a "memory mapped" Compactflash interface for a Motorola 68HC912 SBC, and it explicitly explains the same thing: Even though the sector buffer "presents" as if it were RAM, it's still actually just a door to a FIFO and you have to read/write 512 bytes at a time to it; it does *not* do random access.

Basically the memory-mapped mode is a Potemkin Village to allow essentially-ATA-centric disk flash devices to work with PC card slot controllers that were originally designed to use the old Linear SRAM/FLASH PC cards. The one and only scenario where it works properly is with 512 byte memory block moves. That's it.
 
I want to say the 8086 has something like DJNZ using CX. Don't quote me on that one; I'm an x86 ASM n00b.
LOOP is what you're looking for. Oddly, on 386+, it's slower than a simple DEC, JNZ 2-instruction sequence. And, of course, you're limited to CX as the counting register with LOOP, just like the REPxx prefix on the string instructions.
 
There's a reason I called it a thought experiment. Z180 is not 16-bit, but in the alternate reality of this thought experiment, Z180 wasn't necessarily the Z180/HD64180 that we actually got, but the Z800 that should have been built instead of Z8000.

Sometimes I like to indulge fantasies that if Tandy had played their cards right (and a bunch of other moving parts had worked out, one of which would have had of been things happening a *lot* differently at Zilog) we'd all be using descendants of the original TRS-80 Model I. But, realistically, I don't think there's any universe in which it really makes sense that one of the architectures that started as 8-bit would have really been able to make it through all the subsequent evolution in once piece... any better than the 8088 did, anyway. And it had to cheat to do it.

Aside: My TRS-80 fantasy mostly revolves around Tandy *not* responding to the FCC crackdown on the Model I by splitting their product line into the fundamentally limited Color Computer and the hermetically-sealed-and-boring Model III. Instead someone at Tandy HQ was smart enough to realize that the thing that was great about the Model I was its modularity, acknowledged that Apple was nibbling hard at the public's mindshare with color and graphics, and thus ordered the Model I be remade into a modular Model III with a handful of internal expansion slots (with the full CPU bus on them, and also including a PHANTOM line to allow for paged memory to be implemented after the fact) a 4x NTSC system clock, 3.579Mhz Z80 CPU, and a CRTC-based video system that could do 512x192 mono graphics as well as artifact colors similar to the Apple II or CGA by cycle-stealing main memory. (Text display would still come from a couple K of SRAM, which could be overlaid on the graphics mode if desired.) I'm reasonably sure they could have sold this for about what our world's Model III cost, and considering Radio Shack's huge retail presence could have stolen quite a lot of Apple's thunder.

Combine this with the thriving third-party innovations in TRS-80 operating systems and software, the determined efforts of cloners to leverage the fact that the only part of a TRS-80 that wasn't freely cloneable was the ROM, and even that *could* be solved by just licensing a Level II BASIC work-alike straight from Microsoft... I think there's a real chance they could have kept the Model I's trajectory going up at the late-70's pace into 1983 or so. And if that worked, and Zilog actually had the Z800 ready for volume shipments that year, and it really had turned out to be a performance winner with real 16 bit capabilities... I dunno, maybe they could have kept Tandy's platform relevant a lot longer than it was in the real world? Of course, getting beyond the 80's still presumes that the Z800 itself would have some kind of 386-level successor that takes it cleanly into the 32 bit era. Which, yeah, didn't happen. Certainly not a timeline that mattered.

...Anyway. The Apple IIgs is probably the closest thing we have in the real world to what we're dreaming about here, and let's face it, it demonstrates pretty clearly the problems of trying to take a 1970's architecture into the latter half of the 80's. Leaving aside the whole issue of it being way, way too late to really be relevant it's also a pretty awkward machine, forced to be a dual-mode beast with an entire Apple IIe swallowed up inside, the presence of which imposes all kinds of nonsense and performance sacrifices. Sometimes you just need to flip the table and start over again... which is, in fact, why it's so amazing that the PC platform never actually did that. And I think pretty much all the credit for that lies with Intel. You can say a lot of bad things about them, but nobody came even close to groking the importance of backwards compatibility and upward growth paths the way they did.
 
This is quite an interesting thread & apologies for any inaccuracies from a faded memory.
I still remember other x86 machines that ran MSDOS but died out because of Lotus 123 and its direct access to hardware such as Expanded Memory. There were cases such as the NEC APC III that needed an additional IBM PC compatibility card to run L123.

In its pure form via BDOS and BIOS calls, I see both CP/M and DOS as environments with ASCII conversational character based console I/O
, printer output and simple file I/O (to be later augmented with networking functionality). Additional features such as graphics & sound were driven by loaded applications externally from the DOS.

The Z80 microbee was designed to host CP/M-80 and its 68K Gamma variant ran CP/M-68K for its development phase. (There was also a HD64180 variant).
If I recall the Z80 LDIR/LDDR instructions weren’t very efficient. They essentially ran the LDI/LDD instruction BC times with an opcode fetch (to the same LDIR/LDDR instruction) between each byte transfer which kept the Z80 DRAM refresh running.
The Z80 DMA chip worked much quicker but was very expensive, if applicable had to deal with DRAM refresh thus wasn’t often used.

As for 8080/Z80 compatibility there is a glaring difference. The DAA instruction behaves differently on both and is often used to sense the current CPU.
This is because the 8080 DAA bug was fixed in the Z80.
 
As for 8080/Z80 compatibility there is a glaring difference. The DAA instruction behaves differently on both and is often used to sense the current CPU.
This is because the 8080 DAA bug was fixed in the Z80.
Depends on what you call a bug. It works as advertised, in my experience.
 
LDIR/LDDR aren't *that* bad, but sure, they don't compare to a DMA chip which is, what, about 10 times faster?

They do make for good code though, because they instantly show the programmers *intent* which is gold when looking through code.

I find the INDR,INIR,OTDR,OTIR commands more useful as they affect hardware operation with the upper registers, which is great for moving code around and 200 kbytes/s is fast enough for most floppy access. I wonder if they were thinking of including DMA-like elements in later chip versions and this was an easy command to upgrade?

I suppose this counts as microcode the way it was implemented in the z80?
 
I believe that the "bug" has to do with using DAA after a subtraction. Intel was quite specific that the DAA is used only after an ADD, ADI or ACI instruction. The Z80 put a twist on this by including a "subtract" flag in the PSW and handling decimal adjust after subtraction.

It's not a bug and if subtraction is desired, the the decimal complement of the subtrahend should be added.

If things are done this way, code works on both CPUs without changes.

The 8086, keeping in the same vein, has a "DAS" instruction, keeping DAA working the way it always has.
 
I believe that the "bug" has to do with using DAA after a subtraction. Intel was quite specific that the DAA is used only after an ADD, ADI or ACI instruction. The Z80 put a twist on this by including a "subtract" flag in the PSW and handling decimal adjust after subtraction.

It's not a bug and if subtraction is desired, the the decimal complement of the subtrahend should be added.

If things are done this way, code works on both CPUs without changes.

The 8086, keeping in the same vein, has a "DAS" instruction, keeping DAA working the way it always has.
Oh, that makes sense.. So in the case of Zilog, it was really just an enhanced instruction, though it might lead to some interesting z80 to 8086 automated code conversions.
 
Call it what you like but being unable to use DAA after a subtraction instruction sure sounds like a design oversight.
The need to add a negative number in lieu of a subtraction instruction seems like a workaround.
Also, apparently the NEC 8080A behaves differently to the Intel 8080A by affecting the AC flag for the DAA instruction.

The 8080 is not alone with quirks:
WDC’s latest version of the 65x51 ACIA has issues with its TX Data Buffer Empty flag.
TI’s 9918 VDP F Interrupt flag can’t be software polled to get the current status.
Intel Pentium had a floating point bug which required a global recall.
The first ARM chip didn’t have its voltage pin connected but managed to still work.
The Zilog Z80 OTDR/OTIR apparently can change Carry flag when it should remain unaffected.
 
"What really killed z80 based CP/M machines."

Well, I've read this whole thread and I still don't know. But that's okay. After all it's not really a current topic of relevance, is it? Nevertheless, I was there, as a young business man trying to figure out what to buy as my first personal computer.

I ended up surprised that Commodore did not end up the computer of choice since that was the first real personally accessible computer I got my hands on in Junior High school and took classes on how to program it. Then the VIC-20 and C64 became popular. But then I was away from personal computers for a few years during which time I mostly had contact with Mini computers that were purpose built by the company or leased and were not really "personal" or accessible to me.

In about 1986 I got my hands on a NorthStar Advantage, which my soon-to-be father-in-law purchased along with a package of software for his small local retail business. By then the IBM PC had replaced the Advantage. But I was just getting started. I didn't know what CP/M was or how it compared to DOS. All I knew was that there a lot of people using CP/M in business and the NorthStar was still being supported by a consulting company and reseller. I frequented a few bulletin boards and was able to use the Advantage and CP/M very effectively in my business for word processing and a database and some communications and research functions.

Then I started to notice that my FIL was getting more into his IBM PC and was using programs passed on to him by some of his fellow teachers and students in the local school district. He used Professional Write, R:Base, and Sidekick, and some games. We soon upgraded his PC with Hercules graphics and a hard drive. The we started using it for accounting. As I was the most knowledgeable in computers in the family I became the defacto IT guy.

It wasn't long before I could see that everything had to be compatible with the IBM PC or it was not worth buying. By 1987 and 1988 even Radio Shack was suggesting IBM compatible was best, even though they were still selling non-IBM DOS compatible or Unix systems. NorthStar was offering an 8088 add-in card to be MS-DOS compatible. But after awhile I knew that such a card was not "IBM" compatible. All articles I was reading were saying "IBM"compatible was the only smart move.

So, my first personal purchase (for my own computer) was a Tandy 1400FD which was a full IBM compatible and came with Tandy MS-DOS 3.3 and two floppy drives. I then standardized on IBM compatible software and soon stopped using CP/M and the Advantage, sending back into the attic from whence it came. I learned to program our databases in R:Base and then in Alpha Four and Access. I learned Professional Write and MS Word and DacEasy, all DOS programs. No more Wordstar or Dbase any other CP/M programs from that point on.

So, my answer is "IBM" and IBM compatible software. No one was passing me CP/M disks to say "try this". No one was suggesting NorthStar and CP/M was the future for business users. And they were right. The IBM PC killed CP/M (Not necessarily the Z80 which was still a thing even after CP/M died in the business community).

Seaken
 
The other question (and I may have said this earlier in this thread and forgotten!) is why were people saying (and moving) to IBM. I suspect you could just do more because most CP/M systems were only 64K only and that limit started being felt. Add to that the compatibility of disk media and video (no terminal type) and it is easy to see why there was a migration in the IBM PC's direction. Also, DOS had a FAT filesystem that was much more suited to hard drives.
 
The way I remember things was unless your company or user base was small, and wanted in on computing in some form or another during the early going is the 70's, you probably had a CP/M based system for you accounting and such. As IBM became somewhat affordable and spawned the compatibles, it was all DOS in some form from there on out. There were always those minorities who preferred the roll your own systems and went with CP/M. I worked in an area back in the 80's where I had to support CP/M systems and they were a kludge, so to speak. Users were just getting into DOS apps and then they had to deal with the CP/M syntax which was slowly becoming somewhat arcane. The bottom line is that if DOS hadn't shown up, CP/M would have been just as viable until the next big thing showed up.
 
Everywhere throughout this thread, IBM seems to be the common answer, but I think the other elements that came out of the story, especially around Zilog's short sighted approach to larger processors, is a better story. Had Zilog just continued to produce z80s, faster, pipelined, more memory and with capability to use them, in competition to IBM, it might not have been enough but z80 opcodes would have hung around for a bit longer. The focus on DR-DOS seems to have demonstrated DRI's shift to 86. And microsoft's shift to Windows cut them out of future markets anyway...

But I think there's also a subtext that CP/M may have addressed what was missing in future versions, but it didn't really innovate. Nor was there a single dominant PC manufacturer to force a concept of standardisation across the market as IBM did - as noted, running MS-DOS wasn't enough in the end. People wanted true PC compatability and Microsoft Flight Sim was the benchmark.

Standard architectures led to graphics capabilities - something we saw play out all over again when Microsoft began creating support for games mid-90s and slowly everything moved their way once more, and the old graphics standards fell to the wayside.

I still think there's anecdotal information out there that can fill in the gaps between the puzzle pieces like grout in a mosaic, but Backwards compatible seems an absolute, as does newer software systems only supporting older architectures to reach the lowest common denominator across a platfom. It takes a lot for software developers to let go of something old and broad for something newer that only early adopters can buy, and it requires a lot of faith in the newer platform that only comes with large companies.

If examined from this perspective, CP/M *could* have survived a lot longer, and z80 based systems/CP/M had a major manufacturer supported it... Which makes me wonder whether a Z80 based PC with different ROM BIOS might have survived a while longer had Zilog moved that way and designed a more powerful z80 that worked well and hit the market early enough to compete with the 86... Evan that approach would have been wrong though. It seems the world abandoned z80 at around the right time, even if it was slightly premature.

Many thanks to everyone for the input to this question. It's been very much appreciated -
David.
 
The way I see it, the Z-80's success was largely due to Tandy's Model 1, which sold around 200,000 units alone. As a stand alone company, Zilog just didn't have the public presence, advertising pomp, or a finished product to compete with the likes of IBM and Motorola. Akin to WordPerfect, which more or less owned the DOS era word processing scene, could not fully realize the next big thing was already looming.
 
The way I see it, the Z-80's success was largely due to Tandy's Model 1, which sold around 200,000 units alone. As a stand alone company, Zilog just didn't have the public presence, advertising pomp, or a finished product to compete with the likes of IBM and Motorola. Akin to WordPerfect, which more or less owned the DOS era word processing scene, could not fully realize the next big thing was already looming.

I wonder if you may have overstated the extent to which Tandy's Model 1 contributed to z80 popularity. I'd say the z80 was *wildly* successful in the late 70s and early 80s outside of the US. Spectrums sold in the Millions and there was a world where the PC never took root. Eastern Europe... And there we can see where the technology led to - basically Sinclair Spectrums with 1Mb of memory and CP/M based operating systems. They kept extending on that platform for quite some time... It's probably a good model to predict what may have happened had IBM not come along. Japan hung in there with the z80 until the mid 90s also and produced some amazing architectures and systems that took the PC systems another decade to fully challenge.

I think the US centric view on PC adoption is a little distorted due to the close proximity it had - it took the rest of the world a little time to catch up and during that time, other PCs were still competative... And the convergence between Home and Office PCs was yet to happen and I think the PC wasn't truly dominant and without challengers until the mid 90s. I'm even surprised that the consoles lasted as long as they did and came into the current era.

At the grassroots level, I can say that while I knew about the PC I don't think it really started going crazy intil the mid-80s which was when it started regularly showing up as a Home PC.

Though the Model 1 may have significantly contributed in the early days, the real revolution didn't feel like it hit until the 80s. About when the Model 1 was discontinued.

I'd say Zilog well and truly outpaced Intel for a while and I'd be interested to know how Intel viewed Zilog during the late 70s and early 80s. Zilog could have become what AMD became. It also could have been the first to branch out into parallel architectures as an alternative to wider buses... I don't know how many GHz z80's you could fit onto a die, but I'm guessing it's a few. Parallel architectures never really took off for home PCs, including modern software, though much has evolved to multithreading over the past decades. Its also curious to muse about what may have become of the 86 architecture had the PC not come along...
 
Your points are well taken. The Sinclair which was released here in the US was known as a Timex. It was boon to hobbyist at under 100 USD but needed a TV monitor and adapter. Perfect for the youngsters, but a stretch for serious users. The Tandy Model 1, Model II, Models III & 4 all used a form of the Z80. Although the 5 million or so European Sinclair units used the Z80, it seems to have eventually to have seen a dead end. There was no advancement and boils down to a "shoulda, woulda, coulda" scenario for Zilog. Motorola, a huge conglomerate in its own right, couldn't sustain the pressure from Intel and AMD in the PC world, and their somewhat slow chips wound up mostly in washing machines and the like. So, I'm not dishing on Zilog - it's just a matter of the level of technology and sales performance that one brings to the table. The US made Tucker automobile from the late 1940s is a good example; he, Tucker, had the product that could potentially rock the entire US auto industry but not the wherewithal to become a viable factor against big business.
 
The cause of Zilog's success was Intel's penchant for not improving the current chip design because of the promise of the chip of the future even before the future chip existed. Intel kept repeating that mistake every few years though mostly for AMD's benefit.
 
Back
Top