• Please review our updated Terms and Rules here

Kickstarter for Amiga/Atari Game Player

I like how "software synthesis mixing multiple channels down to a single-channel DAC" equals "a built-in polyphonic sound generator" in Apple marketing-speak...
 
This is how the systems were commonly viewed historically. The CPU was not the defining factor.

This wasn't what was originally stated, though. The statement was "None of them were marketed as such" which is untrue. They were most definitely marketed as such.

The topic of what makes a system N-bit is a topic for an entirely new -- and undoubtedly long -- thread.
 
This wasn't what was originally stated, though. The statement was "None of them were marketed as such" which is untrue. They were most definitely marketed as such.

No, they were not, the statement is still true.
Both examples merely mention the *microprocessor*, not the *system* (I know this distinction is hard to understand for people who grew up on PCs and have no clue about other systems, but that doesn't make it any less true).

Also, if PC/XT systems were marketed as being 16-bit, then clearly the common view would not have been that they were 8-bit, historically.
 
Last edited:
Istr that my Amiga box literature referred to the 68000 as a 16/32-bit system, and I'm Real certain Atari saw it that way as well (since ST is short for sixteen thirty-two).

Exactly, they all just mention the CPU as 32-bit. Amiga and Atari ST even specifically mention the 16-bitness of the rest of the system. Motorola itself also describes the 68000 as a "16/32-bit" processor (which is the only correct way to refer to it, because unlike a 386SX or an 8088 it's not just the data bus that is 16-bit. The ALU is not entirely 32-bit either, many 32-bit operations take longer than 16-bit ones, because they are broken up in 16-bit operations internally, and in some cases, eg MUL/DIV, the 32-bit variations are not implemented at all, and were added with the 020).
Commodore started marketing the 32-bitness with the Amiga 1200/4000/CD32, when the whole system was 32-bit.
Which pretty much implied that they never saw the earlier models as 32-bit. Of course, back in 1985 16-bit was impressive enough already.

Regardless I've always thought of a machines bitness as the word size the processor could support, rather than the data bus width.

It's not necessarily the data bus width, but rather the chipset, which is taken for the 'bitness' of the system. However, technically the chipset can't really use more bits than the data bus width allows... Although there are some exceptions... Eg, you can have a 32-bit VGA chip with its own 32-bit memory controller and 32-bit memory, but connected to a 16-bit bus.

But yes, all 8088 systems have an 8-bit chipset, and Amiga/Atari ST/early Apple 68k machines all have a 16-bit chipset.
Therefore calling them 16-bit or 32-bit systems resepectively would be wrong, and I have never seen this done (of course, in my country it is actually illegal to do so. There has been an instance of an AMD commercial being pulled off the air here, because it was too misleading, by claiming that its no-execute bit prevented viruses. So I may have seen less misleading adverts than people in other countries).
 
Last edited:
Oh and the humble 386DX25

Speaking of which, some early 386DX models had a bug in 32-bit mode, and were labeled "16 bit s/w only" ;)
S_Intel-A80386-16%20(16%20bit%20s-w%20only).jpg
 
No, they were not, the statement is still true.
Both examples merely mention the *microprocessor*, not the *system* (I know this distinction is hard to understand for people who grew up on PCs and have no clue about other systems, but that doesn't make it any less true).

Also, if PC/XT systems were marketed as being 16-bit, then clearly the common view would not have been that they were 8-bit, historically.

You're making an irrelevant distinction. In the PC/XT era, the CPU was effectively the entire machine because it did all the work. Except for specialized industrial applications, there were no dedicated graphics processors or DSP chips. There was no local bus for devices to "talk" to each other; everything was interrupt driven. Hard drive controllers didn't have DMA and in real-world use were more limited by the speed of the drive than the speed of the bus. So for all intents and purposes, if a computer could run 16-bit code and could access more than 64K of RAM without bank switching, it was a fully 16-bit machine, as Gary Kildall attested.

It became a much different ballgame when machines like the Macintosh and Amiga came along with highly specialized custom chipsets that offloaded much of the work from the CPU. That's when the speed and word length of the CPU's external data bus really started to become important.
 
You're making an irrelevant distinction. In the PC/XT era, the CPU was effectively the entire machine because it did all the work.

By that logic, an 8088 would be equivalent to an 8086. Which it clearly is not.

Except for specialized industrial applications, there were no dedicated graphics processors or DSP chips.

As primitive as the PC may be, it DOES have quite a few components connected to the bus, such as the MDA/CGA card, PIT, PIC, and DMA controller, ROM, RAM, floppy controller etc.
And these all have to be 8-bit. Which means they perform like an 8-bit class machine. Which is why a 4.77 MHz PC struggles to keep up with a 1 MHz C64.

There was no local bus for devices to "talk" to each other; everything was interrupt driven.

Technically the ISA bus *was* a local bus. It was certainly possible for devices to talk to eachother directly. The devices just were rather primitive and did not make use of this functionality.

Hard drive controllers didn't have DMA

Even floppy controllers worked with DMA on a PC. Hard drive controllers certainly did (as CPUs became faster, they moved to polled I/O, since the DMA controller did not evolve).

So for all intents and purposes, if a computer could run 16-bit code and could access more than 64K of RAM without bank switching, it was a fully 16-bit machine, as Gary Kildall attested.

Which is completely arbitrary and meaningless.

It became a much different ballgame when machines like the Macintosh and Amiga came along with highly specialized custom chipsets that offloaded much of the work from the CPU. That's when the speed and word length of the CPU's external data bus really started to become important.

Really? Get a clue. Firstly, advanced chipsets have been around long before the Mac and Amiga. Look at early Atari 8-bit machines for example, or the C64.
Secondly, the Mac is about as primitive as the PC is, regarding chipset. It just happens to be based on the more powerful 68000. But it is nowhere near an Amiga.
 
Bitness is never something people will agree on as it was affected by locality/environment, marketing, technical specifications, personal experience, and opinion. It's like trying to argue which color is the best color -- you can't. Let's please drop this before people enter an utterly pointless flamewar with no positive outcome.

Moderators, please lock this topic.
 
As primitive as the PC may be, it DOES have quite a few components connected to the bus, such as the MDA/CGA card, PIT, PIC, and DMA controller, ROM, RAM, floppy controller etc.
And these all have to be 8-bit. Which means they perform like an 8-bit class machine. Which is why a 4.77 MHz PC struggles to keep up with a 1 MHz C64.

This is kind of a pointless and/or silly argument really. Using this logic a modern PC is an 8-bit machine because the simplest peripherals connected to the LPC bus are 8-bit only.

I tend to agree with Trixter though... having witnessed many of these discussions over the years I think you will not arrive at any significant level of consensus and the discussion will quickly devolve into a flamewar.

Back on topic, you'd be crazy to back this kickstarter. ;)
 
Not in terms of floppy drive performance!
biggrin_upper.gif
Indeed. Of course anyone with a bit of nouse would go for a hdd which is quite trivial to do with a PC.

As the others have mentioned this has turned in to another chest bashing thread ;)

Yeah and the kickstarter is a hopeless effort.
 
Last edited:
By that logic, an 8088 would be equivalent to an 8086. Which it clearly is not.

Yes it is, as the 8088 runs the same software as the 8086. 8088 commands is same as 8086 commands, not only a 8 bit subset. Same for 68000/68010 versus 68020 and later. The 8088 code uses fully the capabilities of the 8086. 68000 software immediatelly uses the full data bus width of the 68020 if you run it onto, without any change. No modification needed. Even harder, 68008 has only 8 bit data bus, but one machine command is however able to load/save a 32 bit value over that narrow databus (four bytes one afgter the other) and automatically puts it correctly in a 32 bit register.
 
So for all intents and purposes, if a computer could run 16-bit code and could access more than 64K of RAM without bank switching, it was a fully 16-bit machine, as Gary Kildall attested.

Technically speaking the "definitive" 16 bit machine, the PDP-11 (And other minicomputer-centric designs like the TI-990) couldn't support more than 64k without bank switching either, IE, the large memory versions all required MMUs combined with ISA extensions that are largely analogous to x86-16's memory segmentation.

In any case, yes, it's a very silly argument. I'm sure any number of people here could find examples of eight-bit CPU-equipped machines bristling with chipset-based acceleration capable of outperforming minimally-equipped 16 (or even 32) bit machines at certain specific tasks. And even if we're just talking about CPUs there are all sorts of other factors in play: the purely 16-bit 80286 can comfortably outrun the "sorta-32 bit" 68000 on most arithmetic-centric benchmarks simply because its almost four year newer architecture gives it a substantial IPC advantage.

(For tasks like calculating a spreadsheet the original Macintosh-through-Mac Plus was often right in the same ballpark as the 8088-based IBM PC, and if you threw an 8087 into the PC you could potentially run rings around said Mac. Granted, some of that is due to the DMA overhead of the Mac's video system, which was excessive enough to merit a redesign in the 1987-vintage SE, but at a design level the 68000 is still solidly 8086-era technology. The Amiga looks so good almost *entirely* because of all the goo around the CPU, not the CPU itself.)

In short, the whole "number of bits" thing might be vaguely useful shorthand for describing CPU ISAs and video game eras but by itself it's wholly inadequate for accurately comparing the performance and capabilities of two random machines. I've thrown away plenty of 32-bit Pentium 4 machines capable of hanging just about any 64-bit Ultrasparc II or MIPS R1x000 box out to dry on most any practical benchmark you want to name, does that demonstrate that 32 bit machines are superior to 64 bit ones? Obviously not, it just proves that 7 or 8 years is a *long* time when you're talking about CPU designs no matter how many bits the ISA's integers are.

I've arbitrarily decided that my old Xeon Mac Pro is a 10/14 bit machine, based on the number of serial lines it has running to the FB-DIMM RAM's onboard buffers. That's the number of bits it can electrically transfer at the same time so that's what it is, don't care about all that jazz about frames and channels and whatever. Therefore it is clearly in the same ballpark as an Intellivision console.
 
... and, yeah, back on topic, that Kickstarter was exceedingly painful. (I lost it *way* before they even got to the NASCAR cross promotion and then... wow.) It may actually be even more painful than the Commodore PET smartphone, and that takes a lot of work. Should Commodore fans take it as a complement that people keep dragging pieces of the corpse out of the crypt to crudely desecrate in desperate attempts to wring just a little more money out of the franchise? Because, well, it kinda doesn't feel like one.
 
... and, yeah, back on topic, that Kickstarter was exceedingly painful. (I lost it *way* before they even got to the NASCAR cross promotion and then... wow.) It may actually be even more painful than the Commodore PET smartphone, and that takes a lot of work. Should Commodore fans take it as a complement that people keep dragging pieces of the corpse out of the crypt to crudely desecrate in desperate attempts to wring just a little more money out of the franchise? Because, well, it kinda doesn't feel like one.

Well, with the changes at Apple, all that Reality Distortion Field had to end up somewhere and Amiga proves to be the lucky recipient. If advocacy Usenet groups from 20 years ago could be monetized, nothing could generate the revenue stream of Amiga.
 
This is kind of a pointless and/or silly argument really. Using this logic a modern PC is an 8-bit machine because the simplest peripherals connected to the LPC bus are 8-bit only.

No, that is not the same logic.
My logic is based on the fact that the widest bus available in the system is 8-bit, not the most narrow one.
A difference of best-case vs worst-case.

I tend to agree with Trixter though... having witnessed many of these discussions over the years I think you will not arrive at any significant level of consensus and the discussion will quickly devolve into a flamewar.

I wasn't expecting any discussion at all. As the book I linked earlier says, PC/XT systems were widely regarded as 8-bit. I figured most people here have been around in those days, and were used to this 8-bit/16-bit division between XTs and ATs, and the book also gives a ton of arguments as to why this is.

Besides, my point was mainly about marketing, regardless of what anyone (including Gary Kildall) thinks, my point was only that I have never seen an IBM PC/XT marketed as a 16-bit machine. I have yet to see evidence to the contrary.
 
Yes it is, as the 8088 runs the same software as the 8086.

From a software point-of-view it is, obviously. But we were talking about hardware, and things like capabilities and performance.
The 8086 is faster, and is capable of 16-bit transfers.
This is not something you have to explain to me btw. I was one of the authors of 8088 MPH, remember? I'm quite familiar with the programming environment of an 8088, and fully aware of its 16-bit nature. And also fully aware of the fact that it doesn't really buy you anything over a Z80 or 6502 in terms of performance, since the bottleneck is that 8-bit memory.
 
Last edited:
(For tasks like calculating a spreadsheet the original Macintosh-through-Mac Plus was often right in the same ballpark as the 8088-based IBM PC, and if you threw an 8087 into the PC you could potentially run rings around said Mac. Granted, some of that is due to the DMA overhead of the Mac's video system, which was excessive enough to merit a redesign in the 1987-vintage SE, but at a design level the 68000 is still solidly 8086-era technology. The Amiga looks so good almost *entirely* because of all the goo around the CPU, not the CPU itself.)

This is something I don't quite agree on. Yes, a 286 is indeed faster than a 68000, but there's a huge leap from an 8088/8086 to 286 (and even a bit of a leap from 8088 to 8086).
The Mac was a poor design indeed. The Amiga... the irony of the system is that because the chipset is so advanced, it actually gets in the way of the CPU. The blitter and display hardware steal a lot of memory cycles, which makes the CPU come to a grinding halt in some cases.
The Atari ST is a good example of what the 68000 is capable of with a simple, but efficient chipset. And as advanced as the Amiga's chipset is, quite a lot of tricks can be done on the Atari ST as well with clever code. The 68000 is quite far removed from an 8088/8086. Mainly because it just has a much more advanced instructionset, allowing you to write the same routines with less instructions, and using more registers to reduce memory access.
 
my point was only that I have never seen an IBM PC/XT marketed as a 16-bit machine. I have yet to see evidence to the contrary.

Actually you said, "machines built around an 8088 were never marketed as 16-bit machines." -- not just IBMs.

InfoWorld, Oct. 22, 1984:

e9evkl.png
 
Back
Top