• Please review our updated Terms and Rules here

Sinclair PC200

I think you are probably right... That was the machine with MCGA support wasn't it? At the very least, it was a widely supported game graphics option. Very similar to the PCJR.

If based on points(*) though what points are you thinking off? Just video and speed?

It wasn’t MCGA (that was the low-end PS/2s and like one Epson machine), the Tandys had graphics similar to and mostly compatible with the PCjr. This gave them a 320x200 16 color mode which looks as good as EGA’s low res mode, which is a huge boost over plain CGA. The other winning feature they have is a 4 voice sound chip (same one used in the Sega Master System 8-bit console and still present in the Sega Genesis/MegaDrive), which, yeah, the combination thereof was well supported for games in the late 80’s.

The keyboard-shaped 1000EX and HX only have a 7.16mhz 8088 so speedwise I imagine the Sinclair gets a win there. Does the Sinclair’s video chip support greater color depth like the vaguely related Amstrad PC1512 did, or is it really just nerfed to CGA and an MDA text mode? (I don’t see any references saying it does better than CGA but maybe I missed something.) If it doesn’t then speed is really it’s only win.

The “L” series Tandy 1000s had an improved version of Tandy video that added a 640x200 16 color mode and the ability to do Hercules on a mono monitor, along with switching to an 8086 at 8 or 10mhz so, yeah, it’s a shame they put the RL into a pizza box instead of a keyboard, it would have been the undisputed ultimate (Super)CGA keyboard XT.
 
That's still highly impressive...

Back when I was designing PCBs on my early PCs, most of it was in CGA - so we could see the pad detail on both sides and have a background and overlay color. Never pretty, but better than nothing. We all wanted EGA for the 640x200 mode at least with 16 colors. Eventually we got VGA, but it took time. I ran VGA on a 15KHz monitor for two years before I could afford a secondhand VGA monitor.

The desire for more colors and better use of that color was really strong around 1987... And I remember all the games with a TANDY option as well as CGA...
 
Thing is, the Amiga 1000 had shown the way graphics could look, and in 1987 the A500 made it available to everyone. Ironically at the same time as VGA became available, but PC's were still too expensive as games machines.
In 1988, the PC 200 didn't stand a chance.
 
Thing is, the Amiga 1000 had shown the way graphics could look, and in 1987 the A500 made it available to everyone. Ironically at the same time as VGA became available, but PC's were still too expensive as games machines.
In 1988, the PC 200 didn't stand a chance.

Except Sinclair had originally intended to release a Spectrum with similar capabilities to the Amiga in 1986, with 64 colors and hardware bit operations, as well as hardware vector graphics. Memory necessary to support that cheaply was available from 1985 onwards. So it was going to happen anyway - It was only because Sinclair was sold off to Amstrad by it's creditors that this didn't happen.

But once VGA hit in 1987, the future was pretty quick to shift away from EGA color schemes... And then we had the same photographic images on every PC in every computer shop... And the 24 bit palette became standard.
 
Except Sinclair had originally intended to release a Spectrum with similar capabilities to the Amiga in 1986, with 64 colors and hardware bit operations, as well as hardware vector graphics. Memory necessary to support that cheaply was available from 1985 onwards. So it was going to happen anyway - It was only because Sinclair was sold off to Amstrad by it's creditors that this didn't happen.

But once VGA hit in 1987, the future was pretty quick to shift away from EGA color schemes... And then we had the same photographic images on every PC in every computer shop... And the 24 bit palette became standard.
Hah! That'd never happen... Unless they stopped using rf.
 
The tech associated with the “Sinclair Loki” did eventually come out (in extremely small quantities as an arcade cabinet board) as the Flare One, and after being gifted with a CPU upgrade from a Z80 to an 8086 they tried to sell it as a home console as the Konix Multisystem.

The TL;DR with that whole story is it was a terrible idea that relied on assuming that 16/32 bit CPUs would always be “too expensive” for home computers and that the money you’d save would permanently justify all the painful workarounds you’d have to deal with when your framebuffer is as big as your CPU’s entire address space. (Not to mention all the other applications that were starting to take advantage of large memory and were already starting to find even the comparatively minor annoyances of the 8086’s segmented model limiting.) Amstrad was very much right leaving that one in the recycle bin.
 
The tech associated with the “Sinclair Loki” did eventually come out (in extremely small quantities as an arcade cabinet board) as the Flare One, and after being gifted with a CPU upgrade from a Z80 to an 8086 they tried to sell it as a home console as the Konix Multisystem.

The TL;DR with that whole story is it was a terrible idea that relied on assuming that 16/32 bit CPUs would always be “too expensive” for home computers and that the money you’d save would permanently justify all the painful workarounds you’d have to deal with when your framebuffer is as big as your CPU’s entire address space. (Not to mention all the other applications that were starting to take advantage of large memory and were already starting to find even the comparatively minor annoyances of the 8086’s segmented model limiting.) Amstrad was very much right leaving that one in the recycle bin.

That wasn't quite the idea...

By 1985, Sinclair had realized that a games machine that could deliver "cartoon quality" graphics was the future of video games, and that even a home video game machine needed to be able to provide basic business functionality.

The Loki would have run at 7MHz ( most likely ) - so a little faster than the XTs of the era, but notably would offload the heavy processing of graphics and audio to dedicated hardware, which he realized didn't require the extent of custom silicon that everyone of that era thought was necessary.

This made a lot of sense in 1985 - and he was right that while computer sales were slumping, game sales continued to climb...

It wasn't until around 1995 that the PC got all of what was planned for the Loki. Had low-cost hardware accelerated graphics hit in 1985 without the costs associated with the Amiga, it would have changed things quite a lot.

The Amiga itself is a good example of this this can be successful.

Custom silicon wasn't always needed - Sinclair believed in choosing his objectives carefully to reduce gate counts. He did use custom chips, but mainly for cost reasons and to make hardware piracy a little more difficult since his early TTL only designs were widely cloned and copied.

The ideas that materialized in the Konix did come from the original ideas behind the LOKI, as did the RAM MUSIC MACHINE - which also had limited success. Sinclair had considered how to gain synergies in the improvement of graphics and audios to reduce gate requirements further also. But as with all great ideas, they need to manifest in the era which spawned them and the Konix was too late and went a little too far trying to catch up to what had changed in the world in the mean time.

The Konix also did not have any capabilities to be used as a normal computer - or the software support and history necessary to make it successful.

But the general idea of adding more color to graphics displays primarily intended for games? And hardware facilitated sound capabilities? Sinclair was spot-on with that idea, as later machines demonstrated.
 
But the general idea of adding more color to graphics displays primarily intended for games? And hardware facilitated sound capabilities? Sinclair was spot-on with that idea, as later machines demonstrated

So basically he looked outside, saw what everyone else was already doing, and said “do that but cheaper!” as his company corkscrewed into the ground. Tale as old as time.

By 1985, Sinclair had realized that a games machine that could deliver "cartoon quality" graphics was the future of video games, and that even a home video game machine needed to be able to provide basic business functionality.

That doesn’t really make any sense, at least the “basic business functionality” part? I would agree that it was around 1985, in the hangover from the “crash of ‘83” that decimated the video game console business for a while and generated a fair amount of collateral damage in the home computer space, that people were starting to wake up to the fact that home computers that were *only* good for playing video games weren’t really going to have a long term future, but I would say that Sinclair actually totally failed to grasp the importance of creating *balanced* machines that were more powerful in all areas.

Dedicated video game consoles during the second half of the 1980’s nearly all retained 8 bit CPUs not much more powerful than the second gen consoles from the late 70’s and early 80’s but glued them to more powerful video processors. The Sega Master System, NEC PC Engine, heck, even arguably the Super Nintendo (the CPU in that is only as 16 bit as the 8088 is) are all examples of this philosophy, and it works fine when all the CPU is doing is running game logic and acting as a traffic cop for data DMA-ing between a ROM socket and the video/sound hardware. But it’s a lousy design for a general purpose computer. Sure, depending on the feature set a powerful video processor might help you run a GUI, but outside of that all that acceleration hardware isn’t going to buy you anything.

The Amiga chipset, being a big pile of DMA engines that substitutes RAM for the ROM socket, *also* would have been a lousy general purpose computer if it had been stuck with an 8-bit CPU, but it wasn’t, it had arguably the best 16/32 bit CPU of the era in it, which is what made it such an attractive platform for innovating ways to use its video-game optimized architecture for productivity, art, and creativity. The Loki, if it had ever existed, would have been a *huge* pain in the rear to program for compared to the Amiga; Motorola used to love touting benchmarks using large memory structures that a 68000 could iterate through almost an order of magnitude faster than an 8086 that otherwise could compete roughly equally with it in terms of IPC, a z80 with an external memory pager is going to be *even worse* for these transformative applications.

Everyone at the time thought Sinclair was on crack with his promises, and they were right. A system like Loki *kind of* works for a video game (although the problems with using RAM and mass storage instead of ROM cartridges are readily apparent to anyone paying attention to the piracy issues that invites*), but it’s a terrible computer even if it can run Wordstar.

(* I guess it looks like Sinclair was in fact planning to distribute Loki software on cartridges similar to HuCards, so… again, this is just silly. A Sega Master System could run CP/M on its Z80 if someone made a cartridge and disk drive to do it, does anyone seriously think that would be something anyone would pick over an XT clone, let alone an Amiga, in 1986? That is literally the comparison we’re looking at here.)
 
So basically he looked outside, saw what everyone else was already doing, and said “do that but cheaper!” as his company corkscrewed into the ground. Tale as old as time.

It wasn't a corkscrew. It's was a nosedive powered by the C5... And yeah, he even beat Segway to that end also. The problem is that he assumed C5s would be useful - and they were - but councils and other interests rose up against them ( repeated with the Segway later ) - and killed them dead. Interestingly, they were worth twice as much as kids toys - as the person who bought the failing stock quickly realized - they just needed a different market.

That doesn’t really make any sense, at least the “basic business functionality” part? I would agree that it was around 1985, in the hangover from the “crash of ‘83” that decimated the video game console business for a while and generated a fair amount of collateral damage in the home computer space, that people were starting to wake up to the fact that home computers that were *only* good for playing video games weren’t really going to have a long term future, but I would say that Sinclair actually totally failed to grasp the importance of creating *balanced* machines that were more powerful in all areas.

Dedicated video game consoles during the second half of the 1980’s nearly all retained 8 bit CPUs not much more powerful than the second gen consoles from the late 70’s and early 80’s but glued them to more powerful video processors. The Sega Master System, NEC PC Engine, heck, even arguably the Super Nintendo (the CPU in that is only as 16 bit as the 8088 is) are all examples of this philosophy, and it works fine when all the CPU is doing is running game logic and acting as a traffic cop for data DMA-ing between a ROM socket and the video/sound hardware. But it’s a lousy design for a general purpose computer. Sure, depending on the feature set a powerful video processor might help you run a GUI, but outside of that all that acceleration hardware isn’t going to buy you anything.

The Amiga chipset, being a big pile of DMA engines that substitutes RAM for the ROM socket, *also* would have been a lousy general purpose computer if it had been stuck with an 8-bit CPU, but it wasn’t, it had arguably the best 16/32 bit CPU of the era in it, which is what made it such an attractive platform for innovating ways to use its video-game optimized architecture for productivity, art, and creativity. The Loki, if it had ever existed, would have been a *huge* pain in the rear to program for compared to the Amiga; Motorola used to love touting benchmarks using large memory structures that a 68000 could iterate through almost an order of magnitude faster than an 8086 that otherwise could compete roughly equally with it in terms of IPC, a z80 with an external memory pager is going to be *even worse* for these transformative applications.

For what reason would it have been a problem to program?

It wouldn't have been too far removed from segmented '86 architectures. Different approach sure... But not too different. Still a 64K code segment.

The Jaguar had a far superior processor to the 68000 - about 50 times faster, so no, your assertion that it was the best is not even close. The problem is that there were no coders for the Jaguar chips, so people just used the 68000 chip in the Jaguar to run games, which is funny... Why? Because they were more familiar with it.

Had the z80 based Loki appeared in 1985, the huge base of programmers for Sinclair z80 machines would have shifted to it, generating content.

We live in a world where technical capabilities mean nothing compared to lots of desirable content. It's why the Gamboy Black and White, with a slow z80 mid-90's smashed the far superior Atari Lynx and others of the era.

And Sinclair's Pandora was heading in the same direction as the Gameboy a decade earlier.

The Amiga was a good machine - it sold well - but it sold well because it was a Commodore - C64 owners automatically preselected their next upgrade... of course, it all fell apart a little bit later, but for a while, the Amiga did pretty well.

History is littered with good machines that went no where...

Everyone at the time thought Sinclair was on crack with his promises, and they were right. A system like Loki *kind of* works for a video game (although the problems with using RAM and mass storage instead of ROM cartridges are readily apparent to anyone paying attention to the piracy issues that invites*), but it’s a terrible computer even if it can run Wordstar.

Well, the first part is true, but disks were fine for video games. Would Sinclair have created some kind of hardware protection? Who knows. I think he was starting to realize business opportunities existed in other directions there - that piracy might actually help him. But know knows?

(* I guess it looks like Sinclair was in fact planning to distribute Loki software on cartridges similar to HuCards, so… again, this is just silly. A Sega Master System could run CP/M on its Z80 if someone made a cartridge and disk drive to do it, does anyone seriously think that would be something anyone would pick over an XT clone, let alone an Amiga, in 1986? That is literally the comparison we’re looking at here.)

This is closer to the truth than I think you intended. You *can* run CP/M on a z80 on a Sega Master system *if* they made it do that - but they didn't. And it was expensive. And it would have cost a LOT more to run CP/M on it... All things that are different to Sinclair's intent.

Sinclair on the other hand had a pre-existing market of people who wanted a product and was going to make this all from day 0.

It's the difference between the very capable RAM music machine, which had a part of the LOKI system in 1986 but wasn't that popular, compared to the built-in AY-3-8912 chip in the 128K, Had the 128K come out with the same hardware, it would have been picked up on a much larger scale.

Companies didn't even write software for the 128K most of the time. They wrote 48K games with AY-3-8912 support based on the 128K ports, which resulted in a lot of people adding a music chip to older spectrums. The market dynamics are completely different.

An equivalent? if the Apple IIGS was released for US$500, then that's about how it would have been in the UK/Europe.

Technical capabilities aren't the primary driver of markets... Just look at the PC.
 
I feel like all this was already hashed to death in previous threads, but I have to make one thing crystal clear here, which I'm surprised you keep ignoring:

Had the z80 based Loki appeared in 1985, the huge base of programmers for Sinclair z80 machines would have shifted to it, generating content.

Everyone who was actually there in April 1986 when Amstrad acquired the whole of Sinclair Research's product suite has been very explicit in interviews that Loki never existed in anything even remotely approaching a shippable state at the time of the acquisition, let alone in 1985. It was an ambitious wishcasting specification that *probably* wasn't even directly dictated by Clive Sinclair(*) that at most resulted in a bunch of breadboards and design sketches. Even if Sinclair had been healthy and flush with money there's no way a system this complex could have been out the door before mid-1987 at the absolute earliest, and given Sinclair's track record with the QL it probably would have been a quality control nightmare if they'd tried.

(* The authors of the Konix Multisystem tribute site are convinced that the Loki specs were directly authored by the engineers that moved on to building the Flare One after Sinclair imploded, and given the degree of correspondence between the Loki specs and what Flare produced this interpretation holds a *lot* of water. And the blunt facts are that Flare didn't have sellable silicon until late 1988. Even if we allow for a full year's worth of setback from uncertainity as to any trade secrets/etc rights from the Sinclair purchase this puts the actual debut of a "Loki Spectrum" *way past* any 1986, or probably 1987, timeframe.)

Loki was NEVER GOING TO HAPPEN IN 1985. Full stop.

The Jaguar had a far superior processor to the 68000 - about 50 times faster, so no, your assertion that it was the best is not even close. The problem is that there were no coders for the Jaguar chips, so people just used the 68000 chip in the Jaguar to run games, which is funny... Why? Because they were more familiar with it.

What? I was referring to the Amiga, a machine that came out in 1985 based on technology that had been in development since 1982. The Jaguar didn't come out until 1993, with development starting in 1989. And anyway, that the Jaguar's DSPs can perform some operations "50 times faster" isn't even remotely the point I was making; the problem even with the Jaguar is that utilizing the performance of those custom chips for general purpose operations is really hard and, expecially if we're imagining what you've laid out as your idea of what the "Loki" would have had, may be even be impossible. The Jaguar's custom chips both had Turing-complete DSP cores so they could at least execute arbitrary code to some extent, the Flare 1's (and certainly by extension the Loki's) were for the most part not, so even if you want to accuse a developer that runs game logic on the Jaguar's 68000 of "laziness"(*) you wouldn't be able to do that with a hypothetical developer on a Loki, they'd have no choice to use the CPU from 1976 to run *everything* not directly related to line drawing, sprite placement, or music mixing on.

(* It's actually a pretty interesting rabbit hole to read about the challenges of getting good performance out of the Jaguar. And this is with a system that took about four years to develop, not counting whatever was carried over from the Flare One. All this handwaving about a 1985 Loki completely underestimates just how much work it takes to refine even the greatest ideas into workable products, especially when you expect outside programmers to be able to actually use the system. Even if Sinclair had a big box with a fully working discrete-chip Loki sitting on a workbench in September 1985, a month after the Amiga 1000 first went on sale, it would have been at least eighteen months to a few years before you'd have it in a state where you could have sold it. And every account says it was *nowhere close* to that baked at the time of the Amstrad acquisition.)

Here's a page from a (1989) magazine scan on that Konix site that compares some hypothetical benchmark comparisons between the Flare One and the sort of computers the Loki was supposed to compete with:

Screenshot 2026-01-07 at 11.35.04 AM.png

The Flare is very good at drawing vector lines, but loses to or only slightly edges the Amiga everywhere else; even a humble Atari ST can splat sprites almost as fast and it doesn't even have a blitter. Notably the Acorn Archemides absolutely turns it inside out everywhere except for line drawing, and we know that is entirely due to its ARM CPU being a monster compared to the 68000 in the other two. (So far as I can tell the video in the Archemedes is an entirely dumb framebuffer other than the mouse pointer sprite.) In other words, it's going to get its butt absolutely handed to it by an Amiga running any software that's not *intricately tailored* to whatever fixed-function go-fasts the Loki chipset would have had.

(Touching on the Jaguar again, this seems pretty relevent when you consider the systems that absolutely drove a stake through the heart of the Jaguar were the Sega Saturn and the Sony Playstation, both of which had RISC CPUs far stronger than the 68000.)

This is closer to the truth than I think you intended. You *can* run CP/M on a z80 on a Sega Master system *if* they made it do that - but they didn't. And it was expensive. And it would have cost a LOT more to run CP/M on it... All things that are different to Sinclair's intent.

Here's another tidbit from an August 1988 article that reinforces what I said about the usefulness, or lack thereof, of CP/M to a "Loki":

Screenshot 2026-01-07 at 11.54.53 AM.png

In other words here are the designers of "Loki" saying very bluntly that the *last* thing the system is good for is running general-purpose CP/M applications. I suppose if Loki had actually gone on sale they would have *tried* packaging Wordstar up for it to try to make that case that by buying a Loki you weren't just buying a toy for your kids, you were getting a "real computer". William Shatner would be proud.

So that all said... what *do* you think Sinclair's actual intent was, anyway? Was it actually going to be a game console or was it supposed to be a computer? I will fully allow that the Loki/Flare One was a decent idea for a gaming console that was probably somewhat ahead of its time in trying to implement higher color depths and better vector support than was standard practice in the second half of the 80's (if you read these articles about the Flare that I linked to above their explanation about why they wanted to go with "fat pixels" instead of planar graphics because of the advantages that provides for manipulating 3D graphics is pretty legit and, in fact, 3D rendering is a place where the original Amiga chipset actually faceplants pretty hard), but considering the other limitations of the time in storage and overall system performance it is completely foreseeable that actual implementation of these ideas in mass-market affordable consoles wasn't going to happen until the early 90's. But at the same time the Flare/Loki hardware does not actually offer that much as a computer platform; there's a lot more to life than drawing lines and the Loki/Flare wasn't actually that good at anything but that, and, again, it was never, ever, ever going to happen in 1985.

And Sinclair's Pandora was heading in the same direction as the Gameboy a decade earlier.

What? Every single account of the Pandora describes it as an abortive attempt to put a computer roughly like the Spectrum into a clamshell case equipped with a goofy series of lenses and mirrors that would blow up the display of one of Sinclair's mini TV tubes into something vaguely like a modern laptop screen. (Except every account of the prototypes describes it as worse in almost every possible way, forcing the user to try to de-focus their eyes into infinity to be able to see ghostly output hovering in space behind the glass.) Drawing any kind of line between this and the Gameboy is... a stretch.

If you're referring to earlier incarnations, like when they showed a ZX-81 with one of their mini TVs in it, well, Commodore did the same thing with stuffing a tiny TV into a VIC-20 and showing it off at computer electronics shows. When asked they'd always say the mockup was an example of something they *could do*, not that they were seriously considering releasing something this ridiculous. I guess that also counts as a proto Gameboy?

And yeah, he even beat Segway to that end also.

That is definitely not a flex. I first learned about the C5 specifically because it was mentioned a lot in stories about how Segway's final product completely failed to live up to the "revolutionary" hype that was going around pre-launch. As for the legal issues both faced, both products posed similar serious safety risks. (IE, both are dangerous to mix with pedestrian traffic, and likewise both really couldn't operate safely in the street; the C5 specifically being infamous for putting the rider's head pretty much at bumper level.)

I get that Clive Sinclair is revered at roughly the level of Steve Jobs in certain circles, but... and this is coming from someone who's *far* from a Jobs fan, I think Jobs may have had a better sanity filter when it came to his ideas.
 
It wouldn't have been too far removed from segmented '86 architectures. Different approach sure... But not too different. Still a 64K code segment.

A Z80 has a 64K "segment", full stop. An 8086 has separate code and data segment registers, and if you need to do a far call to get something outside of your current range it's almost always going to be less overhead than having to bang on a memory mapper. The 8086's segment registers also give you a *16 byte* resolution for defining where you want your segments to be, which is far better granularity than most memory mappers, and the system also mostly automatically takes care of making code modules location-independent. It's just automatically better than any reasonable Z80 MMU even with all the warts.

Honestly the *only* reason to stick with a Z80 after 1985 is if you're trying to retain compatibility with some legacy codebase, and this is actually why I'm especially sure that the Flare One *is* in fact the genuine Loki. The Z80 doesn't make any sense otherwise; if it were a clean-sheet design they probably should have gone with a better CPU from day one. (Yes, a Z80 was cheaper than an 8088, but not by much.) I think it's in there because the design accomodated a Sinclair mandate that there had to be a Spectrum backwards compatibility mode, and only stuck around because it was "good enough" for the tech demos they were using to try to get buyers. (Flare was promoting the tech, they never really were in a position to bring a console to market by themselves.) Notably they were perfectly willing to immediately throw it overboard for x86 when Konix showed interest but was skeptical the Z80 was really enough oomph for the games they wanted to run on the chipset.
 
It does seem incredible that the A500 came out before the PC200 when you put them next to each other.

For sure, a CGA XT can't help but come across as kind of hilariously bad if you set one next to an Amiga or Atari ST, especially if they're all running game demos or other graphical stuff.

(This is another area where the Tandy 1000's enhanced graphics saved them; Even though its 16 color 320x200 is based on the IBM PCjr from 1984 that's still actually mostly a tie for what most games written for the Amiga and ST used other than those machines having bigger color palettes. And its sound chip was also roughly on par with the ST's. The 1000 EX/HX's CPU was probably optimistically about half the speed, but the game at least could look and sound far better than it could on a regular 4-color CGA/PC Speaker XT.)

Regarding that side trip above about substituting specialized chipset hardware for more powerful CPUs, though... it might be instructive to note that the two consumer computers standards left standing in the mid-late 90's were the PC and the Apple Macintosh, both of which were platforms that for the most part completely dispensed with blitters, sprites, etc, and relied on Moore's Law to give them more muscle to improve their graphical talents as the bar moved upwards. The Amiga was a great system for the mid-late 80's but the close coupling of CPU to chipset made it more difficult to iterate forward. Commodore Amiga snobs were right in laughing at how much ahead of the curve their computers were compared to a similarly priced entry level PC in 1987-1988, but time did catch up when the Amiga tech basically ended up frozen in amber for half a decade, with future incarnations being too little, too late. Ah well.
 
It does seem incredible that the A500 came out before the PC200 when you put them next to each other.

Doesn't it... But given the earlier Amiga's that were already out, the later models were pretty amazing and took the premise even further by reducing the cost for the same ( more or less ) capabilities.

Nothing about the PC at the time was remotely advanced or even good value - and with the exception of VGA leading lower-cost photo-realism further - but the PC200 came out with CGA and a single expansion port without the ability to place a full card inside the case even.

It was rumored to be the "Super Spectrum" final form at the time also, though History has shown they were simply using the Sinclair name to push product...


I get that Clive Sinclair is revered at roughly the level of Steve Jobs in certain circles, but... and this is coming from someone who's *far* from a Jobs fan, I think Jobs may have had a better sanity filter when it came to his ideas.

And *this* is the real reason the Sinclair PC200 existed... Because neither Sinclair or Jobs were the "incredible" people that they were thought to be. Both were deeply flawed individuals. Though Sinclair was an early Jobs/Wozniak hybrid rather than Simply "Jobs" and he never really understood business. He was deeply fortunate other engineers came along to take things further than he could have imagined.

And his name was deeply ingrained into the UK and european markets - far more than Alan Michael Sugar TRADing ever was going to be - though AMSTRAD did slowly build a name for itself with the early PC1512 and PC1640s far outside of what it achieved with the PCWs and earlier CPCs. The main difference though is that Sinclair was an ICON. AMSTRAD was a BRAND.

But Sinclair, like Jobs, had visions of new concepts and the one common uniting idea behind them was "affordable"... And in this respect, he truly surpassed Jobs and others like them - this is exactly what he managed to do. His ability was to find a single cost-related edge over his competition and to exploit this for business purposes. Whether it was "out of spec" transistors or new ways to achieve something complex like advanced computer video circuitry within the heavy custom-chip requirements, that was his objective.

Because of this, I don't think FLARE is what he would have made... I think FLARE is what happened when Engineers get to run ahead of what the company wants to achieve.

The only outcome from the Loki that ever hit the market was the RAM music machine, but it wasn't combined with the planned video capabilities. Much of the Loki concept could have been made into a simple expansion device, but that would have failed no matter how amazing, because it would never have been a "Sinclair Spectrum" - just another interface.

I get you're far more obsessed with this story than I... And I love your discussion points and really want to hear more about what you just said since you've already dug up some things I wasn't aware of - But I only brought up the elements of this story because they were *directly* related to the PC200 and how the media reported it back in the day... Which I remember because I was buying Sinclaur User ( or maybe it was Crash or something else? ) back in the years after the Super Spectrum was touted, and many people simply assumed the PC200 was it. Magazine veracity of the era was truly dismal.

And yes, I was as much a Sinclair fan-boy as many Apple users are today... In fact, it was around 1988 I realized I should just switch to the winning side for my next acquisition - and so I went with the AT clones and never looked back... Even by this time, it was clear the PC would eventually surpass the Amiga and Mac - And to that extent, the PC200 still made sense in a world where it was under-powered, late to market and limited in capabilities. I'm honestly surprised they didn't also make it work with normal TV sets - which was possible even with only Composite Video back then. Difficult to read, sure, given the broad dot pitch of most home TV sets, but not impossible.

Tell you what - I'll make up a new thread so we can hash out all the fun bits outside of the PC200 thread - :)

And I'd love to know where you got that article claiming the Loki was going to have a coprocessor from... I mean, I remember discussion around that at the time, but I never found any evidence supporting it - that's the first time I've seen it.
 
And I'd love to know where you got that article claiming the Loki was going to have a coprocessor from... I mean, I remember discussion around that at the time, but I never found any evidence supporting it - that's the first time I've seen it.

This is a really weird nit to pick. You said this:
Except Sinclair had originally intended to release a Spectrum with similar capabilities to the Amiga in 1986, with 64 colors and hardware bit operations, as well as hardware vector graphics.

so you are right here acknowledging that the intent was to equip the Loki with "coprocessor" hardware. I realize that there are plenty of hairs to split here, it's very much open to debate how complex a blitter needs to get before you call it a full-fledged "coprocessor", but you had no hesitation from this point insisting that these hardware assists are what would magically make the Loki "competitive" somehow with an Amiga-class computer despite its rotgut CPU.

The Flare One chipset has an actual DSP, if a very crude one, embedded in the chipset element that handles the sound mixing. If you want to insist that the Loki is in fact not the Flare One than fine, Loki only has the fixed-function blitter, and if we're going to say a blitter doesn't quite qualify as a "coprocessor" then just edit any implication that I said it had one out of the discussion. Although I will point out here that both the "Copper" and "Blitter" components of the Amiga chipset are regularly referred to as "Coprocessors" and between being able to, well, blit, and having vector line drawing functions the thing that's in the Loki specifications sounds *very much* like it would fall into the same category as the Amiga's blitter. In short, it's a fixed function coprocessor.

Because of this, I don't think FLARE is what he would have made... I think FLARE is what happened when Engineers get to run ahead of what the company wants to achieve.

Do you have any source for this? Because I have as yet to find *anything* that points to Clive Sinclair being particularly involved with the nuts-and-bolts of Sinclair Research's computer business other than as a manager. The ZX-80 was designed by a guy named Jim Westwood, and Richard Altwasser is credited for being the main designer of the ZX Spectrum. I'm not saying that Clive wasn't "interested" in what was happening here and, I dunno, maybe with those early projects he was the most obnoxious micromanager that ever lived (just like Steve Jobs was prone to being), but this kind of statement basically reads like a "What would Jesus Do?" to me.

That Konix site has the first few paragraphs of the actual LOKI "Specification of the Super-Spectrum Entertainment Engine" proposal document. Read the feature bullet points, all of them except item number 4 (the one about Spectrum 48/128 compatibility)

Code:
1.1 Features

1. Z80 main processor running at 7 Mhz (twice Spectrum speed).
2. 128K of RAM, composed of 2 64Kx4 dual port video RAMs and 2 64Kx4 ordinary  dynamic RAMs. It is possible to expand the machine to have 1 Mbyte of address space.
3. Display resolutions of 256x212 with 8 bits per pixel, or 512x212 with 4  bits per pixel.
4. Spectrum 48/128 software compatibility mode (128 sound may be a  problem).
5. Rasterop hardware performing screen animation, scrolling and line  draw.
6. Polyphonic synthesiser quality sound.

and compare them to the Flare One and Konix Slipstream developer documentation available on the downloads page. There are some minor differences, IE, they appear to have eliminated the requirement for dual ported RAM for the video system, the CPU clock divider results in a 6mhz clock instead of 7mhz, and they added a DSP in front of the audio DACs (assuming that wasn't in the original Loki spec, although the "Polyphonic synthesizer quality sound" bullet point is kind of nebulous about if there's any hardware assist for sound or if it's just DACs like the "RAM Music Machine"), but when you add it all up the Flare One seems to be dead on with both the Loki specification and what was "leaked" in the breathless article about the "SuperSpectrum" that appeared in the July 1986 of Sinclair User... that I will get back to later.

So with all that said, what do you think Clive Sinclair thought about Loki? It's obvious he didn't design it, he didn't design the Spectrum either, so... do you think the whole thing was a debacle that only got away with wasting company money because he was busy fooling around with the C5? If that's what you think then, well, why do you keep bringing it up? Or if you do think Clive wanted Loki, then what exactly is wrong with the Flare One? If you're just going to say it's that they added a DSP to offload sound generation that singlehandedly ruined the whole thing, well... I guess that's an opinion?

I get you're far more obsessed with this story than I.

I'm just amused that it keeps getting brought up in times like this, and every time it not only gets better and more perfect/special/innovative, but earlier in time. I mean, again, you said this:

Had the z80 based Loki appeared in 1985, the huge base of programmers for Sinclair z80 machines would have shifted to it, generating content.

When doubling down on how this mysterious never-happened machine would have been the bees knees if only Saint Clive hadn't been beset upon by forces greater than even his inhuman genius could overcome, and it's just ridiculously ahistorical. The first breathless article about the so-called "SuperSpectrum" was published in a July 1986 issue of Sinclair User, after the Amstrad acquisition, while every more modern account I've seen of the situation (apparently you're admitting I've put way more research into this than you have?) from the engineers involves saying that the whole thing was little past the vaporware stage. So... that's really all I'm asking for, here? If you have even the tiniest scrap of evidence you can link to which would make a "Retail Loki" circa 1985 (verses 1988+) a realistic possibility I would absolutely love to see it. Otherwise talking about it in a 1985/86 context is basically Steampunk science fiction.

But I only brought up the elements of this story because they were *directly* related to the PC200 and how the media reported it back in the day... Which I remember because I was buying Sinclaur User ( or maybe it was Crash or something else? ) back in the years after the Super Spectrum was touted, and many people simply assumed the PC200 was it. Magazine veracity of the era was truly dismal.

Okay, sure. I've heard tell of just how trashy the Sinclair magazines were at the time, I don't suppose it would surprise me a whole lot if one of them would have been stupid enough so somehow link the PC200 to the Loki... but I'm going to guess it wasn't Crash!, because they actually published an article a month after Sinclair User's article back in August 1986 completely shredding it. But... again, you say this:

Except Sinclair had originally intended to release a Spectrum with similar capabilities to the Amiga in 1986, with 64 colors and hardware bit operations, as well as hardware vector graphics. Memory necessary to support that cheaply was available from 1985 onwards. So it was going to happen anyway - It was only because Sinclair was sold off to Amstrad by it's creditors that this didn't happen.

citing that bogus 1986 date, which all the actual documentation says was never in the cards, and you state as if it were a fact that it only didn't didn't come out because of the Amstrad acquisition, IE, you're repeating the falsehood spread by one of those trashy magazines that this was tech that was practically finished and ready to roll out the door when it was nowhere close to that.

So... again, I dunno. All I'm really wondering here is if you have any evidence at all that counteracts the mountain of paperwork and research there is at the Konix site (and a few other stashes around the web) that seems to thoroughly debunk the idea that Loki could have realistically come out in 1986... let alone 1985. 1985 seems like it literally would require time travel technology.
 
Last edited:
But Sinclair, like Jobs, had visions of new concepts and the one common uniting idea behind them was "affordable"... And in this respect, he truly surpassed Jobs and others like them - this is exactly what he managed to do. His ability was to find a single cost-related edge over his competition and to exploit this for business purposes. Whether it was "out of spec" transistors or new ways to achieve something complex like advanced computer video circuitry within the heavy custom-chip requirements, that was his objective.

... one more thing, I guess: Ever hear of a guy named "Madman Muntz"? The way you're painting Sir Clive here I'm starting to think that Muntz was the One True Ur-Clive, with a long history of Cliving before Sir Clive even started Cliving. There's even a word for that kind of Cliving: Muntzing

In the 1940s and 1950s, television receivers were relatively new to the consumer market, and were more complex pieces of equipment than the radios which were then in popular use. TVs often contained upwards of 30 vacuum tubes, as well as transformers, rheostats, and other electronics. The consequence of high cost was high sales pricing, limiting potential for high-volume sales. Muntz expressed suspicion of complexity in circuit designs, and determined through simple trial and error that he could remove a significant number of electronic components from a circuit design and still end up with a monochrome TV that worked sufficiently well in urban areas, close to transmission towers where the broadcast signal was strong. He carried a pair of wire clippers, and when he felt that one of his builders was overengineering a circuit, he would begin snipping out some of the electronics components. When the TV stopped functioning, he would have the technician reinsert the last removed part. He would repeat the snipping in other portions of the circuit until he was satisfied in his simplification efforts, and then leave the TV as it was without further testing in more adverse conditions for signal reception.

In this light I feel like your main objection to the Flare One is it's insufficiently Muntzed to live up to Sir Clive's high standards.

Basically all I'll say there is, sure, believe me, you'll get no argument from me, Sinclair Research took Muntzing to amazing new heights with the ZX-80 and the Spectrum... but Muntzing always comes with a cost. (Madman Muntz's TVs worked well enough if you lived in downtown New York and could see the transmission tower from your window. They didn't work so well anywhere else.) I guess all I would ask is what is your basis for assuming that the Flare One isn't Clive-compliant. After you've read the technical manual let me know what you'd cut out with pliers given the chance?

(Edit: in addition to the tech manual read the magazine articles; the Flare One actually looks pretty Muntzed. There are interviews with the developers criticizing the Amiga for having “too many ways to do the same thing”, touting how *their* blitter is properly stripped to the bone. And, also, the early versions of the chipset didn’t even have a palette register, if you wanted a color cycling effect you used the blitter/vector engine or CPU to cycle every pixel. Which they both claimed their blitter was fast enough to do while simultaneously dismissing color cycling as a gimmick. Smells Sir Clive approved to me…)
 
Last edited:
... one more thing, I guess: Ever hear of a guy named "Madman Muntz"? The way you're painting Sir Clive here I'm starting to think that Muntz was the One True Ur-Clive, with a long history of Cliving before Sir Clive even started Cliving. There's even a word for that kind of Cliving: Muntzing

Similar -I had a boss who had similar ideas to this, but I think it was because his engineering knowledge in this area wasn't as good as many of his staff.

One significant difference though - Sinclair would go one step further... "What if we buy the reject tubes for next to nothing since they throw them out anyway?" would have been Sinclair's approach.

Neither is the first in history to take this approach.

Sinclair did the early analog design himself, but very quickly applied his doctrine to more capable staff he employed to build later technology... He was a big supplier of Amstrad in the early days and probably gave Alan Sugar much of his start building hifi systems.
https://en.wikipedia.org/wiki/Muntzing

In this light I feel like your main objection to the Flair One is it's insufficiently Muntzed to live up to Sir Clive's high standards.

Basically all I'll say there is, sure, believe me, you'll get no argument from me, Sinclair Research took Muntzing to amazing new heights with the ZX-80 and the Spectrum... but Muntzing always comes with a cost. (Madman Muntz's TVs worked well enough if you lived in downtown New York and could see the transmission tower from your window. They didn't work so well anywhere else.) I guess all I would ask is what is your basis for assuming that the Flair One isn't Clive-compliant. After you've read the technical manual let me know what you'd cut out with pliers given the chance?

I don't know enough about the specific detail of the Flare One to comment too deeply - but everything said suggests that Without Sir Clive to hold them back, they achieved their vision over Sir Clive's... Note that they seem particularly proud that the Flare One prototype - that it actually existed ( reflecting earlier comments that no full Loki prototype ever existed - not even full specifications... )

But take a look at the board... It's pretty advanced by common specs of the era. I think it's far more advanced than the Loki would have been. But it would have failed because of a lack of legacy support.

flare_one_proto.jpg


FOUR large custom chips... There's no way Sir Clive would have accepted that... One maybe... And probably he would have stuck with a Ferranti ULA.

I think that the video itself is relatively easy to obtain using the lowest cost components of that era. Two options existed - VIDEO DRAM - already out in 1985 - with BIT operation capabilities supporting BLIT - even back then... Not too cheap though. The other option that would meet specs for a frame buffer, and cheaper than a multi-plane memory setup were 64Kx1 SRAMs - fast enough to run at 210ns to the CPU under all conditions for read ( worst case ) and can be written with zero wait at up to 140ns... Or would work with a z80 at around 14MHz had they existed. 8MHz certainly did and Sir Clive is likely to have gone for 7MHz.

Some specs would change as the engineering team changed - they influenced matters greatly, but once you have the capability to run automated memory operations at 7MHz and create a video output with 256xN ( N being arbitrary from 192 to 256 ) then the video question is complete - The video for the LOKI can be reproduced with 2x22 pin PAL chips from 1985 and 8 x 64kx1 frame buffer chips with the rest being COTS 74 series logic - even LS would be fine for the speeds. This much I built myself to test the theory using chips from 1985. The video is the most expensive part of the solution and would have cost around GBP40 to GBP50 just for those components.

HVG requires around 10 TTL chips to draw pixels at 7 per microsecond, and also builds the interface glue to the CPU. Add a simple 2 dimensional DMA capability and you have sprite-like capabilities even if you ignore other things like screen planes and you don't need fills like the Amiga had - you can achieve that in a single part of the output chip to hold the last color when a "transparent" color is present. This also supports other potential future add-ons such as the Genlock they touted, and you can reduce the burden on the system by dropping the video system clock to 12MHz instead of 14 MHz, giving you a broader horizontal image space also.

This is how I think Clive Sinclair would have approached the problem - not by removing circuitry, but by removing circuit complexity. A single fixed video mode with fixed timing and clever use of that timing. I think he would have used magic muntz clippers that could remove "logic" gates from the equation - not components. A single ULA, but nothing too complex - and a few supporting PAL chips as per the 128K design.

And.... He would have change the spec as often as it suited the magic "muntz" clippers he had - without a doubt. No DSP in sight. Especially in 1985.

But most of the specs are obtainable with the same hardware - and vertical lines are just a count function in the logic implementation with fixed logic - ie, if it had 192 lines, this would be fixed. If it was 212 lines, this would be fixed. if it was 256 lines, this would be fixed. No register allowing "selection".

There's other things. He would have made it VERY modular. Modular costs more, but it lets you do sneaky stuff, like leave components out... If making it 48K compatible wasn't cheap? Then this would have been an "Upgrade" - potentially even requiring you to use your 48K Speccy as a "compatible keyboard" so that you can upgrade at the price... And use it as the 48K compatible mode also. There's not too many gates more for 48K compatability - it basically doubles to triples the gate count - maybe a ULA could handle it, but he knew from Chris Curry's woes not to overload ULAs. The biggest problem in having 48K compatability is the attribute memory - so much so they might have just used shadow ram for that.

Was it going to happen? No. No question from me there. Sinclair was finished in 1985. It would have taken a massive drive to make it possible - but it was possible - and at the price touted, without complex silicon. Sinclair might have stuffed it up too - always a risk - look at how well the QL faired.

He had three projects. The LCC, the Loki and the Pandora.

The LCC? A cut down spectrum. Minimal. Games console. A cheap christmas present - and I think he realized the market wouldn't support it.

The Pandora? He did have a LOT of flat-tube CRT technology - He was a pioneer there. You can buy tubes today that would have fit right into a Pandora - I tried one here - It would be more than suitable. But they geometry of the screen is a bigger problem with computing than with video broadcasts and he never got it working right - even on the bench. As was noted, his z88 was a much better idea and is what the Pandora became.

The Loki? It was a computer with games capabilities. It probably would have been capable of running CP/M, but there's no way Sinclair would have supplied CP/M. He would have written his own version from scratch like I did - something that is clearly NOT CP/M but has enough compatability that most CP/M software will still work - The DOS exception was too well known by then.

I've probably gotten about as far as Sinclair did with my own Loki project. I've tested parts on the bench, but no working full computer. OS is written and working. I went a little further there. I built an emulator and completed the OS. Still buggy, but working. And I'm presently converting Spectrum BASIC to CP/M as the next step - which isn't canon since Sinclair said he would have converted SuperBASIC from the QL. But since Spectrum BASIC is already z80, I figure it's close enough that I can "insist" he would have changed his mind by then.

If he built it would it have been a success?

Hmmm. On the balance of probabilities, yes. Enough people would have bought it despite missing most of the promises he made to have made a difference. Whatever he didn't need to supply could be an add-one... If he made it an "upgrade" to the ZX Spectrum on day one, people would have been mad, but they would have bought it still, even at 200 pounds. Especially if it came with a Disk Interface and the capability to run a real monitor with 80x25 video, and run CP/M software. I know you think CP/M was dead by then, but there was a small window around this time when it was still popular outside the US. I think he would have effectively leveraged that sentiment. And those without an existing Spectrum 48K or 128K? Well, they could buy a keyboard and a 48K compatible module for the video card... That's very much how I think Sinclair would have done it. It's not like there wasn't a history of those kinds of decisions.

Then, if successful, he could have moved up to the z280 and if ( another problem that history brings up ) Zilog hadn't imploded, maybe we'd see similar competition to Intel as happened earlier. Maybe we'd see 64 bit z80 derivatives in modern computers. Or maybe he'd also go with the '86 architecture at some point, converging on the PC. He never really spoke about it much after 1985.
 
… so, yeah, there we go. This whole thing about treating Loki as if it were a real Sinclair product that would have come out in 1985 for reals if it wasn’t for Amstrad *is* literally fan fiction.

I mean, that’s really the nub of it, I guess. This discussion kicked off because you stated as if it were fact that Sinclair *actually* was on the verge of releasing a machine with “cartoon quality” graphics in 1986, and now you’re backtracking and saying that it would have been totes *possible* to make a 256 color framebuffer with a z80 glued to it in 1985 under some arbitrary price point and that justifies the statement. But… again, the actual *truth* is it never happened at Sinclair, not even close to it. I guess I would simply suggest we try to have a higher firewall between history and fantasy.
 
Last edited:
… so, yeah, there we go. This whole thing about treating Loki as if it were a real Sinclair product that would have come out in 1985 for reals if it wasn’t for Amstrad *is* literally fan fiction.

We don't even need to question this - it's history.

But as a thought experiment, would it have come out if Sinclair had continued?

Absolutely... The specs weren't that unreasonable. They only needed to build the computer.

Given the Loki specifically was what Sinclair used to pitch to his creditors to give him time, there's no way it wouldn't have had to come out if they supported him. That much is ALSO history... Sinclair wouldn't just have "offered" it to Alan Sugar either. He would have been looking for a partnership of some kind. But I think he would have preferred to have avoided Amstrad entirely and if he had succeeded, 100% the Loki would have existed.

As for the final specifications? They would likely have been close to what he said at the time. It wasn't that much of a stretch... I think there would be some flexibility in how it was delivered and how much was optional - much as the original Spectrum didn't come with the promised drives ( Interface 1 / Microdrives ). And as much as I hate to admit it, microdrive support would have been included as the first option for storage - that's unavoidable - but we know they were working with floppy disk drives at the time also so we can assume this too would have happened at launch.

Would it have been better than the Amiga? Overall, no. Maybe here or there in a cherry-picked spec, but generally you'd be asking "Would TTL-like based Video Acceleration have been better than the Amiga's Custom Video Hardware?" - we don't need a crystal balls to answer that question.

But would it have had enough performance to bring in a new era of games? Yeah... 100%... I think Sinclair Flight Simulator would have been one of the first programs written for it. They probably would have asked Psion to upgrade the code to take advantage of hardware improvements and more memory.

As for the clues? Not just within Sinclair or even Amstrad. You need to look at the entire state of the computing industry in 1985 and how the changes affected it from 1984 - especially around strategic decisions that made sense to Sinclair in 1984 that wouldn't have made sense anymore in 1985.
 
Back
Top