Is there a goal here, or is this just an elaborate experiment?Also - painful compared to what? The competition is the pain you and (possibly) Tim Paterson experienced in writing an assembler from scratch on a near-dead machine. I'm not sure if you step through multiple versions of the assembler as the first steps, but that is what I envision doing with Sector C - I want to use Sector C to write a better C compiler, and then use that C compiler to write an even better C compiler. I know that once I hit the most primitive version of SubC (about 4000 lines of code) shipped with PDOS/386 that I will be in a position to make 2 more steps to get a better SubC (about 6000 lines of code).
There's a story that when Chuck Moore developed ColorFORTH, he just started with MS-DOS, DEBUG and floppy. Working through enough cycles of that, he got his ColorFORTH working, which booted off the floppy (he dumped MS-DOS), and once he got that firing on one cylinder, all bets were off.
Of course, Chuck Moore is the kind of person that can do that kind of thing. It's takes a lot of attention to make it work, as well as patience (especially if you're restarting the hardware alot -- which anyone would be doing at that level, depending on there approach). It also takes a lot of paper.
Woz is storied to have created the Apple Integer basic, hand assembled, and hand keyed. He was renowned for being able to assemble 6502 in his head.
Forth is an excellent candidate for this kind of thing simply because it forgoes any elaborate file system. In order to start iterating, you need persistence. The Forth Block system is clever, and straightforward. And a single 1.4M 3.5 DOS floppy is a LOT of Forth Screens (1440). The released implementation of Fig Forth for the 6502 is only about 50 screens total (sans the assembler, but that's only, what, 4 or 5 screens of code?).
You could certainly write a C compiler, and it's runtime, in Forth. There used to be a version of Pascal in Forth floating around back in the day.
But this is why folks used other development systems, just getting access to things like real text editors and file systems. Everything else can be shipped over via a serial cable to the new host.
Emulators also improved turn around time.
Speaking of BASIC, if you have access to BASIC, you can bootstrap from that as well. Back in the day, a person wrote an Ada subset, both the compiler, and the P-Code runtime, in BASIC.
Others trying to bootstrap C tend to use simple Lisp languages and writing small C compilers in Lisp. Their goal is to compile GCC 2.95? I think? Since that was that last version that could be compiled by generic cc on unix systems, and then use that to compile a modern GCC. Of course, their goal is to maintain provenance of the build, not starting from scratch with a bare board.
All a matter of what the overall goals, motivations, and self imposed limitations are.
I did some work with Turbo Pascal on CP/M in a simulator, with the Z80 clock turned down to a synthetic 4Mhz. Run into all sorts of strange limitations doing that, and, yea, you notice that slow CPU. The simulator wouldn't simulate I/O speed though. Boy were floppies slow.