• Please review our updated Terms and Rules here

Does Turbo Assembler or MASM come with an IDE?

Also - painful compared to what? The competition is the pain you and (possibly) Tim Paterson experienced in writing an assembler from scratch on a near-dead machine. I'm not sure if you step through multiple versions of the assembler as the first steps, but that is what I envision doing with Sector C - I want to use Sector C to write a better C compiler, and then use that C compiler to write an even better C compiler. I know that once I hit the most primitive version of SubC (about 4000 lines of code) shipped with PDOS/386 that I will be in a position to make 2 more steps to get a better SubC (about 6000 lines of code).
Is there a goal here, or is this just an elaborate experiment?

There's a story that when Chuck Moore developed ColorFORTH, he just started with MS-DOS, DEBUG and floppy. Working through enough cycles of that, he got his ColorFORTH working, which booted off the floppy (he dumped MS-DOS), and once he got that firing on one cylinder, all bets were off.

Of course, Chuck Moore is the kind of person that can do that kind of thing. It's takes a lot of attention to make it work, as well as patience (especially if you're restarting the hardware alot -- which anyone would be doing at that level, depending on there approach). It also takes a lot of paper.

Woz is storied to have created the Apple Integer basic, hand assembled, and hand keyed. He was renowned for being able to assemble 6502 in his head.

Forth is an excellent candidate for this kind of thing simply because it forgoes any elaborate file system. In order to start iterating, you need persistence. The Forth Block system is clever, and straightforward. And a single 1.4M 3.5 DOS floppy is a LOT of Forth Screens (1440). The released implementation of Fig Forth for the 6502 is only about 50 screens total (sans the assembler, but that's only, what, 4 or 5 screens of code?).

You could certainly write a C compiler, and it's runtime, in Forth. There used to be a version of Pascal in Forth floating around back in the day.

But this is why folks used other development systems, just getting access to things like real text editors and file systems. Everything else can be shipped over via a serial cable to the new host.

Emulators also improved turn around time.

Speaking of BASIC, if you have access to BASIC, you can bootstrap from that as well. Back in the day, a person wrote an Ada subset, both the compiler, and the P-Code runtime, in BASIC.

Others trying to bootstrap C tend to use simple Lisp languages and writing small C compilers in Lisp. Their goal is to compile GCC 2.95? I think? Since that was that last version that could be compiled by generic cc on unix systems, and then use that to compile a modern GCC. Of course, their goal is to maintain provenance of the build, not starting from scratch with a bare board.

All a matter of what the overall goals, motivations, and self imposed limitations are.

I did some work with Turbo Pascal on CP/M in a simulator, with the Z80 clock turned down to a synthetic 4Mhz. Run into all sorts of strange limitations doing that, and, yea, you notice that slow CPU. The simulator wouldn't simulate I/O speed though. Boy were floppies slow.
 
I don't think we teach programming the way we used to.

The IBM 1620 was common for a very early 60s small computer in schools (univerisities). It's the one that Dijkstra learned to despise.

But more to the point, if you bought a pad of coding forms for the 1620 assembler (they called it SPS for Symbolic Programming System), it was printed on both sides of every sheet. The top had the standard 1620 SPS form, but the verso had something called the "Absolute System". That is, a location field, an operation field and operand fields. This is where you started--coding raw machine language and then punching it into cards.
After a time you learned that there were programming aids to allow you to assign symbolic values to all of the above.

How many schools start with programming machine language on bare iron today?
 
How many schools start with programming machine language on bare iron today?
Next question: How many schools tell you that bare iron is a possible approach today?

Our local university starts education with Scala (to level the field for all students, as nobody has experience it). I guess most others use Java, C/C++ or Python nowadays, and many students will never see anything else. Embedded is a niche today.

I think emulators were used for very early bring-up (pre-silicon / early samples), and possibly in other cases if you had too much money. As soon as you have a monitor on real hardware, your productivity goes up a lot. For small targets, cross-development is the way to go.
 
Is there a goal here, or is this just an elaborate experiment?

I want to have sufficient understanding of computers such that if I were to go back in time to say 1950, then as each bit of computer hardware became available, I would be able to say "yeah - just use C90 - I am confident that it will work".

There's a story that when Chuck Moore developed ColorFORTH, he just started with MS-DOS, DEBUG and floppy. Working through enough cycles of that, he got his ColorFORTH working, which booted off the floppy (he dumped MS-DOS), and once he got that firing on one cylinder, all bets were off.

You are confident that Forth will work, but you haven't stated either way whether (a subset of) C90 would work (probably specifically starting with Sector C).
 
I don't think we teach programming the way we used to.

This is another thing - I don't want to end up with the movie "Idiocracy".

How many schools start with programming machine language on bare iron today?

I personally didn't even know what a "monitor" is until a couple of years ago (note - I started programming around the end of 1983, although I had read a book on BASIC around 1981).

When I looked it up in Wikipedia it even said it was a "lost art" or similar.

Although - I am not actually particularly wanting to do it that way myself. I'm more interested in what I am doing in PDOS/386 now. Debugging using printf - taking advantage of C - but be able to debug at the assembler level too as required, by having assembler listings available, and what amounts to an "internal monitor".

I don't expect to go away from assembler - I just want C to generate that assembler for productivity reasons. But ultimately I expect to deliver a bunch of assembler/machine code as the final (infinitely) debuggable product.

As opposed to the "try rebooting"/"works for me"/"can't reproduce in dev" dead end of the modern world. No matter how many cores and frameworks you add to that, you'll never actually solve the problem - that's not where the bottleneck is.
 
Next question: How many schools tell you that bare iron is a possible approach today?
It's not necessary today. The whole idea of a what a "computer" is today is vastly different from when a CPU was 3000 transistors. And there's a LOT of stuff that is simply not applicable to todays architectures. As they say, disk is the new tape, RAM is the new disk, and cache is the new RAM, but there's 3 levels of it. We used to say "computers are deterministic" but, truthfully, I'm not quite sure that's true any more, especially when you have dozens of cores on a die.

For 90+% of the the work computer programmers do today, working with "bare iron" doesn't gain you a whole lot, there's a lot more important things the modern student should be studying to be successful in the market today.

Then, of course, even with "bare iron", how bare is it really? All of the modern interfaces are 300 page specification. Everything is serial, and hand rolling a USB driver is probably a bad idea. So, you get controllers and such to do all of that for you and then you talk to those at a high level and hope they work.

And then, of course, even with modern CPUs, it's not even your CPU any more. There's CPUs running the CPUs. Entire subsystems and computers designed just to boot the "CPU" and monitor the CPU, running code you have no access too. But that's ok, its likely your code will be running in a managed environment, running binary code the host doesn't even directly understand anyway.
 
I want to have sufficient understanding of computers such that if I were to go back in time to say 1950, then as each bit of computer hardware became available, I would be able to say "yeah - just use C90 - I am confident that it will work".



You are confident that Forth will work, but you haven't stated either way whether (a subset of) C90 would work (probably specifically starting with Sector C).
How sub of a subset are you considering? I was looking at the C90 design parameters and thinking much of that won't be viable on early hardware. Having the minimal character "byte" with 8 bits might be inconvenient when the hardware uses a 6 bit character. 31 character variable names seem excessive when the machine would need to break the bank to achieve 4K of RAM.
 
You are confident that Forth will work, but you haven't stated either way whether (a subset of) C90 would work (probably specifically starting with Sector C).
I promote Forth simply because of the simplicity in getting someone from a monitor prompt to an interactive environment where one can start iterating. I think a Forth is the shortest path. Sector C looks fully capable as a language, but still needs a supporting environment to make it work. Obviously, this can simply be a monitor. You can put Sector C someplace in RAM, then store source text, in RAM, point the compiler at it and have it compile that code to another place, in RAM.

Then you can execute that code.

All that said, however, I think the complexity of a traditional grammar and compiler is higher than whats necessary for the Forth language. Obviously, one could just write down the code, hand compile it into the machine language of you choice, and off you go. But, cognitively, I think it's easier for someone to grasp and memorize the Forth language and runtime model to the point that they could sit down with a new computer and with little more than a notepad start coding up a rough Forth environment, and get it interactive and iterating quite quickly -- much to what Chuck is rumored to have done with DOS and DEBUG (and if anyone could, Chuck could. He could probably wire a Forth computer up out of NAND gates from memory).

Clearly, if someone writes enough small languages using a classic recursive descent compiler technique, then they'll have enough to build a new one from scratch. But Forth is much more incremental, it's easy to develop individual routines (and I'm talking at the primitive level) and build up your "it's alive!" base piecemeal in contrast to the requirements of a classic compiler. With a classic compiler, that first step is a doozy.

Once you have an interactive, iterative environment, where you can self host code and executables, then anything goes. You're "done", the hard work is finished, the machine is alive and ready for you to take it wherever you want to go. And Sector C can certainly get you there, I just think the trip is shorter, and easier to conceptualize using a Forth based system.
 
That's what I love about today. We have media figures opining on the various technological evils of our time when they can't even tell you accurately how the technology works. Politicians talking about climate change without the faintest knowledge of thermodynamics....I could go on.
 
It's not necessary today. The whole idea of a what a "computer" is today is vastly different from when a CPU was 3000 transistors.
So we should stop teaching how "computers" work and instead teach how they are used?

How do you expect an engineer to understand race conditions in a multi-threaded environment if they aren't even aware what a single-threaded environment looks like? Or that new CPU architectures in a few thousand transistors are, in fact, designed and produced? (Just ask Padauk.) Assembly language on an AVR isn't too different from a RISC-V, either. Both are very viable targets for bare-metal programming, even if just as a learning exercise.

I don't think pretending that nothing exist below Java/Javascript is helping. Especially not when thinking about the next generations, who will run our world in the future.
 
For 90+% of the the work computer programmers do today, working with "bare iron" doesn't gain you a whole lot, there's a lot more important things the modern student should be studying to be successful in the market today.

That may well be true and fine, but I consider the market itself to be corrupt. My wife has been unable to operate her Australian bank account for months because the SMS with the one time pin suddenly stopped getting through (noting that we are overseas in the Philippines using global roaming). She is only ever told "try again" or "no-one else is complaining".

Oh yeah - sometimes the SMS did get through but they were 6 hours late or whatever so couldn't actually be used.

There probably needs to be a 2-step process:

1. Acknowledgement that the systems are too complicated to actually debug and fix and so here is the manual workaround for everything (in this case - phone the customer and tell them the code).

2. A return to basics and rebuild simple systems that people actually know how to debug.

A 6-digit SMS could be transmitted by a 1 MHz Commodore 64 with a 300 bps modem in 0.2 seconds. Even if you are located in the antipodes, the speed of light will only delay that transmission by a little less than 0.1 seconds. So a problem that could have been solved in 0.3 seconds in the 1980s takes months to resolve today.

I have been demystifying all the software that I am personally dependent on (I started with BBS software and then moved on to the operating system itself - MSDOS wasn't that big).

My wife (a non-programmer) made an interesting comment. She said that since she had been with me she had learned that some things shouldn't happen. E.g. she was checking out clothes or something from an online shop and suddenly all the numbers of clothing in her cart doubled. She said "that's a software bug, right?", and I said yes. And she reported it to the store, and obviously there was no procedure to actually fix any of this stuff. I asked her what she used to think before she met me, and she just said "nothing".

She gets very frustrated when the new TV doesn't connect to the Wifi until you go through repeated iterations of switching both off and on and she asks me what's wrong. I told her that 2 expensive and skilled software engineers need to answer that question, and because neither side is PDOS - no-one knows or cares. She doesn't like that answer.

There's no point having 802.11abngandeveryotherletterofthealphabet when you need a monkey to switch devices off and on repeatedly because no-one actually cares.

What SHOULD be happening is - WOW - you've found a bug! This is the first bug encountered in the last 15 years, and as per industry standards, here is your dollar reward of $500,000. We'll do a television show now as our team of engineers come in to solve this problem - this is what software engineers live for - the once in a lifetime opportunity to solve a bug that made it into the wild in mature software.

BTW - in PDOS-related work, two of us were analyzing an issue with a slightly-changed GCC 3.2.3 - approximately 20 years old.

When I was a kid (1970s) there was an emergency in Africa which I apparently needed to solve by handing over my pocket money. That alleged emergency never actually ended, and according to ads on TV, apparently a little child is still waiting for me. And at the time I started my career, around 1986, I was following PC magazines for the latest and greatest technology that was coming out. But actually that was the 80386, and I didn't expect that anyone would need more than 4 GiB of memory to solve their problems - you can't write that much code in a lifetime, and the world was already operating fine (on 32-bit mainframes - register size). I guess I was expecting coalescing at that point. Either way, I stopped buying magazines and focused on what we already had instead of looking for the next best thing. Consolidating. I notice in real life in unrelated things a lot of people don't appreciate what they already have and say they "need" xyz.

I'll let you know when I'm ready to leave 1986. If the 80386 isn't enough for your needs, I'd like to know why.

I've personally written enough code where 640k is too constraining. PDOS/86 (which is fairly minimal already) can load itself - sure. But there's not a lot of room left to do anything else. Yes, there are tricks that could be used to reduce the memory usage - but to me, that is butchering elegant (or perhaps "natural") code due to unfortunate resource restrictions. It's not actually making improvements. I really do need more memory - like 2 MiB. Not 4 GiB. And the people who "need" more than 4 GiB? Yeah, nah.
 
So we should stop teaching how "computers" work and instead teach how they are used?

How do you expect an engineer to understand race conditions in a multi-threaded environment if they aren't even aware what a single-threaded environment looks like? Or that new CPU architectures in a few thousand transistors are, in fact, designed and produced? (Just ask Padauk.) Assembly language on an AVR isn't too different from a RISC-V, either. Both are very viable targets for bare-metal programming, even if just as a learning exercise.

I don't think pretending that nothing exist below Java/Javascript is helping. Especially not when thinking about the next generations, who will run our world in the future.

Exactly.

This is exactly "Idiocracy". We're already there.
 
I note that Github has gone to 2FA. However (and unlike google), when you set up 2FA they send you a list of "backdoor" keys should your SMS number quit working. That seems to be prudent.
 
Once you have an interactive, iterative environment, where you can self host code and executables, then anything goes. You're "done", the hard work is finished, the machine is alive and ready for you to take it wherever you want to go. And Sector C can certainly get you there, I just think the trip is shorter, and easier to conceptualize using a Forth based system.

I'm not disputing (or affirming - I have no knowledge) that Forth is better than C in this respect.

What I'm saying is that at the moment, I can't "conceptualize" - or perhaps "be confident in" - that (longer - by how much?) trip using Sector C.

It may be some sort of learning difficulty, but normally I don't learn things until it is time for me to personally write them to replace whatever I am currently using. So e.g. I learn the details of MSDOS by writing PDOS/86. Even now I don't have output redirection in PDOS because I don't really understand how to do it neatly (and in fact - I don't know if I have painted myself into a hole). That isn't priority though, so I don't really care. And solving it for PDOS/86 won't solve it for z/PDOS, so there's still a lot of effort involved.

And my goal is to understand (and personally see) that "trip" you spoke of - using the C language.

When I have completed everything that is possible to do in C - reached the limits of C90 - I may then go and "learn" Pascal, or C99, or Forth, or any other language, and redo the entire process that I just spent 30 years doing in C.

I haven't yet reached the point where I am ready to do that. I'm more interested in "Sector C can't possibly work because xyz - do you actually understand abc?".

Note that there are some concepts that I wasn't directly taught at any point - really important things, like a "three-way diff". And I've met plenty of others who didn't know either. I've seen diabolical things done because the people in charge didn't understand that.

In another situation (where I was a new starter), I was tasked with sifting through thousands of calls to a particular function and deciding whether it should be this one or the alternative one. And I was given "guidelines" of "xyz says there is no need to change this one". xyz was in America and I explained to my Australian boss that I can't follow a direction that requires someone in America to decide until he is actually online. I can work American hours, but I can't begin this task today. My boss didn't seem to be very happy, but allowed me to wait until America was back.

I sat down with a bunch of Americans and asked them to explain what the actual issue was. It turns out in some circumstances you need to call one version, and in another you need to call this one. So I asked why not just set a global variable for that circumstance and then in each function, call the other function if the other one is the correct one. It turned out that there was someone who previously worked in the company who hated global variables, but now that he's gone, yeah, we can do that. And then I mentioned that I had found a particular global variable already - is that not already set correct for that circumstance? And indeed, it was. I just needed to check an existing global variable to see if it was NULL or not. A handful of lines of code and the problem went away. Instead of massive changes the code base which wouldn't have worked anyway, since a lot of the code was common between these 2 circumstances - although I guess it would have worked if they had started duplicating code as the "solution" or something - and that may well have been the original plan/task.

And yes - they were a bit embarrassed, but I simply dismissed it as sometimes a fresh perspective helps - to be clear - they understood the huge application - I didn't - I had no ability to make functional changes to this product - only fix bugs - so ultimately they were more valuable employees because the business still required functional changes. But I got a cool (unsolicited) email to give to my next job application.
 
How sub of a subset are you considering? I was looking at the C90 design parameters and thinking much of that won't be viable on early hardware. Having the minimal character "byte" with 8 bits might be inconvenient when the hardware uses a 6 bit character. 31 character variable names seem excessive when the machine would need to break the bank to achieve 4K of RAM.

Absolutely - so I will presumably keep quiet for the first 15 years after going back in time. Or maybe I will spend that time asking "wouldn't it be great to have xyz hardware?".

From the software side - I still don't have sufficient understanding of computers to know what the best thing to do in that 15 years is. Maybe it's Forth. Or stick with assembler in original timeline. I have no idea.

But arguing the toss about those 15 years is not even the priority for me. I'm more interested in what happens when I have my 2 MiB of memory - which means an 80286 at the earliest - I haven't yet written a PDOS/286. Nor do I have the C compiler to run on it yet. Nor do I have the stepping stones to get from Sector C to that non-existent C compiler for the non-existent OS.

And I'll probably make the 386 the earliest rather than attempting to intervene a little bit earlier with the 286. So a priority would be to go from 8086 Sector C to 80386 Sub C then to the still-non-existent 80386 C90-compliant public domain compiler. I would be interested to know/speculate the technical plausibility of doing that, but it's not a priority to do the actual work. My current interest is being able to get PDOS/386 to generate minimal 64-bit code to drag UEFI back down to 32-bit. Currently I am dependent on Windows and mingw64 - just to run the compiler.
 
MED is for Win 9x and I'm using a PC-XT. I think I'm going to try Programmer's Workbench.

Your original question seems to have gotten pushed aside while a discussion barely related to it has taken center stage. Let me suggest some places for you to look for programs that might be of help:

First I think you might find "Brand X" debugger to be pretty close to what you want:

http://cd.textfiles.com/simtel/simtel/DISK1/DISC2/ASMUTIL/BXD26.ZIP

Also, while I've never used these myself, these two programs are supposed to give you an IDE for MASM/TASM:

http://cd.textfiles.com/simtel/simtel/DISK1/DISC2/ASMUTIL/ASMED_1.ZIP
http://cd.textfiles.com/simtel/simtel/DISK1/DISC2/ASMUTIL/ASMENV19.ZIP

And you might look at the entire directory which has assembly language tools that might be of interest in trying/using (the first link gives a listing of the software with brief descriptions, the second gives links to the programs):

http://cd.textfiles.com/simtel/simtel/DISK1/DISC2/ASMUTIL/00_INDEX.TXT
http://cd.textfiles.com/simtel/simtel/DISK1/DISC2/ASMUTIL/
 
Hi acgs, thank you for those links. I'll try those programs. Also I've tried Programmer's Workbench and yeah... its veeeery sloooow on my XT, almos unusable!
 
You're welcome. Please post here your experiences with the programs. I'm sure the Brand X debugger will run at a decent speed since it was written back when 8088/8086 machines were still in common use. It's not a standard IDE, but will probably feel close to it to you. BTW, Microsoft originally bundled MASM with a more powerful version of DEBUG called SYMDEB. I'd say it was somewhere between DEBUG and Brand X in usability.

Any experience you have with the other programs will be very interesting to hear about.
 
If you're going to develop anything in ML on as slow machine as XT, you may want to try the tools offered by Eric Isaacson: https://www.eji.com/a86/index.htm
His assembler and debugger are very capable — A86 assembler is very fast indeed.
 
Last edited:
Back
Top