• Please review our updated Terms and Rules here

What are you guys using for retroprogramming?

BASIC is the correct choice if you're starting from ground zero. In fact, "a simple recipe database program" is the most common beginner BASIC example I saw in the early 1980s in books and magazines.

Thanks for the advice.
 
For my current retro programming project (a DOS game for 8088/86 or higher) I'm using these:

Operating system:
- Windows 10 on an Intel i5 laptop from 2014

Emulators:
- DosBox 0.74-3. Immediate results, fast compiling, direct file communication between it and Windows... It's the one I use for the daily work
- PCem and 86Box. For testing on something more or less closer to the real hardware.
- DosBox-X. Just wonderful for testing features not present on the official DosBox.

I used to own an 8086 machine when I was a kid. Unfortunately I no longer have it.

Programming languages:
- Turbo C++ 1.0 and Turbo Assembler 1.01 for the main program
- Power Basic 2.1 for an auxiliary program
- Turbo Pascal 6.0 for another auxiliary application

Code editors:
- Visual Studio Code. Pros: It has a time saving autocompletion feature, it has (to me) nice code coloring, it has a well organized workspace and the ability to read/write on DOS 437 charset. Cons: it's slow and heavy to load the first time, it's buggy and on many versions it hanged all the system, leading it to blue screen of death. The latest versions seem more stable... I wish...
- Notepad++: very fast, it does coloring, many languages... My tool of choice for quick editing.

Graphics manipulation:
- Macromedia Firewoks 8.0
- Gimp

Digital sound manipulation:
- Audacity. It has a very nice VOC saving option, and it also allows to convert/save files to 8 bit and any frequency.
 
I thought I'd update the retro stuff I've done. I've use a 4004 emulator, that I wrote, to create a working piece of code written for the 4004 in 1973. It was printed on what I believe to be a ASR33 with either a worn platten or poor registration on the print head. Things like 0 and C looked the same, as well as P and F.
It was 4K of code. The code was for an electronic maneuver board ( used for steering a ship ). I use the simulator to figure where the code expected which. I then created a board using a 4289 to go from the 4004 PMOS levels to TTY levels ( the original uses a 4008/4009 pair ). It has a display and key entry similar to the original.
So, as a retro project, I recreated a design and debugged the code to be a maneuver board calculator. It is relatively complicated as it need to track up to 10 ships and calculate the closest point of approach. A desired thing if you don't want to run your battle ship into the aircraft carrier.
I'd recently picked up on another 4004 project. I was expecting to need to debug the EPROMs I had because the 4th 1702 looked to have broken code. As it turned out the code was actually functional. This was 4004 code written to run on a SIM4-01 ( and early development card for the 4004 ). I used some help from a friend in the UK.
The code is an assembler, that assembled 4004 code, that ran on the SIM4-01, in 1K of code, originally written by Tom Pittman in 1973. It is a remarkable piece of code. It is a complete two pass assembler with labels and + - address calculation, all fitting into 4K of 4004 code.
I expect to begin working on some of my other lagging projects. I need to get the 6502 setup I have with early Rockwell boards going ( not AIM, these were development boards ).
I also need to get back to my Nicolet, that has stopped reading and writing disk, working again. I was made around 1969 or so, with core memory and a mix of DTL and TTL logic.
Dwight
 
Well, I've been guilty of running CP/M MAC, RMAC and ASM under a DOS emulator under DOSEMU under Xubuntu x64.

Could probably do the same with DOSBOX under Armbian, but have no reason to try.
 
Recently when I was hacking together a couple assembly-language boot-time fixes to run on my Tandy 1000 my positively dreadful workflow ended up being something like this, since I have *no* experience with whatever editor comes with with MASM and it was driving me nuts that I couldn't have both DOS EDIT and a command line onscreen at the same time was:

1: Loaded up my Ethernet packet driver on the Tandy 1000
2: Made edits to the source file on my Macbook Pro using bbedit.
3: Ran "nc -bin listen 1234 > testfile.asm" on the T1000, then "nc {ipaddress} 1234 < working_file.asm" in a terminal on the Mac to push it over.
4: Assembled it with Microsoft MASM, and tested if it got that far.
5: Wash, rinse, repeat.

Not the most efficient thing in the world, but it did mean it was convenient to check the code into github when I was finished. Using DOSbox (at least for the testing part) wouldn't have really worked since these widgets were intended to run from config.sys, and an emulator would have been about as awkward as the netcat thing.

I've actually gotten a fair amount of use out having the Tandy 1000 sitting there next to my "real" computer as I've been working on a project to build a video card targeted at 8-bit applications like S-100 machines. I've been generating test bitmaps to burn into Flash ROMs, and several times I've used NC to copy them over to the Tandy and there written a trivial BASIC program to load the contents of the .bin intended for ROM and poke it into a graphics screen to make sure I get what I expect.
 
I feel like I don't hear enough about how other retro-programmers work, I was wondering what you guys were all using for your retro-programming needs.

What hardware are you targeting?
What kind of compiler/interpreter software do you use?
Do you program on old-school systems themselves or do you use a modern editor?
And (hopefully without starting a flame war) what language(s)?
In the past I did some Z80 and a lot of 6502 asm, but these days I'm taking a break from that and doing 16/32-bit stuff. I'm using my own compiler/assembler for my own language, and I use my own text editor running on a newer PC. Occasionally when I'm testing something on the 286/486/Amiga then I will make small changes and recompile right there to avoid running back and forth :)

The text editor and the earliest version of the compiler (before it became self-hosting) were written in FreeBASIC. I've never released the editor due to a massive albatross of bugs and unfinished features. At one time, it was going to be a web browser... and every time I think about cleaning it up, I start leaning toward a complete rewrite instead.
 
3: Ran "nc -bin listen 1234 > testfile.asm" on the T1000, then "nc {ipaddress} 1234 < working_file.asm" in a terminal on the Mac to push it over.

If your Tandy 1000 has a hard drive, it seems faster to edit/compile/test on the Tandy and then push the finished file back to the Mac when done.

I use Turbo Pascal 7 as my Pascal and Assembler IDE. You can assemble from the IDE, and if there are any errors returned, the IDE will let you jump to the line of each error. The only drawback is that TP7's IDE won't syntax-highlight assembler files, so if I want that and I know I'll be writing code for a while, I use the Aurora editor. (Aurora can also be configured to shell+assemble+jump to errors, but I found it convoluted, so I haven't set that up yet.)
 
Reading some of this thread is hard. The idea of using modern hardware/software at all already gives me chills but to use it for doing something that an older computer is BUILT TO DO?!?!? I couldn't. My Windows 2000 PC is my friend in all computer things I do. I use Windows 7 rarely for some stupid google "apps" but that's it.
 
The idea of using modern hardware/software at all already gives me chills but to use it for doing something that an older computer is BUILT TO DO?!?!?

One of my projects that is geared towards older systems is 50,000+ lines of code. It takes my older system 3+ minutes to compile it, and that's with smart compiling/linking. My modern Windows 10 system running an emulators takes 2 seconds to make the entire project. So yeah, I'm going to develop the majority of it on the modern system, and only develop on the older system when it's absolutely necessary (like a speed-sensitive or hardware-unique section).
 
If your Tandy 1000 has a hard drive, it seems faster to edit/compile/test on the Tandy and then push the finished file back to the Mac when done.

The key there is having an editor and IDE you know and are proficient in, of course.

Shooting the file over to compile does take literally five seconds or so including issuing the commands, so there is that.
 
Recently when I was hacking together a couple assembly-language boot-time fixes to run on my Tandy 1000 my positively dreadful workflow ended up being something like this, since I have *no* experience with whatever editor comes with with MASM and it was driving me nuts that I couldn't have both DOS EDIT and a command line onscreen at the same time was:

1: Loaded up my Ethernet packet driver on the Tandy 1000
2: Made edits to the source file on my Macbook Pro using bbedit.
3: Ran "nc -bin listen 1234 > testfile.asm" on the T1000, then "nc {ipaddress} 1234 < working_file.asm" in a terminal on the Mac to push it over.
4: Assembled it with Microsoft MASM, and tested if it got that far.
5: Wash, rinse, repeat.

Not the most efficient thing in the world, but it did mean it was convenient to check the code into github when I was finished. Using DOSbox (at least for the testing part) wouldn't have really worked since these widgets were intended to run from config.sys, and an emulator would have been about as awkward as the netcat thing.

I've actually gotten a fair amount of use out having the Tandy 1000 sitting there next to my "real" computer as I've been working on a project to build a video card targeted at 8-bit applications like S-100 machines. I've been generating test bitmaps to burn into Flash ROMs, and several times I've used NC to copy them over to the Tandy and there written a trivial BASIC program to load the contents of the .bin intended for ROM and poke it into a graphics screen to make sure I get what I expect.

I've done similar things, but with the Old Box running terminal software and logged into a linux box with getty running on a serial port. Then copy files back and forth with rz and sz.
 
Reading some of this thread is hard. The idea of using modern hardware/software at all already gives me chills but to use it for doing something that an older computer is BUILT TO DO?!?!? I couldn't. My Windows 2000 PC is my friend in all computer things I do. I use Windows 7 rarely for some stupid google "apps" but that's it.

Back with Windows 2000 and Windows 7, machines were crossing the threshold of "fast enough". Enough CPU, enough memory, fast enough disk to where the computer typically was not in the way of your work. Obviously monster CPU tasks like video encoding, graphics rendering, etc. are the exception.

One of my projects that is geared towards older systems is 50,000+ lines of code. It takes my older system 3+ minutes to compile it, and that's with smart compiling/linking. My modern Windows 10 system running an emulators takes 2 seconds to make the entire project. So yeah, I'm going to develop the majority of it on the modern system, and only develop on the older system when it's absolutely necessary (like a speed-sensitive or hardware-unique section).

Folks don't appreciate how slow the older systems were, and we are absolutely jaded by speed. We're also jaded by modern tooling and "infinite" RAM. In hindsight, it's remarkable software got written at all.

As I strive to get my SB180 up and running, all of my work has been done on a Z80 simulator running CP/M. I have the simulator cranked down to a simulated 4Mhz, but that doesn't really help with the I/O.

I'm using Turbo Pascal, and when working on large programs, with multiple includes, and writing to disk, it takes a minute or two to do a turn around. And we're talking < 2000 lines of code. I wish I could get Turbo Pascal 4 for CP/M, I'd really like something like UNITs. I may inevitably switch to something like C, I dunno. And as an OS for development, CP/M is pretty stark. Makes things a little bit interesting. Certainly brings in to light the things we take for granted today.

My SB180 just has floppies, so that will inevitably be even slower when I get that running and transition over to that, but it also has 256K of RAM, a chunk of which can be used as a RAM disk. So, that will help with intermediate files.
 
I couldn't have both DOS EDIT and a command line onscreen at the same time was:

1: Loaded up my Ethernet packet driver on the Tandy 1000
...

Your idea of using another machine is good, but I wonder maybe it would be easier if the other machine was also a DOS machine, if you have a network. Maybe set up a shared drive native to DOS and run your editor and compiler on one, and the command prompt on the machine you were running on. That would have been nice, but of course I never have looked into that route when I was learning programming on DOS back in the day, and moved on since then.

Even as a bonus, it would be nice if you could also use the second machine, maybe through some kind of console to single step a program on the executing machine, in the case you can't really use a second monitor.
 
What hardware are you targeting?
What kind of compiler/interpreter software do you use?
Do you program on old-school systems themselves or do you use a modern editor?
And (hopefully without starting a flame war) what language(s)?

Right now, I'm writing for an old AST 286 PC machine.
I use the OpenWatcom Compiler.
I use a modern PC with Visual Studio as my editor.
And I use C++ and some 8086 assembly, with a dash of a custom scripting language.

How about you guys?
  • Target - 16bit PC
  • Open Watcom and wmake
  • 486sx ( toshiba fanless laptop - it is essential ) - watcom VI editor.
  • C++ and some assembly
 
Your idea of using another machine is good, but I wonder maybe it would be easier if the other machine was also a DOS machine, if you have a network.

Yeah. One of these days I've been thinking it would be interesting to set it up so I can NFS or ETHERDFS mount a working directory that's shared on the modern machine so I can skip the "virtual sneakernet" part with nc. It's just sheer inertia standing in the way right there. (I'd definitely try to get over that hump if I were doing this "a lot".)

(Having the other machine be a DOS machine isn't really a requirement, as long as the editor there can be polite about the differences in text end-of-line markers between DOS/Unix/whatever.)

Edit: I guess another example of "retro-development" that I did years ago is when I wrote the first creaky version of "PETTESTER" to help resurrect a badly brain-damaged Commodore 2001. That involved developing an alternate ROM image for the machine, so the tools I used were:

* xa65, a portable 6502 cross assembler.
* The V.I.C.E. Commodore emulator.
* Various random online 6502 assembly language educational simulators, which I used to learn just enough 6502 assembly to take a crack at the problem.

VICE was *pretty easy* to convince to use alternate ROM images, so all I needed to do was compile the code with xa65 with the correct ORG, pad the generated code with enough zeros to fill out the 4K ROM image, and then hack the last few bytes of the image with the correct jump address to redirect execution per the 6502's startup routine.

It was still pretty surprising when the resulting image worked when burned into an actual 2532 EPROM, of course...
 
Last edited:
I use Turbo Pascal 7 as my Pascal and Assembler IDE. You can assemble from the IDE, and if there are any errors returned, the IDE will let you jump to the line of each error. The only drawback is that TP7's IDE won't syntax-highlight assembler files, so if I want that and I know I'll be writing code for a while, I use the Aurora editor. (Aurora can also be configured to shell+assemble+jump to errors, but I found it convoluted, so I haven't set that up yet.)
This a good news! I used Turbo Pascal a lot back then, and I am still using it right now for pascal coding. I've tried Borland's Brief for assembler and other languages with mixed feelings: on one hand it is faster than turbo pascal, on the other hand the configuration of compile and jump to error is clumsy and not flexible. You have to set an environment variable for the compile command that you cannot change in editor. I will look at turbo pascal's support for TASM assemble, and give a try to Aurora (I vaguely recall that I may have used it at some point)

Editing and cross-compiling on a modern machine is definitely faster and more convenient, but I prefer to do everything on the real machine. I rarely work on big programs, tough.
 
Reading some of this thread is hard. The idea of using modern hardware/software at all already gives me chills but to use it for doing something that an older computer is BUILT TO DO?!?!? I couldn't. My Windows 2000 PC is my friend in all computer things I do. I use Windows 7 rarely for some stupid google "apps" but that's it.

I have no problem at all with that. Back in the old days, it wasn't uncommon to develop games using a higher end system than the target platform. For example, the game studios that could afford it, used the, at that time, ultra expensive 386 and 486 computers, just for making life easier as they could compile the code several times faster than the target machines. They could also use Local Area Networks for communicating different computers, so a computer could compile the code while another one could test it. If something went wrong, only the target machine would crash, so it could save a lot of time.

They also could take advantage of a more user friendly (and time saving...) graphic environments such as Windows 386 or 3.0, or, at a later time, even Next machines, as ID software did while making Doom. ID Software then threw away their Next computers and used Windows NT for the next projects, being targeted for DOS. Just as we do now with DosBox and PCem, they could have several independent DOS windows instances. I my opinion, Windows 10 is not very different, at all, to Windows NT 4.0. In fact, the program I use for graphics, Macromedia Fireworks 8.0, I think it works on NT 4.0.

For example, my project is meant to be playable on the average home PCs that were already at homes or could be bought on the fourth quarter of 1990. They weren't 386 or 486, just because they were very, very expensive, so the average PC owner (as my father and I were at the time) could afford an 8088, 8086 or, in a few cases, a 286.

To develop my project on the real hardware of the time, I would need two 386 machines for developing and compiling, and an 8088 or 8086 machine for testing, equipped with a few graphic cards for testing or, even better, using a card able to emulate several graphic cards, as the OAK VGA OTI-037, that I had at that time. That makes three big computers with their heavy and bulky CRT monitors, plus a composite monitor, o an NTSC television. I also would need to set up a LAN, with a hub, network cards and those RJ cables around my house (my wife would throw me out of the house, LOL). Yes, I could do all the development on one machine but it would be desperately slow. Slow compiling, and slow testing as I would exit the development environment (Turbo C++ for my project), because there would not be enough memory left, and run the result. Loading the TC again, and so on...

Now I just can the advantages of having all those developing computers, plus some extras, in a small and convenient space.
 
Real-world example: Sierra's port of Silpheed (1989) was programmed on an 8MHz 286 with EGA, even though more than half of Sierra's target audience were still using 8086/8088 systems at the time of development. It was common, when possible, to use the fastest/biggest machine possible just to speed development.
 
Real-world example: Sierra's port of Silpheed (1989) was programmed on an 8MHz 286 with EGA, even though more than half of Sierra's target audience were still using 8086/8088 systems at the time of development. It was common, when possible, to use the fastest/biggest machine possible just to speed development.

Compiling code took a while back then, so you needed the fastest thing you could afford while programming and probably more RAM then most desktop machines as well.
 
Just for anyone who may be interested on VSCode, I stopped using it about one week ago. The main reason, the more and more frequent blue screens of death. That's reason enough to stop using an application: it's so frustrating, so damaging that I think it doesn't require a deeper insight. The other reason is that, even when the application didn't crash my entire system (and believe me, working crossing fingers all the time isn't pleasant...), it still was very heavy and slow to load.

The good news is that while looking for an alternative, I found a VSCode fork named VCCodium. I learnt that VSCode source code is open source, but the binaries distributed by Microsoft don't. Microsoft includes several default extensions, not all of then open source, not even every of then necessary at all, and they activate telemetry by default. So VSCodium is a fork compiled just from the official VSCode repository, but with telemetry disabled by default and free of all Microsoft bloatware.

Bottomline, now I'm very happy with VSCodium. It's still slower to load than Notepad++ but it's way faster than the official VSCode, loading now in a reasonable time. And, most important of all for me, it never crashed during all this time I've been using it. My guess is that the guilty of the blue screens and extremely slow load were the numerous unnecessary default preloaded extensions of VSCode, or maybe it was the telemetry, or both, I don't know.
 
Back
Top