Chuck(G)
25k Member
Has anyone ever seen one of these? Or did it die in the lab?
Well, there was ofcourse CP/M86 ...But running CP/M as a complete OS on a PC has never been done AFAIK. And THAT intrigued me.
![]()
It's all the same to me. Microcode vs. program emulation. Program emulation has the advantage of portability. If you have an 8088 or a Core I13 CPU, you're still good.
Well, just before the 5150 was officially revealed, the 68K lab computer was debuted. That started some people talking about IBM introducing a 68K-based PC. Of course, that didn't happen and many people who were expecting something phenomenally earthshaking from IBM were disappointed. The 68K had to wait for the Atari ST debut, I guess.
Non-sequitur. The IBM PC was under development in 1981.But IBM didn't need to be involved. I'm more interested in the culture. If people (programmers) had been writing their programs according to the appropriate specs - which were either available or close to being available - ie ANSI X3.159-1989 (as a draft, or else just use K&R 1) and ANSI X3.64 - then both line mode programs and fullscreen applications could have been portable to everywhere. I've even demonstrated EBCDIC ANSI terminals working on a mainframe (using emulation).
But IBM didn't need to be involved. I'm more interested in the culture. If people (programmers) had been writing their programs according to the appropriate specs - which were either available or close to being available - ie ANSI X3.159-1989 (as a draft, or else just use K&R 1) and ANSI X3.64 - then both line mode programs and fullscreen applications could have been portable to everywhere. I've even demonstrated EBCDIC ANSI terminals working on a mainframe (using emulation).
...
I also don't mind if there was a defacto standard/convention that when writing ANSI controls you write them in a way that is optimized to allow ANSIPLUS or custom replacement to rapidly convert your buffer to direct screen writes. Like an escape sequence to clear the screen followed by your full screen data and ansiplus is specifically designed to do a memchr looking for an additional ESC char, and so long as that isn't present, and the buffer is the exact 80*25 size, then it uses an efficient algorithm to write to the screen.
Some programs were able to get good performance with portable code. UCSD Pascal's major problem was the very limited memory space allowed which meant a lot of swapping compared to other programs that could take advantage of more than 128K.
I would not want to program in C on any micro platform before 1985. The compilers yielded slow yet buggy code. The fictional world with systems having enough memory to run complete compilers that have been thoroughly debugged would result in different development decisions.
1978:Non-sequitur. The IBM PC was under development in 1981.
What you are saying is that every program, for every system, should either be written in C, or link with a C library. Or you're even assuming that this is already the case, as if C was somehow fundamental to how computers function and there was no way - or at least no reason - to avoid it, ever. That's also an attitude common among UNIX/Linux/GPL zealots...
The ANSI standard provides a lowest common denominator for console i/o, but if you want a program (in any language) to run well on a particular machine or OS, using its full capabilities, you have to adjust it anyway. Or have some abstraction layer that is a superset of what the hardware provides.
Many key combinations can't be represented by escape sequences, or at least there is no widely implemented standard for it. Certainly no ANSI one.
I have written a somewhat functional TSR for DOS that scans the video buffer for updates and sends them over the serial port, translated to UTF-8 and terminal escape sequences. It also attempts to translate input to the scan/ASCII codes expected by programs which use INT 16h. At 115200 bps, I'd call it "usable", but still with noticeable lag of course, and occasional screen corruption because of dropped characters.
Now I can use my 286 PC from a terminal window on Linux by running "screen /dev/ttyUSB0 115200".
But for example Ctrl+Enter (copies a filename into the command line in Norton/Volkov Commander) doesn't work, because there is no escape sequence for it that my Linux terminal emulator sends. Worse than that, it ignores the Ctrl and just sends ASCII CR, running whatever is in the partial command line. Yet somehow, the same key combination does work locally in Midnight Commander, and I actually looked at the source code to find out how this magic is possible -- turns out that when the DISPLAY environment variable is set, this console program will connect to the X server, and poll it for the state of modifier keys when receiving CR on stdin!
I'd call that a gross hack, one made necessary by the "standards-compliant" console i/o layer.
I'm not expecting games to be ported.Back in the day, people ported games written in assembly language from one processor architecture to a different one, by rewriting the machine instructions but keeping the logic the same. And rewriting the i/o (which was usually part of the code, not a separate driver or library) to interact with whatever hardware existed in the target system.
"Write code once, compile/run anywhere" may be convenient, but giving up so much for it as you want is hardly an optimal state.
And this would completely defeat the goal of portability, while still being inefficient compared to direct hardware access, and unable to deal with color/attribute changes.
My non-sequitur comment was that citing ANSI x3.4 1989--note the date. Was I familiar with VT100 escape sequences before the PC? I sure was--I even did terminal software on contract before the PC gained momentum(I have the Z80 source to prove it). I was using C in the late 70s as well on Unix; Lattice C for the PC was not very good (2 floppy set)--one can hardly assert that its code generation was optimal.
One may as well ask why the PC didn't use NAPLPS or Videotex for graphcs.
And this would completely defeat the goal of portability, while still being inefficient compared to direct hardware access, and unable to deal with color/attribute changes.
I think we can all agree that life would have a been a lot easier if Michael Shrayer had had the foresight to design the original Electric Pencil word processor to leverage the completely contemporary-to-1976 Open XML file format and set a good example for subsequent editors like Wordstar. Could have saved us a good forty years of having to deal with incompatible document formats if everyone had just gotten it all 100% correct and portable from the start.
For that matter, why can't I CLOADM a .PNG directly into the framebuffer of my TRS-80 Color Computer without having to jump through all these ridiculous hoops to dumb it down into a specific size, color depth, and byte ordering using external software? I mean, jeeze, these standards exist for good reasons, they should have built COLOR BASIC 1.0 to handle all this.