• Please review our updated Terms and Rules here

NEC V20: BRKEM, CALLN and RETEM

resman

Veteran Member
Joined
Jan 1, 2014
Messages
555
Location
Lake Tahoe
And if you have wwaaayyyyy too much time on your hands, you could run multiple CP/M programs at once, ala M/PM.
 

wrljet

Experienced Member
Joined
Oct 31, 2008
Messages
205
Location
Maryland
Speaking of NEC V40...

I built this using a V40, ca. 1989.
Simple ECU for a car project, controlling the ignition and an automatic transmission I had added solenoids to.

wrljet1988-ecm40.jpg
 

kerravon

Experienced Member
Joined
Oct 31, 2021
Messages
97
It's all the same to me. Microcode vs. program emulation. Program emulation has the advantage of portability. If you have an 8088 or a Core I13 CPU, you're still good.

Sorry for the necro, but I've been mulling this off and on since you wrote it, and I think I now have my own answer to this question.

I don't have a boatload of old CP/M source or executables that I want to run as fast as possible to do anything remotely resembling real work. In fact - I have none at all (and don't bother giving me a link to a boatload of the same - there's nothing I actually want to run even if I had every CP/M program in the world).

I have an interesting chip I wish to exercise. Actually two of them. Because I bought two Book 8088s in case one of them broke. So that I had the chip designed to run 8080 executables without compilation, thus provide an edge over Intel's 8088 and win in the marketplace.

They didn't actually win, but that doesn't bother me.

I predicted the Amiga would win over the IBM PC. I was wrong about that too. That still interests me today, and one aspect of my PDOS work is preparing C90 and ANSI X3.64-compliant software ready to make the jump to the Amiga which can also support exactly that.

I'm very very late to market. Like 35 years late or something. But that doesn't bother me either.
 

Chuck(G)

25k Member
Joined
Jan 11, 2007
Messages
42,664
Location
Pacific Northwest, USA
Well, just before the 5150 was officially revealed, the 68K lab computer was debuted. That started some people talking about IBM introducing a 68K-based PC. Of course, that didn't happen and many people who were expecting something phenomenally earthshaking from IBM were disappointed. The 68K had to wait for the Atari ST debut, I guess.
 

kerravon

Experienced Member
Joined
Oct 31, 2021
Messages
97
Well, just before the 5150 was officially revealed, the 68K lab computer was debuted. That started some people talking about IBM introducing a 68K-based PC. Of course, that didn't happen and many people who were expecting something phenomenally earthshaking from IBM were disappointed. The 68K had to wait for the Atari ST debut, I guess.

But IBM didn't need to be involved. I'm more interested in the culture. If people (programmers) had been writing their programs according to the appropriate specs - which were either available or close to being available - ie ANSI X3.159-1989 (as a draft, or else just use K&R 1) and ANSI X3.64 - then both line mode programs and fullscreen applications could have been portable to everywhere. I've even demonstrated EBCDIC ANSI terminals working on a mainframe (using emulation).

Ok, so I've heard that ansi.sys was slow because it used MSDOS which used the BIOS to write to the screen.

So? Replace ANSI.SYS to write directly to the screen buffer. Rather than change every application to statically link libraries that directly write to the screen buffer.

In fact, there's nothing wrong with doing the static link still, so long as it is isolated in the code, and whatever screen writing package or functions you use (microemacs 3.6 manages) results in ANSI codes being sent to a function called scrwrite that takes the same parameters as fwrite, and one of them is stdout, and if scrwrite is not defined via compiler overwrite it defaults to fwrite, so that by default your app conforms to the spec. For MSDOS, feel free to define scrwrite as a different name that is a statically linked module.

You need to recompile for the Amiga anyway, so none of that matters anyway.

There could have been a scrwrite.c that has an #ifdef scwrite surrounding it so that by default it compiles to nothing, so the same command line could have been used to compile for both targets.

Similar issue for keyboard input - you need to translate to ANSI escape codes and interpret them rather than using the extended codes given by MSDOS.

Keystrokes aren't time-critical anyway.

ANSIPLUS (or equivalent) can give you the ANSI X3.64 keyboard strokes instead if (like me) you don't even want to see the pollution of kbdread.c or whatever you want to call it in your source tree. Ditto for not wanting scrwrite.c in your source tree.

I also don't mind if there was a defacto standard/convention that when writing ANSI controls you write them in a way that is optimized to allow ANSIPLUS or custom replacement to rapidly convert your buffer to direct screen writes. Like an escape sequence to clear the screen followed by your full screen data and ansiplus is specifically designed to do a memchr looking for an additional ESC char, and so long as that isn't present, and the buffer is the exact 80*25 size, then it uses an efficient algorithm to write to the screen.
 

kerravon

Experienced Member
Joined
Oct 31, 2021
Messages
97
Actually it just occurred to me that I could put both the keyboard and screen logic into pdpclib for MSDOS. I already do that in pdpclib for EFI.
 

Chuck(G)

25k Member
Joined
Jan 11, 2007
Messages
42,664
Location
Pacific Northwest, USA
But IBM didn't need to be involved. I'm more interested in the culture. If people (programmers) had been writing their programs according to the appropriate specs - which were either available or close to being available - ie ANSI X3.159-1989 (as a draft, or else just use K&R 1) and ANSI X3.64 - then both line mode programs and fullscreen applications could have been portable to everywhere. I've even demonstrated EBCDIC ANSI terminals working on a mainframe (using emulation).
Non-sequitur. The IBM PC was under development in 1981.
 

dreNorteR

Experienced Member
Joined
Dec 19, 2016
Messages
115
But IBM didn't need to be involved. I'm more interested in the culture. If people (programmers) had been writing their programs according to the appropriate specs - which were either available or close to being available - ie ANSI X3.159-1989 (as a draft, or else just use K&R 1) and ANSI X3.64 - then both line mode programs and fullscreen applications could have been portable to everywhere. I've even demonstrated EBCDIC ANSI terminals working on a mainframe (using emulation).

...

What you are saying is that every program, for every system, should either be written in C, or link with a C library. Or you're even assuming that this is already the case, as if C was somehow fundamental to how computers function and there was no way - or at least no reason - to avoid it, ever. That's also an attitude common among UNIX/Linux/GPL zealots...

The ANSI standard provides a lowest common denominator for console i/o, but if you want a program (in any language) to run well on a particular machine or OS, using its full capabilities, you have to adjust it anyway. Or have some abstraction layer that is a superset of what the hardware provides.

Many key combinations can't be represented by escape sequences, or at least there is no widely implemented standard for it. Certainly no ANSI one.

I have written a somewhat functional TSR for DOS that scans the video buffer for updates and sends them over the serial port, translated to UTF-8 and terminal escape sequences. It also attempts to translate input to the scan/ASCII codes expected by programs which use INT 16h. At 115200 bps, I'd call it "usable", but still with noticeable lag of course, and occasional screen corruption because of dropped characters.

Now I can use my 286 PC from a terminal window on Linux by running "screen /dev/ttyUSB0 115200".

But for example Ctrl+Enter (copies a filename into the command line in Norton/Volkov Commander) doesn't work, because there is no escape sequence for it that my Linux terminal emulator sends. Worse than that, it ignores the Ctrl and just sends ASCII CR, running whatever is in the partial command line. Yet somehow, the same key combination does work locally in Midnight Commander, and I actually looked at the source code to find out how this magic is possible -- turns out that when the DISPLAY environment variable is set, this console program will connect to the X server, and poll it for the state of modifier keys when receiving CR on stdin!

I'd call that a gross hack, one made necessary by the "standards-compliant" console i/o layer.

Back in the day, people ported games written in assembly language from one processor architecture to a different one, by rewriting the machine instructions but keeping the logic the same. And rewriting the i/o (which was usually part of the code, not a separate driver or library) to interact with whatever hardware existed in the target system.

"Write code once, compile/run anywhere" may be convenient, but giving up so much for it as you want is hardly an optimal state.

I also don't mind if there was a defacto standard/convention that when writing ANSI controls you write them in a way that is optimized to allow ANSIPLUS or custom replacement to rapidly convert your buffer to direct screen writes. Like an escape sequence to clear the screen followed by your full screen data and ansiplus is specifically designed to do a memchr looking for an additional ESC char, and so long as that isn't present, and the buffer is the exact 80*25 size, then it uses an efficient algorithm to write to the screen.

And this would completely defeat the goal of portability, while still being inefficient compared to direct hardware access, and unable to deal with color/attribute changes.
 
Last edited:

krebizfan

Veteran Member
Joined
May 23, 2009
Messages
5,910
Location
Connecticut
Some programs were able to get good performance with portable code. UCSD Pascal's major problem was the very limited memory space allowed which meant a lot of swapping compared to other programs that could take advantage of more than 128K.

I would not want to program in C on any micro platform before 1985. The compilers yielded slow yet buggy code. The fictional world with systems having enough memory to run complete compilers that have been thoroughly debugged would result in different development decisions.
 
Top