• Please review our updated Terms and Rules here

Bit7 in CP/M characters.

cj7hawk

Veteran Member
Joined
Jan 25, 2022
Messages
1,707
Location
Perth, Western Australia.
Having solved the problem in my BDOS, Zork and other infocom adventures now run in my terminal, at least to the extent that they seem OK. Seems Zork requires that the DMA location be reset to 0080 before execution or it messes it up a bit as it never sets the DMA location itself, and I had some errors in my direct console access routine...

That aside, once it starts working, I see a string of gobbledegook appear on the screen each time it waits for input, and after having a look at some recordings, I wondered if it was a status bar, so I stripped bit7, and sure enough, it is...

From what I can tell, CP/M only used 7 bits ( 0 to 6 ) for characters values, but Zork seems to expect Bit 7 to indicate "inverse" graphics.

Is this normal for CP/M to use the upper bit for inverse? Or is it a machine specific implementation. And was there ever a standard for Infocom terminal settings ( eg, ANSI, ADM3A? )

Thanks
David.
 
There’s nothing in CP/M having anything to do with terminal specific character sequences (other than ANSI).

Most Infocom games just used ANSI and didn’t do any terminal specific sequences. For the rare ones that did an installer was included to configure the game for whatever terminal the user used.
 
Note, ANSI is a definition of ESC sequences, and CP/M knows nothing about that. CP/M uses ASCII characters, more specifically, ASCII-7. Using bit 7 for reverse video was something specific platforms did, usually ones that had built-in video (vs. a serial port to a terminal). That certainly indicates that your version of ZORK has been customized for a specific platform. Presumably, there is a defined configuration area one could patch so that it works on a different terminal. Or, ideally, a Zork-config program that guides you through that.
 
The codes seem to be interpreted correctly, but most of the test is just normal ascii with bit 7 set and from what I can tell looking at copies of infocom adventures on Youtube, it was intended to be printed at the top of the screen and tell you your progress through the game. It looks OK in 7 bit ASCII too - just not as differentiated from the text.

Thanks again -
David
 
From what I can tell, CP/M only used 7 bits ( 0 to 6 ) for characters values, but Zork seems to expect Bit 7 to indicate "inverse" graphics.
CP/M itself only uses 7-bit ASCII characters. Some implementations use the top bit as a parity bit, extended characters or strip/ignore it. Others used it as a single-character video attribute. Depending on your terminal hardware, this could be inverse, underlined, bright, bold or blinking. The advantage of using an attribute bit is that it is stateless and easy to implement in hardware, but you only get a single attribute.

Is this normal for CP/M to use the upper bit for inverse? Or is it a machine specific implementation. And was there ever a standard for Infocom terminal settings ( eg, ANSI, ADM3A?)
CP/M does not care, but many applications supported using the high bit as an attribute bit. The INFOCOM interpreter contains a patch area (see http://www.retroarchive.org/cpm/cdrom/CPM/GENDOC/ZORKNOTE.TXT) describing the terminal. If the byte at offset 010Fh is non-zero, the top bit will be set for all characters in the status line. WordStar also supported this style.

When configuring for an ANSI/VT-100 terminal, you set this byte to zero and enable/disable the attribute of choice in the "begin status" (0152h) / "end status" (0173h) sequences. If the patch area is not configured (010Fh .. 0193h set to zero), the status bar won't be displayed at all. This may have been the default configuration, as it is compatible with dumb terminals and teletypes.
 
That doesn't jibe with the DW floppies I process. All EBCDIC. I suppose that the firmware could translate all of that ASCII to EBCDIC for the disk and printer, but seems silly.
 
Last edited:
That doesn't jibe with the DW floppies I process. All EBCDIC.

Are they CP/M floppies? I vaguely remember looking up the DW hardware manual once and it saying the video and input hardware was switchable between ASCII and EBCDIC, so you could run an OS using either on it.
 
The samples I have are echt Displaywriter ones. Not CP/M. If the system could be switched, that would explain that. I assume that switchability extended to the printer also? The DW keyboard does betray an EBCDIC bent, however; look at shfit-6--it's the "cents" character not the caret. And no braces, only brackets.

But come to think of it, even the S/360 PSW had an ASCII mode bit, which, IIRC mostly determined how the BCD instructions worked.
 
Last edited:
Was there ever anything other than ASCII ( or a variation of ASCII ) and EBCDIC that found common use? I recall always getting told about both and never seeing EBCDIC, but they hammered it into us like it would be the end of the world if we didn't pay attention that we might run into it one day.
 
On mainframes, translation to ASCII (or EBCDIC) was usually done in the peripheral controller. That would often translate into the peripheral driver on machines that did not have peripheral controllers. The peripherals themselves usually had a fixed character coding, e.g. ASCII, and so the machine had to deal with that. Timeshare hardware almost always had to support ASCII-only teletypes, and mag tape was often used for data interchange and so had to be able to read/write data in EBCDIC, ASCII, or the native computer codes. I believe there were some peripherals (e.g. printers) that had a ROM option for EBCDIC, but that was probably not the norm and probably did not sell enough to be worthwhile (this would have been in the age of micros, where things were almost always ASCII).

Most mainframes had their own character coding scheme, often something that could easily translated from punch card codes. And the big "line printers" for those machines usually handled the native computer character coding, which usually dictated how "text" was interpreted by the machine. But some were rather split-brained about whether text was intended for punch cards or line printers. Things like EBCDIC and ASCII seemed to be an improvement, by making the computer programs all "think" of text in the same coding scheme. But it wasn't always that simple.
 
Up until the 1970s, ASCII (or USASCII) was largely viewed as something used on Teletype terminals. Even before EBCDIC and 8-bit codes, IBM had several versions of BCDIC and a couple of encodings that were unique to a given system. (e.g. numeric blank, group mark, etc.) 6 bit codes reigned supreme and, on certain platforms, were still in use well into the 80s. Even this wasn't restricted to big mainframes. Consider the PDP8/DECmate etc. running WPS--6 bit code with certain "shift" characters for lowercase, etc. Fieldata was very much in use on Univac-based systems well into the 80s. CDC stuck with Display Code right through their CYBER mainframe line.

I still deal with these old codes in my everyday work.


Whippersnappers--get off my lawn! :)
 
Last edited:
What was "common use" really depended on what "universe" you lived in (who's computers you used). But, EBCDIC (once it was created) was certainly well-known to pretty much everyone (at least, they knew what it was even if they didn't know that 0xE5 was 'V', etc). Of course, punch card coding predates EBCDIC and was certainly commonly used (for punch cards, but not for internal computer codes or mag tape). And as previously implied, the computer itself was not ASCII or EBCDIC or whatever, that was all a matter of the peripherals and the intent of the software.
 
Of course, punch card coding predates EBCDIC and was certainly commonly used (for punch cards, but not for internal computer codes or mag tape).
I take it that you don't deal with even-parity 7-track tape? The A and B tracks directly correspond to zone punches. Of course, something had to be done about the space character, as 00 is indistinguishable from blank tape, so IBM had a special encoding for that.
Few people understand that punched card code was initially a decimal system; binary was reserved for "scientific" mainframes (e.g. 7070 vs. 7090). On those decimal machines, there's no way to deal with binary, so coding is more-or-less hardwired into them.
EBCDIC, at its heart, is a punched-card code. How else could one explain the gap between I and J or R and S?
 
Last edited:
Punch cards, and the basic coding, predates computers. Not sure what point you're trying to make, but what was written on 7-track or 9-track mag tape depended on the computer that generated it and - more importantly - the software that wrote it. There may have been a defacto standard coding for 7-track tape, but that - like EBCDIC - was probably an option on many machines, and intended for data interchange with other dissimilar computers. Or left entirely up to the software (the tape drives just took whatever binary codes were sent by the computer - the meaning of those codes was the responsibility of the software that wrote and read it). So, there are plenty of systems that used 7-track tape but never used any "standard" (if there even was such a thing) character coding. It's the software that decided that. The only place that things like ASCII or EBCDIC came into play was for "human interface devices" - and the software that interacted with them. Storage devices certainly didn't care.
 
The point is that, at least up to industry-adoption of ASCII, internal codes were essentially punched-card codes.
And the statement that "And as previously implied, the computer itself was not ASCII or EBCDIC or whatever, that was all a matter of the peripherals and the intent of the software." doesn't exactly ring true. Didin't Dijkstra have a big issue with the 1620 in that it was possible to read certain codes (e.g. 8-2-1) but not recognize them as different from (8-2), nor produce them on output? Then there were hard-coded (in the machine itself) things like record and group marks, numeric blanks, etc.--there was no substitution--and woe betide you if you tried to use them in arithmetic. Using the BCD instructions on an S/360 depended a lot on the setting of the ASCII bit in the PSW--and that extended to instructions like EDMK.
 
... internal codes were essentially punched-card codes...
That's only superficially true, the codes were a conversion of the Hollerith codes into a form that used 6/7 bits.

The rest of your reply still makes me wonder what you're trying to get at. It seems like you are confusing peripherals and code conversion instructions with the nature of the computer itself.
 
Back
Top