• Please review our updated Terms and Rules here

Real Programmers

Ken Vaughn

Experienced Member
Joined
Jan 16, 2010
Messages
138
Location
Colorado, USA
The first computer I ever programmed was a UNIVAC File Computer, vintage late 50's. It was a gigantic machine with drum memories -- a primary (fast) drum with fixed heads, and a secondary (slower, but larger) drum with flying heads on a moveable boom. The system had 16 tape drives, each separated from each other by around 6 feet to allow for proper cooling. The card reader and punch used UNIVAC 90 column cards (round holes) and the printer used a rotating drum rather than print trains.

The machine was programmed in machine code -- we called this absolute programming. Each machine instruction contained not only the op code and operands, but also the location of the next instruction somewhere around the drum, but not necessarily under the same read head. Proper programming technique took into account drum latency -- if the instruction was a simple add, then the location of the next instruction need not be very far around the drum, but if the instruction was a divide, then the location of the next instruction would be farther around the drum. I remember when UNIVAC first developed an assembler which took this all into account.

Today while looking for something in my old documents folder, I ran across a USENET (newsgroup) post which I had saved from the early 80's. This post was a classic in the computer science newsgroups. It describes this sort of programming in a humorous manner which I could never do. I thought you might enjoy this post. It is long, but well worth the read.

http://home.comcast.net/~kvaughn65c/Real_Programmers.txt
 
Last edited:
Nice read! Pieces like that are what keeps this hobby fun. Having programmed in machine language for a grade (one of our professors made us do it throughout the semester -- "I had to, so you have to, it's a rite of passage"), I certainly appreciate having access to an Assembler, not to mention high-level languages like C!

Unfortunately, at least at the university I attend, most people consider C to be too "low-level" to be practical anymore -- if it's not Java or C#, they don't want anything to do with it. Most of the newer professors seem to take the same attitude, as now all courses are taught in Java, even the Operating Systems course. The only exposure to anything "lower" is a semester of C and Assembly in one of the Computer Organization course (it's touched on in the prerequisite course, but only just).
 
Unfortunately, at least at the university I attend, most people consider C to be too "low-level" to be practical anymore -- if it's not Java or C#, they don't want anything to do with it. Most of the newer professors seem to take the same attitude, as now all courses are taught in Java, even the Operating Systems course. The only exposure to anything "lower" is a semester of C and Assembly in one of the Computer Organization course (it's touched on in the prerequisite course, but only just).

I'm a big believer in Java but it is definitely too bad the students don't get more on lower-level languages. Especially with so much being done on Linux systems, it seems important to be able to understand C since the kernel and much of the rest of the OS is written in it.
 
I think all programmers should try low level programming. And all those who have no clue should then try something a little higher level untill they can get things that work. You need to know your limitations.

While a genius programmer is nice, you should write code that others can understand and expand as needed or you end up with code that nobody can work on and is dead. Makes you wonder how much of the code that makes the world work is done by people who can barely get the job done, but it still works.
 
I may still have a coding form that I used awhile back. It's labeled "IBM 1620 Absolute Coding System" on one side and "IBM 1620 Symbolic Programming System" on the other. Back then, only wimps programmed in assembly.

But to bring this on topic, how many have the opcodes (not mnemonics) and modifier bits memorized for their favorite platform? A couple of years ago, I surprised myself by writing out the "99 bottles of beer" code for the 1620 from memory. Some things never leave you--on the other hand, I still don't remember where I put my glasses...
 
While a genius programmer is nice, you should write code that others can understand and expand as needed ....

Agreed. But the state of the art in the late 50's and early 60's had not evolved to that which we practice today. Assemblers were not even available when I first started to program. The last 25 years of my professional career was with Control Data Corp. Most of our operating systems and applications (on several platforms) were written in SYMPL -- pronounced "simple" it stood for Systems Implementation Language, a highly structured, strongly data typing, self documenting language similar to Jovial or Pascal, but with lots of low level extensions.

The USENET article that I included in my post plays up to the "maverick" programmer image, but is sort of fun to read. It does, however, describe the programming of early vacuum tube machines with drum memories pretty well. A S-100 system, or an IBM 5150, look pretty primative when compared to modern platforms. The same was true of programming in the "good 'ol days".
 
There is a major difference in any indistry that just formed and one that is mature. I would suspect that 99% of the programmers today could not have done a good job coding on a tube based machine, and those old programmers would have a hard time coding today where the compiler does most of the optimizing work. There are different skills needed over time as systems mature so you have different types of people doing them.
 
I found this advertisment linked from Wikipedia:
http://www.sciencemag.org/cgi/issue_pdf/frontmatter_pdf/137/3523.pdf

It says the RPC-4000 was so good that "you'll never have to be dependent upon programming specialists, even non-technical personnel can master it". I suppose it depended on who programmed the system to begin with, i.e. a RPC-4000 loaded with compiled higher-level languages may have been easier to maintain than one with hand-written machine code, tailored to match the drum rotation.
 
So the only link I've found won't play on youtube (not sure if it's my problem or theirs) but does anyone have a video demonstration of working drum memory? I've read about it and how it works but I'm curious to see it for my own ignorance :)
 
Hi
No one seems to care anymore about how fast, how big or even how well
a program works any more. I remember thinking how compact the code
was for the monitor in my Poly88. It was always an inspiration to me
to do the best I could.
Dwight
 
So the only link I've found won't play on youtube

http://www.youtube.com/watch?v=eIpoA7Ir9p8

Works for me. The drum shown is an early ERA (Engineering Research Associates) device. ERA was acquired by UNIVAC, but the principals (William Norris, Seymour Cray, et al) became unhappy with UNIVAC and moved back to Minneapolis and formed Control Data Corp. Seymour Cray later founded Cray Research, maker of the famous Cray Supercomputers.

The drum shown is a fixed head device -- there is one head per track and consequently no head movement time, only drum latency due to rotation. Do a Google search for FASTRAND and you will see a description of a drum with moveable heads, attached to a boom which moves back and forth over the surface of the drum -- all heads on the boom can read a track at any given location. This, of course, means that there may be head movement time in addition to drum rotation latency.
 
Last edited:
I have to admit that I get a little uncomfortable at times seeing the generated code for something written in, say, "C" and saying "I can do better than that".

Even among x86 assembly programmers, there appears to be few aware that x86 assembly is not one-to-one (symbolic form to machine form). There are often several ways to express the same operation given by a specific mnemonic in the machine encoding.

How many people have tried to write an ASCII sentence to execute as machine code?
 
It says the RPC-4000 was so good that "you'll never have to be dependent upon programming specialists, even non-technical personnel can master it". I suppose it depended on who programmed the system to begin with, i.e. a RPC-4000 loaded with compiled higher-level languages may have been easier to maintain than one with hand-written machine code, tailored to match the drum rotation.
Yes, but when even Assembly language came about, it was thought that it would eliminate virtually all programming errors!
 
Yes, but when even Assembly language came about, it was thought that it would eliminate virtually all programming errors!

Do you have a cite for that?

My recollection was that on early systems, assembly provided some valuable assist to the programmer and didn't simply translate mnemonics. For example IBM 650 SOAP could optimally place instructions on the drum (1+1 addressing). This was a miserable job to do by hand.
 
1: Make program
2: ???
3: Profit!!

There is no need to care how well done your code is if it works and it generates money.
 
Although I'm too young to properly understand it, I suppose the switch to assembly language would at least eliminate numerical typos? I don't know how the machine language on these systems look like, but suppose it is represented either as hexadecimal or decimal numbers, where each numerical would perform different instructions. Logical mnemonics probably are easier to remember, so that kind of "programming error" would be eliminated even if the assembler did no additional optimizations.
 
Back when I worked for the Post Office, somebody pointed out a defunct Postcode Sorting Training simulator, that sat in the corner of a 'classroom'. It was a Commodore Pet (don't remember which model) with a serially attached device to simulate the hardware at the operators station of the real Postcode Sorting machinery.

It was unused because nobody around could remember how to reprogram it, and since some of the postcodes had changed (and new ones added) it was no longer 'in synch' with the live system. Instead they were using a white board (for the teacher) and pen & paper for the students.

I took a look at the program (after loading it in from a C60 casette) and it was written in reasonably clear, but completely undocumented BASIC. Anyway, I got the program working again, to match the live data, and I earned my brownie points for that month.

Another time I was asked to write a program that would import the 'missing international post bags' data, and create the heavily formatted 'request for compensation' documents to send to foreign Post Offices. Up to that point, the data was being ignored.

My first draft of the program took too long to run, and I took it home to work on over a weekend. At the end of the weekend, after writing somecode to create an Index of the data, it ran 10x quicker, and became an instant monetary success. Tens of thousands of pounds each month were recovered each month, and I got a 'well done' for my trouble.
 
Programming microcotrollers for real-time applications is probably the nearest I've got to "real" programming. Limited memory, and limited interrupt execution time meant that In one case I wrote 4Ks worth of assembler about 4 times over. Towards the end of the job, I had to really search hard to find a redundant couple of bytes I could trim out, or a subroutine that could be done a different way, just to fit the code for another function in.
What "Next" says is true, but only for accountants.
 
Back
Top