• Please review our updated Terms and Rules here

What makes COBOL well suited for commercial applications?

I wish I could find the quote but I swear reading somewhere that, when pressed on the verbose English syntax of either COBOL or one of the -MATIC languages, Grace Hopper indicated the alternative was trying to explain mathematical formulas and symbols to military commanders, they would have nothing of it. While it is in name a "business" language, one must remember how big of a business the military is, and that for better or for worse, most of Grace Hopper's view of computers is their role in military, not civilian affairs.
I think this is the major point; its similarity to English is what gave it credibility among management and people who were intelligent but not programmers.

As well, the strictness of COBOL ensured that common errors were lessened.
 
I have been on my own deep dive into early computer language development going down the LISP rabbit hole. But FORTRAN and COBOL had just as much influence, if not more so. My good friend Tom Lahey made a career out of FORTRAN compilers. Ryan and McFarland also came from Digitek and expanded their FORTRAN compiler experience into COBOL as well. Tom lamented that he should have expanded Lahey's compiler product to include more languages. Bringing those mainframe languages to the PC in the '80s must have been fairly lucrative.
 
May as well advocate keying the machine code in through dozen of toggle switches, except that's just an abstraction of routing the patch cables instead.
Well, I think that my point is being missed. Some are discussing Python, COBOL, etc. as if they were an end in themselves--and in today's world where knowledge of machine architecture and instruction sets is no longer required of programmers, that's a problem. Ultimately, it all boils down to what a given machine is doing--even if that machine is a human.

As to Turing-completeness, that can raise problems. In the case of Dijkstra and the 1620, the issue he complained about was basically that the set of output characters was smaller than the set of input characters. That is, a 1620 could not duplicate all input on output.
 
Well, I think that my point is being missed. Some are discussing Python, COBOL, etc. as if they were an end in themselves--and in today's world where knowledge of machine architecture and instruction sets is no longer required of programmers, that's a problem.
I've not noticed any of this on the thread, and I'm not really seeing how it relates to the topic of how language syntax and semantics affect the difficulty of expressing ideas. As a 25-year professional programmer and amateur programmer for years before that, and as someone with a relatively deep knowledge myself of machine architecture and instruction sets, I can say that lack of low-level architecture knowledge isn't a requirement for a substantial number of things (though it still can be helpful), and that language design (syntax and semantics) always has a huge effect on the programs that are written, including their correctness and later maintainability.

Ultimately, it all boils down to what a given machine is doing--even if that machine is a human.
Not for the purposes of this thread, no. It's quite the opposite, in fact: here it boils down to, "how is what you want to do being expressed"?

As to Turing-completeness, that can raise problems. In the case of Dijkstra and the 1620, the issue he complained about was basically that the set of output characters was smaller than the set of input characters. That is, a 1620 could not duplicate all input on output.
1. I'm don't see how the IBM 1620's I/O issues are connected to Turing completeness (or not) in any way. 2. It is not the case ("basically" or not) that "the set of output characters was smaller than the set of input characters." 3. As you can see from a careful reading of EWD37, the problem was effectively one of language and representation design—the very kind of design you've been claiming in previous posts here isn't important.
 
Well, he did observe: "As a result one comes to the shocking conclusion that it is impossible to use the IBM 1620 Data Processing System for one of the most trivial jobs: the reproduction of a punched paper tape, impossible even when it is known that the characters on the tape all belong to the restricted set of the IBM 1620."

He clearly used paper tape I/O. The situation was much worse with card I/O.

As far as the BT/BB hardware is concerned, it's not a deficiency if you consider the BT/BB to be a diagnostic tool. Indeed, there are ISAs where a "trace register" is implemented to record the location of the last branch taken. Contents of same are readable, but overwritten with subsequent branches. The usual subroutine call sequence was BTM, using the return address as the Q operand; the return being made with a branch back to the transmitted address, either with the optional Indirect Addressing feature or by constructing a branch instruction to do the job.

Very similar to other architectures such as the PDP-8 or CDC 6600.

However, the BT/BTM instructions aren't essential to operation. A TF/TFM followed by an unconditional branch would accomplish the same thing.

Note that the 1620 is a memory-to-memory design; there are no directly readable hardware registers.

Only 25 years?
 
Last edited:
Well, he did observe: "As a result one comes to the shocking conclusion that it is impossible to use the IBM 1620 Data Processing System for one of the most trivial jobs: the reproduction of a punched paper tape, impossible even when it is known that the characters on the tape all belong to the restricted set of the IBM 1620."
He did indeed observe that, and nor did I disagree with it. What's in question is not that you got the headline, but your understanding of what the actual problem is.

Only 25 years?
Exactly! Far less than you, I'm sure, yet somehow even with less experience I seem to have a better understanding than you of what programming languages are all about. Sad, really.
 
(LOL) Didn't know that folks lived in ivory towers in Tokyo.
(LOL) Didn't know that so many people think that programming in assembly language, COBOL and Python are all the same thing.

(The especial irony here, of course, is that it's the theorist who thinks they are, and anybody who's actually done any amount of practical programming in any two of those three, or some number of other languages, understands how vastly different they are.)
 
I work in a COBOL shop at a large financial institution. I manage the teams that write the code that make the updates. I can say this. When written correctly, COBOL is very reliable, easy to maintain, and compiles to very efficient code. COBOL is not a slow language to run.
 
(LOL) Didn't know that so many people think that programming in assembly language, COBOL and Python are all the same thing.

(The especial irony here, of course, is that it's the theorist who thinks they are, and anybody who's actually done any amount of practical programming in any two of those three, or some number of other languages, understands how vastly different they are.)
Well, the greater irony here is you pounced on someone who has been at for over 60 years and is more or less accepted as a "master" of the trade worldwide. You should consider rescinding your remarks about 'trolling' as it was embarrassing to all, but of course this an open forum and you are entitled post as you please but not without rebuttal.
 
Well, the greater irony here is you pounced on someone who has been at for over 60 years and is more or less accepted as a "master" of the trade worldwide.
Dude, you can't win an argument with me by making things up. Dijkstra he is not, far from it.

And if you seriously think he wasn't trolling, as you post implies, conisder that this means he feels that writing any particular piece of software in assembly language, or COBOL or Python are all one and the same. This is not only trivial to disprove but, frankly, a ludicrous opinion.

You should consider rescinding your remarks about 'trolling' as it was embarrassing to all, but of course this an open forum and you are entitled post as you please but not without rebuttal.
Feel free to rebut with facts and analysis, rather than ad hominem arguments. Telling me that you consider so-and-so to be a great programmer, and claiming (with no evidence whatsoever) that others think so too, is basically admitting you can't argue the material itself.
 
Dude, you can't win an argument with me by making things up. Dijkstra he is not, far from it.

And if you seriously think he wasn't trolling, as you post implies, conisder that this means he feels that writing any particular piece of software in assembly language, or COBOL or Python are all one and the same. This is not only trivial to disprove but, frankly, a ludicrous opinion.


Feel free to rebut with facts and analysis, rather than ad hominem arguments. Telling me that you consider so-and-so to be a great programmer, and claiming (with no evidence whatsoever) that others think so too, is basically admitting you can't argue the material itself.
Dude you missed the whole point. The gentleman has been around a long time and possess a vast knowledge of hardware and software. Never indicated anyone was 'great' and you must be blind if you think folks here don't respect what he has to offer. I been on this forum about 15 years or and you, dude, are the first ever to take a swipe at him.
 
I work in a COBOL shop at a large financial institution. I manage the teams that write the code that make the updates. I can say this. When written correctly, COBOL is very reliable, easy to maintain, and compiles to very efficient code. COBOL is not a slow language to run.

Especially when IBM mainframes are, effectively, COBOL machines. Many, many years ago when I did some System/370 assembly language, the assembly mnemonics were VERY close to COBOL.
It was the only assembly language that could manipulate packed and character format numbers.
 
That's not true, and it doesn't mean this opinion isn't shared by others. Rather than pile on, let's just leave it at that. Trying to defend this more is just going to blow it way out of control.
The reference was in response to 'trolling'. So relax a bit.
 
The reference was in response to 'trolling'. So relax a bit.
I suspect you may have entirely misinterpreted what I was saying, and why I accused Chuck of trolling. I don't think he's an idiot, and that's precisely why I think he was trolling; I consider it very unlikely that he would actually believe some of the ridiculous things he appeared to be saying.

This is why I encourage you to leave aside the people for a moment and go back to the actual technical points being made here. Ad hominem arguments and appeals to authority can cause you entirely to miss the actual discussion.

I should note that, despite the various flames flying back and forth, I've found this discussion to be fairly useful (including some of Chuck's contributions to it). I've learned significantly more about COBOL (though admittedly some of it has rather horrified me; I'm looking at you, ALTER), gained a little insight into control structures, and I am coming to suspect that the data formatting facilities of COBOL were rather more valuable than people give them credit for, at least in the age in which COBOL was commonly used.
 
Google:

According to available data, a significant number of large companies, particularly in the finance, insurance, and government sectors, still use COBOL for business operations, with estimates suggesting that around 70% of critical business logic and information is written in COBOL, impacting a large portion of Fortune 500 companies; a Reuters study even reported that 43% of banking systems still rely on COBOL. It's still used extensively: more than 95% of ATM swipes and 43% of banking systems are written in COBOL. Experts estimate that COBOL systems support more than $3 trillion in daily commerce through transaction processing. In fact, one study found more than 800 billion lines of COBOL code in daily use!
 
Google:

According to available data, a significant number of large companies, particularly in the finance, insurance, and government sectors, still use COBOL for business operations, with estimates suggesting that around 70% of critical business logic and information is written in COBOL, impacting a large portion of Fortune 500 companies; a Reuters study even reported that 43% of banking systems still rely on COBOL. It's still used extensively: more than 95% of ATM swipes and 43% of banking systems are written in COBOL. Experts estimate that COBOL systems support more than $3 trillion in daily commerce through transaction processing. In fact, one study found more than 800 billion lines of COBOL code in daily use!
Hi Jimmy! It's been while and glad to see you are still kicking it here on the forum.
 
Back
Top