• Please review our updated Terms and Rules here

Decided to start learnig C

Low-end MCUs are generally best done in assembly, as speed and memory are minimal and you may be counting cycles for timing.

I'm actually kind of amazed how close C comes to assembly on the AVR 8-bits; if you need cycle *exactness* then, yeah, either pure assembly or slapping in some embedded ASM is a thing you'll need to do. But it's still pretty incredible that you can use a "higher-level" language on an eight-bit CPU and still do "a useful thing" several million times a second.
 
Last year I bought a C++ book from Amazon and it came from Walmart. It was supposed to have the CD/DVD with it but I never received it. Still going round and round with Amazon and I don't think I'm going to win.

what is the name of the book? there's the possibility I have the cd/dvd.

yeah buying used books and expecting to get the disk usually results in melancholy. Once in a while I'm presently surprised though. That is if it/they don't wind up cracked in half!
 
I've played around with BASIC since about 1979 but never really got super deep into it. A few years ago, I bought a book about assembly, but couldn't stay awake reading it. So, late in September, I started looking for Online Courses for programming. I stumbled accross "C Programming with Linux" from Dartmouth College. I'm now three courses into a seven course program and learning a LOT! Has anyone here checked this out or completed all seven courses? I'd love to hear feedback and suggestions as to were to go after completion. C++? Python? Java? Other C courses?

If you click the picture, you'll see my progress so far.

I really like C for programming on old boxen and for gnawing on the bare metal. It's low-level enough that you can kind of intuit how it's going to look in assembly language as you write it, but it's a lot easier to read and way more portable.

The good ol' K&R book was updated for ANSI C some years (decades?) ago. That was the book that I cut my C-teeth on, and I still like it best. It's a no-nonsense book that just tells you what you need to know in an understandable and straightforward way, without a bunch of extra stuff.

Do take care with pointers and buffer overflows, though. There are some compilers that can somehow insert a piece of protected memory at the end of all your arrays/etc, so if you overflow them it will raise a segmentation fault instead of just randomly smashing other variables and continuing. I can't remember what it was called or how it worked, though.

And be careful and systematic with your dynamic memory allocation. I find it useful to use a malloc replacement that keeps track of all your allocations and then complains at exit if you forgot to deallocate anything.

(Both of these debugging aids to be removed in production builds, of course.)

C doesn't try to save you from yourself in either of these regards. I read somewhere that like 90% of C debugging time is spent dealing with memory allocation and buffer overflow issues, so if you can avoid them to begin with or make them easier to notice, you'll save yourself a lot of time and hair-pulling.
 
Last edited:
Back in the day ADA was the thing at the Pentagon. Might be still around in some areas.

I still have nightmares about Ada. Oh god. Such an unpleasant language named after such an interesting historical figure. Poor Ms. Lovelace's legacy is now forever tainted by the language. :3
 
I still have nightmares about Ada. Oh god. Such an unpleasant language named after such an interesting historical figure. Poor Ms. Lovelace's legacy is now forever tainted by the language. :3

CUTLASS was much easier to write real time control programs in, shame that wasn't widely promoted either. So easy to use and read.

A task could be scheduled to run on a clock and a three term controller was as simple as writing a=INCPID(Gain,IAT,DT). It automatically handled setting up all the dependant histories for the control functions and provided simple to code state machines for sequence controls, filters, I/O handling etc.

Still in use too.
 
what is the name of the book? there's the possibility I have the cd/dvd.

yeah buying used books and expecting to get the disk usually results in melancholy. Once in a while I'm presently surprised though. That is if it/they don't wind up cracked in half!

What I have is "Visual C++ 6" for Dummies (meaning me). It's written by Michael Hyman & Bob Amson (WTFRT2 anyway?). There is a chapter that explains the CD usage and a number to call at Hungry Minds Customer Care", which I just now discovered. I know I know, everyone is going to say I got the wrong book, but my tutor was off that day.
 
I still have nightmares about Ada. Oh god. Such an unpleasant language named after such an interesting historical figure. Poor Ms. Lovelace's legacy is now forever tainted by the language. :3

Well, the ISS hasn't fallen out of the sky yet and Nvidia is coding its automotive stuff in Ada, so it's still around. It's not a language that lends itself to coding "quickies". It's not a bad language for mission-critical stuff, unlike, say C.

C has been described by many people as a "high level assembly language" and indeed the original K&R is a good match to the PDP-11 instruction set. It's C and its descendants that's had a strong influence on computer architecture. For example, ISAs without a stack architecture used to be quite common. Not today. Legacy FORTRAN certainly doesn't require one.

Any JOVIAL programmers out there? That was another good one for embedded stuff--and easy to read and program, but mostly used by the military. Sad that it's fallen out of favor.
 
OP Here! Wow. Lots of great replies here. Thanks guys!! I've only quickly skimmed over all of them! One of the questions that came up was asking what hardware I wanted to program for. The answer to that is, well, mostly older stuff like PC XT and AT such as my 5155 and my 5162. The 5162 dual boots to DOS and ZENIX. I have played a very little with a C compiler on ZENIX and would like to do more there. Also, I've been playing with Arduino and other Micro controllers. Lately, I've been playing with Soarer's converter on a Pro-Micro converting a model F-XT keyboard to USB. In fact, that's what I'm typing on now. I found reverse engineered source code for Soarer's converter and have been able to compile it perfectly for the Pro-micro. I would like to tweak it so that it can also convert to AT protocol as well as USB so that I can connect the same XT keyboard to an AT computer or a newer USB capable machine without replacing the firmware completely on the Pro-Micro.

Someone mentioned that I wouldn't learn much C from seven classes. I think he misunderstood. It's not seven classes. It's SEVEN COURSES. Each course has a bunch of lectures and tests with a Final exam in each course. The EdX Program that Dartmouth College is using is pretty cool. The lectures are done with Code Cast. If you click that link I provided and cut and paste the following tiny bit of code, you can compile and run it to see the output.

Code:
#include<stdio.h>
int main(){
    printf("hello, world\n");
    return 0;
    }


It's a pretty good teaching tool. The main reason for going with an online course is that I need some structure and some goals to strive for to try to get a good foundation to build upon. The great thing about the Dartmouth program is that they get the student coding right away and keep building to more advanced programming problems. I don't yet understand enough to play manipulate the code for Soarer's converter, but I think I'm making headway.

Greg
 
don't rule out Turbo Pascal for vintage only programming. Loads of good learning materials for it. As well as Turbo C++. And if you weren't aware, not to insult your intelligence, any C++ compiler can gobble strait C code. But it may need it's specific syntactical rules obeyed.
 
Last edited:
Would you please explain what "mission-critical" means as far as
programming is concerned?

ziloo :mrgreen:

In its simplest, if the program crashes, people die (or the wrong people die if its a missile control I suppose), though it covers any application or system that if it fails it brings down the whole operation.

So running an airliner on windows 95 would be interesting every time the BSOD came up.

I remember reading about a brand new plane that needed a full throttle test against the brakes before delivery, only the software was too clever and refused to allow full throttle with the brakes on. So they applied an override to that function and engaged full throttle, however the software then had a second loop that assumed that with full throttle applied, the brakes should not be engaged and released them. The brand new plane then rammed into the adjacent building.
Mission Failed !

At work we use a CUTLASS control scheme to run the reactor and its feed & steam systems. Though in this case, failure of the computer system would lead to a reactor trip rather than a plane falling out of the sky. We do have a second computer system running the shutdown cooling sequence but even thats backed up by a redundant relay logic sequenced cooling system.
 
Last edited:
I suppose it does. If they had a system that coordinated all the activities and it stopped the business then yes it would be mission critical.

I believe all trading was/is? done using the GPS clocks to order trades and account prices and I believe there was a bit of a worry that the GPS had become mission critical for all world trading.

Not sure if anything has changed, so a debris storm in their orbit would be bad news.
 
Someone mentioned that I wouldn't learn much C from seven classes. I think he misunderstood. It's not seven classes. It's SEVEN COURSES. Each course has a bunch of lectures and tests with a Final exam in each course.

My basic point it this. At the end of the 7 courses, you will ideally have a solid introduction to the C syntax, its primitive data types, control structures, and memory model.

All of this is very nice.

But that doesn't mean you'll be able to program a computer. You might be able to understand and read code. (And don't take this personal, I don't mean YOU you, or anything like that, I'm talking in general.)

Computer programming is beyond syntax, far beyond syntax.

And, bluntly, you have to apply it, to problems that you care about, that interest you in order for it to stick.

I look back at my father, who lamented the limitations of variable names in BASIC. He used the variable A, then the variable AA, finally the variable AAA and ran in to trouble, because the early BASICs tended to not discriminate beyond the first 2 characters.

Now, anyone of experience would decry that what he wanted to do was Wrong. That he was coding the Wrong Way, and had unreasonable expectations.

His code was awful, but that doesn't mean it was the Wrong Way. As Perl community likes to embrace as a first class concept "There's more than one way to do it". And, IMHO, end user programming should be encouraged. In time, if he stuck with it, wrote more programs, used them for any length of time, I'm sure my Dad would have learned that naming all of your variable A, AA, AAA, B, BB, BBB, would probably end up being a bad idea. But the best way to learn that is by understand what's wrong with it, rather than just parroting some rote principal you heard somewhere. It's one of my quibbles with the whole "Best Practices". Many embrace them without understanding why. And if they did understand, they wouldn't necessarily need lists of Best Practices to follow. And, IMHO, they should understand.

As an anecdote, I was talking to a lady who had written a quite large data processing system, all in a BASIC that supported only 2 letter variable names. She was quite proud that Z9 was used as the global error code in her programs. That was a standard of consistency for her system. No doubt there were others that made best use of the limited variable naming space. Things we obviously take for granted.

Programming is a craft to be practiced. Your courses will tell you about the saw, the hammer, the drill, etc. of the C world. But they won't make you a carpenter. Experience is the only way to get that. Which is why it's important to write code. Writer early, write often, mess things up, "do it Wrong". Results over process.

Which is also why you need to code things the interest you. With my Dad he was either doing engineeering calcs or financial/stock studies. Taking formulas and such from books and trying to apply them. I knew one guy who leased a VAX 11/750 (which was larger than the VAX we had at the office) to do commodities tradiing analysis. He did it all in BASIC, I never saw his code. He was an engineer. But a number of "quality" standards, I imagine his code was "bad", but it gave him the results he wanted, and thats all that mattered.

But I will tell you he dropped it like a hot potato for a PC running Lotus.

So, to learn C, you need to apply C. Start with a blank file, start typing in code, and bumbling your way through things until stuff works for you.

While software is changeable, code has momentum. Early decisions can affect the lifetime of a project. Which is why early on it's important to throw stuff away wholesale and start again, so as to not bring mistakes with you. But don't let that scare you from doing whatever you think will work. You will know when something becomes a problem. Because it then becomes YOUR problem. And those are the best lessons.

I'm a homespun organic coder. Outside of fundamental FORTRAN syntax, and, most importantly, I/O, I can honestly say a class has never taught me much of anything. It's all been book learning in the trenches with no fear of applying stuff and making it work. I tried to learn FORTRAN on my own, we had a copy for the TRS-80, but I couldn't figure I/O out. It was talking I/O channels, and files, and what not, and I just wanted "Hello world" and "Guess the number". Couldn't figure that out when what few books I had were talking about tape drives.

I learned BASIC, then started writing BASIC in FORTRAN, FORTRAN in Pascal, Pascal in C, overlaying my knowledge of the old ways on to the new ways.

So, programming is a craft, the more you do, the better you get. Approach it without fear, you can't hurt anything.
 
I am all in favour of you learning a new language.

I was an assembler, BASIC and PASCAL coder and then got into ADA. At university we had to learn FORTRAN as part of the course.

When the Amiga A1000 came out - that was the point at which I had to start learning 'C'.

Now, I write most of my 'home spun' code in 'C'. Most machines have a 'C' compiler of one form or another available.

After you have done your courses and played with C a bit, I would urge you to read the book "C traps and pitfalls" by Andrew Koenig. This is an excellent book, and shows you how easy it is to write valid code that will get you into trouble.

Most modern compilers will detect some of these issues - but only if you switch the appropriate warnings and errors on though...

My motto is "A C compiler warning is a run-time error waiting to happen"...

Dave
 
An occasional topic on the Usenet C language forum was Dennis Ritchie posting a snippet of code with the notation:

"What does this do?"

Occasionally, the answer would be "we don't know". ANSI cleared some of those up. The point is that C was designed as a "quick and dirty" language and not one that was lexically unambiguous.

Lexically, C is a minefield. Leave out a semicolon or insert one where it doesn't belong and you've got trouble. Substitute a comma and dreadful things can happen. That's one reason to turn on all of the warning levels when you write code. If it warns you of something, it's best to look at it carefully, since locating the issue in running code can be a nightmare.

Also, something not mentioned is "coding standards". Have some, so that code you write 30 years from now is still recognizable as yours and readable. I use as my own, the CDC CPD coding standards of the 1970s, mutatis mutandi. Comments are critical. Say what you're going to do, say what you're doing and then say what you've done. The thousands of lines of uncommented code in Linux drives me nuts.

I'm not a fan of artificially strict standards, I think that Microsoft's "hungarian" style guide is a disaster.
 
Back
Top