• Please review our updated Terms and Rules here

TINY memory model C, compiler recommendation?

Another thing to bear in mind is that many C compilers include some safety checks by default. For example, MSC by default issues a stack-checking call at the entry of every routine. (This is where you get the "STACK OVERFLOW" error message).

However, this and other things can easily be disabled (e.g., the /Gs compile option).
 
Some pointers:
Some good ones there. Thanks.

1) Careful, it uses unsigned chars by default, on x86 it is more common to use signed chars, but there's a compiler-switch for that. So if you use code that worked with MSC or BC++ or such, and it doesn't work in OpenWatcom, see if it's your chars.
I rarely if ever have use for sign on a byte width element, and the defaulting to signed never made a lick of sense to me and was one of the things that pissed me off about C and made me favor pascal; the distinction between byte, shortint, and char. If yer gonna call it char, it should be unsigned... of course Joe forbid anyone use the word "byte" in C. You'd almost think they were worried about still supporting 6 bit systems or something.

This of course is the same thing that led to the C99 standard types that remove all the ambiguity, which I usually include define for in compilers that don't support them.

I'll take uInt8_t, int8_t, int16_t, int32_t, etc, etc, over the original C standard types any day. Particularly with the asshattery that you never know what the *** the compiler is going to give you when you say just plain "int". I HATE ambiguity in programming languages.

That was one of those long overdue changes.

But that's like float and double. Makes me feel like ordering a soda at a fast food place where the ONLY sizes are medium, large, and extra-large... or worse walking into some hipster joint like Starbucks and dealing with their artsy-fartsy names and ending up storming out like a second rate Dennis Leary just because you asked for coffee flavored coffee...

"Whatever the hell that is"

Which is starting to happen to me IRL -- went into a Dunkins a few months ago and asked for a dozen... lip pierced nose pierced twig tattoo on her face that made her look like she needed a shave skank reeking of either patchouli or the worst BO ever (there's a difference?) behind the counter says "A dozen what?"

... and get off my damned lawn! :p

2) I've had some inconsistent results when using floating point. Couldn't quite put my finger on it, but sometimes it seems the floating point precision/rounding is very bad. I think there are some bugs in its x87-backend. I had some routines that initialized things like sin-tables, sqrt, 1/x-related stuff... And my code broke in OpenWatcom, while it worked in BC++ and MSC. I ended up just precalcing the tables on another system, and including them in the binaries, as a workaround.
I'm assuming this is emulated since there would be no reason for there to be a difference with hardware, unless the actual routines use a different approach. Usually I EXPECT certain routines -- trigonometric functions in particular -- to NOT be compatible across different browsers just due to implementation differences.

Thankfully in Paku Paku -- or really any game I would be writing -- the only need I have for that is distance checking, and I'm using a lookup table for that since the range is a divide by 3 8x8 cell. (well, I made it 8x8 to simplify the math, the actual max distance is 7). To speed that up I do a box check first of course before diving for the table.

Blaine (aka Clyde from the real game - last ghost out of the pen) has a funky logic where if he gets 7 tiles (a tile is 8x8 in the real pac man, 3x3 in mine) away while in "chase" mode, he switches back to "return to corner" behavior until he's farther away than that, at which point he will switch back to chase. This makes it hard to get him close enough to catch after eating a power pellet unless it's the one in his corner. (bottom left)

3) There is a bug in the FPU detection routine which locks up an 8088 or 8086 system when no FPU is installed. I have patched the libc to fix this issue (doesn't impact machines with FPU or 186 and higher, so you can use these libs for any 16-bit target)
Is this an issue if you have ZERO plans to use any floating point code and are not declaring any variables that would use it, or is the compiler too stupid to recognize that and sets up for it anyways?

I'll have to test that one.
 
Last edited:
Is this an issue if you have ZERO plans to use any floating point code and are not declaring any variables that would use it, or is the compiler too stupid to recognize that and sets up for it anyways?

I'll have to test that one.

I think the compiler/linker are smart enough to leave the FPU routines out of it if you don't use any float code at all. But as soon as you put 'float test' in your code somewhere (or include some library or header that does), the code will be pulled in.
So, if you're very careful, you could get away with the original ones...
But otherwise, if you just use OpenWatcom 1.9 like I did, you could just use my libraries. I just modified them and rebuilt them, so they're not binary patch-jobs, and therefore not bug-prone.
I can give you the modified sources if you want.
 
Thank ye most kindly.

-- edit --

Nevermind the rest... damn, I've been away from even touching 8087 code for WAY too many decades.
 
Last edited:
Just an update... **** C and the source it code in on.

I'm a Wirth language guy, C's needlessly and pointlessly cryptic syntax has always been a turn-off for me, but the sloppy memory management, convoluted library and include system, and just plain generally being unsuited for programming on 16 bit targets has sent me running and screaming like an idiot back into coding raw x86 asssembly's arms.

I've said it for years, I'd sooner hand-assemble 256 bytes of RCA 1802 machine language and enter it one bit at a time on toggle switches, than deal with trying to debug 100 lines of C code.

OpenWatcom was the best of what I tried... and by the time I was 10% into the project I was just "screw this" over the stupid agonizing way the language works.

How the hell do people make actually useful software with this crap? The language is just so intentionally obtuse and addle-minded it's damned near DESIGNED to force bad coding habits, create memory leaks, and result in programs that are about as stable as a bi-polar epileptic off their meds!

Again, I REALLY don't think this is a joke:
https://www.gnu.org/fun/jokes/unix-hoax.html

It's like the unholy trio were TRYING to make programming -- a relatively simple thing -- and make it as needlessly and pointlessly difficult as possible. I often wonder if they lost sight of the concept that one of a high level language's jobs is supposed to be to make things easier.

So... big boy pants time, gone back to raw x86 assembler using NASM... and in ONE DAY of doing so I'm already further along than I was in nearly two weeks of dicking around with C.

Of course, since I'm using a TINY memory model, (made my own SECTION declarations for NASM rather than using ORG) a lot of the overhead has up and disappeared. LOVE not having to dick around with the stack and therein having BP free to do whatever the hell I want, much less not deal with segment overrides for anything except reading the BDA and writing to video.
 
How the hell do people make actually useful software with this crap? The language is just so intentionally obtuse and addle-minded it's damned near DESIGNED to force bad coding habits, create memory leaks, and result in programs that are about as stable as a bi-polar epileptic off their meds!

It's not the language; it's the coders. It's perfectly possible to code clear-as-glass C, APL, FORTRAN, Ada, COBOL, PL/I, 7700 Autocoder, FLOW-MATIC, RPG, what-have-you.

By the same token, any of the above can be made obtuse in the extreme.

Really. Find some good code and study it.
 
I'm surprised that he hasn't designed his own perfect language, compiler, run-time libraries, operating system, etc. Nothing ever seems to be good enough ...

Seriously, grow up. C and C++ are used all over for a reason. If it's not your cup of tea, fine .. but the rants are nutty.
 
the sloppy memory management, convoluted library and include system, and just plain generally being unsuited for programming on 16 bit targets has sent me running and screaming like an idiot back into coding raw x86 asssembly's arms.

If C's (lack of) automatic memory management is your problem, I'm not sure how moving to assembly is going to help. Likewise with includes/libraries - in both C and assembly you've got an include system which is just textual insertion, and the linker for using separately-compiled libraries. As for "generally being unsuited for programming on 16 bit targets" - C was originally developed on/for the PDP-11 which is a 16-bit machine. If there's a specific problem you're having with using C for 16-bit x86 then perhaps one of us can suggest something which could help - these compilers have some language extensions (e.g. __far pointers) which might relieve some of the pain.

I don't want to put you off from writing pure assembler if that's the course you're set on - if you have the patience to write everything in assembler then your programs will run better for it, but there is a big cost in time and sanity.

How the hell do people make actually useful software with this crap? The language is just so intentionally obtuse and addle-minded it's damned near DESIGNED to force bad coding habits, create memory leaks, and result in programs that are about as stable as a bi-polar epileptic off their meds!

C has some warts to be sure (such as separate compilation, NULL pointers, function pointer syntax and the parsing of "char * p;") but those of us who have grown up with it tend to find it neater and more regular than Pascal (though perhaps this is because we've learned to think in C rather than a property of the language itself). Still, if there's a specific problem you have we might be able to suggest some patterns or ways of thinking about it which could help.
 
I wrote Pascal for years and had no trouble switching to C. I can't remember any struggle at all. As has been said already, bad code is a programmer issue, not a language issue. But I could do without C's handling of **variable types (pointers to pointers). The rest is fine with me. Pascal and other structured languages too. As a replacement for C I don't see many options. Can't stand C++ . D seems to be a good alternative, but I have way too little experience with it to be certain.
 
For some applications, C++ can be wonderfully clear--and surprisingly compact. It's a tool and works very well is used properly, like any other tool.
 
Apologies folks, I think I was just having a week or two of EVERYTHING pissing me off.

I dunno, maybe it's just because I'm working in DOS... on '80's and early '90's microcomputer targets C always feels bloated and unnatural, as if it's DESIGNED to make you screw up. I mean, I use C on AVR and ARM targets regularly, and use the various languages that are little more than C in a frilly dress like PHP, JavaScript, etc, daily... I just don't have the problems there I do trying to work with C in DOS. C syntax has always annoyed the hell out of me with some of its mannerisms, and just trying to get a executable out of C still feels like banging my head against a brick wall -- but nowhere does that seem worse than when trying to work with DOS.

It's still not a language I would use by CHOICE, it's just industry-wide the choice has often been made FOR YOU.

In any case I've seen significantly more progress since I dumped all that and went straight to the assembler, the whole assembler, and nothing but the assembler. No more struggling with ambiguities, bizzaro-land interpretations of memory management, the overhead of garbage collection I'm not actually using, etc, etc...

I'm actually a little surprised as I didn't think C or Pascal were introducing THAT much overhead, but I'm already caught up to where I was two years ago with version 2.0 in Pascal (before I took time off due to health concerns) at a third less code footprint... DESPITE finally having integrated the extra sound card support (PS/1, Innnova SSI2001) into it. (was a standalone test before). I mean I know there's a lot of overhead in dumping stuff on the stack and preserving registers, but I wasn't making THAT many function calls.

Big test is going to be if I'm actually squeezing that extra bit of speed needed to run on an unexpanded unmodified 128k Junior at full speed... though since I'm going to be WELL below that 64k target size I'm wondering if I could make a cartridge for it. The way it works the cart would have to copy itself to RAM before running, but it would be an interesting thing to try and make.

Though again, people use terms like "clear", "compact", or "properly" and I can't help but think I have a different definition of these words... I see it daily in web development where people slop together multiple 100k library of scriptardery and some rubbish 100k CSS "Framework" to then write their own 500k of JavaScript and 100k of CSS, that needs 60 to 200k of markup resulting in a page that spans four to ten dozen separate files, that needs some complex back-end pre-processing asshattery to even be built into a deployable result -- all to do the job of 48k COMBINED (HTML+CSS+SCRIPTS) in three files... and then they have the unmitigated gall to call that process "Easier"... because having to learn four separate frameworks to then write ten to forty times the code was "easier". OBVIOUSLY...

I think my disgust to the point of nausea with the sleazeball scam artists who use garbage like bootcrap or jquery boiled over into my attempt to find something better than TP7 for writing for a small DOS target. I was already in a "mood" before I took a break from webdev to look back at this project... Only to be confronted by the same type of what looks to me like utter and complete nonsense of developers trying to sell working harder as working smarter, and intentionally trying to make the simplest of tasks as difficult as possible.

First blasted lesson I learned of programming -- the less code you use, the less there is to break; a lesson that always seemed lost back in the day on the "Big Iron" *nix crowd with their charging by the K-LoC (there's a reason I cleared a years income in one summer as a pre-teen with a simple script that just deleted comments and made simple optimizations in interpreted languages speeding up programs), and even more lost on today's developers with their idiotic "just slap together other people's code and blindly hope it's secure and works even though I don't understand a line of it" mentality.

Emphasis on the mental.

Bottom line, I think from now on for DOS development I'm just gonna stick to assembly. ESPECIALLY when my result is going to fit the TINY memory model of CS=DS=SS. It seems that trying to use high level languages to "simplify" the process was having the exact opposite result.
 
Last edited:
Your poison. I've written my share of assembly and found that the overhead of a good optimizing C is not onerous--and occasionally the code was better than what I could do.

So good luck with that.
 
I've written my share of assembly and found that the overhead of a good optimizing C is not onerous
Unfortunately when 8088/8086 is your target, a "good optimizing C" seems to be more myth than fact... but let's be honest, before about 1990 nobody working on microcomputers seriously gave a flying purple fish about C. It was a novelty from the mainframe world, not something you used to write software.

There's a reason Windows to this day still uses Pascal calling conventions and length delimited strings.

Though I have always found C and C syntax due to it's needlessly and pointlessly cryptic nature, lack of program flow that properly corresponds to how systems (other than big iron) work, and generally just a pain in the ass. There's a reason I stayed with Pascal for as many years as I did, and a reason I still curse every time I'm stuck using derivative languages like PHP and JavaScript. The very syntax of it seems carefully crafted to promote making errors that any decent language wouldn't even let get past the compilation stage.

and occasionally the code was better than what I could do
I've NEVER had that on x86 -- EVER. ARM and AVR? Sure. Again, I suspect the RISC/CISC divide and that old joke that's really not a joke. CISC is for people who write programs, RISC is for people who write compilers... and it really shows in the resultant code output by compilers.

I look at what compilers vomit up on x86 (particularly anything less than 32 bit targets) and have to suppress the urge to start screaming at the top of my lungs at the... dare I use the word? "Deplorable" code.
 
Unfortunately when 8088/8086 is your target, a "good optimizing C" seems to be more myth than fact... but let's be honest, before about 1990 nobody working on microcomputers seriously gave a flying purple fish about C. It was a novelty from the mainframe world, not something you used to write software.

Don't be silly. I was using Lattice C to write x86 DOS programs for the 5150. And you've obviously never fooled with the Unix/Xenix microcomputer-based systems, oh, say, Sun? x80 C, not so much, but once the 16-bitters made the scene, C suddenly became practical.

There's a reason Windows to this day still uses Pascal calling conventions and length delimited strings.

Though I have always found C and C syntax due to it's needlessly and pointlessly cryptic nature, lack of program flow that properly corresponds to how systems (other than big iron) work, and generally just a pain in the ass. There's a reason I stayed with Pascal for as many years as I did, and a reason I still curse every time I'm stuck using derivative languages like PHP and JavaScript. The very syntax of it seems carefully crafted to promote making errors that any decent language wouldn't even let get past the compilation stage.

I don't think it was because of Pascal that Windows did what it did. If you'dve asked BillG what he thought the best programming language was then, he'd probably have responded "BASIC".


I've NEVER had that on x86 -- EVER. ARM and AVR? Sure. Again, I suspect the RISC/CISC divide and that old joke that's really not a joke. CISC is for people who write programs, RISC is for people who write compilers... and it really shows in the resultant code output by compilers.

I spent years on the CDC 6000 series mainframes--many consider them to be the first RISC machines, although I don't know if Seymour, rest his soul, would have agreed. And ARM is a RISC through and through, as is MIPS. I've used a supercomputer that was far beyond any CISC machine that you've ever seen--and coding well for it was difficult. Too many ways to do the same thing leads to paralysis.

I look at what compilers vomit up on x86 (particularly anything less than 32 bit targets) and have to suppress the urge to start screaming at the top of my lungs at the... dare I use the word? "Deplorable" code.

Again, it depends on what you're doing. If you have a relatively complex task that needs doing, then the choice of languages doesn't matter as much. C is perfectly capable of generating very good code. So is FORTRAN--just ask David Kuch. The problem is that automatic optimization isn't taught well in most compiler courses.
 
C and C++ are not 'managed' languages, and have no garbage collection.
Aside from that, you can use C/C++ as a 'portable assembler', and write everything from scratch, like in assembly language. You don't have to use any libraries, and you don't have to use any startup code. Since you can easily call DOS/BIOS int handlers from C, it's not too difficult to write working programs this way. You can use DOS for file/console IO and memory management.
For more advanced stuff, such as graphics and sound, you can easily bang IO ports directly from C, so that is very similar to asm as well.

Using the tiny memory model is actually the easiest use-case. Since everything is in a single 64K segment, you don't have to worry about segmented memory models and special types of pointers and such. Everything is a near pointer, so effectively you're using a 16-bit linear addressing model.

Of course, you can still use far pointers to address memory outside of that reason, such as video memory. But really, I don't see why any of that would be difficult at all in C. You have to pay attention to near/far pointers and segments in asm just as much, if not more.
And if you do need asm for that extra bit of performance in certain areas, you can easily use inline assembly or link in some object files or libraries built in assembly.

So I am not really sure what the specific problem(s) is/are.
 
Back
Top