• Please review our updated Terms and Rules here

New to ASM, assistance setting up

A few people have stated that machine code which is directly executed on the CPU is not visible by the user
I am not sure if this is true but do early computers have an exception to this?

I am not too clear on this area, things like .bin files have viewable mechine code right (in hex)?
Is this different to what is executed on the machine?
If theoretically you were able to write in machine code would you be able to pump it into the processor without a compiler/assembler?

You mean visible like Blinkenlights? :)

With x86 (and other newer processors) people won't normally program computer directly with machine codes, and also users won't normally see machine codes, unless they run a debugger or look at the HEX dump of an executable file. But still if you want, you can program it with machine codes either by creating .COM file using your favorite HEX editor or just with DOS debug. It might be a good thing to try, especially if you want to understand how x86 instructions are encoded (see pages 29-36 here). But probably it is not very useful for any serious programming. x86 have somehow complex instruction format, and with an assembler you'll be able to do exactly the same (same granularity of low-level CPU control), with much less effort.
 
Also guessing what you're asking but you might be talking about the registers that keep track of what the processors output is? They're viewable or using something like debug.com would let you walk through your code and watch the registers and output which is exactly why a debugger is useful. Helps you find your error in your ML or assembly code.

I'm not a competent assembly programmer so I did always wonder how or what overhead comes from showing the register stack without modifying the registers. From what I know you'd have to pop (read) the entries, dump the data, then push them again when you're done so you don't interrupt what was previously there?
 
I was reading that you are not able to view the machine code being processed in modern CPU's because it would compromise their security, the ability to reverse engineer and such.

There are some instances of this, but it's not the rule by far. I can, for example, think of a few gaming consoles that store the data in encrypted form, but for regular production systems, such as those using Windows or various flavors of *nix, debugging tools are common. That, for example, is the primary use of good old DEBUG--to step through and examine machine code of a running program.
 
When computing devices were first invented (by the British) they were programmed by hard-wiring, then after some time things had developed to the point of loading programming instructions through switches or later punch-cards. The programmer had to manually (and I guess painstakingly) code out each instruction in order, but the end result was still worth it because of the speed it could execute, compared to performing the calculations manually. So machine code is literally a sequence of codes that the execution unit will do with what you intend (hopefully!).

After some time that clearly became very tiresome and with computers able to do more work, so the assembler was created. This takes human-readable (well, to some anyway) text assembly code and generates from it the machine code which can then be executed. Obviously this makes things much easier to develop and maintain.

But with computers then becoming more powerful there was the opportunity to write programming languages, presumably initially themselves in assembler, that would generate more complex constructs than a simple 1:1 mapping of mnemonic to executable code. So for example Pascal compiler will understand "for i := 1 to 10 do writeln('Hello World');" and generate the machine code to have a variable in memory, perform a loop, and write to the screen buffer as you intend.

With respect to modern code, one program can't just randomly go poking around in system memory to see other code any more, due to protection mechanisms introduced in the i386. But the code on disk - if you have permissions - as Chuck says in DLLs, EXEs etc can of course be loaded into a suitable tool to disassemble etc. Of course modern programs are so massively complicated generally that doing such would be very, very difficult to understand.

HTH!
 
With respect to modern code, one program can't just randomly go poking around in system memory to see other code any more, due to protection mechanisms introduced in the i386. But the code on disk - if you have permissions - as Chuck says in DLLs, EXEs etc can of course be loaded into a suitable tool to disassemble etc. Of course modern programs are so massively complicated generally that doing such would be very, very difficult to understand.

HTH!

Wow, thank you for the explanation.
That is what I wanted to hear.
This is a very nice forum in comparison to many others,
This community is very kind.
Thanks!
 
Last edited:
At the expense of rubbing some the wrong way--aw, what the heck, I love rubbing people the wrong way :) --you may want to consider this:

The x86 architecture is a hack and has only gotten to me more so with time. In the evolution of today's CPUs, things have only been added, not removed. And the 8086 itself is an adaptation of the 8080 to handle 16-bit operands and larger memory spaces. The 8080 is an improvement over the 8008 (you would see an immediate resemblance to the 8008 in the instruction set and registers). So, basically, you're dealing with a high-performance racing car that still has a feed bag and buggy whip. In addition to that, you've got software which has had ages to devolve into absolute cruftiness.

If you want something a bit cleaner and perhaps more immediately gratifying, you may want to consider one of today's low-end microcontrollers. There are numerous development kits and tools out there. If you have a program bug, you don't have a lot on the line. And they're pretty cheap. There are perhaps thousands of cool applications for the Atmel AVR-based Arduino boards; TI offers several very inexpensive kits for their MSP430 line (one looks just like a USB flash drive but actually contains a tiny microcontroller board). All of the tools are PC-based and free. On the very small microcontrollers, assembly is the only supported language because of the very limited resources.

Just a suggestion that you may want to mull over.
 
It's a good point - or how about ARM; obviously I'm thinking RaspberryPi. Is there some dev kit for that which has no need for Linux?
 
I agree with the point about x86. When looking back to my assembly programming days I probably liked the 6502 best, at least for 8-bit microprocessors. (And there were great books to learn from! I only needed 'Programming the 6502' by Rodnay Zaks). The 6502 had that combination of being very easy, and at the same time have limitations that gave you a challenge to work around. Limited number of registers, and some limits to addressing modes. And then figure out how to pass parameters to your subroutines (say, a string of text to a print function, to take a common one). It's more satisfying if there are some problems to solve, instead of just easy sailing.
A limited instruction set processor can in many ways be an easier starting point than a processor with tons of instructions and addressing modes. I'm sure some of those modern tiny ones would fit into that description, although I'm not really familiar with many of them.

-Tor
 
It's a good point - or how about ARM; obviously I'm thinking RaspberryPi. Is there some dev kit for that which has no need for Linux?

There are Windows-based tools for ARM, but I'm not a big fan of assembly language for that platform. AVR is easy, PIC is simple in its own way, but both are Harvard architecture. The MSP430 is von Neumann and there's even a version of the chip with FRAM instead of flash, so you don't lose the contents of memory when power goes off. There are some very inexpensive experimenter boards available. A much-overlooked MCU.
 
Back
Top