Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Categories

Welcome to the new platform of Programmer's Heaven! We apologize for the inconvenience caused, if you visited us from a broken link of the previous version. The main reason to move to a new platform is to provide more effective and collaborative experience to you all. Please feel free to experience the new platform and use its exciting features. Contact us for any issue that you need to get clarified. We are more than happy to help you.

16-bit, 32-bit, and 64-bit Assembly :: Future

kuphrynkuphryn Posts: 266Member
Hi.

I began learning 16-bit ASM and practice 16-bit ASM programming under DOS and MASM 6.x three weeks ago. I am very impressed with ASM and the control and simplicity ASM brings to programming. 16-bit ASM is so simple. Programmers have the most flexibility in terms of program design. I enjoy ASM programming and will definitely continue learning and practicing 16-bit ASM.

As I mentioned before I am learning 16-bit ASM, not 32-bit ASM. The author of the ASM book for which I am studying from emphasizes that 32-bit ASM is not as important or as practical as 16-bit ASM. One key reason is the presence of high-level languages including C/C++, making 32-bit Windows and Linux programming much easier and quicker. I definitely agree. Yes, 32-bit ASM is powerful and gives you unparalleled control, but at the cost of design, implemetation, and debugging time.

Software technology changes so rapidly. Consider Microsoft's C#. Companies are in a race to design overly simple and extremely easy, at a cost of control limitation, programming languages. I would like to know what is the future of Assembly language and 16-bit, 32-bit, and soon 64-bit ASM programming? I am most interested in 16-bit ASM because, again, 32-bit and 64-bit ASM programming in Windows and Linux is really not worth the time and effort except for specific reasons such as hardware control.

Thanks,
Kuphryn

Comments

  • DariusDarius Posts: 1,666Member
    : Hi.
    :
    : I began learning 16-bit ASM and practice 16-bit ASM programming under DOS and MASM 6.x three weeks ago. I am very impressed with ASM and the control and simplicity ASM brings to programming. 16-bit ASM is so simple. Programmers have the most flexibility in terms of program design. I enjoy ASM programming and will definitely continue learning and practicing 16-bit ASM.
    :
    : As I mentioned before I am learning 16-bit ASM, not 32-bit ASM. The author of the ASM book for which I am studying from emphasizes that 32-bit ASM is not as important or as practical as 16-bit ASM. One key reason is the presence of high-level languages including C/C++, making 32-bit Windows and Linux programming much easier and quicker. I definitely agree. Yes, 32-bit ASM is powerful and gives you unparalleled control, but at the cost of design, implemetation, and debugging time.
    :
    : Software technology changes so rapidly. Consider Microsoft's C#. Companies are in a race to design overly simple and extremely easy, at a cost of control limitation, programming languages. I would like to know what is the future of Assembly language and 16-bit, 32-bit, and soon 64-bit ASM programming? I am most interested in 16-bit ASM because, again, 32-bit and 64-bit ASM programming in Windows and Linux is really not worth the time and effort except for specific reasons such as hardware control.
    :
    : Thanks,
    : Kuphryn
    :

    While I'm probably (okay, there's hardly a probably about it) not qualified to make this sort of evaluation, in my opinion, 16-bit x86 assembly has been dead for practical code. 32-bit won't be going anyway anytime soon, and 64-bit will/has come but it's not really a desktop solution at this point (obviously if you are targetting high-end servers then this statement is pointless, however, if you are targetting high-end servers then hopefully you know what you are doing and can make your own judgement).

    Part of your argument doesn't make sense. There are C/C++ compilers for 16-bit architectures, there have been high-level languages since the 50's (and not inadequate ones either, FORTRAN, the first, and LISP, a very early HLL are very alive and well) and definitely when people used the 8088 (an 8-bit processor). I don't really see how you can consider 32-bit/64-bit Windows/Linux 'not worth the time and effort' but you see 16-bit as worth it.

    For application code there is very little reason any of the code should be in assembly. About the only time you'd want assembly in application level code is when you are tightening a hotspot.

    "We can't do nothing and think someone else will make it right."
    -Kyoto Now, Bad Religion

  • smaffysmaffy Posts: 20Member
    16 bit = small. cisc. almost no operating system (dos... heh?) that takes care of your code so that you dont screw thing up..

    32 bit = nowdays. risc and cisc. the new processors (pentium and up?) are cisc's, but are really risc's..

    64 bit = wonderful. noone would dare to have a cisc model.. 16 giga-giga-byte memory space (map all files in the system in the memory and pretend that they are allways loaded. kill the evil file-ideology??)

    ...the more bit's, the faster aritmetics. i think that moving from cisc to risc (and not just emulate cisc with microcode as "we" do today) is a bigger issue than how many bits we got!

    too bad that we are stuck with this stupid i386 compatible chips.. i want to use pure risk's!!

    ....why am i writing this? you guys and girls do already know all this.. oh. well.
  • kuphrynkuphryn Posts: 266Member
    Okay.

    You made a very good point about how recent versions of Windows make little use of 16-bit ASM. I believe 16-bit ASM is good for use in pure DOS and to communicate directly with hardware in pure DOS mode. In that case, how does 16-bit ASM come into effect? Can you use 32-bit or 64-bit ASM to communicate with hardware?

    Kuphryn
  • DariusDarius Posts: 1,666Member
    : Okay.
    :
    : You made a very good point about how recent versions of Windows make little use of 16-bit ASM. I believe 16-bit ASM is good for use in pure DOS and to communicate directly with hardware in pure DOS mode. In that case, how does 16-bit ASM come into effect? Can you use 32-bit or 64-bit ASM to communicate with hardware?
    :
    : Kuphryn
    :
    16-bit x86 assembly is a subset of 32-bit x86 assembly with a different memory model. You think that you have to use 16-bit assembly to do in al, dx, or to load and store to a memory address? What keeps you from having direct access to the hardware isn't the available instruction set, it's the operating system. The x86 has three rings of protection (though typically only two are used). Ring0 (the most privileged ring) is what the OS and (typically) device drivers run at. At that privilege level everything is allowed. Application level code runs at ring3 (the least privileged level). At this level direct port IO, processor data structures, and virtual memory address may be protected. If a user (ring3) process tries to access/modify these resources a fault is signalled and the OS decides what happens from there. The OS may allow it, and therefore would perform the operation on behalf of the process, or it may disallow it and signal an error to or terminate the process.

    DOS is good for playing around with hardware at a low-level because you can write normal applications and still have access to the hardware. However, only device driver can and should access hardware at a low-level. So for real low-level hardware manipulation the code should be in a device driver (however, presumably that was your ultimate intent anyways).

    "We can't do nothing and think someone else will make it right."
    -Kyoto Now, Bad Religion

  • kuphrynkuphryn Posts: 266Member
    [b][red]This message was edited by kuphryn at 2002-9-22 8:5:24[/red][/b][hr]
    The 16-bit ASM syntex is irrelevant. I understand the protection Windows enforces on applications in user mode. I understand the use of device drivers and the privilege to communicate with hardware. However, I hear that understand NT and newer versions of Windows such as XP, Windows controls everything. Even device drivers have to go through Windows to communicate with hardware devices.

    Okay, so let say device drivers can communicate with hardware devices in Windows. Do we have to develope our own hardware device driver every time we want to communicate with one? Surely, I do not believe that creating our own driver for, say, a Geforce chipset is easy.

    Here is my point. I want to communicate with hardware devices directly. do not care if I am in Windows user mode, safe mode, or pure DOS.

    Kuphryn


  • DariusDarius Posts: 1,666Member
    : [b][red]This message was edited by kuphryn at 2002-9-22 8:5:24[/red][/b][hr]
    : The 16-bit ASM syntex is irrelevant. I understand the protection Windows enforces on applications in user mode. I understand the use of device drivers and the privilege to communicate with hardware. However, I hear that understand NT and newer versions of Windows such as XP, Windows controls everything. Even device drivers have to go through Windows to communicate with hardware devices.

    Assuming that Windows DOES mediate access to the hardware with device drivers, it probably does so transparently. I.e. the device driver would be written as if it were ring0. I don't have the WinXP or NT DDK in my hands though.

    : Okay, so let say device drivers can communicate with hardware devices
    in Windows.

    There is no "let's say" about it. They can. That's all they do, they would be pointless if they couldn't.

    Do we have to develope our own hardware device driver every time we want to communicate with one?

    You don't, someone does. That's what device drivers are for.

    :Surely, I do not believe that creating our own driver for, say, a Geforce chipset is easy.

    As most of the difficulty of writing a device driver would come from the hardware you are using, what is your point?

    : Here is my point. I want to communicate with hardware devices directly. do not care if I am in Windows user mode, safe mode, or pure DOS.

    If you don't care, then use whichever you can/prefer. As I said before, if you are just PLAYING with the hardware then it doesn't really matter, and DOS would probably be preferable as then you could write a normal application and still have access to the hardware, writing a device driver would require writing interface code while not difficult, it is annoying and irrelevant. However, if you are trying to make something that will be used then obviously writing for an obsolete system and architecture isn't the way to do it. Processor mode has hardly anything to do with it, you have the same amount of hardware access in real-mode as protected-mode (actually you may have more in protected-mode and, for example, the SVGA framebuffer is a lot easier to deal with in protected-mode).

    "We can't do nothing and think someone else will make it right."
    -Kyoto Now, Bad Religion

  • kuphrynkuphryn Posts: 266Member
    16-bit ASM definitely has use use in pure DOS. DOS is not an obsolete OS. 32-bit ASM is useful in 32-bit OS such as Windows and Linux. However, C++ is easier simply because it requires less coding.

    Again, my goal is to learn 16-bit because I could communicate with hardware devices directly in pure DOS mode.

    In the end, ASM, bet it 16-bit, 32-bit, or 64-bit, is important and useful in many situations. Consider disassembling for example. Without knowing ASM, you cannot disassemble a problem or a source code and optimize performance.

    Kuphryn
  • blipblip Posts: 756Member
    I've got to input something: RISC sucks and it's annoying to program compared to CISC! CISC does more work per instruction, and if you had done your homework, you would have noticed that RISC code takes up more space than the equally functional CISC code (just think about it...!).
Sign In or Register to comment.