The .NET agenda...

[b][red]This message was edited by shaolin007 at 2005-1-11 9:56:47[/red][/b][hr]
[b][red]This message was edited by shaolin007 at 2005-1-11 6:23:10[/red][/b][hr]
Reading my book on C# .NET for my 'Real Time Programming' class it says this

[blue]
".NET applications do not access the OS or computer hardware directly. Instead, they use services of the .NET Framework, which in turn access the OS and Hardware."
[/blue]

That right there proves that this C# language cannot do real time as what some of you[red] also[/red] say on this board. Also, I see that M$ is getting away from having the programmer accessing anything directly probably due to all virus's that they got spanked by. I still don't get why this guy is teaching 'real time programming' with C#? I guess I'll have to have a talk with the teacher. By the way the book I have is "Murachs C#".




«1

Comments

  • : Reading my book on C# .NET for my 'Real Time Programming' class it says this
    :
    : [blue]
    : ".NET applications do not access the OS or computer hardware directly. Instead, they use services of the .NET Framework, which in turn access the OS and Hardware."
    : [/blue]
    :
    : That right there proves that this C# language cannot do real time as what some of you on this board says. Also, I see that M$ is getting away from having the programmer accessing anything directly probably due to all virus's that they got spanked by. I still don't get why this guy is teaching 'real time programming' with C#? I guess I'll have to have a talk with the teacher. By the way the book I have is "Murachs C#".
    :


    That is not just a .NET problem, but applies to every program regardless of the programming language. Even *nix has the same "feature". Under MS-Windows, the reason programs don't have direct hardware access is because the CPU is in "Protected mode", which forbids such things unless the application is running in "Ring 0", which is part of the operating system. Device drivers do, and you can get a free Device Drivers Development Kit (DDK) from Microsoft.
  • [b][red]This message was edited by shaolin007 at 2005-1-11 18:38:12[/red][/b][hr]
    : : Reading my book on C# .NET for my 'Real Time Programming' class it says this
    : :
    : : [blue]
    : : ".NET applications do not access the OS or computer hardware directly. Instead, they use services of the .NET Framework, which in turn access the OS and Hardware."
    : : [/blue]
    : :
    : : That right there proves that this C# language cannot do real time as what some of you on this board says. Also, I see that M$ is getting away from having the programmer accessing anything directly probably due to all virus's that they got spanked by. I still don't get why this guy is teaching 'real time programming' with C#? I guess I'll have to have a talk with the teacher. By the way the book I have is "Murachs C#".
    : :
    :
    :
    : That is not just a .NET problem, but applies to every program regardless of the programming language. Even *nix has the same "feature". Under MS-Windows, the reason programs don't have direct hardware access is because the CPU is in "Protected mode", which forbids such things unless the application is running in "Ring 0", which is part of the operating system. Device drivers do, and you can get a free Device Drivers Development Kit (DDK) from Microsoft.
    :

    [green]
    I know what protected mode is, but I was referring to that this C# is not even compiled to any machine code but is compiled to an 'script' so to speak that is read through a CLR. So in turn the CLR is like the cpu in some respects. That way I figure M$ can keep tabs on how calls are made to it's OS because you have to use thier code, and restrict access even further to point where your not even coding anymore your just calling there routines. I'm I wrong to think this?
    [/green]



  • : [green]
    : I know what protected mode is, but I was referring to that this C# is not even compiled to any machine code but is compiled to an 'script' so to speak that is read through a CLR. So in turn the CLR is like the cpu in some respects. That way I figure M$ can't keep tabs on how calls are made to it's OS because you have to use thier code, and restrict access even further to point where your not even coding anymore your just calling there routines. I'm I wrong to think this?
    : [/green]
    :
    :

    .NET is Microsoft's answer to Java where it can theoritically be run on any computer or operating system that supports the .NET framework, not just MS-Windows. .NET is not intended to be a real-time system, so you'll have to look elsewhere for that.
  • : : : Reading my book on C# .NET for my 'Real Time Programming' class it says this
    : : :
    : : : [blue]
    : : : ".NET applications do not access the OS or computer hardware directly. Instead, they use services of the .NET Framework, which in turn access the OS and Hardware."
    : : : [/blue]
    : : :
    : : : That right there proves that this C# language cannot do real time as what some of you on this board says. Also, I see that M$ is getting away from having the programmer accessing anything directly probably due to all virus's that they got spanked by. I still don't get why this guy is teaching 'real time programming' with C#? I guess I'll have to have a talk with the teacher. By the way the book I have is "Murachs C#".
    : : :
    : :
    : :
    : : That is not just a .NET problem, but applies to every program regardless of the programming language. Even *nix has the same "feature". Under MS-Windows, the reason programs don't have direct hardware access is because the CPU is in "Protected mode", which forbids such things unless the application is running in "Ring 0", which is part of the operating system. Device drivers do, and you can get a free Device Drivers Development Kit (DDK) from Microsoft.
    : :
    :
    : [green]
    : I know what protected mode is, but I was referring to that this C# is not even compiled to any machine code but is compiled to an 'script' so to speak that is read through a CLR. So in turn the CLR is like the cpu in some respects. That way I figure M$ can't keep tabs on how calls are made to it's OS because you have to use thier code, and restrict access even further to point where your not even coding anymore your just calling there routines. I'm I wrong to think this?
    : [/green]
    :
    :

    Real time programming with C# in Windows? That's about as far from real time as it gets :-)


    The security problems in Windows haven't got much to do with direct access to hardware anymore. The virus makers do more likely exploit weaknesses in the windows api. And sence you still can call api functions from any .NET language, any security problems in the api will still be there. And if MS applies any such thing as you describe in their compiler, how would it stop people that use other compilers from making evil programs?

    I don't know how .NET generates executables, and why it is done in a certain way, but I think it has little to do with security. I think the main reason was that they wanted all their languages to run with the same IDE, linker, RAD tool etc.

    A parenthesis: I think that Windows has an undeservingly bad reputation when it comes to security, compared to for example unix. It is probably more than 10 times more complex, and there are probably a thousend times more people trying to exploit it.
  • : : [green]
    : : I know what protected mode is, but I was referring to that this C# is not even compiled to any machine code but is compiled to an 'script' so to speak that is read through a CLR. So in turn the CLR is like the cpu in some respects. That way I figure M$ can't keep tabs on how calls are made to it's OS because you have to use thier code, and restrict access even further to point where your not even coding anymore your just calling there routines. I'm I wrong to think this?
    : : [/green]
    : :
    : :
    :
    : .NET is Microsoft's answer to Java where it can theoritically be run on any computer or operating system that supports the .NET framework, not just MS-Windows. .NET is not intended to be a real-time system, so you'll have to look elsewhere for that.
    :
    [RED]
    NET is not intended to be a real-time system, so you'll have to look elsewhere for that.
    [/RED]

    [green]
    Yea I know, looks like I might drop the class but I'm going to talk to my teacher before I do that and see what he says. I'll give him his say so.
    [/green]


  • : Real time programming with C# in Windows? That's about as far from real time as it gets :-)

    [green] Tell me about it. I've been programming for 2 years in assembly and this seems corny to me to use such a language to do tasks in real time.
    [/green]


    : The security problems in Windows haven't got much to do with direct access to hardware anymore. The virus makers do more likely exploit weaknesses in the windows api. And sence you still can call api functions from any .NET language, any security problems in the api will still be there. And if MS applies any such thing as you describe in their compiler, how would it stop people that use other compilers from making evil programs?
    :
    : I don't know how .NET generates executables, and why it is done in a certain way, but I think it has little to do with security. I think the main reason was that they wanted all their languages to run with the same IDE, linker, RAD tool etc.

    [green]
    From what I understand it doesn't generate a exe but a file that is called an 'assembly' which is then run by a 'Common Language Runtime' but I could wrong. The CLR is in charge of enforcing security, executing the instructions, and ect.. This is why I think that it's heading that way. :-)
    [/green]

    : A parenthesis: I think that Windows has an undeservingly bad reputation when it comes to security, compared to for example unix. It is probably more than 10 times more complex, and there are probably a thousend times more people trying to exploit it.
    :
    [green]
    Yea your right, learning assembly language showed me how complex even an old OS like MSDOS can be. I can only imagine the complexity of the Windows code.
    [/green]

  • : : Real time programming with C# in Windows? That's about as far from real time as it gets :-)
    :
    : [green] Tell me about it. I've been programming for 2 years in assembly and this seems corny to me to use such a language to do tasks in real time.
    : [/green]
    :
    :
    : : The security problems in Windows haven't got much to do with direct access to hardware anymore. The virus makers do more likely exploit weaknesses in the windows api. And sence you still can call api functions from any .NET language, any security problems in the api will still be there. And if MS applies any such thing as you describe in their compiler, how would it stop people that use other compilers from making evil programs?
    : :
    : : I don't know how .NET generates executables, and why it is done in a certain way, but I think it has little to do with security. I think the main reason was that they wanted all their languages to run with the same IDE, linker, RAD tool etc.
    :
    : [green]
    : From what I understand it doesn't generate a exe but a file that is called an 'assembly' which is then run by a 'Common Language Runtime' but I could wrong. The CLR is in charge of enforcing security, executing the instructions, and ect.. This is why I think that it's heading that way. :-)
    : [/green]
    :
    : : A parenthesis: I think that Windows has an undeservingly bad reputation when it comes to security, compared to for example unix. It is probably more than 10 times more complex, and there are probably a thousend times more people trying to exploit it.
    : :
    : [green]
    : Yea your right, learning assembly language showed me how complex even an old OS like MSDOS can be. I can only imagine the complexity of the Windows code.
    : [/green]
    :
    :
    [blue].NET is faster then Java, because they have different approach to bytecode. .NET has a huge amount of ready to use classes - basically, almost every possible Windows operation is done through these classes and CLR calls the highly optimized methods on these classes - these methods are real machine code on your platform. Java has to interpret more bytecode to do the same thing as .NET.

    I wonder what comes after .NET?
    [/blue]
  • I don't know what the advantage would be of a scripted language for real time embedded apps, but from experience I can tell you that manufacturers of the embedded devices tend to want to offer their wares to the less technically inclined end user. I assume that this would be a step in that direction, by making the programming of these devices more accessible to more people.

    One point I would like to make is that it is not always the case that I/O in a real time system needs to be accessed at the hardware level. That is not always true. I used an opto-22 board in a recent project, and the I/O on that unit was read/written via an Ethernet connection. In that case the hardware access was done by the controller on the opto-22 card, and not directly from my program (DOS 16 bit written in C).
  • [blue].NET is faster then Java, because they have different approach to bytecode. .NET has a huge amount of ready to use classes - basically, almost every possible Windows operation is done through these classes and CLR calls the highly optimized methods on these classes - these methods are real machine code on your platform. Java has to interpret more bytecode to do the same thing as .NET.
    :
    : I wonder what comes after .NET?
    : [/blue]
    :
    [green]
    I think it's eventually going to reach a point that the control of the OS will be taken entirely or nearly out of the application programmers control. To me it seems it's getting that way with this .NET since the CLR can determine if there is a security issue with your code hence the term managed applications which is the term used in my book. Well anyways, I'm dropping the class after I spoke to the instructor. I told him that they need to change the name of the course to simply C# and drop the real time jargon which I said was very confusing. To me a real time system would be a controller in a automobile that controls the fuel injection or accessing an embedded system in a PC.
    [/green]

  • : I don't know what the advantage would be of a scripted language for real time embedded apps, but from experience I can tell you that manufacturers of the embedded devices tend to want to offer their wares to the less technically inclined end user. I assume that this would be a step in that direction, by making the programming of these devices more accessible to more people.
    :
    : One point I would like to make is that it is not always the case that I/O in a real time system needs to be accessed at the hardware level. That is not always true. I used an opto-22 board in a recent project, and the I/O on that unit was read/written via an Ethernet connection. In that case the hardware access was done by the controller on the opto-22 card, and not directly from my program (DOS 16 bit written in C).
    :

    [green]
    I wish I had a 1/10th of your experience Dennis! ;-)
    [/green]

  • : One point I would like to make is that it is not
    : always the case that I/O in a real time system needs
    : to be accessed at the hardware level.

    Right. It doesn't really matter how many layers there are between your application code and the hardware -- it can still be real time. What matters it that you can guarantee [italic]deterministic[/italic] runtime.

    A garbage collected language running on a preemptive multitasking operating system is about as far from that as it gets.
  • : : I don't know what the advantage would be of a scripted language for real time embedded apps, but from experience I can tell you that manufacturers of the embedded devices tend to want to offer their wares to the less technically inclined end user. I assume that this would be a step in that direction, by making the programming of these devices more accessible to more people.
    : :
    : : One point I would like to make is that it is not always the case that I/O in a real time system needs to be accessed at the hardware level. That is not always true. I used an opto-22 board in a recent project, and the I/O on that unit was read/written via an Ethernet connection. In that case the hardware access was done by the controller on the opto-22 card, and not directly from my program (DOS 16 bit written in C).
    : :
    :
    : [green]
    : I wish I had a 1/10th of your experience Dennis! ;-)
    : [/green]
    :
    :

    That is a very kind remark, thank you.

    There is a faily inexpensive way to go about teaching yourself machine control. An Ethernet enabled PC-104 based computer (I use Kontron) running sockets and DOS from Datalight along with an Ethernet based I/O unit (Grayhill, Opto-22) should run you less that about $825.

    This will give you a highly versatile platform that you are already familiar with programming using your skills as an assembly language programmer and C programmer. You use the same tools you are used to. Then you would have a system that you can hook up to switches, lights, valves, motors, etc. Add motion control for another $800 or so and you have a servo-system, which is the world of robotics.

    Compared with some of the tuition costs that I have seen, it seems a cheap way to go, plus you get to keep the goodies. It is also unlikely to find another way to become proficient enough with the things you will need to learn. Examples are: IEEE 1394 protocol, DOS Sockets programming, Interfacing Little Endian components to Big Endian (like the Opto-22 to PC interface), as well as interfacing DOS components to proprietary hardware (Like the Galil to PC interface). I am of firm belief that most people who post here regularly are far more qualified to figure this stuff out on their own than can be learned at school. Like any other form of programming, it's just a big puzzle that gets figured out.


  • : [blue].NET is faster then Java, because they have different approach to bytecode. .NET has a huge amount of ready to use classes - basically, almost every possible Windows operation is done through these classes and CLR calls the highly optimized methods on these classes - these methods are real machine code on your platform. Java has to interpret more bytecode to do the same thing as .NET.
    : :
    : : I wonder what comes after .NET?
    : : [/blue]
    : :
    : [green]
    : I think it's eventually going to reach a point that the control of the OS will be taken entirely or nearly out of the application programmers control. To me it seems it's getting that way with this .NET since the CLR can determine if there is a security issue with your code hence the term managed applications which is the term used in my book. Well anyways, I'm dropping the class after I spoke to the instructor. I told him that they need to change the name of the course to simply C# and drop the real time jargon which I said was very confusing. To me a real time system would be a controller in a automobile that controls the fuel injection or accessing an embedded system in a PC.
    : [/green]
    :
    :
    [blue]Your predictions may become a reality. There is always two sides of anything - simple philosophy. The managed code is much more stable and secure, but slower then native code. With the PC reaching beyond 3GHz and 2GB of memory - it is of a little concern. I must say, that I have seen apps which strain even these CPUs, but these applications were simply badly designed. From the other side - if you give too much control - the 'bad' programmers (read: virus writers) will definitely use it. So, we have to live with it.[/blue]

  • : That is a very kind remark, thank you.
    :
    : There is a faily inexpensive way to go about teaching yourself machine control. An Ethernet enabled PC-104 based computer (I use Kontron) running sockets and DOS from Datalight along with an Ethernet based I/O unit (Grayhill, Opto-22) should run you less that about $825.
    :
    : This will give you a highly versatile platform that you are already familiar with programming using your skills as an assembly language programmer and C programmer. You use the same tools you are used to. Then you would have a system that you can hook up to switches, lights, valves, motors, etc. Add motion control for another $800 or so and you have a servo-system, which is the world of robotics.
    :
    : Compared with some of the tuition costs that I have seen, it seems a cheap way to go, plus you get to keep the goodies. It is also unlikely to find another way to become proficient enough with the things you will need to learn. Examples are: IEEE 1394 protocol, DOS Sockets programming, Interfacing Little Endian components to Big Endian (like the Opto-22 to PC interface), as well as interfacing DOS components to proprietary hardware (Like the Galil to PC interface). I am of firm belief that most people who post here regularly are far more qualified to figure this stuff out on their own than can be learned at school. Like any other form of programming, it's just a big puzzle that gets figured out.
    :

    [green]
    If it wasn't for some of you guys like you, stober, eric tetz, asmguru, jeffleyda, ashley4, crow, and some others I would be nearly clueless on some of this stuff. I'm glad a site like this is readily available to have that opportunity.

    I'll have to give your suggestion some serious consideration. Thats pretty much the direction I'm looking for. I'll have to do some research on it though first. Thanks for the tip! :-) I don't want to sound like I'm tooting my own horn, but when I spoke with the instructor and rattled off some hardware info on the DMA, PIC, ect., he seemed a little baffled by it. I did send him an email on those items and I never got a response back from that email, but all my other emails were answered very promptly, hmmm? ;-)
    [/green]


  • : [blue]Your predictions may become a reality. There is always two sides of anything - simple philosophy. The managed code is much more stable and secure, but slower then native code. With the PC reaching beyond 3GHz and 2GB of memory - it is of a little concern. I must say, that I have seen apps which strain even these CPUs, but these applications were simply badly designed. From the other side - if you give too much control - the 'bad' programmers (read: virus writers) will definitely use it. So, we have to live with it.[/blue]
    :
    [green]
    I'm still a big believer in creating your code small and fast as possible deterministic on deadlines of course. I have just used so many programs that were just horribly slow on 1ghz systems due to poor coding practices. I know it can't be done always, but it should be a goal of a good programmer to do so in my opinion.
    [/green]

Sign In or Register to comment.

Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Categories