Welcome to the new platform of Programmers Heaven! We apologize for the inconvenience caused, if you visited us from a broken link of the previous version. The main reason to move to a new platform is to provide more effective and collaborative experience to you all. Please feel free to experience the new platform and use it's exciting features. Contact us for any issue that you need to get clarified. We are more than happy to help you.
I'm not quiet sure where to post this question, but as it has relations to measuring efficiency of algorithms I figured this might be the place.
I'm currently working on the implementation of an algorithm within cryptology, and I'm comparing its runningtime with an existing algorithm. Now I compare them in actual running time, but I see that people often refer to the "number of cycles" when they talk about how much work is needed to do something. So; I'm not quiet sure what's ment by these cycles. Is this actual operations on the lowest level - like in the gates on hardware level?
I've been working on some assembly programming lately and I've seen authors speak about the number of cycles on assembly instructions as well (without further explanation..). So; is that how I measure the number of cycles on my algorithm? By checking the assemblycode generated when I compile my program (written in C++), and then counting the number of cycles for the instructions. Or is there parhaps a better way?.. Or am I just way of with all I've been saying here?
Any answers will be gladly accepted! And if this is the wrong messageboard for this question and you have a better idea please let me know...