Problems with a time algorithm.

Hey there. I have a little problem that is probably not hard to figure out, but I have been programming for a week straight and my brain is a little cheezed out:P

I have number of datapoints that each need a timestamp. The time between each datapoint is 0.25 seconds, and I need an algorithm to convert the index of the datapoint to a timestamp. If I only needed to know each second the problem would have been nonexistant, by just using Seconds = Datapoints Mod 60 and so forth, but now I need to put in hundreds of a second too.

So, the datapoints is added 4 per second, so the timeformat should be something like HH:MM:SS:%%.

i would really appreciate any help given.

Sign In or Register to comment.

Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!