Timing Accuracy Issues
#1
Posted 13 July 2012 - 03:54 AM
#2
Posted 13 July 2012 - 04:34 AM
#3
Posted 13 July 2012 - 05:01 AM
Hi Nobby,
Is the elapsed time from timers/Stopwatch not matching the increase in DateTime.Now?
Losing 10 seconds in 15 minutes is a lot (~1.1%).
Can you post a quick code snippet showing how you're calculating and writing out the time? We should be able to run it on our Netduinos to reproduce...
Finally...just curious: if you reflash the mainboard with Netduino firmware (instead of Netduino Plus) do you get the same results?
Chris
Hey Chris,
Due to the nature of my project, I can't share code but I have created a new project which is dedicated to the task of timing using Ticks.
The project has three threads. The main program thread, aka Main(), sits in a for(; loop with a Thread.Sleep(500) call. It does this after it initialises a class that manages the time measurements.
The second thread is fed a time object and reduces its time until it reaches zero or less. The thread is executed with default priority through a lambda expression.
private static void timerFunc(Clock clock) { if (clock == null) return; long startTicks = Microsoft.SPOT.Hardware.Utility.GetMachineTime().Ticks; long endTicks, delta; while (clock.time > 0) { endTicks = Microsoft.SPOT.Hardware.Utility.GetMachineTime().Ticks; delta = (endTicks - startTicks) / m_tick; //where m_tick is System.TimeStamp.TicksPerMillisecond clock.time -= delta; startTicks = Microsoft.SPOT.Hardware.Utility.GetMachineTime().Ticks; Thread.Sleep(50); } }
The third thread is one that runs in the clock class. It sleeps for 100ms and then uses COM1 to write, in ASCII text, the value of the time of the format "mm:ss.f".
I just started a 15min test against my sports wristwatch and PC application. 10mins into the test, clock.time is eight seconds higher than the timer on my watch.
I haven't tried flashing my Netduino Plus with the Netduino firmware. I have a regular Netduino to use and a handfull of Netduino Plus devices I can downgrade if necessary.
Currently, I'm stuck with the newest firmware for my commercial project. Purely because of the Socket runtime code footprint providing a much needed 10K.
Thanks for having a look at this
#4
Posted 13 July 2012 - 05:19 AM
#5
Posted 13 July 2012 - 05:21 AM
Thank you for the additional information. Let's boil it down a bit more, to isolate any potential code or threading issues...
Can you please try creating an app which simply does the following:
- Create and open an instance of the SerialPort class
- In a loop: write the current time to the serial port and then sleep 500ms
And then share that code along with your results? If that basic case is losing 1+ seconds per minute...or if it's not...then we'll have a good basis for diagnostics.
Thank you, Nobby.
Chris
#6
Posted 13 July 2012 - 05:31 AM
Hey Nobby,
Thank you for the additional information. Let's boil it down a bit more, to isolate any potential code or threading issues...
Can you please try creating an app which simply does the following:
- Create and open an instance of the SerialPort class
- In a loop: write the current time to the serial port and then sleep 500ms
And then share that code along with your results? If that basic case is losing 1+ seconds per minute...or if it's not...then we'll have a good basis for diagnostics.
Thank you, Nobby.
Chris
Getting onto that right now.
#7
Posted 13 July 2012 - 07:50 AM
-mainThread.png shows your experiment rules. Over four mins, there is no clock drift between PC and Netduino DateTime.Now
-threadedMachineTime.png shows the same experiment except data transmission is done every 100ms on one thread and the current machine time is stored in clock.time on another thread every 50ms. There is no drift apart from 100ms synchronisation conflicts between threads every now and then which go back to zero anyway.
-stopwatch.png System.Diagnostics.StopWatch approach shows a drift of about 200ms every two minutes. Clock.time is incremented by stop_ticks - start_ticks after a thread.sleep(50). The itterative code is shown below. It should essentially always match the netduino machine time. There is only 1 line of code overhead separating
for(;;) { endTicks = Microsoft.SPOT.Hardware.Utility.GetMachineTime().Ticks; delta = endTicks - startTicks; startTicks = Microsoft.SPOT.Hardware.Utility.GetMachineTime().Ticks; clock.time += delta; //clock.time = Microsoft.SPOT.Hardware.Utility.GetMachineTime().Ticks; //this was used in the second experiment Thread.Sleep(50); }
-internalDrift.png shows how clock.time drifts away from the Netduino system time. The netduino experiment code was modified to only transmit the drift between Microsoft.SPOT.Hardware.Utility.GetMachineTime().Ticks and clock.time(shown in the Current Column for Netduino). The results show that the Netduino clock doesn't drift compared to the PC clock rather clock.time is drifting behind the Netduino clock. As you can see from the code above though, there's nothing to it. Certainly no reason for such a rediculous lack of accuracy. Even if I changed the code so that start_ticks = end_ticks after calculating delta, it doesn't make much of a difference. I did notice that if I reduce or increase the sleep time from 50 down to 25 or up to 100, the accuracy of timing was affected but minimally.
What can you tell me about operations and overhead with large datatypes such as System.Long? Could it be significant?
I will restructure this experiment to perform stopwatch-based approach in the main thread and see what results I get.
Attached Files
#8
Posted 13 July 2012 - 12:41 PM
long startTicks = Microsoft.SPOT.Hardware.Utility.GetMachineTime().Ticks, endTicks=0, delta=0; int sleepTime = 50, choice=1; clock.start = startTicks; for(;;) { if (choice == 0) { Thread.Sleep(sleepTime); //Sleep to regulate timing intervals. 50ms here. endTicks = Utility.GetMachineTime().Ticks; //Get the current Netduino time delta = endTicks - startTicks; //Calculate the time difference startTicks = endTicks; //Assume significant overhead from the previous line of code clock.time += delta; //Increment instance member time value } else if (choice == 1) { Thread.Sleep(sleepTime); endTicks = Microsoft.SPOT.Hardware.Utility.GetMachineTime().Ticks; delta = endTicks - startTicks; startTicks = Utility.GetMachineTime().Ticks; //Assume no overhead from the previous line of code and this line as well clock.time += delta; } else if (choice == 2) { //Choices 2 & 3 just involve a different order of operations but have the same logic as 1 & 2 endTicks = Utility.GetMachineTime().Ticks; delta = endTicks - startTicks; startTicks = endTicks; //Assume overhead from the previous line of code clock.time += delta; Thread.Sleep(sleepTime); } else if (choice == 3) { endTicks = Utility.GetMachineTime().Ticks; delta = endTicks - startTicks; startTicks = Utility.GetMachineTime().Ticks; //Assume no overhead from last and this line of code clock.time += delta; Thread.Sleep(sleepTime); } }
I performed some optimisations on my original timing code which was not optimised at the time. There was still drift. The results were fairly conclussive though. Choices 0 & 2 provided identical and ideal results. There was no slip between clock.time, the Netduino time and the PC time.
Choices 1 & 3 resulted in the slip of 100ms every 30seconds. The only difference in code was using startTicks = Utility.GetMachineTime().Ticks vs startTicks = endTicks. The previous line of code, delta = endTick - startTicks and the alternative code in choices 1 & 3 were resulting in what appears to be significant overhead. Based on the results and a loop interval of roughly 50ms, it works out to be roughly 166us of slip per itteration.
Not being a huge fanatic of investigating the inner workings of the micro framework, I'm just going to accept that I can't be lazy in design with respect to these things. Failing longer duration tests for choices 0 & 2, There's no evidence of any chronic issue with timing accuracy.
#9
Posted 14 July 2012 - 08:44 AM
0 user(s) are reading this topic
0 members, 0 guests, 0 anonymous users