|
Your test code is wrong.
It is not possible to measure durations below the windows time slice duration using DateTime.UtcNow !
Please look at this MSDN page: http://msdn.microsoft.com/en-us/library/system.datetime.utcnow.aspx[^]
Under Remarks it is clearly stated, that this property depends on the system timer, and that the resoultion is approximately 10 ms.
Please see this MSDN page: http://msdn.microsoft.com/en-us/library/8kb3ddd4.aspx[^]
See the explanation of the "fffffff" custom specifier: The "fffffff" custom format specifier represents the seven most significant digits of the seconds fraction; that is, it represents the ten millionths of a second in a date and time value. Although it is possible to display the ten millionths of a second component of a time value, that value may not be meaningful. The precision of date and time values depends on the resolution of the system clock. On the Windows NT 3.5 (and later) and Windows Vista operating systems, the clock's resolution is approximately 10-15 milliseconds.
Please run the following test function:
static void TestDateTimeUtcNow () {
long mindelta = long.MaxValue;
long maxdelta = long.MinValue;
for (int i = 0; i < 1000; i++) {
DateTime d1 = DateTime.UtcNow;
DateTime d2 = d1;
int sameval = 0;
while ((d2 = DateTime.UtcNow) == d1) sameval++;
long delta = d2.Ticks - d1.Ticks;
mindelta = Math.Min (delta, mindelta);
maxdelta = Math.Max (delta, maxdelta);
Console.WriteLine ("{3:D3} DateTime.UtcNow: {0:yyyy-MM-dd,HH:mm:ss.fffffff}, delta: {1}, {2} *", d2, delta, sameval, i);
}
Console.WriteLine ("MinDelta = {0}, MaxDelta = {1}.", mindelta, maxdelta);
Console.ReadKey ();
}
The output will be similar to:
995 DateTime.UtcNow: 2013-04-09,18:01:49.4843750, delta: 156250, 272263 *
996 DateTime.UtcNow: 2013-04-09,18:01:49.5000000, delta: 156250, 270741 *
997 DateTime.UtcNow: 2013-04-09,18:01:49.5156250, delta: 156250, 268597 *
998 DateTime.UtcNow: 2013-04-09,18:01:49.5312500, delta: 156250, 269661 *
999 DateTime.UtcNow: 2013-04-09,18:01:49.5468750, delta: 156250, 274958 *
MinDelta = 156250, MaxDelta = 156250.
The inner while loop counts how many times DateTime.UtcNow returned the same value.
When the returned value changes, it calculates the difference (delta).
This difference is ALWAYS the duration of the Windows scheduler time slice (default 15.625 ms).
This test function shows, that measuring durations with two consequtive calls of DateTime.UtcNow will result in approximately a quarter million of too-small time spans and one too-large time span.
Therefore calculating anything based on those values is completly meaningless.
To measure durations below the time slice duration, you need to use System.Diagnostics.Stopwatch.
If you change your test code to measure using Stopwatch, you will see that your Sleep function either doesnt sleep at all (for parameters smaller than 100 µs) or sleeps one time slice (for parameters larger than 100 µs).
|
|
|
|
|
I don't know about meaningless but it's just not updated as fast as Environment.TickCount
#region Cross Platform μTimer
public sealed class μTimer : IDisposable
{
#region Not Applicable for the MicroFramework
#if(!MF)
#region Uncesessary Interop (Left for Comparison)
#if MONO
using System.Runtime.InteropServices;
[System.Runtime.InteropServices.DllImport("libc.so")]
static extern int usleep (uint amount);
void uSleep(int waitTime) { usleep(waitTime); }
#else
[System.Runtime.InteropServices.DllImport("Kernel32.dll")]
static extern bool QueryPerformanceCounter(out long lpPerformanceCount);
[System.Runtime.InteropServices.DllImport("Kernel32.dll")]
static extern bool QueryPerformanceFrequency(out long lpFrequency);
public static void uSleep(TimeSpan amount) { μTimer.uSleep(((int)(amount.TotalMilliseconds * 1000))); }
public static void uSleep(int waitTime)
{
long time1 = 0, time2 = 0, freq = 0;
QueryPerformanceCounter(out time1);
QueryPerformanceFrequency(out freq);
do
{
QueryPerformanceCounter(out time2);
} while ((time2 - time1) < waitTime);
}
#endif
#endregion
#endif
#endregion
#region Statics
const ushort Port = 7777;
public const long TicksPerMicrosecond = 10;
public const long Divider = 1000;
static bool m_Disposed;
static Socket m_Socket = new Socket(AddressFamily.InterNetwork, SocketType.Stream, ProtocolType.Tcp);
static SocketAsyncEventArgs m_SocketMemory = new SocketAsyncEventArgs();
public static DateTime LocalTime { get { return new DateTime(Environment.TickCount); } }
public static DateTime UniversalTime { get { return LocalTime.ToUniversalTime(); } }
static μTimer()
{
try
{
m_Socket.Bind(new System.Net.IPEndPoint(System.Net.IPAddress.Loopback, Port));
m_Socket.Listen(1);
m_SocketMemory.Completed += BeginProcess;
#if(!MF)
if (!m_Socket.AcceptAsync(m_SocketMemory))
{
BeginProcess(typeof(μTimer), m_SocketMemory);
}
#else
new Thread(()=> BeginProcess(this, null)).Start();
#endif
}
catch
{
throw;
}
}
#if(!MF)
static void BeginProcess(object sender, SocketAsyncEventArgs e){
#else
static void BeginProcess(object sender, object args e){
while(!m_Disposed){ try{ Socket dontCare = m_Socket.Accept(); dontCare.Dispose(); } catch { throw new System.InvalidProgramException("A Connection to the system was made by a unauthorized means."); } }
#endif
if (!m_Disposed && e.LastOperation == SocketAsyncOperation.Connect)
{
if (!m_Socket.AcceptAsync(e))
{
throw new System.InvalidProgramException("A Connection to the system was made by a unauthorized means.");
}
}
Socket dontCare = e.AcceptSocket;
if (dontCare != null) dontCare.Dispose();
}
public static void μSleep(TimeSpan amount)
{
DateTime now = μTimer.UniversalTime, then = μTimer.UniversalTime;
TimeSpan waited = now - then;
if (waited > amount) return;
else System.Threading.Thread.Sleep(amount - waited);
waited = now - then;
if (waited > amount) return;
else unchecked
{
if (m_Socket.WaitRead(((int)((amount.Ticks - waited.Ticks / TicksPerMicrosecond) / Divider))))
{
then = μTimer.UniversalTime;
amount -= waited;
waited = now - then;
if (waited > amount) return;
else System.Threading.Thread.Sleep(amount - waited);
}
}
}
public static void μSleep(int amount) { μTimer.μSleep(TimeSpan.FromMilliseconds(amount * 1000)); }
#endregion
void IDisposable.Dispose() { m_Disposed = true; }
}
#endregion
Tested with this
public static void TestTimer()
{
TimeSpan delay = TimeSpan.FromTicks(TimeSpan.TicksPerMillisecond / 1000);
System.Diagnostics.Stopwatch sw = new System.Diagnostics.Stopwatch();
TimeSpan watch = TimeSpan.Zero;
sw.Start();
sw.Stop();
watch = sw.Elapsed;
DateTime now = Media.Common.μTimer.UniversalTime;
DateTime shouldBe = now + delay;
DateTime then = Media.Common.μTimer.UniversalTime;
TimeSpan pActually = then - now;
now = Media.Common.μTimer.UniversalTime;
shouldBe = now + delay;
then = Media.Common.μTimer.UniversalTime;
TimeSpan μActually = then - now;
TimeSpan totalU = μActually - delay;
TimeSpan totalP = μActually - pActually;
TimeSpan totalS = μActually - watch;
if (μActually > delay)
{
writeNotice(new Exception("μTimer Precision is not Correct!"), ConsoleColor.Red, false);
}
else if (pActually > delay)
{
writeNotice(new Exception("Performance Counter Precision is not Correct!"), ConsoleColor.Red, false);
}
else if (watch > delay)
{
writeNotice(new Exception("StopWatch Counter Precision is not Correct!"), ConsoleColor.Red, false);
}
else if (pActually < μActually)
{
writeNotice(new Exception("Performance Counter beat μTimer!"), ConsoleColor.Red, false);
}
else if (watch < μActually)
{
writeNotice(new Exception("StopWatch Counter beat μTimer!"), ConsoleColor.Red, false);
}
else
{
writeNotice(new Exception(("Timer Took: " + μActually + " PerformanceCounter Took: " + pActually)), ConsoleColor.DarkGreen, false);
}
}
I am still right and this still works as I outlined...
Maybe what you need is the full signaling implementation to believe it...
|
|
|
|
|
Hi Julius,
i dont believe - i measure ..
Your measurement code ist still wrong and therefore leads to wrong results. Let me explain:
In your latest test code, you measure the duration by calling Media.Common.µTimer.UniversalTime, which is implemented as:
public static DateTime LocalTime { get { return new DateTime(Environment.TickCount); } }
public static DateTime UniversalTime { get { return LocalTime.ToUniversalTime(); } }
So the new measurement is based on Enviroment.TickCount, which returns the number of milliseconds since the last start of Windows.
I think you assume, that Environment.TickCount returns increasing values with a step width of 1 (millisecond).
But that is not what is really happening ..
Lets see what actual values Environment.TickCount returns:
static void TestEnvironmentTickCount () {
int mindelta = int.MaxValue;
int maxdelta = int.MinValue;
long sumdelta = 0;
long numdelta = 0;
for (int i = 0; i < 1000; i++) {
int d1 = Environment.TickCount;
int d2 = d1;
int sameval = 0;
while ((d2 = Environment.TickCount) == d1) sameval++;
int delta = d2 - d1;
mindelta = Math.Min (delta, mindelta);
maxdelta = Math.Max (delta, maxdelta);
sumdelta += delta;
numdelta++;
Console.WriteLine ("{3:D3} Environment.TickCount: {0}, delta: {1}, {2} *", d2, delta, sameval, i);
}
double avgdelta = ((double) sumdelta) / ((double) numdelta);
Console.WriteLine ("Min Delta: {0}, Max Delta: {1}.", mindelta, maxdelta);
Console.WriteLine ("Sum Delta: {0} / Num Delta: {1} = Average Delta = {2:F9}", sumdelta, numdelta, avgdelta);
Console.ReadKey ();
}
For statistical purposes, we have a min and max delta value and a sum and counter of delta values.
Inside the outer for loop, the test works like this:
We get a time stamp d1 from Environment.TickCount.
In the inner while loop, we get another time stamp d2 from Environment.TickCount.
If this second time stamp d2 is equal to the first d1, we stay in the while loop and increase sameval by one, because we got the same return value from Environment.TickCount as before. The difference of s2 - d1 would be 0, since d2 is equal to d1.
As soon as Environment.TickCount returns a value d2 different from d1, we fall out of the while loop.
Now we calculate the difference 'delta' between the two "clock readings" d2 and d1.
Since we fell out of the loop, d2 must be not equal to d1, the difference delta must be not 0.
In mindelta and maxdelta we keep track of the minimum and maximum difference we found.
To calculate an average later, we sum and count the difference 'delta'.
Next we print the "clock reading" from Environment.TickCount and the difference 'delta' and how many times we got the same return value.
So the counter 'sameval' tells us how many times we "looked at the clock" without the "clocks watch hand moving".
The difference 'delta' tells us how far the "clocks watch hand moved" when it finally moved.
We repeat the inner test a thousand times.
Then we calculate the average delta by dividing the sum by the counter.
At last we print min and max delta, sum and counter and average.
So what values do we expect ?
Since Environment.TickCount is a property of type int, returning milliseconds, it should return increasing values with a difference of exactly 1.
The the test code should print a minimum value of 1 ms, a maximum value of 1 ms, a sum of 1000 ms, and a counter value of 1, and an average value of 1 ms.
But what values to we actually get ?
995 Environment.TickCount: 52388718, delta: 15, 1613995 *
996 Environment.TickCount: 52388734, delta: 16, 1608911 *
997 Environment.TickCount: 52388750, delta: 16, 1590424 *
998 Environment.TickCount: 52388765, delta: 15, 1603644 *
999 Environment.TickCount: 52388781, delta: 16, 1603553 *
Min Delta: 15, Max Delta: 16.
Sum Delta: 15625 / Num Delta: 1000 = Average Delta = 15.625000000
So in the 998th run, d1 was 52388750 before the while loop. In the while loop we "looked at the clock" 1603644 times without the "clock moving". When it finally moved, it jumped from 52388750 to 52388765. The difference 'delta' is 15 ms.
But Environment.TickCount never ever returned the values between 52388750 to 52388765. In theory it could, but in reality it does not. Because Windows unfortunately updates Enviroment.TickCount just once per time slice.
For 1000 tests, the average is 15.625 ms, a frequency of 64 Hz.
Why is that a problem ?
If the accuracy (= step width) of the clock function used to measure something is 15.625 ms, your clock readings will always be an integral multiple of those 15.625 ms. Any difference between two clock readings will always be an integral multiple of those 15.625 ms too. That means you cant measure a single duration smaller than 15.625 ms with a clock (function) ticking at 15.625 ms or 64 Hz.
Assume a time slice has just begun and the Enviroment.TickCount just returned a different value than before, lets say it returned 1,000,000 (ms) = 1,000,000,000 µs. Now you call your high resolution sleep function. The CPU executes all the instructions of your sleep function and the select.Poll call itself. If select.Poll itself had to sleep for e.g. 10 µs, all this might take maybe 12 µs or something. Now the time would be 1,000,000,012 µs, 12 µs later. To measure the duration, you call Environment.TickCount and it will return 1,000,000 ms again. When you calculate the difference, it will be 0. It will look like sleep didnt sleep at all.
Now assume your sleep call is called 5 µs before the end of a time slice, e.g. at Enviroment.TickCount = 2,000,000 ms (= 2,000,000,000 µs). When Sleep comes back 12 µs later, were already 7 µs (= 12 - 5) into the next time slice, so Enviroment.TickCount will return 2,000,015 ms, because Windows increased TickCount by 15.625 µs when it switched from the previous time slice to the current time slice. Now you measure a difference of 15 ms = 15625 µs. It will look like your sleep function took a thousand times more than it should.
That means using Environment.TickCount is the wrong tool to measure, because it jumps in steps of 15.625 ms.
You simply cannot reliably measure a small duration (e.g. 12 µs) with large clock ticks (e.g. 15625 µs), just like you cannot measure a length of 12 millimeters with a ruler that has only tick marks every 15 meters.
In your test code, you read the clock before (now) and after (then) the call to your sleep function.
now = Media.Common.μTimer.UniversalTime;
Media.Common.μTimer.μSleep(delay);
then = Media.Common.μTimer.UniversalTime;
TimeSpan μActually = then - now;
Then you try to calculate the duration 'µActually' as the difference 'then' - 'now'. As I have shown above, 'then' will either be exactly the same value as 'now' (because both clock readings happen within the same time slice), or 'then' will 15 ms larger than 'now' (if 'now' was in the previous time slice, and 'then' in the next time slice). So your calculated difference 'µActually' will be either too small (= 0 µs) or way too large (= 15625 µs).
I hope you can see and understand the problem with your measurement method now ..
And then there are a few minor problems with your new measure methods:
public static DateTime LocalTime { get { return new DateTime(Environment.TickCount); } }
public static DateTime UniversalTime { get { return LocalTime.ToUniversalTime(); } }
Environment.TickCount returns ticks of 1 millisecond, but the DateTime constructor expects ticks of 100 nanoseconds. You neet to multiply by 10000:
public static DateTime LocalTime { get { return new DateTime(10000 * Environment.TickCount); } }
The origin (zero point) of Enviroment.TickCount is the last boot, which is different from DateTime's origin. An offset would be needed. It would be better to use TimeSpan instead. Then you dont need to worry about time zones, too.
How can the real problem (Environment.TickCount too coarse) be solved ?
Environment.TickCount is bad for two reasons:
- its precision (smallest unit) is only milliseconds
- it jump in 15625 µs increments
DateTime.Ticks is only slightly better:
- its precision (smallest unit) is in ticks of 100 nanoseconds (= 0.1 µs)
- it jump in 15625 µs increments
System.Diagnostics.Stopwatch.Elapsed.Ticks is much better:
- its precision is in ticks of 100 nanoseconds (= 0.1 µs)
- it increments independently of the schedulers time slice
The smallest timespan I could measure on my 3.2 GHz machine is 0.5 µs
To measure reliable times, you need to use Stopwatch.
However, dont mix up Stopwatch.Elapsed.Ticks and Stopwatch.ElapsedTicks !
- Stopwatch.Elapsed.Ticks (the TimeSpan.Ticks property of the Stopwatch.Elapsed property) uses 0.1 µs ticks.
- Stopwatch.ElapsedTicks uses a CPU frequency dependent unit.
When you change your test code to use Stopwatch, you will measure that your sleep function takes a duration of 100 µs for parameters below 100 µs, and a duration of 15625 µs for parameters above 100 µs.
The reason, again, is the Windows schedulers time slice.
This time slice can be decreased from 15625 µs (= 64 Hz) to 976 µs (= 1024 Hz) by calling timeBeginPeriod(1). See my previous replies.
|
|
|
|
|
If I take your code and replace a single line
"while ((d2 = Environment.TickCount) == d1) sameval++;"
With this
"while ((d2 = Environment.TickCount) == d1) Media.Common.μTimer.μSleep(1);"
002 Environment.TickCount: 32824154, delta: 1014, 0 *
003 Environment.TickCount: 32825168, delta: 1014, 0 *
004 Environment.TickCount: 32826167, delta: 999, 0 *
005 Environment.TickCount: 32827181, delta: 1014, 0 *
006 Environment.TickCount: 32828195, delta: 1014, 0 *
007 Environment.TickCount: 32829209, delta: 1014, 0 *
Enough said?
See my above comment on WHY this works and how it uses interrupts and kernel mode drivers by calling Poll...
|
|
|
|
|
Enough said ? Nope, sorry
The clock reading in line 002 is 32824154 milliseconds.
The clock reading in line 003 is 32825168 milliseconds.
The delta is 32825168 ms - 32824154 ms = 1014 milliseconds = 1.014 seconds = 1,014,000 µs (microseconds).
You try to sleep for a duration of 1 microsecond, but your own measurement shows a duration of 1 second, which is 1 million times bigger.
How does a duration of one whole second prove that your function sleeps 1 µs ?
|
|
|
|
|
Sorry there was a bug in my code from the quick updates...
Use this class
#region Cross Platform μTimer
public sealed class μTimer : IDisposable
{
#region Not Applicable for the MicroFramework
#if(!MF)
#region Uncesessary Interop (Left for Comparison)
#if MONO
using System.Runtime.InteropServices;
[System.Runtime.InteropServices.DllImport("libc.so")]
static extern int usleep (uint amount);
void uSleep(int waitTime) { usleep(waitTime); }
#else
[System.Runtime.InteropServices.DllImport("Kernel32.dll")]
static extern bool QueryPerformanceCounter(out long lpPerformanceCount);
[System.Runtime.InteropServices.DllImport("Kernel32.dll")]
static extern bool QueryPerformanceFrequency(out long lpFrequency);
public static void uSleep(TimeSpan amount) { μTimer.uSleep(((int)(amount.TotalMilliseconds * 1000))); }
public static void uSleep(int waitTime)
{
long time1 = 0, time2 = 0, freq = 0;
QueryPerformanceCounter(out time1);
QueryPerformanceFrequency(out freq);
do
{
QueryPerformanceCounter(out time2);
} while ((time2 - time1) < waitTime);
}
#endif
#endregion
#endif
#endregion
#region Statics
const ushort Port = 7777;
public const long TicksPerMicrosecond = 10;
public const long Divider = 1000;
static bool m_Disposed;
static Socket m_Socket = new Socket(AddressFamily.InterNetwork, SocketType.Stream, ProtocolType.Tcp);
static SocketAsyncEventArgs m_SocketMemory = new SocketAsyncEventArgs();
public static DateTime LocalTime { get { return new DateTime(Environment.TickCount * TimeSpan.TicksPerMillisecond); } }
public static DateTime UniversalTime { get { return LocalTime.ToUniversalTime(); } }
static μTimer()
{
try
{
m_Socket.Bind(new System.Net.IPEndPoint(System.Net.IPAddress.Loopback, Port));
m_Socket.Listen(1);
m_SocketMemory.Completed += BeginProcess;
#if(!MF)
if (!m_Socket.AcceptAsync(m_SocketMemory))
{
BeginProcess(typeof(μTimer), m_SocketMemory);
}
#else
new Thread(()=> BeginProcess(this, null)).Start();
#endif
}
catch
{
throw;
}
}
#if(!MF)
static void BeginProcess(object sender, SocketAsyncEventArgs e)
{
#else
static void BeginProcess(object sender, object args e)
{
while(!m_Disposed)
{
try
{
Socket dontCare = m_Socket.Accept(); dontCare.Dispose();
throw new System.InvalidProgramException("A Connection to the system was made by a unauthorized means.");
}
catch { throw; }
}
#endif
if (!m_Disposed && e.LastOperation == SocketAsyncOperation.Connect)
{
try
{
throw new System.InvalidProgramException("A Connection to the system was made by a unauthorized means.");
}
finally
{
if (e.AcceptSocket != null)e.AcceptSocket.Dispose();
}
}
}
public static void μSleep(TimeSpan amount)
{
DateTime now = μTimer.UniversalTime, then = μTimer.UniversalTime;
TimeSpan waited = now - then;
if (waited > amount) return;
else System.Threading.Thread.Sleep(amount - waited);
waited = now - then;
if (waited > amount) return;
else unchecked
{
if (m_Socket.WaitRead(((int)((amount.Ticks - waited.Ticks / TicksPerMicrosecond) / Divider))))
{
then = μTimer.UniversalTime;
amount -= waited;
waited = now - then;
if (waited > amount) return;
else System.Threading.Thread.Sleep(amount - waited);
}
}
}
public static void μSleep(int amount) { μTimer.μSleep(TimeSpan.FromMilliseconds(amount * TimeSpan.TicksPerMillisecond)); }
#endregion
void IDisposable.Dispose()
{
m_Disposed = true;
if (m_Socket != null)
{
m_Socket.Dispose();
m_Socket = null;
}
}
}
#endregion
The problem is that your code needs to look like this to observe that I am correct...
static void TestEnvironmentTickCount()
{
int mindelta = int.MaxValue;
int maxdelta = int.MinValue;
long sumdelta = 0;
long numdelta = 0;
for (int i = 0; i < 1000; i++)
{
int d1 = Environment.TickCount;
int d2 = d1;
int sameval = 0;
while ((d2 = Environment.TickCount) == d1)
{
Media.Common.μTimer.μSleep(TimeSpan.FromTicks(TimeSpan.TicksPerMillisecond / Media.Common.μTimer.TicksPerMicrosecond));
System.Threading.Interlocked.Increment(ref sameval);
System.Threading.Thread.Sleep(0);
Console.WriteLine(Environment.TickCount);
}
int delta = d2 - d1;
mindelta = Math.Min(delta, mindelta);
maxdelta = Math.Max(delta, maxdelta);
sumdelta += delta;
numdelta++;
Console.WriteLine("{3:D3} Environment.TickCount: {0}, delta: {1}, {2} *", d2, delta, sameval, i);
}
double avgdelta = ((double)sumdelta) / ((double)numdelta);
Console.WriteLine("Min Delta: {0}, Max Delta: {1}.", mindelta, maxdelta);
Console.WriteLine("Sum Delta: {0} / Num Delta: {1} = Average Delta = {2:F9}", sumdelta, numdelta, avgdelta);
}
Enough yet?
|
|
|
|
|
Nope, sorry.
With your version of TestEnviromentTickCount, delta is 15 milliseconds, which is 15625 microseconds.
Why cant you measure the duration of your sleep function using Stopwatch, like this:
Stopwatch w = new Stopwatch ();
w.Reset ();
w.Start ();
w.Stop ();
Console.Writeline ("Sleep slept {0} µs.", w.Elapsed.Ticks / 10);
|
|
|
|
|
Here you go man..
this is just for you
static void TestEnvironmentTickCount()
{
int mindelta = int.MaxValue;
int maxdelta = int.MinValue;
long sumdelta = 0;
long numdelta = 0;
System.Diagnostics.Stopwatch w = new System.Diagnostics.Stopwatch();
for (int i = 0; i < 1000; i++)
{
int d1 = Environment.TickCount;
int d2 = d1;
int sameval = 0;
DateTime now = DateTime.UtcNow;
uint ops = 0;
while ((d2 = Environment.TickCount) == d1)
{
ops = 0;
w.Reset();
w.Start();
DateTime then = DateTime.UtcNow;
Media.Common.μTimer.μSleep(TimeSpan.FromTicks((TimeSpan.TicksPerMillisecond / Media.Common.μTimer.TicksPerMicrosecond / 1000)));
w.Stop();
++sameval;
++ops;
System.Threading.Thread.Sleep(0);
Console.WriteLine(Environment.TickCount + " " + (then - now));
Console.WriteLine("Sleep slept {0} µs. Sleep slept {1} Ticks. {2} Ops", (w.Elapsed.Ticks / ( TimeSpan.TicksPerMillisecond / Media.Common.μTimer.TicksPerMicrosecond)), w.Elapsed.Ticks, ops);
}
int delta = d2 - d1;
mindelta = Math.Min(delta, mindelta);
maxdelta = Math.Max(delta, maxdelta);
sumdelta += delta;
numdelta++;
Console.WriteLine("{3:D3} Environment.TickCount: {0}, delta: {1}, {2} *", d2, delta, sameval, i);
}
double avgdelta = ((double)sumdelta) / ((double)numdelta);
Console.WriteLine("Min Delta: {0}, Max Delta: {1}.", mindelta, maxdelta);
Console.WriteLine("Sum Delta: {0} / Num Delta: {1} = Average Delta = {2:F9}", sumdelta, numdelta, avgdelta);
}
Sleep slept 0 µs. Sleep slept 43 Ticks. 1 Ops
46533553 00:00:00.0150008
Sleep slept 0 µs. Sleep slept 46 Ticks. 1 Ops
46533553 00:00:00.0150008
Sleep slept 0 µs. Sleep slept 46 Ticks. 1 Ops
46533553 00:00:00.0150008
Sleep slept 0 µs. Sleep slept 43 Ticks. 1 Ops
46533553 00:00:00.0150008
Sleep slept 0 µs. Sleep slept 46 Ticks. 1 Ops
999 Environment.TickCount: 46533569, delta: 16, 181 *
Min Delta: 15, Max Delta: 47.
Sum Delta: 15803 / Num Delta: 1000 = Average Delta = 15.803000000
I think the problem is that you were calculating the time wrong... You are showing the average TimeSlice information I assume... it's your calculation...
Is that enough or do you still need to see a signaling implementation?
I am not sure what else you need....
And this time please make sure you are asking the u sleep Timer to sleep for the correct amount of time with
Media.Common.μTimer.μSleep(TimeSpan.FromTicks((TimeSpan.TicksPerMillisecond / Media.Common.μTimer.TicksPerMicrosecond / Media.Common.μTimer.Divider)));
where
Media.Common.μTimer.TicksPerMicrosecond = 10; Media.Common.μTimer.Divider = TimeSpan.TicksPerMillisecond / TicksPerMicrosecond;
I hope you agree now and hopefully this will show you that there is nothing scientifically wrong with the notion in the first place... My example was fine in the beginning and the hard part seemed to be getting you to understand and fixing your code to be correct but I do appreciate your rebuttal it just made me even more sure I was correct!
Thanks for all of your effort and let me know if you there is anything else you need proof wise!
Have fun offloading to the NIC!
Regards,
v//
modified 11-Apr-13 21:54pm.
|
|
|
|
|
I would avoid using the micro symbol in your namespace and function names. This requires the developer, if they don't know the key code sequence by heart, to look it up in Charmap or on an ASCII chart. If they are not using the standard fonts, the extended character may be different than the micro character, and opening it in a text editor that doesn't support it may cause compilation errors later on.
I would be very frustrated every time I wanted to use your library if I had to open another program or look up key codes to add your namespace in the code editor. When typing code, I'm typically faster than intellisense in getting the first couple letters out so don't make the user rely on intellisense (may not be available in the editing package they use, or require them to search through a list of function names or namespaces to add it).
|
|
|
|
|
The reason for this is specific, Timer is already taken, uSleep is already known.
This method is intentionally named to prevent conflict and ensure explicit use.
If you don't like the name, change it, simple as that. (Using a using statement or otherwise)
The point of the article was that I have found a reliable way to achieve the desired precision.
Thanks for your "Advice"!
modified 8-Apr-13 0:28am.
|
|
|
|
|
In 2013, unicode should be the 'standard'...
|
|
|
|
|
|
|
In case you didn't see that method uses the Performance Counters which my method already beats!
Thanks though!
|
|
|
|
|