
Elapsed Time with Resolution < 10 milliseconds ???
Quote:
>> Hi,
>> I can calculate how long an even takes using DateTime, but it
>> appears to have a resolution of 10 milliseconds. Is there some other
>> way to calculate elapsed time that would have 1 millisecond
>> resolution ?
>Why not time how long 10 of those events take, then take the average?
The process being timed includes sending a packet to a server and
getting a response. Network and server load can cause huge
deviations from the average time. Since those deviations are one of
the things it's meant to monitor, the user will need to be able to
see each individual time as well as the average.
The millisecond timings are only going to be needed when going over
a local lan or high-speed link. If they're watching response times
over a dialup, the 10 millisecond resolution will be fine.
-- David