Why measuring time difference using a clock is a bad thing...
A lot of times when we write programs, we want to show the user how much time has passed doing something or waiting for something. First thought is taking current time, doing the thing and then taking current time and measuring the difference. However, timestamp can go backwards if time is synchronized from outside clock. Also, precision of a timestamp might be limited to seconds, so anything that takes less than a second shows up taking no time at all.
To measure time difference correctly, we need performance counters. Performance counters are counters that measure time relative to moment the counter is initialized and the counter is updated on constant intervals... Precision of a high-performance counter can be 1/10000000th of a second (100 nanoseconds), which means even short time differences can be measured.
To calculate time difference using performance counter, it is important to know the order which the precision is adjusted... If a developer needs 1 millisecond accuracy, then the difference must be multiplied by 1000 before dividing by the frequency (ticks per second), as otherwise the result would be truncated due to performance counters using integer math.