Skip to content
Advertisement

__rdtscp calibration unstable under Linux on Intel Xeon X5550

I’m trying to use __rdtscp intrinsinc function to measure time intervals. Target platform is Linux x64, CPU Intel Xeon X5550. Although constant_tsc flag is set for this processor, calibrating __rdtscp gives very different results:

JavaScript

As we can see the difference between program executions can be up to 3 times (125-360). Such instability is not appropriate for any measurements.

Here is the code (gcc 4.9.3, running on Oracle Linux 6.6, kernel 3.8.13-55.1.2.el6uek.x86_64):

JavaScript

When I run very similar program under Windows 7, i7-4470, VS2015 the result of calibration is pretty stable, small difference in last digit only.

So the question – what is that issue about? Is it CPU issue, Linux issue or my code issue?

Advertisement

Answer

Definitely it was my code (or gcc) issue. Compiler optimized out the loop replacing it with s = 1000000.

To prevent gcc to optimize this calibrating loop shall be changed this way:

JavaScript

Or more simple and correct way (thanks to Hal):

JavaScript
User contributions licensed under: CC BY-SA
7 People found this is helpful
Advertisement