Skip to content
Advertisement

Why does a timerfd periodic Linux timer expire a little before than expected?

I am using a Linux periodic timer, in particular, timerfd, which I set to expire periodically, for instance, every 200 ms.

I noticed, however, that the timer seems to expire, sometimes, a little before with respect to the timeout I set.

In particular, I’m using the following C code to perform a simple test:

JavaScript

After compiling it with gcc:

JavaScript

I’m getting the following output:

JavaScript

I would expect the real time difference, estimated with gettimeofday(), to never be less than 200 ms (also due to the time needed to clear the event with read()), but there are also some values which are a bit less than 200 ms, like 199.942000 ms.

Do you know why I’m observing this behavior?

Can it be due to the fact that I’m using gettimeofday() and, sometimes, tv_prev is taken a bit later (due to some varying delay when calling read() or gettimeofday() itself) and tv_curr, in the next iteration, is not, causing an estimated time less than 200 ms, while the timer is actually precise in expiring every 200 ms?

Thank you very much in advance.

Advertisement

Answer

This is related to a process scheduling. The timer is indeed quite precise and signal the timeout every 200 ms, but your program will not register the signal until it will actually get control back. That means that the time which you get from gettimeofday() call can show some later moment in the future. When you subtract such delayed values you can get result bigger or smaller than 200 ms.

How to estimate the time between the actual signal of the timer and your call to gettimeofday()? It is related to the process scheduling quantum of time. This quantum has some default value set by RR_TIMESLICE in include/linux/sched/rt.h. You can check it on your system like this:

JavaScript

Output on my system:

JavaScript

So, you may need to wait for the scheduler quantum of another process to finish before you get the control and will be able to read the current time. On my system it can lead to approximately ±4 ms deviation of the resulting delay from the expected 200 ms. After performing almost 7000 iterations I get the following distribution of the registered waiting times:

Distribution of waiting times

As you can see, most of the times are lying in the interval ±2 ms around the expected 200 ms. The minimum and maximum time among all iterations was 189.992 ms and 210.227 ms respectively:

JavaScript

Deviations bigger than 4 ms are caused by rare situations when the program needed to wait for several quantums, not just one.

Advertisement