How is delay variation (jitter) measured? |
|
|
|
Written by Максим
|
Tuesday, 11 January 2011 13:50 |
Delay variation (Jitter) is calculated during tests according to RFC 3550 method:
Ji = Ji-1 + (|Di-1,i| - Ji-1)/16
Where
- Ji – measured delay variation at i-th iteration.
- Di-1,i – difference of receiving times of two consecutive test requests.
Di-1,i = (Ri-1 – Si-1) - (Ri - Si)
R – test request generation time, S – test request receive time.
During U7-type (UDP-echo) test the variation of round-trip time is measured, using the local time of originating IQM agent. During U0-type test the variations of one-way trip delays are measured in both directions. Originating agent’s local time is used for test timestamp, receiving agent’s local time is used for packet delivery time.
|
Last Updated on Tuesday, 11 January 2011 14:35 |