Setting the TCP RTX Timeout
TCP uses a combination of the mean and mean deviation estimators:
- RTT = (1-g)*RTT + g * [rtt sample]
- D = (1-h)*D + h * |sample - RTT|
- g = 0.125 (2^-3), h = 0.25 (2^-2)
- efficiently implemented using fixed point arithmetic
So, 95% of the time would expect:
- (RTT-2D)<(actual RTT)<(RTT+2D) if normal