Discontiguous Exponential Averaging. (1998)
6 hours ago
- #algorithm
- #statistics
- #exponential-smoothing
- Exponential smoothing is a statistical technique used to adapt network protocols to changing conditions.
- Traditional exponential smoothing has flaws, including startup transients and failure to recognize data gaps.
- The algorithm's simplicity comes from reducing past weights with a single multiplication, but this introduces biases.
- Initializing the average with the first data point biases the average towards that point.
- The weight on each point depends on the number of updates, not elapsed time, causing issues with irregular data intervals.
- Corrected algorithms adjust weights based on elapsed time and limit the impact of large data gaps.
- A parameter 'maxDt' defines how large a gap can be before data is treated as missing.
- Final algorithms include computing standard deviation and a 'completeness fraction' to ensure reliability.
- Exponential smoothing can be extended to more complex regression analyses, though memory usage increases with complexity.
- The corrected algorithms preserve the simplicity and memory efficiency of traditional exponential smoothing.