How does the automated outlier and detection work in DemandCaster?
The detection and correction algorithm works as follows:
- The specified forecasting model is fit to the time series, the residuals (fitted errors) are generated and their standard deviation is calculated.
- If the size of the largest error exceeds the outlier threshold, the point is flagged as an outlier and the historic value for the period is replaced with the fitted value.
- The procedure is then repeated using the corrected history until either no outliers are detected or the specified maximum number of iterations is reached.
- In a multiple-level problem the detection is only performed on the end items (i.e., the non-group level). If the correction option has been selected, after all end items are corrected, the group level totals are re-aggregated to reflect the corrected values.
- You can adjust the Sensitivity setting to make the outlier threshold more or less sensitive.
- Sensitivity (std deviations) allows you to set the sensitivity of the outlier detection algorithm. If a given fitted error exceeds this threshold and it is the largest error detected during the current iteration it will be flagged as an outlier. Our default setting is 1 standard deviation.
- Maximum iterations allows you to set the maximum number of iterations permitted during outlier detection for a given item. This setting thereby also defines the maximum number of outliers than can be detected for a given item. This is user defined. Each iteration is another standard deviation.