Contribution Research-in-Progress Paper
03 - Decision Support
Improving Forecast Accuracy by Guided Manual Overwrite in Forecast Debiasing
We present ongoing work on a model-driven decision support system (DSS) that is aimed at providing guidance on reflecting and adjusting judgmental forecasts. We consider judgmental forecasts of cash flows generated by local experts in numerous subsidiaries of an international corporation. Forecasts are generated in a decentralized, non-standardized fashion, and corporate managers and controllers then aggregate the forecasts to derive consolidated, corporate-wide plans to manage liquidity and foreign exchange risk. However, it is well-known that judgmental predictions are often biased, where then statistical debiasing techniques can be applied to improve forecast accuracy. Even though debiasing can improve average forecast accuracy, many originally appropriate forecasts may be automatically "corrected" in the wrong direction, for instance, in cases where a forecaster might have considered knowledge on future events not derivable statistically from past time series. To prevent high-impact erroneous corrections, we propose to prompt a forecaster for action upon submission of a forecast that is out of the confidence bounds of a benchmark forecast. The benchmark forecast is derived from a statistical debiasing model that considers the past error patterns of a forecaster. Bounds correspond to percentiles of the error distribution of the debiased forecast. We discuss the determination of the confidence bounds and the selection of suspicious judgmental forecasts, types of (statistical) feedback to the forecasters, and the incorporation of the forecaster’s reactions (comments, revisions) in future debiasing strategies.