Surveys of current forecasting practices allow firms to compare their procedures with those of other firms and to decide what practices they may wish to change. The surveys also help to show the rate at which new methods diffuse, and this should be of some interest to researchers. In their survey of sales forecasting in 500 large U.S. firms, Sanders and Manrodt received usable replies from 96 firms. While this response rate of 19% is low, it is approximately the same as that for Dalrymples (1987) survey (where the response rate was 16%).
By comparing their survey with Mentzer and Coxs (1984) survey of managers at 500 U.S. corporations, the authors reached some conclusions about changes over a period of about one decade. They concluded that managers are now more familiar with quantitative methods, but, surprisingly, the level of usage of such methods has not increased. Practitioners continue to rely heavily upon judgmental methods. It should be noted that the response rate for Mentzer and Cox was 32%. Because those who are more interested tend to be more likely to respond to mail surveys, and those who are more interested are more likely to be up to date on the latest methods, one would expect the reported usage rate to be higher when the response rate is lower. This makes the Sanders and Manrodt findings even more discouraging because this smaller, presumably more interested group of respondents did not report a higher level of usage.
What would it take to get practitioners to use quantitative methods? Most important, according to the respondents, is that the methods need to be easier to use; next, they need guidelines for using the available techniques; in third place, they want more accurate techniques. These benefits can be provided by expert systems. Collopy and Armstrong (1992) showed that expert systems can provide improved accuracy, but currently available programs that advertise expert systems do not currently go far in providing these benefits. My prediction is that little progress will be made in the implementation of better forecasting methods until expert forecasting systems are widely available and until these systems incorporate the best methods.
Sanders and Manrodt also provided information about issues that were not addressed by Mentzer and Cox. Of particular interest, they examined judgmental revisions of forecasts. Only 9% of the respondents said that they never made judgmental revisions, while 45% said that they always adjusted the forecasts. The respondents claimed that this gave them an opportunity to incorporate their knowledge of the environment (39%), their product knowledge (30%), or their experience (26%). I believe that it is better to use subjective information as inputs to models rather than to revise forecasts.
One intent of subjective adjustments is to introduce biases. Most of these respondents wanted to underforecast (70%), although 10% said that they preferred to overforecast. As with the Mentzer and Cox (1984) and Dalrymple (1987) surveys, this paper by Sanders and Manrodt should be a widely cited paper.