Watch this space for recent news of interest to Researchers:
SAS grants announced for 2011 (29 November, 2010)
The famous M-competition data is now available in Excel format (10 March, 2008)
Sources of Knowledge
-
NEP-FOR is a weekly email report on new working papers in forecasting edited by Professor Rob Hyndman. It covers any papers that are part of RePEc (www.repec.org) and concern forecasting including time series forecasting, judgmental forecasting and the application of forecasting to all areas. It does not cover time series and econometric papers that do not involve forecasting. To subscribe, please go to http://lists.repec.org/mailman/listinfo/nep-for/
The State of Knowledge in 2001 (summarized as principles)
A description of forecasting principles along with evidence as they appeared in Forecasting Principles: A Handbook for Researchers and Practitioners.
What is Known to Date (summarized as principles)
The current state of forecasting knowledge, including changes and additions to principles as new evidence has emerged, is incorporated into the Forecasting Audit.
Research Needs
-
Research Needs for Forecasting (PDF) describes principles of forecasting that are in need of research. Of the 139 principles, 33 are based on common sense and, thus, do not need research. Of the remaining 106, there is a weak need for research on 41 principles, typically because adequate research has already been done. 42 principles are in moderate need of research. Finally, and of key importance, 23 principles strongly need research. For a summary of prior research on these principles, see Principles of Forecasting (2001).
-
Systematic Approach to Research on Time Series - to aid in identifying research conditions, a common set of features is proposed for time series.
Research Funding Sources
-
SAS Grants for Research on Principles of Forecasting - Application deadline: September 30, 2011.
-
Research project on design and use of forecasting support systems in supply chain management (England)
New Forecasting Principles
- A new forecasting principle has been proposed by Magne Jørgensen for the estimation of prediction intervals. This principle stands in marked contrast to current practice in many fields. Comments are solicited; in particular, is there additional research to add and are there other conditions?
14.14 Ask for a judgmental likelihood that a forecast will fall within a pre-defined minimum-maximum interval (not by asking people to set upper and lower confidence levels).
Traditionally, people are requested to provide minimum-maximum intervals to indicate the uncertainty of their estimates. The traditional request leads to over-optimistic views about the level of uncertainty. Jørgensen (2004) proposed that a person different from the estimator should identify minimum and maximum values and that the expert assesses how likely it is that the actual value will be inside the interval. Evidence was obtained from a previously reported experiment and from field studies in two software companies. In these companies, information was obtained from 47 projects applying the traditional framing and 23 projects applying the pre-defined framing. The latter led to better calibrated (less overconfident) prediction intervals. Argumentations, replications and supporting studies are also found in Teigen and Jørgensen (2004), Jørgensen, Teigen & Moløkken-Østvold (2004), Jørgensen and Teigen (2002), and Winman, Hansson & Juslin (2004).
References:
Jørgensen, Magne (2004), "Increasing Realism in Effort Estimation Uncertainty Assessments: It Matters How You Ask," IEEE Transactions on Software Engineering 30(4):209-217, 2004 - full text.
Jørgensen, Magne (2002) & K. H. Teigen, "Uncertainty Intervals versus Interval Uncertainty: An Alternative Method for Eliciting Effort Prediction Intervals in Software Development Projects," in: Proceedings of International Conference on Project Management (ProMAC), Singapore, pages 343-352 - full text.
Jørgensen, Magne, K. H. Teigen & K. J. Moløkken-Østvold (2004), "Better sure than safe? Overconfidence in judgment based software development effort prediction intervals," Journal of Systems and Software, 70(1-2):79-93 - full text
Teigen, K. H. & M. Jørgensen (2005), "When 90% confidence intervals are only 50% certain: On the credibility of credible intervals," Applied Cognitive Psychology, 19:455-475, 2005 - full text
Winman, Anders, Hansson, Patrik, Juslin, Peter (2004), "Subjective probability intervals: how to reduce overconfidence by interval evaluation," Journal of Experimental Psychology: Learning, Memory and Cognition, 30(6), 1167-1175.
Papers with New Evidence on Principles
This section lists papers that contain new evidence relevant to forecasting principles.
[To list your paper here, see instructions.]
- Wright, M. and MacRae, M. (2007). "Bias and variability in purchase intention scales", Journal of the Academy of Marketing Science, 35, 617-624. - Full Text
-
J. Scott Armstrong, Kesten C. Green, Randall J. Jones, and Malcolm Wright (2008). "Predicting Elections From Politicians’ Faces" - Full Text
-
Green. K. C. and Armstrong J. S. (2007). "The Ombudsman: Value of Expertise for Forecasting Decisions in Conflicts", Interfaces, 37, 287-299. [Appears with introduction by Paul Goodwin (pp. 285-286) and commentary by Shelly A. Kirkpatrick, Jonathon J. Koehler, and Philip E. Tetlock.] Description
-
In a paper titled, "The Impact of Institutional Change on Forecast Accuracy: A Case Study of Budget Forecasting in Washington State" (Full Text) (expanded summary), Elaine Deschamps (2004) explored the relationship between organizational change and forecast accuracy by analyzing the budget forecasting process in the state of Washington. Principles were tested on 180 budget forecasts produced before and after the creation of the independent agency. Deschamps found that forecast error decreased by 22% (from a MAPE of 6.8% to 5.3%) after an independent forecasting agency was established.
-
Goodwin, P. (2002) "Integrating management judgment with statistical methods to improve short-term forecasts," Omega, 30, 127-135. Description
Working Papers with Evidence on Principles
Working papers are posted at this site to, first, establish a claim, and second, obtain peer review. Please contact the authors directly with suggestions, such as informing them of a relevant paper that was overlooked. Contact J. Scott Armstrong if you would like your working paper to be considered.
- J. Scott Armstrong and Kesten C. Green (2011), "Demand Forecasting: Evidence-based Methods," Working Paper Full Text
- Alfred G. Cuzà n, J. Scott Armstrong, and Randall J. Jones, "Combining Methods to Forecast the 2004 Presidential Election: The Pollyvote" - Full Text
- Kesten C. Green (2003), "Forecasting Decisions in Conflicts: Analogy, Game Theory, Unaided Judgement, and Simulation compared," PhD Thesis at Victoria University of Wellington. - Full Text
- Thomas H. Tessier, "Conditional Lodging Sales Forecasts through 1980," MBA Thesis at the Wharton School, March 1974. - Full Text
Literature
- Aids for Searching
- Bibliographies and Indexes of Books and Journals
Techniques
- Tests for Comparing 3 or More Models [PDF 39kb] - provides tables when number of comparisons are being made - e.g., testing 4 methods.
- Rule-based Forecasting [an outside site] - complete and corrected rule base to integrate domain knowledge and extrapolation methods.
- Damped Seasonal Factors [PDF 12kb] - a proposal for damped seasonal factors with results from a small test.
Data Used in Published Papers
- M-competition Data contains monthly, quarterly, and annual series. These have been used as "wind-tunnel" data for testing extrapolation methods. The files includes:
- the 1001 series used in the 1982 M-competition, data in excel format,
- the 29 series - series ending in 1987, series ending in 1988, series ending in 1989 (includes the validation data) used in the 1993 M2 competition, data in excel format, and
- the 3003 series used in the 1999 M3-competition (yearly series, quarterly series, monthly series and other series), data in excel format.
- M-competition results:
- M-competition MAPEs
- Makridakis, S. & Hibon, M. (2000). “The M3-competition: Results, conclusions and implications,†International Journal of Forecasting, 16, 451-476. Full text
- M3-competition MAPEs, all data by forecasting horizon
- M3-competition MAPEs, by periodicity by forecasting horizon
- Weatherhead II data (MS Excel, 196 KB), collected in 1995 by Monica Adya for her thesis Critical issues in the Extension of Rule-Based Forecasting Systems at The Weatherhead School of Management at Case Western Reserve. It includes 489 annual time series from U.S. Statistical Abstracts, ValueLine, INFORUM website, and The World Watch Institute. For details, contact This email address is being protected from spambots. You need JavaScript enabled to view it..
Peer Review
Seeking feedback on working papers? One objective of the site is to allow for rapid and open peer review of work on forecasting principles. Provide your working paper to gain comments.
IIF Discussion Group: A list serve that allows researchers to get advice from researchers in the International Institute of Forecasters discussion group).
Researchers Who Do Consulting
If you are also consulting, and if you are a member of the International Institute of Forecasters, you can add yourself to the list of consultants on forecasting (which appears on the Practitioners page).
Requests for Proposals
Funders can list RFP's here.