Can you help us to improve the site by making a donation towards improving the functionality of the site's free Delphi Software forecasting tool?

You may have noticed the new "Donate" tab in the banner at the top of the site. The Forecasting Principles site is provided as a public service for forecasters, researchers, teachers, students, and the consumers of forecasts. To provide this service, including the Special Interest Groups, we depend on sponsorship and advertising. Sponsorship was initially provided by The Wharton School. In more recent years, the International Institute of Forecasters has been and continues to be a generous sponsor and the site's key supporter.

The site's current revenue is not, however, sufficient to cover the cost of improving, extending, and developing forecasting support software tools such as the Delphi Software. For example, users of the Delphi software have asked for the ability to modify questions and change reporting, and to obtain more of the data that is collected in different formats.

Please check out the Donate page. Even a $10 donation would help us to keep improving the site for you.

Scott Armstrong
Kesten Green

The 2009 INFORMS Data Mining Contest is the second installment of a data mining contest that started last year in conjunction with the INFORMS conference. The contest again involves predictive problems for health care quality. The two tasks for this year's contest are: 1) modeling of a patient transfer guideline for patients with a severe medical condition from a community hospital setting to tertiary hospital provider and 2) assessment of the severity/risk of death of a patient's condition.

Predictions must be submitted by September 25. You may want to consider giving a talk at the Contest session at the INFORMS conference. Workshop acceptance will be based on successful submissions to the Contest. Contestants are encouraged to prepare a paper describing methods and insights to be considered for publication in a Special Issue in a data mining journal (to be determined). See the Contest site for more information.

A new working paper by Scott Armstrong and Andreas Graefe describes findings on the use of an index-method model based on 49 biographical variables to predict the outcomes of 28 U.S. presidential elections. It correctly predicted 25. Out-of-sample forecasts were more accurate than forecasts from 12 benchmark models. The authors welcome peer review.

Wright and MacRae used large-scale meta analysis to identify bias and variability in the forecasts from such scales. They found that converting purchase intentions to linear probability scales or proportions resulted in unbiased forecasts. The same was true of 11-point probability scales, but these had lower dispersion of forecast errors. This result gives strong support to Principle 11.4, as scale-point adjustments were not required to get unbiased forecasts. It also supports Principle 8.4, as the use of the longer 11-point scale reduced forecast error, and Principle 8.7 as there was much greater variability in forecast errors for studies with small samples. The meta-analysis was restricted to existing products and services, and did not investigate accuracy for new products.

Does the common advice to "stand in the other person's shoes" lead to better forecasts? In a new paper, Kesten Green and Scott Armstrong describe evidence that it does not. The abstract and link to full text of this paper are available on the Papers page, in a section titled "Working papers by authors seeking reviews and advice". We encourage authors to take advantage of this opportunity to get early feedback on papers that are relevant to forecasting principles, and the wider forecasting community to keep up-to-date with the latest research.