Principles for Examining Predictive Validity: The Case of Information Systems Spending Forecasts

Loading...
Thumbnail Image
Penn collection
Marketing Papers
Degree type
Discipline
Subject
Funder
Grant number
License
Copyright date
Distributor
Related resources
Author
Collopy, Fred
Adya, Monica
Contributor
Abstract

Research over two decades has advanced the knowledge of how to assess predictive validity. We believe this has value to information systems (IS) researchers. To demonstrate, we used a widely cited study of IS spending. In that study, price-adjusted diffusion models were proposed to explain and to forecast aggregate U.S. information systems spending. That study concluded that such models would produce more accurate forecasts than would simple linear trend extrapolation. However, one can argue that the validation procedure provided an advantage to the diffusion models. We reexamined the results using an alternative validation procedure based on three principles extracted from forecasting research: (1) use ex ante (out-of-sample) performance rather than the fit to the historical data, (2)use well-accepted models as a basis for comparison, and (3) use an adequate sample of forecasts. Validation using this alternative procedure did confirm the importance of the price-adjustment, but simple trend extrapolations were found to be more accurate than the price-adjusted diffusion models.

Advisor
Date Range for Data Collection (Start Date)
Date Range for Data Collection (End Date)
Digital Object Identifier
Series name and number
Publication date
1994-06-01
Journal title
Volume number
Issue number
Publisher
Publisher DOI
Journal Issue
Comments
Postprint version. Published in Information Systems Research, Volume 5, Issue 2, June 1994, pages 170-179. Publisher URL: http://www.informs.org/site/ISR The authors assert their right to include this material in ScholarlyCommons@Penn.
Recommended citation
Collection