Item Response Models of Probability Judgments: Application to a Geopolitical Forecasting Tournament
Penn collection
Degree type
Discipline
Subject
probability judgment
item response theory
scoring rules
continuous response model
Management Sciences and Quantitative Methods
Funder
Grant number
License
Copyright date
Distributor
Related resources
Author
Contributor
Abstract
In this article, we develop and study methods for evaluating forecasters and forecasting questions in dynamic environments. These methods, based on item response models, are useful in situations where items vary in difficulty, and we wish to evaluate forecasters based on the difficulty of the items that they forecasted correctly. In addition, the methods are useful in situations where we need to compare forecasters who make predictions at different points in time or for different items. We first extend traditional models to handle subjective probabilities, and we then apply a specific model to geopolitical forecasts. We evaluate the model’s ability to accommodate the data, compare the model’s estimates of forecaster ability to estimates of forecaster ability based on scoring rules, and externally validate the model’s item estimates. We also highlight some shortcomings of the traditional models and discuss some further extensions. The analyses illustrate the models’ potential for widespread use in forecasting and subjective probability evaluation.