Item Response Models of Probability Judgments: Application to a Geopolitical Forecasting Tournament

Loading...
Thumbnail Image
Penn collection
Marketing Papers
Degree type
Discipline
Subject
forecasting
probability judgment
item response theory
scoring rules
continuous response model
Applied Statistics
Business
Business Intelligence
Marketing
Funder
Grant number
License
Copyright date
Distributor
Related resources
Author
Merkle, Edgar C
Steyvers, Mark
Mellers, Barbara A
Tetlock, Philip E
Contributor
Abstract

In this article, we develop and study methods for evaluating forecasters and forecasting questions in dynamic environments. These methods, based on item response models, are useful in situations where items vary in difficulty, and we wish to evaluate forecasters based on the difficulty of the items that they forecasted correctly. In addition, the methods are useful in situations where we need to compare forecasters who make predictions at different points in time or for different items. We first extend traditional models to handle subjective probabilities, and we then apply a specific model to geopolitical forecasts. We evaluate the model’s ability to accommodate the data, compare the model’s estimates of forecaster ability to estimates of forecaster ability based on scoring rules, and externally validate the model’s item estimates. We also highlight some shortcomings of the traditional models and discuss some further extensions. The analyses illustrate the models’ potential for widespread use in forecasting and subjective probability evaluation.

Advisor
Date Range for Data Collection (Start Date)
Date Range for Data Collection (End Date)
Digital Object Identifier
Series name and number
Publication date
2016-01-01
Volume number
Issue number
Publisher
Publisher DOI
Journal Issue
Comments
Recommended citation
Collection