Aggregating Large Sets of Probabilistic Forecasts by Weighted Coherent Adjustment

Aggregating Large Sets of Probabilistic Forecasts by Weighted Coherent Adjustment

0.00 Avg rating0 Votes
Article ID: iaor20116423
Volume: 8
Issue: 2
Start Page Number: 128
End Page Number: 144
Publication Date: Jun 2011
Journal: Decision Analysis
Authors: , , ,
Keywords: aggregation, combining forecasts, Ordered weighted averaging
Abstract:

Probability forecasts in complex environments can benefit from combining the estimates of large groups of forecasters (‘judges’). But aggregating multiple opinions raises several challenges. First, human judges are notoriously incoherent when their forecasts involve logically complex events. Second, individual judges may have specialized knowledge, so different judges may produce forecasts for different events. Third, the credibility of individual judges might vary, and one would like to pay greater attention to more trustworthy forecasts. These considerations limit the value of simple aggregation methods like unweighted linear averaging. In this paper, a new algorithm is proposed for combining probabilistic assessments from a large pool of judges, with the goal of efficiently implementing the coherent approximation principle (CAP) while weighing judges by their credibility. Two measures of a judge's likely credibility are introduced and used in the algorithm to determine the judge's weight in aggregation. As a test of efficiency, the algorithm was applied to a data set of nearly half a million probability estimates of events related to the 2008 U.S. presidential election (∼16,000 judges). Compared with unweighted scalable CAP algorithms, the proposed weighting schemes significantly improved the stochastic accuracy with a comparable run time, demonstrating the efficiency and effectiveness of the weighting methods for aggregating large numbers and varieties of forecasts.

Reviews

Required fields are marked *. Your email address will not be published.