The Good Judgment Project began in July 2011 in collaboration with the Aggregative Contingent Estimation (ACE) Program at IARPA (IARPA-ACE).[10] The first contest began in September 2011.[11] GJP was one of many entrants in the IARPA-ACE tournament, which posed around 100 to 150 questions each year on geopolitical events. The GJP research team gathered a large number of talented amateurs (rather than geopolitical subject matter experts), gave them basic tutorials on forecasting best practice and overcoming cognitive biases, and created an aggregation algorithm to combine the individual predictions of the forecasters.[5][12] GJP won both seasons of the contest, and were 35% to 72% more accurate than any other research team.[13] Starting with the summer of 2013, GJP were the only research team IARPA-ACE was still funding, and GJP participants had access to the Integrated Conflict Early Warning System.[8]
People
The co-leaders of the GJP include Philip Tetlock, Barbara Mellers and Don Moore.[1] The website lists a total of about 30 team members, including the co-leaders as well as David Budescu, Lyle Ungar, Jonathan Baron, and prediction-markets entrepreneur Emile Servan-Schreiber.[14] The advisory board included Daniel Kahneman, Robert Jervis, J. Scott Armstrong, Michael Mauboussin, Carl Spetzler and Justin Wolfers.[15] The study employed several thousand people as volunteer forecasters.[12] Using personality-trait tests, training methods and strategies the researchers at GJP were able to select forecasting participants with less cognitive bias than the average person; as the forecasting contest continued the researchers were able to further down select these individuals in groups of so-called superforecasters. The last season of the GJP enlisted a total of 260 superforecasters.[citation needed]
Research
A significant amount of research has been conducted based on the Good Judgment Project by the people involved with it.[16] The results show that harnessing a blend of statistics, psychology, training and various levels of interaction between individual forecasters, consistently produced the best forecast for several years in a row.[12]
Good Judgment Inc.
A commercial spin-off of the Good Judgment Project started to operate on the web in July 2015 under the name Good Judgment Inc. Their services include forecasts on questions of general interest, custom forecasts, and training in Good Judgment's forecasting techniques.[17] Starting in September 2015, Good Judgment Inc has been running a public forecasting tournament at the Good Judgment Open site. Like the Good Judgment Project, Good Judgment Open has questions about geopolitical and financial events, although it also has questions about US politics, entertainment, and sports.[18][19]
Media coverage
GJP has repeatedly been discussed in The Economist.[11][20][21][22] GJP has also been covered in The New York Times,[3]The Washington Post,[5][23][24] and Co.Exist.[25]NPR aired a segment on The Good Judgment Project by the title "So You Think You're Smarter Than a CIA Agent", on April 2, 2014.[9] The Financial Times published an article on the GJP on September 5, 2014.[26]Washingtonian published an article that mentioned the GJP on January 8, 2015.[27] The BBC and The Washington Post published articles on the GJP respectively on January 20, 21, and 29, 2015.[28][29][30]
^Sunstein, Cass R.; Hastie, Reid (2014-12-23). Wiser: Getting Beyond Groupthink to Make Groups Smarter. Harvard Business Review Press. ISBN978-1-4221-2299-0.
^Zweig, Jason. "Can You See the Future? Probably Better Than Professional Forecasters". The Wall Street Journal. Retrieved September 25, 2015. I think Philip Tetlock's "Superforecasting: The Art and Science of Prediction," ..., is the most important book on decision making since Daniel Kahneman's "Thinking, Fast and Slow."
^Frick, Walter. "Question Certainty". Harvard Business Review. Retrieved 2015-09-26.