Finance:Superforecaster

From HandWiki

A superforecaster is a person who makes forecasts that can be shown by statistical means to have been consistently more accurate than the general public or experts. Superforecasters sometimes use modern analytical and statistical methodologies to augment estimates of base rates of events; research finds that such forecasters are typically more accurate than experts in the field who do not use analytical and statistical techniques,[1] though this has been overstated in some sources.[2] The term "superforecaster" is a trademark of Good Judgment Inc.[3]

Etymology

The term is a combination of the prefix super, meaning "over and above"[4] or "of high grade or quality",[4] and forecaster, meaning one who predicts an outcome that might occur in the future.

History

Origins of the term are attributed to Philip E. Tetlock with results from The Good Judgment Project and subsequent book with Dan Gardner titled The Art and Science of Prediction.[5]

In December 2019 a Central Intelligence Agency analyst writing under the pseudonym "Bobby W." suggested the Intelligence community should study superforecaster research on how certain individuals with "particular traits" are better forecasters and how they should be leveraged.[6]

In February 2020 Dominic Cummings agreed with Tetlock and others in implying that study of superforecasting was more effective than listening to political pundits.[7]

Superforecasters

Science

Superforecasters estimate a probability of an occurrence, and review the estimate when circumstances contributing to the estimate change. This is based on both personal impressions, public data, and incorporating input from other superforecasters, but attempts to remove bias in their estimates.[8] In The Good Judgment Project one set of forecasters were given training on how to translate their understandings into a probabilistic forecast, summarised into an acronym "CHAMP" for Comparisons, Historical trends, Average opinions, Mathematical models, and Predictable biases.[9]

A study published in 2021 used a Bias, Information, Noise (BIN) model to study the underlying processes enabling accuracy among superforecasters. The conclusion was that superforecasters' ability to filter out "noise" played a more significant role in improving accuracy than bias reduction or the efficient extraction of information.[10]

Effectiveness

In the Good Judgment Project, "the top forecasters... performed about 30 percent better than the average for intelligence community analysts who could read intercepts and other secret data".[11][12]

Training forecasters with specialised techniques may increase forecaster accuracy: in the Good Judgment Project, one group was given training in the "CHAMP" methodology, which appeared to increase forecasting accuracy.[9]

Due to their focus on probabilities rather than certainties, superforecasters are often misunderstood as having made a forecasting error when an event that they predicted would happen with less than 50% probability ends up happening. For example, the BBC notes that they "did not accurately predict Brexit", having made a prediction of 23% for a leave vote in the month of the June 2016 Brexit referendum, but goes on to say that they accurately predicted Donald Trump's success in the 2016 Republican Party primaries.[13]

Superforecasters also made a number of accurate and important forecasts about the coronavirus pandemic, which "businesses, governments and other institutions" have drawn upon. In addition, they have made "accurate predictions about world events like the approval of the United Kingdom’s Brexit vote in 2020, Saudi Arabia’s decision to partially take its national gas company public in 2019, and the status of Russia’s food embargo against some European countries also in 2019".[14]

Aid agencies are also using superforecasting to determine the probability of droughts becoming famines,[1] while the Center for a New American Security has described how superforecasters aided them in predicting future Colombian government policy.[15] Goldman Sachs drew upon superforecasters' vaccine forecasts during the coronavirus pandemic to inform their analyses.[16]

The Economist notes that in October 2021, Superforecasters accurately predicted events that occurred in 2022, including "election results in France and Brazil; the lack of a Winter Olympics boycott; the outcome of America's midterm elections, and that global Covid-19 vaccinations would reach 12bn doses in mid-2022". However, they did not forecast the emergence of the Omicron variant.[17] The following year, The Economist wrote that all eight of the Superforecasters’ predictions for 2023 were correct, including on global GDP growth, Chinese GDP growth, and election results in Nigeria and Turkey.[18]

In February 2023, Superforecasters made better forecasts than readers of the Financial Times on eight out of nine questions that were resolved at the end of the year.[19] In July 2024, the Financial Times reported that Superforecasters "have consistently outperformed financial markets in predicting the Fed's next move".[20] In February 2025, the Financial Times reported that "superforecasters continue to have the edge over the futures market in anticipating what the FOMC [Federal Open Market Committee] will do."[21]

Traits

One of Tetlock's findings from the Good Judgment Project was that cognitive and personality traits were more important than specialised knowledge when it came to predicting the outcome of various world events typically more accurately than intelligence agencies.[22] In particular, a 2015 study found that key predictors of forecasting accuracy were "cognitive ability [IQ], political knowledge, and open-mindedness".[23] Superforecasters "were better at inductive reasoning, pattern detection, cognitive flexibility, and open-mindedness". In the Good Judgment Project, the superforecasters "scored higher on both intelligence and political knowledge than the already well-above-average group of forecasters" who were taking part in the tournament.[24]

People

  • Regina Joseph, Good Judgment Project superforecaster,[25][26] technologist and founding Editor-in-Chief of Blender Magazine,[27][28][29] former Futures Division leader and Defence/Security Senior Research Fellow at Clingendael Institute,[30] forecasting science researcher[31][32] and inventor[33][34]
  • Elaine Rich, a superforecaster who participated in the Good Judgement Project.[35]
  • Andrew Sabisky, who resigned from his position as advisor to the United Kingdom government at Downing Street, with chief advisor Dominic Cummings telling journalists "read Philip Tetlock's Superforecasters, instead of political pundits who don't know what they're talking about".[7]
  • Nate Silver, a superforecaster on baseball, basketball, and elections.[36]
  • Nick Hare, former head of futures and analytical methods at the Ministry of Defence (MoD).[22]
  • Reed Roberts, a former PhD student in Chemistry.[22]
  • Jonathon Kitson[37]
  • Jean-Pierre Beugoms[38]
  • Dan Mayland[38]
  • Kjirste Morrell[38]
  • Dominic Smith[38]

Criticism

The concept of superforecasting has been criticised from multiple angles. Nassim Nicholas Taleb has been a particularly strong critic, arguing among other claims that forecasting is not useful to decision makers and that the lack of financial gain accrued by superforecasters is a sign that their actual predictive powers are lacking.[39] Counter-terrorism expert Suzanne Raine criticises it for placing too much emphasis on "what is going to happen" rather than "what is happening" and "how can the future be changed".[40]

References

  1. 1.0 1.1 Adonis (2020).
  2. "Can Policymakers Trust Forecasters?". https://ifp.org/can-policymakers-trust-forecasters/#asking-the-experts. 
  3. "Trademark Electronic Search System (TESS)". https://tmsearch.uspto.gov/bin/gate.exe?f=doc&state=4806:vyqtyx.9.1. 
  4. 4.0 4.1 "Super Definition & Meaning". https://www.merriam-webster.com/dictionary/super. 
  5. Tetlock & Gardner (2015).
  6. Bobby W. (2019), p. 14.
  7. 7.0 7.1 BBC News (2020). harv error: multiple targets (2×): CITEREFBBC_News2020 (help)
  8. BBC News (2020), What is the science behind it?. harv error: multiple targets (2×): CITEREFBBC_News2020 (help)
  9. 9.0 9.1 Harford (2014), How to be a superforecaster.
  10. Satopää, Ville A.; Salikhov, Marat; Tetlock, Philip E.; Mellers, Barbara (2021). "Bias, Information, Noise: The BIN Model of Forecasting". Management Science 67 (12): 7599–7618. doi:10.1287/mnsc.2020.3882. https://pubsonline.informs.org/doi/abs/10.1287/mnsc.2020.3882. 
  11. David Ignatius. "More chatter than needed". The Washington Post. 1 November 2013.
  12. Horowitz MC, Ciocca J, Kahn L, Ruhl C. "Keeping Score: A New Approach to Geopolitical Forecasting" (PDF). Perry World House, University of Pennsylvania. 2021, p.9.
  13. BBC News (2020), How successful is it?. harv error: multiple targets (2×): CITEREFBBC_News2020 (help)
  14. Tara Law. "'Superforecasters' Are Making Eerily Accurate Predictions About COVID-19. Our Leaders Could Learn From Their Approach." TIME. 11 June 2020.
  15. Cochran KM, Tozzi G. "Getting it Righter, Faster: The Role of Prediction in Agile Government Decisionmaking". Center for a New American Security. 2017.
  16. Hatzius J, Struyven D, Bhushan S, Milo D. "V(accine)-Shaped Recovery". Goldman Sachs Economics Research. 7 November 2020.
  17. What the “superforecasters” predict for major events in 2023. The Economist. 18 November 2022
  18. What the “superforecasters” predict for major events in 2024. The Economist. 13 November 2023
  19. The art of superforecasting: how FT readers fared against the experts in 2023. Financial Times. 26 December 2023
  20. Alternative data: Can superforecasters beat the market?. Financial Times. 18 July 2024
  21. Superforecasters continue to beat the market. Financial Times. 20 February 2025.
  22. 22.0 22.1 22.2 Burton (2015).
  23. Mellers B, Stone E, Atanasov P, Rohrbaugh N, Metz SE, Ungar L, et al. "The psychology of intelligence analysis: drivers of prediction accuracy in world politics" (PDF). Journal of Experimental Psychology: Applied. 2015;21(1):1-14.
  24. Mellers B, Stone E, Atanasov P, Rohrbaugh N, Metz SE, Ungar L, et al. "The psychology of intelligence analysis: drivers of prediction accuracy in world politics" (PDF). Journal of Experimental Psychology: Applied. 2015;21(1):1-14.
  25. (in en) Superforecasting: The Art and Science of Prediction. Crown. 2015. ISBN 9780804136693. 
  26. VICE News (2017-05-19). Chechnya Abuse & The FBI Firing: VICE News Tonight Full Episode (HBO). Retrieved 2024-08-27 – via YouTube.
  27. LLC, New York Media (1995-11-13) (in en). New York Magazine. New York Media, LLC. https://books.google.com/books?id=beQCAAAAMBAJ&q=High+Tech+Boom+Town. 
  28. WHO KNEW (2021-04-27). WHO KNEW The Smartest People In The Room - Regina Joseph & David Hughes. Retrieved 2024-08-27 – via YouTube.
  29. "Blender (magazine)" (in en), Wikipedia, 2024-08-27, https://en.m.wikipedia.org/wiki/Blender_(magazine), retrieved 2024-08-27 
  30. Joseph, Regina. "Clingendael Futures". https://www.clingendael.org/sites/default/files/2016-02/Transnational%20Crime%20and%20Southern%20Europe.pdf. 
  31. International Institute of Forecasters (2022-08-15). Forecasting Practices and Processes 5. Retrieved 2024-08-27 – via YouTube.
  32. NSF PREPARE (2022-01-18). RP2 Day 2 Lightning Round 7: Social, Behavioral, Economic & Governance. Retrieved 2024-08-27 – via YouTube.
  33. USPTO.report. "Systems and Methods for Bias-Sensitive Crowd-Sourced Analytics Patent Application" (in en). https://uspto.report/patent/app/20170309193. 
  34. USPTO.report. "Systems and Methods for Multi-Source Reference Class Identification, Base Rate Calculation, and Prediction Patent Application" (in en). https://uspto.report/patent/app/20210034651. 
  35. Nilaya (2015), Guests.
  36. Stieb, Matt (May 19, 2023). "A Brutal Wonk Swap at FiveThirtyEight" (in en-us). https://nymag.com/intelligencer/2023/05/fivethirtyeight-hires-g-elliott-morris-loses-nate-silver.html. 
  37. "Superforecasting: The Future's Chequered Past and Present" (in en). 8 February 2021. https://whynow.co.uk/read/superforecasting-the-futures-chequered-past-and-present/. 
  38. 38.0 38.1 38.2 38.3 "Superforecaster Profiles" (in en-US). https://goodjudgment.com/about/our-team/superforecaster-profiles/. 
  39. Taleb, Nassim Nicholas; Richman, Ronald; Carreira, Marcos; Sharpe, James (1 April 2023). "The probability conflation: A reply to Tetlock et al.". International Journal of Forecasting 39 (2): 1026–1029. doi:10.1016/j.ijforecast.2023.01.005. 
  40. Raine, Suzanne. "Superforecasting will not save us" (in sv). https://engelsbergideas.com/essays/superforecasting-will-not-save-us/. 

Further reading

  • Tetlock, Philip E.; Gardner, Dan (2015). Superforecasting : the art and science of prediction. New York: Crown Publishers. ISBN 9780804136693. OCLC 898909721.