Update link to the functions that show how stars are calculated
- now links to the folder where all the files for the individual platforms are located and explains how to find the code - haven't checked the link works, so maybe needs a direct link to https://github.com/quantified-uncertainty/metaforecast/tree/master/src/backend/platforms
This commit is contained in:
parent
40c6f57c11
commit
5c65cdf2f7
|
@ -63,7 +63,7 @@ In general, if you want to integrate metaforecast into your service, we want to
|
|||
|
||||
## What are "stars" and how are they computed
|
||||
|
||||
Star ratings—e.g. ★★★☆☆—are an indicator of the quality of an aggregate forecast for a question. These ratings currently try to reflect my own best judgment and the best judgment of forecasting experts I've asked, based on our collective experience forecasting on these platforms. Thus, stars have a strong subjective component which could be formalized and refined in the future. You can see the code used to decide how many stars to assign [here](./src/backend/utils/stars.js).
|
||||
Star ratings—e.g. ★★★☆☆—are an indicator of the quality of an aggregate forecast for a question. These ratings currently try to reflect my own best judgment and the best judgment of forecasting experts I've asked, based on our collective experience forecasting on these platforms. Thus, stars have a strong subjective component which could be formalized and refined in the future. You can see the code used to decide how many stars a forecast should get by looking at the function `calculateStars()` in the files for every platform [here](./src/backend/platforms).
|
||||
|
||||
With regards the quality, I am most uncertain about Smarkets, Hypermind, Ladbrokes and WilliamHill, as I haven't used them as much. Also note that, whatever other redeeming features they might have, prediction markets rarely go above 95% or below 5%.
|
||||
|
||||
|
|
Loading…
Reference in New Issue
Block a user