Skip to main content
Model Calibration

Are Our Predictions Actually Accurate?

Perfect calibration means a 70% confidence pick wins 70% of the time. See how we measure up across 531 settled predictions.

531

Settled Predictions

0.2500

Brier Score (lower is better)

6

Confidence Buckets

Predicted vs Actual Win Rate

0%0%25%25%50%50%75%75%100%100%PerfectPredicted Win RateActual Win Rate

Calibration by Confidence Band

Confidence BandPredicted %Actual %DeltaSample Size
50-54%52.0%42.7%-9.3%164
55-59%57.0%65.0%+8.0%160
60-64%62.0%58.5%-3.5%41
65-69%67.0%50.0%-17.0%30
70-74%72.0%61.0%-11.0%41
75-79%77.0%60.0%-17.0%95

What is calibration? A well-calibrated model's stated confidence matches reality. If we say 80% confidence, those picks should win about 80% of the time.

What is the Brier Score? It measures the accuracy of probabilistic predictions. A score of 0 is perfect; 0.25 is no better than a coin flip. Lower is better.

The diagonal line on the chart represents perfect calibration. Points above the line mean we're underconfident (actual wins exceed predicted); points below mean we're overconfident.