Calibration: the alignment between your confidence and your
Calibration: the alignment between your confidence and your accuracy, measured by the correspondence between stated confidence levels and actual outcome frequencies across multiple predictions
Why This Is a Definition
This definition precisely establishes the semantic boundary of 'calibration' by identifying its genus (alignment between confidence and accuracy) and differentia (measured through correspondence between stated confidence and actual outcomes). It distinguishes calibration from mere confidence or accuracy alone, and provides the operational definition needed for the lesson's practical exercises. The definition is consistent with the lesson's emphasis on quantified predictions, outcome tracking, and aggregate analysis.
Source Lessons
Calibration requires feedback
You cannot improve the alignment between your confidence and your accuracy without external data that reveals the gap between what you believed and what actually happened. Calibration without feedback is guesswork about guesswork.
Track your predictions
Recording what you expect to happen and comparing to what actually happens is the only reliable method for calibrating judgment. Without a written record, hindsight bias rewrites your memory of what you believed, making genuine learning from experience impossible.
Record your calibration over time
A log of predictions and outcomes shows you exactly where your perception is off.
Humility is accurate calibration
True humility is not thinking less of yourself but having an accurate model of your capabilities.