27 published lessons with this tag.
When you evaluate before you finish observing, your brain replaces incoming data with expected data. You stop seeing what is there and start seeing what you already believe.
You never perceive raw reality — your beliefs, expectations, and mood always color perception.
You unconsciously seek and emphasize evidence that confirms your existing beliefs.
Facts are observable events — stories are the narratives you construct around them.
When the same structure appears three or more times, treat it as a pattern worth naming — not a coincidence to dismiss.
Two things happening together does not mean one causes the other.
What you perceive is a construction, not a recording. Your brain generates a model of reality shaped by expectation, culture, and attention — and it feels like truth precisely because the construction is invisible to you.
Your brain does not fail randomly. It fails in a specific, measurable, predictable direction: too much confidence. Across decades of research, in every population tested, the dominant calibration error is overconfidence — believing you know more than you do, that your estimates are more precise than they are, and that your performance exceeds what it actually achieves.
Recording what you expect to happen and comparing to what actually happens is the only reliable method for calibrating judgment. Without a written record, hindsight bias rewrites your memory of what you believed, making genuine learning from experience impossible.
Your emotions do not add random noise to perception — they warp it in predictable, measurable directions. Anxiety inflates threats. Euphoria shrinks risks. Anger manufactures certainty. Once you know the direction of the distortion, you can correct for it.
Basic physiological states measurably alter what you perceive and how you evaluate it.
You overestimate the likelihood of events you can easily recall examples of. The availability heuristic substitutes the question "how frequent is this?" with the question "how easily can I think of an example?" — and the substitution happens below conscious awareness, which means you feel like you are reasoning about probability when you are actually reasoning about the vividness of your memory.
Recent events disproportionately influence your perception of what is normal or likely.
Statistical base rates predict outcomes better than compelling individual stories. Your brain will fight this truth every time a vivid narrative competes with a dry statistic — and your brain will be wrong.
Update the strength of your beliefs proportionally to the strength of new evidence.
Everyone has specific recurring distortions — identify yours. Generic bias literacy is not enough. You need a personal bias profile: the particular set of systematic errors your brain commits most frequently, in the specific domains where those errors cost you the most.
What was true in one time period may not be true in another — always note the when.
Who you are with when you process information influences what you conclude.
The schemas you apply automatically without thinking are the hardest to examine.
Established schemas persist even when contradicted by evidence.
Writing down how two ideas relate prevents assuming a connection that does not exist.
Having trusted people review your mental models catches errors you miss.
Finding out your schema is wrong teaches you more than confirming it is right.
Every choice to do X is a choice not to do Y — consider what you give up.
Direct results and other peoples reactions are both valuable but different types of feedback.
Reviewing key conditions before starting a task catches errors before they propagate.
Evolution built in a tendency to defer to authority — recognize when it activates.