Frequently asked questions about thinking, epistemology, and cognitive tools. 193 answers
Treating calibration as a belief rather than an infrastructure. You read about superforecasters, you agree that overconfidence is a problem, you nod at Bayesian updating — and then you walk into Monday's meeting and make intuitive judgments without tracking, without base rates, without feedback..
Assuming meaning is inherent in information rather than constructed by context. This is the context-blind default: you read a number, hear a statement, or receive data, and you immediately assign meaning as if the meaning lives inside the information itself. It does not. The meaning lives in the.
Learning about cultural differences as trivia — 'the Japanese bow, Indians eat with their hands' — without ever examining your own cultural operating system. The lesson isn't about cataloging other cultures. It's about seeing that you have a culture, that it shapes what you perceive as normal, and.
The most dangerous failure mode is not recognizing outdated information — it is treating all information as either timeless or expired, with no middle ground. Some people overcorrect by dismissing anything older than a year as irrelevant. Others never update at all and operate on knowledge from a.
Assuming your reader shares your context by default. You'll know you're in this failure mode when someone responds to your message with unexpected hostility or confusion and your first thought is 'but it was obvious what I meant.' It was obvious to you. You had the context. They didn't.
Agreeing that 'systems matter' while still blaming individuals when something goes wrong in your own organization. The test isn't whether you can cite Deming in a meeting. It's whether, when a colleague underperforms, your first question is 'What about this system made this outcome likely?' rather.
Believing you're immune to social influence because you're 'independent-minded.' Asch's data is clear: 75% of people conform at least once, and the remaining 25% aren't immune — they just have higher thresholds. The most dangerous form of social conformity is the kind you can't see because.
Knowing the history intellectually without encoding it into your decision-making infrastructure. Reading post-mortems without changing processes. Saying 'we learned from that' while preserving the exact conditions that caused it. Historical context only prevents repetition when it is embedded in.
Believing that because something is obvious to you, it must be obvious to your reader. This is the curse of knowledge operating in real time. You will catch yourself doing it most when you are busy, stressed, or communicating with people you know well — precisely the conditions where you are most.
Believing you can serve multiple contexts simultaneously without degradation. You will know this is happening when you feel productive — attending to many things at once — but the output in each context is shallow, reactive, and error-prone. The sensation of busyness is not the same as the reality.
Evaluating past decisions using information you only acquired after the outcome. You'll know you're in this failure mode when your judgment of a decision changes based on what happened next rather than what was knowable at the time. The phrase 'I should have known' is almost always a signal that.
Performing externalization as transcription rather than construction. The most common failure is writing down what you already believe in polished form rather than actually constructing the chain step by step and discovering its structure as you write. Transcription produces a document that.
Treating the act of writing the goal as the achievement itself. Writing 'lose 20 pounds' in a beautifully designed journal and never looking at it again is decoration, not externalization. The written goal must connect to a review loop — you revisit it, update it, and evaluate progress against it..
Listing only the assumptions you are already aware of — the safe, obvious ones. The assumptions that destroy plans are the ones so deeply embedded you mistake them for facts. If your assumption list feels comfortable, you haven't gone deep enough. The real practice is surfacing what you don't know.
Treating the written list as a one-time exercise instead of a living document. You write your priorities once, feel the clarity, and never update them. Within two weeks the list is stale, your actual behavior has drifted, and you are back to reacting. The list only works if you revisit it — weekly.
Describing your mental model in prose instead of drawing it. Writing a paragraph about what you believe preserves the linear, narrative structure that hides gaps. A paragraph can skip over missing relationships because language flows forward and the reader — including you — fills in the blanks.
Waiting until the blocker is fully understood before writing it down. This is the most common failure — the belief that you need to diagnose the problem before you can name it. But diagnosis follows naming, not the other way around. If you wait until you understand the blocker completely, you will.
Tracking only when you feel bad — which creates a dataset that confirms you always feel bad. Or tracking for two days, seeing no pattern, and concluding the practice doesn't work. Energy and mood patterns only emerge across a minimum of seven days. Anything shorter is noise you're mistaking for.
Confusing capture with learning. The most common failure mode is treating externalization as transcription — copying quotes, saving bookmarks, highlighting passages, filing articles into folders. This produces an archive of other people's thinking, not a record of your own learning. If your.
Filtering feedback before you record it. You hear criticism, decide it was 'unfair' or 'they don't understand the context,' and don't write it down. Three months later, when someone else raises the same point, you treat it as new information instead of a confirmed pattern. The filter isn't.
Turning failure analysis into self-punishment. The goal is not to catalogue everything wrong with you — it's to extract usable signal from an outcome that didn't work. If your failure log reads like a list of personal deficiencies rather than a set of causal observations, you've replaced analysis.
Building an elaborate tracking system that becomes its own project. The overhead of maintaining the tracker exceeds the value of the insight it produces. You stop updating it after a week, which feels like a failure, which makes you less likely to try again. The antidote is radical simplicity — a.
Treating workspace design as aesthetics rather than cognition. You reorganize your desk to look clean, buy matching containers, post motivational quotes — and mistake the visual satisfaction for cognitive improvement. The test is not whether your environment looks good. The test is whether it.
Documenting your tools without documenting your processes. You write 'I use Obsidian for notes and Todoist for tasks' and call it system documentation. But tools are not systems. The system is the set of decisions, triggers, cadences, and rules that determine how information flows through those.