When credentialed experts contradict each other, treat it as a map of genuine uncertainty — not a problem requiring you to pick a winner
When two credentialed experts contradict each other on the same question, treat their disagreement as a map of genuine uncertainty in the evidence base rather than as a problem requiring you to pick a winner.
Why This Is a Rule
Expert disagreement is diagnostic information about the state of evidence, not a deficiency in one of the experts. When two credentialed researchers reach opposite conclusions from the same evidence base, the disagreement reveals genuine ambiguity in the data — different but defensible interpretations of incomplete or complex evidence. Philip Tetlock's research on expert political judgment found that even top experts perform poorly on contested questions precisely because these questions involve genuine uncertainty that credentials alone cannot resolve.
The naive response — picking the expert who sounds more confident or whose conclusion you prefer — destroys the information contained in the disagreement. If you select one expert and ignore the other, you've converted genuine uncertainty into false certainty. Your beliefs now reflect a confidence level that the evidence doesn't support.
The sophisticated response is to treat the disagreement as a calibration signal: "The evidence on this question is genuinely uncertain. Both positions have merit. My confidence in either conclusion should be modest." This is uncomfortable but epistemically honest — and it protects you from overcommitting to a position that may be wrong.
When This Fires
- When reading competing meta-analyses or reviews that reach opposite conclusions
- When two advisors with genuine expertise give contradictory recommendations
- When following a scientific debate where qualified researchers disagree
- When making decisions under expert disagreement and feeling pressure to "just pick one"
Common Failure Mode
Selecting the expert who confirms your existing beliefs (confirmation bias applied to expert selection). When two experts disagree, your brain instinctively rates the confirming expert as "more credible" and the disconfirming one as "probably biased." This is authority-laundered confirmation bias — using the expert's credentials to justify a conclusion you'd already reached.
The Protocol
(1) When two credentialed experts contradict each other, pause the urge to pick a winner. (2) Ask: "What does their disagreement tell me about the state of evidence?" Usually it means the evidence is genuinely ambiguous. (3) Map the disagreement: where exactly do they diverge? Is it on facts, interpretation, methodology, or values? (4) Calibrate your confidence downward: if experts disagree, your confidence in either position should be lower than if they agreed. (5) Make decisions appropriate to the uncertainty level: reversible decisions, hedged strategies, or explicit probability estimates rather than binary commitments. (6) Track the disagreement over time — expert consensus shifts as evidence accumulates, and the resolution often reveals which interpretive framework was correct.