Test shareability by having an unfamiliar person execute without your help — every stumble reveals un-externalized tacit knowledge
Test workflow shareability by having an unfamiliar person execute it without your help—every stumble reveals tacit knowledge you failed to externalize.
Why This Is a Rule
The curse of knowledge makes self-assessment of documentation quality unreliable. You wrote the workflow, so you understand it — your brain automatically fills in every unstated assumption, implicit step, and undefined term. Self-review feels like testing but actually confirms nothing because the tester (you) shares all the tacit knowledge of the author (also you). The only reliable test for shareability is execution by someone who doesn't share your knowledge.
Every stumble, question, pause, wrong turn, and confused expression during an unfamiliar person's execution is a data point about tacit knowledge that hasn't been externalized. "Where do I find the template?" = the location wasn't specified. "What does 'review' mean here?" = the review criteria weren't defined. "Do I do this in the browser or the desktop app?" = the tool context wasn't clarified. Each stumble is a documentation gap that would have been invisible in self-review.
This is essentially usability testing applied to workflow documentation. Just as software usability testing requires watching actual users (not designers) attempt tasks, workflow shareability testing requires watching actual executors (not the author) attempt the workflow. The methodology is the same: observe silently, don't help, record every stumble, fix the documentation, re-test.
When This Fires
- After documenting a workflow you intend to share with others
- When shared workflows produce frequent questions from executors
- When your "clear" documentation keeps requiring verbal explanations to supplement it
- Complements Competent stranger test for workflow steps — could someone complete this step with zero clarifying questions? If not, it is not yet atomic (competent stranger test) with the actual testing protocol rather than the thought experiment
Common Failure Mode
Helping during the test: the unfamiliar person stumbles, and you immediately explain. This feels helpful but defeats the test — you've just used verbal tacit knowledge transfer instead of documenting it. The documentation remains insufficient, and the next executor will stumble at the same point without you there to explain.
The Protocol
(1) Find someone unfamiliar with the workflow. They should have general competence in the domain but no specific knowledge of your process. (2) Ask them to execute the workflow using only the documentation. Critically: do not help. Do not explain, clarify, or supplement. Watch silently. (3) Record every stumble: where did they pause? What questions did they ask (even if you didn't answer)? Where did they go wrong? Where did they succeed but took a different path than intended? (4) After the test, review the stumble list. Each stumble is a documentation gap. Update the workflow to address every single one. (5) Re-test with a different unfamiliar person. Repeat until someone can execute the entire workflow without stumbling. Two clean runs = shareable.