Core Primitive
Documenting workflows well enough to share them multiplies their value. A workflow that lives only in your head dies with your attention. A workflow shared becomes a reusable asset — for your team, your community, and your future self.
Your best workflow is worthless if it lives only in your head
In The workflow review, you learned to step back and review your entire portfolio of workflows — to retire the ones that no longer serve you, to improve the ones that do, and to notice the gaps where a workflow should exist but doesn't. That review produced a clearer, leaner set of personal processes. But there is a limitation built into every lesson in this phase so far, and it is time to name it: everything you have built is yours alone. Your workflows live in your notes, in your habits, in the muscle memory of your hands and the pattern recognition of your trained attention. They serve you well. They serve no one else at all.
This lesson is about changing that. Not because altruism demands it — though there is a gift dimension we will explore — but because the act of making a workflow shareable transforms the workflow itself. Sharing is not the final step after building. It is an additional design constraint that forces clarity, eliminates hidden assumptions, and invites improvement from perspectives you do not have. A workflow you keep to yourself can afford to be vague in places, because your tacit knowledge fills the gaps. A workflow you share cannot afford vagueness anywhere, because the recipient does not have your tacit knowledge. The discipline of closing those gaps — of converting every implicit assumption into an explicit instruction — is one of the most powerful refinement tools available to you.
The progression across this phase tells the story. You recognized that repeatable activities are workflows (A workflow is a repeatable sequence of steps). You documented them (Document your workflows). You identified their triggers, steps, checkpoints, inputs, outputs, and handoff points. You learned to measure, iterate, and compose them. You built context-dependent variants and organized them into a library. You reviewed the whole portfolio. Now you take the final step before the capstone: you make your workflows legible to someone who is not you.
The knowledge that cannot be spoken — until it can
Ikujiro Nonaka and Hirotaka Takeuchi, in "The Knowledge-Creating Company" (1995), drew a distinction that is foundational to understanding why sharing workflows is both difficult and valuable. They identified two types of knowledge that organizations rely on. Explicit knowledge is knowledge that can be codified, written down, and transmitted through language: formulas, procedures, specifications, documented policies. Tacit knowledge is knowledge that resists codification — the intuitions, skills, judgment calls, and embodied expertise that people possess but cannot easily articulate. A master carpenter knows how the wood should feel under the plane. A veteran salesperson knows when a prospect is about to say yes. A seasoned engineer knows that this particular codebase tends to break in a specific way when that particular subsystem is modified.
Nonaka and Takeuchi argued that organizational innovation depends on the continuous conversion between tacit and explicit knowledge. They described four modes of knowledge conversion: socialization (tacit to tacit — learning by watching and doing alongside an expert), externalization (tacit to explicit — articulating what you know into shareable form), combination (explicit to explicit — assembling documented knowledge into new frameworks), and internalization (explicit to tacit — learning from documentation until it becomes intuitive skill).
Workflow sharing is, fundamentally, an act of externalization. Your refined workflow contains enormous amounts of tacit knowledge: the judgment calls you make at decision points, the shortcuts you take because you know which steps can be safely compressed, the failure patterns you watch for because you have encountered them before, the environmental conditions you unconsciously ensure before starting. When you share the workflow, you must externalize all of this. You must take the knowledge that exists as felt sense, as trained instinct, as embodied competence, and render it in language precise enough that someone without your history can follow it.
This is hard. It is supposed to be hard. The difficulty is not a sign that you are doing it wrong. It is the mechanism by which the workflow improves. Every piece of tacit knowledge you successfully externalize becomes a permanent, transmissible asset. Every piece you fail to externalize reveals a gap in your own understanding — a place where you are operating on autopilot without fully knowing why. The act of sharing holds a mirror to your own expertise and shows you both what you know and what you merely do.
The bus factor: why unshared knowledge is organizational debt
Software teams use a grim metric called the bus factor: the number of people who would need to be hit by a bus before the project becomes unrecoverable. A project where only one person understands the deployment process, the database schema, and the monitoring infrastructure has a bus factor of one. If that person leaves — for any reason, not just the morbid scenario — the project is in crisis.
The bus factor is not limited to software teams. It applies to any system where critical knowledge is concentrated in a single person. A family where only one parent knows how the household finances work. A small business where only the founder knows the key client relationships. A research lab where only one postdoc knows how to calibrate the primary instrument. In each case, the knowledge is real, the processes work, and the system functions — as long as that one person is present and available. The moment they are not, the system reveals its fragility.
Unshared workflows are organizational debt. They function today, but they carry a hidden liability that compounds over time. The longer a workflow remains unshared, the more tacit knowledge accumulates around it, the harder it becomes to externalize, and the more catastrophic the loss when the sole practitioner becomes unavailable. Sharing your workflows is not just generous. It is structurally responsible. It converts concentrated, fragile knowledge into distributed, resilient knowledge.
Institutions that survive leadership transitions understand this intuitively. The military's obsession with Standard Operating Procedures — the SOPs that govern everything from equipment maintenance to combat operations — is not bureaucratic compulsion. It is institutional memory. An SOP ensures that when a platoon sergeant rotates out, the incoming sergeant inherits not just a title but a complete, documented set of workflows that the unit depends on. The SOP does not replace expertise. It preserves enough operational knowledge that expertise can be rebuilt rather than reinvented from scratch.
Your personal and professional workflows deserve the same treatment. Not because you are running a military unit, but because the principle is identical: documented, shareable workflows survive the inevitable transitions — job changes, team restructuring, shifting responsibilities, periods of absence — that undocumented workflows do not.
Teaching is learning: the protege effect
There is a selfish reason to share your workflows, and it is one of the most well-documented phenomena in educational research. When you teach something to someone else, you learn it better yourself. This is not folk wisdom. It is an empirical finding with a name: the protege effect.
Chase, Chin, Oppezzo, and Schwartz (2009) demonstrated that students who prepared to teach material to others — and especially those who actually taught it — showed deeper understanding and better retention than students who prepared only for their own test. The mechanism is straightforward but powerful. When you prepare to teach, you engage in what the researchers call generative processing: you must organize the material, identify the key concepts, anticipate questions, and fill gaps in your own understanding that you might otherwise never notice. Teaching requires you to see the material from someone else's perspective, which forces you to examine your own knowledge structure for coherence and completeness.
This applies directly to workflow sharing. When you document a workflow for yourself, you can be sloppy. You know what you mean. You can leave steps implicit because your own experience fills them in. But when you document a workflow for someone else — when you prepare to teach it — you discover the places where your understanding is actually incomplete. You find the step that you described as "check the results" but that actually involves a complex judgment call you have never articulated. You find the decision point where you wrote "use your judgment" but where a newcomer would have no basis for judgment because you never explained the criteria. You find the assumption you embedded so deeply that you forgot it was an assumption — like the fact that the workflow only works if you run it on Monday mornings because that is when the data refreshes, a detail so obvious to you that you never wrote it down.
The protege effect means that sharing your workflow makes you better at your workflow. The externalization process forces a level of self-examination that solo practice, no matter how reflective, cannot match. You are not just giving your knowledge away. You are refining it through the act of transmission.
What makes a workflow shareable
Not all documentation is equally shareable. You can write a workflow document that makes perfect sense to you and is utterly useless to anyone else. The difference between personal documentation and shareable documentation lies in three properties that must be deliberately designed in.
Context-independence. A shareable workflow does not assume your specific tools, your specific environment, or your specific organizational structure. It either uses tools that are widely available or explicitly names the tools it requires and suggests alternatives. "Open the spreadsheet" is not context-independent. "Open a spreadsheet application (Google Sheets, Excel, or equivalent)" is. "Run the deployment script" assumes the reader has the script, knows where it lives, and understands its prerequisites. A shareable version names the script, states its location, lists its dependencies, and explains what it does at each stage so that someone using a different toolchain can adapt.
This does not mean stripping out all specificity. Your workflow uses specific tools for good reasons, and those reasons are valuable to share. The goal is to separate the essential logic of the workflow — the sequence of steps, the decision points, the quality criteria — from the incidental implementation details that are specific to your setup. A reader should be able to follow the logic even if they swap out every tool you named.
Clarity for newcomers. A shareable workflow can be followed by someone who has never performed this task before. This is the hardest criterion to meet because it requires you to overcome the curse of knowledge — the cognitive bias, documented by Camerer, Loewenstein, and Weber (1989), in which people who know something find it nearly impossible to imagine what it is like to not know it. You know that step three requires a specific mental model of how the system works, so you skip the explanation. The newcomer arrives at step three with no mental model and no way to build one from what you have written.
The antidote is user testing. Have someone unfamiliar with the workflow attempt to follow your documentation. Every place they pause, ask a question, or make a wrong turn is a place where the curse of knowledge produced a gap. Close the gap. Repeat until a newcomer can execute the workflow successfully without asking you for help. This is the gold standard of shareable documentation, and very few people achieve it on the first draft.
Adaptability through marked customization points. A shareable workflow explicitly identifies which steps are fixed and which are customizable. The fixed steps are the ones where deviation would break the workflow or compromise its output. The customizable steps are the ones where the reader can substitute their own tools, preferences, or context without affecting the result. Marking these explicitly — "This step uses Notion, but any outlining tool works" or "The fifteen-minute timer is calibrated for a 1,500-word document; adjust proportionally for different lengths" — turns a rigid procedure into an adaptable framework. The reader does not have to guess which parts are load-bearing and which are cosmetic.
The sharing spectrum: from private to published
Sharing is not binary. There is a spectrum of exposure, and each level on the spectrum serves a different purpose and demands a different level of polish.
At the near end is the private note — a workflow documented well enough for your future self to follow. This is where Document your workflows started. The audience is you, six months from now, when you have forgotten the details. The standard is low but non-trivial: future-you is a different person with different context, and if your notes assume the context you have today, future-you will find them insufficient.
Next is team documentation — a workflow shared with a small group of people who share your context, your tools, and your organizational knowledge. The standard rises. You need to externalize assumptions that your team does not share, but you can rely on shared vocabulary, shared tool access, and shared understanding of the problem domain. Internal wikis, runbooks, and team playbooks live at this level.
Beyond the team is the public template — a workflow shared with strangers who share your general domain but not your specific context. A "content creation workflow" published on a blog. A "morning routine template" shared in a community forum. A "hiring process checklist" posted on a professional network. The standard is higher still. You must explain domain terminology that insiders take for granted. You must offer tool alternatives. You must mark customization points. The workflow must stand alone without your presence to clarify it.
At the far end is the published system — a workflow that has been refined, tested by multiple users, documented with the rigor of a user manual, and packaged for broad distribution. David Allen's Getting Things Done is a published system. Marie Kondo's tidying method is a published system. The Scrum framework, originally documented by Ken Schwaber and Jeff Sutherland in a seventeen-page guide, is a published system. Each started as a personal workflow, was shared with a small group, was refined through feedback, and eventually became a reusable system that operates independently of its creator.
You do not need to aim for the published-system level for every workflow you share. But understanding the spectrum helps you calibrate. Most people default to the private-note level and never advance. They have valuable workflows that could help others, but the documentation is so personal, so context-dependent, so full of implicit assumptions that sharing it would require as much explanation as the documentation itself. Moving one level up on the spectrum — from private note to team documentation, from team documentation to public template — is usually where the highest-leverage improvement lives.
Open source as existence proof
The open source software movement is, at its core, a massive experiment in workflow sharing. Eric Raymond, in "The Cathedral and the Bazaar" (1999), described the difference between two models of software development. The cathedral model concentrates development among a small group of experts who release polished, finished products. The bazaar model opens the development process to anyone who wants to contribute, with the source code visible, forkable, and improvable by the community.
Raymond's key observation was that the bazaar model, despite appearing chaotic, produced better software faster. His formulation — "given enough eyeballs, all bugs are shallow" — captured the central mechanism: when a process is shared openly, the diversity of perspectives applied to it exceeds what any single team can generate. Problems that are invisible to the creator are obvious to someone with a different background, different tools, or different use cases.
This principle applies directly to workflow sharing. A deployment workflow that you have used a hundred times has blind spots — failure modes you have never encountered because your environment does not trigger them, inefficiencies you do not notice because you have habituated to them, improvements you cannot see because your perspective is fixed. When you share the workflow, you expose it to perspectives that can see what you cannot. The colleague who asks "why do you do step four before step five?" may reveal an ordering assumption that is not actually necessary. The newcomer who struggles with step seven may reveal that step seven is more complex than you realized. The person in a different department who adapts your workflow to their context may discover a simplification that works in yours too.
The network effects are real. A workflow used by one person has one user's worth of testing and improvement. A workflow shared with ten people has ten users' worth. A workflow published as a template has potentially thousands. Each user who encounters a problem, adapts a step, or discovers an improvement feeds information back into the system — if the sharing infrastructure supports feedback. The most valuable shared workflows are not static documents but living artifacts that evolve through collective use.
Sovereignty as gift, revisited
In Sovereignty as a gift to others, you learned that sovereignty — the practice of self-direction — is not merely a personal achievement but a gift to others. When you live according to your own examined values, you create a permission structure that makes it easier for others to do the same. The mechanism is modeling: people calibrate their sense of what is possible by observing what others actually do.
Workflow sharing operates through the same mechanism at the operational level. When you share a refined workflow, you are not just handing someone a procedure. You are demonstrating that this kind of operational clarity is achievable. You are showing that it is possible to take a messy, ad hoc process and transform it into a documented, repeatable, improvable system. For someone who has never seen a well-designed personal workflow, your shared document is proof of concept — evidence that the approach works, that the effort is worthwhile, that operational sovereignty at the personal level is real and attainable.
This is why the quality of your sharing matters. A hastily shared, poorly documented workflow teaches people that workflow documentation is not worth the effort. A carefully shared, well-documented workflow teaches people that operational rigor produces operational freedom. The artifact itself is a model of the practice that created it.
The third brain: AI as sharing accelerator
Large language models are exceptionally well-suited to the specific challenge that makes workflow sharing difficult: converting tacit knowledge into explicit documentation. The gap between what you know and what you have written down is precisely the kind of gap that a structured AI interaction can help you close.
The protocol works like this. Start by writing your workflow as you normally would — the version that makes sense to you, with all its shortcuts and assumptions. Then prompt an AI: "I am going to share this workflow with someone who has never done this task. Read it and identify every place where I have made an assumption, used jargon without defining it, skipped a step that a newcomer would need, or failed to explain the reasoning behind a decision." The AI will return a list of gaps. Many will be obvious in retrospect. Some will reveal tacit knowledge you did not realize you were relying on. Use the list to revise.
A second pass can address adaptability. Prompt: "In this workflow, identify every tool-specific or context-specific element. For each one, suggest a more generic description and at least one alternative tool or approach." This converts a workflow that only works in your environment into one that can be adapted to others.
A third pass can address clarity. Paste the revised workflow and prompt: "Explain this workflow back to me as if you were a newcomer trying to follow it. Where would you get confused? Where would you need more information to proceed?" The AI's attempt to simulate a newcomer's perspective will not perfectly match a real newcomer's experience — there is no substitute for actual user testing — but it catches a substantial portion of the curse-of-knowledge gaps before you ask a real person to struggle through them.
The deeper application is generative. If you have a workflow that exists only as embodied practice — something you do well but have never documented at all — you can describe the process conversationally to an AI and ask it to produce a first draft of documentation. You talk through what you do, why you do it, what decisions you make along the way, what you watch out for, what signals tell you to adjust. The AI organizes your verbal stream into a structured document. You then revise for accuracy, because the AI will have introduced assumptions or missed nuances. But the hardest part of documentation — staring at a blank page trying to articulate what comes naturally — is bypassed. You start with a draft that is sixty percent right rather than zero percent written.
This use of AI respects the boundary between assistance and replacement. The AI does not know your workflow. You do. The AI cannot verify that the documentation matches reality. You can. What the AI provides is structural scaffolding — the organization, the format, the completeness check — that makes your tacit knowledge easier to externalize. The knowledge remains yours. The expression of it becomes a collaboration.
From sharing to capstone
You have now traveled the full arc of workflow design as a personal discipline. You learned to see repeatable activities as designable processes. You documented them, identified their components, measured their performance, iterated on their design, adapted them to context, composed them into larger systems, organized them into a library, reviewed the portfolio, and — in this lesson — learned to share them in a way that multiplies their value beyond yourself.
The act of sharing closes a loop that began with documentation in Document your workflows. Documentation made your workflows visible to yourself. Sharing makes them visible to others. And visibility — whether to yourself, your team, or the wider world — is the precondition for every form of improvement. You cannot improve what you cannot see. You cannot benefit from others' perspectives on what they cannot access. Sharing is not an afterthought to workflow design. It is the mechanism by which a personal process becomes a collective asset, by which individual knowledge becomes organizational memory, and by which the effort you invested in designing your workflows compounds beyond the boundaries of your own practice.
Workflow design is process engineering for your life, the phase capstone, draws all of this together. The twenty lessons of Phase 41 are not a miscellaneous collection of workflow tips. They are, taken as a whole, a case for treating your recurring activities as engineered processes — processes that can be documented, measured, iterated, composed, shared, and continuously improved. Workflow design, properly understood, is process engineering applied to the most complex and consequential system you will ever operate: your own life.
Sources:
- Nonaka, I., & Takeuchi, H. (1995). The Knowledge-Creating Company: How Japanese Companies Create the Dynamics of Innovation. Oxford University Press. (Tacit vs. explicit knowledge; SECI model of knowledge conversion.)
- Raymond, E. S. (1999). The Cathedral and the Bazaar: Musings on Linux and Open Source by an Accidental Revolutionary. O'Reilly Media. (Open source as shared process design; network effects of transparent workflows.)
- Chase, C. C., Chin, D. B., Oppezzo, M. A., & Schwartz, D. L. (2009). Teachable agents and the protege effect: Increasing the effort towards learning. Journal of Science Education and Technology, 18(4), 334-352. (The protege effect: teaching deepens the teacher's understanding.)
- Camerer, C., Loewenstein, G., & Weber, M. (1989). The curse of knowledge in economic settings: An experimental analysis. Journal of Political Economy, 97(5), 1232-1254. (Curse of knowledge and its effect on communication.)
- Schwaber, K., & Sutherland, J. (2020). The Scrum Guide. (Standard operating procedures as shareable frameworks; published systems.)
- Allen, D. (2001). Getting Things Done: The Art of Stress-Free Productivity. Penguin. (Example of personal workflow evolved into published system.)
- Deming, W. E. (1986). Out of the Crisis. MIT Center for Advanced Engineering Study. (Institutional knowledge, operational definitions, system documentation.)
Frequently Asked Questions