The irreducible epistemic atoms underlying the curriculum. 4,828 atoms across 8 types and 2 molecules
Evaluate tool interoperability as a primary selection criterion rather than feature richness alone, as disconnected best-in-class tools require you to be the integration layer.
Consolidate functions into existing tools at 80% effectiveness rather than adding specialized tools at 100% effectiveness when the switching cost exceeds the capability gap.
Set defined trial periods with explicit success criteria for new tools, including a scheduled exit date, to prevent permanent accumulation of trial tools.
Build custom tools when frequency is high, requirements are highly specific to your workflow, and available skill permits maintenance you will not resent.
Buy commercial tools when requirements are standard across thousands of users, even if they serve your needs imperfectly, unless the imperfection creates friction in high-frequency workflows.
Treat the act of building a tool as a form of externalization that reveals implicit steps in your workflow, making the tool itself secondary to the understanding gained.
Document tool configurations immediately upon creation with both the specific setting and the reasoning behind it, as a single artifact, to prevent drift between implementation and explanation.
Store configuration documentation as version-controlled files that serve as both the implementation and the documentation, making drift structurally impossible.
Build cognitive infrastructure on local-first tools where the canonical data copy exists on your device and network connectivity is optional for core functionality.
Design daily schedules around your finite deep work capacity rather than attempting to maximize total work hours or distribute effort uniformly across the day.
Design AI-augmented workflows with graceful degradation where AI accelerates but is not required, allowing core work to continue offline at reduced speed rather than stopping entirely.
Implement automated backup systems that run independently of human memory and intention, removing yourself from the critical path of data protection.
Run tool evaluations with bounded scope and parallel baseline to create controlled comparisons rather than full-stack migrations.
Audit your entire tool stack quarterly to identify redundancy, unused subscriptions, and unmaintained dependencies before they compound into cognitive overhead.
Review actual output produced before reviewing tool usage to maintain tools-in-service-of-goals rather than goals-in-service-of-tools orientation.
Establish single sources of truth for each data type to prevent information duplication and the entropy of maintaining multiple conflicting versions.
Maintain offline capability for critical workflows to ensure cognitive infrastructure remains functional when network connectivity fails.
Treat cognitive infrastructure as a coupled system where capability emerges from the quality of integration between brain, tools, and methods rather than from any component in isolation.
Position tools and resources based on usage frequency rather than categorical logic, placing high-frequency items within arm's reach to minimize retrieval friction and preserve flow states.
Create distinguishable contextual markers (orientation, lighting, objects, time blocks) to enable functional separation within a single physical space when dedicated rooms are unavailable.
Use concrete sensory descriptions instead of evaluative labels to prevent defensive reactions and enable collaborative problem-solving.
Question each workspace object with 'did I use this for my current function in my last three work sessions?' rather than 'could this be useful?' to overcome endowment effect.
Create three workspace zones—active, near, and archive—with objects moving between them based solely on usage frequency relative to current function.
Match your lighting to circadian phase—bright, blue-rich light during analytical work hours, warm, dim light in evening to protect sleep quality.