Give AI your operational definitions before asking for analysis
Feed your operational definitions to AI systems as explicit context before generating analysis or recommendations, treating your personal glossary as the translation layer between the model's probability-weighted semantics and your specific conceptual framework.
Why This Is a Rule
Language models generate text based on probability-weighted semantics — the most statistically common meaning of each word across their training data. Your operational definitions are specific to your context. When you ask an AI to analyze "productivity" without defining it, the model uses the internet's average meaning (output per unit time), not yours (sustainable output quality at target cognitive load). The analysis reads correctly but answers the wrong question.
This rule treats your personal glossary as a translation layer. Before the AI generates anything substantive, load your definitions of the key terms. This converts a generic language model into one that reasons within your conceptual framework.
When This Fires
- Asking AI to analyze a decision where you've defined key terms precisely (e.g., "quality," "done," "risk")
- Using AI to review writing that uses domain-specific vocabulary
- Requesting recommendations in areas where you've developed non-standard definitions through experience
- Any AI interaction where you'd correct a colleague who used your key terms loosely
Common Failure Mode
Assuming the AI "knows what you mean" because its output uses the same words you do. LLMs are fluent at producing text that sounds aligned while operating on subtly different definitions. You ask about "deep work" meaning Cal Newport's specific protocol; the AI responds about "focused concentration" — close enough to feel right, different enough to be wrong. The semantic gap is invisible because both sides use the same vocabulary.
The Protocol
Maintain a working glossary of 10-20 terms that carry specific meaning in your thinking. Before any substantive AI prompt, include the relevant definitions as explicit context: "In this conversation, 'signal' means [your definition], 'quality' means [your definition]." This takes 30 seconds and eliminates hours of misaligned output.