Keep schema evolution logs minimal (4 fields) and review weekly — the dataset reveals cognitive patterns
Maintain schema evolution logs with minimum viable entries of four fields - date, schema affected, what changed, and what prompted the change - reviewed weekly, to generate a dataset about cognitive patterns that introspection alone cannot produce.
Why This Is a Rule
The schema evolution log's primary value isn't in individual entries — it's in the dataset. After 20+ entries reviewed weekly, patterns emerge that no amount of introspection could produce: which domains update most frequently (where your learning is active), what types of evidence trigger changes (what you're responsive to), which schemas remain static (possibly stale or possibly foundational), and how your revision rate correlates with life events.
The minimum viable entry — four fields, under two minutes — prevents the log from becoming burdensome. Elaborate entries produce rich individual records but die within weeks because the effort exceeds the practice's sustainability threshold (see Bottleneck journal: 6 fields, under 2 minutes — sustainability beats thoroughness on bottleneck journal brevity). Four fields is the sweet spot: enough structure for pattern detection, little enough effort for indefinite maintenance.
Weekly review converts accumulated entries into meta-learning: "This week I updated two schemas, both in the interpersonal domain, both triggered by feedback from the same person." That meta-pattern — you're doing most of your learning through one feedback relationship — is invisible from individual entries but obvious from the weekly review of accumulated data.
When This Fires
- Setting up a deliberate schema evolution practice
- When you want to understand how your thinking changes over time
- During epistemic practice design alongside anomaly logs (Keep a separate anomaly log with three fields: expected, actual, and which schema predicted wrong) and decision records (Record five elements at the moment of every significant decision — before hindsight rewrites it)
- When introspection about "how I've grown" produces vague generalities rather than specific patterns
Common Failure Mode
Making entries too detailed: spending 10 minutes on a single log entry. This produces good individual records but kills sustainability. Two minutes maximum per entry. The dataset value comes from consistent, long-term logging — not from elaborate individual entries.
The Protocol
When a schema changes: (1) Log four fields in under 2 minutes: date, which schema, what changed, what prompted the change. (2) Weekly: scan the last 5-7 entries. Ask: patterns in which domains are updating? What types of evidence trigger changes? Any schemas that haven't appeared in the log for months? (3) Monthly: look at the full log. What meta-patterns emerge about your cognitive evolution? (4) The log becomes more valuable over time — 50+ entries reveal patterns that 5 entries can't.