Definitionv1
Natural frequency: a representation format for statistical
Natural frequency: a representation format for statistical information that expresses probabilities as counts or frequencies of events within a population rather than as abstract percentages or decimals
Why This Is a Definition
This definition precisely establishes natural frequency as a specific format for representing statistical information, distinguishing it from probability formats and explaining its cognitive advantages. It captures the key distinction that makes Bayesian reasoning more intuitive in this format.
Source Lessons
Connections
Defines (58)
AxiomExtended Cognition ThesisAxiomAutomatic Narrative Generation Precedes Conscious EvaluationAxiomDirected Attention as Depletable ResourceAxiomPerception as Predictive ConstructionAxiomExpertise Transforms Perceptual ChunkingAxiomAutomatic Fusion of Observation and InterpretationAxiomDual Coding Theory: Verbal and Visual ChannelsAxiomNeural Plasticity Enables Lifelong Automatic LearningAxiomEmotional Hijacking of JudgmentAxiomPerceptual Plasticity Through TrainingAxiomSystematic Overconfidence TaxonomyAxiomEmotion as Systematic Cognitive ModulatorAxiomNatural Frequency Format AdvantageAxiomBias Blind Spot AsymmetryAxiomBrain as Hierarchical Prediction MachineAxiomCognition Operates Through Dual Processing SystemsAxiomMental States Are Cognitively ImputableAxiomEgocentric Anchoring in Perspective-TakingAxiomCognitive and Affective Empathy Are DistinctAxiomLooping Effects of Human ClassificationAxiomAutomatic Pattern PerceptionAxiomDunbar's Number Limits Stable RelationshipsAxiomPiagetian Equilibration Through Schema DynamicsAxiomSimple decision rules using less information can outperformAxiomPeople interpret failure as either evidence about theirAxiomYou necessarily trust your own cognitive faculties as aAxiomWhen estimating future task duration, people naturally adoptAxiomReference class forecasting (using base rates from similarPrincipleTrack the evolution of your beliefs over time rather thanPrincipleApply the same tags to notes from different domains whenPrincipleUse version history to identify beliefs that have revisedPrincipleInstrument systems with the minimum number of metrics thatPrincipleDesign cognitive systems for your actual operatingPrincipleWhen large language models express 90%+ linguisticPrincipleUse the 'five whys' technique on any significant energyPrincipleAsk 'How will I feel about this in a year?' to shift fromPrincipleTrack which environmental changes produce reliable emotionalPrincipleWhen emotional intensity increases, increase scrutinyPrincipleEmbed learning capacity into the system itself rather thanPrincipleCompare your self-concept against your actual behavioralPrincipleWhen making decisions under cognitive load, time pressure,PrinciplePeriodically surface process schemas by extracting embeddedPrincipleTrack decision patterns rather than decision outcomes toPrincipleUse satisficing decision rules (define 'good enough'PrincipleAggregate predictions by confidence level and compare statedPrincipleTrack both your predictions and your emotional state atPrincipleWhen identifying overconfidence in retrospective predictionPrincipleMaintain decision logs that record the domain, yourPrincipleWhen vivid individual narratives compete with statisticalPrincipleTrack not just prediction accuracy but also the frequencyPrincipleBefore group discussions on important topics, requirePrincipleBefore proposing the removal of any inherited system,PrincipleDocument not only what tools you use but the completePrincipleUse AI as a schema inspection tool by articulating yourPrincipleCalculate your actual prediction accuracy across documentedPrincipleDelayed schema updates impose compounding costs because thePrincipleDirect change effort toward what you control (judgments,PrincipleWhen delegating to tools, verify outputs more aggressively