The Natural Framework

Part of the cognition series.

The cognition series found six roles in information processing: perceive, cache, filter, attend, consolidate, remember. Five run forward as stages: perceive, cache, filter, attend, remember. Consolidate reads from remember and writes to the substrate, reshaping how each stage processes on the next cycle. The competitive core is filter and attend: winners suppress losers, diversity is enforced. Consolidate is how the substrate learns.

The same pipeline runs at every timescale. Every row is something you already call “information processing.” Dimmed cells are steps the domain hasn’t yet optimized.

PerceiveCacheFilterAttendConsolidateRemember
Neurons
(μs)
Sensory transductionFeature extraction (V1, V2)Biased competition (Desimone & Duncan)Winner-take-allSynaptic strengthening (LTP)Long-term cortical storage
Operating System
(ns)
Interrupt, I/O eventParse, deserializeCache eviction (LRU vs LFU)Scheduler dispatchLSM compaction, defragfsync, write-ahead log
Database
(ms)
Query arrivesQuery plan, index lookupWHERE clause, index scanORDER BY, LIMITVACUUM, reindexThe table on disk
Inference
(ms)
Tokenize inputPositional encodingSoftmax attentionMulti-head attentionTraining (sealed)Frozen weights
Cognition
(s)
Caret Recorder captures screenMoments segments into chunksPerception PipeSalience + DPPSchema formation offlinePublish to Canon
Writing Prose
(hours)
Read, research, encounterOutline, organize into beatsKill your darlingsSelect what survives the draftEdit: each pass tightensThe published piece
Writing Code
(hours)
Read files, see errorsParse into AST, resolve typesLinter, type checker, testsCode review, select the approachRefactorPush to repo
AI Agent
(min)
Read task, see contextParse into context chunksSelect relevant files, ignore restContext window selectionCreate skillPush to production
Adtech
(ms)
Bid request, user contextUser profile, intent signalsSecond-price auction, highest bid winsHighest bidder wins, no diversityFrequency caps, retargetingCookie-based, being deprecated
Vector Space
(ms)
Positions arrive, embeddedIndexed for auctionCosine gate, nearest-neighborVCG selects on relevance × bidRelocation fees, receiptsContent-addressed positions
Google
(hours)
Googlebot crawlsIndex, parse HTMLNo redundancy inhibitionKeyword match, top-k by PageRankRe-crawl on scheduleThe index
PageLeft
(hours)
Crawler discovers pagesParagraph chunking, embedIngestion filter: freshness, inhibitionSearch with DPP rerankingQuality compounds, PageRank convergesCanon grows

Twelve domains. Six roles.

Now the same table again. None of these domains call themselves “information processing”, but all of them process information.

PerceiveCacheFilterAttendConsolidateRemember
Comedy
(hours)
Read the room, current eventsPremises, setups, organize into bitsOpen mic: weak jokes die on stageTight five: finite stage timeBits refined, callbacks link the setThe special, the body of work
Immune
(days)
Antigen encounterAntigen presentation (MHC)Clonal competitionAffinity selectionAffinity maturationMemory B/T cells
Journalism
(days)
Tips, sources, breaking eventInterview, fact-check, outlineEditorial meeting: stories compete for spaceFront page: finite above-the-foldFollow-up, series, investigative deep-diveThe archive, the public record
Hiring
(weeks)
Resumes, referrals, work samplesRecruiter screen, scorecards, interview loopReject mismatches, downselect slateFinal debrief, compare across dimensionsReference checks, leveling, calibrationEmployee record, team composition, alumni graph
VC
(months)
Pitches, market signals, founder historiesMemo, market map, diligencePass on most dealsPortfolio construction, category balanceBoard learning, thesis updatesCap table, portfolio, pattern library
Music
(months)
Hear influences, find soundsDemos, arrangements, track sketchesBand votes, producer kills tracksTracklist: sequence and pacingMixing, mastering, the final cutThe album, the catalog
Publishing
(months)
Proposals, manuscripts, trendsDevelopmental edit, outline, positioningAcquisitions rejects, editorial cullingCatalog selection, seasonal list balanceCopyedit, revisions, packagingPublished book, the backlist
Architecture
(years)
Site, client needs, codes, contextProgram, plans, massing, schematic designAlternatives killed by budget, code, useFinal scheme, room adjacencies, circulationDesign development, construction documentsThe building itself
Science
(years)
Observe phenomenaFormalize hypothesesPeer review: papers compete for slotsCitation, agenda-settingReview papers, textbooks synthesizeCurricula, canon of knowledge
Law
(decades)
Dispute arisesPleadings, briefs, argumentsAdversarial processSelection of controlling precedentAppellate synthesis, restatementsStare decisis
Evolution
(Myr)
The genomeGenerationNatural selectionNiche differentiationSpeciationThe genome
Universe
(tP)
ForceMass-energy (E=mc²)The Natural FrameworkGodThermodynamicsForce

Twelve more domains. Six roles. From antibodies to Planck time. Twenty-four domains total. More where these came from.

Why the same shape

Each domain faces the same problem: too much input, finite capacity, select a subset that’s both high-quality and diverse. Within each domain, the same data type flows through every step. Neurons process spikes. Databases process rows. Cognition processes moments. The type doesn’t change — only which items survive. Filter is rule-based: a threshold, a WHERE clause, a linter. No judgment. Attend is where judgment enters. Consolidate reads from remember and writes to the substrate: lossy compression that reshapes how each stage processes on the next cycle, propagating from outcome to cause. Compaction reorganizes the cache without changing the system.

Inhibition across domains

Desimone and Duncan (1995) described biased competition in neurons: visual objects compete simultaneously, winners suppress losers through mutual inhibition. Peer review works the same way. The winning papers make it harder for similar papers to get published. That’s inhibition.

The immune system is the cleanest non-neural domain. Antigens compete for T-cell binding. Clonal competition selects the best B cells. Affinity maturation consolidates winners into better antibodies. Memory B/T cells persist for decades. No central coordinator. The body lets pathogens compete.

Natural selection is the slowest domain but the most obvious. The competitive exclusion principle (Gause, 1934) says two species competing for identical resources cannot coexist. Niche differentiation is DPP at evolutionary timescale. Repulsion between similar items — what Salience uses to prevent redundant retrieval — is what prevents redundant species.

The categorical proof

Each role is a morphism — a structure-preserving map between information states. Five compose forward into a single transformation: high-bandwidth input to durable signal. The sixth, Consolidate, flows backward through the substrate, reshaping parameters from outcome to cause. That is category theory. The information states are the objects. Each role is a morphism with a postcondition — a structural guarantee on its output. The forward pipeline is their composition. When one domain’s Remember feeds the next domain’s Perceive, the mapping preserves all six morphisms and their composition order. That is a functor between categories.

Perceive and Cache are map. Filter and Attend are filter. Remember is reduce. Consolidate is the gradient — the backward pass that reshapes the map. Map-filter-reduce has been known since Lisp. The surprise is that immune systems run it too.

Two pipelines — Cognition (object type: Moment) and Writing (object type: Draft) — each with six roles. A dashed arrow from Cognition's Remember to Writing's Perceive shows the functor relationship.

This is deduction, not induction. The boundaries follow from temporal flow.

The full derivation is machine-checked in Lean 4. Five physics axioms, both boundary arguments, the three corollaries, ten removal tests, the stochasticity chain, the handshake composition, the coupling lemma, and the induction are all formally verified. The only gap is pigeonhole (which needs Mathlib); everything else compiles.

Boundary 1: encoding before selection. The pipeline receives raw input from the environment. A system is defined by its boundary: no boundary, no inside, no outside, no system. The boundary creates a type difference. Any physical system has finite state space; the environment includes everything the system does not. dim(E) > dim(I). A morphism must bridge them: that is Perceive. The bridge is a surjection — it must lose information.

That loss is not free. Landauer’s principle: erasing one bit costs at least kT ln 2 joules. Lossy encoding erases bits, so it has a thermodynamic floor. You cannot encode everything. You must choose what to encode — but choosing is selection, and selection requires encoding first. Circular dependency. The only resolution is temporal: encode first with a cheap, fixed, lossy projection (the retina, the tokenizer, the microphone), then select from the encoded set. Perceive is always cheap because it cannot afford to be expensive: you have not selected yet, so you do not know what is worth spending energy on.

Now: the six roles are temporal morphisms, input at time t, output at time t+δ, where δ > 0. Perceive receives multiple inputs (environment and Remember’s feedback). Remember emits at most one output at a time. Inputs arrive faster than outputs drain. By the pigeonhole principle, something must hold them. That is a data structure. A data structure for multiple items requires a write interface (storage) and a read interface (retrieval). Those are Cache’s two operations. Cache must exist. You cannot select from what you have not stored. Encoding before selection.

Boundary 2: selection before persistence. If the loop feeds back, the last step’s output must persist across the cycle boundary. That persistence morphism is Remember. Consolidate is lossy; Remember is lossless. If you persist before selecting, you persist everything, and the store grows without bound. Bounded storage forces selection before persistence.

Remember is the morphism that writes processed data to the store. The store is the substrate itself: the part of the medium that carries the system’s past forward. DNA is the substrate. The connectome is the substrate. Destroying the substrate ends the medium all six roles run on. A meteor does not break a pipeline stage; it vaporizes the substrate. The pipeline was working until there was nothing left to run it.

The claim is inductive between iterations only: if the substrate constrains the next cycle’s Perceive durably enough that the loop runs again, the composition holds. Remove it and the loop has nothing to perceive against. Remember is still an endomorphism, same type in and out, but with the longest time constant: persistent constraints must outlast the cycle they regulate. Timescale is the diagnostic. The contract is the definition: outputs that constrain future processing across cycles. A rock is slow but carries no system history. A genome is slow and carries every cycle that kept on happening.

Corollary 1: the competitive core exists. If output follows input with delay δ > 0, a policy decides when to release. δ = 0 is passthrough. δ = ∞ is suppression. Any system where outputs are a proper subset of inputs over time exhibits δ = some: a selection policy exists. That policy is Filter. The competitive core is not a design choice. It is forced by selective output.

Corollary 2: control separates from data. Nothing prevents policy from being encoded as data in principle. But data has variance (proven above). If policy shares a pool with data, the competitive process cannot distinguish them — it amplifies what matches current perception and suppresses the rest. One iteration and the policy is corrupted by the process it governs. Self-encoding is a fixed point that variance makes unstable. Even sharing a store is fatal: bounded storage forces eviction, and high-volume data evicts low-volume policy. Therefore policy and data must be different types. The contract is a property of the morphism, not the data flowing through it.

Corollary 3: Consolidate and Attend exist. The policy store from Corollary 2 is independent of the data store. It needs a write interface and a read interface. Attend is the read interface: it reads policy from the substrate and applies it to data in the forward pass. Consolidate is the write interface: it reads from Remember (which caches ranked outcomes) and compresses them into policy updates, propagating parameter changes from outcome to cause. The forward data path has five stages. The backward path is Consolidate.

Boundary 1 derives Perceive (type bridge) and Cache (pigeonhole). Boundary 2 derives Remember (loop closure). The corollaries derive Filter, Attend, and Consolidate. Five roles compose as forward stages. The sixth, Consolidate, reads from Remember and writes to the substrate. The derivation forces roles of this shape: a buffer, a gate, a policy reader, a policy writer. The cognition series found these six across every domain it examined. Two independent lines of evidence converge on the same structure.

Why “natural.” The proofs assume only temporal flow, bounded storage, and selective output. Energy satisfies the same premises: it flows through morphisms, storage costs resources, no consumption is 100% efficient. Energy is a data type. The same structure is forced for anything that flows through a bounded selective system in nature.

But the functor itself is a thing in nature. It occupies space, consumes energy, exists in time. The premises apply to the functor, not just to what flows through it.

Stochasticity is not optional. Proof by contradiction. Assume zero variation across a population of functors running the same pipeline. The population is either increasing, decreasing, or steady. If steady at zero output — dead, nothing to prove. If steady at nonzero output — every functor has a beginning, so equilibrium was reached from a prior state with nonzero delta. For the population to be uniform now, every functor must have converged to identical behavior without exchanging information about what to converge to. If increasing or decreasing — every functor must change at identical rates and times, requiring identical initial conditions and inputs. Thermodynamics breaks all three cases. The pipeline is lossy — Boundary 1 proves Perceive is a surjection, and selection erases the losers. Landauer’s principle: erasing one bit costs at least kT ln 2 joules, dissipated as heat. Heat is stochastic. Every lossy step dissipates heat. Heat introduces variation. No physical process produces identical copies. Therefore: stochasticity is not assumed. It is imposed by physics. (This chain — Landauer to finite states to pigeonhole to collision to determinism-forces-error — is formally verified, modulo the pigeonhole step which needs Mathlib.)

Variation percolates. Stochasticity at level n creates population variation. Those functors’ outputs are the data types at level n−1. If the data types are themselves functors, variation at n is population variation at n−1. The reverse holds too: diverse inputs from below produce diverse outputs above. The induction works both directions. Variation propagates through every functor boundary.

Uniformity is fatal. A functor that enforces uniformity by policy kills variation at its level. Without variation, filter has nothing to select, attend has no diversity to enforce, consolidate has nothing to compress. The pipeline stalls. No output. For a self-recursive loop, no output is death. Death percolates up: the uniformity-enforcing functor loses its substrate. With enough iterations, it dies too. The competitive core is the price of existing.

Falsification

The falsification test is structural: remove any morphism or permute their composition order, and the pipeline ceases to function. Three death conditions cover the internal failures: a broken step (a role fails), a closed loop (no new input), or decaying input (the source degrades). The dim cells are the evidence. Skip filter, and attention drowns in redundancy. That is Google’s row. Skip attend, and consolidation amplifies the wrong winners. That is Science’s row. Every dim cell in the tables is a system that dropped or misordered a morphism and broke downstream.

The three death conditions are exhaustive for pipeline failures, but substrate destruction is a precondition failure. The asteroid that killed the dinosaurs did not break a pipeline stage; every role was firing until impact. What ended was the substrate that carried the genome forward. The framework diagnoses how systems kill themselves. Substrate destruction is how systems get killed.

Categorial Error

In the tables above, dim cells mark steps a domain hasn’t optimized. The failures cascade: one broken step dims the rest downstream.

Google filters spam but does not filter for redundancy. Every page that clears the quality threshold enters the index regardless of what’s already there. Attend compensates with keyword match, top-k by PageRank. Top-k is not inhibition. Ten results from the same content farm survive because nothing suppressed them on the way in. Consolidate becomes mechanical re-crawling. One underoptimized step dims the whole row.

Adtech filters by willingness to pay, not relevance. Highest bidder wins every impression. Consolidate patches with frequency caps, a bandage on a filter that never ran. Remember was borrowed from the browser and is now being deprecated. I spent a month dismantling this pipeline. One broken step, three dim cells downstream.

Science is the most consequential. Citation metrics are GET * for academia: top-k by popularity, no diversity enforcement. Merton (1968) called it the Matthew effect — the cited get more cited. You search for “schema consolidation” and get ten papers that cite each other saying the same thing. A DPP would return one from that cluster and five from adjacent regions you didn’t know to search for. JSTOR, PubMed, Nature: same bug as Google, different coat. Fix attend, and consolidate sharpens.

Evolution has no dim cells. The genome perceives, generation caches, natural selection filters, niche differentiation attends, speciation consolidates, and the genome remembers. Perceive and Remember are the same cell. Consolidate lives inside that cell: speciation is the genome reshaping its own selection parameters. The genome is not a record stored inside the organism; it is the historically shaped substrate, every cycle that kept on happening persisting as the constraint on the next. Life is self-recursive because the substrate is the memory. The Universe row has the same structure: force perceives and force remembers. Genome→genome, force→force.

The recursive loop test

In a linked list, a weak node can be routed around, and survive. But in a singly recursive loop, we should be able to find out whether it survives a broken step or not. In biology, genome perception transforms into genome memory; it is a singly recursive loop. If it survives any one of the errors in each of the six roles, then the framework is falsified.

Can it? The error will either compound, diminish, or persist. Let’s test:

Every failure mode, given enough iterations of the loop, converges to the same endpoint: extinction. That is not a coincidence. It is what singly recursive means. In category theory, the six roles are morphisms inside the Giry monad — the monad of probability distributions. The recursive feedback has the structure of a trace; formalizing it requires specifying how environment and internal state interact. The Handshake gives the proof.

Beyond biology

The same test works beyond biology. The loops are messier, but the compounding is the same:

Every one broke a step and fed the error back into the next cycle. Every one ended in collapse.

What to filter

If your objection is “prove the category boundaries formally before I evaluate the idea” — that is a filter that gates on credence rather than structure. Run that filter in a loop. It will kill your own novel ideas before they survive a single iteration, because no new idea arrives pre-credentialed. Worse: it will pass the credentialed ones that should have been caught. The same heuristic that rejects uncredentialed insight is the same one as those who trusted the Harvard fraudsters, the Enron scammers, and the turtleneck at Theranos.

What remains

The six roles, the competitive inhibition at the core, and the vertical relationship: each domain’s Remember is the next domain’s Perceive. The pipeline compresses. Each level takes high-bandwidth information and reduces it to a durable signal the next can perceive. Neurons fire millions of times per second; cognition produces a few thoughts per minute; a career produces a handful of papers; a field produces a canon. The ratio is the reason at every transition. The word we have for that is intelligence.

Follow the output:

Output becomes input.

Fix the broken step, and the downstream cells brighten. That is what PageLeft does: Google’s filter had no redundancy inhibition, so we built one. Attend sharpened. Consolidate followed.

Nobody looks at Google Search and says “the filter step has no redundancy inhibition.” They say “search results are bad.” Nobody looks at academic publishing and says “attend is GET *.” They say “the literature is overwhelming.” The pipeline gives you diagnostic language for problems that existed before the language did. Seeing it was the hard part.

This all started with one comment. I was reading about neural attention and said, “this looks like a cache to me.” The data structure was identical: indexed items competing for limited slots. All inside my head. That observation won the competition against priority queue, against heap. It survived consolidation. It became a schema. That schema generated Salience, which generated DPP, which generated the Transformer mapping, which generated these tables.

The optimal implementations of these candidate functors already exist in nature, optimized over billions of years. We need to learn them and map them onto ourselves.

Stochasticity is physically mandatory for anything that persists. So is the competitive core, and the pipeline — five forward stages, one backward pass that reads from Remember and writes to the substrate — is the minimal structure for running it. Intelligence is the compression ratio between functor levels: what Perceive receives versus what Remember emits. Life is the self-recursive pipeline, Perceive and Remember in the same cell: the substrate constraining its own next perception. Genome→genome. Force→force. Compressing, selecting, persisting. The substrate is the memory.

If this functor is nature itself, iterating the universe at each Planck time, its variation-enforcing policy is the price of its vast existence. The Universe row fills all six cells. Force perceives, mass-energy caches, the natural framework filters, God attends, thermodynamics consolidates, force remembers. The loop closes. The universe is alive. The universe is intelligent. Genome in, genome out. Force in, force out. Intelligence and life are one and the same. ∎

For Christopher Alexander (1936–2022), who gave me new ways to perceive.


Written with Claude Opus 4.6 via Claude Code. I directed the argument; Claude drafted prose. GPT-5.4 via Codex CLI reviewed the result and recommended cutting the conclusion, filtering on credence rather than structure. That recommendation competed against the argument and lost.