Sema
The LanguageWhen the Hash Is the Word·April 2026 · updated May 2026
Autonomous agents need shared, verifiable vocabulary: labels that compress coordination without hiding semantic drift. Sema turns content-addressed behavioral contracts into words in natural language. Each Pattern Card canonicalizes invariants, preconditions, failure modes, and dependencies into a hash-backed identifier, so ordinary prose can carry readable concepts that are also cryptographic equality proofs. The current bootstrap library contains 452 patterns and shows 22.6x mean token compression across audited references.
Understanding Graph
The MemoryPersisting the Invisible Thinking·March 2026
Understanding — the movement from confusion to clarity — was always ephemeral. When AI reasons in tokens, it becomes storable for the first time. The Understanding Graph captures the full cognitive process: tensions, hypotheses, belief revisions, dead ends. Not what the AI concluded, but how it understood.
Entangled Alignment
The ConscienceWhen Safety Is the Substrate·March 2026
Post-hoc alignment is cosmetics: effective, removable. If instead the entire training corpus is annotated with identity-anchored evaluative reasoning — so the model never sees thinking without values — there is no unaligned substrate to revert to. Safety and capability become the same learned distribution.
The Ontology of the Alien
The SparkEscaping the Median Trap·March 2026
Ask an LLM to “be creative” and it converges on the same archetypes. Isolate cognitive modes behind hard boundaries — force generation under alien physics, evaluate with a strict taxonomist — and the system produces structurally novel mechanisms unreachable by unconstrained generation from the same models. The boundary does the creative work.
Fractal Intelligence
The ArchitectureConceptual Decomposition as Problem-Solving Infrastructure·April 2026 · updated May 2026
Existing frameworks decompose tasks. This paper decomposes concepts — the persistent structure of what a domain is made of — behind a uniform five-surface contract. In a prototype simulation of 100 problems across 20 domains, concept-based routing produces a shared graph of 456 solver nodes with 65% reuse. The result is evidence for reusable reasoning infrastructure, not a claim of executed cross-domain problem solving.
Temporal Hindsight Learning
The CurriculumBlindness as Teacher, Hindsight as Curriculum·March 2026
Neural networks are lazy optimizers: if retrieval is available, they skip reasoning. The knowledge cutoff — everyone’s least favorite property of LLMs — is the one mechanism that forces causal thinking. A Teacher with hindsight generates the curriculum; the Student’s blindness ensures it must reason through it. A 70B model matches a frontier system with orders of magnitude more parameters on unseen events.