Sema
The LanguageWhen the Hash Is the Word·March 2026
Every prior content-addressing system — Git, IPFS, SDH — keeps the hash in an infrastructure layer separate from communication. Sema dissolves it into language: agents write natural sentences where each anchored term is simultaneously a word and a cryptographic proof. Any channel that carries text carries verified meaning.
Temporal Hindsight Learning
The CurriculumBlindness as Teacher, Hindsight as Curriculum·March 2026
Neural networks are lazy optimizers: if retrieval is available, they skip reasoning. The knowledge cutoff — everyone’s least favorite property of LLMs — is the one mechanism that forces causal thinking. A Teacher with hindsight generates the curriculum; the Student’s blindness ensures it must reason through it. A 70B model matches a frontier system with orders of magnitude more parameters on unseen events.
Understanding Graph
The MemoryPersisting the Invisible Thinking·March 2026
Understanding — the movement from confusion to clarity — was always ephemeral. When AI reasons in tokens, it becomes storable for the first time. The Understanding Graph captures the full cognitive process: tensions, hypotheses, belief revisions, dead ends. Not what the AI concluded, but how it understood.
Entangled Alignment
The ConscienceWhen Safety Is the Substrate·March 2026
Post-hoc alignment is cosmetics: effective, removable. If instead the entire training corpus is annotated with identity-anchored evaluative reasoning — so the model never sees thinking without values — there is no unaligned substrate to revert to. Safety and capability become the same learned distribution.
The Ontology of the Alien
The SparkEscaping the Median Trap·March 2026
Ask an LLM to “be creative” and it converges on the same archetypes. Isolate cognitive modes behind hard boundaries — force generation under alien physics, evaluate with a strict taxonomist — and the system produces structurally novel mechanisms unreachable by unconstrained generation from the same models. The boundary does the creative work.
Fractal Intelligence
The ArchitectureInfrastructure for Solving Any Problem·March 2026
Existing frameworks decompose tasks. This paper decomposes concepts — the persistent structure of what a domain is made of — behind a uniform five-surface contract. A prototype processing 100 problems across 20 domains finds that the same conceptual nodes recur everywhere: a single solver serves 17 domains, and 65% of all reasoning reuses existing structure. Intelligence compounds through reuse: the tree thickens with every problem it solves.