Kakashi Ventures Accelerator · Turin, 2026

Organisations that
know what they know

A three-layer infrastructure for organisational epistemology — from open storage primitives to a commercial AI-native workspace.

Product Layer · L4–L5

If an AI agent had to operate autonomously in our studio — where would it find everything it needs?

The answer is The Room. Not one more tool in the stack — the environment where work itself happens. Every venture, every challenge, every decision and every piece of knowledge lives here, designed from the first line of code to be as legible to a language model as it is to the humans who built it.

Challenges · Diary · Wiki

Complete operational context. One coherent picture.

The Room is structured around three layers for every active venture. Challenges tracks execution — every task, owner, and dependency in real time. Diary is the episodic memory — timestamped, authored, traceable. Wiki is the semantic layer — decisions taken, frameworks adopted, market contexts mapped. Together they make a venture fully navigable by an agent without any human intermediary.

Knowledge Flywheel

Work that becomes capital — automatically.

Every action the team takes produces a signal. That signal enters the system as a typed Knowledge Object. The engine scores it, ages it, and links it to related claims. Contradictions surface. Stale signals decay. Urgent open questions escalate automatically. The Room turns daily execution into permanent, computable epistemic capital — without asking anyone to maintain it.

AI-Native by Design

Agents are primary users — not features.

The Room is the first workspace built on the premise that AI agents are first-class consumers of the system. Every widget exposes a stable context object. The Widget Contract is the protocol that makes every component AI-compatible by definition. The same operational picture a team lead reads at 9am is the picture an agent navigates at 3am. One interface, two kinds of intelligence.

Epistemic Engine · L2–L3

From ancient Greek οἶδα — "I know because I have seen."

AI agents cannot distinguish what is known from what was once believed. Without epistemic structure, a verified strategic decision and an abandoned hypothesis from two years ago are treated as equivalent — because nothing in the substrate said otherwise. Better retrieval cannot fix this. The problem is structural. OIDA fixes it at the substrate level.

Nine Epistemic Classes

Not categories — commitments.

Every piece of knowledge entering OIDA is assigned one of nine epistemic classes before storage: DECISION, CONSTRAINT, EVIDENCE, NARRATIVE, PLAN, EVALUATION, OBSERVATION, HYPOTHESIS, QUESTION. Each carries a specific seed value, a decay profile derived from justification logic, and a seven-axis coordinate (KOC) that allows structural similarity to be computed in O(1) — no database lookup required.

The class determines how fast a Knowledge Object ages, how much importance it starts with, and what kind of agent reasoning it triggers. A DECISION is valid until explicitly superseded. A HYPOTHESIS has a 50-day half-life. A QUESTION becomes more urgent the longer it stays unanswered.

Knowledge Gravity Engine

Importance that emerges from real signals — not editorial assignment.

The KGE computes a K-score for every active Knowledge Object every six hours from four forces: usage (how recently and frequently a KO was retrieved), evidence (new inbound SUPPORTS edges in a 14-day window), gravity (importance propagated through the signed edge graph), and contradiction penalty (active CONTRADICTS edges). The update rule converges to a provable fixed point. No language model sets a score. No human refreshes a weight.

Contradiction as First-Class Signal

Negative gravity is not a bug. It is the mechanism.

A CONTRADICTS edge carries a semantic coefficient of −0.6, generating negative gravity that actively suppresses the K-score of the contradicted node cycle by cycle. Agents receive a ranked epistemic map — what is settled, what is contested, and what is quietly losing ground to newer evidence.

Organisational Belief System

The same knowledge, scored differently for different organisations.

OBS is a parametric layer that captures how a specific organisation evaluates knowledge along three continuous axes: reasoning formality, representation discreteness, and validation orientation. KVA operates three distinct OBS profiles: Technical (analytical, strict), Venture (adaptive, evidence-driven), Consulting (contextual, narrative-weighted). The same observation can produce different K-score trajectories under each profile. Epistemic culture becomes explicit and auditable.

Storage Layer · L1 · MIT Licence

SQL stores events. Graphs store connections. Vectors find similarity. None of them knows what the organisation believes.

The standard architecture — PostgreSQL + Neo4j + pgvector — creates three separate consistency models to maintain, asynchronous sync bridges that drift, and application code that simulates behaviours which should be database infrastructure. An AI agent querying this stack gets a frozen snapshot: a 2022 GDPR directive and yesterday's clinical update carry identical weight. EpistemicDB adds the missing fourth dimension: epistemic state, natively, without application code maintaining it.

Four Schema Blocks

Every epistemic primitive has its place.

Knowledge CoreKnowledgeObject, KOEdge, KOSource — is the irreplaceable epistemic nucleus. AnchorsKnowledgeAnchor, AnchorEdge — are the cognitive navigation nodes and gravity sources. Retrieval Index — embeddings via pgvector, KOScoreSnapshot — powers hybrid retrieval. Temporal SystemKODecayConfig, KGCycleLog, KGEvictionLog — is the immutable audit trail.

The chunk is canonical. The graph is a derived projection. Never the reverse.

Epistemic Query Language

EQL has the same architectural dignity as SQL.

A template literal TypeScript string is not a language. A language has a formal grammar, standalone .eql files, a CLI, syntax highlighting, a Language Server. EQL has all of these — built on a nearley.js grammar with a full lexer, AST types, a nine-rule type checker, a canonical idempotent formatter, and error reporting with source location. eql query · eql validate · eql format · eql repl.

Example: FETCH KO WHERE entity = "pricing" AND decay_score > 0.35 ORDER BY K DESC — returns living knowledge ranked by emergent importance, not a flat list of documents ranked by string similarity.

Open Source · Independent

The scientific framework survives any evolution of the product.

EpistemicDB is available under the MIT licence and is designed to be adopted, forked, and extended independently of OIDA or The Room. The three-level separation — storage, engine, product — ensures that the epistemic model, the gravity engine, and the retrieval architecture are publishable and citable on their own terms. Building on EpistemicDB does not require licensing the engine or subscribing to the product. The moat is the protocol: KO schema, decay logic, hybrid retrieval scoring, EQL as language.

Architecture · Three Levels of Independence

One stack.
Three clean separations.

L4–L5 — Product
The Room

Commercial workspace · Brook AI orchestration · Capital & Venture OS · KVA instance · three OS configurations

L2–L3 — Engine
OIDA

Publishable framework · KGE + OBS modulation · hybrid retrieval · memory zones · citable independently

L1 — Storage
EpistemicDB

MIT open source · PostgreSQL + pgvector · standalone schema · EQL query language · no product dependency

Get involved

Request early access to The Room.

The Room is in active deployment. If you run a venture studio, investment fund, or knowledge-intensive team — we'd like to build it with you.