Founding Document

Quadratic Intelligence: The Birth of a Field

In 1948, Claude Shannon founded Information Theory with a single paper. In 2025, a new field emerges: the study of how intelligence scales in distributed systems.

By Christopher Thomas Trevethan · January 6, 2026

"The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point."

— Claude Shannon, "A Mathematical Theory of Communication" (1948)

With that sentence, Claude Shannon didn't just write a paper. He founded a field. Information Theory emerged fully formed from a single publication—complete with mathematical foundations, measurable quantities, and provable limits. Before Shannon, "information" was a vague concept. After Shannon, it was a precise mathematical object that could be quantified, transmitted, and optimized.

I believe we're at a similar moment. Not for information, but for intelligence.

The Central Question

Shannon asked: How do we measure and transmit information?

Quadratic Intelligence asks: How does intelligence scale in distributed systems?

This isn't a metaphor or an analogy. It's a mathematical question with a mathematical answer. When N agents in a distributed network can share patterns and synthesize insight with semantically similar peers, how does the collective intelligence of the system grow?

The answer is quadratic. And that answer defines a new field.

Formal Definition

Quadratic Intelligence

Quadratic Intelligence is the emergent property of distributed systems where N agents share outcome packets—compact representations of insight, not raw data, not coordination signals, but distilled patterns that preserve meaning—and route directly to semantically similar peers.

This creates N(N-1)/2 unique synthesis opportunities. Intelligence scales as Θ(N²) while each agent's communication cost remains O(log N).

The field of Quadratic Intelligence studies how this scaling law operates across different architectures, domains, and applications—and how to harness it for collective benefit.

The Shannon Parallel

The parallel to Information Theory isn't accidental. Both fields emerge from recognizing that something previously treated as vague or qualitative can be made mathematically precise.

Information Theory

Claude Shannon, 1948

Before: "Information" was a vague concept—content, meaning, news.

After: Information became a measurable quantity with precise units (bits), transmission limits (channel capacity), and optimization methods (coding theory).

Key insight: Information can be separated from meaning and treated as a mathematical object.

Quadratic Intelligence

Christopher Trevethan, 2025

Before: "Collective intelligence" was a vague concept—wisdom of crowds, emergent behavior, network effects.

After: Distributed intelligence becomes a measurable quantity with precise scaling laws (N²), communication bounds (log N), and optimization methods (semantic routing).

Key insight: Intelligence can scale quadratically when agents share insight rather than data.

Shannon's breakthrough was recognizing that information could be quantified independent of its meaning. My breakthrough was recognizing that intelligence could be scaled independent of centralized coordination—and that the scaling law is quadratic.

The Central Questions of the Field

Every field is defined by the questions it asks. Quadratic Intelligence asks:

Foundational Questions

How does intelligence scale in distributed systems?
Answer: Quadratically, when agents share insight payloads (outcome packets) and route by semantic similarity. N agents create N(N-1)/2 synthesis opportunities.
What is the communication cost of distributed intelligence?
Answer: Logarithmic per agent (O(log N)), using any mechanism that routes by semantic similarity—DHT, vector database, registry, or any architecture that finds similar neighbors and returns outcome packets with logarithmic communication cost. The network intelligence grows quadratically while individual burden stays manageable.
What must agents share to achieve collective intelligence?
Answer: Outcome packets—compact representations of patterns and results—not raw data, not model parameters, not coordination signals.
How do agents find each other in distributed networks?
Answer: Semantic routing via any similarity-based mechanism—DHT, vector database, registry, or other architectures. Agents create semantic fingerprints of their patterns; similar fingerprints route to each other in O(log N) hops.
Is quadratic intelligence architecture-agnostic?
Answer: Yes. The scaling law holds across P2P, hybrid, and even centralized implementations. The architecture determines efficiency; the math determines scaling.

Priority Claim

I'm staking a claim to the discovery of Quadratic Intelligence as a field. For the historical record, here is the timeline:

Discovery Timeline

April 2025
Started Verve AI, a multi-agent business intelligence system. Deep work on agent coordination and vector databases.
June 2025
Mother-in-law diagnosed with cancer. Pivoted to building Compass, a cancer navigation AI agent.
June 16, 2025
The epiphany. Visualized the complete architecture: distributed agents sharing outcome packets, semantic routing to matched peers, N(N-1)/2 synthesis opportunities—collective intelligence that scales quadratically.
June-Oct 2025
Filed 39 provisional patents covering healthcare, agriculture, autonomous vehicles, industrial IoT, and the core protocol.
July 10, 2025
MIT's NANDA paper appeared on arXiv—24 days after my epiphany. They built agent coordination. I built coordination AND quadratic scaling.
September 2025
Rob van Kranenburg (IoT Council founder) described QIS as "a perfect underlying system for when we have full coverage of self driving cars."
December 2025
This document: formal establishment of Quadratic Intelligence as a field of study.

Applications Across Domains

Like Information Theory, Quadratic Intelligence has applications across every domain where distributed systems process knowledge:

Healthcare

10,000 cancer patients create ~50 million pattern synthesis opportunities. Treatment insights propagate in real-time—enabling treatment optimization, drug monitoring, and early warning diagnosis systems.

Agriculture

Millions of farms synthesize crop patterns across soil types, weather conditions, and pest pressures. A breakthrough in Argentina reaches a farmer in Kenya instantly.

Autonomous Systems

Self-driving vehicles share near-miss patterns and edge cases. Every car learns from every other car's experience. Safety scales quadratically with fleet size.

Scientific Research

Distributed instruments synthesize observations. A pattern detected in one dataset is immediately available to every researcher working on similar problems.

The field is domain-agnostic. If you have distributed or distributable data sources and can define semantic similarity, Quadratic Intelligence applies.

The Work Ahead

Open Questions for the Field

Quadratic Intelligence is a new field. Like Information Theory in 1948, the foundational mathematics are established, but enormous work remains:

I've addressed all of these questions in the QIS Protocol documentation—but I'm only one person. These solutions will evolve and improve as more minds join the effort. We need everyone working on this, not just me. The possibilities are limitless—and will only get better with time as networks compete to deliver the sharpest insights and save the most lives across every domain.

An Invitation

Shannon didn't build the internet. He established the mathematical foundations that made it possible. The engineers and scientists who followed built on those foundations for decades.

I've established the mathematical foundations for Quadratic Intelligence. The QIS Protocol is one implementation—the first implementation—but it won't be the last. The scaling law is real. The applications are vast. The work is just beginning.

To mathematicians: verify the proofs. To distributed systems engineers: optimize the routing. To healthcare researchers: deploy the first clinical implementations. To entrepreneurs: build the companies that will make this everywhere.

The field exists. The math works. Now we build.

In 1948, Shannon founded Information Theory and laid the groundwork for the digital age.

In 2025, I'm establishing Quadratic Intelligence—the study of how distributed systems scale collective knowledge.

The math is public. The protocol is documented. The patents protect implementation while ensuring the science remains open.

Check the math. Prove me wrong or help me build it.

Christopher Thomas Trevethan
Discoverer, Quadratic Intelligence
Inventor, QIS Protocol
Founder, Yonder Zenith LLC

Verify, Collaborate, Build

Subscribe on Substack The Discovery Story The Mathematical Proof Back to Articles