← Hub

The Thesis

Defining the Question

TL;DR

Evolution produces complexity; intelligence requires abstraction. Are these the same phenomenon at different scales, or fundamentally different? The answer has implications for AGI, origin-of-life, and what "understanding" means.

Sections
Definitions · Chollet's View · Kauffman's View · The Gap · Stakes

Key Definitions

Before asking whether evolution produces intelligence, we must define our terms precisely.

Emergence

Definition

Emergence: Macro-level properties that arise from micro-level interactions but cannot be predicted from (or reduced to) knowledge of the components alone.

Self-replication in BFF is emergent: nothing in the instruction set "wants" to copy itself. Replication arises from the dynamics of program interaction.

Strong vs Weak Emergence

Complexity

Definition (Adami)

Physical complexity: The amount of information a system stores about its environment. Not Shannon entropy (which peaks for random systems), but functional information—structure that matters for survival.

\[ C = H_{max} - H_{observed} \]

Where \(H_{max}\) is maximum possible entropy and \(H_{observed}\) is actual entropy. Complexity is constrained randomness—structure that deviates from maximum disorder.

Abstraction

Definition (Chollet)

Abstraction: The process of extracting invariant structure from specific instances. Mapping particulars to general patterns that apply across contexts.

Example: Seeing the pattern "all items move right" across different input grids requires abstracting over:

Intelligence

Definition (Chollet, 2019)

Intelligence: The efficiency of skill-acquisition on previously unknown tasks. Not how much you know, but how quickly you learn new things.

\[ \text{Intelligence} = \frac{\text{Generalization}}{\text{Training Experience} + \text{Prior Knowledge}} \]

This is why GPT-4 scoring high on bar exams doesn't prove intelligence: it was trained on essentially the entire internet. The question is: can it solve novel problems with minimal examples?

Chollet's Thesis

Core Claim

Intelligence is skill-acquisition efficiency over unknown tasks. Current AI optimizes for task performance, not for learning efficiency. This is why scaling fails.

The ARC Benchmark

"On the Measure of Intelligence" (2019) introduced ARC to test:

Core Knowledge (Elizabeth Spelke)

ARC assumes humans are born with certain cognitive building blocks:

Why LLMs Fail

Despite trillions of parameters, LLMs perform poorly on ARC because:

  1. Memorization: They've seen similar patterns in training data
  2. Interpolation: They blend known patterns, don't extrapolate
  3. No search: They generate tokens sequentially, can't backtrack

The 2025 ARC winner was a 7-million parameter recursive model that beat 671-billion parameter systems. Architecture matters more than scale.

Kauffman's Thesis

Core Claim

Life is self-organized criticality at the edge of chaos. Order emerges spontaneously when systems reach sufficient complexity. Evolution exploits, not creates, this order.

The Origins of Order

Stuart Kauffman's work suggests that:

RAF Theory

A Reflexively Autocatalytic Food-generated (RAF) set is a collection of reactions where:

RAF sets emerge with probability approaching 1 as chemical diversity increases. Life's origin may have been inevitable.

Edge of Chaos

Ordered ←——————— Edge of Chaos ———————→ Chaotic
(frozen)          (complex)            (random)
     |                 |                   |
crystals         life, minds           gas, noise
  

Systems at the edge of chaos exhibit maximal computational capacity. Too ordered: can't adapt. Too chaotic: can't maintain information. Life operates at the boundary.

The Gap

Here's the problem: Kauffman explains why complexity emerges. Chollet measures intelligence. Are these the same thing?

Kauffman: Emergence
  • Order arises spontaneously
  • Autocatalysis from random chemistry
  • Edge of chaos = maximal complexity
  • Evolution channels order
Chollet: Intelligence
  • Skill-acquisition efficiency
  • Transfer to novel problems
  • Abstraction from examples
  • Core knowledge priors
The Central Question

Does "edge of chaos" complexity eventually become abstraction? Or is abstraction a fundamentally different phenomenon requiring explicit structure (like brains)?

Possible answers:

  1. Continuity: Intelligence is just complexity at scale. Given enough time and selection pressure, abstraction emerges from replication.
  2. Discontinuity: Intelligence requires specific architectural innovations (cortex, attention, memory) that evolution produces but doesn't guarantee.
  3. Duality: Complexity creates the substrate for intelligence, but intelligence requires additional environmental coupling.

Why This Matters

For AGI

If intelligence emerges from complexity:

If intelligence requires specific structure:

For Origin of Life

If abstraction is continuous with replication:

If abstraction is discontinuous:

For Philosophy

The question touches on:

Next Steps

Continue your journey: