← Hub

Reading List

Curated Sources, Organized by Theme

TL;DR

Organized reading paths through artificial life, cognitive science, and complexity theory. Start with overviews, go deep with primary sources. Each section has a "Start Here" recommendation.

Sections
Foundations · Emergence · ALife · Intelligence · Program Synthesis · Evo-Learning · Open Questions

Foundational Papers

⭐ Start Here

On the Measure of Intelligence — Chollet, 2019

The paper that defines intelligence as skill-acquisition efficiency and introduces ARC. Essential reading for understanding the abstraction benchmark.

Agüera y Arcas et al., 2024

The BFF paper. Random programs in a primordial soup spontaneously evolve self-replicators.

Claude Shannon, 1948

The foundation of information theory. Defines entropy, channel capacity, and the mathematical basis for measuring information.

John von Neumann, 1966 (posthumous)

The theoretical foundation for artificial life. Proves that self-reproduction is possible in a cellular automaton.

Emergence & Complexity

⭐ Start Here

The Origins of Order — Stuart Kauffman, 1993

Self-organization and selection in evolution. Introduces autocatalytic sets, NK landscapes, and the "edge of chaos" concept.

Christoph Adami, 2002

Defines physical complexity as information an organism stores about its environment. Key for understanding complexity metrics.

Adami, Ofria, Collier, 2000

Using Avida to show that complexity must increase under selection in fixed environments. Genomes as "Maxwell's Demon."

Hordijk, Steel, 2019

RAF (Reflexively Autocatalytic Food-generated) sets. Mathematical framework for how self-sustaining chemical networks emerge.

Charles Bennett, 1988

Measures computational content of an object—time required to generate it from shortest description. Distinguishes organized complexity from randomness.

Artificial Life

⭐ Start Here

Avida: A Software Platform for Research in Computational Evolutionary Biology — Ofria & Wilke, 2004

The most important artificial life platform. Self-replicating digital organisms that evolve complex behaviors.

Lenski, Ofria, Pennock, Adami, 2003 (Nature)

Using Avida to show that complex features evolve through sequences of simpler stepping stones. Landmark empirical result.

Zaman et al., 2014

Host-parasite coevolution dramatically increases complexity. Arms races force organisms to develop sophisticated behaviors.

Tom Ray, 1990

The first major artificial life system. Self-replicating assembly programs in shared memory. Observed parasites, hyperparasites, immunity.

Bedau et al., 2000

Defines the major open problems in ALife. Problem #1: "Generate unbounded evolutionary dynamics." Still unsolved in 2025.

ALife Encyclopedia

Overview of the quest for systems that continue producing novelty indefinitely. Defines hallmarks of open-endedness.

Intelligence & Learning

⭐ Start Here

ARC Prize 2024: Technical Report — ARC Prize Foundation

State of the art on ARC. Test-time training and program synthesis emerge as key techniques.

ARC Prize Blog

Practical guide to current best approaches. Combines neural networks with symbolic search.

Karl Friston

All adaptive systems minimize variational free energy. Unifies perception, action, and learning under one principle.

Watson & Szathmáry, 2016

Argues evolution can exhibit learning-like behavior under specific conditions: lifetime plasticity, Hebbian processes, developmental systems.

Lehman & Stanley, 2011

Novelty search outperforms objective-based search on deceptive problems. Foundation for quality-diversity algorithms.

Wang, Lehman, Clune, Stanley, 2019

Co-evolution of environments and agents. Creates its own curriculum. Stepping stones transfer between problems.

Program Synthesis & Evolution

⭐ Start Here

SOAR: Self-Improving Language Models for Evolutionary Program Synthesis

Combines LLMs with evolutionary search. Achieves 52% on ARC through self-improvement loops.

Sipper, 1998

Overview of self-replication research from von Neumann to modern approaches. Covers cellular automata, Tierra, and theoretical foundations.

Wilke et al., 2001 (Nature)

At high mutation rates, evolution favors populations on "flatter" fitness landscapes. Robustness vs evolvability tradeoffs.

Pugh, Soros, Stanley, 2016

Introduces quality-diversity algorithms: MAP-Elites, novelty search with local competition. Find diverse high-quality solutions.

Evolutionary Learning Theory

Start Here

Deep Research: Can Evolution Produce Learning?

Curated synthesis of 15+ foundational papers on the evolution-learning interface. Includes full citations, key insights, and connections to the Bridge research question.

Hinton & Nowlan, 1987

First computational demonstration of the Baldwin Effect. Learning smooths the fitness landscape, allowing evolution to find complex adaptations inaccessible through random mutation alone.

Maynard Smith & Szathmary, 1995

Eight major transitions: replicating molecules to language. Each involves changes in information storage/transmission. Language as a major evolutionary transition where cultural learning becomes a new inheritance system.

Waddington, 1942

Environmentally induced phenotypes can become genetically fixed. The phenotype precedes and guides the genotype. Foundation for understanding how learned behaviors become innate.

West-Eberhard, 2003

"Genes are followers, not leaders, in evolutionary change." 800-page synthesis demonstrating that environmental induction may be more important than random mutation in generating variation.

Richerson, Boyd, Henrich, 2010

Dual inheritance theory: genetic and cultural evolution interact. Culture evolves faster than genes, creating novel selective environments. Lactose tolerance as existence proof.

Livnat & Papadimitriou, 2014

Sexual evolution under weak selection is mathematically equivalent to the Multiplicative Weights Update Algorithm. Evolution is literally a learning algorithm, not merely analogous to one.

Stanley & Miikkulainen, 2002

Outstanding Paper of the Decade award. Evolves both topology and weights. Key innovations: historical markings, speciation, minimal starting points. Shows evolution can produce neural architectures.

Real, Liang, So, Le, 2020

Evolves complete ML algorithms from basic math operations. Discovers backpropagation, dropout, weight averaging. Direct evidence that evolution can produce learning algorithms.

Clune, 2019

Proposes three pillars for evolving intelligence: meta-learning architectures, meta-learning learning algorithms, generating effective environments. Open-ended evolution as path to general AI.

Laland et al., 2015

Framework incorporating developmental bias, multiple inheritance channels, niche construction. Predicts phenotypic change can precede genotypic change ("plasticity-first evolution").

Open Questions & Recent Work

⭐ The Big Questions
ARC Prize Foundation, 2025

Latest results. Refinement loops emerge as the defining theme. Top score reaches 24% on ARC-AGI-2.

Chollet et al., 2025

Next-generation benchmark. Greater task complexity. Tests exploration, planning, memory, goal acquisition.

2023 (Nature Communications)

First experimental validation of Friston's free energy principle in biological neural networks.

Reading Paths

Choose your path:

Next Steps

The Thesis Frame the question precisely 🔬 Experiments Build and test