Curated Sources, Organized by Theme
Organized reading paths through artificial life, cognitive science, and complexity theory. Start with overviews, go deep with primary sources. Each section has a "Start Here" recommendation.
On the Measure of Intelligence — Chollet, 2019
The paper that defines intelligence as skill-acquisition efficiency and introduces ARC. Essential reading for understanding the abstraction benchmark.
The BFF paper. Random programs in a primordial soup spontaneously evolve self-replicators.
The foundation of information theory. Defines entropy, channel capacity, and the mathematical basis for measuring information.
The theoretical foundation for artificial life. Proves that self-reproduction is possible in a cellular automaton.
The Origins of Order — Stuart Kauffman, 1993
Self-organization and selection in evolution. Introduces autocatalytic sets, NK landscapes, and the "edge of chaos" concept.
Defines physical complexity as information an organism stores about its environment. Key for understanding complexity metrics.
Using Avida to show that complexity must increase under selection in fixed environments. Genomes as "Maxwell's Demon."
RAF (Reflexively Autocatalytic Food-generated) sets. Mathematical framework for how self-sustaining chemical networks emerge.
Measures computational content of an object—time required to generate it from shortest description. Distinguishes organized complexity from randomness.
Avida: A Software Platform for Research in Computational Evolutionary Biology — Ofria & Wilke, 2004
The most important artificial life platform. Self-replicating digital organisms that evolve complex behaviors.
Using Avida to show that complex features evolve through sequences of simpler stepping stones. Landmark empirical result.
Host-parasite coevolution dramatically increases complexity. Arms races force organisms to develop sophisticated behaviors.
The first major artificial life system. Self-replicating assembly programs in shared memory. Observed parasites, hyperparasites, immunity.
Defines the major open problems in ALife. Problem #1: "Generate unbounded evolutionary dynamics." Still unsolved in 2025.
Overview of the quest for systems that continue producing novelty indefinitely. Defines hallmarks of open-endedness.
ARC Prize 2024: Technical Report — ARC Prize Foundation
State of the art on ARC. Test-time training and program synthesis emerge as key techniques.
Practical guide to current best approaches. Combines neural networks with symbolic search.
All adaptive systems minimize variational free energy. Unifies perception, action, and learning under one principle.
Argues evolution can exhibit learning-like behavior under specific conditions: lifetime plasticity, Hebbian processes, developmental systems.
Novelty search outperforms objective-based search on deceptive problems. Foundation for quality-diversity algorithms.
Co-evolution of environments and agents. Creates its own curriculum. Stepping stones transfer between problems.
SOAR: Self-Improving Language Models for Evolutionary Program Synthesis
Combines LLMs with evolutionary search. Achieves 52% on ARC through self-improvement loops.
Overview of self-replication research from von Neumann to modern approaches. Covers cellular automata, Tierra, and theoretical foundations.
At high mutation rates, evolution favors populations on "flatter" fitness landscapes. Robustness vs evolvability tradeoffs.
Introduces quality-diversity algorithms: MAP-Elites, novelty search with local competition. Find diverse high-quality solutions.
Deep Research: Can Evolution Produce Learning?
Curated synthesis of 15+ foundational papers on the evolution-learning interface. Includes full citations, key insights, and connections to the Bridge research question.
First computational demonstration of the Baldwin Effect. Learning smooths the fitness landscape, allowing evolution to find complex adaptations inaccessible through random mutation alone.
Eight major transitions: replicating molecules to language. Each involves changes in information storage/transmission. Language as a major evolutionary transition where cultural learning becomes a new inheritance system.
Environmentally induced phenotypes can become genetically fixed. The phenotype precedes and guides the genotype. Foundation for understanding how learned behaviors become innate.
"Genes are followers, not leaders, in evolutionary change." 800-page synthesis demonstrating that environmental induction may be more important than random mutation in generating variation.
Dual inheritance theory: genetic and cultural evolution interact. Culture evolves faster than genes, creating novel selective environments. Lactose tolerance as existence proof.
Sexual evolution under weak selection is mathematically equivalent to the Multiplicative Weights Update Algorithm. Evolution is literally a learning algorithm, not merely analogous to one.
Outstanding Paper of the Decade award. Evolves both topology and weights. Key innovations: historical markings, speciation, minimal starting points. Shows evolution can produce neural architectures.
Evolves complete ML algorithms from basic math operations. Discovers backpropagation, dropout, weight averaging. Direct evidence that evolution can produce learning algorithms.
Proposes three pillars for evolving intelligence: meta-learning architectures, meta-learning learning algorithms, generating effective environments. Open-ended evolution as path to general AI.
Framework incorporating developmental bias, multiple inheritance channels, niche construction. Predicts phenotypic change can precede genotypic change ("plasticity-first evolution").
Latest results. Refinement loops emerge as the defining theme. Top score reaches 24% on ARC-AGI-2.
Next-generation benchmark. Greater task complexity. Tests exploration, planning, memory, goal acquisition.
First experimental validation of Friston's free energy principle in biological neural networks.