The Developing Brain

 


I have a notebook from my time at TIFR. On one page, drawn in blue pen during a class I was auditing, are rough sketches of a brain in cross-section at different stages of development. Neocortex, six-layered. Hippocampus. Choroid plexus. Ganglionic eminence. At the bottom right, in my own handwriting: intelligence comes from circuitry and connectivity.

The class was taught by Prof. Shubha Tole, one of the finest developmental neurobiologists working today, and at the time also my PI. I was a molecular biologist sitting in on a neurodevelopment lecture, drawing structures that fulfilled my itch for art and science simultaneously, trying to grab a hold of this enormously complex developmental system.

My note said intelligence comes from circuitry and connectivity. This is true as far as it goes, but it raises an immediate question: what kind of mathematical object is a circuit?

The naive answer is a graph. Neurons are nodes, synapses are edges. This is useful but incomplete. It tells you the topology but not the dynamics. Two circuits with identical connectivity can behave completely differently depending on the strengths of their connections and the timescales of their signaling. Like any model, however, it is useful depending on the question you're asking.

The richer answer comes from statistical mechanics. Think of the brain not as a fixed circuit but as a physical system with an enormous number of degrees of freedom, each neuron capable of being active or silent, each pattern of activity a microstate. The question is not which microstate the brain occupies at any moment but what the statistical structure of those microstates looks like. Which patterns are common, which are rare, which are forbidden by the underlying connectivity.

This is exactly the question statistical mechanics was built to answer. Given constraints on average activity and pairwise correlations between neurons, the maximum entropy distribution gives you the model that makes the fewest additional assumptions. The coupling matrix in that energy function encodes the effective interactions between neurons, the statistical fingerprint of the circuit. Bill Bialek and colleagues showed in the late 2000s that this framework, fitted to retinal ganglion cell data, predicted neural activity patterns with remarkable accuracy. The circuit was doing something close to what the maximum entropy principle would predict.

Intelligence comes from circuitry and connectivity. But the circuitry comes from somewhere. It comes from a regulatory program written in the genome, unfolding in developmental time, in a system that is never at equilibrium and never stops computing.

Comments

Popular posts from this blog

Why Information is Logarithmic: Hartley’s 1928 Insight

An interview with a lawyer on Public Policy and Law

my family! Guest post by 7yo niece Part III