History’s Top Brain Computation Insights: Day 5

Drawing of a Purkinje cell by Ramon y Cajal 5) Neurons are fundamental units of brain computation (Ramon y Cajal – 1889)

Golgi, a prominent 19th century biologist, argued that the brain is one unified reticulum (or web) of neural tissue, much like the circulatory system. However, Ramon y Cajal came to a very different conclusion using Golgi's very own silver chromate staining technique. He argued that this tissue web was composed of separate cells. Later studies showed (using the electron microscope) that Ramon y Cajal was correct.

Some have argued that Golgi was partially correct since electrical synapses (gap junctions) exist in small number in the brain. However, even with gap junctions the cells' plasma membranes separate the two sides of the synapse, which Golgi's theory did not predict.
Looking carefully at his stained neural cells, Ramon y Cajal postulated that nerve signals travel in one direction (from dendrite to axon). The dendrite (top), cell body (dark central spot), and axon (bottom) can be clearly distinguished in the included image of a drawing by Ramon y Cajal. He was unable to test this prediction himself, but he turned out to be correct once again.

Implication: The mind is implemented in an electric organ with distributed and modular function consisting of neural units.

[This post is part of a series chronicling history's top brain computation insights (see the first of the series for a detailed description)]


Demystifying the Brain

Most neuroscience writing touts statements like 'the human brain is the most complex object in the universe'. This serves only to paint the brain as a mysterious, seemingly unknowable structure.

This is somehow comforting to some, but it's not for me. I want to understand this thing!

Here are some facts to demystify the brain (most points courtesy of Dave Touretzky and Christopher Cherniak):

  • The brain is very complex, but not infinitely so

  • A few spoonfuls of yogurt contain 1011 lactobaccillus bacteria: ten times the number of cortical neurons in a human brain
  • There are roughly 1013 synapses in cortex. Assume each stores one bit of information: that’s 1.25 terabytes.
    • Also, the Library of Congress (80 million volumes, average 300 typed pages each) contains about 48 terabytes of data.

  • Volume of the human brain: about 1.4 liters.
  • Number of neurons in a human brain: 1012. Number of neurons in a rat brain: 1010.
  • Number of neurons in human spinal cord: 109 [source]
  • Average loss of neocortical neurons = 1 per second; 85,000 per day; ~31 million (31×106) per year [source]
  • No neuron fires faster than 1 kHz
  • The total energy consumption of the brain is about 25 watts [source]
  • Average number of glial cells in brain = 10-50 times the number of neurons [source]
  • "For all our neurocomputational sophistication and processing power, we can barely attend to more than one object at a time, and we can hardly perform two tasks at once" [source]

What to take from all this? Simply that the brain is a real, physical object with real, physical limitations. As such, we really do have a chance to understand it.

Of course, if the brain is so limited, how can we expect our brains to be up to the task of understanding it?

We can, and we will. How can I be so sure? Because (as illustrated above) we have already built, and thus understand, computational devices that are rivaling and in some cases surpassing (e.g., in memory capacity) the computational powers of the brain.

This shows that we have what it takes to understand ourselves in the deepest way possible: by learning the neural mechanisms that make us who and what we are.