Error #1: Complexity
This is what the book claims:
[Neural tissue is] the most complex material we have discovered in the universe.It's hard to argue against statements like this, because it isn't clear what definition of "complex" is being used.
First, let's assume Eagleman means to use information as the measure of complexity. There is more quantum information per unit volume in a black hole than any other object, including the human brain.
Maybe he's talking about digital storage? If you say each synapse stores one bit, the human brain has a digital information density of 1013 bits per cubic centimeter. That's about twice as dense as the theoretical upper limit of 4 gigabits per cubic millimeter for digital, holographic storage. Maybe this is where he gets the claim, but he's still wrong, because synapses do not store digital information! Neurological processes are messy and unreliable for digital computing, obviously.
Maybe he's talking about branching complexity? The network of neutrons in the human brain contains hundreds of trillions of linkages, after all. What could be so huge as that, without falling apart?
Hyphae are the microscopic, hairlike roots of a fungus. The interconnected hyphae of the Armillaria solidipes cover 3.4 square miles of forest! This single fungal organism is thousands of years old, and I suspect a network diagram of its hyphae would be far more complex than a network diagram of a human brain.
Also, consider the Huge-LQG, the largest known structure in the universe. It has a mass of 6 * 1018 solar masses! The average mass of a star in our galaxy is:
(mass of milky way) / (number of stars)
(1,250 billion) / (300 billion)
4.17 solar masses per star
(1,250 billion) / (300 billion)
4.17 solar masses per star
So, the number of stars in Huge-LQG is of the order 1018. Each one of those stars influences its neighbors by gravitation and radiation. Therefore, I claim that this system has a "complexity" on an order (much) greater than 1018. That's a thousand times more "complex" than the human brain, which has on the order of 1015 connections.
No matter how you define complexity, Eagleman's claim is wrong.
Error #2: Photons
This is what the book claims:
If you represented these trillions and trillions of pulses in your brain, by a single photon of light, the combined output would be blinding.Really? Let's do the math. First, Eagleman estimates there to be 100 billion neurons in the human brain. He also states that there are on the order of 100 firings per second. So, we can calculate the number of photons per second as:
N = (100 billion) * (100 per second) = 1013 s-1
Let's be generous and assume this light comes from a red laser:
E = h * c / v
= (6.626 * 10-34) * (3 * 108) / (635 * 10-9)
= 3.13 * 10-19 J
= (6.626 * 10-34) * (3 * 108) / (635 * 10-9)
= 3.13 * 10-19 J
The power output of the brain-equivalent laser is:
E * N = 3.13 * 10-6 watts
That's one thousandth of a milliwatt! Only 0.1% the brightness of our red laser pointer. Far form being "blinding", the human eye can hardly see it.
You see, Plank's constant is very small, even more small than the human brain is big. Eagleman is trying to use big numbers to fascinate his readers. Unfortunately for him, physics has neuroscience beet on that front.
I don't expect Eagleman to know about things like holographic storage, hyphae, superclusters, or Planck's constant. I do expect him to stick to his subject, and avoid grandiose claims outside his area of expertise, in the name of selling books.
No comments:
Post a Comment