The Physics Police

The Physics Police

Wednesday, April 17, 2013

Don't Eat Soil

The anti-GMO blogosphere is alight with the most lazy case of scientific fraud I've ever seen.

This started in March when Zen Honeycutt at Moms Across America made this blog post. She claimed to have a "stunning report" demonstrating a nutritional difference between GMO and non-GMO corn, and that:
These are exactly the deficiencies in a human being that lead to susceptibility to sickness, disorders and cancer.
This bizarre claim is backed up by the (false) explanation that:
Glyphosate draws out the vital nutrients of living things.
The post also claims that:
Formaldehyde showed to be toxic in ingestion to animals. This corn has 200X that!
Scary, right? Well, turns out, this "stunning report" was published in an advertisement produced by ProfitPro® to peddle their ManurePlus™ product. They were using false advertising to scare farmers into signing up for their program.

How do I know the data in this ad is false?
  1. It has nothing to do with the nutritional contents of corn! The analysis compares soil samples from two adjacent fields. See footnote 1 on the original document, posted by Zen Honeycutt.
     
  2. Not convinced by #1? The percent "Organic matter" for both GMO and non-GMO corn is under 3%. Whatever they are measuring, it isn't the body of a plant, which is, by definition, 100% organic matter! Clearly, this table is taken from a soil report.
     
  3. The GMO numbers have been tampered with by blindly lowering all the numbers. The (unintentional) result is that some quantities, which represent the percent value of something called Base Saturation, don't add up to 100%. This is undeniable proof of forgery.
     
  4. The report measures "Anaerobic biology" in parts per million (ppm), which makes no sense. Anaerobic organisms are identified by growing them in liquid culture. There is no method for measuring the mass fraction of living Anaerobic organisms in a soil sample. This line was obviously added by someone who did not pass High School chemistry (or biology).
     
  5. The comically low Cation Exchange Capacity has clearly been lowered to make the GMO soil appear impoverished. The fraudster was so lazy, they used the value 3 meq per 100g, a value typical of sand. Farmers don't grow corn in sand.
     
  6. The report boasts an impossible formaldehyde concentration of 200 ppm in the GMO soil. Formaldehyde does not accumulate alkaline soil, because it is broken down by the Cannizzaro reaction. The report shows the GMO pH to be 7.5 (alkaline), which is incompatible with such a huge amount of formaldehyde.
     
  7. The lazy hoaxer was not feeling particularly creative when faking the "Available Energy" of GMO corn at 100 µS, which is precisely the recommended minimum value.
     
  8. Only one of the numbers has three significant digits. There are no error bars. Real science doesn't look like this. It's a fake!


GMO corn has the same nutritional value as non-GMO corn. Glyphosate does not draw out vital nutrients. There is no formaldehyde in GMO corn.

Zen Honeycutt was fooled. ProfitPro sells cow shit with bullshit. GMO corn is not going to hurt you.

But eating farm soil might hurt you. Don't eat soil.

Monday, April 15, 2013

The Universe in Half a Tweet

Neil Turok is the director of the Perimeter Institute for Theoretical Physics, and the author of a book I stumbled upon recently, titled The Universe Within. He's also a damn fool liar! I will justify this, but first I have to take a deep breath...

Okay, so, I was checking him out to help me decide if I should read his book. That's when I found this YouTube interview, where he talks about a lecture he presented. This lecture is on the same topic as the contents of his book, so I watched the video. After a few minutes, he said something which triggered my bullshit alarm. I searched online, and found this news article wherein he repeats the claim:
It turns out that if you take just 300 electron spins and couple them together into a quantum computer, then the amount of information it would be capable of handling is the same as that in the position of every particle in the whole universe. There are about 10 to the power of 90 particles in the universe and if you use each one of their three-dimensional positions and imagine recording all of those positions, that's about the same as what you would get from just 300 electrons in a quantum computer.
There are so many lies in this paragraph, that I required help just to suss out what he was even talking about! People much smarter than me managed to reverse engineer his mathematics, and explain how he arrived at a count of 300 electrons in his hypothetical quantum computer.

First, he started with an estimate for the number of particles in the universe:

N = 1090

Then, he used the convenient approximation:

210 = 103

To put N in terms of a power of 2:

N = 2300

Then he stopped. He had what he wanted. A big, scary number to intimidate and impress people. He didn't care what it meant. He had a narrative to present, that quantum computers are awesome, and this number fit into that narrative. All he had to do was lie.

After all, who's going to call him out? Tyler Harbottle, the author of the news article? Allan Gregg, the guy who interviewed him on YouTube? Apparently not. Well, look out Neil, the Physics Police are coming for you...

Let me explain the true meaning of the number 300 in the above equation. It tells us how many bits of memory you need to keep track of one, large number. Just one, single number! It can be as large as N, the number of particles in the universe. So, if you were to store, say the three-dimensional position of each of these N particles in computer memory, you'd need a 300-bit number to refer to just one of them.

Notice that this feat is impossible, because each bit of computer memory is built from more than one atom, meaning the computer would take more than N particles to build. But N is the number of particles in the universe, so we would exhaust the material content of the universe before completing construction of the computer memory.

Even children know that counting every grain of sand is not the same as creating a map of the beach. 300 bits can store 37 typed characters, not even enough to contain this sentence. That's less than half a Tweet! Surely, the universe contains more information than that.

But I digress. Let's recall what Turok actually claimed. He claimed that the amount of information stored in a quantum computer built from 300 electrons is equal to the classical information represented by the position of every particle in the universe. Anyone who has taken high-school physics knows this cannot be true.

In classical physics, one can measure the position of a particle to an arbitrary degree of certainty. There is no classical limit on how much you can improve your measuring apparatus. You can describe the position of a particle with infinite accuracy. Laplace's demon did it.

Anyway, I want to calculate just how fabulously wrong this ponce has got it. After all, that's the fun part! First, I'll start with some simple and generous assumptions. Let's assume the universe is a nice, orderly cube, with each of the N particles sitting along a nice, neat lattice. This is the ideal situation from the point of view of memory consumption, in which each particle has a unique position.

The cube's length, in arbitrary integer units, is equal to the number of particles on a side, or the cube root of the number of particles:

L = N1/3 = 1030 = 2100

For each of the N particles, we need to store three different 100-bit numbers; the position of the particle along each of the three dimensions. So, that's 300 bits per particle. Notice that this number 300 is a coincidence, having to do with the number of spacial dimensions, whereas the 300 electrons were calculated from converting binary to base ten. Now we can calculate the memory required to record the position of every particle in our model:

H = (300 bits) * N = 3 * 1092

As we saw before, the hypothetical quantum computer would be "capable of handling" only 300 bits. So, how far off was his claim that this storage capacity is equal to H?

H / (300 bits) = 1090

So, Turok was off in his calculation by a factor equal to the number of particles in the universe. This is a prediction so gloriously inaccurate that he deserves an Ig Nobel Prize.

But I can't end there. The worst of it is, he's not only wrong, he contradicts himself, too. Earlier on, he claims that a qubit can store an infinite amount of information:
And that's what a quantum bit has -- an infinite amount of information from the whole sky. The quantum bit is something carried by a single electron, the spin of a single electron -- the most elementary constituent of matter that we know would carry a whole sky's worth of information.
So why use 300 electrons? Why not just one?


Sunday, April 7, 2013

Homeopathy

Because he hates me, a friend sent me this video about Homeopathy with Dr. Werner. In the first minute of speaking, she claims:
The whole universal mass can be consolidated down into the size of a bowling ball.
The diameter of a bowling ball is 21.8 cm. The most compact configuration of matter is a black hole. The Schwarzschild radius defines the size of the black hole. Using some basic algebra, we can find the mass of a bowling-ball sized black hole:

m = (radius) * (c2/2G)
  = (21.8 cm / 2) * (6.73 * 1026 kg / m)
  = 7.34 * 1025 kilograms

That's just a little bit less than the mass of the planet Uranus. Last time I checked, the universe is a whole lot bigger than Uranus! (Get it? Bigger than your anus?) It's not worth addressing any of the other nonsense in the video, but I found that calculation interesting, and worth sharing.

Incognito

Recently, I finished the book Incognito by David Eagleman. Overall, it's a decent neuroscience book. I can honestly say it changed how I think about the subconscious mind. It's very well written, except for the introduction, which reads more like a television advertisement for neuroscience than a book introduction. Worse, this introduction contains two glaring factual errors. This, I cannot abide. You know what's coming next!

Error #1: Complexity


This is what the book claims:
[Neural tissue is] the most complex material we have discovered in the universe.
It's hard to argue against statements like this, because it isn't clear what definition of "complex" is being used.

First, let's assume Eagleman means to use information as the measure of complexity. There is more quantum information per unit volume in a black hole than any other object, including the human brain.

Maybe he's talking about digital storage? If you say each synapse stores one bit, the human brain has a digital information density of 1013 bits per cubic centimeter. That's about twice as dense as the theoretical upper limit of 4 gigabits per cubic millimeter for digital, holographic storage. Maybe this is where he gets the claim, but he's still wrong, because synapses do not store digital information! Neurological processes are messy and unreliable for digital computing, obviously.

Maybe he's talking about branching complexity? The network of neutrons in the human brain contains hundreds of trillions of linkages, after all. What could be so huge as that, without falling apart?

Hyphae are the microscopic, hairlike roots of a fungus. The interconnected hyphae of the Armillaria solidipes cover 3.4 square miles of forest! This single fungal organism is thousands of years old, and I suspect a network diagram of its hyphae would be far more complex than a network diagram of a human brain.

Also, consider the Huge-LQG, the largest known structure in the universe. It has a mass of 6 * 1018 solar masses! The average mass of a star in our galaxy is:

(mass of milky way) / (number of stars)
(1,250 billion) / (300 billion)
4.17 solar masses per star

So, the number of stars in Huge-LQG is of the order 1018. Each one of those stars influences its neighbors by gravitation and radiation. Therefore, I claim that this system has a "complexity" on an order (much) greater than 1018. That's a thousand times more "complex" than the human brain, which has on the order of 1015 connections.

No matter how you define complexity, Eagleman's claim is wrong.

Error #2: Photons


This is what the book claims:
If you represented these trillions and trillions of pulses in your brain, by a single photon of light, the combined output would be blinding.
Really? Let's do the math. First, Eagleman estimates there to be 100 billion neurons in the human brain. He also states that there are on the order of 100 firings per second. So, we can calculate the number of photons per second as:

N = (100 billion) * (100 per second) = 1013 s-1

Let's be generous and assume this light comes from a red laser:

E = h * c / v
  = (6.626 * 10-34) * (3 * 108) / (635 * 10-9)
  = 3.13 * 10-19 J

The power output of the brain-equivalent laser is:

E * N = 3.13 * 10-6 watts

That's one thousandth of a milliwatt! Only 0.1% the brightness of our red laser pointer. Far form being "blinding", the human eye can hardly see it.

You see, Plank's constant is very small, even more small than the human brain is big. Eagleman is trying to use big numbers to fascinate his readers. Unfortunately for him, physics has neuroscience beet on that front.

I don't expect Eagleman to know about things like holographic storage, hyphae, superclusters, or Planck's constant. I do expect him to stick to his subject, and avoid grandiose claims outside his area of expertise, in the name of selling books.