books book reviews

speculative pop science books

reviewed by T. Nelson

Score+1

The Shrinking Brain

by Michael A. Crawford and David E. Marsh
Authorize, 2023, 442 pages
reviewed by T. Nelson

It's always a bad sign when the authors can't decide on what to call their book. Is it The Shrinking Brain and the Global Health Crisis as it says on the cover, or The Shrinking Brain and the Role of Environment in its Evolution and Future as the title page says? Who knows? Let's just call it The Shrinking Brain.

The idea is that the average IQ, the average cranial capacity, and the mental health of humans are all decreasing and that the cause is a lack of docosa­hexa­enoic acid (DHA), an omega-3 fatty acid, in the brain.

So far, so good. I've always said people need to eat more fat, including DHA and cholesterol. People think that eating fat makes you fat, but it's simply not so. Fat triggers the satiation response. Without that, people keep eating more and more carbohydrates. And it's true, as the authors say, that there are high concen­tra­tions of DHA in the brain and retina. It has many functions in the brain. I have a lot of unpub­lished research showing how it protects against Alzheimer's disease, though NIH wasn't interested in funding it, so I was forced to drop the project.

But DHA and arachidonic acid (AA) also seem to be magnets for exaggerated claims in the popular literature. Many papers in the field by big name scientists have gotten retracted over the years. Science magazine recently cast doubt on the entire neuro­protectin / resolvin story that the authors mention many times. And there are now over 2000 papers on DHA and cancer, some claiming it causes cancer and some claiming it prevents it.

The authors hypothesize that DHA and AA, as well as common nutrients like calcium and phosphorus, cause heritable epigenetic changes and that these epigenetic changes, not mutations, are the principal drivers of evolution.

‘Epigenetics’ is the term for changes in gene expression caused by phosphorylation of histones, methylation of DNA, and small regulatory RNAs. Epigenetics doesn't change the primary sequence of DNA, but the distinction isn't 100%. For instance, DNA methylation can sometimes promote a base mutation from C to T. In general, though, the difference between epigenetics and DNA mutation is that mutations create new things, while epigenetics regulates existing things up and down. Sometimes that regulation can be passed on to offspring, but contrary to what the authors say it's not some magical new thing that overthrows the theory of evolution. The only thing that's new is that regulation of gene expression isn't just feedback inhibition as we once thought.

The authors repeat a number of arguments, popular with creationists, that “random mutation” could not have created complex organisms. They write [p.103]:

The rules of chemistry are not at all those of a blind watchmaker. The blind watchmaker model can only have been postulated by people who lacked a grounding in chemistry and physics. . . . Chance can be replaced by the inevitability of chemistry and physics.

What on Earth does this mean? Chemistry and physics have minds of their own? Or that epigenetic forces magically create whatever is needed? No one ever claimed that mutations don't act in accordance with the rules of chemistry and physics. The authors want to say that epigenetic forces, not mutations, drive evolution. But if so, how did the epigenetic forces come into being? The true answer, of course, is random mutations coupled to natural selection.

To illustrate their idea, the authors claim that the fins and vestigial legs in dolphins are not caused by evolution of DNA, but by heritable epigenetic changes caused by high levels of phosphate in the seawater. As evidence, they claim to have found a tribe in Kenya that has bent legs. And wouldn't you know, they eat a lot of fish.

Maybe they do, but it doesn't prove anything. For one thing, those children aren't born with bent legs, which rules out their theory altogether. And for another, many other ethnic groups eat lots of fish and so far none of them have turned into dolphins. (Or if they have, the medical establishment and Big Fish have done a good job covering it up).

They also claim that the immune system creates new antibodies by direct interaction of B cells with antigens, which sort of molds them into the correct shape. This is simply incorrect. There is a huge body of research showing that antibodies are made in T and B cells by V(D)J recombination, in which the DNA is shuffled and undergoes random hypermutation.

Yet another claim is that when the dinosaurs were wiped out, a solar flare came along and created a mutation in plants that allowed the creation of arachidonic acid from linoleic acid. This, they say, is what allowed the mammals to flourish. They also say that the air is only 3 percent oxygen. I'd put that down to a typo (in fact, the air was closer to 35% oxygen at its peak 300 million years ago but declined to almost the modern level of 20.9476% by the time the dinosaurs went extinct), except that they also write CO2 and O2 wrong, suggesting that they slept through Chem 101, where they pound this stuff into your head.

But their most bizarre claim is about the brain. Nutrition, they say, caused the average cranial capacity to increase from 340 cm3 (in chimpanzees, which they say are our ancestors) to 1600 cm3. Then in modern times, our bad diet caused it to decrease back to 1336 (for males) and 1198 (for females). This is not so. The 1600 figure, which they repeat many times, actually refers to Neanderthal brain size, which exceeds that of modern humans. Then the authors say chimp­an­zee DNA differs from humans by only 1.5%, so there must be some other factor to explain the difference, and that factor is a lack of adequate dietary DHA.

It's true that, yes, we have no bananas that contain DHA, so chimps might have a deficiency. It would be easy to test. Get some chimps, feed them canned anchovies for a few generations, stand well back so as not to smell their breath, and see if the cranial capacity of their descendants gets bigger. Better yet, use mice. If the authors are right, in a few generations they'll be beating us at chess. And for anyone who slept through Biology 101, chimps are not our ancestors. Chimp­an­zees (and bonobos) split off from hominins about seven million years ago. The 1.5% difference in DNA between chimps and humans might seem small, but it's huge. Even a single DNA base change can produce a marked difference or even death for an individual.

Eating fish might not make you popular in the lunchroom, but it is certainly good nutrition. But contrary to what these authors (and some papers in the literature) say, DNA mutations are far from rare. Indeed, harmful somatic mutations caused by chemicals, radiation, and other factors are the second highest cause of death for humans. Between 1.42 million and 10 million DNA single nucleotide poly­morph­isms (SNPs), a fancy term for inherited DNA mutations, have been found in the human genome so far. That's one mutation for every 100–1900 base pairs. These SNPs, propagating through the population, are evolution.

On page 148 they say cats evolved a rapid pupillary contraction in response to light because humans shine flashlights on them. (I read that paragraph several times, thinking I must have misinterpreted it). Then they conflate this with developmental changes in the eye as if that proves their theory. But they noticed a problem: cats don't synthesize vitamin A, which they need for night vision. So, why don't cats just invent it? It the authors' theory is correct, that should be easy. The authors say the reason they can't is that the cat genome is “full” like a full hard drive. This is, frankly, nonsense. The real reason is that their theory of teleological epigenetics doesn't hold water. Only genetic DNA mutations (and transpositions, duplication, recombinations, and similar weird DNA tricks) can ‘invent’ new things—new enzymes, new organs, and new pathways. DNA mutations are ‘random,’ by which scientists mean unpredictable.

Epigenetic changes are well known to be important in gene expression, phenotypic variation, and adap­tation. They act as volume controls, helping the organism to adapt. But they can't create anything new.

So far, there is no evidence that deficiencies in dietary DHA can cause inherited changes in human cranial capacity or IQ or that they cause mental illness in their descendants. Another problem for the authors' theory is the research that says DHA supplements have little or no effect on the brain except in animals that are experi­ment­ally deprived of it, because the brain holds onto its supply of DHA tenaciously.

By tenaciously I mean DHA doesn't just float around in the cell. The vast majority of it is covalently attached to membrane lipids. Yes, DHA is important, but its function and the conditions under which it is released are not fully understood.

I should add that if the NIH was interested in this, we'd know many of the answers by now (which is to say, you'd know; I already know), but that's another topic. Basic research seems to be a low priority at NIH these days.

So, what's going on here? What these guys are doing is called motte-and-bailey. They make an outlandish claim (such as that humans are aquatic and newborn babies naturally know how to swim, or that we evolved from chimps), then rephrase it later so it says something plausible, such as that humans evolved near the shore to take advantage of a DHA-rich environ­ment. Then they become indignant when their original claim is challenged.

Incidentally, newborn babies cannot swim. This is dangerous misinformation.

The last three chapters are just alarmism with little scientific basis. Alarmism might get you into the Daily Mail and sell books, but it will convince few scientists. For one thing, mental illness is not just a single disorder. There is NO overpopulation crisis. And the decline in IQ could be caused by any number of things, from Covid to useless teaching fads to evolution-based population changes. Where is the scientific evidence that low dietary DHA, and not something else, is responsible?

“We can't wait any longer,” say the authors. It's a crisis “even greater than climate change.” So eat your cod liver oil now before it's too late! What we have, in effect, is four hundred pages of Lysenkoist and Lysenko-adjacent claims all for the sake of giving us some mundane dietary advice.

nov 11, 2023

Score-1

The Age of Prediction

by Igor Tulchinsky and Christopher E. Mason
MIT Press, 2023, 218 pages
reviewed by T. Nelson

Being light and upbeat may be a virtue, but there's a risk: if you spend too much time raving about how wonderful everything is, people might confuse you with a vacuum cleaner salesman.

This book follows a familiar style: invent a catchy new word, like “Quanta­saurus,” spin a narrative that superficially seems to support it, and claim that you've discovered an important new trend.

The trend here is that biomedicine and artificial intelligence are bringing about a glorious new age of knowledge science. This will lead to the “Age of Prediction” in which everyone is healthy, crimes are all easily solved, and the authors' quantitative investment firm soars to the top:

The relentless emergence of prediction in all areas of life, science, and finance will broadly reduce many forms of risk . . . . The same predictive algorithms that help us live longer and build better financial models could help us get to another world, like Mars, or to another solar system to live in a different sun's light, perhaps chauffeured by an AI-driven brain.

The two pillars of this strange new world are genetics and artificial intelligence:

[L]ike the plethora of algorithms that Igor uses in the markets, medicine is rapidly building an arma­ment­arium of increasingly predictive tools and models, from genomic analysis to sophisticated epidem­io­logical models and data sets to machine learning, expanding far beyond rapid diagnosis, and now aiding drug discovery and continual monitoring for the first sign of any cancer.

Personalized medicine

What they're talking about here is personalized medicine, which was a hot topic about fifteen years ago. I was once asked in a job interview what I thought of personalized medicine. I told them it was mostly a fad, which I guess was the wrong thing to say, because they didn't hire me. The idea was that each patient would get an optimized, bespoke treatment based on their specific genetics, thereby eliminating harmful side effects and treating cancer, which can occur by a dizzying variety of different errors in a cell's DNA.

Unfortunately, it turns out that over 10,000 different mutations can cause cancer. Individual cancers can have as many as 20 mutated genes that play causal roles in cancer. Many researchers now think that it is genomic instability rather than mutations that play the central role in tumorigenesis. This realization that the disease is more complex than once thought has put a damper on previous hopes: the odds of finding a panel of drugs that will work, getting them all approved, and avoiding drug interactions make personalized medicine mainly a hope for the future. It now seems that many, if not most, of the molecular targets in these diseases are non-druggable. The approach today is to return to pro-apoptotic therapies and to search for common markers amenable to immunotherapy.

The trend, therefore, is away from precision medicine and toward corporatized medicine. As medical costs continue to skyrocket—just getting a ride in an ambulance can cost $8,000, and diagnosing a disease is even more costly—administrators seek cost effic­iency at the expense of individualized treatment. Likewise, AI in drug discovery is still very much an unproven fad.

Another challenge is that when people get genetically tested, they panic. They get unnecessary operations, thinking they're doomed to get cancer. Or they become depressed, thinking their life is over because they have a risk factor for dementia.

About those financial models . . .

Then there are authors' claims about stock market prediction, used by their company. On page 35 they say an algorithm “may be built around the predictive notion that a tech stock that goes up three days in a row is likely to go up in the fourth.” They say that 100 such rules or ‘models’ would produce 10 times better predictions than just one. This is called technical analysis of stocks; its practitioners are called chartists or technicians. Computers may be faster now, but the method is not new. Simply put, you plot the stock value on a graph and look for patterns. Maybe the authors have stumbled on some unsuspected numerological truth, but the general opinion of financial economists seems to be that technical analysis is a great way of losing money for your clients. Economist Frederic S. Mishkin says it's a “waste of time,” and investment expert Burton G. Malkiel says it “must share a pedestal with alchemy.”

As for AI, they authors seem not to know much, so they give us a bunch of hand-waving:

In contrast to the early, limited machines, AI in the Age of Prediction is a vast, diverse, and swiftly moving field with a rich interplay between computer science and many other fields.

The biggest problem with this glorious new Age of Predic­tion, aside from the fact that the world is often more complicated than your model, is that predic­tions are always probabilistic, so they require careful evaluation by a human. That makes them more expensive, even if genetic testing and computer predictions are cheap. In the end, everything eventually comes down to money.

Laymen might learn a bit about the hopes we once had for personalized medicine. The rest of it is just fluff. If the authors wanted to convince us that this glorious new age of prediction was real, they'd have to crunch some numbers and deal with hard questions. Instead we get a pep talk that tells people what they want to hear.

What was it about this book that caused Nature mag to put it on their recommended reading list? Maybe they like fluffy things.

nov 28, 2023. updated dec 02, 2023