roviding useful dietary and exercise suggestions to the public is vital to me. I constantly check research to ensure I make and am making the best possible suggestions for the various goals sought by those who read my material. Body IO has even hired a dedicated researcher to help keep information consistent, up-to-date and, most importantly, accurate.
This is one reason I value the balanced, well researched critiques of my dietary and athletic recommendations—although shockingly few of the myriads fit this description. These critiques hold me and the Body IO team to exceedingly high standards, sometimes forcing us to retract certain statements or modify advice. Something I’m sure you’ve noticed.
Because we do our job well, the foundation of the information provided here remains accurate and constantly verified. This is why I work closely with Dr. Rocky Patel to ensure scientific hypothesis translates into a healthy physical reality. As an example, I published The Carb Nite® Solution almost ten years ago, and to date clinical use and emerging research continue to support the methodology.
Advancement Happens When We Question Our Most Basic Beliefs
Over the past years, Paleo forced me to question some of my recommendations. The Paleo codec clearly conflicts with components of my plans and makes it difficult for Paleo adherents to swallow and even harder for them to attempt. How much further afield of Paleo can one go than recommending a donut as part of a healthy diet? Carb Nite does just that.
Because of this conflict and the respect I have for some of the most prominent Paleo experts, I decided to determine the validity of the Paleo prescription and if it really did offer consistent, defensible insights. If it did, particularly in matters of food choice, I would account for it in future works.
As you may assume from Part 1 of this series, I found the antithesis of justification. What I did learn, however, is that the history of food exemplifies our abilities as a species to create, adapt and thrive in almost any food environment.
What we need to question, as a culture, is the recurring fear of food—and the stories we create to justify it—that’s accompanied each sweeping dietary change for the last 150 years. All of these fears have turned out to be unfounded, including the ones used to defend Paleo food selection.
FALSE: Humans did not evolve eating significant amounts of grass-based grains such as wheat and barley.
The history of wheat is one of triumph and villainy. When Norman Borlaug created the first disease resistant, high-yield dwarf wheat hybrid, he laid the groundwork to feed billions of people worldwide, some of whom had never known plenty. Borlaug’s contribution to alleviating world hunger earned him the 1970 Nobel Peace Prize. Now, however, an uncountable number of tomes warn of the dire consequences of Borlaug’s Green Revolution. Is Borlaug the savior of billions or the cause of all Western disease?
Paleo tells us that grass-based grains like wheat and corn either didn’t exist or were not accessible. It is because of this dietary and evolutionary mismatch that obesity and metabolic disease escalates in industrialized nations.
Prominent Paleo authors posit the absence of grains in the Paleolithic diet because of the lack of archeological remains and arguments about the human digestive tract. Plant remains do not survive environment decay as do bones and stone tools and accounts for the lack of evidence at various ancestral human sites. Previous authors relied on the appearance of stone tools used for processing grains to estimate their introduction into the human diet.
Based on new methods of analysis, we can recreate the dietary landscape of early Australopithecus, our ancient forbears from nearly 2 million years ago. I find this instructive because evidence suggests that the beginning of our broadening food choices—both in food type and seasonality—started with the consumption of wild grasses and animals[56-61]. This ability to eat very early grasses defined the first step in the evolution of dietary flexibility. Even our earliest ancestors developed an ability to digest wild, seed bearing grasses, making the argument that humans are not equipped to adequately digest grass-based grains (such as barley and wheat) somewhat dubious.
In the area where some of the earliest modern humans lived toward the end of the Middle Paleolithic and beginning of the Upper Paleolithic Era, about 90,000 to 53,000 years ago, conditions existed for the seasonal growth of wild wheat and barley varieties and several small-grained grasses—like foxtails. As early Australopithecines developed the capacity and propensity for foraging grasses, we could hypothesize that early Paleolithic humans did the same.
Even if this hypothesis is more logical than thinking human ancestors ate grasses and grains for millions of years then suddenly stopped for 200,000 years with the arrival of homo sapiens, it does not constitute a ‘smoking gun’.
To examine more recent human diets from the Paleolithic age, archeologists would need a windfall of well preserved plant matter from a Paleolithic dig site. Serendipitously, such a find was made about ten years ago, dating back 23,000 years that was the leaping off point for the migration of humans North into Europe.
Researchers found significant amounts of wild wheat and barley and also large quantities of small grain grasses, like foxtails mentioned above and brome grass (foxtail seeds also contain gluten residues identical to wheat and brome grasses produce gluten components and are the progenitor of several modern grain cultivars[65-66]). This new data stands in stark contrast to previous estimates, particularly the belief that these wheat varieties did not reach the ancient world until roughly 8000 years ago.
Discovering the grains establishes that our hunter-gatherer forbears collected the material for some reason. We must still determine if these items represented construction material—for example, were the grasses used for bedding—or food.
As researchers probed the ancient huts, several clues lead us to believe the grains were used as supplemental food items.
The large number and variety of grains;
The hunter-gatherers harvested fully matured grains, meaning greatest nutritional content;
The type and quantity of grains aligns with grain collection and types used by modern hunter-gatherers and primitive farmers[69-73];
The ancient inhabitants stored the grains near an ancient grinding stone.
Humans may have a million-year-old relationship with seed-bearing grasses, and we can confirm that grains have been a part of the human diet for at least 23,000 years. Even if this defined our total evolutionary window of exposure, given our accelerating genomic changes, this would have been plenty of time to adequately adapt to these food sources, at least, those of us with European or Arabic decent (I qualify this statement because we can’t say based on this data that the Indian or Asian continents had such early exposure).
Any problems humans in Industrial nations suffer today are likely a problem of quantity (too many dietary carbohydrates) rather than an evolutionary intolerance to wheat and other gluten containing grains.
Bottom Line: Humans evolved eating grains for at least 23,000 years and likely longer, and are well adapted to consuming them.
FALSE: Dairy farming is too recent for humans to have properly adapted.
I eat a lot of cheese, butter and drink my fair share of heavy whipping cream. I don’t drink milk. I can’t remember the last time I did. But of all the subjects I researched in writing this paper, the history of milk in the human diet amazed me. The complete story is a bit of a mystery, but after searching through all the pieces, the history of milk consumption in humans exposes the astonishing rate at which humans can evolve to changing dietary landscapes.
Of all the Paleo recommendations, the mandate to avoid dairy must be the most difficult to justify. If they side with the scholars who estimate humans did not begin consuming dairy until 5000 years ago, they can continue to argue that human exposure to milk is recent, however, they’re faced with a discrepancy. Since we’re supposedly Stone-aged genetic throwbacks, Paleo would need to explain the rapid genetic changes needed to consume milk as an adult, a trait that 40% of modern humans possess (it just so happens, this 40% that can drink milk appear to be the Paleo Diet’s main demographic).
If, on the other hand, they decide to preserve the idea that human evolution moves at a snail’s pace, they would be forced to use other estimates that humans began dairying—the technical term for dairy farming in the literature—as early as 15,000 years ago, placing it in the Upper Paleolithic Era.
Mark Sisson seems to have devised a sensible solution to this conundrum: if you can eat dairy, then eat dairy. This wisdom failed to enter into the official Paleo codec.
In the case of dairy consumption, as of yet, researchers lack a windfall discovery like the one demonstrating the long comingled history between humans and grains. For the advent of dairying, we can exam three separate lines of evidence: estimations of the introduction of animal husbandry (maintaining livestock), genomic studies on lactase persistence in humans (the ability to digest raw milk), and residues of organic materials from well preserved pottery and tools (storage and processing of milk).
With astute reasoning, Paleo recognizes that modern hunter-gatherers don’t drink milk, an observation they use to defend dairy abstinence. Hunter-gatherers do not raise cattle—they hunt and they gather—this makes fresh milk difficult to obtain. Though I’m no expert on the subject, I imagine the risk to reward matrix of milking a wild boar or rhinoceros is unfavorable, thus discouraging the practice.
Because of this profound Paleo perception, I want to start with estimates of when humans first started maintaining dairy-producing livestock, particularly cattle. We need cattle to make acquiring fresh milk in large quantities feasible. Clever use of bovine mitochondrial DNA and differentiation of recurring gene patterns—haplotypes—gives us estimates of when early humans began domesticating cattle.
Two recent and thorough studies of Neolithic cattle mitochondrial DNA show that humans tended to domesticated herds of cattle at least 11,000 years ago[76-77]. By this time, our ancestors had also domesticated goats and sheep. That is the latest date. These studies cannot tell how early domestication started, but they can tell us likely starting timeframes, which could have been as far back as 30,000 years ago. Our Paleolithic ancestors may have started the agricultural revolution by herding and domesticating livestock.
That we may have begun domesticating cattle as early as 30,000 years ago is consistent with the divergence of haplotypes from ancient cattle. It also aligns with evidence suggesting that humans brought domesticated livestock with them as they moved north into Europe[77, 79-81].
We definitely had access to a significant supply of dairy 11,000 years ago, but could we digest it?
This is a question of genetics and also one of technology. Let’s discuss the genetic component. To consume raw milk, we need the ability to digest lactose. Most animals start life able to digest lactose by creating an enzyme called lactase. These animals lose the ability as adults, just like 60% of modern humans. Those 40% of us who continue to enjoy milk throughout adulthood possess a genetic mutation known as lactase persistence.
Tracing the origin of the first mutation is difficult, but ancient remains do show that early Neolithic humans contained a gene variant for lactase persistence; it existed in humans 10,000 years ago, but it was uncommon[82-84]. Another 1000 to 3000 years passed before lactase persistence became pervasive. This timeframe for the appearance of lactase persistence across Europe, Africa, Middle East and India seems surprisingly consistent[86-88].
Looking at the data thus far, a discrepancy exists. Early humans had cattle around 11,000 years ago but lacked the ability to digest raw milk for possibly another 3000 years thereafter. This leaves a 3000 year window (or longer) when we had access to dairy, but possibly spurned it. A clue to this missing-milk-time emerges as we explore our third topic, ancient tools and pottery.
Broken clay jars and rudimentary colanders also tell part of the story. If the artifacts once stored fats or fatty foods then we can extract the tale of their ancient contents through isotope detection techniques. From various sites across Europe, the earliest direct evidence of milk storage from sheep, goats or cattle is from about 9000 years ago.
The 9000-year age is significant because it represents the very first appearance of pottery in the ancient world, i.e. humans poured milk into the world’s first pottery. As with the domestication of livestock, humans may have been dairying earlier, but since the current set of tested fragments represent the oldest known human pottery, we can’t use these methods to probe deeper into our milky past. We do know that from that point on, dairying continued throughout the Neolithic period[90-94].
Our picture now looks like this: we have cattle; 1000 or 2000 years later we invented pottery and immediately filled it with milk; finally, 1000 years later, we developed the ability to drink milk. It feels like something is missing. That missing piece: cheese. The archeological record that currently exists shows that cheese making is at least as old as pottery making.
After the domestication of cattle and before 9000 years ago, humans developed a technique to make cheese and appeared pretty adept at it by this point due to their design, construction and use of a sieve in the process. Humans and cheese, it appears, existed together longer than we can witness from the archeological record.
Developing cheese probably gave our earliest agriculture forbears a massive advantage over the native hunter-gatherer populations. Cheese could last through times when food was less plentiful, providing valuable fat and protein. Lacking the ability to digest lactose didn’t matter because the cheese is what’s left over once curds form and the lactose and whey is washed away through the sieve.
Neolithic humans used crude sieves for this extraction process. The residue that remained on those strainers provided enough raw material to determine the last time someone used it to make cheese, despite that being nearly 9,000 years ago.
During the millennial overlap between a predominantly lactose intolerant people and access to dairy and cheese making, what once acted as a benign mutation—lactase permanence—became a positively selected for trait. With a continued influx of primitive farmers from other regions and positive selection pressure, lactase permanence claimed 70 to 80% of Europeans in only 1000 years.
Despite the lengthy discussion, exploring the history of dairy in the human diet demonstrates Paleo’s multiple fallacies. Humans do, in fact, have a long and beneficial history with dairy; the prevailing environment did not shape the human diet, but humans created a new food environment by keeping livestock and utilizing dairy; after forming this new food niche, the local human genome rapidly evolved allowing them to thrive more easily; by evolving greater fitness in the new niche, humans formed divergent eating patterns throughout the region and thus created differing ancestral contexts for diet.
Every one of these early human successes contradicts Paleo’s most cherished tenets.
Bottom Line: Dairy—from goats, sheep and cattle—has been in the human diet (particularly of Europeans) for at least 9000 years and we rapidly adapted to dairy consumption, an adaptation that continues to spread even today.
FALSE: Yams and sweet potatoes are biologically appropriate ancestral foods (and maybe white potatoes too).
A couple of years ago, I remember a Paleo authority educating me—actually, correcting me—when I discussed the inappropriate comparison of the Okinawan diet—high in rice—to the problem of the Western world’s obesity epidemic. I argued, and still do, that it is inappropriate and ignorant to use the traditional Okinawan diet and to justify the claim that a high carbohydrate diet must be healthy for everyone.
No one disagreed with my sentiment, but the correction came when I was told that Okinawans didn’t actually eat rice. Their Paleolithic ancestral food is the sweet potato. If I have learned anything in life, it is this: when I’m ignorant on a subject, I listen, assess and when time allows, study. I did just that.
This event occurred around the advent of the nebulous Paleo 2.0, when rich carbohydrates sources, such as potatoes, sweet potatoes, yams and even rice suddenly became biologically appropriate ancestral foods. At the time of writing this article, confusion trumps consensus. Some Paleo websites say sweet potatoes are, but white potatoes are not. Another Paleo website says all tuberous vegetables are ancestral.
Each website I visited seemed to adopt a ‘best guess’ approach for classification. The agreement across the myriad websites I consulted is: sweet potatoes and yams existed in the human diet throughout the Paleolithic Era, whereas white potatoes constitute a modern, industrialized food.
Since I established that humans do not derive from a single appropriate context, the answer to whether potatoes are Paleo depends on ancestry, although barely. As for the ancestry of potatoes—all types—we know they originated in Central America[101-104]. On all other continents where potatoes now grow, humans carried them there.
To answer the question of the potato’s Paleo-authenticity, we only need to know when first exposure occurred in different populations. If you trace your genetic heritage back to Central or South America, then one could argue that potatoes count as an ancestral food.
The oldest human remains in the Americas date to approximately 11,000 to 14,500 years ago and are located in North America[105-106]. As potatoes originated in Central America, humans may have reached the potato while technically in the Paleolithic Era. In this case, those with prehistoric American heritage would be less adapted to potatoes than those of European decent are adapted to wheat.
According to the Paleo codec, therefore, potatoes cannot be appropriate for even the first human to ever taste one. But let us continue to explore the history of humans and the potato.
The next population to acquire sweet potatoes were Polynesians who probably dropped several plant species in India and China around 900 years ago[107-108]. This implies that Okinawans did not have access to sweet potatoes until 1100 A.D. Paleo cannot, unfortunately, showcase the Okinawan diet of sweet potatoes to defend the idea of a Paleo potato. Okinawans, according to Paleo theory, should have become metabolically deranged after several hundred years of eating sweet potatoes, a non-ancestral food.
Europeans enjoyed potatoes last, starting roughly 500 years ago when explorers returned from their Atlantic crossings bringing potatoes and an assortment of edible vegetation[106, 109-110]. Even yams did not make it to the European continent until roughly 500 years ago, brought back from Southern Africa by explorers.
Paleo should argue that potatoes and yams—sweet, white or otherwise—are biologically inappropriate for all humans. Only one human culture might have come into contact with potatoes during the Paleolithic period.
While we’re exploring Paleo’s accepted list of ancestral foods, we can identify other common Paleo foods with a similar history to potatoes. Tomatoes entered Europe with potatoes, as did all types of squash (including zucchini) and peppers—total European exposure time: 500 years. Our European ancestors ate wheat pasta thousands of years before adding tomato sauce.
Carrots, also a recommended Paleo food, originated in the mountains of Afghanistan and entered the human diet roughly 1000 years ago. (The carrot will prove to be another thorn in Paleo’s side, as I will discuss later in the series.)
Actually, most vegetables recommended by Paleo fail their own litmus test of appropriate. Paleo approved vegetables—turnips, leeks, lettuce, cabbage, beets, celery, asparagus, etc—were originally considered weeds among ancient farmers and did not enter the human diet until roughly 1000 years ago. To date, the most extensive plant remains found from prehistoric human sites that represent food stuffs include acorns, pistachios, almonds, olives, figs, raspberries, grapes and large amounts of grains.
Scrutinizing Paleo food suggestions, from grains to dairy to potatoes and beyond, demonstrates that there is no Paleo template. Food selections and suggestions come quite literally from the whimsy and personal desires of those making recommendations. Several sites defending potatoes as Paleolithic foods do so because, to paraphrase, “if you’re athletic, you need carbohydrates”.
In essence, most experts explain Paleo vegetables as those with a low glycemic index. If this is the definition of a Paleo food, then Paleo is a modern diet using modern definitions of measure—ones that are arguably invalid in terms of health—to construct food lists.
Humans have consumed grains, particularly wheat, for over 20,000 years and dairy for over 10,000 years, yet Paleo considers them biologically inappropriate because adaptation in such a limited timeframe is impossible. Paleo, however, deems foods entering the human diet in the last 500 to 1000 years as part of the Paleolithic human diet, and therefore, recommended. Using the term “evolutionarily informed” is erroneous.
Bottom Line: Potatoes, yams and a host of other Paleo recommended foods entered the human diet in the last 500 to 1000 years and were not a part of the prehistoric human diet. These lists constitute a whimsical selection of vegetables apparently chosen to create a palatable, modern diet that bears no resemblance to our Paleolithic or even Neolithic heritage.
To Be Continued…
References (Continued from Part 1)