Early members of our species found nourishment through scavenging, and then hunting, animals such as gazelles, turtles, birds and fish, and consuming the edible parts of plants, including fruit and roots, as well as mushrooms, nuts and seeds. Hungry humans instinctively regarded all of these as food. About 10,000 years ago, during a period of increasing temperature and dryness in the Fertile Crescent, humans observed the ibex and aurochs grazing on einkorn, the ancient predecessor of modern wheat. Our hungry, omnivorous ancestors asked, ‘Can we eat that, too?’ They did, and surely got sick: vomiting, cramps and diarrhoea. At the very least they simply passed wheat plants out undigested, since humans lack the ruminant digestive apparatus. Grass plants in their intact form are unquestionably unappetizing. We somehow figured out that for humans, the only edible part of the einkorn plant was the seed – not the roots, not the stem, not the leaves, not the entire seed head – just the seed, and even that was only edible after the outer husk was removed and the seed was chewed or crushed with rocks and then heated in crude pottery over fire. Only then could we consume the seeds of this grass as porridge, a practice that served us well in times of desperation when ibex meat, bird eggs and figs were in short supply.
Similar grass-consuming adventures occurred with teosinte and maize (the ancestors of modern corn) in the Americas; rice from the swamps of Asia; and sorghum and millet in sub-Saharan Africa, all requiring similar manipulations to allow the edible part – the seed – to be consumed by humans. Some grasses, such as sorghum, posed other obstacles; its content of poisons (such as hydrocyanic acid, or cyanide) results in sudden death when the plant is consumed before maturity. Natural evolution of grasses led to wheat strains such as emmer, spelt and kamut as wheat exchanged genes from other wild grasses, while humans selected strains of corn with larger seeds and seed heads (cobs).
What happened to those first humans, hungry and desperate, who figured out how to make this one component of grasses – the seed – edible? Incredibly, anthropologists have known this for years. The first humans to consume the grassy food of the ibex and aurochs experienced explosive tooth decay; shrinkage of the maxillary bone and mandible, resulting in tooth crowding; iron deficiency; and scurvy. They also experienced a reduction in bone diameter and length, resulting in a loss of as much as 5 inches in height for men and 3 inches for women.1
The deterioration of dental health is especially interesting, as dental decay was uncommon prior to the consumption of the seeds of grasses, affecting less than 1 per cent of all teeth recovered, despite the lack of toothbrushes, toothpaste, fluoridated water, dental floss and dentists. Even though they lacked any notion of dental hygiene (aside from possibly using a twig to pick the fibres of wild boar from between their teeth), dental decay was simply not a problem that beset many members of our species prior to the consumption of grains. The notion of toothless savages is all wrong; they enjoyed sturdy, intact teeth for their entire lives. It was only after humans began to resort to the seeds of grasses for calories that mouths of rotten and crooked teeth began to appear in children and adults. From that point on, decay was evident in 16 to 49 per cent of all teeth recovered, along with tooth loss and abscesses, making tooth decay as commonplace as bad hair among humans of the agricultural Neolithic Age.2
In short, when we started consuming the seeds of grasses 10,000 years ago, this food source may have allowed us to survive another day, week or month during times when foods we had instinctively consumed during the preceding 2.5 million years fell into short supply. But this expedient represents a dietary pattern that constitutes only 0.4 per cent – less than one-half of 1 per cent – of our time on earth. This change in dietary fortunes was accompanied by a substantial price. From the standpoint of oral health, humans remained in the Dental Dark Ages from their first taste of porridge all the way up until recent times. History is rich with descriptions of toothaches, oral abscesses, and stumbling and painful efforts to extract tainted teeth. Remember George Washington and his mouthful of wooden false teeth? It wasn’t until the 20th century that modern dental hygiene was born and we finally managed to keep most of our teeth through adulthood.
Fast-forward to the 21st century: modern wheat now accounts for 20 per cent of all calories consumed by humans; the seeds of wheat, corn and rice combined make up 50 per cent.3 Yes, the seeds of grasses provide half of all human calories. We have become a grass seed-consuming species, a development enthusiastically applauded by agencies such as the USDA, which advises us that increasing our consumption to 60 per cent of calories or higher is a laudable dietary goal. It’s also a situation celebrated by all of those people who trade grain on an international scale, since the seeds of grasses have a prolonged shelf life (months to years) that allows transoceanic shipment, they’re easy to store, they don’t require refrigeration and they’re in demand worldwide – all the traits desirable in a commoditized version of food. The transformation of foodstuff into that of a commodity that’s tradeable on a global scale allows financial manipulations, such as buying and selling futures, hedges and complex derivative instruments – the tools of mega-commerce – to emerge. You can’t do that with organic blueberries or Atlantic salmon.
Examine the anatomy of a member of the species Homo sapiens and you cannot escape the conclusion that you are not a ruminant, have none of the adaptive digestive traits of such creatures and can only consume the seeds of grasses – the food of desperation – by accepting a decline in your health. But the seeds of grasses can be used to feed the masses cheaply, quickly and on a massive scale, all while generating huge profits for those who control the flow of these commoditized foods.
Mutant Ninja Grasses
The seeds of grasses, known to us more familiarly as ‘grains’ or ‘cereals’, have always been a problem for us nonruminant creatures. But then busy geneticists and agribusiness got into the act. That’s when grains went from bad to worse.
Readers of the original Wheat Belly know that modern wheat is no longer the 41⁄ 2-foot-tall traditional plant we all remember; it is now an 18-inch-tall plant with a short, thick stalk; long seed head; and larger seeds. It has a much greater yield per acre than its traditional predecessors. This high-yield strain of wheat, now the darling of agribusiness, was not created through genetic modification but through repetitive hybridizations, mating wheat with nonwheat grasses to introduce new genes (wheat is a grass, after all) and through mutagenesis, the use of high-dose x-rays, gamma rays and chemicals to induce mutations. Yes: modern wheat is, to a considerable degree, a grass that contains an array of mutations, some of which have been mapped and identified, many of which have not. Such uncertainties never faze agribusiness, however. Unique mutated proteins? No problem. The USDA and US Food and Drug Administration (FDA) say they’re okay, too – perfectly fine for public consumption.
Over the years, there have been many efforts to genetically modify wheat, such as by using gene-splicing technology to insert or delete a gene. However, public resistance has dampened efforts to bring genetically modified (GM) wheat to market, so no wheat currently sold is, in the terminology of genetics, ‘genetically modified’. (There have been recent industry rumblings, however, that make the prospect of true GM wheat a probable reality in the near future.) All of the changes introduced into modern wheat are the results of methods that pre-date the technology to create GM foods. This does not mean that the methods used to change wheat were benign; in fact, the crude and imprecise methods used to change wheat, such as chemical mutagenesis, have the potential to be worse than genetic modification, yielding a greater number of unanticipated changes in genetic code than the handful introduced through gene-splicing.4
Corn and rice, on the other hand, have been genetically modified, in addition to undergoing other changes. For instance, scientists introduced genes to make corn resistant to the herbicide glyphosate