Regardless of the source, nutrition news should always pass what you might call The Reasonableness Test. In other words, if a story or report or study or theory sounds ridiculous, as in the earlier examples, it probably is.
Questions to ask about any study
You open your morning newspaper or turn on the evening news and read or hear that a group of researchers at an impeccably prestigious scientific organization has published a study showing that yet another thing you’ve always taken for granted is hazardous to your health. So you throw out the offending food or drink or rearrange your daily routine to avoid the once-acceptable, now-dangerous food, beverage, or additive. And then what happens? Two weeks, two months, or two years down the road, a second, equally prestigious group of scientists publishes a study conclusively proving that the first group got it wrong.
Consider the saga of dietary fiber and colon cancer. In the early 1990s, based on a respectably large number of studies including a 1992 meta-analysis of 13 case-control efforts in nine different nations, all kinds of health experts urged everyone to increase his or her consumption of high-fiber foods to reduce the risk of colon cancer. Then in 1999, data from the long-running Nurses’ Health Study at the Harvard School of Public Health showed absolutely no difference in the risk of colon and rectal cancer between women who ate lots of high-fiber foods and those who didn’t.
Imagine the confusion. Imagine the number of boxes of high-fiber cereal tossed in favor of scrambled eggs, once considered a cholesterol risk, now regarded as perfectly healthful, for breakfast. Imagine the reaction to a report in the Journal of the National Cancer Institute two years later saying that while cereal high in dietary fiber may not be protective, people whose diets are low in fruit and vegetables have the greatest risk of colorectal cancer. What to do? Toss the cereal? Keep the banana?
Nobody seems to know. That leaves you, a layperson, on your own to come up with the answer. Never fear — you may not be a nutritionist, but that doesn’t mean you can’t ask five common-sense questions about any study to arrive at a sensible conclusion that says, “Yes, this may be true,” or “No, this may not be.”
Does this study include human beings?
True, animal studies can alert researchers to potential problems, but working with animals alone can’t give you conclusive proof of the effect in human beings because different species react differently to various foods and chemicals and diseases. Cows and horses can digest grass and hay; humans can’t. Mouse and rat embryos suffer no ill effects when their mothers are given thalidomide, the sedative that’s known to cause deformed fetal limbs when given to pregnant monkeys — and human beings — at the point in pregnancy when limbs are developing.
Are there enough people in this study?
No, a researcher’s saying, “Well, I did give this to a couple of people,” is not enough. To provide a reliable conclusion, a study must include sufficient numbers of people to establish a pattern. Otherwise, there’s always the possibility that an effect occurred by chance.
Equally important, the study needs people of different ages, races, ethnicity, and, yes, gender. Without them, the results may not apply across the board. One good example can be found in the original studies linking high blood cholesterol levels to an increased risk of heart disease and linking small doses of aspirin to a reduced risk of a second heart attack involved only men. It wasn’t until follow-up studies were conducted with women that researchers were able to say with any certainty that high cholesterol may be hazardous for men and women and that aspirin is protective for women as well as men — but not in quite the same way. As cardiovascular researchers eventually learned, men taking low-dose aspirin tend to lower their risk of heart attack. For women, the aspirin reduces the risk of stroke. Vive la difference!
Is there anything in the design or method of this study that may affect the accuracy of its conclusions?
Some testing methods are more likely than others to lead to biased or inaccurate conclusions. A retrospective study (which asks people to tell what they did in the past) is always considered less accurate than a prospective study (one that follows people while they’re actually doing what the researchers are studying), because memory isn’t always reliable. People tend to forget details or, without meaning to, alter them to fit the researchers’ questions.
Was this study reviewed by the author’s peers?
Serious researchers subject their studies to peer review, which means they have others working in the same field read the data and approve the conclusions. All reliable scientific journals require peer review before publishing a study.
Are the study’s conclusions reasonable?
If you find a study’s conclusions illogical, chances are the researchers feel the same way. In 1990, the Nurses’ Health Study reported that a high-fat diet raised the risk of colon cancer. But the data showed a link only to diets high in beef. No link was found to diets high in dairy fat. In short, this study was begging for a second study to confirm (or deny) its results, and in 2005, a large study of more than 60,000 Swedish women, reported in the American Journal of Clinical Nutrition, showed that eating lots of high-fat dairy foods actually reduced the risk of colorectal cancer.
EXTREME NUTRITION: CANNIBALISM
Cannibalism, from Canibales, the name early Spanish explorers pinned on a tribe in the West Indies, is one of civilized mankind’s strongest taboos, but anthropologists know that men and women have been tossing their friends and neighbors and relatives and defeated enemies onto the fire or into the stew pot ever since there was a written or drawn record of human activity.
The heyday of cannibalism reports was the Age of Exploration when stories of man-eating savages went along with virtually every voyage to the New World. Clearly, many of the terrifying tales were true, but the cannibal label was also used to belittle or demonize unknown or resistant peoples.
In fact, cannibalism has crept into virtually every society, civilized and not, driven by religious or cultural ritual such as the idea that devouring the heart of a brave man confers bravery upon the diner, but more commonly by simple necessity of survival during famine. In 1609, for example, George Percy, an original member of the Jamestown Colony in Virginia, wrote: “ now famin beginneinge to Looke gastely and pale in every face, thatt notheinge was Spared to mainteyne Lyfe and to doe those things which seame incredible, as to digge upp deade corpes outt of graves and to eate them. And some have Licked upp the Bloode which hathe fallen from their weake fellowes.”
Although they did not reach into graves, members of the Donner Party, caught in winter storms and starving as they tried to cross the Rockies (1846–1847), were also driven to cannibalism, as were those caught in the dreadful 842-day Siege of Leningrad (1941–1944) when more than 800,000 people starved to death; in China during the Great Leap Forward (1958–1961); and high in the Andes among the young athletes stranded after the crash of Uruguayan Air Force Flight 571 (1972).
But this is Nutrition For Dummies, not History For Dummies, so what you want to know is this: How nutritious is human flesh? According to James Cole, Senior Lecturer in Archaeology a lecturer on human origins at the University of Brighton in England: Very.
Human bodies, like other animal carcasses are red meat, fat, and offal. Based on data from four (dead) male adults, Cole estimates that a whole, cooked human body serves up about 82,000 calories. At a recommended 2,500 calories a day for an average adult male and 2,000 for an average adult woman, that’s about 34 days’ worth of sustenance