Nutritional Anthropology The
Bond Effect |
|||||||||||||||||||||||
DEADLY
HARVEST Between Our Health & Our Food GEOFF BOND |
Geoff Bond's |
||||||||||||||||||||||
The most recent Newsletters are available by private subscription Now Buy the Nutritional Anthropology's Bible: by Geoff Bond Healthy Harvest Information Page
|
Chapter
2 In the last chapter, we explored the lifestyle of our ancient ancestors in our East African homeland and dubbed it the “Savanna Model.” Remarkably, our bodies have not changed significantly since then and, ideally, we would still live and feed ourselves the same way, even today. However, as history will show, things changed. We will now continue our human story by exploring how and why human lifestyles drifted away from this ideal. Just before the dawn of recorded history, so-called advancements took place that set most of humanity on a path that led firmly away from our naturally adapted lifestyle. This process repeated itself throughout the world. As we will see, most humans were no longer nourishing their bodies to the best advantage. This chapter puts the spotlight on the major departures from the ideal diet, departures which happened, for the most part, quite unwittingly. As the centuries rolled by, new techniques and new foods came along that led us ever further from our ancestral feeding patterns. From the 19th century, it became obvious that our food supply was not always nourishing populations properly. It also became apparent that the food supply was increasingly vulnerable to dubious practices. We review how government agencies tried to correct both these matters and why we cannot rely on them to protect the public interest. This will give us a perspective on how we ended up with the food supply that we have. THE MAJOR UPSET IN HUMAN NUTRITION Page 27 Above Page 28 Below Then, a group of foragers stumbled on a solution to the problem. They discovered how to feed themselves on a much smaller area of land. Instead of wandering their territory in search of their next meal, they took control of their food supply. They had figured out a way to survive on 4 square miles instead of 200 square miles. This was a huge innovation for the human race: for the first time, humans stayed in one place and planted. It would not only change the social behavior of these bands forever, it would also cause a dramatic shift in the human diet, not always with positive results. The place where it all started was the grasslands of what is now Kurdistan in northern Iraq. Historians call this lifestyle upheaval the “Farming Revolution.” It was important enough to mark the end of the Paleolithic Age and the beginning of the “Neolithic” or New Stone Age. The start of agriculture marked the beginning of this period. It would take man’s use of metals several thousand years later to create the next important period—the Bronze Age. However, for us, it is this agricultural revolution that still influences our life and health today. The New Stone Age ushered in unforeseen consequences. Because those pioneer farmers were not able to cultivate their usual diet of foraged plants, they found themselves compelled to grow those items that it was possible to grow. For this reason, completely new food groups entered the diet. For the first time in the history of the human race, people started to eat grass seeds. Put plainly like that, it sounds faintly ridiculous, but today we eat grass seeds on a huge scale. Of course, we know them by another name: grains. Those first farmers took the grasses that surrounded their living space and learned to plant, grow, harvest, winnow, and mill their seeds. These grasses were the ancestors of today’s wheat and barley. This dietary change marked a massive upheaval in human nutrition. It was the first step—but a major one—away from our ancestral diet. We are only now beginning to understand the severity of the consequences. Cereal
and Legume Defined Cereal is just another word for grain. They are all
varieties of grasses. Legume, also known as “pulse,” is the
collective term for lentils, chickpea (garbanzo) beans, peas,
soybeans, peanuts, and similar seeds. Some authorities, such as the
United States Department of Agriculture (USDA), use the term dry
beans. They are part of a very large family called Fabaceae, the pea
family. About the same time, the first farmers also discovered how to grow
lentils and garbanzo beans (chickpeas). In this way, 11,000 years ago,
and for the first time, we humans started to consume two completely new
food categories: grains and legumes. It was the first demonstration that
humans could harness nature. However,
as we shall see, nature can play tricks on us. Just because we can
consume Page
28 Above Page
29 Below something without an immediate negative reaction, it does not
always mean that we should consume it. Indeed, humans can train themselves to eat almost
anything. It is one of the lessons that we shall learn: contrary to what
most of us fondly imagine, we cannot trust our instincts to tell us what
to eat. A TECHNOLOGICAL REVOLUTION AND A SHIFT IN SOCIETY All these pressures made for a huge change in human activity.
Humans had exchanged the mobile, instinctive, and day-to-day existence
of the forager for the responsibilities of the structured, disciplined,
and productive life of one who farms and processes food. A remarkable
adjustment had to be made: evolution had equipped humans with a
mentality for survival in the savanna environment. Fortunately, some of
the same qualities, such as ingenuity, fortitude, and persistence, could
be pressed into service to make a success of this new existence. On the
other hand, humans are not by nature tidy or given to planning for the
future or to organizing large groups. These first farmers had to learn,
the hard way, the skills to manage themselves on a larger scale and to
make provision for the future. Farming fixed people in one place, so they created the first
permanent villages. This marked one of man’s most important shifts
from animal clan-like life. The density of their populations increased
vastly. As foragers, an individual would rarely see a group of more than
50 people; in a whole lifetime, he or she might encounter no more than
400 strangers. Today, we come across just as many on a single visit to
the shopping mall. There was a second, quite unexpected consequence that changed
forever the way human society is organized. As foragers, humans lived
day-to-day and hand-to-mouth. They gathered what they needed for the day
and consumed it, then they repeated the process the next day. Everybody
took part in the procurement of food—if not, they starved. With the
advent of farming came a radical change: farmers had to produce food in
advance of requirements and store it. This enabled the production of
food surpluses. In a very short time, these surpluses were used to
support artisans whose skills only indirectly helped food production.
Here was the start of the “division of labor,” where individuals
specialized in just one activity, such as the making of tools, baskets,
or bricks. The farming life was in many ways more insecure than the foraging
one. The stores required protection from pilferers and bandits, so
warrior castes arose. The total dependence on a successful harvest
required the gods to be placated, so priestly castes came into being. As
intermediaries between the people and the gods, the priesthoods in turn
developed ever more complex rituals, sacrifices, taboos, and
superstitions. Page
29 Above Page
30 Below In forager societies, the barter system is well
developed. Humans are very good at keeping a record in their heads of
who owes what and to whom: they keep score and make sure that they leave
no obligation unreturned and that nobody cheats or gets a free ride. It
is easier to police this in a forager society. They know everybody with
whom they are dealing and many of them are direct relatives. There is a
high degree of trust. However, in these new, densely populated farming
societies, this delicate balance breaks down. With the division of
labor, the number of transactions multiplies. Farmers, tradesmen,
artisans, and all the different occupations have to make deals with each
other all the time— just to stay fed, get raw materials, and trade
finished product. Furthermore, they were less likely to know each other
or have mutual kinship ties. There was an urgent need to keep records of who does what—and
owes what—to whom. This led to another revolution in human society:
the invention of writing and numbering. In this way, an intellectual
class of scribes and bookkeepers came into being. It is from this time
that we have the first written records, or “history.” (Everything
that happened before this time is known as “prehistory.”) Barter of
goods is an unwieldy and inflexible way of trading on this scale, so
money was invented. With that came special classes of financiers, money
lenders, and accountants. With the multiplication of transactions
between people who did not know each other, lawyers and judges were
needed to draw up contracts and resolve conflicts. To manage these
complex societies, a class of bureaucrats came into being. All these
specialized groups were fed from the food surpluses. The food reserves were a fabulous source of wealth quite unknown to
hunter-gatherers. With concentration of knowledge in just a few hands,
some people were able to commandeer an unequal share of these resources
for themselves and their relatives. In this way, potentates, priests,
and merchants accumulated vast amounts of power and wealth. Dating
the Earliest Farming Sites The dates we give for the start of farming are the
best estimates available. Exploration of ancient sites is ongoing and
new sites could yet be discovered that push the earliest dates back in
time. In addition, it is possible that improved dating technology will
cause minor revisions to currently accepted dates. However, for our
purposes, we are only interested in seeing the general pattern of when
farming came about. It does not really matter exactly when it happened
or, indeed, which farmers were the first. The essential points are
that, in the grand sweep of human history, it is a recent occurrence
and that farming started in certain localities and not in others. Page
30 Above Page
31 Below In due course, many villages grew into cities; some cities became
the centers of great empires such as Sumer, Babylon, and Egypt. With
this evolution came a completely new way of organizing society. By
necessity, and in a great many ways, the new society was at variance
with our naturally adapted Savanna Model society. This dislocation
affects us even more today, leading to all kinds of unwarranted stress
and psychological disturbances. The people who adopted farming had, unknowingly, grasped a tiger by
the tail. Their population densities had grown well beyond the point
where they could return to a simple forager existence. So, all peoples
who depend on farming (and that includes us) have to put up with its
inherent drawbacks. However, until now, we had not even realized the
extent to which our lives are affected by this lack of harmony with our
savanna-bred natures. THE SPREAD OF FARMING The farming techniques developed in Kurdistan spread rapidly to
neighboring areas. Within 1,000 years, farming was practiced in the
plains between the Tigris and Euphrates (in present-day Iraq) and
eastwards to western Persia (Iran). Farming spread westward into areas
of present-day Syria, Turkey, Lebanon, Jordan, Israel, and finally the
Nile valley of Egypt. It was the scene of the struggles and migrations
of some of the earliest known peoples, including Sumerians, Assyrians,
Akkadians, Semites, Babylonians, and Phoenicians. On a map, the area
traced out looks rather like a French croissant (a crescent shape). In
1916, an American Egyptologist at the University of Chicago, James Henry
Breasted, coined the term the Fertile Crescent, which became the byword for the
cradle of farming (see map on pp. 10-11). Farming was also quickly taken
up even further to the west, in Cyprus, Crete, and Greece, and in India
to the east. As time went by, other groups of people, quite independently,
discovered how to farm grains using whatever resources were locally
available. The Chinese began with millet about 7500 B.C. and moved on to rice about 1,000 years later. From there, rice
cultivation spread to Burma, Indo-China, and India. Rye, which grows
well in cold climates, was first harvested 3,000 years ago when
agriculture spread to northern Europe. Oats came along only 1,000 years
ago, also in northern Europe. Page
31 Above Page
32 Below Recent
Origins of Breakfast Cereals The modern commercial concept of corn as a breakfast
food originated in the vegetarian beliefs of the American Seventh-Day
Adventists. In 1906, a Seventh- Day Adventist named Will Kellogg
founded a company to make “Corn Flakes” for this niche market.
Then, in the late 1950s, came a remarkable example of how smart
advertising can dramatically change a nation’s eating habits. A new
marketing campaign promoted “breakfast cereals” so persuasively
that consumption skyrocketed. In just a generation, they became the
chief food of choice at breakfast for an entire nation. Progressively,
governments have required the cereals to be fortified (or, as the
cereal companies prefer, “enriched”) with an ever-lengthening list
of vitamins and minerals. The Indians of Mexico were the first to cultivate corn (maize)
7,000 years ago. By ingenious selection of the best varieties, they
gradually bred it from a normal grass seed into the much larger and
plump cob that we know today. Columbus
brought corn back to Spain and it spread to similar climates in the Old
World. In the United States, the main communities cultivating corn were
those living close to their Mexican counterparts in the Southwest. It
was not until 200 A.D. that corn spread out from that area and then only to the Indians
on the eastern seaboard, such as the Iroquois. Even so, it was regarded
as a minor crop. Most of the other Indians of the United States—the
Apache, Comanche, Sioux, Cheyenne, the Cahuilla in the south, and the
Chinooks in the north—were hunter-gatherers. After the arrival of
European farmers to America, wheat, not corn, was the main crop planted
for human consumption. It may come as a surprise to learn that in the
United States corn did not become a big item of human consumption until
the 1950s. Until then, Americans only consumed corn in a minor way in
the form of popcorn, corn on the cob, and hominy; corn’s main use was
to fatten cows and hogs. We have so far focused on grains because they were the storm
troopers of the farming revolution. As the centuries rolled by, many
more foods were brought into production (and others abandoned). In the
next chapter, we will look at how these new foods were introduced and
the consequences (for better or for worse) of human consumption. In the
meantime, let us note that it took a long time for farmed products to
become common around the world. The peoples of ancient Kurdistan (northern Iraq) happened to be the
first to develop farming, but as we have seen, later and quite
independently, cereal farming was invented in China and Mexico. However,
not all farming started with cereals: the Incas of Peru began with
potatoes (5,500 years ago) and moved on to a grass seed called
“quinoa” only later. The Indians of the eastern United Page
32 Above Page
33 Below States first cultivated the sunflower for its seeds 4,500 years
ago. The root of the sunflower (we know it as Jerusalem artichoke) was
also eaten. THE INDUSTRIALIZATION OF FARMING Sometimes, farming percolated outwards from these centers, often by
conquest, to neighboring territories, but the process was not always
rapid. For example, the Celts, Germans, Anglo-Saxons, and Scandinavians
did not farm until 2,500 years ago, a mere 100 generations past. Indeed,
up to the present day, there are still a few non-farming populations:
isolated forager bands of San Bushmen (Southern Africa), Aborigines
(Australia), Hadza (Tanzania), the fierce Sentinales (Nicobar Islands),
and Aché (Peru) have escaped efforts to corral them into fixed hamlets
and farms. Farming always began with plants. However, where suitable animals
existed, their domestication quickly followed. In the Fertile Crescent,
sheep and goats were soon farmed. The same happened in China (pigs),
Mexico (turkeys), and Peru (llamas and guinea pigs). The types of plants
cultivated and breeds of animals raised were specific to the locality.
But the plants and animals of the Fertile Crescent are the ones that
spread to Europe and came to dominate the Western food supply until the
late Middle Ages (around 1300 to 1500). Farmers learned that spreading farmyard manure on the land improved
the quality of the crop. Farmers were great naturalists: they watched
out for the best growing plants and selected their seeds for the next
planting. In this way, they developed varieties that possessed more
desirable qualities: for example, they resisted disease better, had
better yields, or were easier to harvest. Yield Page
33 Above Page
33 Below The earliest farmers used hoes to till the ground. But as soon as
they had domesticated cattle, oxen were available as a source of power.
So, some ingenious person invented the first plows. They were already in
use 5,000 years ago in present-day southern Iraq. The technique quickly
spread to everywhere in the Fertile Crescent, including Israel and
Egypt. The earliest known use in China is more recent, about 2,500 years
ago. This basic plowing technique hardly changed for several thousand
years although there were gradual improvements: more efficient plows
were devised and draft animals became bigger and more powerful. Most
farming centers followed this pattern, but in the Americas, no suitable
animals were available, so the Aztecs and Incas continued to cultivate
by hand. The first farmers had to grind their cereal grains into flour. They
did this with a device called a quern, which consisted of a flat stone
with a rounded stone on top. A few grains were put between the two
stones and someone pushed the rounded stone backwards and forwards to
pulverize the grains into coarse flour. By Roman times, the quern had
become a much bigger, rotary device operated by slaves or donkeys. About
this time, there was an important advance: nature, in the form of
flowing water, was harnessed to turn the millstones.
These early “watermills” were built of wood including all the
mechanism. In some areas where free flowing water was not readily
available (for example, Holland), the watermill technology was adapted
to harness wind power; thus the windmill was born. The technology
improved steadily over the ensuing centuries. It took steam power during
the Industrial Revolution to replace these mills during the 19th century. The late Middle Ages in northern Europe saw two big leaps in
farming practices. In England and Germany, it was discovered how to get
three crops every two years instead of just two crops. This is known as
the three-field system: onethird was planted in the fall for harvesting
early summer, one-third in the spring for harvesting in late summer, and
one-third remained fallow. This increased production by 50%.
Mediterranean countries like Italy, Greece, and Spain could not benefit
from this innovation: unlike northern Europe, they do not have summer
rain, which is essential for the system to work. Secondly, the problem of feeding livestock during the hard winters
held back northern Europe. The practice was to slaughter a large part of
the herd in autumn and start again in the spring. The three-field system
generated a surplus of fodder that farmers could feed to the beasts
through the winter. But this could only work if there was a good way of
preserving the fodder for several months. This led to a second major
development—silage, a way of conserving fodder in deep pits and
allowing it to ferment. This stops it from going rotten and preserves
its nutrients. These two developments marked the rise in economic power
of northern Europe during the Middle Ages to the detriment of the
countries of southern Europe. Page
34 Above Page
35 Below So, farming techniques improved, at least in the sense that farmers
obtained higher production for the same effort. Farming had evolved in a
slow and steady way from its early roots and most of the basic
principles would have been familiar to a Sumerian from 5,000 years
earlier. During all this time, no one knew what was happening to the
nutrients in the plants and animals, but no one really thought about it
either. They were being kept alive in an uncertain world and survival
was the goal. Plant Chemicals Research on poison gas in Germany during World War II led to the
discovery of another group of yet more powerful insecticides—the most
common being a compound called parathion. Some of these compounds were
“systemic”— that is, the plant absorbed them into its tissues and
became itself toxic to insects. Though low in cost, these compounds were
toxic to humans and other warm-blooded animals. These chemicals were designed to kill insects. However, there are
other nuisances that harm crop yields: funguses, weeds, worms and
viruses. Attention was turned to developing fungicides, herbicides (to
kill weeds), and vermicides (to kill worms), with almost equal success.
Viruses cannot be attacked by chemicals, but they are transmitted from
plant to plant by insects, worms, and other bugs; by killing the bugs,
virus damage was controlled too. It seemed that almost any pest,
disease, or weed problem could be mastered by suitable chemical
treatment. Farmers foresaw a pest-free millennium. Crop losses were cut
sharply, locust attack was reduced to a manageable problem, and the new
chemicals, by dramatically improving food production, saved the lives of
millions of people. But problems began surfacing in the early 1950s. In many crops,
standard doses of DDT, parathion, and similar pesticides were found
ineffective and had to be doubled or trebled. Resistant breeds of
insects had developed. In addition, the powerful insecticides often
destroyed helpful insects too. Resistant survivors soon produced worse
outbreaks of pests than there had been before the treatment. Soon, concern was expressed about pesticide residues. It was found
that Page
35 Above Page
36 Below many birds and wild mammals retained considerable quantities of DDT
in their bodies. Rachel Carson, in her 1962 book Silent Spring, sounded the alarm. Since that
time, agriculturalists have tried to find a middle way between the
well-tried traditional methods and the use of chemicals. Even so,
chemicals have become ever more sophisticated and widespread, and they
are not just restricted to controlling pests either. Fruit trees are
sprayed to heighten the color of the fruit; they are even treated with
hormones to get all the fruit ready for harvesting on a programmed day.
Residues in foods are strictly controlled, but there are always some
left on our plates. No one really knows the consequences of consuming
them over a lifetime or the effect they have when they are added to each
other. Plant Fertilizers It took a while for anyone to work out why these materials had
their effect. Then, the brilliant English chemist Sir Humphry Davy, in
an 1820 treatise, explained just what these fertilizers were doing. They
were adding three bulk elements essential for plant growth: nitrogen,
potassium, and phosphorus. Deposits of phosphorus and potassium were
discovered in many parts of the world and their availability, even up to
the present day, is not a problem. Sources of nitrogen (as in saltpeter)
were scarce and its supply was not assured until, in 1909, the German
chemist Fritz Haber discovered how to make nitrogen fertilizer from the
nitrogen in the air. These three chemicals—nitrogen, potassium and
phosphorus—still form the basis of all bulk fertilizers. Plants grow in soil that contains a vast range of chemicals and
they absorb them, even if they don’t need them. Over the years,
scientists have identified those other elements that are essential to a
healthy plant. They are needed in much smaller quantities (so they are
known as “trace elements”) and there are only about 14 of them. They
include chemicals like copper, zinc, manganese, and sulfur. With this
discovery, it was possible to grow plants without soil altogether, just
dangling their roots in nutrient-rich water. This system is known
variously as “hydroponics,” “nutriculture,” and “soil-less
culture.” A variant is used extensively in desert areas where plants
can be grown, under suitable cover, with their roots in gravel or sand.
Beautiful vegetables and fruit can be grown this way by just using these
basic nutrients. However, what makes a plant grow is not always
sufficient for animals and humans. We need those other trace Page 36 Above
Page
37 Below elements that plants normally
absorb when they grow in soil, such as iron, chromium, and selenium,
even if the plants themselves do not need them. Animal Husbandry But cattlemen have gone one step further: corn is plentiful and
easily made into a concentrated feed, and it fattens cows fast. But corn
is not normal cow food at all—they cannot digest it properly and it
disrupts the working of their intestines. Their colons become overgrown
with bacteria, which in turn produce nasty toxins that get into the
carcass. Cattlemen even have a name for this phenomenon: “bloody
gut.” Ever cheaper sources of fodder were sought, however outlandish.
Even the last swillings from the slaughterhouse floor were collected,
dried, and pressed into cake as animal feed. In this way, we were
treated to the ultimate spectacle of dead cows being fed to live cows.
This practice allowed the incurable sickness bovine spongiform
encephalopathy (BSE; familiarly known as “mad cow disease”) to
spread in British herds and to fatally sicken many humans who ate the
beef. But that is only the start. Chickens would normally lay only about
170 eggs per year. With clever feeding, suitable lighting, and other
stimulation, they now average 240 eggs per year. The ambition is to
increase this to 700 eggs per year by the addition of sex hormones to
speed up the chicken’s egg-production cycle. They are fed dyes to make
their yolks bright yellow, they are dusted with insecticides against
parasites, and fed antibiotics to stop them from getting sick in the
crowded conditions. Since 1993, dairy cows have been injected with the hormone known as
rbST to increase milk production by up to 25%. Antibiotics have
routinely been added to animal feed since the 1950s to increase growth
rate. All these measures are sanctioned by government authorities, chief
among them the U.S. Food and Drug Administration (FDA). But even this is
not enough for some: the competitive pressures to produce cheap meat are
so great that unscrupulous cattlemen inject their herds with illegal
substances, such as muscle-building steroids. Page
37 Above Page
38 Below Mechanization Through all these changes, the nature of the plants was changed by selective breeding.
Combine harvesters, tomato reapers, or cotton pickers need plants that
grow in specific ways to work efficiently, so the plants were bred to be
more suitable for mechanical harvesting. In this way, mechanization
drove a trend to change plants for convenient handling. Many plants do
not lend themselves to mechanized production, so they were no longer
farmed. “A chicken in every pot and a car in every garage”—that was
the slogan used by Herbert Hoover in his 1928 presidential campaign. It
is hard to imagine that, for the average American in those times, it was
as rare to eat chicken as it was to own a car. Mechanization changed all
that for both chickens and cars. Animals such as hogs and chickens could
be kept in large sheds and reared in much more densely packed
conditions. Their products became much cheaper. By the 1930s, farming
had become so mechanized that this marked a major change: agriculture
flipped from being a labor-intensive industry to one that used few
farmhands but invested heavily in machines. Plant Genetics This kind of plant breeding has a long history, but this does not
automatically make it an acceptable thing to do. The whole point is that
plants have been Page
38 Above Page
39 Below changed for a variety of reasons, but none of them has to do with
nutritiousness. We just do not know what has been lost or gained in the
process. However, with plant breeding, at least scientists were working
with combinations of genes that could have occurred in nature. Carrot
Color Frivolity Since the 1970s, scientists have been artificially manipulating
plant genes to achieve desired characteristics. Sometimes genes from a
quite different species, or even an animal, are introduced to modify the
plant genes. Their goals have been to make farming easier and cheaper by
improving yields, and by producing crops resistant to pests, drought,
salt, and weed-killers. A second objective is to make foods that
transport well, are easily packaged, and have a long shelf life. It is
an incredibly powerful technique that has few boundaries. A Supreme
Court decision in 1980 made genetically modified organisms (GMOs)
patentable, so there is a strong incentive for agri-business to focus on
GMO plants and to ignore conventional breeds. The momentum is so great
that it is like a runaway train hurtling into the darkness. No one knows
what will come of it, but one thing is clear—the train is rushing us
on an enforced journey away from our human origins. FOOD PROCESSING, TRANSPORT, AND STORAGE Page
39 Above Page
40 Below Wheat quickly goes rancid when made into flour. For this reason,
since time immemorial, bakers only milled their flour when they were
ready to use it. However, ingenious industrialists found that the
problem lay in the wheat germ. By the simple expedient of removing the
wheat germ as the grain was milled, flour would keep almost
indefinitely. Mechanization was brought to traditional processes of
grinding cereal grains into flour. For 10,000 years, this had been
achieved by grinding the grains between two stones. In the 19th century, that process changed.
Steel had arrived and the quirky millstones were replaced by banks of
steel cylinders rotating at high speed. These progressively ground the
grain down to ever finer particle sizes [4]. At every stage, there were sieves to separate the bits of outer
husk (bran) from the flour itself. The whole lot was driven by
steam-powered machinery—it was a tremendous advance in productivity.
This procedure has been continued right to the present day. The
Importance of Micronutrients For example, there is the family of carotenoids, of
which there are over 600. They give the color to carrots, oranges,
tomatoes, and melons. There is the phenol family with over 5,000
members. They too are present in all fruits and vegetables, and
strongly present in tea, coffee, and wine. And there are the 7,000
terpene compounds, which are omnipresent in all plant foods,
particularly in spices and aromatic herbs. We must not forget the
thousands of bioflavonoids, yet another vast range of compounds that
are essential to health. We know that all these micronutrients, both classic
and background, are important to optimum health. We can’t define
exactly how all these compounds work, but we ignore their importance
at our peril. The industrialization of milling and baking also changed the nature
of bread. The bakers like the new “refined” flour. It was uniform in
size and free of bran and wheat germ, so bread-making became much more
predictable. It did not need human skill to ensure that the bread baked
properly every time, so this meant that bread could be made on a
production line too. But both the bran and the wheat germ had been
stripped out of the bread. It was not until much later when scientists
discovered that wheat germ is a powerhouse of important nutrients,
including omega-3 oil, vitamin E, and choline (a B vitamin). It was the
precious Page
41 Below and fragile omega-3 oil that went rancid so quickly. In one stroke,
this processing deprived city populations of vital nutrients. As we
shall see, this had surprising and unexpected negative consequences for
the consumer. The first patents for canning food were issued as early as 1810 in
England; the United States soon followed. The technique involves sealing
the food in the can and then heating it to over 200°F. Most animal
foods can be preserved this way and a good many plant foods as well.
Always the pressure is on to select variations of the food that
withstand this treatment best. Some foods, like milk and fruit juices,
are “pasteurized”—the food is heated very briefly to an elevated
temperature and then sealed into bottles. No one thought particularly
hard about what was happening. Heat, it was known, killed the harmful
bacteria that cause food to rot, so that was good. It was less
understood that heating also destroyed natural enzymes and many other
micronutrients. Systems of food transport became quicker and more reliable, so many
more food products were grown for export to the burgeoning cities. Thus,
varieties of plant were chosen that survived transportation well. Bulk
storage systems improved with the development of refrigeration in the
1920s and of scientific techniques of “conditioning,” which sought
to slow or prevent spoilage by careful control of moisture and gases in
the silo. Plant varieties that stored well were favored. THE FAST-FOOD INDUSTRY An industry had sprung up to fulfill a need. Americans were earning
more money but they had less free time, and more and more women were
working. This “fast-food” industry, as it came to be known, provided
attractive, tasty, and cheap food—and you did not even have to stop
the car engine while collecting your order at the “drive-thru”
window. It is hard to imagine that in 1950, McDonald’s had just one
outlet (in San Bernardino, California). Today, they have over 11,000.
They were quickly followed by imitators such as Burger King (now over
6,000 outlets) and Wendy’s (now over 3,500 outlets). These
establishments specialized in a new type of food, the hamburger.
However, other enterprising food suppliers introduced different but new
foods to the scene: pizza and tacos, for example. Others reworked
traditional ideas: fried chicken and sub-sandwiches. Almost always, the accompaniment was french fries and a soft drink.
The Page
41 Above Page
42 Below soft drink industry had gotten going earlier, partly encouraged by
Prohibition in the 1920s. Even so, in 1940, the average American
consumed only about 6 ounces per week. By 2000, that had jumped by 20
times to a gallon a week (128 ounces). These changes are radical. At the 1950s family meal, hamburger,
pizza, and tacos were either unknown or rarely served. Potatoes were not
often served in the form of french fries; soft drinks were absent. All
these changes have occurred just in living memory and we will later look
at the consequences of this dramatic shift in feeding habits. FOOD SAFETY As the food supply was industrialized during the 19th century, more and more foods
were processed and packaged for sale. Unscrupulous merchants adulterated
their products with cheaper and sometimes harmful substances, labels
were deliberately misleading, inferior food was fancied up with dyes and
artificial flavors, and food was preserved with noxious chemicals. In
the United States, Abraham Lincoln set up the U.S. Department of
Agriculture (USDA) with a division called the Bureau of Chemistry to
look into such matters. Following his investigation of food adulteration, in 1880 the chief
chemist Peter Collier recommended a national food and drug law. The bill
was defeated, and this was a portent of battles ahead. Why would
Congress refuse to legislate for food purity? Only if there were
powerful forces opposed to these measures— the nation’s food supply
is the subject of a titanic battle between the food industry and
governments that try to regulate it for the public good. In 1883, Dr. Harvey Wiley became chief chemist and took up the
battle. He expanded the Bureau of Chemistry’s food adulteration
studies and campaigned for a federal law. He was so vigorous and
forthright that was called the “Crusading Chemist.” Finally, a law
was passed in 1906. The pure food regulations were scientifically sound,
thorough, and gave strong powers to the Bureau of Chemistry to enforce
them. Had these regulations been allowed to remain in place, they would
have made America one of the healthiest places in the world to eat, but
trouble was already on the horizon. On signing the regulations, the Secretary to the Treasury
complained that they were too severe on the food industries. The
fishermen of Massachusetts wanted to keep borax; the dried fruit
industry of California wanted to use sulfur dioxide; ketchup interests
begged for benzoic acid. Very quickly, industry forces set about
undermining the Bureau of Chemistry. The Secretary of Page
42 Above Page
43 Below agriculture, James Wilson, was persuaded to set up a board, under
chairman Ira Remsen, to protect the manufacturers. This “Remsen
Board” started making its own recommendations to Wilson, who often
upheld them, over the head of the Bureau of Chemistry. The Crusading
Chemist’s success was short-lived. In Dr. Wiley’s words, “The food
and drugs law became a hopeless paralytic.” In spite of the restrictions and difficulties, Dr. Wiley battled
on. He tried to stop the bleaching of flour, which often uses chlorine
dioxide, a chemical akin to household bleach. In a test case, the Bureau
of Chemistry sued the Lexington Mill and Elevator Company for
contaminating flour with nitrogen peroxide, another bleaching chemical.
The case took almost 10 years to complete as it went all the way to the
United States Supreme Court. The Supreme Court finally ruled against
bleached flour in 1919. But mysteriously, the USDA wrote the application
guidelines in such a way as to make the ruling easily circumvented.
Bleached flour has never been removed from commerce in the U.S. to this
day. On the other hand, bleached flour has been banned in many European
countries. Dr. Wiley took on the Coca-Cola company for dispensing its wares
without disclosing the ingredients. The two sides fought to a
standstill. Coca-Cola made a concession: it removed cocaine from the
formula. On the other hand, it retained the right to keep some of the
ingredients secret from the public. In 1912, Dr. Wiley resigned in
disgust and wrote: “The makers of unfit foods have taken possession of
Food and Drug enforcement, and have reversed the effect of the law,
protecting the criminals that adulterate food, instead of protecting the
public health” [6 ]. The Bureau of Chemistry nevertheless continued with its work. By
chipping away at its task, Congress passed a series of piecemeal laws.
For example, requiring that food packages be “plainly and
conspicuously marked on the outside of the package in terms of weight,
measure, or numerical count” and banning labels that “may mislead or
deceive.” In 1930, the Bureau of Chemistry was restructured into the
Food and Drug Administration (FDA). Little by little, the law was
clarified on a number of fronts. In 1958, manufacturers of new food
additives were required to establish safety, and in 1960 the
manufacturers of new color additives were required to do likewise. Even
these gains are not quite what Dr. Wiley had in mind—he wanted food to
be free of additives altogether. That is the situation today. Manufacturers can make up a confection
of anything they like, so long as no one ingredient is “harmful.” In
all these matters, there is a further weakness: the FDA relies on the
manufacturer’s own laboratory tests to prove safety. The confection
can be totally without food value. Indeed, food can now officially be
adulterated so long as it is declared on the label! In this way, hot
chocolate, for example, has a poor cocoa content but a high level of
cheap fillers, artificial colors, and flavors. Still, manufacturers have
to be careful about health claims and they must declare somewhere on a
label what is in the food. Page
43 Above Page
44 Below Both from its founding as the Bureau of Chemistry and under its
present banner, the FDA is supposed to be a guardian of the public
interest. However, in many respects it gives the impression of being a
watchdog that is conspiring with the burglars. This may be a somewhat
unfair characterization, but the reality is that the FDA has to work in
a highly political environment. The general public cannot therefore rely
on the FDA’s protection from many of the dubious practices carried out
by the food industry. We have to do that for ourselves. GOVERNMENT EATING GUIDELINES Scurvy
Scurvy is a disease that has been known from ancient
times, although it was rare. It became common among early Europeans
who had to endure long winters in places like central Canada and among
sailors on long sea voyages. Scurvy’s symptoms are swollen and
bleeding gums, loosened teeth, sore joints, bleeding under the skin,
slow wound healing, and anemia. If not treated, it results in death. Also in the 18th century, it was found that rickets, a bone disease common in poor
parts of cities, could be cured by the consumption of cod liver oil. We
now know that rickets is caused by a deficiency of vitamin D. Pellagra
is a disease that used to be common in the southern states of the U.S.
where poorer people lived almost entirely on corn. In 1937, it was
discovered that pellagra is caused by a deficiency of tryptophan, an
essential protein that is unavailable in corn. It can easily be cured by
eating small amounts of protein-rich foods. The Japanese Navy used to lose 50% of its seamen to beriberi. They
were eating a diet of polished white rice and not much else. In the
1870s, the Japanese reported that they could cure beriberi by feeding
their sailors with some extra rations of vegetables and fish. We now
know that beriberi is a disease caused by a deficiency of vitamin B1 (thiamine). The list goes on, but the message is simple: for the last 250
years, more and more diseases have been linked to nutritional
deficiencies. Governmental authorities have learned the lesson from this
and, in a bid to improve the health Page
44 Above Page
45 Below of their populations, started to advise them how to eat. One can
imagine the early messages: “eat citrus fruit to avoid scurvy” and
“eat beans (which are protein- rich) with your corn to avoid
pellagra.” There were early attempts to smuggle essential nutrients
into the food supply by “enriching” some foods: vitamin D was added
to butter and calcium was added to flour. Magic
Bullet Mirage An unfortunate side effect of these discoveries was
the encouragement of the notion of a “magic bullet”—that is, one
simple cure for one simple disease. As we shall see, this is too
simplistic. Most of our modern diseases are due to a complex
interaction of many factors that are going wrong at the same time. For over a century, the U.S. government has been interested in
helping Americans to choose a healthy diet. The agency charged with this
responsibility is the USDA. As early as 1894, the USDA developed the
first food composition tables and dietary recommendations. However, they
found quite quickly that, to communicate the ideas to ordinary folk,
they needed to group the various foods into categories. Then, they could
give recommendations for each category. This gave rise to the concept of “food groups.” These food
groups have become entrenched, in various forms, in the way we think
about our diets. For this reason, and because we will be using this
concept as we move through the book, we will look at the story of food
groups and how to interpret them. The History of Food Groups It might be thought that this was very clear. However, in 1942, the
USDA issued a new food guide that reduced the number of food groups to
what they called the “Basic Seven.” These were: green and yellow
vegetables; oranges, tomatoes, and grapefruit; potatoes and other
vegetables and fruit; milk and milk products; meat, poultry, fish, eggs,
and dried peas and beans; bread, flour, and cereals; and butter and
fortified margarine. Page
45 Above Page 46 Below Twelve Food Groups Milk It is interesting to see what has changed. “Potatoes and sweet
potatoes” have been lumped in with “other vegetables and fruit.”
“Eggs” and “dry beans, peas, and nuts” are lumped in with
“meat, poultry, and fish.” The “butter” group has been expanded
to “butter and fortified margarine.” The word lean has been dropped from the
category “lean meat.” The “other fats” group and the
“sugars” group have disappeared entirely. This does not look like a move in the right direction, but worse is
to come. In 1956, the “Basic Seven” groups were condensed to the
“Basic Four”: milk and milk products; meat, fish, poultry, eggs, dry
beans, and nuts; fruits and vegetables; and grains. This time the
“green and yellow vegetables” group has disappeared. “Butter and fortified margarine” has been dropped. “Oranges,
tomatoes, and grapefruit” are lumped into the catch-all category
“fruits and vegetables.” This took the simplification too far. In
1979, the USDA issued the “Hassle Free Guide to a Better Diet.” This
added a fifth group—”fats, sweets, and alcohol”— to the Basic
Four. The guide recommended moderation in the use of the fifth group and
also mentioned calories and dietary fiber for the first time. Finally, in 1980, the USDA released the first Dietary Guidelines
for Americans. The only change was to split the “fruit and
vegetable” group into separate groups. So, now we are up to six
groups. The USDA, for ease of reference, condensed their designations
to: the Grains group, the Vegetables group, the Fruit group, the Milk
(Dairy) group, the Meat and Beans group, and the Fats, Oils, and Sweets
group. Six Food Groups (USDA Categorization of 1980–2004)
Grains Vegetables Fruit Milk (Dairy) Meat and Beans Fats, Oils, and Sweets Page
46 Above Page
47 Below Since that time, the USDA has issued revisions to its Dietary Guidelines every five years, but the food group classification has remained broadly the same. More recently, the USDA mentioned the use of “salt and sodium” and recommends moderation. Finally in 2005, the USDA issued the following redefinition of the food groups: Six Food Groups (USDA Categorization 2005)
Grains Vegetables Fruit Milk (Dairy) Meat and Beans Oils The only food group that the USDA changed is the “Fats, Oils, and
Sweets” group—it is now just the “Oils” group. Where have fat
and sugar gone? The USDA has created a new concept: that of optional
treats. If your daily intake of calories on the conventional food groups
leaves a shortfall, you can top-up with sugars and fats. By removing
sugar and fat from food groups altogether, the USDA is placating the
sugar, snack-food, soft drink, and confectionary lobbies, and it is also
an attempt to feed consumers’ weakness for pleasurable and comfort
foods. In 1992, to give a pictorial presentation to the Dietary
Guidelines, the USDA introduced the Food Guide Pyramid. This is a neat
way of showing the priority to be given to each group as well as depicting the food groups
themselves. However, as we shall see later, there are serious flaws both
in the groupings and the priorities. So, there is nothing special about the way our food supply is
categorized today. Other categorizations have been used in the past and
every few years the USDA reviews and makes changes to them. Most
Americans will be familiar from their school days with the idea of food
groups. However, depending on just what year they went to school, the
food groups were different. No wonder people are confused. Why are the contents of the food groups shuffled around so much?
One of the reasons has to do with pressure groups. The sugar lobby did
not like being singled out, so they got sugar dropped entirely in 1942.
Only in 1980, and against bitter opposition, did the USDA get sugar
mentioned again, but only as an afterthought in the “Fats, Oils, and
Sweets” group. Likewise, butter and margarine were quietly merged into
the same group. For similar reasons, the potato lobby got their product
dropped as a food group in 1942; the potato and its french fry variant
have remained submerged in the “Vegetable” group ever Page
47 Above Page
48 Below since. For this reason, in the minds of most Americans, a french
fry has just the same value as a tomato. As for “lean meat” and
“green and yellow vegetables,” they were leveled down and airbrushed
out of special mention. The Bond Effect Food Groups By the term vegetable, the USDA means any plant food that is not a fruit, grain, nut, or
legume. Even after excluding these categories of plant food, the term vegetable covers a wide range of plant
types. For reasons that will become clear later, we will divide the
“Vegetable” group in two: “Vegetables (Starchy)” and
“Vegetables (Non-Starchy).” One USDA group, “Meat, Fish, Poultry, Dry Beans, Nuts and
Eggs,” seems to have been lumped together because they are, on the
whole, protein-rich foods. However, not all protein-rich foods (for
example, cheese) are included and some protein-poor foods (for example,
chestnuts) make the list. This USDA food group is just too incoherent
for our purposes. There are significant differences among these items,
so we will break down this group into three major classes. One is
protein-rich foods of animal origin: “Meat, Poultry, Eggs, and
Fish.” The other two are protein-rich foods of plant origin: “Dry
Beans and Peas (or Legumes)” and “Nuts.” By making these adjustments to the familiar USDA food groups, we
will be able to highlight in a more precise way how foods conform to, or
diverge from, the Savannah Model. As we proceed through the book, it
will be necessary to make even more subtle distinctions, but for now,
this breakdown will serve our purposes. Our modified groupings then are:
Grains; Vegetables (Starchy); Vegetables (Non-Starchy); Fruits; Milk and
Dairy; Meat, Fish, Eggs, and Poultry; Dry Beans; Nuts; Fats and Oils;
Sugars and Sweeteners; Salt and Sodium; and Beverages. Table 2.1 (shown
on page 49) shows how the modified groups can be compared to the current
USDA groups. Dietary Guidelines for Americans The categorization of the
nation’s food supply into food groups is the first of a two-stage
process. The second, and more important, stage, is then recommending to
Americans how many servings of each food group they should be consuming Page
48 Above Page
49 Below every day. These recommendations are embodied in the
impressive-sounding “Dietary Guidelines for Americans.” First
instituted in 1980, they are revised every five years, on the decade and
on the half decade. Table
2.1 Comparison of Food Groups
Every five years, we are treated to the spectacle of a new round of
negotiations for an agreed text to put in the Dietary Guidelines. It is
not edifying: each interest group brings the maximum of financial and
political pressure to bear. Regrettably, in the mêlée, the
scientists’ impartial advice is mostly watered down or abandoned. In
other words, the USDA’s Dietary Guidelines are not a gold
standard—on the contrary, they are a weak and deceitful compromise
between all the competing interests. This is a major cause for concern. In spite of their debased
nature, these recommendations are then taught to our children in schools
and used to design meals in hospitals, schools, prisons, and retirement
homes. Worse, these recommendations become the dogma in which
professional dieticians and nutritionists are trained. The conventional
platitudes for healthy eating have become as sincere as a harlot’s
kiss. Integrity has abandoned the field, leaving it wide Page
49 Above Page
50 Below open to all kinds of alternative dietary nostrums. Most of them are
questionable, some are plausible, but none of them gets to the
fundamentals. They cannot, for they do not know the truth about our
human heritage. The whole point of this book is to provide those
fundamentals and to do so in an honest, and coherent way. ADRIFT FROM THE IDEAL In the 1940s and 1950s, the Americans led the way in intensifying
agriculture. Now, we have found scientific ways of altering foods for
all kinds of reasons. When the supply of untouched fertile soils run
out, we find ways of pressing exhausted soils back into service using
fertilizers, pesticides, and other chemicals. We are capable of
producing what look like real plants but which simply do not contain the
same nutrients. We can mass-produce animals on a production line system
and their flesh finds its way onto our plates—injected with hormones,
fed with antibiotics, and dusted with insecticide. Foods are processed
and refined in ever more sophisticated ways. Artificial dyes, fillers,
preservatives, colorings, flavors, and odors are routinely used in
manufactured foods. Foods are routinely adulterated with cheap,
nutrition-free fillers and extenders. We have seen how government
agencies try to hold the line, but their efforts are subverted by
political and financial pressures. They cannot be relied upon to protect
the public interest. All this sounds alarming, but which factors are of primary
importance and which of secondary importance? In this chapter, we have
reviewed familiar territory concerning the intensification of
agriculture; but the main theme has focused on the idea that many of
these foods are new to the human diet anyway. Perhaps in themselves these foods are posing problems: just maybe, no matter how pure or
sullied they are, they need to be treated with caution. In the next
chapter, we will look more closely at the origins of the foods commonly
available today and examine the consequences of consuming them. Page
50 Above |
||||||||||||||||||||||
On to Chapter Three |