OVERFED, YET STARVING

in food •  2 years ago 

OVERFED, YET STARVING

over feeding 4.JPG
A human being should be able to change a diaper, plan an
invasion, butcher a hog, conn a ship, design a building,
write a sonnet, balance accounts, build a wall, set a bone,
comfort the dying, take orders, give orders, cooperate, act
alone, solve equations, analyze a new problem, pitch
manure, program a computer, cook a tasty meal, fight
efficiently, die gallantly. Specialization is for insects.
–ROBERT A. HEINLEIN

Let us think back to a time before food delivery apps and
diet gurus, when “Trader Joe” was the guy guarding the
only salt lick in a hundred-mile radius and “biohacking”
was something you did to a fresh kill with a sharpened
stone. Government diet recommendations (or governments,
for that matter) wouldn’t arrive on the scene for millennia,
so you’d have to make do, as your ancestors did, with
intuition and availability. As a forager, your diet would
consist of a diverse array of land animals, fish, vegetables,
and wild fruits. The chief calorie contributor would by and
large be fat, followed by protein.1 You might consume a
limited amount of starch, in the form of fiber-rich tubers,
nuts, and seeds, but concentrated sources of digestible carbohydrate were highly limited, if you had access to them
at all.
Wild fruits, the only sweet food available to ancestral
you, looked and tasted much different from the
domesticated fruits that would line supermarket shelves eons
later. You likely wouldn’t even recognize them when placed
next to their contemporary counterparts, a contrast almost as
stark as a Maltese lapdog standing next to its original
ancestor, the gray wolf. These early fruits would be small,
taste a fraction as sweet, and be available only seasonally.
Then, approximately ten thousand years ago, a hairpin
turn in human evolution occurred. In the blink of an eye,
you went from a roaming tribal forager subject to the whims
of season to a settler with planted crops and farmed animals.
The invention of agriculture brought to your family—and
the rest of humanity—what was a previously inconceivable
notion: the ability to produce a surplus of food beyond the
immediate needs of daily subsistence. This was one of the
major “singularities” of human existence—a paradigm shift
marking a point-of-no-return entry into a new reality. And in
that new reality, though we procured quantities of foods that
would feed many people cheaply and fuel global population
growth, individual health took a downward turn.

over feeding.JPG

For hundreds of thousands of years prior, the human diet
was rich in an array of nutrients spanning diverse climes,
but this micronutrient and geographic diversity disappeared
when every meal became based on the handful of plant and
animal species that we were able to cultivate. Starvation was
less of an immediate threat, but we became slaves to single
crops, making nutrient deficiencies more prevalent. The dramatic increase in the availability of starch and sugar
(from wheat and corn, for example) created tooth decay and
obesity, a loss of height, and a decrease in bone density. By
domesticating animals and crops, we inadvertently
domesticated ourselves.
The advent of agriculture fed a vicious spiral of
behavioral demands that changed the very nature of our
brains. A hunter-gatherer had to be self-sufficient, but the
post-agriculture world favored specialization. Someone to
plant the wheat, someone to pick it, someone to mill it,
someone to cook it, someone to sell it. While this process of
hyper-specialization eventually led to the industrial
revolution and all its conveniences like iPhones, Costco, and
the Internet, these modern trappings came with a flip side.
Fitting an ancient brain into a modern environment may be
like fitting a square peg into a round hole, as evidenced by
the millions of Americans on antidepressants, stimulants,
and drugs of abuse. A person with ADHD, whose brain
thrives on novelty and exploration, may have been the
ultimate hunter-gatherer—but today this person struggles
with a job that requires repetition and routine (the authors
can—ahem—relate).

over feed 3.JPG
The confluence of this dietary shift and the relegation of
our cognitive duties caused our brains to lose the volumetric
equivalent of a tennis ball in a mere ten thousand years. Our
ancestors from five hundred generations ago would have
lamented our restrictive existences, and then apologized to
us for engineering our cognitive demise. Forget about
leaving the next generation with lower standards of living,
student debt, or environmental destruction—our ancestors were so successful that they left us with smaller brains.
We didn’t know it at the time, but in one fell swoop we
had turned our backs on the diet and lifestyle that created
the human brain, and adopted one that shrank it.

Energy Dense, Nutrient Poor

Given the obesity epidemic and the amount of food
Americans and others around the globe routinely throw
away (even slightly misshapen fresh vegetables get tossed
out so your supermarket-going experience is as aesthetically
pleasing as possible), it may surprise you to know that our
bodies are still somehow . . . starving.
Have you ever wondered why so many packaged goods
now have to be “fortified” with vitamins? There are more
than fifty thousand edible plant species around the world—
plants that provide a bevy of unique and beneficial nutrients
that we consumed as foragers. And yet today, our diets are
dominated by three crops: wheat, rice, and corn, which
together account for 60 percent of the world’s calorie intake.
These grains provide a source of cheap energy, but are
relatively low in nutrition. Adding in a few cents’ worth of
(usually synthetic) vitamins is the dietary equivalent of
putting lipstick on a pig

MICRONUTRIENTS GONE MIA Potassium Supports healthy blood
pressure and nerve signals
B vitamins Support gene expression
and nerve insulation
Vitamin E Protects fatty structures
(like brain cells) against
inflammation
Vitamin K2 Keeps calcium out of soft
tissues like skin and arteries
and in bones and teeth
Magnesium Creates energy and
facilitates DNA repair
Vitamin D Anti-inflammatory, supports a healthy immune
system
Selenium Creates thyroid hormones
and prevents mercury
toxicity
The above list covers only some of the essential nutrients
lost to the modern diet. In total, there are roughly forty
minerals, vitamins, and other chemicals that have been
identified as essential to our physiology and are readily contained in the whole foods we’re not eating.2 As a result,
90 percent of Americans now fall short in obtaining
adequate amounts of at least one vitamin or mineral.3
To complicate matters, nutrient intake guidelines are set
only to avert population deficiencies. This means that even
when we check all the institutionally recommended boxes,
we may still be handicapping our bodies in serious ways.
The recommended daily allowance (RDA) of vitamin D, for
example, is meant only to prevent rickets. But vitamin D
(generated when our skin is exposed to the sun’s UVB rays)
is a steroid hormone that affects the functioning of nearly
one thousand genes in the body, many involved in
inflammation, aging, and cognitive function.

In fact, a recent University of Edinburgh analysis found low vitamin
D to be a top driver of dementia incidence among
environmental risk factors.4 (Some researchers have argued
that the RDA for vitamin D should be at least ten times
higher than it currently is for optimal health.)5
When our bodies sense low nutrient availability, what’s
available will generally be used in processes that ensure our
short-term survival, while long-term health takes a back
seat. That’s the theory initially proposed by noted aging
researcher Bruce Ames. Dubbed the “triage theory” of
aging, it’s sort of like how a government may choose to
ration food and fuel during wartime.
In such cases, more
immediate needs such as food and shelter might take
priority, whereas public education would become a casualty.
In the case of our bodies, loftier repair projects can become
an afterthought to basic survival processes, all while proinflammatory processes run amok.
The downstream effects of magnesium deficiency may
be the perfect example of such reprioritization. This is
because magnesium is a mineral required by more than
three hundred enzymatic reactions in the body with duties
ranging from energy creation to DNA repair. If it is
constantly shuttled into short-term needs, DNA repair takes
a back seat. This effect is almost certainly magnified when
we consider that nearly 50 percent of the population doesn’t
consume adequate amounts of magnesium, second in
deficiency rates only to vitamin D, and yet it is easily found
at the center of chlorophyll, the energy-generating molecule
that gives dark leafy greens their color.6
Research has validated that inflammation wrought by
nutrient scarcity is strongly linked with accelerated brain
aging and impaired cognitive function.7 Robert Sapolsky,
author of Why Zebras Don’t Get Ulcers, may have said it
best when describing the similar reshuffling of priorities that
occurs during stress: the body holds off on long-term
projects until it knows that there will be a long term. After
all, the major consequences of damaged DNA—a tumor, for
example, or dementia—won’t get in your way for years,
decades even . . . but we need energy today.

over feed 2.JPG

Take care of the Earth and transport your meals in these durable, stretchable storage boxes. Microwave, dishwasher, oven & freezer safe.

deflux.JPG

deflu 2.JPG

deflux 3.JPG

https://sites.google.com/view/delione-flex-n-fresh-container/home

Authors get paid when people like you upvote their post.
If you enjoyed what you read here, create your account today and start earning FREE STEEM!