As I look at the increasingly bizarre world around us, I think two new words seem appropriate: nutricide, the killing of people by serving them junk foods; and pharmacide, the killing of people by over-prescribing pharmaceutical drugs.
Read the labels of almost every supermarket food sold in a box, can, jar, bottle, or bag, and you’ll find that they contain far too many ingredients that serve large-scale processing but are slowly lethal to con-sumers ingesting them. I’m referring to various sugars, trans fats, interesterified fats, excess salt, and soybean oil as the top offenders. The fast-food companies are no better, serving up their breaded, deep-fried, hydrogenated goodies.
The folks at McDonald’s, KFC, Coca-Cola, Kraft, Campbell, and so many other makers of processed food are killing people by way of overweight, dia-betes, and heart disease. They’re guilty of nutricide.
Unfortunately, some so-called health food and natural food products aren’t much better. I recently attended the Natural Foods Expo show in Anaheim, California, where more than 3,000 companies showed their products to over 50,000 visitors. Many of the products were great – meats from organically raised animals, organically grown produce, and even relatively healthy snack foods.
But there were also too many attempts at feel-good knockoffs of conventional junk foods: “health food” soft drinks with as much sugar as a Pepsi, energy bars with as much sugar as a Snickers, chocolate soy milk with more sugar and calories than regular chocolate milk, ad nauseam (a term that seems particularly appropriate). And people wonder why two-thirds of Americans are overweight.
Meanwhile, the pharmaceutical industry has cooked up their own solution to the ills caused by all these unhealthy foods: heavily advertised drug panaceas that, in most cases, have side effects worse than the disease itself. Each year, more than 700,000 people get hospitalized because of adverse reactions to drugs, and more than 100,000 people in hospitals die from their medications, all of which are approved for use by the Food and Drug Administration. They’re all guilty of pharmacide.
Abram Hoffer, MD, PhD, a pioneer in nutritional medicine, recently pointed to the debate about whether anti-depressant drugs were really better than placebos. “This is a phony debate, almost like trying to figure out how many angels are dancing on the head of a pin,” he told me. “Even if the drugs are 10 percent better, they are so much more toxic than any placebo that a placebo should be preferred.”
Wednesday, May 21, 2008
Thursday, May 8, 2008
Redefining the Meaning of Nutritional Deficiencies
Many of us were taught that vitamin deficiencies were horrible diseases such as scurvy, beriberi, and pellagra – each often characterized by the body literally falling apart. Relatively common 100 years ago, these diseases are now considered rare.
But it is a mistake to consider these "classic" deficiency diseases the first sign of compromised nutrition. Rather, these diseases consist of the final burst of symptoms – what some people in medicine refer to as "total system failure" – before death.
It is equally foolish to believe that nutritional deficiencies are rare today. Often the signs of marginal nutritional intake or early deficiency are more difficult to assess, in large part because their symptoms may be vague and because health-care professions simply don't bother investigating them.
For example, a vitamin C-deprivation study found that the first signs of deficiency were not those of scurvy, but rather irritability and fatigue – two extremely common symptoms. That should not come as a surprise because 30 to 48 percent of Americans do not consume the "recommended" amounts of vitamin C, indicating that their nutritional status is marginal at best.
Studies show similar patterns with other nutrients. A study of magnesium intake in the elderly found that one-fourth of subjects did not consume the officially recommended daily amounts. Similarly, 93 percent of Americans do not consume the recommended levels of vitamin E. In another study, researchers reported that 98 percent of patients hospitalized for hip fractures were either deficient or had marginal blood levels of vitamin D.
Vitamins and minerals directly or indirectly play roles in the thousands of biochemical reactions that occur in our bodies every second of the day. Without them, these chemical pr ocesses become sluggish or cease. The situation is analogous to using yeast to make dough rise. If yeast is not present, the dough does not rise to make bread.
Yet the average person, subsisting on fast foods and convenience foods (instead of fresh wholesome foods), most likely has a marginal intake of many vitamins and minerals. As chemical reactions slow down, any number of symptoms are likely to emerge. The situation is further complicated by the use of pharmaceutical medications, nearly all of which interfere with nutrient absorption or utilization.
It makes no sense to wait until the symptoms of nutritional deficiencies become fulminate. It's far more fascinating and exciting to discover how nutritional deficiencies and imbalances can cause a wide variety of otherwise inexplicable symptoms.
But it is a mistake to consider these "classic" deficiency diseases the first sign of compromised nutrition. Rather, these diseases consist of the final burst of symptoms – what some people in medicine refer to as "total system failure" – before death.
It is equally foolish to believe that nutritional deficiencies are rare today. Often the signs of marginal nutritional intake or early deficiency are more difficult to assess, in large part because their symptoms may be vague and because health-care professions simply don't bother investigating them.
For example, a vitamin C-deprivation study found that the first signs of deficiency were not those of scurvy, but rather irritability and fatigue – two extremely common symptoms. That should not come as a surprise because 30 to 48 percent of Americans do not consume the "recommended" amounts of vitamin C, indicating that their nutritional status is marginal at best.
Studies show similar patterns with other nutrients. A study of magnesium intake in the elderly found that one-fourth of subjects did not consume the officially recommended daily amounts. Similarly, 93 percent of Americans do not consume the recommended levels of vitamin E. In another study, researchers reported that 98 percent of patients hospitalized for hip fractures were either deficient or had marginal blood levels of vitamin D.
Vitamins and minerals directly or indirectly play roles in the thousands of biochemical reactions that occur in our bodies every second of the day. Without them, these chemical pr ocesses become sluggish or cease. The situation is analogous to using yeast to make dough rise. If yeast is not present, the dough does not rise to make bread.
Yet the average person, subsisting on fast foods and convenience foods (instead of fresh wholesome foods), most likely has a marginal intake of many vitamins and minerals. As chemical reactions slow down, any number of symptoms are likely to emerge. The situation is further complicated by the use of pharmaceutical medications, nearly all of which interfere with nutrient absorption or utilization.
It makes no sense to wait until the symptoms of nutritional deficiencies become fulminate. It's far more fascinating and exciting to discover how nutritional deficiencies and imbalances can cause a wide variety of otherwise inexplicable symptoms.
Subscribe to:
Posts (Atom)