Thursday 31 July 2014

The Tokelau Island Migrant Study: The Final Word

Over the course of the last month, I've outlined some of the major findings of the Tokelau Island Migrant study. It's one of the most comprehensive studies I've found of a traditional culture transitioning to a modern diet and lifestyle. It traces the health of the inhabitants of the Pacific island Tokelau over time, as well as the health of Tokelauan migrants to New Zealand.

Unfortunately, the study began after the introduction of modern foods. We will never know for sure what Tokelauan health was like when their diet was completely traditional. To get some idea, we have to look at other traditional Pacific islanders such as the Kitavans.

What we can say is that an increase in the consumption of modern foods on Tokelau, chiefly white wheat flour and refined sugar, correlated with an increase in several non-communicable disorders, including overweight, diabetes and severe tooth decay. Further modernization as Tokelauans migrated to New Zealand corresponded with an increase in nearly every disorder measured, including heart disease, weight gain, diabetes, asthma and gout. These are all "diseases of civilization", which are not observed in hunter-gatherers and certain non-industrial populations throughout the world.

One of the most interesting things about Tokelauans is their extreme saturated fat intake, 40- 50% of calories. That's more than any other population I'm aware of. Yet Tokelauans appear to have a low incidence of heart attacks, lower than their New Zealand- dwelling relatives who eat half as much saturated fat. This should not be buried in the scientific literature; it should be common knowledge.

Overall, I believe the Tokelau Island Migrant study (among others) shows us that partially replacing nourishing traditional foods with modern foods such as processed wheat and sugar, is enough to cause a broad range of disorders not seen in hunter-gatherers but typical of modern societies. Changes in lifestyle between Tokelau and New Zealand may have also played a role.
The Tokelau Island Migrant Study: Background and Overview
The Tokelau Island Migrant Study: Dental Health
The Tokelau Island Migrant Study: Cholesterol and Cardiovascular Health
The Tokelau Island Migrant Study: Weight Gain
The Tokelau Island Migrant Study: Diabetes
The Tokelau Island Migrant Study: Asthma

Vitamin K2 and Cranial Development

One of the things Dr. Weston Price noticed about healthy traditional cultures worldwide is their characteristically broad faces, broad dental arches and wide nostrils. Due to the breadth of their dental arches, they invariably had straight teeth and enough room for wisdom teeth. As soon as these same groups adopted white flour and sugar, the next generation to be born grew up with narrow faces, narrow dental arches, crowded teeth, pinched nostrils and a characteristic underdevelopment of the middle third of the face.

Here's an excerpt from Nutrition and Physical Degeneration, about traditional and modernized Swiss groups. Keep in mind these are Europeans we're talking about (although he found the same thing in all the races he studied):

The reader will scarcely believe it possible that such marked differences in facial form, in the shape of the dental arches, and in the health condition of the teeth as are to be noted when passing from the highly modernized lower valleys and plains country in Switzerland to the isolated high valleys can exist. Fig. 3 shows four girls with typically broad dental arches and regular arrangement of the teeth. They have been born and raised in the Loetschental Valley or other isolated valleys of Switzerland which provide the excellent nutrition that we have been reviewing.
Another change that is seen in passing from the isolated groups with their more nearly normal facial developments, to the groups of the lower valleys, is the marked irregularity of the teeth with narrowing of the arches and other facial features... While in the isolated groups not a single case of a typical mouth breather was found, many were seen among the children of the lower-plains group. The children studied were from ten to sixteen years of age.
Price attributed this physical change to a lack of minerals and the fat-soluble vitamins necessary to make good use of them: vitamin A, vitamin D and what he called "activator X"-- now known to be vitamin K2 MK-4. The healthy cultures he studied all had an adequate source of vitamin K2, but many ate very little K1 (which comes mostly from vegetables). Inhabitants of the Loetschental valley ate green vegetables only in summer, due to the valley's harsh climate. The rest of the year, the diet was limited chiefly to whole grain sourdough rye bread and pastured dairy products.

The dietary transitions Price observed were typically from mineral- and vitamin-rich whole foods to refined modern foods, predominantly white flour and sugar. The villagers of the Loetschental valley obtained their fat-soluble vitamins from pastured dairy, which is particularly rich in vitamin K2 MK-4.

In a modern society like the U.S., most people exhibit signs of poor cranial development. How many people do you know with perfectly straight teeth who never required braces? How many people do you know whose wisdom teeth erupted normally?

The archaeological record shows that our hunter-gatherer ancestors generally didn't have crooked teeth. Humans evolved to have dental arches in proportion to their tooth size, like all animals. Take a look at these chompers. That skull is from an archaeological site in the Sahara desert that predates agriculture in the region. Those beautiful teeth are typical of paleolithic humans and modern hunter-gatherers. Crooked teeth and impacted wisdom teeth are only as old as agriculture. However, Price found that with care, certain traditional cultures were able to build well-formed skulls on an agricultural diet.

So was Price on to something, or was he just cherry picking individuals that supported his hypothesis? It turns out there's a developmental syndrome in the literature that might shed some light on this. It's called Binder's syndrome. Here's a description from a review paper about Binder's syndrome (emphasis mine):

The essential features of maxillo-nasal dysplasia were initially described by Noyes in 1939, although it was Binder who first defined it as a distinct clinical syndrome. He reported on three cases and recorded six specific characteristics:5
  • Arhinoid face.
  • Abnormal position of nasal bones.
  • Inter-maxillary hypoplasia with associated malocclusion.
  • Reduced or absent anterior nasal spine.
  • Atrophy of nasal mucosa.
  • Absence of frontal sinus (not obligatory).
Individuals with Binder's syndrome have a characteristic appearance that is easily recognizable.6 The mid-face profile is hypoplastic, the nose is flattened, the upper lip is convex with a broad philtrum, the nostrils are typically crescent or semi-lunar in shape due to the short collumela, and a deep fold or fossa occurs between the upper lip and the nose, resulting in an acute nasolabial angle.
Allow me to translate: in Binder's patients, the middle third of the face is underdeveloped, they have narrow dental arches and crowded teeth, small nostrils and abnormally small sinuses (sometimes resulting in mouth breathing). Sound familiar? So what causes Binder's syndrome? I'll give you a hint: it can be caused by prenatal exposure to warfarin (coumadin).

Warfarin is rat poison. It kills rats by causing them to lose their ability to form blood clots, resulting in massive hemmorhage. It does this by depleting vitamin K, which is necessary for the proper functioning of blood clotting factors. It's used (in small doses) in humans to thin the blood as a treatment for abnormal blood clots. As it turns out, Binder's syndrome can be caused by
a number of things that interfere with vitamin K metabolism. The sensitive period for humans is the first trimester. I think we're getting warmer...

Another name for Binder's syndrome is "warfarin embryopathy". There happens to be
a rat model of it. Dr. Bill Webster's group at the University of Sydney injected rats daily with warfarin for up to 12 weeks, beginning on the day they were born (rats have a different developmental timeline than humans). They also administered large doses of vitamin K1 along with it. This is to ensure the rats continue to clot normally, rather than hemorrhaging. Another notable property of warfarin that I've mentioned before is its ability to inhibit the conversion of vitamin K1 to vitamin K2 MK-4. Here's what they had to say about the rats:

The warfarin-treated rats developed a marked maxillonasal hypoplasia associated with a 11-13% reduction in the length of the nasal bones compared with controls... It is proposed that (1) the facial features of the human warfarin embryopathy are caused by reduced growth of the embryonic nasal septum, and (2) the septal growth retardation occurs because the warfarin-induced extrahepatic vitamin K deficiency prevents the normal formation of the vitamin K-dependent matrix gla protein in the embryo.
"Maxillonasal hypoplasia" means underdevelopment of the jaws and nasal region. Proper development of this region requires fully active matrix gla protein (MGP), which I've written about before in the context of vascular calcification. MGP requires vitamin K to activate it, and it seems to prefer K2 MK-4 to K1, at least in the vasculature. Administering K2 MK-4 along with warfarin prevents warfarin's ability to cause arterial calcification (thought to be an MGP-dependent mechanism), whereas administering K1 does not.
Here are a few quotes from a review paper by Dr. Webster's group. I have to post the whole abstract because it's a gem:

The normal vitamin K status of the human embryo appears to be close to deficiency [I would argue in most cases the embryo is actually deficient, as are most adults in industrial societies]. Maternal dietary deficiency or use of a number of therapeutic drugs during pregnancy, may result in frank vitamin K deficiency in the embryo. First trimester deficiency results in maxillonasal hypoplasia in the neonate with subsequent facial and orthodontic implications. A rat model of the vitamin K deficiency embryopathy shows that the facial dysmorphology is preceded by uncontrolled calcification in the normally uncalcified nasal septal cartilage, and decreased longitudinal growth of the cartilage, resulting in maxillonasal hypoplasia. The developing septal cartilage is normally rich in the vitamin K-dependent protein matrix gla protein (MGP). It is proposed that functional MGP is necessary to maintain growing cartilage in a non-calcified state. Developing teeth contain both MGP and a second vitamin K-dependent protein, bone gla protein (BGP). It has been postulated that these proteins have a functional role in tooth mineralization. As yet this function has not been established and abnormalities in tooth formation have not been observed under conditions where BGP and MGP should be formed in a non-functional form.
Could vitamin K insufficiency be related to underdeveloped facial structure in industrialized cultures?  Price felt that to ensure the proper development of their children, mothers should eat a diet rich in fat-soluble vitamins both before and during pregnancy. This makes sense in light of what we now know. There is a pool of vitamin K2 MK-4 in the organs that turns over very slowly, in addition to a pool in the blood that turns over rapidly. Entering pregnancy with a full store means a greater chance of having enough of the vitamin for the growing fetus. Healthy traditional cultures often fed special foods rich in fat-soluble vitamins to women of childbearing age and expectant mothers, thus ensuring beautiful and robust progeny.

Paleolithic Diet Clinical Trials Part III

I'm happy to say, it's time for a new installment of the "Paleolithic Diet Clinical Trials" series. The latest study was recently published in the European Journal of Clinical Nutrition by Dr. Anthony Sebastian's group. Dr. Sebastian has collaborated with Drs. Loren Cordain and Boyd Eaton in the past.

This new trial has some major problems, but I believe it nevertheless adds to the weight of the evidence on "paleolithic"-type diets. The first problem is the lack of a control group. Participants were compared to themselves, before eating a paleolithic diet and after having eaten it for 10 days. Ideally, the paleolithic group would be compared to another group eating their typical diet during the same time period. This would control for effects due to getting poked and prodded in the hospital, weather, etc. The second major problem is the small sample size, only 9 participants. I suspect the investigators had a hard time finding enough funding to conduct a larger study, since the paleolithic approach is still on the fringe of nutrition science.

I think this study is best viewed as something intermediate between a clinical trial and 9 individual anecdotes.

Here's the study design: they recruited 9 sedentary, non-obese people with no known health problems. They were 6 males and 3 females, and they represented people of African, European and Asian descent. Participants ate their typical diets for three days while investigators collected baseline data. Then, they were put on a seven-day "ramp-up" diet higher in potassium and fiber, to prepare their digestive systems for the final phase. In the "paleolithic" phase, participants ate a diet of:
Meat, fish, poultry, eggs, fruits, vegetables, tree nuts, canola oil, mayonnaise, and honey... We excluded dairy products, legumes, cereals, grains, potatoes and products containing potassium chloride...
Mmm yes, canola oil and mayo were universally relished by hunter-gatherers. They liked to feed their animal fat and organs to the vultures, and slather mayo onto their lean muscle meats. Anyway, the paleo diet was higher in calories, protein and polyunsaturated fat (I assume with a better n-6 : n-3 ratio) than the participants' normal diet. It contained about the same amount of carbohydrate and less saturated fat.

There are a couple of twists to this study that make it more interesting. One is that the diets were completely controlled. The only food participants ate came from the experimental kitchen, so investigators knew the exact calorie intake and nutrient composition of what everyone was eating.

The other twist is that the investigators wanted to take weight loss out of the picture. They wanted to know if a paleolithic-style diet is capable of improving health independent of weight loss. So they adjusted participants' calorie intake to make sure they didn't lose weight. This is an interesting point. Investigators had to increase the participants' calorie intake by an average of 329 calories a day just to get them to maintain their weight on the paleo diet. Their bodies naturally wanted to shed fat on the new diet, so they had to be overfed to maintain weight.

On to the results. Participants, on average, saw large improvements in nearly every meaningful measure of health in just 10 days on the "paleolithic" diet. Remember, these people were supposedly healthy to begin with. Total cholesterol and LDL dropped. Triglycerides decreased by 35%. Fasting insulin plummeted by 68%. HOMA-IR, a measure of insulin resistance, decreased by 72%. Blood pressure decreased and blood vessel distensibility (a measure of vessel elasticity) increased. It's interesting to note that measures of glucose metabolism improved dramatically despite no change in carbohydrate intake. Some of these results were statistically significant, but not all of them. However, the authors note that:
In all these measured variables, either eight or all nine participants had identical directional responses when switched to paleolithic type diet, that is, near consistently improved status of circulatory, carbohydrate and lipid metabolism/physiology.
Translation: everyone improved. That's a very meaningful point, because even if the average improves, in many studies a certain percentage of people get worse. This study adds to the evidence that no matter what your gender or genetic background, a diet roughly consistent with our evolutionary past can bring major health benefits. Here's another way to say it: ditching certain modern foods can be immensely beneficial to health, even in people who already appear healthy. This is true regardless of whether or not one loses weight.

There's one last critical point I'll make about this study. In figure 2, the investigators graphed baseline insulin resistance vs. the change in insulin resistance during the course of the study for each participant. Participants who started with the most insulin resistance saw the largest improvements, while those with little insulin resistance to begin with changed less. There was a linear relationship between baseline IR and the change in IR, with a correlation of R=0.98, p less than 0.0001. In other words, to a highly significant degree, participants who needed the most improvement, saw the most improvement. Every participant with insulin resistance at the beginning of the study ended up with basically normal insulin sensitivity after 10 days. At the end of the study, all participants had a similar degree of insulin sensitivity. This is best illustrated by the standard deviation of the fasting insulin measurement, which decreased 9-fold over the course of the experiment.

Here's what this suggests: different people have different degrees of susceptibility to the damaging effects of the modern Western diet. This depends on genetic background, age, activity level and many other factors. When you remove damaging foods, peoples' metabolisms normalize, and most of the differences in health that were apparent under adverse conditions disappear. I believe our genetic differences apply more to how we react to adverse conditions than how we function optimally. The fundamental workings of our metabolisms are very similar, having been forged mostly in hunter-gatherer times. We're all the same species after all.

This study adds to the evidence that modern industrial food is behind our poor health, and that a return to time-honored foodways can have immense benefits for nearly anyone. A paleolithic-style diet may be an effective way to claim your genetic birthright to good health. 

Paleolithic Diet Clinical Trials
Paleolithic Diet Clinical Trials Part II
One Last Thought

How to Eat Grains

Our story begins in East Africa in 1935, with two Bantu tribes called the Kikuyu and the Wakamba. Their traditional diets were mostly vegetarian and consisted of sweet potatoes, corn, beans, plantains, millet, sorghum, wild mushrooms and small amounts of dairy, small animals and insects. Their food was agricultural, high in carbohydrate and low in fat.

Dr. Weston Price found them in good health, with well-formed faces and dental arches, and a dental cavity rate of roughly 6% of teeth. Although not as robust or as resistant to tooth decay as their more carnivorous neighbors, the "diseases of civilization" such as cardiovascular disease and obesity were nevertheless rare among them. South African Bantu eating a similar diet have a low prevalence of atherosclerosis, and a measurable but low incidence of death from coronary heart disease, even in old age.

How do we reconcile this with the archaeological data showing a general decline in human health upon the adoption of agriculture? Humans did not evolve to tolerate the toxins, anti-nutrients and large amounts of fiber in grains and legumes. Our digestive system is designed to handle a high-quality omnivorous diet. By high-quality, I mean one that has a high ratio of calories to indigestible material (fiber). Our species is very good at skimming off the highest quality food in nearly any ecological niche. Animals that are accustomed to high-fiber diets, such as cows and gorillas, have much larger, more robust and more fermentative digestive systems.

One factor that reconciles the Bantu data with the archaeological data is that much of the Kikuyu and Wakamba diet came from non-grain sources. Sweet potatoes and plantains are similar to the starchy wild plants our ancestors have been eating for nearly two million years, since the invention of fire (the time frame is debated but I think everyone agrees it's been a long time). Root vegetables and starchy fruit ted to have a higher nutrient bioavailibility than grains and legumes due to their lower content of anti-nutrients.

The second factor that's often overlooked is food preparation techniques. These tribes did not eat their grains and legumes haphazardly! This is a factor that was overlooked by Dr. Price himself, but has been emphasized by Sally Fallon. Healthy grain-based African cultures often soaked, ground and fermented their grains before cooking, creating a porridge that's nutritionally superior to unfermented grains. The bran was removed from corn and millet during processing, if possible. Legumes were always soaked prior to cooking.

These traditional food processing techniques have a very important effect on grains and legumes that brings them closer in line with the "paleolithic" foods our bodies are designed to digest. They reduce or eliminate toxins such as lectins and tannins, greatly reduce anti-nutrients such as phytic acid and protease inhibitors, and improve vitamin content and amino acid profile. Fermentation is particularly effective in this regard. One has to wonder how long it took the first agriculturalists to discover fermentation, and whether poor food preparation techniques or the exclusion of animal foods could account for their poor health.

I recently discovered a paper that illustrates these principles: "Influence of Germination and Fermentation on Bioaccessibility of Zinc and Iron from Food Grains". It's published by Indian researchers who wanted to study the nutritional qualities of traditional fermented foods. One of the foods they studied was idli, a South Indian steamed "muffin" made from rice and beans. 

The amount of minerals your digestive system can extract from a food depends in part on the food's phytic acid content. Phytic acid is a molecule that traps certain minerals (iron, zinc, magnesium, calcium), preventing their absorption. Raw grains and legumes contain a lot of it, meaning you can only absorb a fraction of the minerals present in them.

In this study, soaking had a modest effect on the phytic acid content of the grains and legumes examined. Fermentation, on the other hand, completely broke down the phytic acid in the idli batter, resulting in 71% more bioavailable zinc and 277% more bioavailable iron. It's safe to assume that fermentation also increased the bioavailability of magnesium, calcium and other phytic acid-bound minerals.

Fermenting the idli batter also completely eliminated its tannin content. Tannins are a class of molecules found in many plants that are sometimes toxins and anti-nutrients. In sufficient quantity, they reduce feed efficiency and growth rate in a variety of species.

Lectins are another toxin that's frequently mentioned in the paleolithic diet community. They are blamed for everything from digestive problems to autoimmune disease. One of the things people like to overlook in this community is that traditional processing techniques such as soaking, sprouting, fermentation and cooking, greatly reduce or eliminate lectins from grains and legumes. One notable exception is gluten, which survives all but the longest fermentation and is not broken down by cooking.

Soaking, sprouting, fermenting, grinding and cooking are the techniques by which traditional cultures have been making the most of grain and legume-based diets for thousands of years. We ignore these time-honored traditions at our own peril.

Wednesday 30 July 2014

A few thoughts on Minerals, Milling, Grains and Tubers

One of the things I've been noticing in my readings on grain processing and mineral bioavailability is that it's difficult to make whole grains into a good source of minerals. Whole grains naturally contain more minerals that milled grains where the bran and germ are removed, but most of the minerals are bound up in ways that prevent their absorption.

The phytic acid content of whole grains is the main reason for their low mineral bioavailability. Brown rice, simply cooked, provides very little iron and essentially no zinc due to its high concentration of phytic acid. Milling brown rice, which turns it into white rice, removes most of the minerals but also most of the phytic acid, leaving mineral bioavailability similar to or perhaps even better than brown rice (the ratio of phytic acid to iron and zinc actually decreases after milling rice). If you're going to throw rice into the rice cooker without preparing it first, white rice may actually deliver an overall higher level of certain minerals than brown rice, though brown rice may have other advantages such as a higher feeling of fullness per calorie. Either way, the mineral availability of rice is low. Here's how Dr. Robert Hamer's group put it when they evaluated the mineral content of 56 varieties of Chinese rice:
This study shows that the mineral bio-availability of Chinese rice varieties will be [less than] 4%. Despite the variation in mineral contents, in all cases the [phytic acid] present is expected to render most mineral present unavailable. We conclude that there is scope for optimisation of mineral contents of rice by matching suitable varieties and growing regions, and that rice products require processing that retains minerals but results in thorough dephytinisation.
It's important to note that milling removes most of the vitamin content of the brown rice, and most of the fiber, both of which could be disadvantageous depending on what your overall diet looks like.

Potatoes and other tubers contain much less phytic acid than whole grains, which may be one reason why they're a common feature of extremely healthy cultures such as the Kitavans. I went on NutritionData to see if potatoes have a better mineral-to-phytic acid ratio than grains. They do have a better ratio than whole grains, although whole grains contain more total minerals.

Soaking grains reduces their phytic acid content, but the extent depends on the grain. Gluten grain flours digest their own phytic acid very quickly when soaked, due to the presence of the enzyme phytase. Because of this, bread is fairly low in phytic acid, although whole grain yeast breads contain more than sourdough breads. Buckwheat flour also has a high phytase activity. The more intact the grain, the slower it breaks down its own phytic acid upon soaking. Some grains, like rice, don't have much phytase activity so they degrade phytic acid slowly. Other grains, like oats and kasha, are toasted before you buy them, which kills the phytase.

Whole grains generally contain so much phytic acid that modest reductions don't free up much of the mineral content for absorption. Many of the studies I've read, including this one, show that soaking brown rice doesn't really free up its zinc or iron content. But I like brown rice, so I want to find a way to prepare it well. It's actually quite rich in vitamins and minerals if you can absorb them.

One of the things many of these studies overlook is the effect of pH on phytic acid degradation. Grain phytase is maximally active around pH 4.5-5.5. That's slightly acidic. Most of the studies I've read soaked rice in water with a neutral pH, including the one above. Adding a tablespoon of whey, yogurt, vinegar or lemon juice per cup of grains to your soaking medium will lower the pH and increase phytase activity. Temperature is also an important factor, with approximately 50 C (122 F) being the optimum. I like to put my soaking grains and beans on the heating vent in my kitchen.

I don't know exactly how much adding acid and soaking at a warm temperature will increase the mineral availability of brown rice (if at all), because I haven't found it in the literature. The bacteria present if you soak it in whey, unfiltered vinegar or yogurt could potentially aid the digestion of phytic acid. Another strategy is to add the flour of a high-phytase grain like buckwheat to the soaking medium. This works for soaking flours, perhaps it would help with whole grains as well?

So now we come to the next problem. Phytic acid is a medium-sized molecule. If you break it down and it lets go of the minerals it's chelating, the minerals are more likely to diffuse out of the grain into your soaking medium, which you then discard because it also contains the tannins, saponins and other anti-nutrients that you want to get rid of. That seems to be exactly what happens, at least in the case of brown rice.

So what's the best solution for maximal mineral and vitamin content? Do what traditional cultures have been doing for millenia: soak, grind and ferment whole grains. This eliminates nearly all the phytic acid, dramatically increasing mineral bioavailiability. Fermenting batter doesn't lose minerals because there's nowhere for them to go. In the West, we use this process to make bread. In Africa, they do it to make ogi, injera, and a number of other fermented grain dishes. In India, they grind rice and beans to make idli and dosas. In the Phillipines, they ferment ground rice to make puto. Fermenting ground whole grains is the most reliable way to improve their mineral bioavailability and nutritional value in general.

But isn't having a rice cooker full of steaming brown rice so nice? I'm still working on finding a reliable way to increase its nutritional value.

Dietary Fiber and Mineral Availability

Health authorities tell us to eat more fiber for health, particularly whole grains, fruit and vegetables. Yet the Diet and Reinfarction Trial, which determined the effect of eating a high-fiber diet on overall risk of death, came up with this graph:



Oops!  At two years, the group that doubled its fiber intake had a 27% greater chance of dying and a 23% greater chance of having a heart attack. The extra fiber was coming from whole grains. The difference wasn't statistically significant, so we can't make too much out of this. But at the very least, it doesn't support the idea that increasing grain fiber will extend your life. 

Why might fiber be problematic? I read a paper recently that gave a pretty convincing answer to that question: "Dietary Fibre and Mineral Bioavailability", by Dr. Barbara F. Hartland. By definition, fiber is indigestible. We can divide it into two categories: soluble and insoluble. Insoluble fiber is mostly cellulose and it's relatively inert, besides getting fermented a bit by the gut flora. Soluble fiber is anything that can be dissolved in water but not digested by the human digestive tract. It includes a variety of molecules, some of which are quite effective at keeping you from absorbing minerals. Chief among these is phytic acid, with smaller contributions from tannins (polyphenols) and oxalates. The paper makes a strong case that phytic acid is the main reason fiber prevents mineral absorption, rather than the insoluble fiber fraction. This notion was confirmed here.

Whole grains would be a good source of minerals, if it weren't for their very high phytic acid content. Even though whole grains are full of minerals, replacing refined grains with whole grains in the diet (and especially adding extra bran) actually reduces the overall absorption of a number of minerals (free text, check out table 4). This has been confirmed repeatedly for iron, zinc, calcium, magnesium and phosphorus. 

Refining grains gets rid of the vitamins and minerals, but at least refined grains don't prevent you from absorbing the minerals in the rest of your food. Here's a comparison of a few of the nutrients in one cup of cooked brown vs. unenriched white rice (218 vs. 242 calories):

Brown rice would be quite nutritious if we could absorb all those minerals. There are a few ways to increase mineral absorption from whole grains. One way is to soak them in slightly acidic, warm water, which allows their own phytase enzyme to break down phytic acid. This doesn't seem to do much for brown rice, which doesn't contain much phytase.

A more effective method is to grind grains and soak them before cooking, which helps the phytase function more effectively, especially in gluten grains and buckwheat. The most effective method by far, and the method of choice among healthy traditional cultures around the world, is to soak, grind and ferment whole grains. This breaks down nearly all the phytic acid, making whole grains a good source of both minerals and vitamins.

The paper "Dietary Fibre and Mineral Bioavailability" listed another method of increasing mineral absorption from whole grains. Certain foods can increase the absorption of minerals from whole grains high in phytic acid. These include: foods rich in vitamin C such as fruit or potatoes; meat including fish; and dairy.

Another point the paper made was that the phytic acid content of vegetarian diets is often very high, potentially leading to mineral deficiencies. The typical modern vegetarian diet containing brown rice and unfermented soy products is very high in phytic acid, and therefore it may make sense to ensure plentiful sources of easily absorbed minerals in the diet, such as dairy. The more your diet depends on plant sources for minerals, the more careful you have to be about how you prepare your food.

Statistics

Ricardo just sent me a link to the British Heart Foundation statistics website. It's a goldmine. They have data on just about every aspect of health and lifestyle in the U.K. I find it very empowering to have access to this kind of information on the internet.

I've just started sifting through it, but something caught my eye. The U.K. is experiencing an obesity epidemic similar to the U.S.:
Here's where it gets interesting. This should look familiar:

Hmm, those trends look remarkably similar. Just like in the U.S, the British are exercising more and getting fatter with each passing year. In fact, maybe exercise causes obesity. Let's see if there's any correlation between the two. I'm going to plot obesity on the X-axis and exercise on the Y-axis to see if there's a correlation. The data points only overlap on three years: 1998, 2003 and 2006. Let's take a look:
By golly, we've proven that exercise causes obesity! Clearly, the more people exercise, the fatter they get. The R-value is a measure of how closely the points fall on the best-fit line. 0.82 isn't bad for this type of data. If only we could get all British citizens to become couch potatoes, obesity would be a thing of the past! OK, I'm kidding. The obesity is obviously caused by something else. I'm illustrating the point that correlations can sometimes be misleading. Even if an association conforms to our preconceived notions of how the world works, that does not necessarily justify saying one factor causes another.  Controlled experiments can often help us strengthen a claim of causality.

What Can Evolution Teach us About the Human Diet?

Vegetarians deserve our respect. They're usually thoughtful, conscientious people who make sacrifices for environmental and ethical reasons. I was vegetarian for a while myself, and I have no regrets about it.

Vegetarianism and especially veganism can get pretty ideological sometimes. People who have strong beliefs like to think that their belief system is best for all aspects of their lives and the world, not just some aspects of it. Many vegetarians believe their way of eating is healthier than omnivory. One of the classic arguments for vegetarianism goes something like this: our closest living relatives, chimpanzees and bonobos, are mostly vegetarian, therefore that's the diet to which we're adapted as well. Here's the problem with that argument:

Where are chimps (Pan troglodytes) on this chart? They aren't on it, for two related reasons: they aren't in the genus Homo, and they diverged from us 5-7 million years ago. Homo erectus diverged from our lineage about 1.5 million years ago. I don't know if you've ever seen a Homo erectus skull, but 1.5 million years is clearly enough time to do some evolving. Homo erectus  ate animals as a significant portion of its diet.

If you look at the chart above, Homo rhodesiensis (often considered a variant of Homo heidelbergensis) is our closest ancestor, and our point of divergence with neanderthals (Homo neanderthalensis). Some archaeologists believe H. heidelbergensis was the same species as modern Homo sapiens. I haven't been able to find any direct evidence of the diet of H. heidelbergensis from bone isotope ratios, but the indirect evidence indicates that they were capable hunters who probably got a substantial proportion of their calories from meat. In Europe, they hunted now-extinct megafauna such as wooly rhinos. These things make modern cows look like chicken nuggets.

H. heidelbergensis was a skilled hunter and very athletic. They were top predators in their ecosystems, judged by the fact that they took their time with carcasses, butchering them thoroughly and extracting marrow from bones. No predator or scavenger was capable of driving them away from a kill.

Our closest recent relative was Homo neanderthalensis, the neanderthal. They died out around 30,000 years ago. There have been several good studies on the isotope ratios of neanderthal bones, all indicating that neanderthals obtained most of their protein from meat. They relied both on land and marine animals, depending on what was available. Needless to say, neanderthals are much more closely related to humans than chimpanzees, having diverged from us less than 500,000 years ago. That's less than one-tenth the time between humans and chimpanzees.

I don't think this means humans are built to be carnivores, particularly since there is accumulating evidence of diverse plant consumption by neanderthals, but it certainly blows away the argument that we're built to be vegetarians. Historical human hunter-gatherers had very diverse diets, but on average were meat-heavy omnivores. 

Tuesday 29 July 2014

Latest Study on Vitamin K and Coronary Heart Disease

A Dutch group led by Dr. Yvonne T. van der Schouw recently published a paper examining the link between vitamin K intake and heart attack (thanks Robert). They followed 16,057 women ages 49-70 years for an average of 8.1 years, collecting data on their diet and incidence of heart attack.

They found no relationship between K1 intake and heart attack incidence. K1 is the form found in leafy greens and other plant foods. They found that each 10 microgram increase in daily vitamin K2 consumption was associated with a 9% lower incidence of heart attack. Participants consumed an average of 29 micrograms K2 per day, with a range of 0.9 to 128 micrograms. That means that participants with the highest intake had a very much reduced incidence of heart attack on average. Vitamin K2 comes from animal foods (especially organs and pastured dairy)and fermented foods such as cheese, sauerkraut, miso and natto. Vitamin K is fat-soluble, so low-fat animal foods contain less of it. Animal foods contain the MK-4 subtype while fermentation produces longer menaquinones, MK-5 through MK-14.

There's quite a bit of evidence to support the idea that vitamin K2 inhibits and possibly reverses arterial calcification, which is possibly the best overall measure of heart attack risk. It began with the observations of Dr. Weston Price, who noticed an inverse relationship between the K2 MK-4 content of butter and deaths from coronary heart disease and pneumonia in several regions of the U.S. You can find those graphs in Nutrition and Physical Degeneration.

The 25% of participants eating the most vitamin K2 (and with the lowest heart attack risk) also had the highest saturated fat, cholesterol, protein and calcium intake. They were much less likely to have elevated cholesterol, but were more likely to be diabetic.

Here's where the paper gets strange. They analyzed the different K2 subtypes individually (MK-4 through MK-9). MK-7 and MK-6 had the strongest association with reduced heart attack risk per microgram consumed, while MK-4 had no significant relationship. MK-8 and MK-9 had a weak but significant protective relationship.

There are a few things that make me skeptical about this result. First of all, the studies showing prevention/reversal of arterial calcification in rats were done with MK-4. MK-4 inhibits vascular calcification in rats whereas I don't believe the longer menaquinones have been tested. Furthermore, they attribute a protective effect to MK-7 in this study, but the average daily intake was only 0.4 micrograms! You could get that amount of K2 if a Japanese person who had eaten natto last week sneezed on your food. I can't imagine that amount of MK-7 is biologically significant. That, among other things, makes me skeptical of what they're really observing.

I'm not convinced of their ability to parse the effect into the different K2 subtypes. They mentioned in the methods section that their diet survey wasn't very accurate at estimating the individual K2 subtypes. Combine that with the fact that the K2 content of foods varies quite a bit by animal husbandry practice and type of cheese, and you have a lot of variability in your data. Add to that the well-recognized variability inherent in these food questionnaires, and you have even more variabiltiy.

I'm open to the idea that longer menaquinones (K2 MK-5 and longer, including MK-7) play a role in preventing cardiovascular disease, but I don't find the evidence sufficient yet. MK-4 is the form of K2 that's made by animals, for animals. Mammals produce it in their breast milk and other animals produce it in eggs all the way down to invertebrates. I think we can assume they make MK-4, and not the longer menaquinones, for a reason.

MK-4 is able to play all the roles of vitamin K in the body, including activating blood clotting factors, a role traditionally assigned to vitamin K1. This is obvious because K2 MK-4 is the only significant source of vitamin K in the diet of infants before weaning. No one knows whether the longer menaquinones are able to perform all the functions of MK-4; it hasn't been tested and I don't know how you could ever be sure. MK-7 is capable of performing at least some of these functions, such as activating osteocalcin and clotting factors.

I do think it's worth noting that the livers of certain animals contain longer menaquinones, including MK-7. So it is possible that we're adapted to eating some of the longer menaquinones. Many cultures also have a tradition of fermented food (probably a relatively recent addition to the human diet), which could further increase the intake of longer menaquinones. The true "optimum", if there is one, may be to eat a combination of forms of K2, including MK-4 and the longer forms. But babies and healthy traditional cultures such as the Masai seem to do quite well on a diet heavily weighted toward MK-4, so the longer forms probably aren't strictly necessary.

Well if you've made it this far, you're a hero (or a nerd)! Now for some humor. From the paper:

The concept of proposing beneficial effects to vitamin K2 seems to have different basis as for vitamin K1. Vitamin K1 has been associated with a heart-healthy dietary pattern in the earlier work in the USA and this attenuated their associations with CHD. Vitamin K2 has different sources and relate to different dietary patterns than vitamin K1. This suggests that the risk reduction with vitamin K2 is not driven by dietary patterns, but through biological effects.
They seem confused by the fact that people who ate foods high in saturated fat and cholesterol had less CHD, yet people consuming green vegetables didn't.  Here's more:
Thus, although our findings may have important practical implications on CVD prevention, it is important to mention that in order to increase the intake of vitamin K2, increasing the portion vitamin K2 rich foods in daily life might not be a good idea. Vitamin K2 might be, for instance more relevant in the form of a supplement or in low-fat dairy. More research into this is necessary.
Translation: "People who ate the most cheese, milk and meat had the lowest heart attack rate, but be careful not to eat those things because they might give you a heart attack. Get your K2 from low-fat dairy (barely contains any) and supplements."

Margarine and Phytosterolemia

Margarine is one of my favorite foods. To rip on. It's just so easy!

The body has a number of ways of keeping bad things out while taking good things in. One of the things it likes to keep out are plant sterols and stanols (phytosterols), cholesterol-like molecules found in plants. The human body even has two enzymes dedicated to pumping phytosterols back into the gut as they try to diffuse across the intestinal lining: the sterolins. These enzymes actively block phytosterols from passing into the body, but allow cholesterol to enter. Still, a little bit gets through, proportional to the amount in the diet.

As a matter of fact, the body tries to keep most things out except for the essential nutrients and a few other useful molecules. Phytosterols, plant "antioxidants" like polyphenols, and just about anything else that isn't body building material gets actively excluded from circulation or rapidly broken down by the liver. And almost none of it gets past the blood-brain barrier, which protects one of our most delicate organs. It's not surprising once you understand that many of these substances are bioactive: they have drug-like effects that interfere with enzyme activity and signaling pathways. For example, the soy isoflavone genistein abnormally activates estrogen receptors. Your body does not like to hand over the steering wheel to plant chemicals, so it actively defends itself.

A number of trials have shown that large amounts of phytosterols in the diet lower total cholesterol and LDL. This has led to the (still untested) hypothesis that phytosterols lower heart attack risk. The main problem with this hypothesis is that although statin drugs do lower LDL and heart attack risk, not all interventions that lower LDL lower risk.  LDL plays an important role in heart attack risk, but it's not the only factor.  Statins have a number of biological effects besides lowering LDL, and some of these probably play a role in its ability to protect against heart attacks.

Lowering total cholesterol and LDL through diet and drugs other than statins does not reliably reduce mortality in controlled trials. Decades of controlled diet trials showed overall that replacing saturated fat with polyunsaturated vegetable oil lowers cholesterol, lowers LDL, but doesn't reliably reduce the risk of cardiovascular disease. Soy contains a lot of phytosterols, which is one of the reasons it's heavily promoted as a health food.

All right, let's put on our entrepreneur hats. We know phytosterols lower cholesterol. We know soy is being promoted as a healthier alternative to meat. We know butter is considered a source of artery-clogging saturated fat. I have an idea. Let's make a margarine that contains a massive dose of phytosterols and market it as heart-healthy. We'll call it Benecol, and we'll have doctors recommend it to cardiac patients.

Here are the ingredients:

Liquid Canola Oil, Water, Partially Hydrogenated Soybean Oil, Plant Stanol Esters, Salt, Emulsifiers, (Vegetable Mono- and Diglycerides, Soy Lecithin), Hydrogentated Soybean Oil, Potassium Sorbate, Citric Acid and Calcium Disodium EDTA to Preserve Freshness, Artificial Flavor, DL-alpha-Tocopheryl Acetate, Vitamin A Palmitate, Colored with Beta Carotene.
Nice.

And I haven't even gotten to the best part yet. There's a little disorder called phytosterolemia that may be relevant here. These patients have a mutation in one of their sterolin genes that allows phytosterols (including stanols) to pass into their circulation more easily. They end up with 10-25 times more phytosterols in their circulation than a normal individual. What kind of health benefits do these people see? Premature atherosclerosis, an early death from heart attacks, abnormal accumulation of sterols and stanols in the tendons, and liver damage.

Despite the snappy-looking tub, margarine is just another industrial food-like substance that I am highly suspicious of. In the U.S., manufacturers can put the statement "no trans fat" on a product's label, and "0 g trans fat" on the nutrition label, if it contains less than 0.5 grams of trans fat per serving. A serving of Benecol is 14 grams. That means it could be up to 3.5 percent trans fat and still labeled "no trans fat". This stuff is being recommended to cardiac patients.

When deciding whether or not a food is healthy, the precautionary principle is in order. Margarine is a food that has not withstood the test of time. Show me a single healthy culture on this planet that eats margarine regularly. Cow juice may not be as flashy as the latest designer food, but it has sustained healthy cultures for generations. The U.S. used to belong to those ranks, when coronary heart disease was rare.

Paleopathology at the Origins of Agriculture

In April of 1982, archaeologists from around the globe converged on Plattsburgh, New York for a research symposium. Their goal:
...[to use] data from human skeletal analysis and paleopathology [the study of ancient diseases] to measure the impact on human health of the Neolithic Revolution and antecedent changes in prehistoric hunter-gatherer food economies. The symposium developed out of our perception that many widely debated theories about the origins of agriculture had testable but untested implications concerning human health and nutrition and our belief that recent advances in techniques of skeletal analysis, and the recent explosive increase in data available in this field, permitted valid tests of many of these propositions.
In other words, they got together to see what happened to human health as populations adopted agriculture. They were kind enough to publish the data presented at the symposium in the book Paleopathology at the Origins of Agriculture, edited by the erudite Drs. Mark Nathan Cohen and George J. Armelagos. It appears to be out of print, but luckily I have access to an excellent university library.

There are some major limitations to studying human health by looking at bones. The most obvious is that any soft tissue pathology will have been erased by time. Nevertheless, you can learn a lot from a skeleton. Here are the main health indicators discussed in the book:
  • Mortality. Archaeologists are able to judge a person's approximate age at death, and if the number of skeletons is large enough, they can paint a rough picture of the life expectancy and infant mortality of a population.
  • General growth. Total height, bone thickness, dental crowding, and pelvic and skull shape are all indicators of relative nutrition and health. This is particularly true in a genetically stable population. Pelvic depth is sensitive to nutrition and determines the size of the birth canal in women.
  • Episodic stress. Bones and teeth carry markers of temporary "stress", most often due to starvation or malnutrition. Enamel hypoplasia, horizontal bands of thinned enamel on the teeth, is probably the most reliable marker. Harris lines, bands of increased density in long bones that may be caused by temporary growth arrest, are another type.
  • Porotic hyperostosis and cribra orbitalia. These are both skull deformities that are caused by iron deficiency anemia, and are rather creepy to look at. They're typically caused by malnutrition, but can also result from parasites.
  • Periosteal reactions. These are bone lesions resulting from infections.
  • Physical trauma, such as fractures.
  • Degenerative bone conditions, such as arthritis.
  • Isotopes and trace elements. These can sometimes yield information about the nutritional status, diet composition and diet quality of populations.
  • Dental pathology. My favorite! This category includes cavities, periodontal disease, missing teeth, abscesses, tooth wear, and excessive dental plaque.
The book presents data from 19 regions of the globe, representing Africa, Asia, the Middle East, Europe, South America, with a particular focus on North America. I'll kick things off with a fairly representative description of health in the upper Paleolithic in the Eastern Mediterranean. The term "Paleolithic" refers to the period from the invention of stone tools by hominids 2.5 million years ago, to the invention of agriculture roughly 10,000 years ago. The upper Paleolithic lasted from about 40,000 to 10,000 years ago. From page 59:
In Upper Paleolithic times nutritional health was excellent. The evidence consists of extremely tall stature from plentiful calories and protein (and some microevolutionary selection?); maximum skull base height from plentiful protein, vitamin D, and sunlight in early childhood; and very good teeth and large pelvic depth from adequate protein and vitamins in later childhood and adolescence...
Adult longevity, at 35 years for males and 30 years for females, implies fair to good general health...
There is no clear evidence for any endemic disease.
The level of skeletal (including cranial and pelvic) development Paleolithic groups exhibited has remained unmatched throughout the history of agriculture. There may be exceptions but the trend is clear. Cranial capacity was 11% higher in the upper Paleolithic. You can see the pelvic data in this table taken from Paleopathology at the Origins of Agriculture.

There's so much information in this book, the best I can do is quote pieces of the editor's summary and add a few remarks of my own. One of the most interesting things I learned from the book is that the diet of many hunter-gatherer groups changed at the end of the upper Paleolithic, foreshadowing the shift to agriculture. From pages 566-568:
During the upper Paleolithic stage, subsistence seems focused on relatively easily available foods of high nutritional value, such as large herd animals and migratory fish. Some plant foods seem to have been eaten, but they appear not to have been quantitatively important in the diet. Storage of foods appears early in many sequences, even during the Paleolithic, apparently to save seasonal surpluses for consumption during seasons of low productivity.

As hunting and gathering economies evolve during the Mesolithic [period of transition between hunting/gathering and agriculture], subsistence is expanded by exploitation of increasing numbers of species and by increasingly heavy exploitation of the more abundant and productive plant species. The inclusion of significant amounts of plant food in prehistoric diets seems to correlate with increased use of food processing tools, apparently to improve their taste and digestibility. As [Dr. Mark Nathan] Cohen suggests, there is an increasing focus through time on a few starchy plants of high productivity and storability. This process of subsistence intensification occurs even in regions where native agriculture never developed. In California, for example, as hunting-gathering populations grew, subsistence changed from an early pattern of reliance on game and varied plant resources to to one with increasing emphasis on collection of a few species of starchy seeds and nuts.

...As [Dr. Cohen] predicts, evolutionary change in prehistoric subsistence has moved in the direction of higher carrying capacity foods, not toward foods of higher-quality nutrition or greater reliability. Early nonagricultural diets appear to have been high in minerals, protein, vitamins, and trace nutrients, but relatively low in starch. In the development toward agriculture there is a growing emphasis on starchy, highly caloric food of high productivity and storability, changes that are not favorable to nutritional quality but that would have acted to increase carrying capacity, as Cohen's theory suggests.
Very interesting.

One of the interesting things I learned from the book is that Mesolithic populations, groups that were halfway between farming and hunting-gathering, were generally as healthy as hunter-gatherers:
...it seems clear that seasonal and periodic physiological stress regularly affected most prehistoric hunting-gathering populations, as evidenced by the presence of enamel hypoplasias and Harris lines. What also seems clear is that severe and chronic stress, with high frequency of hypoplasias, infectious disease lesions, pathologies related to iron-deficiency anemia, and high mortality rates, is not characteristic of these early populations. There is no evidence of frequent, severe malnutrition, so the diet must have been adequate in calories and other nutrients most of the time. During the Mesolithic, the proportion of starch in the diet rose, to judge from the increased occurrence of certain dental diseases [with exceptions to be noted later], but not enough to create an impoverished diet... There is a possible slight tendency for Paleolithic people to be healthier and taller than Mesolithic people, but there is no apparent trend toward increasing physiological stress during the mesolithic.
Cultures that adopted intensive agriculture typically showed a marked decline in health indicators. This is particularly true of dental health, which usually became quite poor.
Stress, however, does not seem to have become common and widespread until after the development of high degrees of sedentism, population density, and reliance on intensive agriculture. At this stage in all regions the incidence of physiological stress increases greatly, and average mortality rates increase appreciably. Most of these agricultural populations have high frequencies of porotic hyperostosis and cribra orbitalia, and there is a substantial increase in the number and severity of enamel hypoplasias and pathologies associated with infectious disease. Stature in many populations appears to have been considerably lower than would be expected if genetically-determined maxima had been reached, which suggests that the growth arrests documented by pathologies were causing stunting... Incidence of carbohydrate-related tooth disease increases, apparently because subsistence by this time is characterized by a heavy emphasis on a few starchy food crops.
Infectious disease increased upon agricultural intensification:
Most [studies] conclude that infection was a more common and more serious problem for farmers than for their hunting and gathering forebears; and most suggest that this resulted from some combination of increasing sedentism, larger population aggregates, and the well-established synergism between infection and malnutrition.
There are some apparent exceptions to the trend of declining health with the adoption of intensive agriculture. In my observation, they fall into two general categories. In the first, health improves upon the transition to agriculture because the hunter-gatherer population was unhealthy to begin with. This is due to living in a marginal environment or eating a diet with a high proportion of wild plant seeds. In the second category, the culture adopted rice. Rice is associated with less of a decline in health, and in some cases an increase in overall health, than other grains such as wheat and corn. In chapter 21 of the book Ancient Health: Bioarchaeological Interpretations of the Human Past, Drs. Michelle T Douglas and Michael Pietrusewsky state that "rice appears to be less cariogenic [cavity-promoting] than other grains such as maize [corn]."

One pathology that seems to have decreased with the adoption of agriculture is arthritis. The authors speculate that it may have more to do with strenuous activity than other aspects of the lifestyle such as diet. Another interpretation is that the hunter-gatherers appeared to have a higher arthritis rate because of their longer lifespans:
The arthritis data are also complicated by the fact that the hunter-gatherers discussed commonly displayed higher average ages at death than did the farming populations from the same region. The hunter-gatherers would therefore be expected to display more arthritis as a function of age even if their workloads were comparable [to farmers].
In any case, it appears arthritis is normal for human beings and not a modern degenerative disease.

And the final word:
Taken as a whole, these indicators fairly clearly suggest an overall decline in the quality-- and probably in the length-- of human life associated with the adoption of agriculture.

The Glycemic Index: A Critical Evaluation

The glycemic index (GI) is a measure of how much an individual food elevates blood sugar when it's eaten. To measure it, investigators feed a person a food that contains a fixed amount of carbohydrate, and measure their blood glucose response over time. Then they determine the area under the glucose curve and compare it to a standard food such as white bread or pure glucose.

Each food must contain the same total amount of carbohydrate, so you might have to eat a big plate of carrots to compare with a slice of bread. You end up with a number that reflects the food's ability to elevate glucose when eaten in isolation. It depends in large part on how quickly the carbohydrate is digested/absorbed, with higher numbers usually resulting from faster absorption.

The GI is a standby of modern nutritional advice. It's easy to believe in because processed foods tend to have a higher glycemic index than minimally processed foods, high blood sugar is bad, and chronically high insulin is bad. Yet many people have criticized the concept.  Why?

Blood sugar responses to a carbohydrate-containing foods vary greatly from person to person. For example, I can eat a medium potato and a big slice of white bread (roughly 60 g carbohydrate) with nothing else and only see a modest spike in my blood sugar. I barely break 100 mg/dL and I'm back at fasting glucose levels within an hour and a half. You can see a graph of this experiment here. That's what happens when you have a well-functioning pancreas and insulin-sensitive tissues. Your body shunts glucose into the tissues almost as rapidly as it enters the bloodstream. Someone with impaired glucose tolerance might have gone up to 170 mg/dL for two and a half hours on the same meal.

The other factor is that foods aren't eaten in isolation. Fat, protein, acidity and other factors slow carbohydrate absorption in the context of a normal meal, to the point where the GI of the individual foods become much less pronounced.

Researchers have conducted a number of controlled trials comparing low-GI diets to high-GI diets. I've done an informal literature review to see what the overall findings are. I'm only interested in long-term studies-- 10 weeks or longer-- and I've excluded studies using subjects with metabolic disorders such as diabetes.  

The question I'm asking with this review is, what are the health effects of a low-glycemic index diet on a healthy normal-weight or overweight person? I found a total of seven studies on PubMed in which investigators varied GI while keeping total carbohydrate about the same, for 10 weeks or longer. I'll present them out of chronological order because they flow better that way.  

One issue with this literature that I want to highlight before we proceed is that most of these studies weren't properly controlled to isolate the effects of GI independent of other factors.  Low GI foods are often whole foods with more fiber, more nutrients, and a higher satiety value per calorie than high GI foods.

Study #1. Investigators put overweight women on a 12-week diet of either high-GI or low-GI foods with an equal amount of total carbohydrate. Both were unrestricted in calories. Body composition and total food intake were the same on both diets. Despite the diet advice aimed at changing GI, the investigators found that both groups' glucose and insulin curves were the same!

Study #2. Investigators divided 129 overweight young adults into four different diet groups for 12 weeks. Diet #1: high GI, high carbohydrate (60%). Diet #2: low GI, high carbohydrate. Diet #3: high GI, high-protein (28%). Diet #4: low GI, high protein. The high-protein diets were also a bit higher in fat. Although the differences were small and mostly not statistically significant, participants on diet #3 improved the most overall in my opinion. They lost the most weight, and had the greatest decrease in fasting insulin and calculated insulin resistance. Diet #2 came out modestly ahead of diet #1 on fat loss and fasting insulin.

Study #3. At 18 months, this is by far the longest trial. Investigators assigned 203 healthy Brazilian women to either a low-GI or high-GI energy-restricted diet. The difference in GI between the two diets was substantial; the high-GI diet was supposed to be double the low-GI diet. This was accomplished by a number of differences between diets, including different types of rice and higher bean consumption in the low-GI group.  Weight loss was a meager 1/3 pound greater in the low-GI group, a difference that was not statistically significant at 18 months. Changes in estimated insulin sensitivity were not statistically significant.

Study #4. The FUNGENUT study. In this 12-week intervention, investigators divided 47 subjects with the metabolic syndrome into two diet groups. One was a high-glycemic, high-wheat group; the other was a low-glycemic, high-rye group. After 12 weeks, there was an improvement in the insulinogenic index (a marker of early insulin secretion in response to carbohydrate) in the rye group but not the wheat group. Glucose tolerance was essentially the same in both groups.

What makes this study unique is they went on to look at changes in gene expression in subcutaneous fat tissue before and after the diets. They found a decrease in the expression of stress and inflammation-related genes in the rye group, and an increase in stress and inflammation genes in the wheat group. They interpreted this as being the result of the different GIs of the two diets.

Further research will have to determine whether the result they observed is due to the glycemic differences of the two diets or something else.

Study #5. Investigators divided 18 subjects with elevated cardiovascular disease risk markers into two diets differing in their GI, for 12 weeks. The low-glycemic group lost 4 kg (statistically significant), while the high-glycemic group lost 1.5 kg (not statistically significant).  In addition, the low-GI group ended up with lower 24-hour blood glucose measurements.  This study was a bit strange because of the fact that the high-GI group started off 14 kg heavier than the low-GI group, and the way the data are reported is difficult to understand.  Perhaps these limitations, along with the study's incongruence with other controlled trails, are what inspired the authors to describe it as a pilot study.

Study #6. 45 overweight females were divided between high-GI and low-GI diets for 10 weeks. The low-GI group lost a small amount more fat than the high-GI group, but the difference wasn't significant. The low-GI group also had a 10% drop in LDL cholesterol.

Study #7. This was the second-longest trial, at 4 months. 34 subjects with impaired glucose tolerance were divided into three diet groups. Diet #1: high-carbohydrate (60%), high-GI. Diet #2: high-carbohydrate, low-GI. Diet #3: "low-carbohydrate" (49%), "high-fat" (monounsaturated from olive and canola oil). The diet #1 group lost the most weight, followed by diet #2, while diet #3 gained weight. The differences were small but statistically significant. The insulin and triglyceride response to a test meal improved in diet group #1 but not #2. The insulin response also improved in group #3. The high-GI group came out looking pretty good. 

[Update 10/2011-- please see this post for a recent example of a 6 month controlled trial including 720 participants that tested the effect of glycemic index modification on body fatness and health markers-- it is consistent with the conclusion below]

Overall, these studies do not support the idea that lowering the glycemic index of carbohydrate foods is useful for weight loss, insulin or glucose control, or anything else besides complicating your life.  I'll keep my finger on the pulse of this research as it expands, but for the time being I don't see the glycemic index per se as a significant way to combat fat gain or metabolic disease.

More Thoughts on the Glycemic Index

In the last post, I reviewed the controlled trials on the effect of the glycemic index (GI) of carbohydrate foods on health. I concluded that there is not much evidence that a low GI diet is better for health than a high GI diet.

It is true that for the "average" individual the GI of carbohydrate foods can affect the glucose and insulin response somewhat, even in the context of an actual meal. If you compare two meals of very different GI, the low GI meal will cause less insulin secretion and cause less total blood glucose in the plasma over the course of the day (although the differences in blood glucose may not be large in all individuals).

But is that biologically significant? In other words, do those differences matter when it comes to health? I would argue probably not, and here's why: there's a difference between post-meal glucose and insulin surges within the normal range, and those that occur in pathological conditions such as diabetes and insulin resistance. Chronically elevated insulin is a marker of metabolic dysfunction, while post-meal insulin surges are not (although glucose surges in excess of 140 mg/dL indicate glucose intolerance). Despite what you may hear from some sectors of the low-carbohydrate community, insulin surges do not necessarily lead to insulin resistance. Just ask a Kitavan. They get 69% of their 2,200 calories per day from high-glycemic starchy tubers and fruit (380 g carbohydrate), with not much fat to slow down digestion. Yet they have low fasting insulin, very little body fat and an undetectable incidence of diabetes, heart attack and stroke. That's despite a significant elderly population on the island.

Furthermore, in the 4-month GI intervention trial I mentioned last time, they measured something called glycated hemoglobin (HbA1c). HbA1c is a measure of the amount of blood glucose that has "stuck to" hemoglobin molecules in red blood cells. It's used to determine a person's average blood glucose concentration over the course of the past few weeks. The higher your HbA1c, the poorer your blood glucose control, the higher your likelihood of having diabetes, and the higher your cardiovascular risk. The low GI group had a statistically significant drop in their HbA1c value compared to the high GI group. But the difference was only 0.06%, a change that is biologically meaningless.

OK, let's take a step back. The goal of thinking about all this is to understand what's healthy, right? Let's take a look at how carbohydrate foods are consumed by cultures that rarely suffer from obesity or metabolic disease. Cultures that rely heavily on carbohydrate generally fall into three categories: they eat cooked starchy tubers, they grind and cook their grains, or they rely on grains that become very soft when cooked. In the first category, we have Africans, South Americans, Polynesians and Melanesians (including the Kitavans). In the second, we have various Africans, Europeans (including the villagers of the Loetschental valley), Middle Easterners and South Americans. In the third category, we have Asians, Europeans (the oat-eating residents of the outer Hebrides) and South Americans (quinoa-eating Peruvians).

The pattern here is one of maximizing GI, not minimizing it. That's not because high GI foods are inherently superior, but because traditional processing techniques that maximize the digestibility of carbohydrate foods also tend to increase their GI. I believe healthy cultures around the world didn't care about the glycemic index of foods, they cared about digestibility and nutritional value.

The reason we grind grains is simple. Ground grains are digested more easily and completely (hence the higher GI).  Furthermore, ground grains are more effective than intact grains at breaking down their own phytic acid when soaked, particularly if they're allowed to ferment. This further increases their nutritional value.

The human digestive system is delicate. Cows can eat whole grass seeds and digest them using their giant four-compartment stomach that acts as a fermentation tank. Humans that eat intact grains end up donating them to the waste treatment plant. We just don't have the hardware to efficiently extract the nutrients from cooked whole rye berries, unless you're willing to chew each bite 47 times. Oats, quinoa, rice, beans and certain other starchy seeds are exceptions because they're softened sufficiently by cooking.

Grain consumption and grinding implements appear simultaneously in the archaeological record. Grinding has always been used to increase the digestibility of tough grains, even before the invention of agriculture when hunter-gatherers were gathering wild grains in the fertile crescent. Some archaeologists consider grinding implements one of the diagnostic features of a grain-based culture. Carbohydrate-based cultures have always prioritized digestibility and nutritional value over GI.

Finally, I'd like to emphasize that some people don't have a good relationship with carbohydrate. Diabetics and others with glucose intolerance should be very cautious with carbohydrate foods. The best way to know how you deal with carbohydrate is to get a blood glucose meter and use it after meals. For $70 or less, you can get a cheap meter and 50 test strips that will give you a very good idea of your glucose response to typical meals (as opposed to a glucose bomb at the doctor's office). Jenny Ruhl has a tutorial that explains the process. It's also useful to pay attention to how you feel and look with different amounts of carbohydrate in your diet.