September 29, 2009

Why Weight Loss Stops on Long Term Low Carb Diets

The enthusiasm for the low carb diet as a weight loss diet arises in the first few weeks and months when most people experience dramatic weight loss.

What rarely gets mentioned--especially in the miracle weight loss books--is that very few low carb dieters ever get to their weight loss goal, especially those who start out with a lot of weight to lose.

I am enthusiastic about the power of carb restriction to lower blood sugar to normal or near normal levels. I am not as enthusiastic about low carbohydrate dieting as the solution to tough weight loss problems.

Because even the online low carb community tends to believe that people who stall out are "not doing the diet right" and respond to stall posts with that assumption, most people who do stall out long term leave the discussion boards, leaving only those who have succeeded to greet the newbies.

But as someone who stalled out for years on my own weight loss, and someone who has read the boards for years, I am convinced that permanent stalls are the norm and the people who get down to goal the exception--especially among those older than 45.

In this post I'm going to discuss a few reasons why this happens.

1. Thyroid slowing. Long term low carbing causes changes in T3 hormone levels which are often hard to diagnose. It can cause something called "Euthyroid syndrome." I learned about this from Lyle Macdonald's book, The Ketogenic Diet, which has cites to the relevant research. Getting help for this problem is very hard as your TSH will be fine and standard thyroid testing may not pick it up.

Physiologically what seems to happen is that your body responds to months of ketogenic dieting by assuming you are starving--people who are starving are in ketosis all the time too. So it turns down the thermostat to conserve your body mass so you don't die. If this happens to you you'll know it. You'll feel exhausted and dragged out all the time, and the burst of energy most people feel when they start out low carbing will be a distant memory.

Dr. Bernstein reports that many of his patients develop thyroid problems months after starting the LC diet, but he insists this is because they have developed autoimmune thyroid disease. I have to question this. Too many of us with no markers for any kind of autoimmune disease experience this metabolic slowdown on long term low carb diets. Whatever the explanation, once your thyroid slows, you weight loss will slow dramatically.

My take on this now is that it is a good idea to raise your carbohydrate intake over the ketogenic boundary from time to time. Where that boundary is varies from person to person. It's the point where after adding a few more grams of carbohydrate to your intake, you suddenly gain the 3-8 lbs you lost the first three days you were on the diet. That instant weight gain is not fat. It is the weight of the glycogen you've just restored to your liver and muscles.

Watch your calories closely when you raise your carbs this way and you shouldn't gain any weight. In fact if you watch calories and keep carbs just over the boundary while lowering your fat intake you might lose a pound or two.

Note: If you can't keep your blood sugar normal at an intake high enough to get you out of a ketogenic state it might be time to talk to an endocrinologist about safe drugs that can help. I personally maintain now at an intake that varies from 70-110 grams a day (my ketogenic boundary is at about 65 grams a day.) If I stay lower than 50 for six weeks I always develop that half dead feeling again. I need insulin at some of my meals to eat at this level, but I feel a lot better when I do. Metformin along with the insulin keeps me from gaining weight. In fact, I have been losing slowly and steadily over the past six months with the combination of lower fat/higher carbs, insulin and metformin.

2. Fat-induced insulin resistance. There is some interesting research that has been discussed on the Whole Health Source Blog about how, and more importantly, why, palmitic acid, a saturated fat might raise insulin resistance in rodents. There are a lot of other studies over the years that have demonstrated that high saturated fat intakes of all kinds increase insulin resistance too.

While I don't believe that high sat fat intake worsens heart disease or cholesterol, I think it is very possible that, for the reasons that Stephan Guyenet hypothesizes in the Whole Health Source post, long term high saturated fat intake does does increase insulin resistance, and that after many months of eating very high fat/low carb diets this increase in IR can become a huge problem especially when people experience "carb creep."

"Carb creep" is very common. Over time most of us end up eating more carbs than we think we do. A bit more here, a bit there, or perhaps we are eating larger portions of lower carb foods than we realize, so that 4 g intake is 8 g. Do that five times a day and you are eating a lot more carbs than you realize.

The cure for this is to weigh your portions for a while and get accurate carb counts. If you are eating over 60 grams a day, cut back on the saturated fat and see if that helps. I am starting to think the very high fat low carb diet is only appropriate with extremely low carb intake levels.

For those of us eating low carb to control blood sugar, a higher carb intake may be necessary to keep ourselves from experiencing diet burn out. If your blood sugar is under control at a higher carb intake, your health is fine. You may have to compromise on weight loss, though. Or perhaps cut back on the cheese, butter, nuts, meat fat and cream and see if cutting out some of that saturated fat helps.

3. Stalling Is Built Into the System. The 10% factor. High quality research which I've blogged about elsewhere suggests that when people have lost between 10-20% of their starting body weight they will experience metabolic slowdown no matter what diet they use.

When I polled the diabetes community last year about their own diet experiences, the single most often repeated report was that most people who cut carbs could lose and maintain the loss of 20% of their starting weight, but after that, forget it. This is better than Dr. Leibel's results with a mixed diet, where slowdown kicked in at 10%, but it's far less weight loss than most people who embark on a low carb diet hope to lose.

My belief is that if you stall the smartest thing you can do is declare victory and maintain your weight for a few months without attempting to lose more. Make your body feel safe, so your thyroid ramps up a bit and stops worrying about the next famine. If you can't maintain at your partial weight loss, you are not going to be able to maintain if you lose more.

In fact, the sanest goal would be to find the weight level at which you can maintain happily without feelings of deprivation and stay at that weight. Then you might just not become one of the many many low carb dieters who lose 100 lbs and gain back 120. There are far too many of them, most of them blaming themselves for weakness. My guess is it isn't weakness, it's the revenge of a metabolism that has been pushed too far and is now 100% dedicated to preparing you for the next massive famine.

4. Calories Do Count. There is a glorious period when people start their first low carb diet where they do seem to trick the body into dropping crazy amounts of real fat--despite eating relatively high calorie intakes. This passes. Oh, how it passes.

Once you have dropped that initial 10% or so the magic of low carb dieting wears off and the only way most of us (not all, but most) lose any more weight is by cutting back on our food intake.

To lose weight you do have to cut calories below some level which for many of us with metabolic problems or who are older is MUCH lower than what dietitians tell you or what you get when you use online calculators.

After years of thinking I couldn't lose weight cutting calories, I learned I can--thanks to a week long attack of stomach poisoning. It turns out all I have to do is cut my calories down to about 800 a day and I will lose. The calculators tell me I should lose on 1350. The nutritionist told me I should lose on even more than that. Twice this year I've been sick for a week unable to eat and both times I have lost a pound or so eating 800 calories--and most significantly, kept it off later on.

Do I want to keep eating like that to lose more? No. Because if there is one thing I've learned, it is that you maintain your weight loss on a diet only a few hundred calories higher than the one you ate while losing. I have no wish to have to eat 1,000 calories a day for the rest of my life.

Normal Blood Sugar Is The Best Goal To Chase. Most of us started obsessing about weight loss because doctors told us if we lost a lot of weight we'd stop being diabetic. This is absolute hogwash. Go look at my Type 2 Poll if you still believe this. It isn't the weight loss that controls blood sugar it is cutting out the carbs. No matter how thin we get, most of us will see diabetic blood sugars if we eat carbs.

If you understand this, but also understand that maintaining normal blood sugars no matter what you weigh will eliminate diabetic complications including heart disease, you should be able to accept whatever weight loss you can achieve and relax about the rest.

Carb restriction is a powerful tool for achieving normal health. It's a useful approach to weight loss, but like ALL diet approaches, it only goes so far. Yes, there are people who have huge low carb weight loss successes, but for every one of those there are hundreds who stall out at that 20% lost from starting weight. If you can get to even 15% lost, give yourself a big pat on the back, realize you are are normal if you stall, and get to work on maintaining that impressive weight loss for life.

 

September 27, 2009

Great News: Life Expenctancy of People with Type 2 Diabetes Same as General Population

A brand new Dutch study published in PLos ONE came up with a dramatic finding that should reassure everyone diagnosed with Type 2 diabetes that their condition is not, necessarily, a death sentence.

Since this study did not tout the usefulness any particular drug it got no play in the health media. But it may be the most important study published in the past couple months.

Here's the full text of the study:

Life Expectancy in a Large Cohort of Type 2 Diabetes Patients Treated in Primary Care (ZODIAC-10) Helen L. Lutgers et al. PLoS ONE 4(8): e6817. doi:10.1371/journal.pone.0006817

The authors' idea was to test the idea that the development of effective drug treatments for cardiovascular disease (i.e. ACE inhibitors, Statins, etc.) which have greatly improved survival for the non-diabetic population over the past decade might have also improved the survival of people with Type 2 Diabetes.

What they concluded is that "This study shows a normal life expectancy in a cohort of subjects with type 2 diabetes patients in primary care when compared to the general population."

Analyzing factors that predicted the likelihood of death, the study isolated only two: "a history of cardiovascular disease: hazard ratio (HR) 1.71 (95% confidence interval (CI) 1.23–2.37), and HR 2.59 (95% CI 1.56–4.28); and albuminuria: HR 1.72 (95% CI 1.26–2.35), and HR 1.83 (95% CI 1.17–2.89)." Albuminuria means "protein in the urine" and is a marker damage to the kidney's filtration units.

Interestingly, smoking, HbA1c, systolic blood pressure and diabetes duration did not predict the likelihood of death within the group of people with Type 2 Diabetes.

Here is the most salient data from the study. I urge you to click on the link and read it in full:

Table 1

The participants in this study mostly had A1cs around 7%. The researchers note that only 7% of their subjects had an A1c over 9%. This fact makes it unlikely that the overall result would apply to the US population with diabetes the average A1c in the U.S. is higher--closer to 9% than 7%.

But the linking of increased death risk with a major microvascular complication, kidney damage, suggest that those of us who strive to achieve extremely tight control--the so-called "5% Club" should rest easy.

An eleven year study of over 1800 people with diabetes found a straight line relationship between the risk of developing chronic kidney disease and the A1c. The risk began to increase significantly when the A1c rose over 6.0%.

Poor Glycemic Control in Diabetes and the Risk of Incident Chronic Kidney Disease Even in the Absence of Albuminuria and Retinopathy: Atherosclerosis Risk in Communities (ARIC) Study. Lori D. Bash et al. Arch Intern Med. Vol. 168 No. 22, Dec 8/22, 2008.

Preexisting heart disease was also a significant predictor of mortality in this study. This was defined as one of,
ischemic heart disease (IHD), International Classification of Diseases ninth revision (ICD-9), codes 410–414 and/or a history of coronary artery bypass surgery or percutaneous coronary intervention, cerebrovascular accidents [i.e. Stroke] including transient ischemic attacks (CVAs/TIAs) and/or peripheral vascular disease (PVD). PVD was defined as surgical intervention, history of claudication and/or absent pulsations of ankle or foot arteries (absence of pulsations of the dorsalis pedis arteries bilaterally was not scored as PVD when tibial posterior artery pulsations were present).
There is also a straight line relationship between A1c and heart disease in the non-diabetic population. You can read the details HERE. The salient feature is that the significant difference in heart attack risk takes place not when A1c is over 7% but when it moves from 4.7% to over 6%.

All in all, this is exceedingly good news. If even people with diabetes who maintain the abnormally high blood sugars that produce the 7% A1c have the same risk of death as people without diabetes, the fear that our diagnosis is a death sentence can be relaxed.

This also lays to rest the factoid repeated endlessly by doctors that a diabetes diagnosis is, health-wise, the same as having already had a heart attack. This study disproves that idea completely. Instead, this study suggests that having had a heart attack raises the likelihood of death but by the same amount in the diabetic and non-diabetic population, since the mortality in the two groups was the same.

Though the study found that within the diabetic population the A1c did not predict likelihood of death, we have to remind ourselves that the factors that did predict death in the diabetic population were those that increase in the general population as A1c goes over 6%.

A 7% A1c translates into an average blood sugar of 155 mg/dl. Since we know that keeping blood sugar under 140 mg/dl is the magic number for preventing neuropathy--which also appears to have been more frequent in the group who died--we can feel safe in believing that those of us who lower blood sugar to the 5% range without using drugs whose side effects are a higher risk of heart attack viz. Avandia, Actos, Glyburide, Amaryl, Prandin and Starlix, have done all we can to ensure we live long, happy, complication-free lives.

Before you leave this study, note the lack of relationship of cholesterol levels and mortality in this group. On average, all measures of cholesterol, as expressed as mean and standard deviation, were "better" in those who died. It is a shame that the study did not include the measurement of C-Reactive Protein, a measure of inflammation, as it would have been interesting to see how predictive it was in this group.

 

September 25, 2009

Fish Oil, Yes--But NOT From Fish

There is quite a lot of evidence suggesting that fish oil, a mixture of eicosapentaenoic acid (EPA) and docosahexaenoic acid (DHA) has strong anti-inflammatory properties and may also improve lipid profiles and heart attack risk.

You can read an excellent summary of a review that looked at this data here:

Omega-3 Polyunsaturated Fatty Acids and Cardiovascular Disease Protection (American College of Cardiology CME)

One small area of concern about the research cited is that quite a lot of it was done with a prescription (i.e. expensive, proprietary) form of fish oil called Lovaza made by drug kingpin GlaxoSmithKline. You can read the official Prescribing Information about Lovaza HERE.

Lovaza costs around $100 for 60 1 gm capsules and in the studies cited 4 gram daily doses were used. The fish oil you buy at the pharmacy is about $15 for 200 capsules and you can often find "buy one get one free" offers.

This issue is important because the above review notes that re a major Lovaza study "Other studies, including a recent underpowered OMEGA trial, have not demonstrated such benefits." The explanation could be that Lovaza is worth the obscene price or that GSK used the usual big pharma study design dodges and statistical tricks to skew the results. My guess is the latter.

Subtract the drug company studies, though, and there is still a lot of evidence pointing to fish oil as being helpful in inflammatory contexts. You can read a older full-text review that cites many observational studies HERE.

A very recent study examined the impact of fish oil on gene expression, comparing the effects on genes in human subjects of six month supplementation with fish oil against supplementation with sunflower oil.

Fish-oil supplementation induces antiinflammatory gene expression profiles in human blood mononuclear cells. Bouwens M et al. Am J Clin Nutr 2009 Aug;90(2):415-24. Epub 2009 Jun 10.

The finding of the gene study was that
A+DHA intake resulted in a decreased expression of genes involved in inflammatory- and atherogenic-related pathways, such as nuclear transcription factor kappaB signaling, eicosanoid synthesis, scavenger receptor activity, adipogenesis, and hypoxia signaling.
The dose in the above study was 1.8 g a day of fish oil. It is not mentioned if the study used a branded version.

This suggests that fish oil would be a worthwhile supplement for people with diabetes because heart disease is now known to be an inflammatory condition. Some people with Type 2 also turn out to have autoimmune factors at work in attacking their beta cells.

But as good as fish oil might be for you there is one huge caveat. Thanks to the heedless pollution of our environment by coal burning plants releasing mercury in the air, fish has become an extremely toxic way to get your fish oil.

The data commonly cited about the amount of mercury in fish is between twenty and thirty years old and what little evidence we have the current levels of mercury in fish is far, far higher.

Here is the FDA listing of mercury levels in fish:

FDA Mercury Levels in Commercial Fish and Seafood

Note that the date of most of the mercury concentration data cited for many of the fish is "1990-1994" and for some, like shrimp and mackerel, it is 1978.

I personally know two people who consumed fish several times a week believing in its health benefits who were diagnosed with toxic blood mercury levels by mainstream doctors and given chelation therapy.

In a recent book, Experimental Man, by David Ewing Duncan, one that is otherwise not worth reading, the author ate a fish and then went to the lab and had his blood analyzed for mercury. The reading reported was far higher than what the FDA lists of mercury in that particular fish would suggest.

So this suggests that eating fish is not a good way to get the benefits of fish oil.

Capsules are better, though they may, in fact, contain very small amounts of mercury, the amounts are dramatically lower than that found in fish.

Here is a study that gives you some idea of how much mercury might really be in fish oil capsules. Many are free of it, some of it do have small amounts:

Measurement of Mercury Levels in Concentrated Over-the-Counter Fish Oil Preparations: Is Fish Oil Healthier Than Fish? Stacy E. Foran et al. Archives of Pathology and Laboratory Medicine, Vol. 127, No. 12, pp. 1603–1605.

One VERY important warning. Over the year that I have been buying fish oil capsules, I have found the ingredients of the capsules changing from bottle to bottle within the same brand. So does the origin of the fish oil which may be listed as "sardines" in one bottle and not mentioned in another, suggesting it comes from larger, more mercury-polluted fish. I found this true for several different brands. So when a manufacturer cites test values those values may have been done months before and may be entirely different from results you'd see if the tests were performed now.

Because mercury levels in fish roughly correspond to the size of the fish, you'd do best with fish oil made from very small fish, like sardines. However, small fish taken from highly polluted waters may be more toxic than you'd expect. I also see soy oil increasingly making its way into capsules, a concern for those who have problems with soy proteins. So when you buy your fish oil, examine the label carefully each time.

Another very important point: Buy "Enteric coated" fish oil. Unlike the plain kind, it will not cause you to burp up disgusting bits of fish oil all day.

 

September 22, 2009

Let's Not Twist History To Support Our Beliefs

It's very common for people to invent an idealized idea of the past to support a point relevant to their current beliefs. For example, conservatives often refer to a mythical American past in which extended patriarchal families lived together in harmony. Demographers who have studied US census records found that the only time significant numbers of Americans lived in three generation families was during the depression of the 1930s when they were forced into that situation by poverty and moved out as soon as times improved. They typical pattern in the US has always been for children to leave home as soon as they could and to live in pair bonds--or mother-headed families which were common thanks to death or desertion.

There is a similar phenomenon among the low carb community which disturbs me, since I've been academically trained in both the study of Anthropology (bachelor's degree, University of Chicago) and History (Masters Degree, University of Massachusetts studying under Stephen B. Oates). There is enough data here and now to support the utility of cutting the carbs out of our diet that we don't have to invent an idyllic past to support it.

The two Idyllic Fantasies that bother me the most are these: a) The Paleolithic Perfection scenario and b) The idea that Heart Disease was unknown until very recently. Let's have a brief look at why neither of these arguments are accurate or relevant to us today.

"Paleolithic" refers to all hominids--Neanderthals and modern humans--living before the discovery that plants could be domesticated which took place around 12,000 BC. In the fantasy, Paleolithic people lived on game, ate low carb diets, and were far healthier than everyone ever since.

The fallacy here is that we know very little about real Paleolithic life. All we have are a very limited collection of burials and stone tools with the occasional find of traces of fiber, fossilized seeds, and non-human bones which may or may not be the remains of the hominid meals, since animals often deposit kills in the caves where hominids dwelled at some other time. Most of our beliefs about "paleo" peoples derive from study of isolated groups of early 20th century hunter/gatherers. How similar their habits were were to paleo people of 20,000 years ago is impossible to know.

The argument is made that skeletons of agricultural people from the next, Neolithic, period are shorter than those of the Paleolithic period and show signs of disease not found in the earlier bones. Attributing this to diet makes us feel good about our low carb eating, but it ignores a few key facts: The paleolithic lifestyle winnowed out all but the most healthy and robust very quickly. Human population in the paleolithic period, though it lasted almost 2 million years stayed nearly constant--hominids just barely managed to replace themselves even though females were probably having a baby every 3 years. That's because most babies, all but the most robust--died.

The reason for this is obvious to anyone who has studied the experience modern hunter/gatherer peoples--periodic famine ravaged the population on a regular cyclical basis. The very few hunter/gatherers extant in the early 20th century when they were discovered by anthropologists were those who lived either in extremely isolated environments where agriculture was impossible, or in unusually rich biospheres that made hunting/gathering attractive enough that they did not, like humans in 90% of the rest of the planet, welcome agriculture with wild enthusiasm as soon as they learned of it.

The reason most humans took to agriculture with enthusiasm was precisely because it raised the likelihood of survival. Storing food gets you through famine and allows your children to survive the three months with nothing to eat that otherwise would kill them. It is only after the adoption of agriculture that human population starts to soar.

Those robust skeletons hide another ugly truth. Older individuals or those who have anything wrong with them quickly fall by the wayside when living a migrant paleolithic lifestyle. There is no way a hunting gathering band can carrying along people who can't walk many miles every day. Any severe illness or disability lasting for more than a few weeks will mean that the individual must be abandoned because the group must keep moving to find food. Game isn't stupid and it avoids hunters who stay in one place. Gatherers (i.e. women who supply most of the calories in contemporary hunting/gathering societies) must keep moving because edibles are scarce, seasonal and an area gets picked out pretty fast. So people don't develop the disease of age, like arthritis, because they don't live long enough to do so.

So the apparent "health" of the paleolithic remains is mostly due to the fact that most of the paleo people never made it to adulthood and when they did, they died relatively young--that and the fact that we have very few remains from that period on which to draw our conclusions. Hundreds rather than the tens of thousands from the later period.

The Inuit, so beloved of Paleo fantasists, lived a life of such extreme deprivation that it is hard to understand why anyone would make them the poster child of any diet. While most Native American peoples developed agriculture thousands of years before--the Inuit lived isolated in an environment where it is impossible to grow anything. They did what they could to survive but their numbers were small and their health and that of their children not anything you would envy--their real diet included the stomach contents of their prey--the semi digested vegetable matter now digested since the cellulose had been broken down--and entirely raw meat and fat. No Inuit survived to age 7 who was not metabolically able to cope with that diet. Natural selection works that way.

I urge anyone who romanticizes the Inuit to read The Last Gentleman Adventurer: Coming of Age in the Arctic a fascinating book written by a young man who spent a winter with Inuit living a traditional hunting lifestyle back in the 1920s.

The other fantasy I'm seeing more and more appearing on the health blogs is the idea that heart disease didn't exist until the early 20th century. A good example of this kind of thinking occurs towards the end of this otherwise reasonable blog post: Fat Head: Margarine and Mother Nature. The graph looks impressive, but the reality behind the figures on the graph drains it of the meaning the blogger would like to assign to it.

There are a lot of reasons why heart attack deaths rise. Primary is this: The single strongest epidemiological risk factor for heart attack is age. This is something you don't hear because there is a huge industry devoted to terrifying you, young, about your likelihood of getting a heart attack so it can sell you expensive products and medial services.

But heart attacks have always been relatively rare in people younger than 55. and when they occur they are usually due to specific genetic conditions or side effects of other serious disease processes. Because until the second half of the 20th century a much smaller number of people lived into the decades where heart attacks occur, it should not surprise us that heart attacks were rare.

The reason that Social Security kicks in at 65 is that back when pensions were first instituted (in Germany in the 1880s), not all that many people lived to be much older than 65, so a program that paid pensions to the few rugged survivors was quite economically sustainable. To achieve the equivalent demographic effect today, we would have to raise the age for old age pensions to at least 80, perhaps higher.

People a hundred years ago did not live to die of heart attacks because injury and bacterial infections took a huge toll. An infected pimple could become deadly--my mother remembered children she went to school with dying of just that kind of infection.

Another reason why people did not live long enough to develop heart disease was that until the 1950s there were no effective treatments for high blood pressure so people who developed cardiovascular disease who nowadays might die of a heart attack were much more likely to die of stroke or kidney failure caused by high blood pressure first. When Franklin Roosevelt's blood pressure in 1944 was measured at 210/110 his doctors had no way of lowering it.

The physical labor so adulated by today's gym culture did not lead to a longer life, it wore people out. There is a huge difference between spending an hour or two at the gym and putting in 12 hours six days a week at the steel mill, coal mine, or farm. Physical labor takes a very big toll on the body over time. Add the tuberculosis bacillus to the mix, a disease that preys on overworked crowded populations--or syphilis, one of the most frequent killers in the 19th century, though rarely one that appeared on death certificates, stir in a pinch of cholera, typhoid and other water-borne killers, and you have a very good explanation for the lack of heart disease deaths. Most people didn't live long enough to die of heart attacks.

Heart attacks probably did occur among those lucky enough to make it into their 60s, but in the 19th century and early 20th century differentiating a heart attack from a fatal stroke might have been very difficult. When the death occurred at home as was usually the case, an autopsy would have rarely been performed. When the death involved a person in their 60s or older the diagnosis was likely to be, "Old age."

That people did in fact have nonfatal heart attacks in those times is evidenced by the frequency with which historical sources refer to "dropsy" which sounds a lot like heart failure--the condition that follows nonfatal heart attack. The many older people with "dropsy" probably had serious heart disease. But there was no way of knowing this and when they died of "dropsy," or after a "fit" that was what went on their death certificate.

Finally we should not forget that a major reason for the growth of heart disease in the 20th century was the spread of cigarette smoking which took off after WWI when cigarette manufacturers ensured the troops were given cigarettes along with their rations.

Wikipedia summarizes the epidemiology thus:
The widespread smoking of cigarettes in the Western world is largely a 20th Century phenomenon – at the start of the century the per capita annual consumption in the USA was 54 cigarettes (equivalent to less than 0.5% of the population smoking more than 100 cigarettes per year), and consumption there peaked at 4,259 per capita in 1965. At that time about 50% of men and 33% of women smoked (defined as smoking more than 100 cigarettes per year). By 2000, consumption had fallen to 2,092 per capita, corresponding to about 30% of men and 22% of women smoking more than 100 cigarettes per year, and by 2006 per capita consumption had declined to 1,691; implying that about 21% of the population smoked 100 cigarettes or more per year.
In fact, with all the horrors of the modern diet, heart attack death rates in the age group most prone to heart disease have been dropping steadily over the past two decades. This is largely due to anti-smoking efforts and the advent of effective blood pressure medication.

There is no question that cutting down on carbohydrates makes a huge difference in the health of people with diabetes and pre-diabetes. There is no question either that the frankenfood industry is selling products full of highly unhealthy lab created chemicals to an unwary public, and that some of these foods, like the margarine described in the blog post cited above, do worsen our health.

But let's not fall into the trap of imagining that over all health was better in the past and that the reason for this was dietary. For most of human (and hominid) history, food was scare and hard to come by. Starvation was always a possibility. For anyone dwelling in the latitudes where winter brings snow, six months of the year were the "starving time" and agriculture was all that made it possible for large populations to colonize those areas successfully.

Living long enough to develop the "diseases of civilization" was, in many ways, a triumph. You have only to look at the curve of population over the past twenty thousand years to see, in evolutionary terms, which diet and lifestyle made people the most successful in terms of reproductive success.

Diet is an important tool for controlling diabetes, but it is not a panacea, and there is a huge understandable, human tendency to hope that by controlling what we can control--the food we put into our mouth, we can somehow live forever. Those of us with diabetes actually do have a condition where what we eat makes a huge difference in our health outcome but most other human ailments, however, do not fall into this pattern.

Let's do what we can to improve what we eat. Let's avoid foods that obviously harm us, but let's not resort to fantasy and nostalgia to back up our dietary arguments. There is plenty of good data in the present to do so.

Making inflated arguments, or imposing "truths" that are true only for those of us with disrupted glucose metabolisms on everyone blows up in our face. The main reason so many studies of the effectiveness of the low carbohydrate diet come up with tepid (though positive) results is that they involve groups of people that do not have blood sugar problems, for whom the low carb diet is not any more effective than other diets. Were researchers to evaluate the low carb diet as a treatment only for people with abnormal glucose, not as a panacea for all humans including the majority whose glucose metabolism is normal, we'd end up with much more compelling data.

 

September 18, 2009

High Fructose Intake Raises Liver Fat

As I blogged last week, it is liver fat not, as we have been told for years, visceral fat, that appears to be responsible for the metabolic disorders associated with obesity.

Intrahepatic fat, not visceral fat, is linked with metabolic complications of obesity. Elisa Fabbrini et al. PNAS Published online before print August 24, 2009, doi: 10.1073/pnas.0904944106>The abstract of the above study is pretty tough going. Fortunately, you can read a clearer explanation of what this study found in this report in Diabetes in Control

Diabetes in Control:Liver Fat Has Greater Impact on Health than Abdominal Fat

A new study reported last week gives important information about an important factor that causes fats to be deposited in the liver: Fructose.

Fructose overconsumption causes dyslipedemia LĂȘ KA, et al. Am J Clin Nutr. 2009 Jun;89(6):1760-5. Epub 2009 Apr 29.

In this study twenty-four human males, 8 normal who were the controls and 16 "healthy offspring of patients with type 2 diabetes" were fed two different diets, each for a week. The first was a control diet (the makeup not specified in the study abstract but undoubtedly it was the the usual high carbohydrate diet favored by nutritionists). Then all the subjects they were fed a diet that contained 3.5 g of fructose per kilogram of the subject's fat free mass and 35% more calories than the control diet.

This is a lot of fructose. Back of the envelope calculations suggest it would be 145 grams of fructose for a 140 lb woman at the high end of normal BMI and 255 grams for a 200 lb man with 20% body fat.

Liver and muscle tissue were studied with 1-H magnetic resonance spectroscopy and insulin resistance was also measured with a two step insulin clamp.

The finding here was that the high fructose/high calorie diet raised intrahepatic cellular lipids--i.e. fats in the liver cells--by 76% in normal people and 79% in those with relatives with Type 2 diabetes.

The high fructose/high calorie diet raised intramuscular cellular lipids much more in controls than people with diabetic relatives--47% in the normals, 24% in those with diabetic relatives. This probably reflects the insulin resistance of the people with diabetic heritages. The high fructose/high calorie diet raised VLDL-triacylglycerols, the lipid fraction most closely linked with heart disease, +51% in controls and 110% in the people with diabetic relatives.

The high fructose/high calorie diet raised fasting hepatic glucose output 4% in control: +4% and 5% in those with diabetic relatives. The researchers also concluded that it decreased the subjects' liver insulin sensitivity.

With this in mind, you'd want to ask yourself what kind of diet would provide 3.5 grams of fructose per pound of body weight each day. Research using data from the 1970s found that
For most sex/age groups nonalcoholic beverages (eg, soft drinks and fruit-flavored drinks) and grain products (eg, sweet bakery products) were the major sources of fructose.
There is only a tiny amount of fructose in wheat, so the fructose attributed to in baked goods here came from the sugar and corn syrup they contained. Table sugar, sucrose, is made of glucose bonded to fructose, and is one half fructose. High fructose corn syrup may be anywhere from 55% fructose to 90% fructose. There is no way of knowing the actual percentage of fructose in any food you buy that lists high fructose corn syrup on the label.

The 1970s data showed that the average person's intake of fructose from all sources averaged 37 grams a day, with young males eating an average of 54 grams a day. Since the 1970s, our food supply has been invaded by high fructose corn syrup which you will find in everything from bread to soup to beans to salad dressing. With 41 grams of high fructose corn syrup in every 12 ounce can of Pepsi, it's likely that the average daily intake is at least double that common in the 1970s. And then there are all those milkshakes-sold-as-coffee-and-tea like the Starbucks Apple Chai Infusion with its 74 g of sugars. . . .

So while the average person might not be eating as much fructose as the amount used in the study that showed high fructose intake causing liver fat build up, they may be eating up to 2/3rds that amount. Given the massive impact the high fructose diet had on liver fat, this latest study adds convincing evidence that fructose does, in fact, play a big part in the so-called "obesity epidemic" and in the rising incidence of Type 2 diabetes.

Another study also reported last week linked second hand smoke to liver fat. This is a mouse study, so it has to be taken with a grain of salt. Still, we know that people who smoke have much higher rates of heart disease than those who don't, so it is worth considering

Science Daily: Second-hand smoking results in liver disease study finds

The exact mechanism occurring in the mouse study is reported thus:
They found that second-hand smoke exposure inhibits AMPK activity, which, in turn, causes an increase in activity of SREBP. When SREBP is more active, more fatty acids get synthesized. The result is NAFLD [non-alcoholic fatty liver disease] induced by second-hand smoke.
As you can see when you visit the web page where I discuss what research knows about the real causes of Type 2 diabetes, there is no one cause. Type 2 Diabetes appears to require the presence of an underlying a genetic flaw but it gets triggered by a long list of environmental factors ranging from pesticides, to pharmaceutical drugs, to industrial pollutants and now apparently fructose and cigarette smoke.

This makes any real solution to the "Diabetes epidemic" unlikely. Heavy manufacturers and big agriculture don't plan to stop polluting our water and air. In fact, they have become very skilled at evading existing regulations. Two thirds of the foods available in our supermarkets and fast food outlets are filled with fructose and transfat as well as thyro-toxic, autoimmune provoking soy--as well as a laundry list of chemical ingredients the average consumer cannot begin to identify.

The less money people have the more likely they are to eat the cheap fructose and soy filled foods available. When you have only two bucks for lunch and that will buy you a soy and fructose taco for $.69 and a soda, you buy it.

There's a movement to replace some of the corn syrup in our food with sucrose--which is still half fructose. Unfortunately, there is no movement to remove sugars from things like chicken dinner and soup.

Since most of us were eating the fructose-filled foods for decades before diagnosis, we'd all like to know what it takes to reverse the fatty build up in our livers. There isn't any clear answer on this. Lowering your carbohydrate intake and cutting down on fructose may not budge the fat that is already in your liver. Most of the people who are severely insulin resistant who eat low carbohydrate diets for long periods of time appear to stay insulin resistant.

There are a host of fraudulent "liver cleanse" products out there which do nothing but cleanse your wallet. Don't fall for them. Lemon juice does not magically remove "toxins" from your liver. The liver removes toxins from the liver since that is a big part of its job. But it doesn't seem to be able to remove the fats that clog its own tissue. Metformin may or may not help. Beyond that, who knows?

I'd love to hear from research savvy readers about any treatment you have seen where there is solid data, not paid for by a supplement manufacturer, that links a treatment with the reduction of liver fat and improvement in non-alcholic fatty liver disease.

nbsp;

September 15, 2009

Shameful Research: Poorly Conceived Metastudy May Cost You Your Feet and Kidneys

What do you do when you are too poor or lazy to do real research to answer the most important question raised this year about the treatment of Type 2 Diabetes but want to see your name on a journal paper? The answer is obvious: A metastudy.

I have blogged several times in the past about the flaws in several studies that are being used to argue that it is dangerous to lower A1c. You can read those posts HERE, HERE, and HERE.

Now the Annals of Internal Medicine has published a study that will reinforce the message doctors have already taken to heart that there is no point in lowering A1c.

Systematic Review: Glucose Control and Cardiovascular Disease in Type 2 Diabetes Tanika N. Kelly et al. Ann Int Med 15 September 2009, Volume 151 Issue 6, Pages 394-403, doi

This study throws together the summary data compiled from the three most recent large studies of what happens when you lower A1c. By using only summary data, they ignore all the critical details that were essential to understanding those studies' outcomes--like how severe the patients' heart disease was before they started treatment or what drugs they used to control blood sugar.

With this dubious methodology the researchers conclude:
Intensive glucose control reduced the risk for some cardiovascular disease outcomes (such as nonfatal myocardial infarction), did not reduce the risk for cardiovascular death or all-cause mortality, and increased the risk for severe hypoglycemia.
The irresponsibility of publishing this study in a high impact journal leaves me gasping.

What's so wrong with this study that it should never have been published?

1. Its conclusion that tight control doesn't prevent cardiac death even though it lowers the incidence of heart disease ignored the hugely important fact, reported elsewhere that cardiac deaths among people with diabetes have dropped dramatically--to the point where some researchers are having trouble getting statistically significant results from their cardiac death research because not enough people are dying of the expected heart attacks.

Since this study did not differentiate outcome by age, the 85 year old person with Type 2 diabetes who dies of a heart attack is not distinguished from the 50 year old person who dies of a heart attack, and the conclusion that heart disease is decreased but not completely eliminated loses all meaning.

2. It ignores that these same studies all found that lowering A1c below 6.5% greatly reduced the incidence of neuropathy and kidney disease. I don't know about you, but while I'm waiting for my fatal heart attack (at age 90) I'd like to do it with both limbs intact and a set of functioning kidneys.

3. This metastudy completely ignores demographics of who was treated and what their condition was before blood sugar lowing treatment was initiated. For example, it claims that lowering A1c causes more instances of severe hypoglycemia. It doesn't, however, explain that the hypos were most likely to occur in older people whose diabetes was poorly controlled for decades before they attempted tight control.

Other research has found that older people are more prone to hypo even when they are taking no glucose-lowering medications--this is a byproduct of autonomic neuropathy, a long-term complication of poorly controlled diabetes.

How do you avoid developing autonomic neuropathy? By keeping your blood sugar under 140 mg/dl at all times which produces, among other things a 5% A1c--and a greatly reduced incidence of neuropathy of all kinds.

It also fails to note that the studies that found slightly poorer heart disease outcomes in the tight control group enrolled older, sicker patients who had long histories of exposure to extremely high blood sugars--people whose heart disease may have become irreversible long before tight control was attempted (but who still showed better kidney and nerve outcomes). Then these studies gave them drugs linked to heart problems: sulfonylurea drugs or Avandia and Actos.

4.There is no differentiation in this study between those who lower A1c with drug cocktails including sulfonylureas--known to raise heart attack risk--and Avandia and Actos--known to increase the incidence of heart failure--and those who lower A1c by cutting out the carbohydrates that raise blood sugar. Cutting out carbohydrates will not cause serious hypos and there is not a scintilla of evidence that it raises heart attack risk, even though the enemies of low carb dieting have invested a fortune in trying to find evidence that it does. You can read more about the safety and efficacy of the low carbohydrate diet HERE.

It is not paranoia on my part to fear that this study will cause even more amputation and kidney failure. I hear weekly from patients who have just returned from visiting their physicians 20 lbs lighter, with A1cs in the 5% range, people who have achieved these dramatic results simply by cutting down on carbohydrates, whose doctors respond to their achievements with alarm, and urge them to raise their A1cs because lowered A1cs "cause heart attacks."

If I had only heard one or two reports along these lines, I'd assume they came from those doctors who graduated in the bottom of their medical school class or from elderly doctors whose minds are going, but this is not the case. I'm hearing too many of these stories.

Busy family doctors get their medical news from newsletters that reduce the already tiny amount of information you read in an abstract to a single sentence or two. The sentence this latest study turns into is "Lowering A1c does not benefit patients with Type 2 Diabetes."

Since insurers are desperate for rationales to use to limit what treatments they will pay for, a study like this is a huge gift to those who want to prevent anyone with an A1c under 8% from receiving any diabetic drug.

Meanwhile, tens of thousands of people with Type 2 Diabetes are being told by their doctors to keep their A1cs in what they erroneously believe is a safe range--7-8% and to avoid the "danger" of lower A1cs.

We'll see the impact of this in a decade when the explosion in the incidence of amputations and kidney failure shows what really happens when you teach patients that A1cs over 7% are preferable to those under 6%.

If your doctor advises you that lowering A1cs is dangerous, tell them it is time they read the actual studies being used to draw this conclusion and pay attention to the details. Point out that for younger people recently diagnosed with Type 2 diabetes who do not have a history of years of poor control there is no evidence that tight control poses any problem. Note that negative outcomes from "tight control" were prevalent among people treated by Veterans hospitals, who were old enough that they had survived decades of very poor treatment and very high A1cs before they were put on the drug cocktails made up of drugs known to cause heart problems. Point out that even in these studies nerves and kidneys benefited from tight control. Finally, note that none of these studies track the cardiac outcomes for people with Type 2 diabetes who lower their carbohydrate intake and avoid Sulfonylureas, Actos and Avandia.

Then find a new doctor, preferably one who knows something about how to treat diabetes. They are out there. I do hear from people whose doctors support their progress enthusiastically and even from people whose doctors suggest they cut way back on their carbs.

September 14, 2009

Insulin vs Expensive Orals: Higher Satisfaction, Less Weight Gain

Doctors have been so brainwashed by the drug companies as to the superiority of their oral drugs to insulin that it is rare to find anyone newly diagnosed with Type 2 diabetes who was put on insulin at diagnosis except for those whose diagnosis occurred during a hospital admission where blood sugars were found to be over 500 mg/dl.

This is true, even though there is compelling evidence that putting people with Type 2 on insulin immediately after diagnosis provides long term benefits, even if the patients stop the insulin a month or two after they start it.

There are three reasons why doctors fight against putting the newly diagnosed person with Type 2 diabetes on insulin: Fear that the patient will not comply with the treatment because it involves shots, fear that insulin will cause dangerous hypos, and fear that the patient, already battling a weight problem will gain more weight.

It does not take a new study to debunk all these fears. The patients who supposedly refused needles rushed to adopt Byetta as soon as they were told it would cause weight loss or rejuvenate beta cells--claims that in many cases were untrue. Patients taking the sulfonylurea drugs Amaryl or Glyburide routinely experience dangerous hypos if they neglect to eat high carb meals throughout the day. Actos and Avandia cause dramatic and permanent weight gain in many of those who take them, because the drugs' mechanism of action lowers blood sugar by creating new fat cells and clearing glucose by transforming it to fat and storing it in the new cells, but doctors prescribe them anyway.

Nonetheless, for those of us who consider insulin the best of all the diabetes drugs, a new study published in Diabetes Care confirms our belief that insulin should be the first, not the last, drug prescribed to people with Type 2 who cannot normalize their blood sugar with diet alone. Insulin, rather than the obscenely expensive oral drug cocktails doctors prefer.

Here's the study:

Insulin-Based versus Triple Oral Therapy for Newly-Diagnosed Type 2 Diabetes: Which is Better? Ildiko Lingvay, et al. Diabetes Care July 10, 2009, doi: 10.2337/dc09-0653

This was a three year study. One group of patients were put on insulin plus metformin. The abstract does not specify whether the insulin was basal insulin (Lantus or Levemir) or a combination of basal and fast acting insulin. Given what I hear from my correspondents, my guess is it was basal insulin alone.

The other group of people with Type 2 diabetes in this study were put on a typical drug cocktail regimen consisting of metformin, glyburide, and Actos.

Blood sugars as measured by A1c were very similar in both groups, however there was less weight gain in the insulin/metformin group" than in the drug cocktail group. Not only that, but as reported in Diabetes in Control the researchers said,
"... while the weight gain persisted over time in the group treated with oral hypoglycemic agents, the weight gain in the insulin-treated group leveled off after 18 months and even regressed towards baseline.
Fewer people dropped out of the insulin arm of the study than the oral drug arm, too, suggesting that it was easier to comply with.

There is a reason why early insulin might be so helpful for people with Type 2 diabetes and it is, that, when prescribed properly, insulin is much more likely to lower blood sugar below the level that produces secondary insulin resistance,.

Secondary insulin resistance is a phenomenon doctors are often unaware of. It turns out that when blood sugars rise to a certain point--somewhere around 180 mg/dl--this rise causes an increase in insulin resistance, so that once blood sugars are high they stay high, because it takes more insulin to lower blood sugar than it does when blood sugars are below the threshold. This secondary insulin resistance also makes weight gain more likely.

This is why so many people with Type 2 see a dramatic drop in their blood sugar when they cut carbs out of their diets. Their insulin sensitivity improves as soon as the blood sugar drops under that threshold and their remaining insulin is far more effective. In addition, the drop in secondary insulin resistance often makes it easier to lose or maintain weight on the same calorie intake.

Giving people insulin in doses that drop blood sugar below that same threshold has a similar effect and makes it much easier to control blood sugar, too.

Injecting insulin does something else--it gives the battered beta cell a chance to recover. Alternative drugs like Byetta, Januvia, Prandin, Starlix, Glyburide and Amaryl all force the exhausted beta cells to secrete insulin, stressing the already stressed cells. Injected insulin replaces beta cell insulin, giving the beta cells a chance to rest.

Many people with newly diagnosed Type 2 diabetes don't need any drug at all. I hear daily from people who have adopted the strategy described HERE and lowered A1cs from well over 8% to the 5% range that prevents complications.

But if like me, you have something going on in your own, unique metabolism that means that you can't get normal blood sugars eating under 100 grams of carbohydrate a day, don't be shy about asking your doctor for insulin.

Insulin works. Combined with a low carb diet it won't pack on the pounds--especially if you use Levemir rather than Lantus. Fears that use of insulin might increase cancer appear to be unfounded, at least for people who combine insulin with metformin. You can read more about the insulin/cancer scare HERE.

In contrast to the unsupported fears about insulin, the known side effects of Actos are ugly, and getting uglier by the month. The sulfonylurea drugs, Amaryl and Glyburide, are sold with an FDA warning that they may increase the risk of heart attack. Januvia and Onglyza lower blood sugar using a mechanism that turns off a cancer suppressor gene.

If you worry about the confusing data about Lantus and cancer (which is far from conclusive) ask your doctor about using R insulin which is identical to the insulin your own beta cells produce. Many people find that R insulin is the ideal insulin to use along with a low carb diet as it is slower in action than the much more expensive analog insulins and matches the digestion curve of a higher protein/higher fat meal far better.

September 10, 2009

Wheat May Be Sparking Autoimmune Type 1 Thanks to Soy in Our Diets

It has long been known that people with autoimmune diabetes (Type 1 and LADA) are very likely to also develop gluten allergies. Now a recent study suggests that T-cells in people with Type 1 diabetes may be overly sensitive to wheat and that this sensitivity may be related to the development of the Type 1 diabetes.

Type 1 Diabetes Linked To Immune Response To Wheat

Many people in the low carb community welcome this kind of finding in that it reinforces the "carbs are evil and bread is the evilest of carbs" thinking that often permeates that community.

But before we blame wheat for causing Type 1 diabetes it is worth looking a little deeper. The fact is humans living in temperate climates in the Northern Hemisphere have been eating wheat-based diets for some five thousand years. But the sudden explosion of Type 1 diagnoses is about 30 years old.

Back in my wheat-saturated childhood in the 1950s Type 1 was extremely rare. I did not meet a single person who had it until I was 23 years old and I interacted with literally thousands of young people during that time period. There was no one with Type 1 in my college dorm. There was no one with Type 1 at work.

Epidemiological studies back up this anecdotal finding. The rate of development of Type 1 in children appears to have taken an upward turn in the 1970s then it really got going.

This study sheds some interesting light on why.

The Rising Incidence of Type 1 Diabetes Is Accounted for by Cases With Lower-Risk Human Leukocyte Antigen Genotypes Spiros Fourlanos, et al. Diabetes Care. 2008 August; 31(8): 1546–1549. doi: 10.2337/dc08-0239.

After explaining that "...the incidence of childhood-onset type 1 diabetes in Australia has doubled in the last 20 years, from 11.3 cases per 100,000 person-years in 1985 to 23.2 in 2002." It concludes
"The rising incidence and decreasing age at diagnosis of type 1 diabetes is accounted for by the impact of environment on children with lower-risk HLA class II genes, who previously would not have developed type 1 diabetes in childhood.
In short, something is happening now so that a large group of children with what used to be diabetes-related genes that occasionally caused Type 1, but not all that often, are now getting Type 1, children (and adults, too) who would not have developed it pre-1980.

The researchers attribute this to an environmental cause and suggests a few very unconvincing suggestions as what this environmental cause might be including increased exposure to rotavirus among babies in hospital nurseries.

The gluten/Type 1 wheat/Type 1 link should, if anyone was paying attention, have got some bright red signals flashing as to what the real environmental change producing this rise in Type 1 might be. The obvious question to ask is this. What is new in our environment since the late 1970s. Is it wheat?

No. Of course not. But what is new might be something that changes the way our bodies digest wheat, in particular, something that changes gut permeability so that the wheat proteins that for millennia made their way through the gut without getting into the blood stream where they could provoke antibody attack suddenly started leaking into the gut.

And when you ask what change in our food supply happened since the late 1970s that had the ability to change gut permeability you find one huge possibility: Soy.


No one ate Soy in the 1950s except when they went to a Chinese restaurant. Even then, they avoided bizarrely exotic foods like tofu and black beans in favor of bland dishes flavored with a bit of salty American-made soy sauce.

But in the 1970s this changed dramatically. Soy began to be used as a protein extender in fast foods and packaged foods to the point where it is now virtually impossible to find a packaged food that does not have soy in it somewhere.

The public was sold the line that soy was health food--based on claims that were never substantiated, and many of which were proven false.

What the public does not know, and I learned only from reading Kaayla Daniel's book, The Whole Soy Story: The Dark Side of America's Favorite Health Food is that soy contains compounds called "saponins" which, as their name suggests, have a soapy quality. As you can read on page 240 of Daniel's book, Soy saponins damage gut mucosa and have been found in numerous studies to change the permeability of the gut. Since these studies were not funded by the huge food processors who inundate us with soy, you have never heard about them. You have now.

Any of you who have more than a passing interest in how the foods we eat affect our bodies owe it to yourselves to read the Daniel book. There is something eye-opening on every page, and Dr. Daniel documents each of her claims with rigorous citations to peer reviewed research.

Here is just one research study backing up the finding about the relationship of soy saponins to gut damage. There are many more.

Influence of Saponins on Gut Permeability and Active Nutrient Transport In Vitro. I. T. JOHNSON, et al. The Journal of Nutrition 116: 2270-2277, 1986.

Here is an interesting page from a book about saponins describing how soy saponins and lectins combine to damage the gut:

Plant lectins. By A. Pusztai p. 159-160.

If you have an inherited genetic tendency to autoimmune disease, and wish to preserve yourself and your children from developing autoimmune diabetes, this latest news about wheat might impell you to do two things.

1. Completely eliminate wheat and gluten from your diet.

2. Completely eliminate soy from your diet.

Because it is most likely the leakiness of the gut that allows large proteins into the bloodstream where they provoke immune attack, my guess is that eliminating wheat will merely allow some other large protein into your blood which will provoke immune attack. Perhaps, since there is something about wheat that seems to provoke attack on the insulin-producing beta cells, eliminating wheat might lower your risk of diabetes. But who knows what cells other proteins leaking through might resemble? Avoiding diabetes by upping your chances of developing Rheumatoid Arthritis, Multiple Sclerosis, or Lupus, it not a course I would recommend.

Instead, concentrate on keeping your gut membranes doing the job they were hired to do. Don't eat foods that will harm the cells on the barrier membranes that keep your food proteins where they were intended to go. In short, don't eat soy.

 

September 7, 2009

High Fat Diet Makes Us Lazy and Stupid?

I've been eating a high fat/lowered carb diet for the past 11 years. During this time I have published three nonfiction books, two of which made it to the tops of their Amazon category bestseller list and one of which, Blood Sugar 101, is currently showing up, on and off, in the top 100 Amazon bestseller list for all disease books.

Over the same period, I've written the PHP-based Phlaunt.com software generator that creates easily updated web sites for computer-phobic craftspeople. That software is also the platform I use for the Blood Sugar 101 site since it makes it extremely easy to update a web page and I update Blood Sugar 101 frequently. Over this period, in my spare time, I've finished two novels, one of which was recently bought by Avon Books, a major publisher, despite the repeated claim in the writers' media that it is almost impossible to break in to fiction publishing in these hard economic times.

Given how well my brain has performed on a high fat/low carb diet, you can imagine my feelings when I read this latest health headline:

Do High-fat Diets Make Us Stupid And Lazy? To which the editorial answer is a loud "yes."

The use of the word "Us" in this headline is very odd indeed, since the research cited was a rat study. Last I looked, I was not covered with fur, nor did I have a tail, or feast on garbage in alleyways. Have rodents taken over the editing of diet research news?

One begins to wonder. There has certainly been an torrent of publicity over the past few months for rodent studies that purport to prove that high fat/low carb diets will make us stupid, flabby, and/or dead. These studies are funded by organizations like the British Heart Foundation, sponsor of the "stupid" study above, that have a huge stake in keeping the public from learning that the dietary advice they've been pushing for decades--the advice to eat low fat diets filled with "healthy whole grains" is not only worthless, but actually harmful.

Since all the human data confirms the uselessness of low fat diets and, furthermore, shows that high fat/low carbohydrate diets provide the best outcomes for people with diabetes, the stakeholders in the low fat theory have been driven to desperate measures. These measures increasingly involve feeding very artificial diets to rodents whose furry little bodies are adapted to diets extremely different from that of the human omnivore.

To understand just how deceptive the rodent research is that is being marshaled to support the low fat "bitter-enders," I'd urge you to read this blog post:

Whole Health Source: Animal Models of Atherosclerosis: LDL

As Dr. Stephan Guyenet points out, the diets researchers give their little furry friends are obscenely distorted, because that is what it takes to produce the effects they desire. The high protein diet in one recent study that purported to be an "Atkins" diet, fed mice a diet so high in protein that the equivalent for a human would include a daily portion of 17.5 pounds of beef steak, 3.8 pounds of beef liver, or 22.5 eggs.

As only the abstract is publicly available, one can only speculate what the high fat diet was that was fed to rats in this latest tudy. In other rodent "high fat" studies where I have been able to see the details of the diets, the fats used have often been trans-fat filled shortening. Many of these "high fat" diets also supply the carbohydrates that fill out the diet in the form of sugar water or pure fructose.

But whatever the rats actually ate in this latest study, it has huge flaws, starting with the idea that we can measure the "cognitive performance" of a rat in a way that has any implications for human cognitive function and moving on to the fact that the measurements were taken after a very brief exposure to a high fat diet--slightly more than a week.

We know that in humans who are put on a low carbohydrate/high fat diet, there is a period, often lasting up to two weeks, in which the intense metabolic changes that occur, including dropping blood sugars, can lead to temporary mental fuzziness. However, as anyone who has eaten a low carb diet for more than two weeks knows, that short term mental fuzziness is far different from permanent cognitive decline, and once the body stabilizes on the new diet, the result is enhanced mental clarity.

For a human to achieve permanent cognitive decline via diet, they have to eat a high carbohydrate/low fat diet for an extended period. That is because it is the high carbohydrate, low fat diet that, over years of exposure, raises blood sugar high enough to damage the vascular system in the brain. A persistent low fat intake also deprives the body of the fats that are vital for the production of the hormones essential to cognition, like estrogen, and for repair of the tissues in the brain for which cholesterol is essential.

This latest rat study also claims that the high fat diet destroys the rats' ability to exercise (hence the headline reference to high fat intake making "us" "lazy".) To understand how silly the claim is that a high fat intake is damaging to human fitness, you need only read the training manual sent home nowadays with college football players. I saw one when my son was on a college football team. The dietary recommendations? Eat high protein/low carbohydrate foods (which implies a higher fat intake) except before games. Trainers who are graded on their trainee's performance have learned that low fat/high carbohydrate diets are the worst regimen on which to build lean body mass and increase fitness.

But pointing out human results to these low fat fanatics is as effective as it was to tell those Japanese troops fighting on into the 1950s in jungle island enclaves that WWII was over. Having given committed their professional lives to this theory, they will die fighting for it, no matter how much evidence accumulates that their theory is fatally wrong.

Fortunately, human cognition is such that you don't have to take the word of anyone about the impact of a high fat/low carbohydrate diet on your own brain. Eat a low carb diet for three weeks and see what happens to your thinking. If you are like most people with diabetes, it will get clearer, your moods will get rosier, and your creative output will rise.

To counter this latest burst of media disinformation, I'd love it if some of my readers who eat high fat/lowered carbohydrate diets would use the comment section that follows this post to list some of their recent intellectual accomplishments.

 

September 3, 2009

Two New Studies Support Utility of Metformin

UPDATE: Sept 16, 2009: Mouse study finds metformin blocks breast cancer stem cells. This is a mouse study, which means it may be irrelevant to humans, but added to the other data linking metformin to reduced cancer incidence, it's worth noting.

Science Daily: Diabetes Drug Kills Cancer Stem Cells in Combination Treatment in Mice.

ORIGINAL POST:

It's well known that people with diabetes have a higher risk of cancer. In some cases, undetected cancer may be causing the diabetes--diabetes is, rarely, an early symptom of pancreatic cancer. In others the many years of exposure to high blood sugar may feed baby cancer cells which do not apparently need insulin to be able to take in and metabolize glucose.

Until now, all we could advise people with family histories of diabetes to do, to prevent or slow cancer, was to strive for early diabetes or pre-diabetes diagnosis and to lower blood sugars aggressively once abnormal blood sugars were found.

Now new data from a British epidemiological study suggests that metformin may exert an anti-cancer effect. The study is:

New Users of Metformin Are at Low Risk of Incident Cancer: A cohort study among people with type 2 diabetes. Gillian Libby et al. Diabetes Care September 2009 vol. 32 no. 9 1620-1625.doi: 10.2337/dc08-2175

The study made use of medical records collected in Tayside, Scotland UK. The researchers compared 4085 people with type 2 diabetes who were new users of metformin in 1994–2003 to a group of people with diabetes diagnosed the same year who were not given metformin.

The result was that
Cancer was diagnosed among 7.3% of 4,085 metformin users compared with 11.6% of 4,085 comparators, with median times to cancer of 3.5 and 2.6 years, respectively (P < 0.001). The unadjusted hazard ratio (95% CI) for cancer was 0.46 (0.40–0.53).
This association held up even when adjusted for "sex, age, BMI, A1C, deprivation, smoking, and other drug use."

This result is intriguing, though it makes me want to ask, "What kept the doctors from prescribing metformin to the second group, the one that had so much more cancer?" Until we know the answer to that question, it is impossible to screen out the possibility that the same factor that kept doctors from prescribing the usual first line treatment for Type 2 diabetes might have also promoted cancer.

Doctors often avoided prescribing metformin to newly diagnosed patients if those patients' A1cs are between 7-8%--a level that correlates with very damaging post-meal blood sugars--high enough,perhaps, to actively promote tumor growh. That is because doctors erroneously believe an A1c near 8% to be close enough to "good control" (which they define as an A1c of 7%) that it can be managed with "diet and exercise." The diet prescribed in the UK, however, is a low fat diet full of carbohydrates which tends to push blood sugars up. So patients managed only on diet and exercise are likely to have blood sugars spiking into the very high range after every meal.

The results in this study were supposedly analyzed to adjust for A1c--but we know that the identical A1c can represent very different peak blood sugars and very different lengths of time of exposure to those peak blood sugars.

This was demonstrated very clearly in the Kumamoto study where patients with the identical 7% A1c to the patients in the UKPDS study had far fewer diabetic complications than their peers in the UKPDS study because they used fast-acting insulin to cap their post-meal blood sugar excursions to 180 mg/dl. In the UKPDS post meal sugars were not monitored and though the patients achieved the same 7% A1c as the Kumomoto study participants, they did so with much higher post-meal numbers. (Citations to these studies can be found HERE.)

Because of this, it is not possible to adequately adjust for blood sugar status, simply by adjusting for A1c. This means it is impossible to rule out the possibility that the Metformin group in the cancer study may have been prescribed metformin because their initially higher blood sugars impelled doctors to give them more aggressive treatment which meant, long term, they had less exposure to very high blood sugars and hence, less glucose feeding their baby cancers.

It is also possible the people in the no-metformin group were sicker. Doctors won't prescribe metformin to people who show evidence of significant kidney damage or liver disease. In that case, their higher rate of cancer might be related to their other health problems. Or the kidney damage that made metformin inappropriate might have been caused by years of undiagnosed diabetes and hence much more opportunity to feed glucose to baby tumors.

But we should not dismiss the possibility that metformin really does exert a protective effect.

Whatever the answer, it should be one more bit of data that supports the idea that, if you can take it, metformin is the very best diabetic drug to take.

Another piece of research that also supports the use of metformin is the finding, also reported in this month that liver fat not, as we have been told for years, visceral fat, appears to be the culprit in the metabolic disorders associated with obesity.

Intrahepatic fat, not visceral fat, is linked with metabolic complications of obesity. Elisa Fabbrini et al. PNAS Published online before print August 24, 2009, doi: 10.1073/pnas.0904944106

The abstract of the above study is pretty tough going. Fortunately, you can read a clearer explanation of what this study found in this report in Diabetes in Control

Diabetes in Control:Liver Fat Has Greater Impact on Health than Abdominal Fat

The lead researcher in that study was quoted as saying, ""We have found that excess fat in the liver, not visceral fat, is a key marker of metabolic dysfunction." The metabolic dysfunction in question was insulin resistance and the secretion rate of very low density triglycerides--the cholesterol fraction most associated with clogged arteries.

Why, you might ask, am I linking this second study with metformin? Because there is some interesting animal research that suggests that metformin may reduce liver fat. This research has not yet been repeated in humans, though it has been found that metformin improves liver enzymes.

One recent six month long Norwegian study failed to find evidence that metformin reduced liver fat. However, I can't help but wonder if the problem in repeating the animal result might not have something to do with the stress of the very high carbohydrate diets eaten by the human subjects here. We know the medical establishment still believes, in the face of a lot of data, that a low fat diet will lower body fat levels--even though we know to the contrary that high carbohydrate levels raise triglycerides, the basic building block of fats. Metformin lowers triglycerides, and did so in this study, but it is possible it can't lower them enough in the face of the on onslaught of 300 grams a day of carbohydrate.

This makes me wonder if combining metformin with a low carb diet in a manner that produces a normal level of triglycerides--one well under 150 mg/dl, may produce the same effect in humans as has been seen in animal studies where metformin did reduce the liver fat burden.

It's worth a try, especially as there is no negative finding in the study of people taking metformin who have high levels of liver fat. It lowered their triglycerides, blood sugar, and body fat, significantly, and there is no evidence the metformin made the condition worse.

If you decide to try metformin, take it in its cheap, generic form. Many pharmacies will sell it to you for $4-8 a month. Don't take your metformin mixed into a $4/a day pill that also includes another newer drug whose safety profile is a) already known to be dangerous (Actos or Avandia) or b)Unknown but already showing areas of concern (Januvia or Onglyza).

The extended release form (Metformin ER or XR) is easier on the digestive tract.

If you have been diagnosed with Non-alcoholic Fatty Liver Disease (NAFLD) and have been taking metformin in association with a lowered carbohydrate intake, let me hear from you about your experience.

 

September 1, 2009

New JAMA Study Adds To Picture of How Researchers Skew Results to Get Favorable Outcomes

Following hot on the heels of my previous post about researcher malfeasance comes a new a new study just published in JAMA which illuminates the persistence of the discredited technique where researchers redefine study endpoints to come up with results that satisfy the research's sponsors.

The study is Comparison of Registered and Published Primary Outcomes in Randomized Controlled Trials Sylvain Mathieu. JAMA. 2009;302(9):977-984.

It's goal was to see whether requiring researchers to register their study designs publicly made a significant difference in how the results of the research was reported.

The reason for requiring registration was the tendency of drug company researchers to change the "endpoints" they reported on if the originally selected endpoints did not come up with data favorable to their company's drugs.

To give you an example of how this worked. A drug company would commission a big expensive study to see if Drug A prevented heart attacks. The results four years later would show that Drug A had no impact at all on heart attacks. At this point the researchers would then hunt through their data until they found evidence that Drug A decreased the incidence of hangnails in female study participants aged 45-50. The final publication of the study would be a research paper title, "Drug A Effective Against Hangnails." The more important finding that the drug was useless for the purpose the company had tested it for would never be published.

To counter this abuse, it was proposed to require that researchers register their studies when they began, making it clear what their primary endpoints were supposed to be. In our example, the primary endpoint would be heart attack reduction.

This latest study finds that only 45.5% of the studies published after the supposed adoption by the mainstream research community of the requirement to register studies were adequately registered.

Of those, 31% "showed some evidence of discrepancies between the outcomes registered and the outcomes published." In addition, "The influence of these discrepancies could be assessed in only half of them and in these statistically significant results were favored in 82.6% (19 of 23)."

The researchers in this study conclude, "Comparison of the primary outcomes of RCTs [random controlled trials] registered with their subsequent publication indicated that selective outcome reporting is prevalent."

In short, researchers are ignoring the registration requirement and journals continue to publish studies that ignore or obfuscate the study's original primary endpoints.

And, even more importantly, exactly how badly the results are being massaged and tweaked is impossible to tell from the way these studies are published.

When will peer reviewers start getting serious about reviewing these studies and demanding that researchers live up to the ethical standards that are, supposedly, shared by the research community? For example, that researchers divulge to peer reviewers the original reason the study was undertaken and publish the results whatever they might be.