fbpx

Live slow, die old: Mounting evidence for caloric restriction in humans

Like to share?

Let’s say you wanted to hear a first-hand story of ordinary life 100 years ago. Where might you go to find a storyteller? Maybe you want to hear as many stories as possible. Where might you go then?

Undoubtedly you would choose the subtropical Japanese island of Okinawa, where you would find centenarians at three times the rate of already long-lived mainland Japan. Once their own state, Okinawa is a proud maritime culture home to the endangered Uchinaaguchi dialect, a sizeable and little-loved American military presence going back to World War II, and the largest natural experiment in human longevity on record.

Some elderly Okinawans might attribute their longevity to habushu, a traditional rice liqueur with a pit viper coiled at the bottom of the bottle. In fact, the habu snake does possess the curious property of being able to survive long spans of time, up to a year, without food, and perhaps this is where Okinawans take their dietary cues from. In 1972, the Japan National Nutrition Survey found that they consumed 17% less energy than the Japanese average, much of it from plant-based material like the Japanese sweet potato, their staple food, in contrast with the rice-heavy cuisine of the mainland. Partly as a result of this diet, Okinawans died of heart disease, cancer, and cerebral vascular disease at rates 60-70% of the Japanese average, and lived on average a year and a half longer.

Okinawans were far from the first to benefit from such a lifestyle, though. Anecdata on the benefits to lifespan and health of restricting caloric intake stretch back to Luigi Cornaro’s 16th century Discorsi della vita sobria. This four-part guide to living long and well detailed his own regimen of 350g of food per day, plus 400mL of wine, a habit which he claimed to be responsible for his 102 years of life. But prior to the scientific era, all sorts of odd practices were sworn by as preventative treatments for aging, and there was little to distinguish Cornaro’s account from the dubious claims that abounded on the topic.

Animal models offer support for longevity

Then in the 1930s, researchers at Cornell’s Animal Nutrition Laboratory were able to increase rat lifespans up to 50% by restricting their caloric intake, and in the following decades it became clear that this method could also be used in flies, worms, dogs, and other short-lived animals. Data on long-lived animals was sparse, though, and there was disagreement as to whether the pathways on which caloric restriction worked were conserved across all species. Primate studies were especially important: since clinical trials in humans would be nearly impossible to conduct due to the length of our lives and the infeasibility of enforcing dietary constraints over that span, these results would likely either serve as confirmation or be the death knell for caloric restriction in humans.

So the scientific community went from excitement  in 2009, when the University of Wisconsin published positive results for CR’s longevity benefits in rhesus monkeys, to confusion in 2012 when the NIA published their own findings contradicting these results. As is often the case, the devil was in the details–methodology varied substantially between the two studies, from the ages of the monkeys used, to the types of food in their diets, to the amount of food the control animals ate. NIA monkeys were fed a diet rich in whole foods, while UW monkeys’ diets contained more refined sugar and fat, and the gap in caloric intake between the NIA’s test and control groups was noticeably smaller than UW’s–meaning that all of the NIA monkeys ate a somewhat restricted diet.

Even more telling was the age difference in test groups. The NIA began their trial with juvenile and elderly monkeys, while the UW monkeys were all adults. Previous studies in animals had indicated that the later the intervention occurred, the less improvement was seen, and the best results were achieved in rodents if CR began immediately after weaning, so it was unsurprising that the NIA’s elderly group failed to demonstrate substantial benefit. Unlike rats and mice, however, youth didn’t predict better treatment results. In fact, young CR monkeys in the NIA study died at a slightly higher rate than controls. Considering humans’ closer relationship to primate development than to rodents, this likely indicates that we too would be better served by only beginning CR in adulthood.

In all, the UW study that had originally reported positively on CR in adult monkeys seemed to have the advantage. The research community was tentatively optimistic once again, but for clinicians, evidence from animal models alone would never be enough to wholeheartedly recommend an intervention.

Biomarkers present a shortcut

By the early 2000s, a wide array of data on caloric restriction in humans was available, much of it from observational studies of self-selected individuals who practiced CR as a lifestyle. There was also growing interest in the possibility of using biomarker predictors of aging diseases and longevity, particularly in overweight populations. Early biomarker studies of CR in humans were almost universally small and short, but in 2015, Duke’s CALERIE trial results reported on a large, 2-year randomized control study tracking measures associated with diseases of aging and longevity. Focusing on metabolic activity and inflammation, measures of both categories were improved, including a reduction in C-reactive protein (associated with inflammation), glucose control and blood pressure. However, not all markers showed clear and consistent response to CR treatment.

The apparently incomplete response may have been due to the study’s failure to meet its goal of reducing caloric intake to 25% below the amount subjects would normally have eaten, averaging only 12% reduction over 2 years. This was despite participants receiving help from a “very intensive behavioral intervention” over the duration of the study, highlighting one of the most daunting challenges of translating CR from laboratory models to humans: compliance.

Convenient possibilities

It would seem that few are eager to sacrifice a quarter of their diet, even for the promise of longer life. Dr. Roy Walford and Brian M. Delaney cofounded the Caloric Restriction Society in 1994 to advocate the practice for humans, and have collected roughly 7,000 members to date that are committed to maintaining 20-30% calorie reduction over their entire lives. Before his death in 2004, Walford wrote numerous books and articles, made television appearances and even a risky gambit during his tenure as a crewmember of Biosphere 2 to persuade the rest of the crew to adopt a meager near-starvation diet, proving himself a dedicated champion of CR. But despite the campaigning of its devotees, dietary restriction for longevity has scarcely built momentum in the general public.

Instead, many scientists have turned their attentions toward CR mimetics–small molecule drugs that target similar pathways to those affected by CR. These agents would ideally provide the benefits of CR without the side effects sometimes present in long-term practitioners, such as loss of libido and persistent hunger. Numerous candidates acting on metabolic mechanisms have been proposed, including the antioxidant resveratrol, famously present in wine, but few have lived up to the hype surrounding CR mimetics. One of the more promising medications is metformin, a diabetes drug currently in clinical trials for preventing multiple diseases of aging.

How could a diabetes drug prevent cancer?

Long-term caloric restriction appears to lead to a variety of metabolic changes, making the exact source of the benefits difficult to tease apart. However, insulin signaling surely plays an important role, both directly in the case of an age-linked metabolic disease like type II diabetes, and indirectly by affecting the risk of developing diseases like cancer. Metformin improves insulin sensitivity, and decreases activity of insulin-like growth factor (IGF-1), a protein that promotes growth and leads to tumor formation. Scientists think that IGF-1 was useful in our ancestral environment, when food was harder to come by and people endured periods of famine interspersed with times of relative plenty, but now that our cells don’t need to rush to grow and divide while energy conditions are favorable, high IGF-1 activity likely does more harm than good.

Faced with the modern metabolic curse of overabundant food, one growing contingent may be having their cake and eating it too: intermittent fasters are becoming more prevalent among the health-conscious, many finding the regimen easier to adhere to than stringent limits on their overall intake. Intermittent fasting may involve reducing caloric intake, but more often simply compacts an individual’s normal intake into feast periods that are bracketed by fasting, varying between 24 hour fasts once per week, to daily 16 hour fast periods. This could simulate an energy balance that is more similar to ancestral conditions, and seems to improve insulin function and glucose regulation. Though it may not produce the full spectrum of beneficial effects that proponents of true CR lay claim to, its combination of health benefits and relative convenience could make it a formidable contender as the next widely-adopted longevity practice.

Tegan McCaslin

Tegan is Geroscience's lead editor, and writes on a variety of topics--mainly science, medicine, and humans--here and elsewhere on the web.