I can often appear a cocky git, especially when discussing nutritional controversies; however, I try to entertain doubts as vehemently as I herald certainties. To keep myself honest, I must air these doubts, these areas where I cannot proclaim sure knowledge, whether through a fault in my own wisdom, a personal lack of understanding, or a true ambiguity in the field.
So I present here my current “Unsurance Policy” against Dunning Kruger Effectitis:
In metabolically enlightened circles, all the coolest kids finger insulin resistance (IR) as the root of the so-called diseases of civilisation. I have become increasingly convinced that we find not IR, per se, but hyperinsulinaemia itself, at the root of the problem. Don’t blame the body for drawing up the drawbridges after a flood of insulin deluges each cell. Blame the deluge.
Whilst hyperglycaemia causes substantial harm, excess insulin causes many problems all on its own: the inflammatory responses, the anabolic state that encourages potentially mutagenic cell proliferation, as well as the general kick in the nuts to endocrine homoeostasis.
But I still don’t enjoy a complete mechanistic understanding of exactly how having so much insulin in the body lies precisely at the feet of so many maladies. I would love someone to write a pop-sci book, using helpful analogies, beautiful illustrations and, if necessary, scratch-and-sniff stickers, that explains how excessive insulin really plays its part as the bad actor. At a cellular molecular level, what naughtiness lurks? And if, in fact, insulin lies upstream from that which actually does the in-situ damage, then I want to know all about it.
The pendulum of opinion about protein swings predictably – even tediously. A few years ago, we found Jimmy Moore suggesting that chomping down on protein had little to differentiate it from gorging on chocolate cake! More seriously, researchers like Dr Ron Rosedale started reading the Rosetta Stone of the ancient mTor pathway and not liking the tale that emerged: excess protein, it suggested, encourages life’s candle to burn more quickly – the soma detects structural superfluity and energy abundance, so nature suggests one just breed and then bugger off. The way to longevity, then, requires us to eschew the whey.
While the sugary Carbaliers battled the fatty Roundheads, they both batted about the “third” macronutrient, protein, as a pawn in the primary battle.
And, yes, the pendulum has swung again. Low Carb doctors like Ted Naiman have adapted a kind of post-modern CICO (however they protest that accusation). They proclaim that eating more protein and stopping your fat gluttony will help you lose weight and become generally healthier. mTor? Meh-tor more like.
Beyond the swinging pendulum lie perennial questions, never conclusively answered, about how protein does or does not interfere with ketosis, and whether this matters, and how it does, or does not, promote additional gluconeogenesis to replace that ketotic deficit. For every person who claims additional protein as a silver bullet to satiety, another retorts that a protein glut shoots one in the fat-adapted foot, leading to ravenous annoyance.
Honestly, I don’t really know what to believe about this one. As a lover of biltong, I’d welcome it if ad-libitum protein had the blessings of my favourite gurus. But anyone who’s attended one of Rosedale’s talks emerges certain that, at the very least, protein has a case to answer. Perhaps a powerful one.
Does anyone have any convincing data here, one way or the other? Let me know!
LCHF works. We have the data. Anyone who argues against it now as an effective strategy against hyperinsulinaemia thus makes an extraordinary claim, for which I demand extraordinary evidence to repudiate such an obviously successful methodology. Of course, because the body relies largely on insulin signalling to process digestible carbohydrates, and because science agrees that those carbs have no essential role for humans, limiting their ingestion makes logical sense.
We bolster that logic further when we realise our species went through a number of evolutionary funnels which demanded we thrive without any carbs.
Nevertheless, we do seem to find a few populations that survive, and perhaps even do well, on a higher carb diet. The Blessed Kitavans among them, perhaps. Clearly, they have many factors at play. Modern civilisation provides many more wonky paving-stones we can trip over than just a glut of exogenous glucose: air pollution, cocked-up circadian rhythms, even the stressful alienation from our evolved mores of socialisation. And, of course, the dreaded seed oils.
So do the happy-yet-carby civilisations thrive because of the carbs, or despite them? Have they inadvertently tipped the scales of health so profoundly that even a pile of insulin-spiking carbs can’t quite tip it askew the other way? Or do their diets actively promote their healthful normo-insulinaemia?
A few years ago, Denise Minger wrote a thought-provoking post about what she termed Carbosis. If the magical realm of ketosis happens on a high-fat, low carb diet, then Carbosis reflects its mirror image on a low-fat, high carb diet. Whilst I disagree with some of her conclusions (it seems, for example, we might, fairly, call Kempner a bit of a crook), I still find much to intrigue me in her argument.
I do believe that hyperinsulinaemia provides the key to unlocking much disease and disorder; so perhaps we can find not one, but two ways, to avoid turning that key:
1) LCHF. Eat food that doesn’t require much insulin intervention at all: namely, high fat food. Avoid glycolytic foods that spike insulin in this regimen, because high fat foods reduce the cellular response to insulin (what some have termed “glucose sparing adaptation” or “benign physiological insulin resistance”). As such, any exogenous glucose causes a hyperglycaemic emergency in this context.
2) HCLF: Eat food that forces your body to use insulin efficiently: namely, high carb foods. Avoid foods that induce glucose-sparing adaptation in this regimen (namely fats), to ensure that every cell listens carefully to insulin’s signal. The body needs to get the message that you provide glucose alone as its exogenous fuel. No mixed messages!
Both regimens, it seems, would result in a relatively low, and pulsatile, insulin pattern. I know which one I prefer, but perhaps a highly-restrictive HCLF doesn’t simply tell a just-so story, but has some bearing in reality. If it does, then it still leaves unanswered to me some of the other problems one experiences when relying entirely on glycolysis: the AGE damage, the satiety deficit, the potential lack of ketogenic benefits. But we need convincing answers to these carby black cygnets that paddle about on our otherwise calm LCHF pond.
Yes, the simple Ancel Keys lipid hypothesis has died. Only anti-science organisations like the American Heart Association, or the British Dietetic Association, still try to animate the rotting corpse.
The wonderful work by Ivor Cumins reveals how only a profoundly stupid – or myopic – person could rely on LDL-C or even LDL-P as a reliable causal marker of anything in particular. That said, I retain some tinges of concern. At some threshold, can the LDL particles themselves cause mischief all on their own, however much you’ve got the insulin beast under control? If so, who should worry about this? And how must they respond?
Since most data show that diets high in saturated fats do generally promote LDL traffic, as they should (see Dave Feldman’s amazing experiments), should a few of us revert to a mild Keysian paranoia, and keep a little remnant of the lipid hypothesis close to our ApoE4-cursed hearts? Do scare-mongerers like Gundry have a modicum of horrible validity when they preach at some of us to down olive oil and little else? I hope not, because we’ve worked so hard to awaken ourselves from last century’s lipophobic nightmare, that it’d annoy me to have to revivify a stubborn rump of that bad dream. And yet. And yet..
Statins: the mycotoxic medication everyone likes to mull over. People perceive it as either an angelic saviour of all our arteries, or a nearly-useless expensive paeon to a failed hypothesis, bringing debilitation of muscle and mind, and slyly inducing diabetes.
I tend more to the “expensive, useless and mildly debilitating” wing, but I cannot conclude that statins have no use in anybody. Of course, I don’t believe in their putative mechanism: lowering LDL to assuage the gods of the Lipid Hypothesis. So I have only two further questions:
- Do statins provide any actual pleiotropic benefits at all?
- Why do they not exhibit even more harm than they appear to do?
A Strong Statin-Apologist, of course, assumes the pills have a near miraculous effect on primary and secondary cohorts, thanks to their LDL-lowering mechanism, and waves away tales of debilitating side-effects as hysterical anecdotes.
The typical Weak Statin-Apologist, largely free from lipid-hypothesis bias, nevertheless concedes some mild pleiotropic benefit. The weak-apologist usually concocts some story about reducing inflammation, limiting some aggravating protein prenylation by the spanner it throws in the mevalonate pathway, or an effect the statin has on blood-clotting factors. The apologist acknowledges side-effects, but accedes to statins’ utility, nevertheless, in high-risk cohorts.
I wonder whether even a Weak Statin Apologist bends over backwards in too much fielty to utterly corrupt data. The only studies showing that statins have any effect at all, in any population, come from pharmaceutical sources, and their well-funded, well-briefed researchers. We know that pharmaceutical companies and their pets have substantially cherry-picked and manipulated which data we get to see and verify. Sources less prone to industry bias, surprise surprise, show statins working no better than placebo, for pretty much anyone.
Interestingly, whatever source you use, statins seem to have little to no effect on all-cause mortality. So even their pleiotropic benefits, if any actually survive scrutiny of the compromised data, may have some debilitating counterbalance.
Even if we assume statins have no substantive benefit at all, something still puzzles me, though. Having studied the mevalonate pathway, and what statins do to stymie it, I do not understand why statins do not have an unambiguous and uncompromisingly catastrophic effect on us. Perhaps their dosage screws up a sufficiently small proportion of HMG-CoA-Reductase such that enough unaffected cells get to go about their business unmolested?
I’d love someone to explain why statins don’t mess us up even worse than they appear to do so. Or do they, in fact, mess us up as badly as their spanner-in-the-works mechanism predicts, and the dam of legal ramifications will soon break, washing us all with a horrible torrent of truth.
We know that nobody enjoys suffering from anaemia. But those with high blood iron have problems too. Researchers have long associated high blood iron with mortality, probably via increased oxidative damage. Iron-filled folk basically become rust-buckets. But I have some questions. As with LDL, what context lies behind the epidemiology? If you have low insulin, and low systemic inflammation, does having a high ferritin count matter that much? And, in any case, should we necessarily heed high ferritin itself as the marker of concern? After all, ferritin does not measure free iron.
And if we find we have high iron, does this provide us for an annoyingly legitimate reason suddenly to find red meat scary all over again? And, if someone does have a high ferritin count, how frequently should they donate blood? Lots of people have lots of opinions. But no research provides clear-cut answers here.
Nobody but the most rabid vegan jihadi now denies that meat played a crucial role in our evolution; but researchers still debate about how often, and to what degree, we acted as exclusive carnivores. This matters, because evolutionary science tells us important stories about the viability of modern eating strategies – or otherwise; and whether modern carnivores recapitulate a grand tradition of our species, or find themselves treading utterly novel ground. These nuances matter. So long as our history maintains even the smallest green sprig in its continuous narrative, veggiementalists can argue for that sprig’s essentiality. But should that sprig wither and disappear for any meaningful duration of our species’ successful propagation, then that last gasp of obligate omnivory disappears.
People like Nora Gedgaudas point to the isotopic evidence that suggests we often enjoyed ancestral carnivory beyond that of foxes and wolves. Others, like Wrangham, have decided we controlled fire a million or more years before most think we did, and that we munched on tubers with alacrity. I don’t find his arguments particularly convincing – it seems more of a desperate retcon than proper science; but still, we must at least entertain any plausible hypothesis until we can falsify it conclusively.
Whether you believe we lived as tater-heads much of the time or not, I note that too many people ignore evolutionary bottlenecks when discussing our nutritional adaptation. Even if you’ve convinced yourself that our steady-state included tubers, fruit, honey or whatever else, we have substantial evidence that our species passed through a number of evolutionary funnels between any steady states. Some large-scale disasters brought our numbers down to mere thousands, and we had to endure pretty harsh, necessarily carnivorous, conditions as we replenished our populations. These conditions quickly and radically tested our metabolic mettle. Whether the Toba catastrophe itself happened or not, we know that our species faced similar events and survived to tell the tale. And the more icy and dusty of these catastrophes filtered out any individual unable to thrive and reproduce well on a meat-only diet, passing on the hardy, meaty genes we enjoy today.
Even with all this fleshy reasoning, I still muse on the gemüse: does a side-salad give us an an ancient edge, or perhaps even a novel one? No argument has convinced me yet, but perhaps you have one.
Halophobia has at least as murky a history as the fear of saturated fats. Indeed, Gary Taubes wrote one of his earliest nutritive investigations into debunking the pseudo-science lurking behind the fear of salt.
I came from a household that held to that fear of salt, whilst nevertheless succumbing to its savoury siren call. The salt shaker might as well have contained strychnine for all the hysteria that attended its use. As with many childhood imprints, I cannot quite reach for the Maldon flakes without a pinch of guilty doubt, even today.
So when James DiNicolantonio wrote The Salt Fix, which purported to blast out of the briny water once and for all any suspicion against our favourite edible rock, you would have expected nothing but cathartic elation from me. Sadly, my personal experience in conversing with Dr DiNicolantonio meant that I had to take his work, if you’ll forgive me, with a pinch of salt. I had verified his making obviously exaggerated or patently false statements too often for me to take blithely any theory he proposed thereafter (the last straw came in a lively discussion I had with him on Twitter about pigs in Okinawa, if you must ask. Dear reader, he blocked me). So, ironically, his paeon to salt made me want to question the tasty crystals again.
I love salty and sour flavours. Even when I used to mainline carbs, salt and vinegar crisps lay as my drug of choice rather more than anything particularly sweet. And my palate adjusts itself to increasing amounts of salt with remarkable ease. And I have seen enough data from Taubes and others to reassure me that salt’s bad rep lies largely with its hyperinsulinaemia-inducing fellow travellers, whatever suspicions I may have with DiNicolantonio’s rigour.
That said, the ongoing debates about how much salt we really evolved to expect to obtain leave me in a tizzy. On the one hand, folk like DiNicolantonio suggest we basically fooded our salt, rather than the other way round; and the more abstemious fathers of Paleo suggested only the most lucky of our ancestors managed to ingest a grain or two in a week’s foraging. Does it matter if either proves finally true? If we have functioning kidneys, does excess salt do anything more egregious than get urinated out? I hope not. But I don’t quite know not.
You can burn sugar. Or you can burn fat. More accurately, you can convert either into ATP. (You can also convert some scrap material, like lactate, to ATP too). When you burn fat, your liver turns some of the partial combustion products into ketones, from which cells can extract yet further energy and which, unlike triglycerides, can traverse the blood-brain barrier.
So far so neat.
But in the last several years, some (including my brother, in his AHS 2017 talk) have suggested that certain well fat-adapted populations nevertheless show little sign of ketogenesis. Some Inuit people with specific genetic markers lie as the inevitable darlings of this puzzling hypothesis.
What does it all mean? Who knows. Nobody. Whatever they’d like to suggest. And some do suggest they know rather more than they can justify, like Tom Naughton and Richard Nikoley, who use sketchy facts about the Inuit somehow to justify their tater addiction. Others, like Petro Dobromylskyj, who writes the complex but astonishing Hyperlipid blog, muse more thoughtfully on what Inuit adaptations might imply. He concluded his piece about the Inuit and their ketogenic status thus:
Ultimately, point scoring on the internet about what the Inuit did or didn’t eat shouldn’t destroy people’s chances of health. Destroying a circular argument about Inuit diets may make the destructor feel good. Destroying the feet, eyes and kidneys of a person with type 2 diabetes, who need a ketogenic diet, as a spin off from that victory must be difficult to live with. I don’t know how anyone can do this.
Beautifully put. But I still wonder: what does it mean to burn fat without creating ketone bodies in the liver? Does that actually even happen, or did the folk who purported to measure this in the Inuit so long ago simply not then have tools sophisticated enough to measure the ketones that actually did lurk in this population?
This whole ambiguous area represents another controversy where having a strong opinion about it marks you out as a zealot of some sort, I think. And it will do until we have better data, and better interpretations of that data.
Lots of animals can produce ketones and use them for fuel. And, controversy about Inuit aside, humans can too. For most animals, ketogenesis only really begins when an animal starves, or has had insufficient protein in its diet, or both. It clearly acts as an emergency fuelling system, which the liver switches on in times of need.
That animals – even carnivorous mammals – don’t spend much time in ketosis comes as a surprise to some people. Your cat converts much of its protein into glucose, which it then burns. A weird kind of inverted irony of the fact that a cow gets much of its carbs (as grass or grains) converted to fat! When a cat or a cow doesn’t get enough to eat then, eventually, ketosis kicks in. Farmers don’t like their cows to exhibit ketosis, because it suggests they haven’t fed them properly.
Humans do things differently, though. We can produce ketones at the drop of a hat compared with other animals. Not only do we manufacture them when starving, but we can produce them when we have eaten enough protein and enough calories, when we cut the carbs. Admittedly, if we eat loads of protein, we do not produce as many ketone bodies as when we limit the protein; however, compared with other animals, we produce orders of magnitude more even then.
Let’s stop and think about the implications of this for a moment. A species with a ludicrously energetically expensive, big brain has evolved to extract as much brain-traversing energy from dietary fat as possible, except when exogenous glucose runs unusually high. I think that tells us a lot about us, and our evolved expectations. Indeed, I think it suggests that we have a superpower.
But does “replete ketosis” represent a unique superpower, or can you find another species that does it just the way we do? I know researchers like L Amber O’Hearn want to know. Perhaps you can help?
For many of us, learning about fructose’s naughtiness acted as a gateway to general nutritive enlightenment. Along with Taubes’s “Good Calories, Bad Calories“, Robert Lustig’s YouTube video, “Sugar, The Bitter Truth“, led to a collapse of lot of tired assumptions and stale narratives. I remember watching Lustig’s video and thinking “I can no longer eat sugar without thinking of it as a kind of poison”. And, indeed, Lustig makes the case for fructose in particular as alcohol without the buzz. Just as liver-toxic. Just as addictive. Just as deadly when abused.
Even back then, though, I had a feeling Lustig overstated his case. I also think that in focussing on fructose, he understates the general problem with excessive glucose. He also has to invent fibre fairy-stories to justify the eating of high-fructose fruit. Then, and now, advising people not to eat that much fruit gets you branded an unredeemable heretic. But, these objections aside, I took as read that we should avoid fructose as a metabolic plague.
But as I began to understand what happens when you burn fat as your primary fuel, and when you produce ketones, I also began to wonder again about fructose. Lustig agreed that fructose itself does not provoke insulin production, but can make one “insulin resistant”. For someone who lives on carbs, this indeed suggests a substantial problem. When your body needs to sponge up all the excess glucose, having fructose to frustrate the sponge can cause metabolic mayhem.
But what if you don’t rely on a glucose sponge? And what if, because you don’t gorge on carbs, your liver doesn’t overflow with glycogen? As a first port of call, fructose fills up the liver’s glycogen tanks – no bad thing. Only then does it do most of the other stuff we’d prefer it not do, like get converted to liver-clogging fat. And even if some of it does convert to liver fat, why should a habitual fat burner worry that much about a little extra fuel on the fire? Unlike hyperinsulinaemic folk always in storage mode, we should burn it off pretty quickly. Perhaps just a quick run or a day’s intermittent fasting should do it.
I know that fructose has some other undesirable metabolic by-products, and it shouldn’t suddenly become a keto staple; but could an occasional bolus of non-insulinaemic fructose for a keto-adapted, relative active person actually prove more benign than the equivalent pile of insulin-spiking glucose? I really don’t know for sure, but find the thought intriguing! Perhaps Dave Feldman could try this in one of his experiments.
Forget the teacup pug: a gut bacterium of your very own represents the must-have pet of the season. Or several billion of them. The focus on gut flora has led some to believe, in a neo-classical revival, that the gut lies at the root of all maladies. If you don’t carefully – perhaps even neurotically – tend your biome, your “old friends” will wither and die, your intestines will rot, become cancerous and end up voting ‘yes’ in an ill-judged referendum.
Some suggest piling your gut full of rotting or rotted material to give your old friends a constant snack. Other have suggested a forced-migration policy of introducing yet more old friends into the kingdom. Either through the front door or the back.
Those backed into a corner by vegetables’ increasing apparent uselessness have found refuge in postulating that the most useless thing in vegetables – the stuff you cannot digest and has no nutritive value – as its most valuable. Fibre ftw!
So, as someone who eats lower carb, does one need to go out of one’s way to get extra fibre to avoid a Holodomor of one’s old friends? Or does the need for fibre really stem from the observation that when people eat nothing (fibre, essentially) rather than glycocarbs, their health improves? What if, instead of this, we cut out the glycocarbs almost entirely? When this gets tested, fibre’s magic disappears in a puff of smoke. Furthermore, the gut-energy benefits provided by fibre to those who never experience ketosis perhaps represent but an ersatz version of what your gut would enjoy on ketosis.
Furthermore, we know that fibre can actually cause substantial problems to those with adhesions, Crohn’s disease, IBS and other maladies of the gut. A relative’s doctor put her on a low sediment diet precisely because of this. She continues to enjoy rude health.
If “gut health” lies as sine qua none of general health, and “gut health” requires a specifically tended zoology, well supplemented by plenty of fibrous and fermented fertiliser, whence the health of zero-carb adherents. Certainly, they get some “animal fibre”, but nowhere near enough conventionally thought necessary to run the zoo.
Hucksters and shysters (both figurative and literal) plague the area of gut health and the biome. As ever, if anyone with a strong opinion about it sidles up to you, watch your wallet.
I have at hand my go-to aphorism for those tizzying about their gut biome:
We define a generally healthy gut biome as any biome we find in the gut of a generally healthy person
That said, like the colon, I find this area dark and mysterious, with many strange kinks and crannies. Some research on germ-free mice truly made me stop in my stride. Researchers bred and kept these mice to have no gut biome at all. So, how did they cope without their “oldest friends”? Apparently, they lived happier and longer lives!
“I eat green vegetables because my mother taught me to eat them, and I like them“. So said Gary Taubes. Behind the quip lies an interesting truth: despite their universal health halo, vegetables have a surprisingly sparse evidence-base to recommend them. I discussed this with some luminaries on a podcast.
Vegetables provide no unique vitamins or minerals that animal products cannot provide. Indeed, animal products usually provide them in higher concentrations and with better bio-availability. Some have thus postulated that special substances unique to vegetables may confer useful benefits to us. They have even presumptuously named these substances “phytonutrients“, so suffixed as if they have already proven their unique nutritious properties!
In reality, no in-vivo study has shown any proper benefit to humans of these phytochemicals, especially not in the concentrations made available to us through normal ingestion. In higher concentrations, some chemicals could have some medicinal benefit (in which case, as with all refined and concentrated plant extracts, we would name it “medicine'”).
More often than not, though, such extraction and concentration to metabolically-significant levels reveals these phytochemicals as highly toxic. Plants do manufacture many of these substances, after all, as endogenous pesticides! Until we have at least some persuasive data beyond speculative test-tube ramblings, or epidemiological wishful thinking, we cannot allow the propaganda term “phytonutrient” to go unchallenged.
Some try to turn the deficit into a benefit, arguing that the toxins and carcinogens in plants provide benefits through bracing “hormesis”. Too often, though, the hormetic hypothesis has the patina of a desperate last Hail Mary, when even the fibre fable has failed.
We certainly know that one can live and thrive without ingesting any vegetables. We have plenty of examples of this both now and historically. The neurotic ingestion of leaves, in fact, began relatively recently. We do not know, however, whether those who can process vegetation without the problems that some clearly have, might gain in that ingestion a finessed buffing of their diet, or whether they basically waste their chewing.
I do eat vegetables. As with Taubes, I enjoy them. Indeed, their status as probable “junk food” has enhanced my enjoyment of them! I know not whether, in sum, they provide me with anything of use beyond aesthetics. I’d like to understand, beyond all the Appeal to Nature, Garden of Eden nonsense, their true, unsentimental value. Assuming they have one.
Finally, even if we allow, for sake of argument, that vegetables do provide some unique benefit to those who can tolerate them, I have a question that no dietitian has answered when I have asked it: assuming you accept vegetables as healthful or even necessary, what benefit does also ingesting fruit bring you to counterbalance its added sugar load? Answers, I have received none.
Gurus enjoy hectoring people to get more sleep almost as vehemently as they preach the mysterious religion of the gut biome. We all know that feeling crabby and annoyed often requires little more than a good night’s sleep. Some of us acknowledge we’d like a lie-in more often than we get one. But what about forcing oneself to get an arbitrary number of hours of sleep per night, or else?
No matter how hard I try, my body doesn’t seem to let me have more than six hours of sleep a night. If I attempt to go to bed earlier, my brain usually won’t allow me to go to sleep until later in any case; and if I do fall asleep earlier, I’ll just wake up that much earlier too. Thus, when I hear the great and the good go on about getting more sleep, as if one can bully oneself into greater periods of unconsciousness, I get a little puzzled.
We have low red and orange lights on after dark. We don’t go to bed ridiculously late. I don’t usually have problems falling asleep. Waking up on a cold winter’s morning doesn’t feel the horrendous wrench it might. So, what do I miss in not somehow cudgelling my body into an extra hour or two of sleep? And if I do miss something, how might I remedy this? I note that my more hyperinsulinaemic friends need additional restorative sleep every night than I do. Perhaps my body gets everything done it needs in six hours, and I can chuck the guilt about not getting more shut-eye in my night-stand drawer?
To me, supplementation recommendations have now clashed and merged into nothing more than white noise. What, if any, supplements one should take seems almost impossible to discern rationally. One can argue that well-formulated LCHF diets require no supplementation. One nevertheless retains that niggling feeling that a little pill here or a powder there might optimise things.In particular, I wonder about taking in additional vitamin K2. We usually obtain it from fermented soy beans (natto), but I doubt that represents its ancestral source. More likely, we’d have obtained it from organs and fermented dairy products. It supposedly helps vitamin D to sequester calcium properly into the bones, and not into the arteries. And what about vitamin D, now that I mention it? Should one supplement? Or do we merely cargo-cult the myriad other benefits from sun exposure when we down a D3 capsule?
Other minerals of interest include magnesium and potassium. I occasionally have a teaspoon of magnesium citrate in some water before bedtime. Do I do something useful thereby? Who knows!
We do know for sure that the usual recommended daily allowances of most vitamins and minerals take into account the needs of sugar-burners. For example, a sugar burner demands much, much more vitamin C than someone who burns fat. I still wonder, though, whether I should perhaps take a small amount of ascorbate every day. Linus Pauling certainly thought so. The reality seems somewhat more complex!
My brother took one of the 23 and Me genetic tests, which found evidence of an MTHFR gene mutation. I have not taken such a test myself, but it would make my having this mutation relatively likely! That said, since 30 to 50% of people carry this mutation, one wonders whether pathologising it actually makes any sense at all!
It relates to the reduction of the production of a methyalation enzyme. Operatic sorts have proclaimed it a veritable death sentence, and that anyone “suffering” from this need take immediate action. I can’t react with quite that panic: for one thing, a mutation prevalent at that level of a population could not sustain itself if it so drastically affected health – specifically reproductive health. Clearly, it must confer some benefit to make up for any deficit. And perhaps its deficits only make themselves largely known in those who eat the standard modern western diets? I cannot say that the alt-site recommendations to take exotic supplements and to “detox” fill me with confidence that anyone really knows how appropriately to react – if at all – to this hardly-rare genetic circumstance!
Should I purchase expensive B vitamins to “fix” me? Or should I continue as normal and ignore the latest hysteria?
This feels a bit like airing dirty laundry. Those who adopt a low carb, high fat diet sometimes have their fasting glucose levels creep up. This causes panic with some, who feel stung by a diet which promised to bring down blood glucose, yet appears to do something of the opposite.
The reality of this adaptation makes sense when you consider its likely root: as resting insulin levels drop, the glucose produced by gluconeogenesis circulates in slightly higher concentrations than one might initially expect. Make no mistake – I interpret any indication of a lack of hyperinsuliaemia a good news story. People under this regimen who report slightly higher fasting glucose numbers nevertheless tend to report good HbA1c values. This suggests that the slightly higher blood glucose, in this context, doesn’t do the spiky damage we might fear in others.
Even with this perfectly rational explanation for the glucose creep, I must admit some worry. After all, glucose on its own glycates. Ok, perhaps not as much as in other contexts, as the HbA1c demonstrates, but still – does one want to tell one’s liver to chill out a bit with the gluconeogenesis? A bit of metformin, perhaps? Or just leave well enough alone? Again, one feels that if this represents our evolved state, then fine; but perhaps not fine, for evolution doesn’t really care if I die a few more years than I otherwise might, whereas I do!
I think the general advice not to consume sugar goes down surprisingly easily; as easy as, well, a spoonful of sugar. Whether vegan or keto, whether the BDA or the AHA, we can all sit around the campfire and sing sugar-free Kumbaya. Why? Because we enjoy feeling the frisson of ascetic virtue. Sugar tastes sweet, so we need punishment for enjoying it!
Loving kale rewards one with a feeling of virtue, because it tastes vile. Avoiding sugar rewards one with a feeling of virtue, because it tastes good.
When we get addicted to such virtue, we forget that Karma does not actually exist. The universe does not particularly care whether we suffer, or whether we bliss ourselves in hedonic relish.
We tacitly assume that any pleasure we get from anything that tastes sweet must, by definition, punish us for the crime of that pleasure; or, at the very least, demand a price. If you stop to think about this for a minute, you’ll realise it utter bunkum. Nobody has written that Karmic Contract for you and, just because some pleasure can become problematic in some contexts, you have no reason to generalise that pleasures, alleviated of orthogonal side effects may, shock horror, just exist as pleasures, not sirens calling one to any rocks!
Of course, because we do assume the sensation of sweetness as far too suspiciously nice ever to have no negative consequence, we fight passionately against anyone who claims that we can, indeed, experience sweet flavour without needing to fear the gods’ wrathful judgement: artificial sweeteners, polyols and other non-nutritive options.
The usual reaction to any such suggestion entails conspiracy theory and dark ruminations on the metabolic and even moral ruin that such an attempt to cheat the fates might provoke. Saccharine and Aspartame will kill you. Sucralose will rot your gut. Erythritol and xylitol’s names even sound evil! Of course, if you try to find any convincing data on the true perfidy of any of these, you will come away empty handed. Plenty of online hysterics. A pile of crappy epidemiology suggesting that fat people, gasp, tend to try non-nutritive sweeteners and, therefore, non-nutritive sweeteners make you fat.
More serious speculation involves wondering about whether the body gets “tricked” by the non-calorific sweetness; or whether insulin nevertheless spikes into the void (presumably balanced by glucagon, lest we fall into a coma with ever Diet Coke?). Most studies, especially of sweeteners like erythritol, show no metabolic activity, no insulin spike, no glucose spike – nothing. And yet still, we scowl. Too good! I wrote a piece about the polyols, and how to make a good custard from them, where I already discussed much of this.
More sophisticated, recent attempts to resurrect the stern gods have involved taste-buds in our guts, and a suggestion that consuming non-nutritive sweeteners spikes our insulin more the next time we consume glucose.
Well then, fine, but what if we make a habit of not having many “next times” in which to consume glucose. “But you’ll crave more sweet things and revert to sugary junk”. Well, ok, but what if I instead revert to erythritolly goodness? So what? Unless you introduce me to one of these finger-wagging gods, I don’t feel I have the obligation to believe in them!
All this sweet bravado aside, I still have a quiet voice that reminds me that eschewing too much sweetness has its aesthetic benefits: a less dead palate, for one. But sharpening hedonism has a marked difference from wishing to blunt it. I also have another, quieter, voice which says, despite the bluster of these paragraphs.. what if the the ascetics have it right after all, and sweetness sins inherently?
The ascetic mindset approves of exercise, so long as one doesn’t make the mistake of finding it too fun. As with “eat your veggies”, exercise represents an almost universal recommendation. Unlike “eat your veggies”, exercise does come with some compelling evidence. The body has an evolutionary expectation of a certain amount of physical exertion properly to manage its hormonal, muscular and cardio-vascular status. Even just keeping the lymph system moving properly suggests the necessity of a brisk walk or two a day.
Beyond the bland prescription, though, I hope for some more specific, carefully evidenced arguments. People claim to have such arguments The paleo-bro contingent has long demonised what they term “chronic cardio” as either useless, or outright damaging. The bros’ recommendations, though, lie at odds with the notion that long distance running derives from our species persistence-hunting expertise, so interestingly studied by people like Dan Lieberman.
Clearly, exercise causes a stress response. It raises blood glucose. It damages, acutely. People bicker constantly about the degree to which this acute damage provides hormetic chronic benefits. They also cannot agree at what point too much becomes too much, and at what threshold we should consider too little as too little. How little, how much, how frequent, what rest, what movements: as with other recommendations discussed here, it all dissolves into a white noise.
It gets worse. I find so dull the perennial arguments between those who believe Olympians could do fine on a few macadamia nuts, vs those who think that a 2km fun-run requires 2 kilos of sugary gel. Perhaps a top-flight sprinty sportsperson may indeed require some glucose to push to the front, whatever Phinney and Volek might say. That such a sportsperson might need the assistance of yet another a dodgy external chemical to help them win their race would not shock me – and it certainly doesn’t interest me! My interest lies in strategies to promote healthful longevity, not in hacking a few freak specimens to achieve the ghoulish limit of their mechanical capabilities, at the probably cost of that healthful longevity.
I hedge my bets: running at medium pace, daily, with occasional sprints. Weekly weight-bearing gymnastics. And hoping for the best.
If you want to get into a firestorm, try to discuss the potential of ketogenic diets as a preventative strategy against cancer, or as an ajduvant to its treatment. Click to submit such a tweet, and before your finger has left the mouse, angry astroturfers and patients with Stockholm Syndrome will have already prepared the pyre and have tied you to the stake.
Anyone who dare suggest diet have anything to do with cancer, or might mediate its initiation, spread and metastasis, gets this treatment. Unless, of course, they suggest that evil meat could have something to do with it. In that case, tumour-shame away!
Despite the angry straw-man accusations, almost nobody suggests that a ketogenic diet provides guaranteed prevention against cancer; and certainly, nobody suggests one should ever rely on it as one’s only attempted cure! And, yes, much research lies in its early stages, so we cannot make anything but tentative, hopeful statements. People like Andrew Scarborough present powerful anecdotes, but we need more than anecdotes to counter pharmaceutical bulwarks.
Furthermore, anyone who states that cancer thrives on sugar, and therefore cutting back exogenous sugar marks the end of cancer’s story, commits a gross over-simplification of the tale. The body can produce plenty of glucose via gluconeogenesis, as well as other substrates upon which cancer might thrive.
That said, whether we believe in the conventional interpretation of cancer’s aetiology, or in a more Warburg-slanted metabolic story, only the most cretinous would suggest that keeping a patient (or someone who may well become a patient) in a hyperinsulinaemic state presents anything but rich manure in which cancer can sprout and bloom. Cancer cells and their damaged mitochondria absolutely do present opportunities for dietary differentiation. Only a pseudoscientist could possibly suggest otherwise.
However one might fear a heart-attack or a stroke, or even Alzheimer’s, cancer remains the terrible health bogeyman of our time. Whatever Pollyanna tales the drugs-company funded shills in the cancer-charity industry might tell us, conventional therapy has a horribly dismal record here. This lack of progress provides fertile ground for nasty woo-merchants and hucksters. One hopes that properly understood applications of the ketogenic diet might legitimately rise above such accusations soon, in all but the most bigoted of minds. I await the time when I can uncross my fingers. Do let me know when!
So there ends my Unsurance Policy. For now. This in no way exhausts my metabolic uncertainties or nutritional knowledge-gaps! It does lay out enough of the threadbare tapestry of ambiguity so some clever folk can darn it for me.