This article is interesting: Is Sunscreen the New Margarine?
If there was one supplement that seemed sure to survive the rigorous tests, it was vitamin D. People with low levels of vitamin D in their blood have significantly higher rates of virtually every disease and disorder you can think of: cancer, diabetes, obesity, osteoporosis, heart attack, stroke, depression, cognitive impairment, autoimmune conditions, and more. The vitamin is required for calcium absorption and is thus essential for bone health, but as evidence mounted that lower levels of vitamin D were associated with so many diseases, health experts began suspecting that it was involved in many other biological processes as well.
And they believed that most of us weren’t getting enough of it. This made sense. Vitamin D is a hormone manufactured by the skin with the help of sunlight. It’s difficult to obtain in sufficient quantities through diet. When our ancestors lived outdoors in tropical regions and ran around half naked, this wasn’t a problem. We produced all the vitamin D we needed from the sun.
But today most of us have indoor jobs, and when we do go outside, we’ve been taught to protect ourselves from dangerous UV rays, which can cause skin cancer. Sunscreen also blocks our skin from making vitamin D, but that’s OK, says the American Academy of Dermatology, which takes a zero-tolerance stance on sun exposure: “You need to protect your skin from the sun every day, even when it’s cloudy,” it advises on its website. Better to slather on sunblock, we’ve all been told, and compensate with vitamin D pills.
Yet vitamin D supplementation has failed spectacularly in clinical trials. Five years ago, researchers were already warning that it showed zero benefit, and the evidence has only grown stronger. In November, one of the largest and most rigorous trials of the vitamin ever conducted—in which 25,871 participants received high doses for five years—found no impact on cancer, heart disease, or stroke.
How did we get it so wrong? How could people with low vitamin D levels clearly suffer higher rates of so many diseases and yet not be helped by supplementation?
As it turns out, a rogue band of researchers has had an explanation all along. And if they’re right, it means that once again we have been epically misled.
These rebels argue that what made the people with high vitamin D levels so healthy was not the vitamin itself. That was just a marker. Their vitamin D levels were high because they were getting plenty of exposure to the thing that was really responsible for their good health—that big orange ball shining down from above.
Am I willing to entertain the notion that current guidelines are inadvertently advocating a lifestyle that is killing us?
I am, because it’s happened before.
In the 1970s, as nutritionists began to see signs that people whose diets were high in saturated fat and cholesterol also had high rates of cardiovascular disease, they told us to avoid butter and choose margarine, which is made by bubbling hydrogen gas through vegetable oils to turn them into solid trans fats.
From its inception in the mid-1800s, margarine had always been considered creepers, a freakish substitute for people who couldn’t afford real butter. By the late 1800s, several midwestern dairy states had banned it outright, while others, including Vermont and New Hampshire, passed laws requiring that it be dyed pink so it could never pass itself off as butter. Yet somehow margarine became the thing we spread on toast for decades, a reminder that even the weirdest product can become mainstream with enough industry muscle.
Eventually, better science revealed that the trans fats created by the hydrogenation process were far worse for our arteries than the natural fats in butter. In 1994, Harvard researchers estimated that 30,000 people per year were dying unnecessarily thanks to trans fats. Yet they weren’t banned in the U.S. until 2015.
Might the same dynamic be playing out with sunscreen, which was also remarkably sketchy in its early days?