Vitamin D, known for its role in bone and teeth health, has recently been linked to a host of other wellness benefits, so how much vitamin D do you need and what are the best ways to get it.
During these short winter days it’s not unusual to hear friends bemoan, “I need more vitamin D.” In recent years vitamin D has emerged as a “super vitamin” in media and medical reports. Even with all the new attention the benefits of vitamin D are nothing new. Discovered back in the 1920’s, it’s commonly known for its role in helping the body process calcium, which in turn strengthens bones and teeth. What’s novel are the escalating reports of widespread American deficiency.
In the Beginning…
“In the early part of the century, with rickets – the softening of children’s bones often resulting in bowed legs – rampant in northern Europe and the northern United States, the idea came about that a dietary deficiency might be the cause,” says Dr. Schlesinger. “This was confirmed in 1922, and soon after, scientists discovered that vitamin D could be produced in foods by irradiating them – that is, exposing them to ultraviolet light. Such fortification, in place since the 1930’s is largely responsible for the virtual elimination of rickets.”
The medical community breaks vitamin D down into two types. One, vitamin D3, is formed when ultraviolet B rays interact with exposed skin (also called cutaneous vitamin D); it’s also present organically in some food. The second, vitamin D2, is solely digestible and occurs in natural foods, like oily fish, or in enriched foods. All vitamin D heads to the liver and kidneys – cutaneous vitamin D via the bloodstream and ingested vitamin D via the gastrointestinal tract – where it is converted into hormones that help boost calcium levels in the intestines, transforms calcium and phosphate in the blood into the building block for bones and regulates some aspects of cell growth.
Because calcium depends on vitamin D to fulfill its role in bone development, having the former but being severely deficient in the latter can lead to softening of the skeletal system, called osteomalacia. In children and babies, that shows up as rickets. In adults, it appears in the form of aching, weak bones and muscles, pelvic deformities and potentially a waddling gait. If both calcium and vitamin D are below ideal levels in adults, osteoporosis can also occur.
“Studies have also linked vitamin D deficiency to increased falls in the elderly and to a higher risk of some cancers, including breast, colon and prostate,” says Dr. Schlesigner, adding that the same is true for developing diabetes and overall risk of early death. “And the effect of vitamin D on the immune system is just coming to light. Scientists at the University of Copenhagen have recently discovered that vitamin D plays a crucial role in activation of immune system cells that render them more capable of fighting infection or other harmful threats. Without the vitamin, these cells would remain dormant and unresponsive to the threats.”
Considering its proven and postulated potential, aspiring to and maintaining a steady intake of vitamin D sounds like a no-brainer. So why are deficiencies showing up instead, along with those relatively isolated cases of rickets? To get to the bottom of both answers, take a look at what’s changed since the vitamin was first introduced into milk and other calcium-rich foods.
For starters, Americans are living longer. At the turn of the 20th century, world life expectancy was about 30 years of age; today, Americans average 79.5 years of life. Thus in the past, those who mainly absorbed vitamin D via the sun likely died before the harmful effects of ultraviolet B radiation – such as melanoma – presented and/or were realized. Further, when was the last time you heard of someone downing a teaspoon of cod liver oil (one of the richest natural sources of vitamin D)? Compound those points with the variety of drinks – sodas, sports drinks, powdered drinks – that have taken the place of old faithful’s like fortified milk and orange juice, and you can see where vitamin D consumption and creation faced a decline.
In fact, the National Institutes of Health (NIH) says most people naturally ingest only 100 International Units (IU) daily via their regular diet. But that amount falls short of the recommended daily dosages for any age group (anywhere from 200 to 1,000 IU). To hit the ideal allowances each day – the calcium-synthesizing catalyst formed by vitamin D only lasts five days, so there needs to be a steady feed of it to maintain healthy levels – you’ve got to mix and match your sources. “It’s been suggested by some researchers that approximately five to 30 minutes of unprotected sun exposure to the face, arms, legs or back at least twice a week will produce enough vitamin D to satisfy daily requirements,” says Dr. Schlesinger. “But ultraviolet (UV) radiation causes cancer and is responsible for most of the estimated 1.5 million skin cancers and 8,000-plus death from melanoma in the United States each year. So the American Academy of Dermatology (AAD) contends that there is no safe level of UV exposure from the sun that allows for maximal vitamin D synthesis without increasing skin cancer risk. Instead, they suggest individuals make up the difference through dietary or supplementary sources.”
There are a few groups who have a tougher time getting vitamin D through the sun, supplements or fortified foods than most. For example, it takes longer for UVB rays to create vitamin D in dark-skinned people (like African-Americans or Latinos) because their higher melanin levels slow down production. Those who suffer from ailments like Crohn’s Disease are also more prone to deficiencies, as gastrointestinal issues limit fat absorption through digestion. “Vitamin D is a fat-soluble vitamin, so people with this condition require supplements,” explains Dr. Schlesinger, adding that obese people may experience similar troubles. Excess fat absorbs the vitamin, so their bodies lack sufficient vitamin D in circulation.
Also at risk for deficiency: breast-fed babies (breast milk isn’t rich in vitamin D and infants should have limited exposure to sunshine) and the elderly; according to Dr. Schlesinger, a recent study revealed that 60 percent of female residents of a U.S. nursing home were so deficient that they developed secondary hormone abnormalities.
The solution? Anyone who falls into these at-risk groups should take supplements, the kind and amount is best determined by a doctor.
Let the Sun Shine In?
But, what about sunscreen’s role in all of this? Does it share in the blame for today’s vitamin D decline? “Using sunscreen or sunblock does reduce your skin’s ability to make vitamin D,” Dr. Schlesinger concedes. “But, if you are following the nutritional guidelines, you should be getting enough through your diet.”
“Many experts think that the amount of sun exposure needed to get enough vitamin D is minimal since the skin is very efficient at producing it,” he adds. “So I advise wearing sunscreen and going about your day, because you should get enough cutaneous vitamin D from daily routines – like walking to and from your car.”
Dr. Schlesinger and the AAD both have a pretty persuasive argument for doing such. Bottom line, when you consider that getting your vitamin D via the sun involves intentional exposure to a carcinogen, while ingested supplements do not, the risks of accidental sun overexposure-cancer from tanning or burning – shortchanges its benefits. So put on your sunscreen year-round and take your supplements. Compared to melanoma, a plain old multi-vitamin and a big glass of enriched milk sounds like a good combination.