This is a list of the vitamins and mineral supplements that nutritional science and medicine generally agree the general population should take (and would benefit from) daily:
Surprise, there aren’t any. The best candidate might be Vitamin D—simply because 40% of the American public (higher among people with higher melanin levels in their skin) are deficient in the vitamin by even the lowest standards of bare sufficiency. Vitamin D, cholecalciferol, is created in the skin (it’s a vitamin that humans can actually synthesize themselves) using ultraviolet rays from sunlight to facilitate the process, and with increased time indoors and overuse of sunscreen, Vitamin D deficiency has been raising in recent decades even among Caucasians. Even so, the jury is still out on the effectiveness of oral supplementation of Vitamin D at raising blood levels of the compound to above 50 nmol/Liter, which is largely agreed to be the lowest level at which basic skeletal health can still be well-maintained (actual ideal levels may be considerably higher, but there is considerable disagreement in the scientific literature what the range of healthy levels may be). For any other vitamin and many minerals, there is generally some appellation of genetic disorder, lifestyle, or disease that mean certain subpopulations benefit from supplementation of certain nutrients. Post-menopausal women should take a calcium-magnesium supplement. People on strict vegan or vegetarian diets should take Vitamin B12 supplements (because this vitamin is synthesized by bacteria and almost entirely found in meat, particularly pork and beef). Some people have genetic issues with absorbing certain nutrients and thus need to consume much larger quantities than the RDAs for the general public, and those cases too can require supplementation. However, daily multivitamins? Vitamin C and Zinc pills? Vitamin E? There is no clear consensus to recommend anyone waste money on said supplements; there is very little scientific evidence of benefit to justify supplements. In the name of fairness though, I will outline both the pro-supplementation and the anti-supplementation arguments.
While I have just said there is no consensus, this does not mean there is no scientific evidence to support supplementation, though on the whole the evidence in favor is spottier and inconsistent and often disappears when grouping different cohort studies together. Dr. Bruce Ames, now known for his work in pioneering Triage Theory in nutrition (more on it later), has made some fairly simple criticisms of clinical trials that, for the most part, don’t show clear benefit to supplementation even where theoretically speaking there should be benefit. First, these studies do not screen applicants to determine adequacy or inadequacy—which is, honestly, a gaping hole for potential error, given that over 80% of a study population may already be adequate in said nutrient. This creates a lot of noise and makes potential benefits less obvious in the resulting data. In tandem with this, studies of supplementation only rarely (I have only seen this done with Vitamin D), test biometric measures to find out whether subjects are actually reaching adequate levels of the nutrient, which would determine whether the dose was sufficient and whether the supplement being used has adequate bioavailability. The fact is, however, that for many nutrients there isn’t a clear and easy way to determine whether the body has sufficient levels or not—even for commonly tested deficiencies like iron, determining deficiency requires testing a number of different biometrics. The development of biometric screening that can fully capture nutritional adequacy vis a vis biological levels within the body is still in its infancy, and that always limits the level of detail that clinical testing can account for.
Dr. Ames and most of the pro-supplement minority in the medical field would argue that there are few tangible risks definitively associated with supplementation, and that our biological models and understanding of nutrition basically require benefits to exist, otherwise these models, which work well elsewhere, are completely wrong. Somewhat like with the Higgs Boson, what we know says that for deficient people, supplements will have benefits in maintaining health and reducing risk vectors for different diseases like diabetes, cancer, arteriosclerosis, fatty liver disease, and so on—there just haven’t been appropriately designed trials to test this, instead having largely been far too broad and too lax in their controls. I will admit, while I personally do not find this argument totally convincing, it is a legitimate argument and makes some good points. Especially in the context of contemporary American diets and empty calories from processed foods, a large number of people are deficient in various nutrients. The pro-supplement argument even adds that NHANES data likely understates the commonness of deficiency, given that it defines deficiency based on EAR (Estimated Average Requirement) which is set two standard deviations below the RDA (Recommended Daily Allowance). While ideally, people would make lifestyle changes and consume all these nutrients from diet, supplementation works as a low-impact intervention that is easier to implement, and if it were really risk-free, any benefits would be easy profit in terms of public health.
The second part of Dr. Ames’ argument is that EAR’s themselves are probably too low and may mask the level of deficiency common in contemporary diets. It is impossible to understand this argument without briefly explaining Dr. Ames’ Triage Theory. Triage Theory approaches nutrition from a unique perspective: scarcity and evolution. The human body needs 30 minerals and vitamins to function; severe deficiency in any one can be fatal or have deleterious effects on health. These vitamins and minerals play roles as enzyme cofactors in thousands of biological processes, as well as forming components of critical structures in cells or as antioxidants. Vitamin C, for instance, is famously involved in production of collagen, which holds together scar tissue and forms a base for teeth—which is why scurvy is such a horrific disease, as victims’ old wounds literally reopen and their teeth fall out among other extraordinary suffering. Iron, the most common nutritional deficiency in the world, is used in hemoglobin and makes our blood red; it is what enables the body to transport oxygen to cells (and has other biological functions as well). Triage Theory suggests that scarcity is a common ecological pressure, and that evolution would have selected for traits that ration certain nutrients effectively in absence of adequate dietary intake. Dr. Ames’ epiphany moment occurred while pondering what nutrient rationing would look like. Just as Type O blood seems to be largely a reaction to childhood mortality from malaria (though that theory is still under debate), the evolutionary pressure of nutrient scarcity is to live long enough to reproduce and care for offspring until they are independent. That is, short-term survival should, theoretically speaking, always be the human body’s priority. Each vitamin and mineral are used in a number of biological processes, but Triage Theory predicts that they are prioritized for critical processes that impact short-term survival, and that with even mild long-term deficiency, other biological processes that are critical to healthy aging and that, for instance, we can observe in centenarians and people who age well, are triaged by the body so to speak. Simply put, the main biological pressure driving evolution is producing viable offspring (which in the case of humans means early childhood care as well), not living to advanced age.
The theory’s most notable findings have been that Vitamin K is indeed triaged in deficient conditions. Vitamin K is a fat soluble compound that is a component of prothrombin, which is one of the main enzymes involved in blood clotting, and also why people taking the blood-thinner warfarin are advised not to take supplemental Vitamin K. Without Vitamin K (one of the vitamins the general public either doesn’t know exists or has no clue what biological process it supports), you would bleed out and die. That’s what Triage Theory calls a survival function. However, Vitamin K is also involved in regulating blood levels of calcium and maintaining bone health. Other enzymes that use the vitamin are important to preventing arteriosclerotic plaques that cause heart disease and stroke. In Japan, high Vitamin K2 consumption (a unique form of the vitamin synthesized by bacteria and found in huge quantities fermented soybeans, known as natto), has a strong correlation with reduced risk of bone fractures in elderly women and in lower ischemic heart disease risk and overall lower cardiovascular mortality risks (both of which Japan has among the lowest rates of in the world). These are “longevity” functions, and they are not a priority for the body; being healthy at 80 is simply not a core selective pressure, but surviving long enough to produce viable offspring that can care for themselves is.
In tandem with this theory, is the argument that many EAR’s don’t adequately account for the levels of a nutrient needed to fully support longevity; scientists calculate EAR’s based on detectable deficiencies and “essential” biological functions, which may not represent the full range of biological functions for said nutrient. It would then logically follow that longevity and long-term health may require greater intake—which would also explain why observational studies of people with nutrient-dense diets so consistently find lower mortality risk, and lower rates of cardiovascular disease and cancer. The second half of this argument as to how supplementation may be more beneficial than we suspect or that current studies are being designed to test for, is also compelling, but I am capital S Skeptical of this half of the argument. To the extent that the argument may be entirely plausible and reasonable, there is still a long history of quacks and snake oil salespersons, even Nobel Laureates like Linus Pauling, recommending huge doses of whatever vitamin miracle cure they are attached to (and making money advocating for). Vitamin C and Zinc supplements, for instance, are prominent in regulating the immune system, but enormous doses even sustained for weeks and months before a cold, have shown no ability to prevent colds and the largest detected effect in a blind trial was an approximately 5% reduction in average reported duration of illness. The only other impact of Vitamin C in double blind trials has been a slight moderation of symptoms, possibly through a process similar to antihistamines. Any sagacious theorizing that is even attempting to open the door to recommending big doses of vitamins as having special health-improving functions must clear numerous and immensely high hurdles to gain credibility, because this is basically the Linus Paulus orthomolecular therapy argument, which is nothing more than a crackpot appellation of vitamin treatments based on the idea that dietary needs of nutrients vary radically by individual, vitamins can cure cancer and other common illnesses (somehow) and RDAs are grossly insufficient (without any evidence to support this).
Likewise, even at the age of 92 Dr. Ames has continued publishing on this issue, with one of the more recent expansions of Triage Theory having to do with “longevity vitamins.” As I said, the core argument is generally sound and plausible—namely that we have found 30 vitamins and minerals that are necessary for survival; they are “essential” nutrients needed in addition to water, fat, carbohydrates, and protein, but that there are also likely nutrients that we can live without, but whose consumption supports a variety of biological processes in healthy aging. There are a lot of bioactive compounds we absorb from diet, and some, particularly several carotenoids and ergothioneine have shown some correlated role to dementia and eye health, and have been shown to be retained by the body in certain tissues exposed to high levels of oxidative stress. On the other hand, there are a million unscrupulous companies and bizarre naturopathic fanatics pitching random foods and nutrients as the cure for aging or the cure for everything from cancer to Irritable Bowel Syndrome. I think this is a very interesting subject for researchers, but the science of determining what, if any, bioactive compounds in diet play a role in supporting healthy aging or to upregulating genes responsible for healthy aging, is simply not there yet and science will need at least a decade to sort out and vet ongoing discoveries, separating the rice from the chaff along the way. Anyone who says otherwise is trying to sell something, and even those who do say otherwise are probably trying to sell something, but are just more clever about it.
The argument against supplementation is that well, there hasn’t been much benefit in aggregate. Sometimes a study finds a benefit, such as a decrease in mortality from prostate cancer with Vitamin E, but only among male smokers, a subpopulation with a highly elevated risk profile to start with. A lot of studies of Vitamin E in particular suggest that the secondary half of the Triage Theory may be valid for high-risk groups; that RDAs may understate what adequacy is for older adults or adults with risk factors like smoking, drug use, diabetes, or a history of heart attack, stroke, hypertension, and other chronic diseases. This basically flows into my opening point that certain supplements likely have net benefits for some subpopulations for differing reasons and to differing extents, but this is insufficient I think to argue that general population supplementation is worth the cost or substantially impacts health in a positive manner. A 2008 study pooling together 67 randomized clinical trials of Vitamin A, Vitamin E, beta-carotene, Vitamin C, and Selenium covering 232,550 participants found for Vitamin A, beta-carotene, and Vitamin E supplements significantly increased risk of morality compared to the placebo group in those studies, Vitamin C showed no effect, and Selenium offered a very modest decrease in risk with an asterisk that data was insufficient. I think the main body of scientific evidence very clearly shows no benefits to supplementation, and a variety of risks, with Vitamin E supplementation in particular increasing the risk of stroke in several studies.
Listening to Bruce Ames and reading some other popular health nutritionist accounts, I feel they definitely underestimate the potential for supplementation to have a negative impact on health. Many studies, including the SELECT study on Vitamin E and Selenium, as well as other studies on Folate, found an increase in cancer risk. Many of these nutrients are as essential for cancers as normally functioning cells, with the difference being that cancers grow much faster and need even larger amounts of the nutrients (being malnourished also inhibits cancer growth). Diets higher in Folic Acid (largely from cruciferous greens and other vegetables) have correlated very strongly in research to lower colorectal cancer risks and lower mortality rates from colorectal cancer and breast cancer, but a random clinical trial on the colorectal adenomas found that people with higher risks (a history of polyps) received no protective benefits from 1000 micrograms of daily Folic Acid supplementation, and on the contrary, had an increased risk of developing severe polyps. While all nutritionists understand adequate caution with say, iron supplementation or Vitamin A supplementation (preformed retinol, which can be toxic; for that matter Vitamin A supplementation was also found to increase cancer risks substantially among smokers), I think that a lot of more insidious risks for long-term supplementation that have popped up in various studies are given less weight than is appropriate by the pro-supplementation argument.
The ideal is to obtain nutrients from diet, unless otherwise needed due to preexisting medical conditions. That is the official position of actual, scientifically-trained nutritionists and medical researchers (the field is full of chiropractic school and homeopathic digital lecture courses sending out mail-order diplomas recognizing them as “nutritionists”). Supplementation has problematic bioavailability, and many ratios between other nutrients are also vital in both metabolism and absorption of said nutrients. The equilibrium of calcium and magnesium, for instance, is very important to the absorption and regulation of calcium levels in the blood, while Vitamin E absorption is dependent on Vitamin C and Selenium-based proteins. With calcium, an additional issue is that high phosphorous intake (common in meat and dairy heavy Western diets) may interfere with absorption of calcium, though the Ca to P ratio is still poorly understood. This would explain why hip and bone fractures are so rare among certain populations in East Asia and Africa (Japan in particular), that consume less than half the recommended RDA of calcium, but do so largely from plant-based sources. Phosphorous additives, common in processed foods and fast foods, are also nearly completely absorbed by the body, and are even a risk factor for kidney stones, whereas plant-based phosphorous is less readily absorbed. Elsewhere, there is the potassium-sodium equilibrium, the imbalance of which is a major cause of hypertension (not just high salt intake alone), and conservatively speaking over 90% of the American public are deficient in potassium, which is concentrated in fruits and vegetables. Stepping off into the risky grounds of non-professional personal conjecture, I suspect that the really high concentrations of nutrients in supplements may actually lead to poor absorption and interfere with biological processes in the gastrointestinal system, including the complex microbiota that play a complex role in the digestive and metabolic system in humans. Such concentrated doses of vitamins and minerals are unlikely to be found in nature and thus I feel like it is reasonable to assume that humans may not be equipped to process them, especially not on a regular basis. With any compound, “the dose maketh the poison”, and daily supplementation makes it pretty easy to, for instance, consume twenty times the RDA of Vitamin E a day, which is very difficult to do through diet, and even in that case, the vitamin would be dispersed throughout the day, in the vitamin’s eight different naturally-occurring forms, as opposed into one punch, generally 100% in the form of an esterized alpha-tocopherol only. There is even evidence that certain ratios of different Vitamin E compounds have a syncretic effect and that all are needed, particularly between alpha-tocopherol and gamma-tocopherol, which function different in the body—which would explain why large doses of Vitamin E, a vitamin frequently deficient in American diets (80% of Americans consume less than the EAR in diet), have not generated positive effects in trials. I think the evidence suggests long-term overconsumption of vitamins and minerals comes with its share of increased health risks—too much calcium is also a risk factor for arteriosclerosis, and too much Vitamin B6 can cause nerve damage even though it is water soluble (and thus accumulates more slowly in the body). So in that sense, I find it curious that pro-supplementation arguments treat overconsumption as a non-concern, and, without firm evidence, continue the orthomolecular tradition of suggesting RDAs in general are somehow insufficient without laying out clear arguments for how much additional consumption is needed and who benefits from it and how.
The evidence is simply far clearer and simpler with diet; diets heavy in fresh vegetables, fruits, whole grains and nuts, with regular consumption of fish, are correlated to lower mortality risks, lower overall cancer risks, and vastly lower risks of heart disease and stroke. Add to this evidence that good diet and the regular consumption of mushrooms has been correlated to reduced risk of developing dementia, as well as to reduced mortality risks, and it is easy to understand why most nutritionists and doctors have focused on recommending dietary changes in recent years. While terms like “The Mediterranean Diet” or “The Scandinavian Diet” are just branding and neither diet is a serious reproduction of either current or historical eating patterns in their region, both are serious attempts to reimagine diet in its more historical and literal sense: a long-term appellation of food and cooking practices, (not a short-term exercise in self-flagellation until you lose the weight and get that beach body). Both focus on being sustainable, and in the case of the Scandinavian Diet, environmentally conscious as well (by maximizing local consumption).
My Take on the Issue
Pills aren’t the answer. Health literacy is. More classroom time for students, starting younger, on nutrition, teaching responsible curriculums based on the latest scientific consensus. In Japan, kids have cooking classes from Elementary through Junior High, and each day, before school lunch, kids read a short letter from the nutritionist in charge of preparing the menu for lunch, which outlines the main ingredients, what nutrients are found in them, and simply stating what those nutrients do for the body, changing seasonally. It’s a phenomenal system, as is the requirement that, while kids can put back things they don’t like, they absolutely have to at least eat a little of everything. Just the fundamental respect to food and to the people involved in its production is heavily emphasized as part of classroom instruction, as is the basic maturity to be able to at least eat a little bit of something even if you don’t like it (no ten-year-old should be having a meltdown because they are being asked to eat a single slice of cooked and seasoned eggplant). It’s no surprise that by 20, most Japanese people are far less picky and accustomed to eating a wider range of foods than Americans are. I have family members that more or less only ate French fries, baked processed chicken tenders (the processed part was a requirement; this person would not eat fresh, homemade chicken tenders), and ham and mayonnaise sandwiches (only with white bread) all the way through to their senior year of high school.
Lest it seem I am suggesting I was a “good eater” – I was not and my family would guffaw and gag on their own spit if I ever suggested I was in their presence. I once sat at a table for four and half hours at a great-aunt’s house because I refused to eat three spoonfuls of mixed vegetables, and ate nearly 100 packs of ramen instant noodles a year throughout my childhood. Even on Thanksgiving, there was a year I made myself ramen noodles, and sat eating a bowl of beef-flavored ramen noodles, plus bread rolls and cheesecake, and nothing else, even with roasted Turkey, stuffing, and a dozen different casseroles and sautés on the table. I was a picky little shit, a product of a system of food production and a culture surrounding food that is designed to create picky little shits over-dependent on big food conglomerates producing highly-processed food at a considerable surcharge. I wouldn’t eat anything hardly; pictures of me until chronic depression and ADHD/antidepressant medications caused weight gain showed a skeletally skinny kid. I ate sweet breakfast cereal, fast food chicken nuggets and French fries, white bread (Europeans continually remark to me, apropos of nothing, how weirdly sweet normal American white bread is), ramen noodles, and cheddar cheese, and little else. I struggled and forced myself, through sheer will, to overcome all the stupid and meaningless aversions I developed to food—as a child I refused to eat my aunt’s homemade pizza because of the .05 in. minced pieces of onion in the sauce, and couldn’t handle a chicken casserole without first picking out every tiny bit of mushroom in it from the Campbell’s soup mix. I started forcing myself to eat broccoli (which is a perfectly good food that is flexible enough to use in a lot of dishes and not that obtrusive) for the first time at 21. I was, at one point, drinking 3 or more colas a day through my first two years of cafeteria dorm life in college—in other words I spent over two decades with a cripplingly awful lifestyle, wouldn’t try anything new from any other culture’s food (curry freaked me out the first time I saw it) and ate far too much salt and sugar and far too few of many essential nutrients. And the thing is I wasn’t even an outlier—my lifestyle was pretty much in line with tens of millions of other Americans, particularly middle class white Americans from all regions.
Food allergies are one thing, but Americans are simply juvenile and petulant when it comes to food, and the problem is really bad with my fellow millennials, (I was the biggest offender until I worked to build my palette and familiarize myself with more tastes and textures) as I find a large chunk of millennials (and Gen X), are insanely picky compared to European and Japanese friends of mine of the same age. A lot of Japanese people are game to eat pretty much anything, without comment or complaint. I often only found out a Japanese coworker and friend didn’t like something after we had completely finished eating it; the attitude is “Meh, I’m not a fan, but I’m also not a petulant brat and I won’t let it go to waste or a make a big deal out of it” which may also be due to the collective social nature of eating out over drinks in Japan, where groups are evenly dividing up each dish that they order or where set course menus are the norm. A big part of the issue begins and ends with American school lunches, which are extremely processed and rarely overseen by trained nutritionists or cooked fresh, as is the standard in Japan, and at home as well, American kids are likely to live off almost entirely processed frozen and canned foods. Exposure to this kind of diet from a young age, leads to adults incapable of tolerating any range of texture in food—the kind of person that sends back a margerita pizza from a 5-star Italian café because it has actual sliced cherry tomatoes baked onto it (this isn’t moralizing; this would have literally been me, and half my cousins and siblings).
Japan rightfully views eating habits as part of education, as are morality and ethics, in addition to radical things like: actual P.E. (learning to swim, play various sports such as table tennis, basketball, soccer, badminton, gymnastics and so on), actual art classes that include instruction on art theory starting in elementary school, cooking lessons, and learning social interactions and responsibility through extensive delegation of classroom activities (including responsibility for decorating and cleaning your own classroom) and of planning for school events and activities. This works, and it raises the average level of knowledge and awareness about nutrition. The trick is finding a way to drive down food prices and increase demand while also improving distribution networks, especially to urban centers that are often critically underserved (especially lower income districts). Every solution requires government and non-governmental elements, socio-cultural and economic reforms, to be able to really address pernicious problems. Issues like those Americans face with health and nutrition create growing burdens on health care systems, avoidable suffering, loss of life, and perhaps worst of all, loss of quality of life (healthy life expectancy), which are most concentrated in the most disadvantaged segments of society. Systematic racism and social inequality also look like poor education systems, lack of knowledge or educational outreach on nutrition, lack of affordable distribution networks for fruits, nuts, and vegetables, and inaccessible systems of medical care for these groups already put at elevated risk. Nutritional deficiency, particularly in Vitamin D, but also Vitamin E and magnesium and calcium to name some examples, effects more Hispanic and Black Americans than Caucasians, and lower income urban areas have higher levels of insufficiency for a number of essential nutrients, as well as imbalances of dietary intake, with too much sugar and saturated fat, and not enough protein and unsaturated fats.
Economic justice and social equality in part comes from addressing these kinds of nutritional deficiencies and educational gaps, along with income inequality, out of control rent, unwalkable urban and suburban areas, air and water pollution, and the pricing out of the medical system of tens of millions of low-income Americans. All are important and play a role; the person who is obsessed with Vitamin D levels in Black Americans but doesn’t care about incarceration rates, police killings, affordable housing, or food deserts in major urban areas, is really just looking for a topic to lecture the poor on and social reform that doesn’t threaten their status in any way, similar to wagging the finger at opiate abuse in rural America without talking about social isolation and economic stagnation. I just think nutrition is a deeply under-written topic from the Left (and as such often relegated to conservatives and liberal elites), but one that intersects with my personal interests in a way that makes it easier to write about than say housing policy or health insurance policy and American zoning laws. Thus I write to educate and formulate goals on this issue, but am not ignoring the potency (or the need to try and tackle in my own writing in the future), other issues that intersect with public health and also generate inequality and unnecessary suffering and other social ills.