As obesity rates skyrocket, Dietary Guidelines go from being pro-vegetable to anti-meat. How'd that happen?
Watch carefully and you can see the sleight of hand built into the creation of the Dietary Guidelines for Americans (courtesy of the Harvard School of Public Health)
Happy (almost) New Year!
This coming year, if all goes as planned, we’ll be treated to the 10th edition of the Dietary Guidelines for Americans (DGAs), courtesy of the Departments of Agriculture and of Health and Human Services .
The DGAs are the government’s “science-based advice on what to eat and drink.”
Their purpose, from the very first edition published in 1980, has always been to “promote health and prevent chronic disease.”
In honor of the coming edition, let’s ask the obvious question:
How’s that going?
Have the Dietary Guidelines of America promoted health? Have they prevented chronic disease?
Before I answer (spoiler alert: no), let me provide the context in the form of a brief history lesson.
The driving force behind the idea that the USDA owed it to Americans to tell them how to eat healthy was a consumer advocate named Carol Tucker Foreman. In March 1976, newly elected President Jimmy Carter appointed Foreman, then executive director of the Consumer Federation of America, to be an assistant secretary of agriculture.
At the time, some 13 to 14% of American adults were obese. Two percent of the population had been diagnosed with diabetes.
Foreman believed (as she told me in an interview 20-odd years later), that Americans “were getting sick and dying because we ate too much.” And she believed it was incumbent on the USDA to fix that problem (Dare I say it, to make Americans healthy again?) and that nutrition researchers had an obligation to make their best guess about the diet-disease relationship. Then the public could decide whether to take it or not.
“Tell us what you know and tell us it’s not the final answer,” Foreman would tell nutrition researchers. “I have to eat three times a day and feed my children three times a day and I want you to tell me what your best sense of the data is right now.”
And so Foreman put the USDA to work on the first edition of the guidelines, apparently assuming that if the science changed—if clinical trials, for instance, designed to test the efficacy of the guidelines failed to detect any (as they would)—the nutritionists’ “best sense of the data” would change, and the guidelines would as well.
The early editions of the guidelines—up through the 7th—only purported to give advice to “healthy Americans.” The 7th edition, though, in 2010, expanded the focus of the advice to include Americans “at risk of chronic disease.”
By then, regrettably, the USDA had no choice. More than half of all Americans fit into the “at risk” category or worse—they were overweight or obese, pre-diabetic or diabetic—and the USDA would accept this reality and (supposedly) deal with it.
Now, after 9 editions of the DGAs with the 10th on the horizon, over 40% of American adults are obese, and 8% have been diagnosed with diabetes. Since that bright and shining moment of hope in March 1976, the prevalence of obesity has increased threefold, of diabetes, fourfold. Direct medical costs of obesity and diabetes together amount to some $1.4 billion every day.
So the only possible answer to the how’s-that-going question: Not well, I’m afraid, not well at all.
The DGA paradox?
Hence, the DGA paradox, which is a 2-parter. First: nearly half a century of USDA dietary guidelines associates with half a century of Americans getting ever less healthy and at ever greater risk of chronic disease.
Coincidence? Perhaps.
For all we know, the situation might be considerably worse without the purportedly science-based advice that the DGAs have provided.
But maybe it would be better. Maybe this association contains a cause (the DGAs) and an effect (the ob/db epidemics).
Here’s the second part of the paradox: As Americans have become ever less healthy, the DGA advice has become only and ever more of the same.
What started as the advice to eat a variety of foods, including plenty of fruits and vegetables, has become ever more plant-based.
From 1980 onward, this plant-based philosophy emerged organically (pun acknowledged) from the very questionable assumption (see my last post) that saturated fats from animal-sourced foods are not only the dietary cause of heart disease but of untimely death, as well. If that is true, then consumption of these animal-sourced foods should be limited. (Heart disease, after all, is “the nation’s number 1 killer,” as the American Heart Association likes to phrase it. At the very least, the meat we eat should be lean and the dairy low-fat.
Once fat calories are limited, though, the guidelines de facto become carbohydrate-rich. Hence, “[m]ost of the calories in your diet should come from grain products, vegetables, and fruits,” quoting the 1995 guidelines, the 4th edition. And because animal fats are particularly rich in saturated fats, we were told first to “use fats and oils sparingly” and then, as DGA editions rolled along, to prioritize the use of vegetable oils when we did use them.
This past December, the USDA and the Department of Health and Human Services (which also oversees the DGA process) released the scientific report for the coming, 10th edition of the DGAs.1 That report recommends further amplifying the mostly plant guidance: not just prioritizing plant-based sources of fat over animal sources (seed oils over butter, lard, tallow, etc.) but prioritizing plant-based protein (beans, peas, and legumes) over the more traditional animal sources (meat, poultry, eggs).
So, back to the DGA paradox: as the nation has gotten ever fatter and more diabetic, arguably a population more beset by chronic disease than ever in its history, the advice to eat mostly plants is morphing slowly into advice to eat only plants.
What if the advice itself is the problem? What if even more of the same is the last thing the USDA should be advising the American public to do?
If you give advice with the goal of promoting health and preventing disease and your target audience gets ever more unhealthy, aren’t you morally obligated to explore the possibility that this regrettable situation is a result of your advice? If you are a doctor giving advice or prescribing an intervention—pharmaceutical or dietary—you’d certainly do it. Why not a public health institution?
In an ideal world (or at least my ideal world), each 5-year cycle of producing the DGAs would include a mechanism to ask the question—how are we doing?—and if the answer is as it very regrettably is, to investigate without bias the possibility that the advice is the problem. Just as new presidents are elected every 4 years and can assess and rethink (for better or worse) the policies of the government, it would be reasonable to expect that the 5-year cycle of the DGAs would allow the same kind of critical retrospection.
But they don’t.
Worse, the DGA process has evolved to assure that the fundamental principles underlying the advice will never change. How the USDA does this—the circular logic, and the sleight-of-hand built into the methodology—is what I’m going to discuss. It’s a 4-part process. (The first, which is the most obvious, may sound familiar to those who read my initial post, “In Defense of Bad Health Journalism.)
1. Questioning assumptions is not the job
The guidelines themselves are ultimately drafted by USDA and HHS employees, but they are “science-based,” in theory, because of the work of a Dietary Guidelines Advisory Committee (DGAC). This committee is composed of leading nutrition authorities—20 of them this year—and a new committee is assembled for each new edition of the guidelines.
The recently-released scientific report that I mentioned above is the end product of the current DGAC. That report represented, so we’re told, 22 months of “extensive Committee deliberations… rigorous reviews of data and scientific literature” and close collaboration, during which the committee members gained “valuable insights from one another.”
The committee is by no means blind to the paradox problem, but it does not see it as a paradox. Rather it describes the dire numbers of the obesity and diabetes epidemics as the “backdrop” to its work. The committee members do not consider the explosion of diabetes and obesity in America to be an unprecedented public health failure—as I do, for instance—but rather “major public health challenges.”
Challenges can be overcome. Failures have to be investigated and understood so that they are not repeated. But this kind of thinking was not, apparently, among the valuable insights that the DGAC members gleaned from one another. If nothing else, this kind of critical assessment is not what they are empowered to do.
The DGA’s advisory committee is expected to “build” on the work of earlier committees, not to question them. It updates the thinking based on systematic reviews done by the USDA itself (which play a critical role, as I’ll discuss) and the latest research, which typically (although not always) means the research published in the previous five years, i.e., since the last edition of the guidelines or the last DGAC report.
In the absence of any major clinical trials to demonstrate definitively that the advice in the guidelines is harmful—the kind of enormous randomized controlled trials that can last a decade and cost tens to hundreds of millions of dollars—the committee will not, and perhaps cannot conclude that their advice has led them, and so us, astray.
On the rare occasions when such trials have been done, as I’ve often discussed, they have invariably failed to confirm that the DGA guidance has any beneficial effect. This was the case, for instance, with the $115 million Multiple Risk Factor Intervention Trial (published in 1982, just 2 years after the first guidelines), the half-billion-dollar dietary arm of the Women’s Health Initiative (2006), and the $200 million Look AHEAD trial (2013). But the nutritional authorities (on or off the DGAC) have found it far easier to assume that these trials did a poor job of testing their advice than that their advice did a poor job of being right.
As a result, at the very best, DGACs will only recommend minor variations on the themes put forth by their predecessors. Before they even do that, though, they will conclude that the DGA paradox is due not to the nature of the advice being given but to the failure of Americans to take it. We are not paying sufficient attention.
2. The adherence gap (i.e., it’s not us; it’s you…)
Just as the nutritionists serving on the DGAC are not blind to the DGA paradox, the DGA authors at the USDA are not blind to the (circular) logic they employ that assures this conclusion. They may be unaware of the implications, but they describe the logic (almost) as clear as day.
Here it is in the very first paragraph of the introductory chapter to the current guidelines, the 2020-2025 version. The heading is “Setting the Stage.”
The foods and beverages that people consume have a profound impact on their health. The scientific connection between food and health has been well documented for many decades, with substantial evidence showing that healthy dietary patterns can help people achieve and maintain good health and reduce the risk of chronic diseases throughout all stages of the lifespan. Yet, Federal data show that from the first edition of the Dietary Guidelines for Americans in 1980 through today, Americans have fallen far short of meeting its recommendations, and diet-related chronic disease rates have risen to pervasive levels and continue to be a major public health concern.
Let me break that down.
Foods and beverages have a profound impact on our health.
We know what that impact is and why: i.e., “The scientific connection between food and health has been well documented for many decades.”
We know what “healthy dietary patterns” are (as we’ve been preaching since 1980).
Americans are unhealthy: i.e., “diet-related chronic disease rates have risen to pervasive levels….”
Americans are not taking our advice: “Americans have fallen far short….” (We have been preaching, it appears, only to the choir.)
What’s the solution?
Give more of the same advice. At the very least, figure out a way to present the advice in a more compelling manner— how about a food pyramid or MyPlate?—hoping that by doing so Americans will get the message and eat as we’re being told to eat (if nothing else, more Americans will join the choir).
And here’s the evidence that Americans have “fallen far short” of meeting the recommendations:
Since perfect adherence to the recommendations will never happen, the authors of the guidelines can always blame the national failure to maintain our health on the gap between perfection and, well, reality.
No matter how small or large the gap, the solution never requires fixing the advice. It can always be more of the same, assuming if we can just shrink the gap, we can solve the problem.
3. Dietary patterns & the sleight of hand (keep your eyes on the sugar)
Let’s go back to the history to understand the nature of the sleight of hand that’s now embedded in the DGA process. Like a magic trick, you have to watch closely if you want to understand how it’s done.
In the early versions of the guidelines, the advice focused on specific types of macronutrients that we should avoid eating “too much” (fat, saturated fat, and sugar), all assumed to be dietary evils. These guidelines came with suggestions about foods to eat that satisfied these criteria (fruits, fiber-rich vegetables, whole grains, lean proteins) and foods to avoid to help satisfy these criteria (eggs, organ meats, butter, desserts, sodas, etc), but that’s all they did.
Through the early 2000s, the guidelines made no mention of dietary patterns. But nutrition research itself was changing. As the government and health associations had begun disseminating diet advice, and influential health journalists were broadcasting it far and wide (most famously here), nutritional epidemiologists began to worry that maybe their research was being confounded by this notion of a healthy diet. This is a critical issue and I will return to it in later posts.
Once these institutions had taken to disseminating this notion of a healthy diet, that would become the diet that health-conscious people would eat.
But here’s the catch: people who are health-conscious, by definition, are people who both know enough and can afford to prioritize their health over life’s less healthy pleasures (cigarettes, for instance). It’s not only a very good bet that they will be healthier than people who either can’t afford to be health conscious or don’t know enough or care enough to be, but that they have a host of other advantages (better doctors, higher education, higher socio-economic status) and engage in a whole host of other health-conscious behaviours that also work to keep them healthier.
Eating what you are believe to be a healthy diet, in other words, is a marker, a sign, of being health-conscious. It’s part and parcel of being health-conscious, but it associates with all these other behaviors and advantages that might also work to make you healthy.
In short, a universe of these health-related factors cluster together, assuring that the people who choose to eat the diet that we are being told is healthiest (by the government, journalists, health associations and healthcare providers) are also the people who live longer and healthier lives. Whether the diet is helping or not.
While the nutritionists assumed (hoped?) that at least some of the “favorable health outcomes” that associated with these dietary choices was actually due to the dietary choices themselves, they wouldn’t even know what aspect of the diet it was. Why not? Because supposedly healthy dietary choices also cluster together: health-conscious people don’t just make one isolated change to their diet. They make multiple related dietary choices that they think of as healthy eating—an entire healthy eating pattern.
In 1993, National Cancer Institute epidemiologists raised this issue in regards to the surveys that the government used to establish what Americans were eating and how that associated with health status (and that revealed, in 1998, the existence of the obesity epidemic). Their paper was called (with my italics) “Dietary patterns associated with a low-fat diet in the national health examination follow-up study: identification of potential confounders for epidemiologic analyses.”
The NCI researchers found that people who ate low-fat diets, just as the DGAs and health associations and the media were telling them to do, also tended to eat and not eat a lot of other things that the same institutions were also telling them to do:
Intakes of vitamin C and percentages of calories from carbohydrates, dietary fiber, poultry, low-fat dairy products, fruits, vegetables, cereals, and whole grains were markedly higher, while intakes of protein, total fat, saturated fat, oleic and linoleic acids, cholesterol, sodium, all red meats, high-fat dairy products, eggs, nuts, white bread, fried potatoes, desserts, fats, and oils were much lower….
The NCI researchers were looking for an association between high-fat diets and cancer. But now they realized that even if low-fat diets associated with lower cancer risk (they wouldn’t), it could be for all these other non-fat related dietary reasons in this healthy eating pattern—maybe eating more fruits and vegetables (so more fiber or vitamin C), or less meat, or fewer desserts (less sugar), etc.
The NCI epidemiologists were right to worry about these issues. This remains a critical problem with nutritional epidemiology. The reason why medical science depends on randomized controlled trials to establish reliable knowledge is because the randomization controls for just these kinds of confounders. That’s what it does.
Observational studies of the kind these nutritional epidemiologists had taken to doing cannot control for these factors. People who eat what they think of as a healthy diet are health conscious. They do a lot of things that they think of as healthy. They differ profoundly from those who don’t eat a healthy diet in many respects, not just in terms of diet. If you’ll excuse the nutrition-related pun, they are as different as apples and oranges.
By the end of the 1990s, though, the nutritional epidemiologists at the Harvard School of Public Health, led by Walter Willett and Frank Hu, got involved and decided maybe they could use these dietary patterns in another way. Ignoring the potential confounder possibility, they decided to look in one of their huge cohort studies—the Physicians’ Health Study with its 45,000 doctors—to determine whether particular dietary patterns associate with health in their studies. Unsurprisingly, they did. Willett, Hu, et al. published their results in 1999 and 2000.
Here’s how they described their motivation:
Distinct eating patterns reflect different dietary traditions worldwide, and they may be related to rates of coronary heart disease (CHD) in different countries. Mediterranean and Asian populations have very low rates of CHD compared with Western populations. These low rates are attributed to high intakes of vegetables, fruit, whole-grain products, and fish and low intakes of red meat, high-fat dairy products, and other animal products in traditional Mediterranean and Asian diets.
But now the Harvard epidemiologists allowed their preconceptions to bias their observations.
While Mediterranean and Asian populations had low rates of CHD compared with Western populations (hence, high intakes of vegetables, whole grains, and fish; low intakes of red meat, high-fat dairy, etc.), some Western nations also had low rates of CHD (France and Switzerland, for instance), but consumed traditional diets with plenty of red meat and high-fat dairy. Other traditional populations like the cattle herders of Kenya (the Masai), the reindeer herders of Siberia, the Inuit, or the Native Americans of the Great Plains also ate meat- and fat-rich diets and were, until their diets were westernized, conspicuously healthy.
This is why the British researchers in the 1960s and 1970s who studied these nutrition transitions—what happens when traditional diets become westernized—focused on the presence or absence of refined grains (white flour) and sugar as the fundamental change. This was true regardless of the meat and high-fat dairy intake of the populations.
But when Willett, Hu and their Harvard colleagues went looking in their Physicians’ Health Study cohort and found a related dietary pattern—just what the NCI researchers had identified—they redefined the competing diets.
This was the beginning of the sleight of hand.
Rather than taking advantage of the decades of British observations on these diet-disease associations worldwide, the Harvard epidemiologists narrowed their focus only to American diets (what U.S. doctors ate, a group that should be particularly health-conscious) and divided those into a “prudent” pattern and a “Western” pattern.
The prudent pattern was characterized “by higher intake of vegetables, fruit, legumes, whole grains, fish, and poultry;” the Western pattern, by “higher intake of red meat, processed meat, refined grains, sweets and dessert, French fries, and high-fat dairy products…”2
Not surprisingly, the more prudent the MDs’ diet, the healthier they were; the more closely they followed the Western pattern, the less healthy:
During 8 y of follow-up, we found that as prudent pattern score increased, the risk of CHD decreased, even after adjustment for potential beneficial nutrients such as folate and cereal fiber. In contrast, as Western pattern score increased, the risk of CHD increased, even after adjustment for potential deleterious nutrients such as saturated fat, trans fat, and cholesterol. These data suggest that the 2 major dietary patterns derived from the FFQ predict the risk of CHD, independent of the effects of several known beneficial or deleterious nutrients.
One obvious way to interpret the association Willet, Hu, et al were reporting is made clear by the choice of the term “prudent” to describe the pattern that associated with better health. In this context, prudent is a synonym for health-conscious. The prudent dietary pattern included, by definition, all the foods that we had come to think of as characterizing a healthy diet by the 1990s. The Western dietary pattern included all the foods that the nutritional authorities had convinced us were unhealthy indulgences.
By judging their doctors on the basis of whether they chose to eat fish moreso than red meat (or bacon!), fruits and vegetables rather than sweets and desserts, whole grains over processed grains (white rice and flour), the Harvard epidemiologists had assured that they were selecting physicians who were health conscious (prudent) and comparing their health to those who weren’t (Western!). If they thought about the confounding of their nutritional analyses by this simple fact, as the NCI epidemiologists had, the Harvard epidemiologists did not consider it worthy of discussion, let alone suggest the kind of analysis or experiment that might rule it out.
But now let’s get back to the sleight of hand and how it ends. Were you watching carefully enough to see it?
The “traditional” dietary pattern that the British had defined was characterized by the absence of industrialized, processed carbohydrates—sugars and processed grains. The Harvard epidemiologists had swapped the traditional diet idea for the “prudent” dietary pattern. And that pattern, of course, not only included minimal consumption of these processed carbohydrates but the avoidance of red and processed meat and high-fat dairy.
By swapping the prudent pattern for the traditional pattern, Willet, Hu and their colleagues could put red and processed meats and high-fat dairy in the same Western pattern as sugar and processed grains. They could put foods that are consumed by populations eating their traditional diets and so living off the land—red and even processed meat (bacon) and high-fat dairy—into the same pattern as industrialized foods that came with westernization.
This subtle manipulation of the dietary patterns—the sleight of hand—assured that the health outcomes associated with eating the protein and fat from meat and dairy would be confounded by the health outcomes associated with eating the industrially processed carbohydrates—sugar and white flour.3
As nutritionists in general took to playing up dietary patterns as another way to communicate their beliefs about a healthy diet, they continued to use the Harvard prudent and western dietary patterns as their models.
In the early 2010s, the USDA embraced the dietary pattern concept for the DGAs and did the same, providing yet another way to package the DGAs in a manner that more Americans might actually follow. Now the DGAs offered up entire healthy dietary patterns—a vegetarian pattern, a Mediterranean pattern, a “Healthy U.S.-Style Dietary Pattern” pattern, all with “the same core elements.”
Those “core elements,” of course, would be high-intakes of fruits, vegetables, whole grains and legumes, because those were foods prudent, health-conscious people prioritized, and low intakes of sugar and refined grains and red and processed meats and high-fat dairy, because these were foods that prudent health-conscious people knew enough to limit. Why? Because that’s what they were being told by health authorities, and they were prudent and health conscious.
The problem is that prudent, health-conscious people would never know if Harvard’s prudent dietary patterns or the USDA’s healthy dietary patterns made them healthier, any more than Willett, Hu and their Harvard colleagues could know that.
But that’s what they would eat.
Meanwhile the DGAs could advise ever more plants and ever less meat, eggs and dairy in the guidelines because health-conscious people who avoided those foods were healthier. Although they might be healthier still, if they ate more meat and less plants. So long as that proposition was never tested in randomized controlled trials (or the relevance of the tests would be rejected when they come up negative), the truth would remain unknown.
4. Ask the wrong questions
Beginning in 2017, the DGA process was modified to add a new stage at the beginning of the process. Rather than beginning by appointing a Dietary Guidelines Advisory Committee and letting them do as they pleased, the employees at the USDA and HHS who oversaw the DGA process would first identify “topics and supporting questions” that they thought would be “of greatest importance and relevance to Federal nutrition programs, policies, and consumer education priorities.”
Those topics and questions would be posted on a USDA website and the public could comment. The questions would then be readjusted, if the USDA and HHS employees thought it necessary. The USDA would gather the data needed to answer the questions, and agency employees would do “Nutrition Evidence Systematic Reviews.” These reviews would use “rigorous, protocol-driven methodology” to support the DGACs in deciding how to modify the DGAs themselves if such modification was necessary. These systematic reviews would be the science base for the next guidelines.
And now we’re back to the circular logic, or the self-fulfilling prophecy of the DGAs.
As I’ve written many times, the process of science depends as much on the questions that are asked as the observations or experiments done to answer them. Ask the wrong question, it doesn’t matter what answer you get. If you think a particular dietary pattern is the healthiest way to eat and you want to know if that is true—the question—then you have to do an experiment to answer the question. Specifically, a randomized controlled trial.
You randomly assign subjects to eat the dietary pattern you think is healthiest or other dietary patterns that may also be healthy, and keep the subjects eating that way, following one pattern or the others, long enough to determine whether you’re right.
Ideally, the USDA would ask whether or not the healthy dietary patterns promoted by the DGAs cause us to be healthier than other dietary patterns that we might otherwise think are healthy?
For instance, does eating a mostly-plant diet with limited refined grains and sugars cause us to be healthier than eating a mostly-animal-sourced diet with limited refined grains and sugars? Does eating a diet in which the fat sources are plant-based (seed oils) make us healthier than eating a diet in which the fat sources are from animals (butter, lard, tallow, etc.)?
These are critically important questions. Should I prioritize plant-foods or animal-based foods if I want to maximize my health (and my children’s health)?
The problem, though, is that researchers have asked these kinds of questions—as a collaboration of evidence-based medicine researchers did in 2020, in a series of four systematic analyses published in Annals of Internal Medicine—and they could find only “low to very-low quality” evidence supporting this mostly-plant advice. (I wrote about these analyses for Unsettled Science here.)
But these are not the questions that the USDA asks. All the questions that guide the DGAC work, all the questions asked and answered by the systematic reviews, take the form of “what is the relationship” between dietary patterns and risk of chronic disease, just as the Harvard nutritional epidemiologists asked 25 years ago.
Here’s the beginning of the list from the recent DGAC report (it continues faithfully along these lines):
By asking what is the relationship between dietary patterns consumed and chronic diseases, they can answer correctly that the healthy dietary patterns (i.e., what prudent, health-conscious people eat because that’s how they’ve been told to eat) associate with “favorable health outcomes.” They do.
More importantly, that’s the answer even in the absence of supporting evidence from randomized controlled trial. RCTs are not needed to answer those questions. The USDA analysts can rely only on observational studies (as they mostly do) because this is a question that can be answered with observations only.
It is not a cause and effect question requiring experiments to answer. The USDA analysts are not asking whether the supposedly healthy dietary patterns cause people to be healthier than other dietary patterns—which might then be healthy, too, or even healthier—they ask “what is the relationship” between these diets and health.
And they come to the same conclusion that the NCI researchers did in 1993 and Willet, Hu, and their fellow Harvard epidemiologists did in 1999 and 2000, because it’s built into the question: people who ate this healthy dietary pattern, who are by definition prudent, health-conscious people, are healthier than those who don’t. All these healthy dietary patterns “associate with favorable health outcomes” because they were based on what prudent, health-conscious, and so relatively healthy, people eat.
Even when the USDA reviewers found (a very few) clinical trials that supposedly tested the safety and efficacy of the dietary pattern, the trials only tested the healthy dietary pattern against whatever people were eating normally—as this trial did, for instance, finding precious little benefit, or the MIND Diet trial did, as I discussed here, finding no benefit—not against potential competitors for a such a pattern or different variations on the healthy pattern.
Once the nutritional epidemiologists from Harvard ignored the confounding problem and slipped dietary indulgences like sweets, desserts, sugary beverages and white bread into the same dietary pattern as meats, processed meats and high-fat dairy, they created the self-fulfilling prophecy of mostly plants.
With this circular logic, the USDA methodology assured that the DGAs would never conclude that meats, processed meats and high-fat dairy might be beneficial. With this logic, the DGAs will give us variations on mostly plants to all plants so long as there are DGAs.
Nina Teicholz has an excellent review of the evidence (or lack thereof) in the new DGAC report on her substack, Unsettled Science.
Vegetable oils and seed oils are also what health-conscious people were told to eat in this era. This is why relatively high intake of seed oils was part of the prudent diet. This catgeorization then confounds the seed oil story. Because seed oil consumption associates with health consciousness and so favorable health outcomes, it explains why the nutritional epidemiologists (particularly those from Harvard) and the USDA promote seed oils. Whether consuming them causes favorable health outcomes, of course, or does the opposite, requires RCTs to determine. This issue is at the heart of the seed oil debate.
Oh I think we’ve followed the recommendations — low to no fat being the most onerous. Precisely that is what has made us obese and full of diabetes. They should be shut down.
Thank you, Gary!
The Carbohydrate-Insulin-Model (CIM), as elucidated by David Ludwig of Harvard and colleagues, explains the physiologic basis for much of this weight gain. Insulin is well known to cause weight gain, so when we stimulate insulin by eating sugar and refined grains, sure enough: we gain weight! Sugar and refined grains are provided in abundance to those on federally funded food programs, as per the DGA. Voila! Greater obesity among those using these programs.