The Confusing Role of Fat in the American Diet (Part 1)

Chosen Team

The Confusing Role of Fat in the American Diet (Part 1)

Continuing with a look at how the American diet has evolved over time, particularly in relation to modern day food processing, we’ll now be looking at the ever-confusing role of fats in our diet.

A Brief History of Fats

Many years ago, particularly prior to the Industrial Revolution (comprising the late 1700s and early 1800s), fat was a natural part of the human diet.  It was not the “bad word” many think of it as today and even liberal amounts of fat were commonplace in regular meals.  Farmers and self-sufficient homesteaders alike maintained cattle for the production of milk, cream, and home-churned butter.  It was expected for butchers to sell all parts of livestock for food without trimming off all the fatty parts.  Meat was not standardized to “98% lean” like we see today.  Rich nuts and seeds were consumed without concern for the fat content.  And they were pressed even anciently for the use of the oils both topically and internally.

It was not long after that scientific advancements got in the way of keeping healthy fats in the American diet.  The establishment of nationwide organizations like the United States Department of Agriculture (USDA, formed in 1862) and the Food and Drug Administration (FDA, formed 1906), were key turning points in our nation’s health.  Although striving to educate the people and promote more balanced diets, both of these organizations have endorsed health recommendations that were not in line with how people have eaten for centuries prior.  And a lack of available knowledge and research was so scarce that many were greatly influenced by their recommendations, and still are to this day.

One significant area where public health has been led astray by government and media attention is found in the recommendations for fat and the added sources of fat to popular foods.  The naturally occurring, healthful fats were gradually replaced with chemically altered fats.  The most problematic of these being trans-fatty acids, also known as hydrogenated or partially hydrogenated oils.  According to American Heart Association (AHA), the hydrogenation process was discovered in the 1890’s as an effective food preservative that also made foods taste more pleasant.  In the early 1900’s, the popular brand Crisco shortening hit the shelves and became an American pantry staple.  It was cheaper to produce than high-quality butter and in light of the Great Depression, gained even greater popularity, as livestock was scarce.  Just after the World War II, margarine was developed and began to replace butter for bread and baked goods. 

The Low-Fat Movement

By 1957, the American Heart Association was touting the claim that reducing saturated dietary fats (from sources like butter and beef) would reduce the risk of heart disease.  And the popularity of margarine continued to grow (1).  Similar claims were made by like organizations and the fats that were previously consumed were either replaced or cut out as much as possible.  Advocacy groups petitioned fast food restaurants and other public establishments to cut out saturated fats for cooking and frying, leading to even more widespread use of trans fats.

Studies published in the late 1980’s and early 1990’s began to bring awareness to the dangers of consuming trans fatty acids.  Conditions like coronary heart disease (2) and high LDL cholesterol levels (3) were attributed to trans fat consumption.  This resulted in the labeling of trans fats on products, enforced by the FDA, and the food industry began to sway away from their use.  Unfortunately, vegetable oils like corn and canola were already gaining popularity and began to replace trans fats (4) in a “swapping one bad for another” type of situation.

Vegetable oils (a misnomer, as no vegetable oils actually come from vegetables) were discovered in the early 1900’s via a chemical process of extracting triglycerides from plants.  They gained steady popularity as a more affordable way to derive fats than from meat and dairy sources.  Common examples that are still used today are canola, sunflower, peanut, cottonseed and soybean oils (oils labeled as simply “Vegetable Oil” are frequently made from soybeans).  These oils are creating using a highly technical refining process, which includes the use of sulfur dioxide (a pungent, irritating gas that can increase respiratory infections), acids, and high-heat extraction processes (5).

While the AHA now warns against the dangers of trans fats (6), they are still very far off base in their recommendation of canola and corn oil as primary dietary fats (7).  Additionally, the UDSA Choose My Plate nutrition program replaced the popular Food Pyramid in 2011 (long thought to be the standard healthy diets) and the latest plate depiction they’ve published doesn’t even picture fat as a part of a healthy plate (8).  This only continues to fuel the idea that fat is bad and Americans should be limiting their intake, creating a continuingly confusing scene for our essential friend fat.  In Part 2, we’ll cover why Choose My Plate is lacking and debunk many of the myths surrounding the fats that we should be eating.


A History of Trans Fats, American Heart Association, 2014:

Intake of Trans Fatty Acids and Risk of Coronary Heart Disease…, The Lancet, 1993:

Effect of Dietary Trans Fatty Acids on High-Density and Low-Density Lipo-protein Cholesterol Levels…, The New England Journal of Medicine, 1990:

Corn Refining, Corn Refiners Association of America, 2014:

The Corn Refining Process, Corn Refiners Association of America, 2014:

Trans Fats, American Heart Association, 2014:

Use Olive, Canola, Corn, or Safflower Oil as Your Main Kitchen Fats, American Heart Association, 2014:

Choose My Plate Diagram, USDA, 2014: