Throughout history, our understanding of nutrition and health has shifted dramatically. What people once considered miraculous cures or beneficial foods often turned out to be dangerous or downright toxic. The fascinating intersection of marketing, limited scientific knowledge, and cultural beliefs created a perfect storm for some truly bizarre dietary choices.
These historical food trends remind us how malleable our perceptions of health can be. Each generation believed they had discovered the secret to wellness, only to learn harsh lessons about the consequences. From radioactive drinks to metallic additives in bread, our ancestors consumed substances that would horrify modern consumers.
Radioactive Water as an Energy Drink

Radithor was a radioactive patent medicine brand of distilled water containing at least 1 microcurie (37 kBq) each of the radium-226 and 228 isotopes, sold in 1⁄2 US fl oz (15 mL) bottles. In 1932, the illness and death of business magnate Eben Byers was unambiguously linked to his fervent use of Radithor, leading to the collapse of the radium fad and the strengthening of regulatory control of pharmaceutical and radioactive products in the United States. The early twentieth century witnessed an extraordinary craze for radioactive products as health tonics. In the 1920s, an energy drink called Radithor was promoted by doctors, backed by “scientific” publications, and emblazoned across newspaper and billboard advertisements. In the 1920s, drinking radium and thorium infused water was advertised as healthy.
Eben Byers, a wealthy American socialite, athlete, industrialist, and Yale College graduate, who drank 1400 bottles of Radithor beginning in 1927, died in 1932 of various cancers as a result; before he died his jaw had to be removed. Byers was buried in a lead-lined coffin; when exhumed in 1965 for study, his remains were still highly radioactive, containing about 100,000 times more radium than normal. The tragedy of Byers became a cautionary tale, with a Wall Street Journal article describing the Byers incident (published in August 1990) titled “The Radium Water Worked Fine Until His Jaw Came Off”. Renowned doctors touted the benefits of this “elixir of life” and its healing effect on their patients. Radium expert Dr. Luther S.H. Gable of the Detroit Institute of Technology reported to an audience at a 1931 lecture that a radium-infused beverage was the cornerstone of his health regimen. He regularly drank a radium “highball,” fruit juice containing emanations, to maintain peak physical condition.
Cocaine Toothache Drops for Children

Before the 20th century, regulations did not require manufacturers of ready-made remedies to list ingredients in their medicines. Perhaps taking advantage of the popularity of the drug, Lloyd Manufacturing did advertise the cocaine in their toothache drops. The promised instantaneous relief was likely provided by the anesthetic properties of cocaine. A cheery vintage advertisement depicting kids playing in a yard promotes “cocaine toothache drops” sold at an Albany, New York, pharmacy in March 1885. The ad, which touts its 15-cent drops, promises “instantaneous cure.” In the late 1800s, cocaine was considered a possibility for local anesthesia.
Marketed mostly to children for toothaches, these cocaine toothache drops contained small amounts of cocaine that were meant to ease the pain associated with tooth-related pain. At the time, parents had no idea what they were really giving their children. The medical community embraced cocaine’s properties after Austrian ophthalmologist Carl Koller discovered the anesthetic properties of cocaine in 1884. He found that a few drops of cocaine solution, when put on a patient’s cornea, acted as a topical anesthetic. It significantly reduced the pain and made surgery much easier and less risky. However, addiction to cocaine was on the rise and by 1902 there were an estimated 200,000 cocaine addicts in the United States. By 1907, U.S. coca leaf imports increased and were three times their 1900 levels.
Victorian Bread Laced with Lead and Chalk

You could highlight an array of foods in Victorian England that would fail to pass muster under any food safety laws, from the lead chromate found in mustard to the arsenic compounds used to color confectionery. However, given its ubiquity in households of the era, the most egregious example may well be bread. Seeking to create thick loaves of an appealing white hue, Victorian bakers mixed in such ingredients as ground-up bones, chalk, and plaster of Paris. The drive for profitable bread production led to dangerous shortcuts that prioritized appearance over safety. Another common additive was alum, an aluminum-based compound that inhibited digestion and contributed to the malnutrition rampant among the poorer population. Although the dangers of adulterated edibles were known among the more educated members of the public, there was little stopping the food producers and distributors who ignored these health risks in favor of profits.
These additives weren’t just cosmetic choices but represented a systematic contamination of the food supply. The white color of bread became a status symbol, driving bakers to use increasingly toxic substances to achieve the desired appearance. The widespread consumption of contaminated bread affected entire populations, particularly the working classes who relied on bread as a dietary staple.
Rhubarb Leaves During World War I

Known for its reddish stalk and tart flavor, rhubarb in the hands of a capable chef can be used to create delicious pies, sauces, and jams. That is, the stalks can be turned into such kitchen delights – the thick green leaves are chock-full of toxic oxalic acid and therefore not meant for ingestion. Unfortunately, this fact was not well known a century ago, as rhubarb leaves were recommended as a source of vegetation during the food shortages of World War I. The desperation of wartime food rationing led authorities to promote dangerous alternatives without proper research.
Consumed in small but regular doses, the leaves inhibit the beneficial effects of calcium and trigger the buildup of calcium oxalate, leading to kidney stones and other serious health complications. The irony was particularly cruel since rhubarb stalks were already a known food source, but the distinction between edible stalks and toxic leaves wasn’t widely understood among the general population. This dangerous recommendation shows how crisis situations can override basic food safety knowledge.
Tansy Plant as Medieval Easter Medicine

Coming on the heels of Lent, the tansy not only provided a welcome change from the strict diet of lentils and fish consumed over the previous 40 days, but also was said to provide relief from the gas-inducing Lenten meals. Despite its purported medicinal qualities, the plant is actually mildly toxic, its pyrrolizidine alkaloids dangerous to humans in high doses. Although the poison didn’t hinder the long-standing popularity of the tansy on dinner tables, people are generally dissuaded from eating the plant today. Medieval Europeans incorporated tansy into their Easter celebrations as both a symbolic and practical food. The bitter herb was believed to represent the suffering of Christ while supposedly cleansing the body after weeks of limited diet.
The plant’s yellow flowers and strong aroma made it an attractive addition to springtime meals. Traditional recipes included tansy puddings, tansy cakes, and tansy wine. The herb’s popularity persisted for centuries, with households cultivating tansy in their kitchen gardens specifically for culinary and medicinal purposes. Only modern toxicology research revealed the true dangers lurking in this seemingly innocent seasonal tradition.
Margarine as Heart-Healthy Butter Substitute

The origin of margarine, which is made from vegetable fat, dates back to the mid-1800s. Since that time, margarine has replaced butter as the fat spread of choice in most developed countries. This switch was driven by the lower price of margarine compared with butter as well as recommendations from health professionals to eat less saturated fat in order to prevent coronary heart disease (CHD). For decades, margarine was aggressively marketed as the healthier alternative to butter, supported by medical professionals and government dietary guidelines.
While this switch away from saturated fats began to show reduced CHD incidence in the population, researchers also identified an independent link between trans fat (a fat produced when partially hydrogenating vegetable fats to make margarines) intake and CHD. Since this link between trans fat and CHD was confirmed by multiple studies regulatory agencies around the world have sought to eliminate trans fats from the diet. The food industry was quick to respond and has been producing trans fat-free margarine for years now. Crisco (1911), first trans fat, cottonseed oil partially hydrogenated. Margarine (same trans fat). Both mandated to replace saturateds, 1983, just reversed 2016 (now being phased out), known now to be WORSE than saturateds.
The Dangerous Diet Culture of Extreme Starvation

Written by Italian nobleman Luigi Cornaro in 1558, La Vita Sobra (The Sober Life) advocated extreme starvation as a way to lose weight and boost health. The author himself lost weight and reportedly lived to the age of 102 by eating just 12oz (342g) of food and drinking 14fl oz (400ml) of wine per day. Evidence suggests mice live longer when fed starvation rations, but it’s not been proven in humans. This early diet book established a dangerous precedent for extreme caloric restriction that would influence diet culture for centuries.
The idea of “ideal body types” arose in the mid-1800s, and beauty became reliant on the visuals of a person’s body. The thin ideal and form-fitting clothing of the mid-19th century was so prevalent that the first “diet influencer” emerged. Lord Byron was considered the most beautiful man in the world by the Victorians. Everybody wanted to look like him. He recounted his diet of starving himself and then binge eating, after which he would try to sweat off any gained weight under many layers of clothing. He is also the inventor of the vinegar diet (a diet that we see even now with the popular practice of consuming a tablespoon of apple cider vinegar before meals which continues today despite lacking scientific support.
Conclusion

These historical examples serve as powerful reminders that our current understanding of nutrition and health isn’t the final word. Each generation believed they had discovered the ultimate path to wellness, only to learn that their “miracle foods” were often harmful or even deadly. The radioactive water that promised energy, the cocaine drops that offered pain relief, and the contaminated bread that symbolized prosperity all seemed logical within their historical context.
What makes these stories particularly sobering is recognizing how marketing, limited scientific knowledge, and cultural beliefs combined to create widespread acceptance of dangerous practices. The pattern repeats throughout history: promising new discoveries, enthusiastic adoption by medical professionals, aggressive marketing to consumers, and eventually the discovery of devastating health consequences.
These cautionary tales should make us more thoughtful about today’s health trends and supposed superfoods. What seems revolutionary and beneficial now might be tomorrow’s cautionary tale. Did you expect that our ancestors consumed such dangerous substances in the name of health?


