When Numbers Don’t Tell the Full Story

Here’s a number that might surprise you: up to 20% of people believe they have a food allergy, but only about 5% actually do. That’s a staggering gap between perception and reality, and it’s particularly pronounced when it comes to nut allergies. Recent studies show that while self-reported food allergy prevalence in Europe ranges from 13.1% to 19.9%, only 0.8% of people actually have confirmed food allergies when tested through oral food challenges. This disconnect isn’t just a statistical curiosity—it’s reshaping how we understand and treat allergic reactions. Peanut allergy affects approximately 1 out of 50 children, making it the second-most-common food allergy among children. But here’s where things get interesting: many of these “allergies” might not be real at all. Instead, experts point the finger at overdiagnosis and overreporting of food allergies, fueled by inaccurate testing methods and misconceptions of what a food allergy really is. The question isn’t whether nut allergies exist—they absolutely do and can be life-threatening—but whether we’re seeing a genuine epidemic or a case of medical mistaken identity.
The Self-Reporting Problem That’s Skewing the Numbers

Currently, the majority of available data is based on self-reporting, which generally overestimates food allergy prevalence by a factor of three to four. Think about it this way: if you had an upset stomach after eating peanuts once, you might tell people you’re allergic to them. But that single incident could have been food poisoning, stress, or even an unrelated stomach bug. This may be due to patients/parents mistaking other adverse reactions to food (e.g., food poisoning, enzyme deficiencies, contact dermatitis, or food aversions) for food allergy. The problem is so widespread that researchers have documented dramatic differences within the same country. For example, prevalence of parent-reported food reactions was 14.5% among siblings of 1570 German infants enrolled in the EuroPrevall study, whereas a separate study reported OFC confirmed food allergy in 31 of 739 German children (4.2%). In one meta-analysis, the rate of self-reported food allergy was 12 percent and 13 percent for children and adults compared to 3 percent when confirmation with testing was applied.
Testing Troubles: When Medical Tests Mislead

Part of the overdiagnosis problem stems from misunderstanding what common allergy tests actually measure. The skin test is only reliable perhaps 60 percent of the time, and many endorse serologic testing as well to increase the sensitivity and specificity of food allergy testing. However, allergy is defined as sensitization with clinical symptoms related to exposure. Many just assume allergy based on a diagnostic test. It’s like assuming someone is afraid of heights because their heart rate increases when they look down from a tall building—the physical reaction doesn’t necessarily mean there’s a real problem. One of the major issues in food allergy is the common misconception that having a “positive test,” by a blood test or allergy skin prick test, is equivalent to having a clinical food allergy. For example, 2005-2006 National Health and Nutrition Examination Survey data showed a 7.6 percent rate of positive serum IgE tests to peanut, clearly higher than the prevalence of clinical peanut allergy. This concern was exacerbated by a recent study, which found that patients who were diagnosed with tree nut allergy based on skin prick tests had no reaction to tree nuts when eaten. The gold standard for diagnosis—the oral food challenge—is rarely used because it’s time-consuming, expensive, and carries risks.
The Hygiene Hypothesis: Are We Too Clean for Our Own Good?

The immune system is a learning device, and at birth it resembles a computer with hardware and software but few data. Additional data must be supplied during the first years of life, through contact with microorganisms from other humans and the natural environment. If these inputs are inadequate or inappropriate, the regulatory mechanisms of the immune system can fail. As a result, the system attacks not only harmful organisms which cause infections but also innocuous targets such as pollen, house dust and food allergens. This theory, called the hygiene hypothesis, suggests that our modern obsession with cleanliness might be backfiring. The main theory to explain a rise in allergic disease, including food allergy, is the “hygiene hypothesis” that generally suggests that “clean living” with less farm living and the use of medications to prevent and quickly treat infections leaves our immune system in a state that is more prone to attack harmless proteins like those in foods, pollens, and animal dander. However, recent research complicates this simple narrative. Our study does not directly contradict the hygiene hypothesis. However, I believe that it is the first proof-of-concept study confirming that diverse microbial exposures as well as infections are not the sole nor primary factors driving the dramatic rise of allergic diseases.
The Microbiome Connection: It’s Not About Dirt

The hygiene hypothesis has evolved beyond simple cleanliness concerns. The term “hygiene hypothesis” has been described as a misnomer because people incorrectly interpret it as referring to their own cleanliness. Having worse personal hygiene, such as not washing hands before eating, only increases the risk of infection without affecting the risk of allergies or immune disorders. Modern research focuses on the microbiome—the trillions of beneficial bacteria living in and on our bodies. The good bacteria we call commensals. Our bodies actually have more bacterial cells than human cells. What we’ve learned over the years is that the association with family life and the environment probably has more to do with the microbiome. The increase in allergy rates is primarily attributed to diet and reduced microbiome diversity, although the mechanistic reasons are unclear. Think of your gut bacteria as a diverse ecosystem—when that ecosystem is disrupted by antibiotics, processed foods, or other factors, the immune system can become confused and overreactive. The use of antibiotics in the first year of life has been linked to asthma and other allergic diseases, and increased asthma rates are also associated with birth by Caesarean section.
Real Increases vs. Better Recognition

So are nut allergies actually increasing, or are we just better at identifying them? The answer is probably both. In 2008, 1.4 percent of children in the survey were reported to have peanut allergies, as opposed to just 0.4 percent in 1997. The prevalence of combined peanut or tree nut allergies in children was 2.1 percent in 2008, compared to 0.6 percent in 1997. These numbers suggest a real increase, but they’re based on self-reporting, which we know is problematic. Our analysis suggests that although prevalence of physician-diagnosed food allergy has increased over the past decade, incidence might have plateaued. The consistency of our estimates in children using Clinical Practice Research Datalink data, compared with clinical trials data conducted in England, suggests that analysis of CPRD data can provide a reliable method to monitor changes in the epidemiology of food allergy. What’s clear is that awareness has dramatically increased, leading to more diagnoses—but not all of them are accurate. Studies have indicated that food allergies are being overdiagnosed and overreported, and that many people may be needlessly avoiding certain foods.
The Doctor Dilemma: When Physicians Get It Wrong

Even healthcare professionals struggle with food allergy diagnosis. Compounding the problem, many physicians lack an understanding of how to apply common diagnostic tests and interpret the results. In a survey of 407 primary care physicians, less than 30 percent of the participants reported that they were comfortable interpreting laboratory tests to diagnose food allergy, and 38 percent indicated incorrectly that skin or blood tests were sufficient for a diagnosis. This creates a dangerous cycle: worried parents bring children to doctors who aren’t fully trained in allergy diagnosis, leading to more false positives. Many people are overdiagnosed when IgE tests are applied broadly, or as screening tools. People should only be tested for food allergies when their history supports the right symptoms, timing of onset in relation to ingestion of a food, and duration of symptoms. If someone is eating a food without any symptoms, they are not allergic to that food. If someone has a ‘positive’ IgE test, it does not mean they are necessarily allergic and they should not be told to remove food from their diet based upon testing alone. The result? Families avoiding foods unnecessarily, children missing out on important nutrients, and healthcare resources being wasted on phantom allergies.
The Anxiety Factor: When Fear Becomes Self-Fulfilling

There’s another layer to this story that’s often overlooked: the psychological component. Studies have shown that a great portion of food-allergic patients with a severe or mild reaction do not visit a doctor, while physician-confirmed food allergies were associated with lower psychosocial burdens compared to self-reported cases. When parents become convinced their child has a food allergy, they often become hypervigilant about symptoms, potentially misinterpreting normal reactions as allergic ones. This anxiety can spread through families and communities, creating what researchers call “allergy clusters” in schools and neighborhoods. The stress of managing a perceived food allergy can actually trigger symptoms that mimic allergic reactions—stomach upset, skin rashes, or breathing difficulties. Similarly, we found that the HRQL of people who were diagnosed by a doctor was significantly lower than those who were not. It’s a perfect storm of genuine concern, medical uncertainty, and the power of suggestion. The irony is that the very fear of allergic reactions can create symptoms that reinforce the belief in the allergy.
Modern Life’s Role: Beyond Cleanliness

While the hygiene hypothesis provides one explanation for rising allergy rates, researchers are discovering that modern life affects our immune systems in multiple ways. It is well known that the prevalence of allergic conditions would follow economic development and urbanization in many countries or regions. In developed countries, one in three children suffered from at least one allergic disorder. In line with the epidemiology of asthma, food allergy is also much less common in rural areas. Clear understanding of reasons explaining the disparity of food allergies between urban and rural population would pave the way to the development of effective primary prevention for food allergy. Urban environments expose us to different pollutants, our diets have changed dramatically with processed foods, and we spend more time indoors. Our diets have changed dramatically over the years. We eat a lot of processed food that doesn’t have the normal components of a healthy microbiome, like fiber. These healthy bacteria in our gut need that fiber to maintain themselves. They not only are important for our immune system but they’re absolutely critical to us deriving calories and nutrients from our food. Even factors like birth method matter: babies born via cesarean section have different microbiomes than those born vaginally, potentially affecting allergy development.
The Solution: Smarter Testing and Targeted Hygiene

Addressing the overdiagnosis problem requires a multi-pronged approach. Some experts now speak of “targeted hygiene”—eliminating the spread of pathogens while promoting steps to restore a diverse microbiome. For example, one can teach children to wash their hands after handling raw chicken but also encourage them to play outside in the dirt. If your child has been out in the garden and comes in with slightly grubby hands, I, personally, would let them come in and munch a sandwich without washing. This approach recognizes that not all bacteria are bad and that some exposure to microbes is actually beneficial. In order to help avoid overdiagnosis of food allergies, we should consider moving away from skin prick and blood tests and moving toward greater use of oral food challenges. It is likely any intervention will need to be done by 3 or 4 years of age, by which time a child’s microbiome is established and the immune system has completed much of its training. The window for immune system education is narrow, making early intervention crucial.
The nut allergy debate reveals something profound about modern medicine: sometimes the cure creates its own disease. While genuine nut allergies are serious and life-threatening conditions that deserve our attention and resources, the combination of anxious parents, inadequate testing, and well-meaning but undertrained physicians has created an epidemic of overdiagnosis. The real challenge isn’t just identifying who has allergies—it’s figuring out who doesn’t. As we move forward, the focus should shift from fear-based avoidance to evidence-based understanding, ensuring that children who truly need protection get it, while those who don’t aren’t unnecessarily restricted from enjoying a peanut butter sandwich.