Viewing entries tagged
allergy history

Comment

More Tools, More Problems? Food Allergies Since 1960

This guest post was written by Theresa MacPhail—assistant professor in the Science, Technology, and Society Program at Stevens Institute of Technology. 

Last December, I wrote a blog post about the early history of food allergies from the 1800s through the 1960-70s. In this installment, we’ll examine more recent food allergy chronicles, current treatments, and diagnosis debates. Despite advances in our understanding of the immune system, and promising developments in allergy-related technologies (like the Allergy Amulet), the lack of a cure or effective treatments for food allergies persists.

The Discovery of IgE

Immunotherapy treatments were first tested in animals, and then cautiously applied in clinical settings to treat both respiratory allergies and food allergies beginning in 1911. The risk of an accidental anaphylactic response was, and is, ever present. Much of the early allergy testing and treatment remained unchanged until the mid-1960s, when two separate research teams discovered immunoglobulin E, or IgE—a molecule that naturally forms in human blood.

IgE’s discovery led to a greater understanding of the inflammatory response that follows allergen exposure, sparking more research around the cause of allergic reactions. By 1975, the first commercially available and reliable blood test for IgE became available for clinical use. IgE testing quickly became a significant aid in allergy diagnosis, since an elevated presence of IgE levels in the blood often indicates a food allergy.

IgE has played an enormous role in subsequent allergy research, diagnosis, and treatment. However, while IgE tests provide information as to the likelihood of having a food allergy, 50-60% of IgE blood tests yield a “false positive” result, creating a great deal of uncertainty in diagnosis. IgE as an allergy biomarker is accordingly far from perfect.

Food Allergies - A Rising Prevalence?

If you follow the news or social media, or have a young child in the school system, it certainly seems that food allergies are on the rise. Although food allergy awareness has increased over the last decade and has become a more popular topic of conversation, the food allergy prevalence rate has been difficult to measure with confidence.

Figures on the national and global food allergy population are unsettled. This is largely because the numbers rely on multiple data sets collected across different methods and research groups. Official estimates place the figure at around 15 million. Adding to this confusion is the difficulty in confirming the presence of an allergy with current diagnostic tools (often IgE testing, discussed above). The majority of food allergy and food intolerance cases depend on self-reporting and sometimes self-diagnosis—and those numbers fluctuate greatly. A recent paper looking at multiple different allergy studies found that “[s]elf-reported prevalence of food allergy varied from 1.2% to 17% for milk, 0.2% to 7% for egg, 0% to 2% for peanuts and fish, 0% to 10% for shellfish, and 3% to 35% for [other foods].” A 2013 paper further suggested that “at least 1%–2% and up to 10% of the US population suffers from food allergies," which based its findings on "self-report, skin prick test (SPT), serum-specific IgE (sIgE), and oral food challenges (OFC).” These reports show that food allergy populations vary based on allergy type, reported severity, geographic region, study design, and testing method.

In short, with no easy and standardized way to diagnose food allergy cases, it is difficult to confirm and measure the perceived rise in the food allergy population.

The LEAP Study and the Future of Oral Immunotherapy

Perhaps the most significant study on food allergy in the last 50 years is the Learning Early About Peanut Allergy (LEAP) study by the Immune Tolerance Network. In this study, infants at a higher risk of developing a severe allergy to peanuts were randomly assigned to one of two groups: one that would avoid ingesting peanut-containing foods until age 5, and one that would consume a peanut-containing snack (~6 grams of peanut protein) with three or more meals per week until age 5. Of the children who avoided peanut, 17% developed a peanut allergy, compared to only 3% of the children in the control group. In a press release for the study, one of the researchers noted how for decades allergists have recommended that infants avoid consuming allergenic foods, and this study "suggests that this advice was incorrect and may have contributed to the rise in [] peanut and other food allergies.” Indeed, the LEAP study overturned decades of prior advice and shook the allergy research community. The study also gave credence to one of the oldest forms of allergy treatment: immunotherapy. 

After a decade of research, oral immunotherapy is becoming more widely accepted as effective for the most common food allergies (e.g., peanut), but little is known about its long-term effectiveness. If you’re not familiar, oral immunotherapy (OIT) is a method of food desensitization that involves re-introducing the immune system to the allergenic food in gradually increasing amounts over time, with the goal of eventual tolerance. Although researchers are optimistic about its potential, it is not without its drawbacks. You can learn more about OIT in Allergy Amulet’s blog post here.

The Promise and Peril of Epinephrine

Epinephrine (the hormone adrenaline) was first discovered in 1900 and marketed to treat asthma attacks and surgical shock. By 1906, with the development of a synthetic version, the drug was in common use by clinicians to treat severe asthma attacks. Immunologists and allergists experimented with dosages in the decades following, standardizing treatment protocols.

In 1975, a biomechanical engineer developed the first auto-injector syringe for the military, which was then adapted for use with epinephrine. It wasn’t until 1987, however, that the FDA approved the first epinephrine auto-injector for the general public. Epinephrine auto-injectors proved so effective—and the dosage delivered was so consistent—that it became the standard prescription for anyone suffering from a severe allergy. By the 1990s, food allergy patients were advised to carry one at all times for their safety.

In 2016, the mother of a child with a severe food allergy began a campaign against the dramatic rise in price of one of the most popular epinephrine auto-injector brands: EpiPen. The price of EpiPen surged between 2004 and 2016 – increasing from $100 to over $600. With few competitors on the market, Mylan Pharmaceuticals, the manufacturer of the EpiPen, felt no need to lower its prices. The story went viral and sparked debate about pharmaceutical industry pricing policies and access to affordable healthcare. Since the scandal broke, there has been a call to develop alternative and less expensive epinephrine auto-injectors.

The Epi-Pen story—and this post—highlight the urgent need for greater investment in allergy research and innovation. Let’s hope that with new advancements in the coming years, food allergy itself will be history. 

Comment

Comment

Fact, Fad, or Fiction? A Brief History of Early Allergy Science

This guest post was written by Theresa MacPhail—assistant professor in the Science, Technology, and Society Program at Stevens Institute of Technology. 

“Many physicians think that idiosyncrasies to foods are imaginary.” – Albert Rowe, MD (1951)

Two years ago, my 63-year-old aunt developed hives. Large red wheals covered her entire body, and the slightest pressure to her skin—including wearing clothes—caused her pain. Over the course of her life, she had coped with eczema and the occasional rash, but this was new. This was different.

Her doctor sent her to a dermatologist, who—dumbfounded—sent her back to her doctor. After many medical appointments, blood tests, and rounds of steroids, an allergy specialist asked her to undertake an elimination diet, cutting out several foods. My aunt’s hives immediately cleared, and it was only after she introduced wheat back into her diet that the hives resurfaced. Her diagnosis: a wheat allergy.

My aunt’s experience is an all-too-common tale of food allergy classification: routine misdiagnosis, common misconception, and a general lack of understanding within the broader medical community. What is it about food allergies that make this story so familiar? Why are food allergies and intolerances so difficult to diagnose and treat? It turns out that our troubles with allergy diagnosis have a long and complicated history.

Rose Colds & Sea Anemones: Early Allergy Science

We begin in 1819, when the physician John Bostock presented the first clinical description of hay fever—or summer catarrh—to the medical community. By the mid-1800s, doctors had begun diagnosing patients with “summer” or “rose” colds (which we now call hay fever or seasonal allergies). In 1905, immunologists discovered they could produce an anaphylactic response in animals (injecting toxin from sea anemones into dogs) and began experimenting with allergic reactions in the laboratory. These anaphylactic responses to sea anemones were not considered allergic reactions or “allergies.” That link would be discovered later.

Hay fever and seasonal allergies were relatively easy for clinicians to diagnose with skin tests and desensitization techniques. Desensitization—or allergen immunotherapy—in its early form involved allergens converted into a serum or vaccine and injected into a patient. Leonard Noon and John Freeman discovered allergen immunotherapy in 1911, and this technique is still used for treating seasonal allergies today.

Until the early 20th century, food allergy remained somewhat of a nebulous concept. It was widely recognized, but hadn’t yet been proven. In 1912, Oscar Menderson Schloss breathed legitimacy into food allergy diagnosis and proved its existence. An American pediatrician, Schloss developed a skin scratch test with which he correctly diagnosed egg sensitivity. While this was seen as a breakthrough in allergy detection, skin scratch tests did not produce consistent results, as many patients with obvious clinical allergies didn’t react to these tests.

A leading difficulty with allergy diagnosis (food and seasonal)—both past and present—has been distinguishing allergy symptoms from the bevy of other ailments they mimic. Food allergy reactions are also highly idiosyncratic—meaning that no two patients with an egg or wheat sensitivity will necessarily react to the same degree or in the same fashion. Famed allergy specialist Warren T. Vaughan argued that the greatest difficulty in understanding and studying food allergy is the inconsistency of responses to different exposure levels among individuals. By 1931, after years of practice, Vaughan still couldn’t find logical patterns in the allergy symptoms of his patients. He had no explanation for why two patients reacted differently to equal doses of an allergen, concluding that “allergy to food is always an individual affair.”

By the late 1930s, physicians began realizing that chronic food allergies were far more prevalent among the general population than previously imagined. In some cases, food allergies were considered responsible for patient migraines, hives, intestinal troubles, bladder pain, and asthma. Guy Laroche and Charles Richet—two prominent French allergists at the time—argued that older physicians had failed to properly label food allergies as “alimentary anaphylaxis,” instead classifying these events as medical anomalies. For Laroche and Richet, the vigorous tracking of patient diet and symptoms proved their hypothesis: physicians were failing to recognize anaphylactic episodes to food as the result of an allergic response. This was a breakthrough.

A Fad is Born & Modern Trends

Because allergy diagnosis relied heavily on patient input, and were poorly understood, many doctors dismissed allergies as a response to emotional stress or neurosis. Doctors believed that these patients—the majority of whom were women—overplayed symptoms to garner attention or sympathy. It became a “grab bag” diagnosis, especially in the hands of general practitioners. As diagnoses surged, Samuel Fineberg warned that the glut of allergy research—only a few decades old—had led clinicians to dismiss allergies as just a trend. One prominent allergist observed that older generations regarded food allergy “as a passing fad.” Many today still view food allergies and intolerances as fads, although this is changing.

And while perceptions are evolving, allergy treatments have mostly remained stagnant. Between confirmation of the first food allergy in 1912 and the late 1960s, avoidance was the only prescription for food allergy patients. In 1935, food allergy specialist Dr. Albert Rowe argued that mild allergies couldn’t be diagnosed with skin tests alone, and insisted that elimination diets were a superior remedy to skin testing. He created a guide for physicians and patients, which became widely used among allergists from the late 1930s to as late as the 1980s. Rowe counseled that food allergy should not be dismissed as “mere fancy” but taken as medical fact, and helped shift the perception of food allergies in the medical community.

As evidenced in this history, food allergy treatments haven’t changed much. Desensitization for seasonal allergies has been around since the early 1900s, food allergy desensitization (oral immunotherapy), while relatively more recent, still builds off of the same concept of desensitization. With oral immunotherapy, the patient ingests small amounts of the allergic food in gradually increasing amounts. It’s not widely practiced at present, and is only offered by select allergists nationwide.

We can still see the echoes of this history when we look at current debates over food allergy versus food sensitivity designations. Take gluten, for example. While wheat allergy and the autoimmune disorder Celiac Disease are accepted medical conditions, gluten sensitivity is still debated by researchers and the public alike.

There is still much we don’t understand about food allergies and intolerances, but increasing research in this space holds promise for solving these medical mysteries. Fact, fad, or fiction? As history has shown, only through scientific advancements and research will facts eclipse fad and fiction.   

Part Two: Food Allergies Today

Stay tuned for part two of this story as we discuss the modern world of food allergy—epinephrine auto-injectors entering the market, the staggering increase in food allergy diagnosis, the LEAP study, and oral immunotherapy.

Comment