Viewing entries tagged
allergy science


Get Your Geek On: The Science Behind Food Allergy Testing

Screen Shot 2018-01-31 at 1.35.21 PM.png

Food testing is serious business. It’s also a large and growing one: the market for food testing kits was valued at $1.58 billion in 2016. That figure is expected to climb to $2.38 billion by 2022.

Since the enactment of the Food Safety Modernization Act (FSMA) of 2011, food manufacturers are increasingly implementing comprehensive food testing procedures. Allergen testing has accordingly taken on a more prominent role in food safety plans. Traditionally, food allergen testing has been confined to the lab; but as new technologies emerge, and old technologies evolve, that’s starting to change.

In this post, we break down the most common food allergen detection technologies. We also discuss emerging technologies and approaches (including ours!) and why changes in food allergen detection are on the horizon. Spoiler alert: prepare for some major geeking out!

Liquid Chromatography-Mass Spectrometry (LC-MS)

As its name implies, liquid chromatography-mass spectrometry (LC-MS) is a two-phase test. During the liquid chromatography phase, a food sample is dissolved in a liquid and funneled through a highly-pressurized chromatography column, which separates molecules based on size and structure.

The mass spectrometer measures the mass of each molecule, as well as the masses of any molecular fragments. A molecule’s mass and fragmentation pattern provide identifying information about the molecule.

Caffeine: Mass Spectrum

Mass spectrum fingerprint of caffeine.

Mass spectrum fingerprint of caffeine.

Although LC-MS is a highly-selective tool for molecular identification, LC-MS instruments are expensive and large. Even modest instruments can cost tens of thousands of dollars and stand as high and wide as a microwave. Higher-end instruments can be as large as a car! Test times are also relatively long, ranging from 10 to 30 minutes per food sample. Accordingly, these tests are generally confined to lab environments at present.

Ultraviolet, Visible Light, Infrared, and Raman Spectroscopy

These spectroscopic methods rely on light absorption. A molecule’s chemical structure determines which light wavelengths may be absorbed and the degree of absorption. Spectrometers shine a range of wavelengths at a food sample, and a molecule’s relative absorption of those different wavelengths generates an identifying “fingerprint” for that molecule. You can think of spectroscopy as the enLIGHTened approach to molecular detection 😉.

Caffeine: Infrared Spectrum

Infrared   spectral fingerprint for caffeine. Peaks and dips signify   degree     of molecular light absorption.

Infrared spectral fingerprint for caffeine. Peaks and dips signify degree of molecular light absorption.

Spectral fingerprints are ideal for identifying molecules in samples containing only a few ingredients. Spectra can be generated in a span of seconds, with high-resolution versions taking only one to two minutes. However, identifying molecules in complex mixtures like food samples can present serious challenges for spectroscopic methods, as spectral fingerprints are likely to overlap, making individual molecules difficult or impossible to identify—especially in low quantities. Accordingly, spectroscopy does not currently lend itself to allergen detection in food samples. Moreover, any spectrometer that could potentially afford sufficient selectivity for allergen detection would be large and costly.

Immunoassays & ELISA

Immunoassay tests rely on antibodies. Antibodies are naturally-occurring proteins in the body’s immune system designed to recognize and fight potentially harmful foreign materials. Each antibody is formed to recognize a specific target—usually a protein or protein fragment. Since the 1950’s, scientists have cultivated antibodies to function outside of the body. These antibodies led to tests known as immunoassays. There are many variants of immunoassays, including ELISA (enzyme-linked immunosorbent assay) tests, which many food manufacturers use to test for allergens during the manufacturing process.

In a typical immunoassay, a liquid sample suspected of containing a particular allergenic protein is exposed to a test strip containing antibodies, which are formulated to recognize that specific protein. If the target protein is present, the protein will stick to the antibodies on the test strip and a secondary reaction will stain the bound protein, causing the test strip to change color.

Immunoassays are highly selective, portable, and can produce results in as little as a few minutes. However, culturing and harvesting specific antibodies can be expensive. Moreover, antibodies—like most proteins—are sensitive to harsh conditions like high temperatures or extreme pH levels. The integrity of these tests, therefore, depends on adequate storage conditions. Antibodies are also known to have relatively short shelf lives and typically degrade within one year.

PCR and Molecular Beacons

Another technology in the allergen detection field involves identifying DNA sequences from an allergenic ingredient using a combination of a polymerase chain reaction (PCR) and molecular beacons. Don’t worry, it’s not as complicated as it sounds.

One way to test for an allergenic ingredient is to detect DNA segments unique to that ingredient. DNA is made of two complementary strands, and when one strand finds its complement, they bind. Simple enough. PCR uses the complementary nature of DNA to identify and exponentially replicate target DNA strands. This replication makes the DNA strands easier to detect using what are called molecular beacons: specialized molecular tags that turn fluorescent upon binding to a target DNA strand. These illuminated beacons can then be measured with a fluorescence spectrometer. While PCR-based assays are sensitive and selective, these tests are generally better suited for laboratory environments because they require automated laboratory equipment.

Historically, molecular beacons have been used to detect nucleotide chains like DNA; more recently, molecular beacons are being used to bind and stain proteins–including allergens–instead of DNA sequences. In this approach, PCR is not necessary, as the molecular beacons attach directly to the protein. Notably, molecular beacon tags require a fluorescence spectrometer to measure the target allergenic protein or nucleotide sequences.

Molecularly Imprinted Polymers (Allergy Amulet’s Technology!)

Molecularly imprinted polymer (MIP) sensors are an exciting emerging technology. MIPs are highly-specialized plastic films molded to recognize a single target molecule, such as an allergenic protein or a chemical tracer for an allergenic ingredient. Historically, molecularly imprinted polymers have been used for drug separation and delivery. Only recently have MIPs been adapted for use as molecular recognition elements in electronic sensing devices.

Building an MIP is similar in concept to creating a lock for which the target molecule is the key. Our polymer films contain hundreds of trillions of cavities (locks), which recognize a specific target molecule (key) by size, shape, and complimentary electron charge distribution. The molding procedures used for MIPs mean that they can be designed to target a wide variety of molecular targets. Our Scientific Advisor, Dr. Joseph BelBruno, was the first to develop electronic MIP sensors for detecting nicotine and marijuana. Allergy Amulet is the first to develop MIP sensors for detecting allergenic ingredients.

   18 pt 
   18 pt 
 /* Style Definitions */
	{mso-style-name:"Table Normal";
	mso-padding-alt:0in 5.4pt 0in 5.4pt;
	font-family:"Times New Roman";
      Imprinted cavity molded to bind to a specific target molecule.

Imprinted cavity molded to bind to a specific target molecule.

Because the core ingredient in a MIP-based sensor is a specialized plastic, MIP films are highly durable and affordable to produce. The high specificity of target binding, coupled with a straightforward electrochemical resistance measurement, allows for rapid and portable testing.

That’s it! Now you know the science behind allergen detection methodologies. We hope you enjoyed geeking out with us for a short while. Until next time!

-        The Allergy Amulet Science Team


These scientific explanations have been simplified to accommodate our nontechnical readership. 



Fact, Fad, or Fiction? A Brief History of Early Allergy Science

This guest post was written by Theresa MacPhail—assistant professor in the Science, Technology, and Society Program at Stevens Institute of Technology. 

“Many physicians think that idiosyncrasies to foods are imaginary.” – Albert Rowe, MD (1951)

Two years ago, my 63-year-old aunt developed hives. Large red wheals covered her entire body, and the slightest pressure to her skin—including wearing clothes—caused her pain. Over the course of her life, she had coped with eczema and the occasional rash, but this was new. This was different.

Her doctor sent her to a dermatologist, who—dumbfounded—sent her back to her doctor. After many medical appointments, blood tests, and rounds of steroids, an allergy specialist asked her to undertake an elimination diet, cutting out several foods. My aunt’s hives immediately cleared, and it was only after she introduced wheat back into her diet that the hives resurfaced. Her diagnosis: a wheat allergy.

My aunt’s experience is an all-too-common tale of food allergy classification: routine misdiagnosis, common misconception, and a general lack of understanding within the broader medical community. What is it about food allergies that make this story so familiar? Why are food allergies and intolerances so difficult to diagnose and treat? It turns out that our troubles with allergy diagnosis have a long and complicated history.

Rose Colds & Sea Anemones: Early Allergy Science

We begin in 1819, when the physician John Bostock presented the first clinical description of hay fever—or summer catarrh—to the medical community. By the mid-1800s, doctors had begun diagnosing patients with “summer” or “rose” colds (which we now call hay fever or seasonal allergies). In 1905, immunologists discovered they could produce an anaphylactic response in animals (injecting toxin from sea anemones into dogs) and began experimenting with allergic reactions in the laboratory. These anaphylactic responses to sea anemones were not considered allergic reactions or “allergies.” That link would be discovered later.

Hay fever and seasonal allergies were relatively easy for clinicians to diagnose with skin tests and desensitization techniques. Desensitization—or allergen immunotherapy—in its early form involved allergens converted into a serum or vaccine and injected into a patient. Leonard Noon and John Freeman discovered allergen immunotherapy in 1911, and this technique is still used for treating seasonal allergies today.

Until the early 20th century, food allergy remained somewhat of a nebulous concept. It was widely recognized, but hadn’t yet been proven. In 1912, Oscar Menderson Schloss breathed legitimacy into food allergy diagnosis and proved its existence. An American pediatrician, Schloss developed a skin scratch test with which he correctly diagnosed egg sensitivity. While this was seen as a breakthrough in allergy detection, skin scratch tests did not produce consistent results, as many patients with obvious clinical allergies didn’t react to these tests.

A leading difficulty with allergy diagnosis (food and seasonal)—both past and present—has been distinguishing allergy symptoms from the bevy of other ailments they mimic. Food allergy reactions are also highly idiosyncratic—meaning that no two patients with an egg or wheat sensitivity will necessarily react to the same degree or in the same fashion. Famed allergy specialist Warren T. Vaughan argued that the greatest difficulty in understanding and studying food allergy is the inconsistency of responses to different exposure levels among individuals. By 1931, after years of practice, Vaughan still couldn’t find logical patterns in the allergy symptoms of his patients. He had no explanation for why two patients reacted differently to equal doses of an allergen, concluding that “allergy to food is always an individual affair.”

By the late 1930s, physicians began realizing that chronic food allergies were far more prevalent among the general population than previously imagined. In some cases, food allergies were considered responsible for patient migraines, hives, intestinal troubles, bladder pain, and asthma. Guy Laroche and Charles Richet—two prominent French allergists at the time—argued that older physicians had failed to properly label food allergies as “alimentary anaphylaxis,” instead classifying these events as medical anomalies. For Laroche and Richet, the vigorous tracking of patient diet and symptoms proved their hypothesis: physicians were failing to recognize anaphylactic episodes to food as the result of an allergic response. This was a breakthrough.

A Fad is Born & Modern Trends

Because allergy diagnosis relied heavily on patient input, and were poorly understood, many doctors dismissed allergies as a response to emotional stress or neurosis. Doctors believed that these patients—the majority of whom were women—overplayed symptoms to garner attention or sympathy. It became a “grab bag” diagnosis, especially in the hands of general practitioners. As diagnoses surged, Samuel Fineberg warned that the glut of allergy research—only a few decades old—had led clinicians to dismiss allergies as just a trend. One prominent allergist observed that older generations regarded food allergy “as a passing fad.” Many today still view food allergies and intolerances as fads, although this is changing.

And while perceptions are evolving, allergy treatments have mostly remained stagnant. Between confirmation of the first food allergy in 1912 and the late 1960s, avoidance was the only prescription for food allergy patients. In 1935, food allergy specialist Dr. Albert Rowe argued that mild allergies couldn’t be diagnosed with skin tests alone, and insisted that elimination diets were a superior remedy to skin testing. He created a guide for physicians and patients, which became widely used among allergists from the late 1930s to as late as the 1980s. Rowe counseled that food allergy should not be dismissed as “mere fancy” but taken as medical fact, and helped shift the perception of food allergies in the medical community.

As evidenced in this history, food allergy treatments haven’t changed much. Desensitization for seasonal allergies has been around since the early 1900s, food allergy desensitization (oral immunotherapy), while relatively more recent, still builds off of the same concept of desensitization. With oral immunotherapy, the patient ingests small amounts of the allergic food in gradually increasing amounts. It’s not widely practiced at present, and is only offered by select allergists nationwide.

We can still see the echoes of this history when we look at current debates over food allergy versus food sensitivity designations. Take gluten, for example. While wheat allergy and the autoimmune disorder Celiac Disease are accepted medical conditions, gluten sensitivity is still debated by researchers and the public alike.

There is still much we don’t understand about food allergies and intolerances, but increasing research in this space holds promise for solving these medical mysteries. Fact, fad, or fiction? As history has shown, only through scientific advancements and research will facts eclipse fad and fiction.   

Part Two: Food Allergies Today

Stay tuned for part two of this story as we discuss the modern world of food allergy—epinephrine auto-injectors entering the market, the staggering increase in food allergy diagnosis, the LEAP study, and oral immunotherapy.