Comment

OIT—Is It For Me?

Feeding your peanut-allergic child peanuts is not easy as a mother—I would know, I do it every day. Your instincts as a parent are to keep your child as far out of harm’s way as possible. But in today’s world, peanuts may be the best management tool we have for my peanut-allergic child.

Let me explain.

My daughter was born with a severe allergy to peanuts and tree nuts. For the first three years of her life, we strictly avoided these foods. She’s now four. Last April, we agreed to undergo an oral food challenge at her allergist’s office to find out if she was still allergic. Her peanut blood test numbers had dropped considerably—this blood test measures levels of Immunoglobulin E (IgE) to individual allergens in the body. IgE is the antibody that triggers food allergy symptoms. Plus, she hadn’t been exposed to peanut since she was a baby. Unfortunately, the oral food challenge outcome wasn’t as we hoped: after ingesting ¼ of a peanut, split into three gradually increasing doses over a 45-minute period, she experienced an anaphylactic event and we had to administer epinephrine. It was an emotional day, to say the least.

After discovering that she was still severely allergic to peanuts, we decided to explore oral immunotherapy: a method of food desensitization that involves re-introducing the immune system to the allergenic food in gradually increasing amounts over time, with the goal of eventual tolerance.

For our family, the results have been life changing. The same little girl that reacted to ¼ of a peanut now eats 12 peanuts daily with zero symptoms. But OIT is not necessarily for everyone, so I’d like to share our family’s journey and offer some insights into the process so that you can determine whether it’s a good fit for you or your child.

If your allergist doesn’t have a clear picture of your allergy severity, treatment may start with an oral food challenge. Once the individual has been identified as an OIT candidate, they are typically provided a juice-like beverage containing tiny amounts of the allergen. This beverage is consumed during the same two-hour period every day. Depending on how quickly a patient builds up a tolerance, your allergist may recommend coming in every week or two for an “updose”—an increase in the amount of allergen consumed. As the immune system grows more tolerant, the patient eventually moves to a powder form (which is typically sprinkled onto food), and finally to solids (e.g., whole nuts).

Importantly, OIT requires a considerable time commitment. Although updosing typically occurs every week or two, the allergen must be consumed every day to build and maintain tolerance. OIT also places constraints on physical activity. During OIT, the patient can only engage in calm, quiet activity half an hour before dosing, and at least two hours afterwards (during their observation period). This ensures that the immune system doesn’t get “revved up” unnecessarily and trigger an allergic reaction.

Is OIT perfect? Not quite. For the foreseeable future, my daughter must eat 12 peanuts with a two-hour observation period everyday. However, we can now choose the time frame each day, and expect the observation period to shorten over time. There’s also a measure of unpredictability. On two occasions, our daughter developed a couple hives after her prescribed dose, and we had to give her antihistamines. Other times, we had to lower her dose because she was sick, which can compromise the immune system. It is these situations, and the risk of producing a more serious adverse outcome, that discourages many allergists from taking up the practice. Indeed, OIT is still relatively controversial. Additionally, OIT treatments are still in their nascent stages and are not widely practiced, so there is less data and information available.

Importantly, not every food-allergic child or adult is a good candidate for OIT. For example, if a patient has severe environmental allergies, acute asthma, or eosinophilic esophagitis, they will not likely qualify for OIT. Additionally, OIT treatment is not available for all allergens—desensitization to peanuts, for example, is far more common practice than, say, shellfish.

If you think OIT may be of interest to your family, I’d encourage you to talk to your allergist and seek out additional information and guidance. You can also reach out to me at mnohe@allergyamulet.com for more on the parent perspective—I’m always up for a good food allergy chat!

- Meg, Director of Strategic Development

Comment

Comment

B the Change: The Future of Business

 

Occasionally, our blog content may depart from its typical focus on food allergies to discuss topics related to business and entrepreneurship. We are a start-up, after all. In this post, we examine the role of corporate social responsibility in today’s world. This piece also coincides with a guide I wrote for entrepreneurs on the same topic, which Yale and Patagonia jointly published today.

In a recent column in The Week, William Falk discusses the loss of civility in America. He starts with a story of his pregnant co-worker standing in a crowded NYC subway car, waiting for someone to offer her a seat (spoiler alert: no one did), and segues into what he sees as a “me-first” mentality overtaking common decency in America. I was reminded of a recent experience at an airport, where I witnessed two disabled men and a member of the military waiting until the platinum, gold, and first-class passengers boarded before they were invited on the plane. Capitalism at its finest.

Falk attributes this culture shift partly to our “brutally Darwinian” workplace culture, in which overtime is encouraged and vacation is a luxury few can afford. Falk submits that this workhorse mentality fuels an economic struggle of survival that leads to competition and hostility, both inside and outside the workplace. I think few would dispute the considerable amount of economically motivated resentment in our country right now.

So how did we get here? Why the Darwinian corporate culture? I think the problem started, in part, with one man; and the solution, I believe, lies partly in the guide I mentioned at the start of this piece.

Let me explain.

In 1970, Nobel Prize-winning economist Milton Friedman, a champion of free market economics, famously stated: “There is one and only one social responsibility of business[:] to increase its profits.” In other words, companies must value profits above all else and are not bound by a commitment to the communities and people they employ and serve. This ideology came to be known as the doctrine of shareholder primacy. But the problem with this theory is that profits often come at the expense of worker well-being, community health, and the environment.

Now, say your businesses decided to consider employee well-being on par with shareholder profits in corporate decision-making. What then?  

In 2006, the nonprofit organization B Lab launched a certification process aimed at supporting just those kinds of stakeholder-driven business decisions. The objective was to separate the truly “good” companies from those that merely had good marketing departments by implementing a standard vetting process. B certification created a system that measures social and environmental impact, and ensures reporting compliance through more stringent measures of accountability and corporate transparency. In 2010, states began implementing statutes under which companies could incorporate as Benefit Corporations: an alternative to a C-Corp or LLC that builds the same values of Certified B Corporations into the company’s charter and articles of incorporation. The same principles inform both certification and incorporation: to place stakeholder interests (shareholders, employees, community, and environment) on equal footing in corporate decision-making. The guide I mentioned offers a detailed roadmap for businesses interested in securing either B certification or incorporation status.  

If all of this B stuff is new to you, you’re not alone—though you’ve probably heard of Patagonia, Etsy, or Warby Parker, all of which are either a Certified B Corporation or a Benefit Corporation. To date, there are nearly 4,000 Benefit Corporations and 2,000 Certified B Corporations in existence, and these entities are just the tip of the iceberg. According to a recent report, “social impact has evolved from a pure PR play to an important part of corporate strategy to protect and create value.” JP Morgan estimates that the market for socially responsible investing stands somewhere between $400 billion and $1 trillion.

We live in an increasingly interconnected world, one in which we are (ironically) becoming more and more disconnected from the communities and people around us. We often don’t know the people who sew our clothes, the farmers who grow our produce, and the manufacturers who assemble our electronics. Globalization and interconnectedness have increased trade and communication between countries, but they have also fueled income inequality and transferred millions of jobs overseas. 

Our nation’s businesses and economy are only as healthy and as strong as the communities, environment, and employees they serve and on which they depend. Milton Friedman was wrong. The rise and popularity of Certified B Corporations and Benefit Corporations is largely a response to that realization. Now, more than ever, businesses must build social values into their bottom line. It’s not just good for society—it’s good for business.

-Abi, CEO & Co-Founder

 

Allergy Amulet is neither a Certified B Corporation nor a Benefit Corporation. Our company is currently pre-revenue and does not yet have a product on the market. Because of this, it is too early for certification and too costly to reincorporate. The company plans to pursue both designations at the appropriate time.

Comment

2 Comments

Raising a Teen with Food Allergies: First Kisses and Fighting Bubble Tendencies

2 Comments

Comment

May Contain Nuts: A Crash Course on FDA Food Allergy Labeling Laws

Comment

Comment

Fact, Fad, or Fiction? A Brief History of Early Allergy Science

This guest post was written by Theresa MacPhail—assistant professor in the Science, Technology, and Society Program at Stevens Institute of Technology. 

“Many physicians think that idiosyncrasies to foods are imaginary.” – Albert Rowe, MD (1951)

Two years ago, my 63-year-old aunt developed hives. Large red wheals covered her entire body, and the slightest pressure to her skin—including wearing clothes—caused her pain. Over the course of her life, she had coped with eczema and the occasional rash, but this was new. This was different.

Her doctor sent her to a dermatologist, who—dumbfounded—sent her back to her doctor. After many medical appointments, blood tests, and rounds of steroids, an allergy specialist asked her to undertake an elimination diet, cutting out several foods. My aunt’s hives immediately cleared, and it was only after she introduced wheat back into her diet that the hives resurfaced. Her diagnosis: a wheat allergy.

My aunt’s experience is an all-too-common tale of food allergy classification: routine misdiagnosis, common misconception, and a general lack of understanding within the broader medical community. What is it about food allergies that make this story so familiar? Why are food allergies and intolerances so difficult to diagnose and treat? It turns out that our troubles with allergy diagnosis have a long and complicated history.

Rose Colds & Sea Anemones: Early Allergy Science

We begin in 1819, when the physician John Bostock presented the first clinical description of hay fever—or summer catarrh—to the medical community. By the mid-1800s, doctors had begun diagnosing patients with “summer” or “rose” colds (which we now call hay fever or seasonal allergies). In 1905, immunologists discovered they could produce an anaphylactic response in animals (injecting toxin from sea anemones into dogs) and began experimenting with allergic reactions in the laboratory. These anaphylactic responses to sea anemones were not considered allergic reactions or “allergies.” That link would be discovered later.

Hay fever and seasonal allergies were relatively easy for clinicians to diagnose with skin tests and desensitization techniques. Desensitization—or allergen immunotherapy—in its early form involved allergens converted into a serum or vaccine and injected into a patient. Leonard Noon and John Freeman discovered allergen immunotherapy in 1911, and this technique is still used for treating seasonal allergies today.

Until the early 20th century, food allergy remained somewhat of a nebulous concept. It was widely recognized, but hadn’t yet been proven. In 1912, Oscar Menderson Schloss breathed legitimacy into food allergy diagnosis and proved its existence. An American pediatrician, Schloss developed a skin scratch test with which he correctly diagnosed egg sensitivity. While this was seen as a breakthrough in allergy detection, skin scratch tests did not produce consistent results, as many patients with obvious clinical allergies didn’t react to these tests.

A leading difficulty with allergy diagnosis (food and seasonal)—both past and present—has been distinguishing allergy symptoms from the bevy of other ailments they mimic. Food allergy reactions are also highly idiosyncratic—meaning that no two patients with an egg or wheat sensitivity will necessarily react to the same degree or in the same fashion. Famed allergy specialist Warren T. Vaughan argued that the greatest difficulty in understanding and studying food allergy is the inconsistency of responses to different exposure levels among individuals. By 1931, after years of practice, Vaughan still couldn’t find logical patterns in the allergy symptoms of his patients. He had no explanation for why two patients reacted differently to equal doses of an allergen, concluding that “allergy to food is always an individual affair.”

By the late 1930s, physicians began realizing that chronic food allergies were far more prevalent among the general population than previously imagined. In some cases, food allergies were considered responsible for patient migraines, hives, intestinal troubles, bladder pain, and asthma. Guy Laroche and Charles Richet—two prominent French allergists at the time—argued that older physicians had failed to properly label food allergies as “alimentary anaphylaxis,” instead classifying these events as medical anomalies. For Laroche and Richet, the vigorous tracking of patient diet and symptoms proved their hypothesis: physicians were failing to recognize anaphylactic episodes to food as the result of an allergic response. This was a breakthrough.

A Fad is Born & Modern Trends

Because allergy diagnosis relied heavily on patient input, and were poorly understood, many doctors dismissed allergies as a response to emotional stress or neurosis. Doctors believed that these patients—the majority of whom were women—overplayed symptoms to garner attention or sympathy. It became a “grab bag” diagnosis, especially in the hands of general practitioners. As diagnoses surged, Samuel Fineberg warned that the glut of allergy research—only a few decades old—had led clinicians to dismiss allergies as just a trend. One prominent allergist observed that older generations regarded food allergy “as a passing fad.” Many today still view food allergies and intolerances as fads, although this is changing.

And while perceptions are evolving, allergy treatments have mostly remained stagnant. Between confirmation of the first food allergy in 1912 and the late 1960s, avoidance was the only prescription for food allergy patients. In 1935, food allergy specialist Dr. Albert Rowe argued that mild allergies couldn’t be diagnosed with skin tests alone, and insisted that elimination diets were a superior remedy to skin testing. He created a guide for physicians and patients, which became widely used among allergists from the late 1930s to as late as the 1980s. Rowe counseled that food allergy should not be dismissed as “mere fancy” but taken as medical fact, and helped shift the perception of food allergies in the medical community.

As evidenced in this history, food allergy treatments haven’t changed much. Desensitization for seasonal allergies has been around since the early 1900s, food allergy desensitization (oral immunotherapy), while relatively more recent, still builds off of the same concept of desensitization. With oral immunotherapy, the patient ingests small amounts of the allergic food in gradually increasing amounts. It’s not widely practiced at present, and is only offered by select allergists nationwide.

We can still see the echoes of this history when we look at current debates over food allergy versus food sensitivity designations. Take gluten, for example. While wheat allergy and the autoimmune disorder Celiac Disease are accepted medical conditions, gluten sensitivity is still debated by researchers and the public alike.

There is still much we don’t understand about food allergies and intolerances, but increasing research in this space holds promise for solving these medical mysteries. Fact, fad, or fiction? As history has shown, only through scientific advancements and research will facts eclipse fad and fiction.   

Part Two: Food Allergies Today

Stay tuned for part two of this story as we discuss the modern world of food allergy—epinephrine auto-injectors entering the market, the staggering increase in food allergy diagnosis, the LEAP study, and oral immunotherapy.

Comment