Herd Mentality Among Pediatricians Caused Current Peanut Allergy Epidemic

A headline, “How Pediatricians Caused the Peanut Allergy Epidemic” got me to click the other day. The article makes some important points, I think.

Having a peanut allergy is a serious health concern, both as an adult and for one’s child. For a sensitized person, exposure to peanut-containing products can be fatal if an Epi-pen or emergency room is not available for an epinephrin injection. Since this is an economics blog, I’ll note that a 2012 survey estimated the economic cost of any food allergy in US children at $24.8 billion annually, or $4184 per child. This includes direct medical costs, and the indirect costs, including opportunity costs, for children and their caregivers.

Out of an abundance of caution, pediatricians in the 1990s started recommending that parents keep peanuts from their infants and children. Instead of protecting children, however, this policy has done just the opposite. The incidence of peanut allergies has soared, with now some 2.5% of the pediatric population showing peanut allergies:

Around the year 2000 peanut allergies began to skyrocket. Sales of EpiPens, used in cases of peanut-induced anaphylactic shock, became a major expense for parents and a growing profit center for the manufacturer. … So, what changed? How did peanuts go from cheap, nutritious food source to become the little death pills that we think of them today? The answer is not what you would expect: pediatricians created the peanut allergy epidemic.

Meanwhile, the more that health officials implored parents to follow the recommendation, the worse peanut allergies got. From 2005 to 2014, the number of children going to the emergency department because of peanut allergies tripled in the U.S. By 2019, a report estimated that 1 in every 18 American children had a peanut allergy. 

It did not have to go like this.  I poked about the web and found another article, titled The Medical Establishment Closes Ranks, and Patients Feel the Effects, which framed matters in terms of physician behaviors:

 Peanut allergies in American children more than tripled between 1997 and 2008, after doctors told pregnant and lactating women to avoid eating peanuts and parents to avoid feeding them to children under 3. This was based on guidance issued by the American Academy of Pediatrics in 2000.

You probably also know that this guidance, following similar guidance in Britain, turned out to be entirely wrong and, in fact, avoiding peanuts caused many of those allergies in the first place.

That should not have been surprising, because the advice violated a basic principle of immunology: Early exposure to foreign molecules builds resistance. In Israel, where babies are regularly fed peanuts, peanut allergies are rare. Moreover, at least one of the studies on which the British advice was based showed the opposite of what the guidance specified.

As early as 1998, Gideon Lack, a British pediatric allergist and immunologist, challenged the guidelines, saying they were “not evidence-based.” But for years, many doctors dismissed Dr. Lack’s findings, even calling his studies that introduced peanut butter early to babies unethical.

When I first reported on peanut allergies in 2006, doctors expressed a wide range of theories, at the same time that the “hygiene hypothesis,” which holds that overly sterile environments can trigger allergic responses, was gaining traction. Still, the guidance I got from my pediatrician when my second child was born that same year was firmly “no peanuts.”

It wasn’t until 2008, when Lack and his colleagues published a study showing that babies who ate peanuts were less likely to have allergies, that the A.A.P. issued a report, acknowledging there was a “lack of evidence” for its advice regarding pregnant women. But it stopped short of telling parents to feed babies peanuts as a means of prevention. Finally, in 2017, following yet another definitive study by Lack, the A.A.P. fully reversed its early position, now telling parents to feed their children peanuts early.

But by then, thousands of parents who conscientiously did what medical authorities told them to do had effectively given their children peanut allergies.

This avoidable tragedy is one of several episodes of medical authorities sticking to erroneous positions despite countervailing evidence that Marty Makary, a surgeon and professor at Johns Hopkins School of Medicine, examines in his new book, “Blind Spots: When Medicine Gets It Wrong, and What It Means for our Health.”

Rather than remaining open to dissent, Makary writes, the medical profession frequently closes ranks, leaning toward established practice, consensus and groupthink.

This article describes further instances of poorly-founded medical advice. Women were scared away from helpful estrogen hormone replacement therapy for many years because of unfounded fears of breast cancer. Blood donor institutions suppressed concerns about AIDS in donated blood, in order to not rock the boat:

In 1983, near the beginning of the AIDS crisis, the American Red Cross, the American Association of Blood Banks and the Council of Community Blood Centers rejected a recommendation by a high-ranking C.D.C. expert to restrict donations from people at high risk for AIDS. Instead, they issued a joint statement insisting that “there is no absolute evidence that AIDS is transmitted by blood or blood products.” The overriding concern was that Americans would not trust the blood supply, or donate blood, if people questioned its safety.

As with the advice on peanuts, a reversal came about far later than it should have. It took years for the blood banking industry to begin screening donors and it wasn’t until 1988 that the F.D.A. required all blood banks to test for H.I.V. antibodies. In the interim, half of American hemophiliacs, and many others, were infected with H.I.V. by blood transfusions, leading to more than 4,000 deaths.

That is poignant for me, since a good friend of mine died from AIDS that he contracted through a blood transfusion in that timeframe.

Well, what to do now about peanuts? It seems an obvious action is to expose infants to peanuts, at 4-6 months, along with other solid foods – – perhaps with the caveat to start with small doses and preferably stay within driving distance of an emergency room should that be needed. As for children who now manifest peanut allergies, there is some hope of desensitizing them if you start young enough, preferably no more than three years old.

The Power is still out

We’re on day 4 without electricity, so this will be a brief post. Things I’ve learned or had reinforced:

  1. Prepping for the apocalypse is silly but prepping for a disaster is not. This time has been inefficient and uncomfortable, but not especially problematic. Compared to Asheville, we got off quite easy, a fact made all the clearer by our fortune to maintain a fairly normal life thanks to the most modest of preparations: a couple charged phone banks, LED lamps, batteries, a propane tank and grill, and coolers pre-filled with ice.
  2. Price controls during a disaster, formal and informal, remain problematic. More than few people saw their esteem of Clemson drop as fans descended on the region for the football game and grab up every bag of ice they laid eyes on to facilitate their tailgating, a problem that probably could have been averted by simply letting the price of ice quadruple.
  3. Public goods matter and government remains a superior way of providing and coordinating large swaths of them. Not to get all Nozick and Rawls on you, but think of it this way: disaster response and coordination requires scale. Any institution that emerges that is superior in providing such responses will have the scale of government, will be a de facto government, regardless of whether you call it a government or not.
  4. Power lines. Bury the damn power lines. God how I miss living where the bulk of power lines were underground. I never knew how good I had it.
  5. Hank was right. Propane and propane accessories are where it is at.

Stay safe everyone.

Probability Theory for the Minecraft Generation

If you are teaching statistics to 20-year-olds (or maybe even if you are not), you might be interested in ways to make probability theory more engaging. I watched a students eyes light up when I showed this in class, so that makes it feel worth sharing.

The Law of Large Numbers is a standard part of statistics or business analytics classes. Something that goes along with it conceptually is “The Law of Truly Large Numbers,” sometimes also called The Infinite Monkey Theorem. The idea is that if you put monkeys in front of typewriters, perhaps infinite monkeys with infinite typewriters and with infinite time, they will eventually write a Shakespeare play.

To illustrate this feature of probability theory for the video gamers, a fun and well-produced video is
“Can Mobs Beat Minecraft?” by Wifies

There is nothing inappropriate for students. The video is 13 minutes, which is too long to show during a class session. I recommend watching the first minute and a half and then explaining that the middle is a lot of gaming details to prove that it is technically possible that a randomly acting “mob” could eventually beat the entire Minecraft game, given enough time.

At the 10-minute mark, the math begins. You could watch about one more minute and a half to see how he tries to calculate the infinitesimally-small-and-yet-positive probability that this could happen. Given enough time, just about anything that is possible will happen.

Another possibility for a teacher is not to show the video in class but to offer it as an optional or extra credit assignment, so that a student who loves Minecraft could really have fun with it and other students can skip.

For me, this pairs with Chapter 5 on Probability for the textbook Applied Statistics in Business and Economics.

Another teaching tip. If you ever need to print out paper rulers, you might be Googling “printable rulers” and you’ll see a bunch of scams as the top results. THIS link works: https://www.brightonk12.com/cms/lib/MI02209968/Centricity/Domain/517/Ruler_6-inch_by_16.pdf

Federalism in Action: The Case of Alcohol and Local Autonomy

Where would you expect Federalism to occur? In other words, where would expect a government to devolve authority to a lower government. Importantly, this is different from freedom vs authoritarianism. The lower government might choose to be more or less free. For example, right now in Florida there is a state-wide constitutional amendment on the ballot that would enshrine each individual’s right to hunt and fish. Ignoring the particulars of what that means, it’s clearly a step toward centralizing policy rather than decentralizing it. Central governments can be strong and protect citizens, or they can strip us of rights. Either way, being small players and far-removed, it’s difficult for us to affect the policy decisions.

That concern is philosophical, however. Maybe my opinion shouldn’t matter (one could easily argue). Even as a matter of prudence, one-size-fits all sets a standard, but the standard may not be a good fit for every locality and circumstance. There is a trade-off between ease of navigating a uniform policy across the land and customized policies that are particular to local priorities. Given that Americans can vote, is there a way for us to think about when a policy will be (should be?) centralized vs decentralized?

There is a great case study by Strumpf & Oberholzer-Gee* on the matter of alcohol policy after the end of national prohibition. The US has a dizzying array of liquor laws across the country and even across states. Some states have a central policy of dry or wet, while others devolve the authority to lower governments. How should we think about that policy? What determines the policy of central versus devolved authority?

Continue reading

Rockonomics Highlights

I missed Alan Kreuger’s 2019 book on the economics of popular music when it first came out, but picked it up recently when preparing for a talk on Taylor Swift. It turns out to be a well-written mix of economic theory, data, and interviews with well-known musicians, by an author who clearly loves music. Some highlights:

[Music] is a surprisingly small industry, one that would go nearly unnoticed if music were not special in other respects…. less than $1 of every $1,000 in the U.S. economy is spent on music…. musicians represented only 0.13 percent of all employees [in 2016]; musicians’ share of the workforce has hovered around that same level since 1970.

there has been essentially no change in the two-to-one ratio of male to female musicians since the 1970s

The gig economy started with music…. musicians are almost five times more likely to report that they are self-employed than non-musicians

30 percent of musicians currently work for a religious organization as their main gig. There are a lot of church choirs and organists. A great many singers got their start performing in church, including Aretha Franklin, Whitney Houston, John Legend, Katy Perry, Faith Hill, Justin Timberlake, Janelle Monae, Usher, and many others

Continue reading

Consumer Expenditures in 2023

Today BLS released the annual update to the Consumer Expenditure Survey, which is exactly what it sounds like: a survey of US consumers about what their spending. The sample size is “20,000 independent interview surveys and 11,000 independent diary surveys” so it’s a pretty big sample. And this is a really great data source, because versions of it go back over 100 years (though the current, annual survey with a lot of detail starts in 1984).

What does this new data tell us? One area that has received a lot of attention lately is food spending (including a lot of attention on this blog), especially the cost of groceries. According to the CPI food at home index, grocery prices are up almost 26 percent since the beginning over 2020. That’s a lot! But incomes are up too, so how does this affect spending patterns?

Here’s what food and grocery spending for middle-quintile households looks like:

Compared to the pre-pandemic 2019 levels, consumers are spending slightly less of their income on food (12.7% vs. 13.2%), though a slightly larger share of their income is being spent on groceries (8.1% vs. 7.8%). Those changes are noticeable, though this isn’t the radical realignment of spending patterns you might expect from such a big change in food prices. The reason is clear: while grocery prices are up about 26%, middle-quintile incomes are up a similar 25% since 2019. That’s falling behind a little bit, but incomes have roughly kept pace with rising food prices. And from 2022 to 2023, both of these percentages decreased slightly, by about 0.3 percentage points.

Continue reading

Did Sherlock Holmes Really Wear a Deerstalker Hat?

Quick, who is the guy on the right in the illustration below?

Image: YouTube

Here he is again:

We’d know him anywhere, thanks to that deerstalker cap. This was a practical hat used by hunters and other outdoorsmen in England at the time. It was popular with women as well as men. The front and back brims warded off rain and sun. The ear flaps tied under the chin for cold weather or wind. The flaps were tied at the top when they were not down. Holmes’s hat was apparently in a hounds-tooth tweed pattern of water-shedding wool.

How often did Arthur Conan Doyle feature his detective character wearing this headgear? Actually, he didn’t at all. The stories never once mention Holmes in a deerstalker cap (or an Inverness cape, another Sherlock Holmes trope), although such a hat is not implausible.

When the first sets of Sherlock Holmes stories appeared serialized in the Strand magazine in the early 1890’s, they were illustrated by artist Sidney Paget. Paget is responsible for the deerstalker cap image. Here is the detective and his sidekick on the way to investigate the Boscombe Valley mystery:

Image: Wikipedia The Boscombe Valley  Mystery   

It would seem that Sherlock Holmes lived and died by his deerstalker, as evidenced by Paget’s illustration on the detective’s struggle to the death with the arch-villain Professor Moriarity above Reichenbach Falls, in The Final Problem:

Image: Wikipedia

( Doyle wrote The Final Problem to kill off his detective character, so the author could move on to more dignified pursuits than writing Sherlock Holmes stories. He did not anticipate the public outcry at the demise of the popular character. Men in London wore black armbands, and subscriptions to the Strand magazine were cancelled in protest. Eventually Doyle brought Holmes back in a further series of stories, with the literary device that Holmes had faked his own death in order to hide out from a criminal syndicate. )

Even Paget did not keep Holmes in this hat all the time. When the great detective was not sleuthing in the outdoors, he was properly dressed for English society. It was unthinkable for a gentleman to appear in public without some kind of hat. For instance, here are two illustrations from The Adventure of Silver Blaze. Holmes is depicted below in his deerstalker when confronting a bad guy at the gate of a neighboring farm, after tracking a horse across the moor:

In the same story, however, Holmes is drawn by Paget at a horse race event wearing a formal top hat like the other gentlemen:

Image:  Wikipedia    Holmes with Silver Blaze (forehead dyed), 1892 illustration by Sidney Paget

If all this leaves you itching for your own deerstalker cap, there are several versions available on Amazon, e.g. here and here

Bonus: if you yearn to identify with a more contemporary hero, see here for info on Indiana Jones fedoras.

Bad service is a sign of a better world

I’ve been hearing more grumbling about bad service in restaurants than usual, bundled with a growing nostalgia for when service was “better”. This could, of course, be simply a sign that my cohort and I continue to rise in age, but let’s put aside healthy skepticism for a moment and accept this observation at face value. What if service in restaurants, hospitality, etc is, in fact, lower in quality than it was one or two decades ago? I would like to suggest that this is a good sign of improving times.

In 1930, 1 in 20 households had servants in their home. “If the poorest households are excluded from the statistics, the percentage of homes with servants increases dramatically, as indicated by 1930–1931 studies of urban, college-educated homemakers, or middle-class families, from 20 to 25 percent of which had a servant” (Palmer 2010). By 1950 these numbers were cut in half and they’ve plummeted since. Imagine a elderly couple who had raised children with full-time, possibly live-in, servants have since grown to watch their children marry and have children of their own. They go out to enjoy a family meal in 1975, doting over their grandchildren while oh-so-subtly critiquing the parenting technique of their sons- and daughter-in-laws. When you see them in your mind’s eye, are they happy with the restaurant’s service? Is there anything a server or manager can do that can possibily compete with the level of service they enjoyed in their parenting and prime earning years?

I suspect that you are envisioning something similar to myself: a Karen, indefatiguable in her complaining, a gray-haired husband encouraged to leave an outrageously low tip. They enjoyed service at the level of employer and boarder, in a social construct that we would today frame as a remnant of an outdated class system. You may be annoyed that no one has refilled your water glass in 10 minutes, that the menu is a QR code, that you are expected to exceed 20% in your tip. Your disappointment, however, is positively quaint when compared to the dropoff relative to what a significant portion of the population was wholly accustomed to even 2 generations ago.

These entitled complainers that you absolutely cannot empathize with? The mechanism behind their comtemptible behavior is the same that leads you to tip 18% before leaving the Cheesecake Factory in a huff. The world has moved on, gotten better, and brought Baumol’s inescapable cost disease with it. The time and attention of humans is more expensive than ever. The pandemic brought with it a shock to the hospitality labor market that is still rippling today. A lot of people learned about the market value of their labor and those that got out first have reported that life is often better on the other side, that the pay was better than expected and their work involved immeasurably fewer misogynistic sad dads and spiraling white wine Karens. Wages have of course adjusted, but so has employment. I don’t have the data in front me, but anecdotally I’m seeing fewer hosts and table bussers, more tops per server, more lunch shifts stretched across an assistant manager and server duo. That means less service on average with a higher variance in quality.

Which is fantastic. The world is getter better and people’s time and energy are more valuable for it. Should restaurants find that the balance of profit margins increases faster with quality of food rather than service, all the better. Temporary parasocial relationships are right up there with big houses and fast cars for me: overrated traps that siphon away household resources from the things that actually matter. The ribeye served with a smile over clean linen is fine, but it’s got nothing on tacos uncermoniously dropped on a plastic table you can afford to share with someone you love.

Paper on Finance and Economics Women Club

I am one of several founders of a club with the abbreviation F.E.W. for Finance and Economics Women. This is a student organization that we have at Samford and that Dr. Darwyyn Deyo runs at San Jose State University.

Read our report here: The Finance and Economics Women’s Network (FEW): Encouraging and Engaging Women in Undergraduate Programs published in the Journal of Economics and Finance Education

Our short paper is mostly a how-to guide including a draft of a club charter document. We describe our institutions and how we use this group to engage and encourage students. Please read it for more details on how to start a club.

Like most student groups, the FEW model relies on student leaders who take initiative. Having done this for more than 6 years, we have a growing network of alumni and local business partners who connect to current students through FEW events. Personally, I am lucky that 3 faculty members total support the club at my school.

Women are often minorities in upper-division econ and finance classes. Women also have some unique challenges when it comes to choosing career paths and navigating the workplace. These events (e.g. bringing in a manager from a local bank to talk with student over lunch) allow a space for students to ask questions they might not normally ask in a classroom setting or in a standard networking environment.

We report the results of a small survey in our paper. We can’t infer causality, nor did we run any experiments. However, we did find that women were more likely to report that a role model in their chosen profession influenced their choice of major. Part of the purpose of the FEW model is to expose students to a variety of role models who they might not otherwise connect with.

Here’s a news article with a picture of the founding group at Samford. I have great appreciation and respect for our student leaders who keep it going, and I am grateful to the graduates who stay in contact with us.

Suggested citation: Buchanan, Joy, and Darwyyn Deyo, “Finance and Economics Women’s (FEW) Network: Encouraging and Engaging Women in Undergraduate Programs” (2023) Journal of Economic and Finance Education, 22: 1, 1-14.

Interpreting New DIDs

If you didn’t know already, the past five years has been a whirl-wind of new methods in the staggered Differences-in-differences (DID) literature – a popular method to try to tease out causal effects statistically. This post restates practical advice from Jonathan Roth.

The prior standard was to use Two-Way-Fixed-Effects (TWFE). This controlled for a lot of unobserved variation over individuals or groups and time. The fancier TWFE methods were interacted with the time relative to treatment. That allowed event studies and dynamic effects.

Continue reading