The Arithmetic of Family Punctuality

My children are getting more capable. They get more responsibility that comes with the independence that capability implies. Specifically, when getting ready in the morning they like to leave so that they arrive at school just barely on time. Except, when something comes up, they are rushed, flustered, short-tempered, and tardy. They lament that “if only the unforeseeable event X hadn’t happened, I would have been on time”.

It doesn’t matter what X is. Maybe they forgot to pack a lunch, or set out their clothes, or they have a flat tire on their bikes, or… whatever. The specific time-consuming event is unforeseeable. But, that *any* time-consuming event will occur is very foreseeable. What’s a Bayesian to do?

Before we even start the analysis, let’s acknowledge that being perfectly on time for some event usually involves stress and a lack of preparedness. Yes, you were ‘on time’, but given the probability of heavier traffic, difficulty finding a parking spot, or whatever, we know that tardiness is just one unforeseen event away.

Individual Punctuality

How long does it take to get somewhere? It takes both travel time and time preparing to depart. Let’s just generally call this ‘preparation’ time. Let’s assume that you complete everything that you would complete. That means that you aren’t forgoing a shower or breakfast or whatever lower priority you might choose to forgo to arrive at some obligation punctually.

Random events can occur either as you travel to work or as you prepare to depart, but let’s place the random travel events to the side and focus on what one can do to get out of the house ‘on time’. In my personal case, my children have a 30min interval during which they can arrive at school. They almost never arrive in the first 15min of that interval. That’s more of a policy choice than an accident. They don’t want to sit in a cold gymnasium for 20min if it’s avoidable. So, their planned arrival time has an effective 15min window.

Here is the problem. A time-consuming random event, X, is a right-skewed random variable. Discretely, the modal day includes X=0min. Though the most common delays are greater than 0min. See the distribution below. A 0min random event occurs 35% of the time. But, a time-consuming event happens 65% of the time. So, if you try to arrive exactly on time to your obligation, then you will be punctual 35% of the time and you will be tardy 65% of the time. That’s not a good look and not a good reputation to build – and that’s apart from building a habit of imprudence and the material consequence of not being ready for the task at hand.

Someone with just enough insight to be dangerous might say ‘Ah! Instead, leave with enough time to accommodate the expected unforeseen event’. Mathematically, that’s the weighted average. In this case, that’s six minutes. So, if you plan to arrive 6min early, then you will be punctual – on average. But even that’s not really what we’re after. We’d like to be on time for a preponderance of the days. Building in a 6-minute buffer does two things. 1) Every time that there is a 0min or 5min unforeseen event, you get to your destination 6min or 1min early. That’s good for your nerves, performance, and reputation. But, that also means that you’re late whenever there is a 10min, 15min, or 20min unforeseen event – and those occur 35% of the time!

Continue reading

Price Level: Noise vs Signal

My university recently hosted a guest speaker. Among their content, they included some nominal macroeconomic values from pre-2020, back in the era when inflation was very low. That roughly includes the years 2012-2019. Truly, inflation stayed below 2% through February of 2021, but I think that we can all agree that the economy was different in a few ways beginning in 2020.

I asked the speaker why not express the nominal values in real terms. They were emphatic that the low rates of inflation at the time implied that the signal-to-noise ratio was too low. Therefore, the ‘real’ inflation adjusted values would not be more precise because excessive noise would be introduced into the series during a period when not much deflating was necessary in the first place.

My answer to this is a firm ‘maybe’. It makes sense and it’s plausible (Jeremy has written about error and revisions in the past). We can think about the noise in price indices in a few ways.

1) It may be information is incomplete and becomes more complete as time passes. This sort of noise only exists in the short-run and is resolved as more information becomes available later in time. Revisions tend to happen each month for prior months, as well as each year for prior years. There are also big revisions after methodological, consumption weight, and data source changes.

2) Another type of noise is due to incomplete information that is never resolved. After all, the government statisticians can’t see literally all of the transactions. Those unobserved transactions will never make it into the official inflation measures and we’ll never get a perfect picture.  

3) Methodological artifacts may also include known biases. This type of noise doesn’t get corrected except after major changes to the series. If those changes never happen, then we just sort of live with imprecision. Luckily, so long as the bias is consistent, then percent change in the price indices will approximate the underlying true levels. However, if there are non-random biases in the percent change, then it can cause some trouble.

One way to get an idea for the amount of noise in the data is to observe the magnitude of revisions. Of course, this only helps us with the first type of noise above that eventually gets resolved with more information. It’s much harder to get a handle on the imprecision that is not identifiable. The Philadelphia Federal Reserve Bank provides an easy-to-use database that puts all of the archival and revised numbers for many macro series in a single place: the Real-Time Data Set (RTDS). It includes every historical PCE price index value for each publication month. Let’s limit our sample to the 21st century.

Continue reading

Take Your Kids to the Movies

Economists talk about the ‘Covid Shock’ in 2020 because it was a mostly unpredictable event that had big, measurable effects. People spent a lot less time being in close quarters. Especially hit hard were movie theaters, and other events spaces.

In the several years prior to Covid, “Recreational Service” industry sales had been chugging along, growing at healthy annual rate of 3.4% (inflation adjusted). This category of services includes clubs, sports centers, theaters, and museums. In the blink of an eye, the covid shock drastically reduced spending in that category by more than 60%. See the graph below.

Unfortunately, we don’t have disaggregated series for the components of “Recreational Services”. But we do know that movie theaters were already well past their hay-day. Theaters had been closing and consolidating for more than a decade and ticket sales were down. Many give credit to the popularity of streaming video services and other digital media alternatives. Covid added insult to injury.

Now, going to a movie theater is exceptional. As a teenager in the early naughts, I’d go to the theater easily half a dozen times per year. Now, I don’t think that I’ve gone six times in the last five years. Real growth in the entire recreational service category has grown annually by an anemic 1.8% since 2019. It’s not dead, but that’s also the total industry. I’ve heard the news stories of sports events making a big comeback. I’ve not heard anything like that for movie theaters.

I went to the movies recently and it is not what you remember.

Continue reading

A Rant about Long Run Problems and Passe Solutions

If you listen to or read major economists discussing what they think are big-picture problems, then their list usually includes three topics: Fertility, Culture, & the Fiscal Health.  On the wonkier side, you’ll also hear that housing scarcity and affordability is a problem, but let’s stick with the first three.

Fertility

People are deciding to have fewer children for a variety of reasons. In no particular order, the reasons include greater access to financial institutions, more popular female education, higher female wages, lower infant mortality, and falling religiosity. Some also speculate that housing affordability, safety regulations, and social safety nets contribute too.

What’s wrong with lower fertility? In an objective sense, there is nothing wrong. But, in the sense that people value similar things, we are in somewhat uncharted territory. Realized fertility is dropping across the globe. We know that economies of scale increase productivity and real wages. We also know that technological innovation comes from having more minds engaged with economic problems. It’s possible that labor productivity rises faster than the productivity that we lose with smaller scale, but it’s an open question. What happens to the liberal societies and polities when the liberals fail to persist? These are big geopolitical concerns.

Culture

People seem to be more fragmented religiously and culturally. Social scientists used to discuss Judeo-Christian norms more often. Sometimes you’d hear about English or Roman legal tradition or enlightenment values. But now, there seems to be very little in terms of common social cohesion. In the USA, the general common culture seems to be ‘smile and be nice’. That’s not the worst common rule, but it’s not enough to hang our hat on for a capable liberal state.

The lack of cultural cohesion isn’t my own particular concern – public intellectuals in economics and elsewhere feel like there is a problem. There is a mix of reasoning behind the concern. Some people are worried about transmitting values to the next generation, some are worried about how people behave when no one’s watching, and still others are worried about simply lacking a Schelling  point that coordinates large scale economic cooperation.

Fiscal Health

Continue reading

An Expensive Easter

Americans like their food. Holidays are often known by the dishes that we serve. Thanksgiving is a bit unique in that most of us converge on turkey, though diversity obviously exists. What about Easter? There’s not really the same focus on a single food like there is for Thanksgiving. My impression is that people eat daytime or lunch foods that include ham, lamb, or just about anything. My family tends to make tacos.

What am I saying?! We eat candy! Solid or hollow chocolate bunnies, jellybeans, peeps, and on and on. We fill Easter eggs and keep candy around the office. We literally have baskets full of candy.

A Chocolate Bunny? In this economy?

Have you seen the price of chocolate? Yeesh! The latest figures are from February and the prices for chocolate and cocoa bean products are down 11.7%  year-over-year. That’s nice, you may think, our budgets can fit a bit more chocolate into our consumer – I mean Easter – baskets. Great news. The news seems a little less great when you realize that February’s price of chocolate was 90% higher than it was four years earlier in 2022. 90% higher is a lot like 100%, and 100% is double! In fact, the price had peaked at 142% higher by September of 2025, and now prices are quickly falling. See the chocolate-colored line in the graph below.

Continue reading

Cournot & Stackelberg Math

This post solves for the equilibrium quantity of production with quadratic total cost under Cournot and Stackelberg competition.

Say that there are two firms. They produce the exact same quality and type of goods and sell them at the same price. Let’s also assume that the market clears at one price. Finally, let’s assume increasing marginal costs.

Let’s say that they face the following demand curve:

The firms have a total cost of:

The marginal cost is the derivative with respect to the choice variable for each firm, or their respective quantities produced:

The total revenue is just the price times the quantity sold.

This is all standard fare for economic modeling. You’re free to make different assumptions. You can even adopt different slopes in the demand curve to reflect goods with different characteristics.

Cournot Competition

If you imagine a lengthy production process, or otherwise that they physically attend the same market, then it’s reasonable to assume that they don’t know one another’s choice of quantity produced.

We know how firms maximize profit: They produce the quantity at which the marginal revenue equals the marginal cost. But, what is marginal revenue? The derivative of total revenue with respect to the choice variable:

Now we can set the marginal revenue equal to marginal cost and solve for the optimal level of output:

Notice that the optimal level of output depends on the production decision of the other firm. These are called response functions. If we solve for the quantities at which they intersect, then we are solving for where both firms are producing the best response to one another. This is known as a Pure Strategy Nash Equilibrium (PSNE).

Luckily, in many applications, one or more of the above terms are zeros, which makes things much simpler.

The general process for solving for the Cournot equilibrium is:

  1. Set MR=MC to find the response functions.
  2. Find where the response functions intersect.

Stackelberg Competition

Continue reading

What is an AI Skill?

If you’ve been on LinkedIn recently, then you may have seen the chatter about teaching your artificial intelligence to have various skills. I saw one post by a guy who claimed to have created several skills, each representing a tech billionaire.

At first, I thought “I am behind the 8-ball. What is this new thing?”. Obviously I know what the word “skill” is and how people use it, but I had not encountered its use in the context of AI having it. What does it mean for an AI to have a skill? I somewhat dreaded the the work of learning the new skill of teaching my AI skills.

Then I had lunch with a computer scientist and I learned that skills are nothing new.

Continue reading

The Economic Story of Mike Mulligan and His Steam Shovel

Mike Mulligan and His Steam Shovel, by Virginia Lee Burton, is a classic 1939 children’s book about a man, Mike Mulligan, and his beloved steam shovel, Mary Anne, who are replaced by modern machinery. They get one last chance to demonstrate their worth by digging the cellar for a new town hall in a single day.

This book is more than just a nostalgic children’s story with a happy ending. This is a tale about economic history, comparative advantage, non-pecuniary benefits, labor and capital heterogeneity, and, of course, transaction costs.

Here’s some background. Historically, excavating or earth-moving equipment was powered by steam. Much like a steam engine locomotive (train), a steam shovel burns coal to heat water in a boiler, creating steam that can drive pistons that operate the mechanics. The result is machinery that can move a greater volume of soil at a faster speed than humans with simple hand shovels. Advancements in oil extraction and refining and internal combustion made the steam methods obsolete. Diesel or gasoline made earth movers safer, faster, and larger all because there was no need to build high pressures from boiling water. Steam pressure in the field takes a lot of time and is dangerous. 

Here is how the story goes. Mike enjoys his earth-moving work with his steam-shovel and is proud to be more productive than hand-shovels. One day, diesel, electric, and gasoline-powered shovels arrive. They’re bigger and better than Mary Anne. She is now obsolete. It’s unclear whether Mike’s skills are transferable to the newer equipment, but he implicitly prefers working with Mary Anne.  Together, they can’t compete in the urban areas where the value placed on quick excavation is high. So, they flee to the countryside.

The text doesn’t say why the newer shovels aren’t in the countryside. Let’s address that first. The new shovels haven’t spread to the rural areas because the opportunity cost is too high. Diesel Shovels are expensive and the owners/operators need revenue from many jobs in order to pay for their equipment in a reasonable amount of time and earn a positive return. Rural areas don’t have the same willingness to pay for as many projects, so less specialized capital is limited by the smaller extent of the market. Clearly, a higher cost of capital – the cost of the loan that pays for the diesel shovels or the alternative uses of the resources – accentuate the necessity for project volume.

Continue reading

Ricardian Equivalence: Reasonable Assumption #2

There are several requirements for Ricardian Equivalence:

  1. Individuals or their families act as infinitely lived agents.
  2. All governments and agents can borrow and lend at a single rate.
  3. The path of government expenditures is independent of financing choices

Assumption 2) appears patently absurd on its face. I certainly cannot borrow at the same interest rate that the US Treasury can. QED. Do not pass go, do not collect $200. The yield on 1-year US treasuries is 3.58%. I can’t borrow at that rate… Or can I?

Let’s do some casuistry.

What is a loan?

It’s a contract that:

  • Provides the borrower with access to spending
  • with or without collateral
  • with a promise to repay the lender at defined times, usually with interest.

So, when you borrow $5 from a friend and pay it back on the same day, it’s a loan. The contract is verbal, there is no collateral, the repayment time is ‘soon’ with flexibility, and the interest rate is zero.

A mortgage is a collateralized loan. You borrow from a bank, make monthly payments for the term of the loan, and accrue interest on the principal. The contract is written, the house or a portion of its value is the collateral, and the interest rate is positive.

What about a Pawnshop loan? Most of us are probably unfamiliar with these. In this circumstance, a person has valuable non-assets that and the pawnshop has money.  They engage in a contractual asset swap. The borrower lends the non-money asset to the pawnshop as collateral and borrows money from the pawnshop. The pawnshop borrows the non-money asset and lends the money to the borrower. The borrower can use the money as they please, but the pawnshop can not use the non-money asset – they can simply hold it. They collect interest in order to cover their opportunity costs.

One outcome is that the borrower repays the loan and interest by the maturity date and reclaims their non-money asset. Another outcome is that the borrower retains the option to default without any further obligation. But they lose the right to reclaim their property according to the repayment terms. If the borrower exercises the option to default, then the pawnshop acquires full rights to the non-money asset. The pawnshop often resells the asset at a profit. The profit is relatively reliable because the illiquidity of the non-money asset allows the pawnshop to lend much less than its retail value. That illiquidity is also why the borrower is willing to accept the terms.

If we accept that the pawnshop contract is a loan, which is just a collateralized loan with a mostly standard default option, then get ready for this.

Continue reading

Regulatory Burden By Presidential Administration

During president Trump’s first term in office, he made a bunch of waves (as he’s wont to do). His more educated supporters said that he engaged in substantial deregulation of telecommunications, which got a lot of press. There was a quiet contingent of educated voters who were relatively silently supportive on Trump’s regulatory policy, even if his character was indefensible or his other policy was less desirable.

But was Trump a great deregulator? Or was it one of those cases when we say that he regulated *less* than his fellow executives? The George Washington University Regulatory Studies Center can help shed some light with their data. Specifically, they have calculated the number of ‘economically significant’ regulations passed during each month of each president going back through Ronald Reagan’s term. What counts as ‘economically significant’? The definition has changed over time. But, generally, ‘economically significant’ regulations:

  1. “Have an annual [adverse] effect on the economy of $100 million or more
  2. Or, adversely affect in a material way the economy, a sector of the economy, productivity, competition, jobs, the environment, public health or safety, or State, local, or tribal governments or communities.”

The only exception to this is between April 6, 2023 and January 20, 2025 when the threshold was raised to $200 million.

The Data

The graph below-left shows the number of economically significant regulations for each president since the start of his term, through July of 2025. It’s reproduced from the link above except that I appended Trump’s second term onto his first term. What does the graph tell us? There doesn’t seem to be much of a difference between republicans and democrats. Rather, it seems that, generally, the number of economically significant regulations increases over time. Importantly, the below lines are cumulative by president. So each year’s regulations each cost $100m annually and that’s on top of the existing ones already in place. So, regulatory costs generally rise, with the caveat that we don’t see the relief provided by small or rescinded regulations (for that matter, we don’t see small regulatory burdens here either). Something else that the below graph tells us is that presidents tend to accelerate their economically significant regulations prior to leaving office. Reagan was the only exception to this pattern and he *slowed* the number of regulations as the end of his term approached.

Below-right is the same data, but the x-axis is months until leaving office. Every president since Bush-41 has accelerated their burdensome regulations during their final months in office. The timing of the acceleration corresponds to how close the preceding election was and whether the incumbent president lost. Whereas all presidents regulate more in their last 2-3 months in office, the presidents who were less likely to win re-election started regulating more starting around eight months prior to leaving office. Of course, they wouldn’t say that they expected to lose, but they sure regulated like there was no tomorrow.

What about Trump? Trump’s fewer regulations is caused by his single term. He definitely still added to the regulatory burden (among economically significant regulations, anyway). While Trump started with the fewest additional regulations since Reagan, and Biden started with the most ever initial regulations, together they earn the top prizes for most regulations added in their first term.

What if we append these regulations from end-to-end? That’s what the below chart does. We do have to be careful because the series is a measure of gross economically significant regulations and not net economically significant regulations. So, it’s possible that some rescissions dampened the below values, but this is the data that I have for the moment. While each presidential administrations increases regulation more than the prior, the good news is that the rate of change is not exponential. The line of best fit is quadratic. We’re experiencing growing regulations, but at least it’s not compound growth.

The Cost

We can estimate the costs of these economically significant regulations. It’s a rough cut, and definitely a lower bound since rescission is rare and $100 million is itself a lower bound, but we can multiply the number of regulations by $100m to get minimum annual cost. Like I said, the Biden criterion from April 2023 through January 20, 2025 changed, so those regulations get counted as $200 million instead. The change in definition means that the regulation counts underestimate the late-term Biden regulations relative to the other presidencies.

Continue reading