WW II Key Initiatives 2: “Thatch Weave” Tactic to Counter More-Agile Japanese Fighter Planes

This is the second of a series of occasional posts on observations of how some individual initiatives made strategic impacts on World War II operations and outcome.  While there were innumerable acts of initiative and heroism that occurred during this conflict, I will focus on actions that shifted the entire capabilities of their side.

It’s the summer of 1941. The war in Europe between mainly Germany and Britain had been grinding on for around two years, with Hitler in control of nearly all of Europe. The Germans then attacked the Soviet Union, and quickly conquered enormous stretches of territory. It looked like the Nazis were winning. Relations with Japan, which aimed to take over the eastern Pacific region were uneasy. The Japanese had already conquered Korea and coastal China, and were eyeing the resource-rich lands of Southeast Asia and Indonesia. It was a tense time.

The Japanese military had been building up for decades, preparing for a war with the United States for control of the eastern Pacific. They developed cutting edge military hardware, including the world’s biggest battleships, superior torpedoes and a large, well-trained aircraft carrier force. They also produced a new fighter plane, dubbed the “Zero” by Western observers.

Intelligence reports started to trickle in that the Zero was incredibly agile: it could outrun and out-climb and out-turn anything the U.S. could put in the air, and it packed a wallop with twin machine cannons. Its designers achieved this performance with a modestly-powered engine by making the airframe supremely light.

As I understand it, the U.S. military establishment’s response to this intel was fairly anemic. It was such awful news, that seemingly they buried their heads in the sand and just hoped it wasn’t true. Why was this so disastrous? Well, since the days of the Red Baron in World War I, the way you shot down your opponent in a dogfight was to turn in a narrower circle than him, or climb faster and roll, to get behind him. Get him in your gunsights, burst of incendiary machine-gun bullets to ignite his gasoline fuel tanks, and down he goes. If the Zero really was that agile, then it could easily shoot down any U.S. plane with impunity. Even if you started to line up behind a Zero for a shot, he could execute a tight turning maneuver, and end up on your tail, every time. Ouch.

A U.S. Navy aviator named John Thatch from Pine Bluff, Arkansas did take these reports on the Zero seriously. He racked his brains, trying to figure out a way for the clunky American Wildcat fighters to take on the Zeros. He knew the American pilots were well-trained and were good shots, if only they could get some crucial four-second (?) windows of time to line up on the enemy planes.

So, he spent night after night that summer, using matchsticks on his kitchen table, trying to invent tactics that would neutralize the advantages of the Japanese fighters. He found that the standard three-plane section (one leader, two wingmen) was too clumsy for rapid maneuvering. He settled on having two sections of two planes each.   The two sections would fly parallel, several hundred yards apart. If one section got attacked, the two sections would immediately make sharp turns towards each other, and cross paths. The planes of the non-attacked section could then take a head-on shot at the enemy plane(s) that were tailing the attacked section.

Here is a diagram of how this works:

Source: U. S. Naval Institute

The blue planes are the good guys, with a section on the left and on the right. At the bottom of the diagram, an enemy plane (green) gets on the tail of a blue plane on the right. The left and the right blue sections then make sudden 90 degree turns towards one another. The green plane follows his target around the turn, whereupon he is suddenly face-to-face with a plane from the other section, which (rat-a-tat-tat) shoots him down. In a head-to-head shootout, the Wildcat was likely to prevail, since it was more substantial than the flimsy Zero. Afterwards, the two sections continue flying parallel, ready to repeat the maneuver if attacked again. And of course, they don’t just fly along hoping to be attacked, they can make offensive runs at enemy planes as well, as a unified formation. This technique was later dubbed the “Thatch weave”.

Thatch faced opposition to his unorthodox tactics from the legendary inertia of the pre-war U.S. military establishment. Finally, he and his trained team submitted to a test: their four-plane formation went into mock combat against another four planes (all Wildcats), but his planes had their throttles restricted to maximum half power. Normally that would have made them toast, but in fact, with their weaving, they frustrated every attempt of the other planes to line up on them. This demonstration won over many of the actual pilots in the carrier air force, though the brass on the whole did not endorse it.

By some measures the most pivotal battle in the Pacific was the battle of Midway in June, 1942. The Japanese planned to wipe out the American carrier force by luring them into battle with a huge Japanese fleet assembled to invade the American-held island of Midway. If they had succeeded, WWII would have been much harder for the U.S. and its allies to win.

The way that battle unfolded, the U.S. carriers launched their torpedo planes well before their dive bombers. The Japanese probably feared the torpedo planes the most, and so they focused their Zeros on them. Effectively only Thatch and two other of his Wildcats were the only American fighter protection for the slow, poorly-armored torpedo bombers by the time they got to their targets. Using his weave maneuver for the first time in combat, he managed to shoot down three Zeros while not getting shot down himself. This vigorous, unexpectedly effective defense by a handful of Wildcats crucially helped to divert the Japanese fighters and kept them at low altitudes, just in time for the American dive bombers to arrive and attack unmolested from high altitude.

In the end, four Japanese fleet carriers were sunk by the dive-bombers at Midway, at a cost of one U.S. carrier. That victory helped the U.S. to hang on in the Pacific until its new carriers started arriving in 1943. Thatch’s tactic made a material difference in that battle, and was quickly promulgated throughout the rest of the U.S. carrier force. It was not a complete panacea, of course, since the once the enemy knew what you were about to do, they might be able to counter it. However, it did give U.S. fighters a crucial tool for confronting a more-agile opponent, at a critical time in the war. Thatch went on to train other pilots, and eventually became an admiral in the U.S. Navy.

Source: Wikipedia

The Middle of the 20th Century was a Weird Time for Marriage

Yesterday on Twitter I shared a chart showing the age at first marriage for white men and women in the US, with data going back to 1880. I pointed out an interesting fact: at least for men, the age was essentially the same in 1890 and 1990 (27), though for women it was a bit higher in 1990 than in 1890 (by about 1 year).

This Tweet generated quite a bit of interest (over 800,000 impressions so far), and (of course!) a lot of skeptical responses. One skeptical response is that I cut off the data in 1990, when trends since then have shown continuously rising ages at first marriage, and by 2024 the comparable figures were much higher than in 1890 (by about 4 years for men and 6.5 years for women). In one sense, guilty as charged, though I only came across this data when looking through the Historical Statistics of the US, Millennial Edition, and that was the most current data available when it was printed. Here is a more updated chart from Census:

But there is another interesting fact about that data: the massive decline age of first marriage in the first half of the 20th century. Between 1890 and 1960, the median age at first marriage fell by about 3 years for men and 2 years for women. For men, most of the decline (about 2 years) had already happened by 1940. Thus, if we start from the low-point of the 1950s and 1960s (as many charts do, such as this one), it appears marriage is continuously getting less common in US history, while the fuller picture shows a U-shaped pattern.

This same pattern shows up in another measure of marriage data: the percentage of people that never get married. If we look at White, Non-Hispanic Americans in their late 40s, the picture looks something like this (keen observers will note that the Hispanic distinction is a modern one dating from the 1970s, but Census IPUMS has conveniently imputed this classification back in time based on other demographic characteristics):

Looking at people in their late 40s is useful because, at least for women, they are past their childbearing years. And using, say, the late 50s age group doesn’t alter the picture much: even though some people get married for the first time in their 50s, it’s always been a small number.

Here we can see an even more dramatic pattern. 100 years ago, it was not super rare for people to never marry: over 1/10 of the population didn’t! But by 1980 (thus, for people born in the early 1930s), it was much rarer: less than 4% of women were never married (among White, Non-Hispanics). In fact, the peak in 1920 of 10% unmarried women wasn’t surpassed again until 2013! And it’s not substantially higher today than 1920 for women, especially when considering the full swing downward. Men are quite a bit higher today, though the 1920 peak of 13% wasn’t surpassed again until 2006.

For a measure that peaks in 1920, we might wonder if new immigrants are skewing the data in some way, given that this is right at the end of about 4 decades of mass immigration. But just the opposite: if we focus on native-born women, the 1920 level was even higher at 11.1%, which wasn’t surpassed until 2022, and even in the latest figures it is less than 1 percentage point higher than 1920.

Precisely why we observed this U-shaped pattern in marriage (both first age and ever married) is debated among scholars, though my sense among the general public is that it isn’t much thought about. Most people (from my casual observation) seem to assume that marriage rates and ages were always lower in the past, and that modern times are the outliers. But in reality, the middle part of the 20th century seems to be the outlier. The “Baby Boom” of roughly 1935-1965 is possibly better understood as a “Marriage Boom,” with more babies naturally following from more and younger marriages.

WW II Key Initiatives 1: FDR Prodded the Navy To Convert Cruisers to Carriers, Just in Time

This is the first of a series of occasional posts on observations of how some individual initiatives made strategic impacts on World War II.  Most major decisions were made by teams of qualified engineers or military staff or whatever. But there were cases where one person’s visionary action made a material difference. There were, of course, many thousands of individual acts of initiative and heroism that went into the outcome of any given battle. However, I will focus on actions that shifted the entire capabilities of their side.

In this regard, I recently read how the intervention of President Roosevelt helped to give the U.S. nine additional aircraft carriers in the Pacific at a time when they were critically needed. As of U.S. entry into WWII in December, 1941, America had a total of 7 carriers, while Japan had 11.

It had been clear for a while that the U.S. needed more carriers, but (pre-Pearl Harbor), the Navy was more focused on building battleships; for centuries, big ships carrying big cannons were the vessels that ruled the seas. Navy brass had run studies of carrier sizing, and decided they would rather have fewer, larger carriers, due to operational efficiencies. A problem was these large carriers took years to construct.

Thus, as of 1940 the projections were that the U.S. Navy would receive no new carriers before 1944. As a naval war with Japan looked more and more likely, the President got concerned. FDR had been Assistant Secretary of the Navy during World War I, and maintained an interest in naval affairs, so he had informed judgement here. In October, 1940, Roosevelt sent a letter to the Chief of Naval Operations,  expressing interest in converting merchant ships into carriers for secondary duties such as convoy escort, antisubmarine warfare, aircraft transport, and air support of landing beaches. The Navy’s response was lukewarm. In 1941, FDR proposed that some of the many cruisers under construction could be converted to small carriers. The Navy considered this, and on 13 October 1941, the General Board of the United States Navy replied that such a conversion showed too many compromises to be effective: such carriers would be less stable platforms than the big carriers, and carry less than half the number of planes per ship.

I think most presidents would have given up at this point, but not FDR. He immediately ordered another study (I assume with the implicit message, “…and this time give the boss the answer he wants”). Lo and behold, on 25 October 1941, the Navy’s Bureau of Ships reported that aircraft carriers could in fact be converted from cruiser hulls. They would be of lesser capability, but fast enough for fleet action, and available much sooner than large carriers.

The December 7, 1941 attack on Pearl Harbor changed everything. That ninety-minute raid showed that aircraft carriers were by far the most critical warships. A carrier could reach out a hundred miles and easily sink any battleship with torpedo bombers, as Japan showed on that “day of infamy” and further demonstrated by sinking British battleships near Singapore, and chasing the British navy largely out of both the Pacific and the eastern Indian Oceans. (If the brass had been paying attention, the British Navy had already used carrier-based torpedo bombers to cripple battleships at the Taranto raid and with sinking the Bismarck, well before Pearl Harbor).

The U.S. did end up converting some (slow) merchant ships to carriers, and built a huge number of small, slow, fragile “escort” carriers for transporting planes and for shore bombardment. But there was still an immediate need for better-protected small “fleet” carriers which were fast enough to keep up with the big carriers and which could survive being hit by a bomb. Japanese leaders knew they could not prevail in a long drawn-out war, so their strategy was to inflict so much damage on American military and territorial assets in the first year of conflict that the U.S. would sue for peace under Japanese terms. Japan, like Germany, was very successful at first. The Japanese overran nearly all of Southeast Asia, including the Philippines (an American possession), the Dutch East Indies (a source of rubber, petroleum, and minerals) and the British stronghold at Singapore. They came perilously close to invading Australia. So the first year or so was critical: the Allies needed to survive the onslaught from a better-prepared opponent until American mobilization took full effect.

The Navy settled on repurposing a suite of nine Cleveland class light cruisers which were under construction. These new “light carriers” could carry about 30 planes apiece, compared to a complement of around 60 planes on the full-sized ships. The smaller carriers carried fewer spares, rolled more in heavy seas, and had smaller flight decks which led to more accidents. Nevertheless, they provided a boost to U.S. naval air power at a critical time.

The U.S. entered the war with seven fleet carriers, of which six were assigned to the Pacific. In the course of 1942, four of those six fleet carriers were sunk, and the other two were severely damaged from bombs and torpedoes. Thus, there was a time in October, 1942 that the U.S. had not a single operational carrier in the Pacific, while Japan was fielding around six. That was dire.

No new U.S. carriers were commissioned until the last day of 1942 (U.S.S. Essex). That was a long dry spell. Finally, in the first six months of 1943 eight fleet carriers commissioned. Of these, three were full-sized ships, while five were the cruiser-based light carriers. That finally gave the U.S. some breathing room, which allowed it to defend its assets and pursue offensive operations. These “Independence-class” light carriers fought in many battles, sometimes providing around a quarter of the fleet airpower.

Thereafter, the astonishing mid-century American industrial capacity took over. From mid-1943 through mid-1945 another 17 fleet carriers (including four more Independence-class light carriers in the second half of 1943) poured out of U.S. shipyards, along with some 60 “escort” carriers. By late 1944, this gigantic fleet had utterly overwhelmed Japan’s navy.

But it was largely Roosevelt’s vision and repeated poking of the stodgy Navy staff that produced the first batch of light carriers which helped tipped the balance of forces during the critical first eighteen months of the war.

Children Don’t Die Like They Used To

Academics generally agree on the changing patterns of mortality over time. Centuries ago, people died of many things. Most of those deaths were among children and they were often related to water-borne illness. A lot of that was resolved with sanitation infrastructure and water treatment. Then, communicable diseases were next. Vaccines, mostly introduced in the first half of the 20th century, prevented a lot of deaths.

Similarly, food borne illness killed a lot of people before refrigeration was popular. The milkman would deliver milk to a hatch on the side of your house and swap out the empty glass bottles with new ones full of milk. For clarity, it was not a refrigerated cavity. It was just a hole in the wall with a door on both the inside and outside of the house. A lot of babies died from drinking spoiled milk. 

Now, in higher income countries, we die of things that kill old people. These include cancer, falls that lead to infections, and the various diseases related to obesity. We’re able to die of these things because we won the battles against the big threats to children. 

What prompts such a dreary topic?

I was perusing the 1870 Census schedules and I stumbled upon some ‘Schedule 2s’. Most of us are familiar with schedule 1, which asks details about the residents living in a household. But schedule 2 asked about the deaths in the household over the past year.  Below is a scan from St. Paul, Minnesota.

Continue reading

Purchasing Power in 1868: Guinness Edition

When reading an old novel or watching a period drama movie or TV show, it is almost inevitable that some historical currency amounts will be mentioned. This is especially true when the work is dealing with money and wealth, for example the series “The Gilded Age” is about rich people in late 19th century America. So money comes up a lot. I wrote a post a few weeks ago trying to contextualize a figure of $300,000 from 1883 for that show.

A new Netflix series “The House of Guinness” is another period piece that spends a lot of time focusing on rich people (the family that produces the famous beer), as well as their interactions with poorer folks. So of course, there are plenty of historical currency values mentioned, this time denominated in British pounds (the series is primarily set in Ireland, where the pound was in use). On this series, though, they have taken the interesting approach of giving the viewers some idea of what historical currency values are worth today, by overlaying text on the screen (the same way they translate the Gaelic language into English).

For example, in Episode 4 of the first season, one of the Guinness brothers is attempting to negotiate his annual payment from the family fortune. He asks for 4,000 pounds per year. On the screen the text flashes “Six Hundred Thousand Today.”

The creators of the show are to be commended for giving viewers some context, rather than leaving them baffled or pausing the show to Google it. But is 600,000 pounds today a good estimate? Where did they get this number? As with the “Gilded Age” estimate, it’s complicated, but it is probably more than you think.

Continue reading

Now Published: Prohibition and Percolation

My new article, “Prohibition and Percolation: The Roaring Success of Coffee During US Alcohol Prohibition”, is now published in Southern Economic Journal. It’s the first statistical analysis of coffee imports and salience during prohibition. Other authors had speculated that coffee substituted alcohol after the 18th amendment, but I did the work of running the stats, creating indices, and checking for robustness.

My contributions include:

  • National and state indices for coffee and coffee shops from major and local newspapers.
  • A textual index of the same from book mentions.
  • I uncover that prohibition is when modern coffee shops became popular.
  • The surge in coffee imports was likely not related to trade policy or the end of World War I
  • Both demand for coffee and supply increased as part of an intentional industry effort to replace alcohol and saloons.
  • An easy to follow application of time series structural break tests.
  • An easy to follow application of a modern differences in differences method for state dry laws and coffee newspaper mentions.
  • Evidence from a variety of sources including patents, newspapers, trade data, Ngrams, naval conflicts, & Wholesale prices.

Generally, the empirical evidence and the main theory is straightforward. I learned several new empirical methods for this paper and the economic logic in the robustness section was a blast to puzzle-out. Finally, it was an easy article to be excited about since people are generally passionate about their coffee.


Bartsch, Zachary. 2025. “Prohibition and Percolation: The Roaring Success of Coffee During US Alcohol Prohibition.” Southern Economic Journal, ahead of print, September 22. https://doi.org/10.1002/soej.12794.

Housing is More Expensive Today, But Not Because the US Left the Gold Standard

Housing is certainly more expensive than in the past. I have written about this several times, including a post from last year showing that between about 2017 and 2022 housing started to get really expensive almost everywhere in the US, not just on the West Coast and Northeast (as had previously been the case). I don’t think the housing affordability crisis is in serious doubt anymore, and it can’t be explained over the past few years by increasing size and amenities, since those haven’t changed much since 2017 (though it is relevant when comparing housing prices to the 1970s).

But why did this happen? Knowing why is crucial, not merely to blame the causes, but because the policy solution is almost certainly related to the causes. I and many others have argued that supply-side restrictions, such as zoning laws, are the primary culprit. The policy solution is to reduce those restrictions. But a recent op-ed titled “Why your parents could afford a house on one salary – but you can’t on two,” the authors place the blame for housing prices (as well as the stagnation of living standards generally) on a different factor: Nixon’s 1971 “severing the dollar’s link to gold.” The authors have a book on this topic too, which I have not yet read, but they provide most of the relevant data in this short op-ed.

Does their explanation make sense? I am skeptical. Here’s why.

Continue reading

Can the President Fire a Member of the Federal Reserve Board of Governors?

That’s exactly what he tried to do this past Monday. Trump announced on social media that Lisa Cook, appointed by Biden in 2022, is now fired. Things are about to get awkward.

First, Trump can’t simply fire Fed governors willy-nilly. Remember when DOGE was involved in all of those federal workforce lay-offs earlier in the year? I know, it seems like forever ago. The US Supreme Court ruled on the legality of those firings, including some at government corporations and ‘independent agencies’. The idea behind such entities is that they are supposed to be politically insulated and less bound by the typical red tape of the government. But Trump’s administration argued that the separation from the rest of the executive branch is a fiction and that there is no one else in charge of them if not the president. The Supreme Court agreed with the administration, with one exception.

Continue reading

The American Middle Class Has Shrunk Because Families Have Been Moving Up

In 1967, about 56 percent of families in the US had incomes between $50,000 and $150,000, stated in 2023 inflation-adjusted dollars. In 2023, that number was down to 47 percent. So the American middle class shrunk, but why? (Note: you can do this analysis with different income thresholds for middle class, but the trends don’t change much.)

The data comes from the Census Bureau, specifically Table F-23 in the Historical Income Tables.

As you can see in the chart, the proportion of families that are in the high-income section, those with over $150,000 of annual income in 2023 dollars, grew from about 5 percent in 1967 to well over 30 percent in the most recent years. And the proportion that were lower income shrunk dramatically, almost being cut in half as a proportion, and perhaps surprisingly there are now more high-income families than low-income families (using these thresholds, which has been true since 2017). The number is even more striking when stated in absolute terms: in 1967 there were only about 2.4 million high-income households, while in 2023 there were 11 times as many — over 26 million.

Is this increase in family income caused by the rise of two-income households? To some extent, yes. Women have been gradually shifting their working hours from home production to market work, which will increase measured family income. However, this can’t fully explain the changes. For example, the female employment-population ratio peaked around 1999, then dropped, and now is back to about 1999 levels. Similarly, the proportion of women ages 25-54 working full-time was about 64 percent in 1999, almost exactly the same as 2023 (this chart uses the CPS ASEC, and the years are 1963-2023).

But since the late 1990s, the “moving up” trend has continued, with the proportion of high-income families rising by another 10 percentage points. Both the low-income and middle-income groups fell by about 5 percentage points. Certainly some of the trend in rising family income from the 1960s to the 1990s is due to increasing family participation in the paid workforce, but it can’t explain much since then. Instead, it is rising real incomes and wages for a large part of the workforce.

We Don’t Have Mass Starvations Like We Used To

Two ideas coalesced to contribute to this post. First, for years in my Principles of Macroeconomics course I’ve taught that we no longer have mass starvation events due to A) Flexible prices & B) Access to international trade. Second, my thinking and taxonomy here has been refined by the work of Michael Munger on capitalism as a distinct concept from other pre-requisite social institutions.

Munger distinguishes between trade, markets, and capitalism. Trade could be barter or include other narrow sets of familiar trading partners, such as neighbors and bloodlines.  Markets additionally include impersonal trade. That is, a set of norms and even legal institutions emerge concerning commercial transactions that permit dependably buying and selling with strangers. Finally, capitalism includes both of these prerequisites in addition to the ability to raise funds by selling partial stakes in firms – or shares.

This last feature’s importance is due to the fact that debt or bond financing can’t fund very large and innovative endeavors because the upside to lenders is too small. That is, bonds are best for capital intensive projects that have a dependable rates of return that, hopefully, exceed the cost of borrowing. Selling shares of ownership in a company lets a diverse set of smaller stakeholders enjoy the upside of a speculative project. Importantly, speculative projects are innovative. They’re not always successful, but they are innovative in a way that bond and debt financing can’t satisfy. Selling equity shares open untapped capital markets.

With this refined taxonomy, I can better specify that it’s not access to international trade that is necessary to consistently prevent mass starvation. It’s access to international markets. For clarity, below is a 2×2 matrix that identifies which features characterize the presence of either flexible prices or access to international markets.

Continue reading