Tyler Cowen, Talent Curator

Everyone else at EWED has been too classy (or earnest?) to post it, since it would implicitly be bragging.

But I’m home with a quarantined kid today and need the win. So here is biotech founder Tony Kulesa‘s article on how Tyler Cowen is the Best Curator of Talent in the World.

Highlights:

Tyler has identified talent either earlier than or missed by top undergraduate programs, the best biotech startups, and the best biotech investors, all without any insider knowledge of biotech. In comparison, Forbes 30U30, MIT Tech Review TR35, or Stat Wunderkind, and other industry awards that highlight talent are lagging indicators of success. It’s hard to find an awardee of these programs that was not already widely recognized for their achievements among insiders in their field. The winners of Emergent Ventures are truly emergent. 

What explains Tyler’s ability to do this?

1. Distribution: Tyler promotes the opportunity in such a way that the talent level of the application pool is extraordinarily high and the people who apply are uniquely earnest

2. Application: Emergent Ventures’ application is laser focused on the quality of the applicant’s ideas, and boils out the noise of credentials, references, and test scores. 

3. Selection: Tyler has relentlessly trained his taste for decades, the way a world class athlete trains for the olympics. 

4. Inspiration: Tyler personally encourages winners to be bolder, creating an ambition flywheel as they in turn inspire future applicants.

This seems right as far as it goes, and there is more depth in the article, but there has to be more to the story than we can see from the outside. Luckily Tyler has said he is writing a book on identifying talent.

The Half-Life of Policy Rationales

Bryan Caplan recently wrote about public goods theory, how we teach it, and the unrealistic nature of how we classify goods as either/or, rather than on a continuum. I explored similar themes in a blog post that I wrote back in January, but Caplan brings up another important point about public goods theory that I forget.

In a short 2002 paper, and then in a 2003 book with the same title, Foldvary and Klein proposed the idea of “the half-life of policy rationales.” In brief, the justification for many market failure arguments is contingent on the current state of technology. They apply this to concepts such as natural monopoly and information asymmetries, but for public goods theory the most important application is to the concept of excludability.

Here’s the basic idea: it is costly to exclude non-payers for using some goods. If it is so costly that it would not be profitable for a private enterprise to produce the good in question, it won’t be produced privately. But it still may be efficient for government to produce the good, if the benefit from the good exceeds the cost of raising the revenue to pay for it (likely out of general revenue, since we have already admitted it is infeasible to charge the users directly).

But here’s the Foldvary and Klein point: all of the above paragraph is dependent on the current state of technology! Take roads for example. When you had to pay someone to physically take a few coins for a toll road, plus force all motorists to slow down to a complete stop to pay the toll, it was probably cost prohibitive to operate limited-access private toll roads. But technology changes. We now have the technology for electronic tolling done at highway speed (and even coin buckets were slightly faster than handing some dude your change). The argument for government provision of highways, which was strong when technology was ancient, is significantly weakened now that technology has reached its modern state.

(There may be lots of other reasons you think that roads should be publicly provided, such as equity, but these are separate questions and distinct from the argument made in standard public goods theory.)

Foldvary and Klein go through many more examples in their book, but we can already see the key insight. And I think this is extremely important for teaching public goods to undergraduates. It’s normal for us to say that goods are either excludable (in which private provision is best) or non-excludable (in which there is a strong case for some government intervention). But this either/or framing is wrong (a continuum is a better way to think about it), and crucially it can change over time depending on technological changes. Excludability is not some inherent feature of a good or service, it is a function of the state of technology.

Europe Natural Gas Shortage: Factories Shut, Maybe Worse to Come

Shut down your old reliable coal and nuclear power plants. Replace them with wind turbines. Count on natural gas fueled power plants to fill in when the breeze stops blowing. Curtail drilling for your own natural gas, and so become dependent on gas supplied by pipeline from Russia or by tankers chugging thousands of miles from the Middle East. What could possibly go wrong?

That is what Europe is discovering now as natural gas prices have quintupled, taking electricity prices up with them. Europe is having a hard time finding enough gas supply to fill up storage facilities to get them through the winter. If consumers are prioritized, widespread industry shutdowns are possible if there is a cold winter. Prices for many things will rachet up, with implications for inflations and in turn for central banks’ response to inflation. (The Fed’s Powell has been talking down the current inflation as merely transitory).

 In the UK, energy companies are going bankrupt because the wholesale price that at which they purchase gas is higher than the government-mandated cap on gas price they can charge consumers. Plants which use natural gas as a feedstock like fertilizer plants are shutting down, which impacts farmers. Carbon dioxide is a byproduct of some of these operations, and the resulting shortage of CO2 is affecting meat-packers who use it in their operations. Indeed, a food producer has warned that the Christmas dinner could be “cancelled.” That’s just how bad it is. The Brits are even delaying the shutdown of the country’s largest remaining coal-burning power plant.

Jason Bordoff of the Columbia Climate School and the Center on Global Energy Policy just published a long article giving his perspective on all this. He identifies several contributing factors:

( 1 ) Cold and hot weather affected gas consumption this year. Winter in much of the Northern Hemisphere was unusually cold earlier this year, which boosted gas demand for heating. And then a hot summer consumed more gas to make electricity for air conditioning. 

( 2 ) Other sources of electricity have been hampered. “Wind generation in Europe has been far below average this year due to long periods of less windy weather. …Demand for fossil fuels is set to spike further as Germany takes another three nuclear reactors off the grid this year as part of its nuclear shutdown. Meanwhile, drought conditions in China and South America have led to reduced hydropower output, drawing supplies of globally traded gas into those markets instead.”

( 3 ) The post-COVID economic recovery has boosted industrial demand.

( 4 ) Russia has restricted gas deliveries to Europe though the existing pipeline that runs through Ukraine. (Many observers see this as a pressure tactic to get Europe to switch over to a northern pipeline route, which would then remove the importance of Ukraine for Russian gas marketing, which would then give Russia a freer hand to resume military harassment of that country.) Also, European countries have restricted their own gas production. The Dutch are curtailing the production rate at their big Groningen gas field because local residents fear earthquakes from ground subsidence, and the Brits have restricted fracking of promising gas fields due to public protests.

As might be expected in our interconnected world, the European supply crunch has affected U.S. prices, which are at their highest level in five years. America exports gas via liquified natural gas (LNG) tankers, but U.S. gas supplies so far have not responded much to the price increase. The hostility of the Biden administration and pressure from green-leaning investors has discouraged petroleum companies from expanding drilling.

Meanwhile, California is running its own experiment in green energy  adoption:             

California, for example, is having trouble keeping the lights on as it rapidly scales the use of intermittent solar and wind power. It recently requested an emergency order from the U.S. federal government to maintain system reliability by, among other actions, allowing the state to require certain fossil fuel plants scheduled to retire to stay online and by loosening pollution restrictions. California is also proposing to build several temporary natural gas plants to avoid blackouts, even as the state shuts down the Diablo Canyon nuclear power plant, which produces more zero-carbon electricity than all the state’s wind turbines combined.

Professor Bordoff notes that “Many projections for how quickly and how much clean energy can be scaled are based on stylized models of what is technically and economically possible”, and unsurprisingly calls for policies which mitigate volatility, e.g., “…regulatory and infrastructure policies can facilitate more integration, flexibility, and interconnectedness in the energy system—from power grids to pipelines—so there are more options to pull energy supplies into a market when needed.”

Oh, and this restatement of the obvious:  

Uncertainty about the pace of transition may lead to periodic shortfalls in supply if climate action shutters traditional fossil fuel infrastructure before alternatives can pick up the slack—as may be starting to happen in some places now. And if fossil fuel supply is curbed faster than the pace at which fossil fuel demand falls, shortfalls can result in market crunches that cause prices to spike and exacerbate existing geopolitical risks. In fact, this is what the International Energy Agency just warned is happening in oil markets—a striking contrast to what it said only a few months ago, when it warned that new fossil fuel supplies would not be needed if nations were on track to achieve net-zero emissions by 2050.

Me? After working through  all this material, I’m going to go buy me some shares of ExxonMobil, the largest natural gas producer in the U.S.

Evolutionary Science and Gresham’s Law of Ideas

So there’s a book that said something really dumb:

And by cursory inspection of excerpts and reviews, it is chock full of all kinds of silly ideas that experienced what I can only imagine to be a frictionless path from the authors’ minds to publication. I don’t really care about this book or the specific ways in which it is is bad. And I don’t really care about the authors, who appear to be mediocre self-styled evolutionary scientists whose major claims to fame appear to be favoring ivermectin over vaccines and supporting themselves financially by levying a lawsuit against Evergreen State College.

What I care about is evolutionary biology and psychology as subfields. The core idea is that the evolutionary framework of persistent adoption and adaptation of traits under unrelenting selective pressures can be a useful modeling framework for generating theories of social, economic, biological, and psychological phenomena. Evolutionary selection is a good idea, one of the most powerful in intellectual history! But to me, an outsider economist with a long-ago acquired undergraduate degree in biology, the subfields seems to be suffocating under the weight of ad hoc theories generated in volume by marginal practitioners and non-scientists. Why? What’s wrong with evolutionary sciences? Here’s a couple thoughts.

1) There’s nothing wrong. Saying something is wrong with the subfields is like watching The Shining and thinking “There’s something wrong with axes”. This is just a bad book with bad ideas thought up by authors with minimal right to claim the mantle of evolutionary science.

That’s a totally reasonable response but I’m in no mood to leave well enough alone.

2) There’s a perverse selective pressure within evolutionary sciences where the worst ideas rise to the level of public dissemination. The culling forces of the popular press select along dimensions that are not merely orthogonal to good science, they are actively selecting against it. Put in the language of my own field, publishing bad ideas seems to be more profitable than publishing good ones.

That’s pretty big claim, and one for which I have no real proof, just tacit intuition and a small number of anecdotes. Sorting through the reviews of the Heying & Weinstein book, I thought of the brief phenomenon that was “Sex at Dawn” a decade ago. It, similarly, sold a breathless explanation of human behavior, specifically promiscuity. Emphasis on the world sold. “Sex at Dawn” proved that you could be scientifically hollow and still sell a boatload of copies. For those who are curious, here’s a review by an evolutionary psychologist that doesn’t hold nearly the grudge that I do. He politely sifts through the major claims, weaving through the silliness to find the handful of specific claims, and proceeds to debunk them. Other reviewers were considerably less kind (including those at Oxford Press, who rejected it for publication).

So why are these and similar books, so successful?

I’ve long suspected that there is a Gresham’s Law of Popular Science at work. Simply put, bad ideas are less costly to generate than good ones, so they are more plentiful. For the non-expert consumer of popular science, this raises the costs of search probability that a randomly encountered book is bunk. What I believe to be more problematic, though, is that bad ideas are often less costly to consume. Spoon-fed as common sense writ magnificent and powerful, pseudoscientific books get a foothold in our mind first through the scarcity of our time and attention only to then grow roots in our ego. Easily consumed during rare moments of relaxed reading, they then show us ideas that give us explanatory access to life, the universe, and everything. Why struggle through caveated niche explorations when someone else has distilled the complexity of a modern life well-lived to something that is as flexible in its flattery as a horoscope and often conveniently enumerated?

Does this happen within economics? Of course it does. It happens in every scientific field. But that is why scientific fields evolve intellectual immune systems, and often very aggressive ones at that. The entire field of “Statistics” essentially exists as the custodian of the scientific method. But there are little details that matter, too.

Take, for example, the core concept of “maximization” in economics. Sure, it gets abused, but at the end of the day it’s pretty tough to get very far with an ad hoc utility/profit/wealth maximizing model in economics that produces useful predictions. Why is that? Well, a big reason is that we’ve left out a very important word. Economists deal almost exclusively in constrained maximization. Absent constraints, nearly every maximizing model amounts to little more than a tautology. It’s requirement for maximization under constraints, both components transparently introduced, that gives a model it’s power. When I observe meritless pop evolutionary science books, mostly what I’m seeing is unconstrained just so stories that work backwards from a conclusion they believe there is a book-purchasing audience for. There are selective pressures, but where are the resource constraints? There are groups but where are the rivals they are competing with? There is this evolutionary path, but why not the other paths?

So what should evolutionary sciences do? Well, first of all, I don’t know. But if I had to guess, the answer is nothing. Nothing but do the thing a proper science always does. Do the work, push the good ideas, kill the bad ones, and trust that the custodians of the scientific method will do their jobs. And so will the editors. And the hiring committees. And the critics. Sure, a couple folks will pay a couple years mortgage, but a bit of financial and status injustice are a small price to pay while we keep the scientific mission moving forward. At least until we’re all crabs.

Clemens and Strain on Large and Small Minimum Wage Changes

In my Labor Economics class, I do a lecture on empirical work and the minimum wage, starting with Card & Kreuger (1993). I’m going to quickly tack on the new working paper by Clemens & Strain “The Heterogeneous Effects of Large and Small Minimum Wage Changes: Evidence over the Short and Medium Run Using a Pre-Analysis Plan”.

The results, as summarized in the second half of their abstract are:

relatively large minimum wage increases reduced employment rates among low-skilled individuals by just over 2.5 percentage points. Our estimates of the effects of relatively small minimum wage increases vary across data sets and specifications but are, on average, both economically and statistically indistinguishable from zero. We estimate that medium-run effects exceed short-run effects and that the elasticity of employment with respect to the minimum wage is substantially more negative for large minimum wage increases than for small increases.

The variation in the data comes from choices by states to raise the minimum wage.

A number of states legislated and began to enact minimum wage changes that varied substantially in their magnitude. … The past decade thus provided a suitable opportunity to study the medium-run effects of both moderate minimum wage changes and historically large minimum wage changes.

We divide states into four groups designed to track several plausibly relevant differences in their minimum wage regimes. The first group consists of states that enacted no minimum wage changes between January 2013 and the later years of our sample. The second group consists of states that enacted minimum wage changes due to prior legislation that calls for indexing the minimum wage for inflation. The third and fourth groups consist of states that have enacted minimum wage changes through relatively recent legislation. We divide the latter set of states into two groups based on the size of their minimum wage changes and based on how early in our sample they passed the underlying legislation.

The “large” increase group includes states that enacted considerable change. New York and California “have legislated pathways to a $15 minimum wage, the full increase to which firms are responding exceed 60 log points in total.” Data comes from the American Community Survey (ACS) and the Current Population Survey (CPS).

Continue reading

Human Capital and Filepaths

Someone wrote a story about my life. It’s a report from The Verge called “File Not Found: A generation that grew up with Google is forcing professors to rethink their lesson plans”.

When I started teaching an advanced data analytics class to undergraduates in 2017, I noticed that some of them did not know how to locate files on a PC. Something that is unavoidable in data analytics is getting software to access data from a storage device. It’s not “programming” nor is it “predictive analytics”, but you can’t get far without it. You need to know what directory to point the software to, meaning that you need to know what directory contains the data file.

As the article says

the concept of file folders and directories, essential to previous generations’ understanding of computers, is gibberish to many modern students. It’s the idea that a modern computer doesn’t just save a file in an infinite expanse; it saves it in the “Downloads” folder, the “Desktop” folder, or the “Documents” folder, all of which live within “This PC,” and each of which might have folders nested within them, too. It’s an idea that’s likely intuitive to any computer user who remembers the floppy disk.

I am a long-time PC user. Navigating File Explorer is about as instinctive as drinking a glass of water for me. The so-called digital natives of Gen Z have been glued to mobile device screens that shield them from learning anything about computers.

Not everyone needs to know how computers work. I myself only know the layer that I was forced to learn.

My Dad, to whom I owe so much, kept a Commodore 64 in a closet in our house. About once a year, he would try to entice me into learning how to use it. I remember screwing up my 9-year-old eyes and trying to care. Care, I could not. It’s hard to force yourself to do extra work without a clear goal. The Verge article explains

But it may also be that in an age where every conceivable user interface includes a search function, young people have never needed folders or directories for the tasks they do. The first internet search engines were used around 1990, but features like Windows Search and Spotlight on macOS are both products of the early 2000s. Most of 2017’s college freshmen were born in the very late ‘90s. They were in elementary school when the iPhone debuted; they’re around the same age as Google. While many of today’s professors grew up without search functions on their phones and computers, today’s students increasingly don’t remember a world without them.

One area in which I do minimum archiving is my email. I rely heavily on the search function. I could spend time creating email folders, but I’m not going to put in the time unless I’m forced to.

Here’s where the “problem” lies:

The primary issue is that the code researchers write, run at the command line, needs to be told exactly how to access the files it’s working with — it can’t search for those files on its own. Some programming languages have search functions, but they’re difficult to implement and not commonly used. It’s in the programming lessons where STEM professors, across fields, are encountering problems.

Regardless of source, the consequence is clear. STEM educators are increasingly taking on dual roles: those of instructors not only in their field of expertise but in computer fundamentals as well.

Personally, I don’t mind taking on that dual role. I didn’t learn to program until I really wanted to. The only reason I wanted to was that I had discovered economics. I wanted to be able to participate in social science research. Let these STEM or business courses be the motivation for students to learn to use computers as tools instead of just for entertainment.

Allen Downey wrote a great blog on this topic back in 2018 that is more practical for teachers than the Verge report. He argues that learning to program will be harder for the 20-year-olds of today than it was for “us” (old people as defined by entering college before 2016). He recommends a few practical strategies, while acknowledging that there is “pain” somewhere along the process. He thinks it is sometimes appropriate to delay that pain by using browser-based programming interfaces, in the beginning.

I gave my students a break from pain this week with a little in-browser game that you can play at https://www.brainpop.com/games/blocklymaze/ They got 10 minutes to forget about file paths, and then it was back to the hard work.

I have found that a lot of students need individual attention for this step – the finding a file in their hard drive. I only have to do that once per student. Students pick the system up quickly. File Explorer is a pretty user-friendly mechanism. Everyone just has to have a first time. Sometimes, Zoomers just need a real person who cares about them to come along and say, “The file you downloaded exists on this machine.”

One way around this problem is to reference data that lives on the internet instead of in a local machine. If you are working through the examples in Scott Cunningham’s new book Causal Inference, here’s a piece of the code he provides to import data from his public repository into R.

full_path <- paste(https://raw.github.com/scunning1975/mixtape/master/, df, sep=“”)

df <- read_dta(full_path)

The nice thing about referencing data that is freely available online is that the same line of code will work on every machine as long as the student is connected to the internet.

As more and more of life moves into the cloud, technologists might increasingly be pointing programs to a web address instead of the /Downloads folder on their local machine. Nevertheless, the kids need to have a better sense of where files are stored. He or she who can understand file architecture is going to get paid a lot more than their peers who only know who to poke and scroll on a smartphone.

There is a future scenario in which AI does most of the programming for us. When AI can fetch files for us, then File Explorer may seem obsolete. But I worry about a world in which fewer and fewer humans know where their information is stored.

Avoiding Intertemporal Idiosyncratic Risk

Hopefully by this time we all know about index funds. The idea is that by investing in a large, diversified portfolio, one can enjoy the average return across many assets and avoid their individual risk. Because assets are imperfectly correlated, they don’t always go up and down at the same time or in the same magnitude. The result is that one can avoid idiosyncratic risk – the risk that is specific to individual assets. It’s almost like a free lunch. A major caveat is that there is no way to diversify away the systemic risk – the risk that is common across all assets in the portfolio.

We can avoid the idiosyncratic risk among assets. But, we can also avoid idiosyncratic risk among times. Each moment has its own specific risks that are peculiar to it. Many people think of investing as a matter of timing the market. However, people who try to time the market are actively adopting the specific risks that are associated with the instant of their transaction. This idea seems obvious now that I’m writing it down. But I had a real-world investing experience that– though embarrassing in hindsight – taught me a heuristic for avoiding overconfidence and also drilled into my head the idea of diversifying across time.

I invested a lot into my preferred index fund this past year. I’d get a chunk of money, then I’d turn around and plow it into the fund. What with the Covid rebound, it was an exciting time. I started paying more attention to the fund’s performance, identifying patterns in variance and the magnitude of the irregularly timed and larger changes. In short, by paying attention and looking for patterns, I was fooling myself into believing that I understood the behavior of the fund price.

And it’s *so* embarrassing in hindsight. I’d see the value rise by $10 and then subsequently fall to a net increase of $5. I noticed it happening several times. I acted on it. I transferred funds to my broker, then waited for the seemingly regular decline. Cha-ching! Man, those premium returns felt good. Success!

Silly me. I thought that I understood something. I got another chunk of change that was destined for investing. I saw the $10 rise of my favorite fund and I placed a limit order, ensuring that I’d be ready when the $5 fall arrived. And I waited. A couple weeks passed. “NBD, cycles are irregular”, I told myself. A month passed. And like a guy waiting at the wrong bus stop, my bus never arrived. All the while, the fund price was ultimately going up. I was wrong about the behavior of the fund. Not only did I fail to enjoy the premium of the extra $5 per share. I also missed what turned out to be a $10 per share gain that I would have had if I had simply thrown in my money in the first place, inattentive to the fund’s performance.

Reevaluation

I hate making bad decisions. I can live with myself when I make the right decision and it doesn’t pan out. But if I set myself up for failure through my own discretion, then it hurts me at a deep level. What was my error? Overconfidence is the answer. But why did it hurt me?

Continue reading

Covid, Cars, China, Crypto, Corruption

We generally do long “effort posts” on specific topics here, but today I’m mixing things up with 5 quick updates.

  1. Covid My daughter got sent home with a cough Tuesday, which meant I cancelled classes Wednesday to hang out with her until we get a Covid-negative PCR. Last Thursday my son’s public school was closed for Yom Kippur, and I got so focused on hanging out with him I forgot to post here.
  2. Cars My wife bought a new used car last week. We’ve covered here how car prices have jumped up while inventories fell this summer, and the latest numbers show that used car prices are now falling slightly from very high levels while new car prices continue to rise. While actually buying a car, the low inventories stood out even more than the high prices. Several times we saw a promising car online, only to call or visit the dealer and find out it had sold the day before. The new Nissan Leaf sounds like an excellent value at its sticker price, but none were available in Rhode Island, and no blue ones anywhere in New England.
  3. China Scott covered the collapsing Chinese real estate market on Tuesday. I’ll just pass along the takes I’ve seen from Western economists and China-watchers Michael Pettis and Christopher Balding, which is that this is a big deal that will slow Chinese growth for years but is unlikely to precipitate a 2007-style financial crisis. I find Balding’s argument that financial contagion will be limited to be convincing partly because of his actual arguments about quasi-bailouts, and partly because he almost always argues that “things in China are worse than you think”, so if he says “Evergrande isn’t Lehman Brothers” I listen.
  4. Crypto Tuesday I met the co-founder of a new crypto-based prediction market, Melange, which sounds promising. The prediction market space is growing rapidly with PolyMarket and Kalshi joining the older PredictIt.
  5. Corruption Last week the World Bank announced it is discontinuing the Doing Business report/ranking due to apparent corruption; top Bank officials in the middle of raising money from countries including China pushed to raise the rankings of those countries beyond what the data justified. I hope another organization steps up do continue the good parts of the Doing Business report in a more trustworthy way.

Selectivity and Selection Bias: Are Selective Colleges Better?

If you have ever been through the process of applying to colleges, you have almost certainly heard the term “selective colleges.” If you haven’t the basic idea is that some colleges are harder to get into, for example as measured by what percentage of applicants are accepted to the school. The assumption of both applicants and schools is that a more selective college is “better” in some sense than a less selective college. But is it?

In a new working paper, Mountjoy and Hickman explore this question in great detail. The short version of their answer: selective colleges don’t seem to matter much, as measured by either completion rates or earnings in the labor market. That’s an interesting result in itself, but understanding how they get to this result is also interesting and an excellent example of how to do social science correctly.

Here’s the problem: when you just look at outcomes such as graduation rates or earnings, selective colleges seem to do better. But most college freshmen could immediately identify the problem with this result: that’s correlation, not causation (and importantly, they probably knew this before stepping onto a college campus). Students that go to more selective colleges have higher abilities, whether as measured by SAT scores or by other traits such as perseverance. It’s a classic selection bias problem. How much value is the college really adding?

Here’s how this paper addresses the problem: by only looking at students that apply to and are accepted to colleges with different selectivity levels, but some choose to go to the less selective colleges. What if we only compare this students (and of course, control for measurable differences in ability)?

Now this approach is not a perfect experiment. Students are not randomly assigned to different colleges. There is still some choice going on. But are the students who choose to attend a less selective college different in some way? The authors try to convince us in a number of ways that they are not really that different. Here’s one thing they point out: “nearly half of the students in our identifying sample choose a less selective college than their most selective option, suggesting this identifying variation is not merely an odd choice confined to a small faction of quirky students.”

Perhaps that alone doesn’t convince you, but let’s proceed for now to the results. This chart on post-college earnings nicely summarizes the results (see Figure 3 in the paper, which also has a very similar chart for completion rates)

Continue reading

Likely Collapse of Chinese Real Estate Conglomerate Evergrande Roils World Markets

Nearly a year ago, on this blog we described the sequence of events that led to the Great Recession ( or “Global Financial Crisis”) of 2008-2009. The underlying problem was real estate-related debt: as inflated housing prices collapsed, many people couldn’t (or wouldn’t) pay their mortgages. Various financial dominos fell, but the one that gets singled out as the single most critical event was  the collapse of Lehman Brothers investment bank on September 15, 2008. The Dow Industrial average fell 504 points that day, and loss of confidence in the financial markets led to a freeze-up in credit, which was/is the lifeblood of business.

The likely bankruptcy of the gigantic Chinese real estate conglomerate Evergrande is being discussed as another possible “Lehman Moment”. It is hard to comprehend just how big this outfit is. It owns more than 1,300 real estate projects across China, directly employs 200,000 people, and is indirectly sustains some 3.8 million jobs. It got that big by borrowing (including selling bonds) and spending enormous amounts of money. The problem now is that it seems like it cannot service its $300 billion debt. Once things like this start to go bad, they often get much worse, quickly. Other parties stop wanting to do business with you, and it all goes downhill. (A famous reply in Ernest Hemingway’s The Sun Also Rises to the question, “How do you go bankrupt?” was “Gradually, and then suddenly”). The market prices on Evergrande’s bonds indicate that the market expects bankruptcy, with bondholders getting only about 25 cents on the dollar.

If this collapse materializes fully, a lot of investors will lose a lot of money, a lot of suppliers of building materials to Evergrande will not get paid and may go broke, and a lot of real estate development in China will freeze up for the time being.  Goldman Sachs estimates a 1-4% hit to China’s GDP, which is huge, and would reverberate across the whole world.

Wall Street seems to have been ignoring this drama, until yesterday (Monday). Blam, stocks fell around 2%, and were still headed south at the end of trading. Is this the start of The Big One? Well, that makes for dramatic commentary, but most observers seem to take a more nuanced approach. First, the all-powerful Chinese government could order the People’s Bank of China to “fix this”. We all now know that central banks have magical powers to create as much money as needed to, e.g., buy all outstanding Evergrande bonds at near-par. On the other hand, the Chinese government lately has been clamping down on speculation. So there may be some sort of compromise, a semi-orderly unwinding, with bondholders feeling some pain, but actual real estate operations being sold off and continuing under some other names.

Wall Street may be more worried about whether the Fed announced on Wednesday that it will take away the punch bowl by tapering of its bond purchases. The last time the Fed did that, in 2018, stocks took a long and hard tumble. Again, a range of outcomes is possible here.

Ironically, all these concerns, as long as they don’t really turn into something serious, may be a bullish indicator for stocks. Stocks are said to “climb a wall of worry”; it is when everyone is totally complacent that is a setup for a crash. Time will tell whether the Evergrande difficulties end up being part of  a bullish wall or a bearish cliff.