I could do better

My favorite soccer team has been badly coached for 2 years and I am regularly convinced I could do better.

These are not the thoughts of a rational man and its causing me no small amount of consternation, bordering on intellectual crisis. Which is, of course, a lie, but adding a touch of intellectual melodrama never hurts when you’re trying your damnedest to write something new every week.

It is a puzzle, to be sure. There have been two coaches in the last two years, the second having only been there a week. The first was experienced, accomplished, and internationally famous. I’m quite confident he was wrong in the majority of decisions he made, but I at least had a model for why he was so often wrong.

When an ostensible expert appears to be failing at their job far worse than a hypothetically cheaper replacement, I always look for the rational reason why someone might be choosing to fail. In this case, we were observing an individual who could achieve mediocrity without effort. His past accomplishments gave him credibility with the players and his stock of knowledge as of 2011 was sufficient to carry him to large pay checks. To achieve mediocrity required near minimal effort. Could he update his tactics, both within the structure of the game and his management of personnel? Of course. But doing so would require enormous amounts of effort. His salary had peaked, his future managerial prospects dimmed by age and recent results, and as such the returns to effort were dwarfed by the returns to leisure. Allow me to enter ego into the calculus. What sounds more cognitively costly: acquiescing to reality that your human capital has been rendered obsolete and rebuilding your modus operandi from scratch with the full knowledge that you may spend your wealth-laden golden years failing in public? Or denying it fully, shifting all blame for failure onto the personnel, and bemoaning that it is not your human capital that is obsolete, but rather that the labor pool available to you is fundamentally flawed? To me its a no-brainer, and it’s why I am fully of the belief that there actually are bloggers in their mom‘s basement who could have better managed a team.

The new manager is a temp. He’s never managed a team before. Then again, neither have I. He has, however, played professional soccer at the highest level. He has been placed on the management training track by a world-class organization. He has none of the maladapted human capital or rational-addiction-adjacent reasons to fail at his job. He has all of the local and tacit knowledge from being on the training pitch and in the locker room that I don’t.

I’m still confident I could have done a better job than he did today. Why is that?

I can construct a model to rationalize my beliefs, but that model gets awfully “just-so” very quickly. It relies on assumptions I can’t justify and broad generalizations that, if evenly applied, would hurt the case for myself as superior even more so than the current job holder. Of course, I can invent a narrative where I am the superior sports team manager, but that narrative would have to rewrite my entire personal history going back so far as to render me a completely differ human, and one who no doubt would have just as many (and possibly the same) blind spots.

I guess what I’m saying is that I know I shouldn’t be the manager. Every rational bone in my body knows that is a silly idea and I would fail miserably. But I think there is a case to be made that sometimes we can look at the person making decisions for our favorite team, look at their track record, and confidently say “They would be making better decisions if they talked it over with me.” When the armchair quarterback says ‘the coach is an idiot” they’re not saying they want to be the coach. They’re saying they want to be in the room. They want a voice because they think they could contribute.

Someone tell Tottenham Hotspur that I’m available. I’m not free, but I can be had.

Don’t arbitrage time with friends

As you may have already heard, the US suicide rate dropped 6% last year. During a pandemic. During a lockdown. During a time when rates of depression have reportedly increased. This is all quite surprising to many people, myself included. I don’t have a convincing explanation, only a single relevant thought.

I think we’ve rediscovered regular long-distance communication with people that have drifted out of our lives and many of us are better for it. I know I am.

While I think that loneliness and isolation are major force behind a lot of social ills, I also know that the “loneliness epidemic” was always a poorly constructed metaphor at best, and possibly only weakly observed at worst. But I also suspect that loneliness and isolation are phenomena in the tails of the distribution. Isolation doesn’t happen to people with average or even below-average social networks. It happens to people entirely without them, for whom their strongest connections with other humans have dissolved. Such things do not always reveal themselves statistically, at least not without looking really hard.

We have observed the emergence, and now dominance, of asynchronous communication. We text, email, tweet. We post on Instagram, Facebook, Snapchat, or TikTok. These are all means of communication, but (with the possible exception of texting) these forms of communication exist outside of real-time. They don’t command chunks of contiguous time– they arbitrage the fractions of time that previously existed between activities and went uncommitted to a narrowly defined task.

I don’t know what causes loneliness, but I do know that its much easier to not feel lonely when you are spending fully committed time, and not arbitraged fractions, communicating with another human being. If you’re under the age of 40, its almost socially illegal to voice call a friend to talk. For many it would be viewed as an act of emotional aggression, an imposition of social need, if not anxiety, on another. The irony of this millennial norm I’ve unfairly placed on them through nothing but my own unreliable observations is that it strikes me as an accelerated path to friendless boomer sad-dad suburban isolation.

The pandemic hit and many of us had to start Zooming in to work. And we had to explain Zoom and Google Meetings to our parents so we could talk to them. But I think a lot of us started catching up with old school buddies, too. Folks you sent Christmas cards to or caught up with at a cookout the Sunday after Thanksgiving. It became completely normal to schedule a call in advance – to put it on a calendar and reserve that time. And I think a lot of people who moved for work or relationships, who after 10 years changed to a new office where they didn’t know anyone, who maybe had simply fallen out of step with friends after the first four years of trying to keep triplets fed — I think a lot of people really enjoyed the pandemic-driven need for reserving time for contiguous social interaction in a manner entirely unconstrained by geography. And maybe they ended up feeling less lonely for it.

Keep Zooming your friends far away. Keep putting it on the calendar. Do it forever.

The Problem is the Science

The University has been the engine of basic science in the US and abroad for a long time. Any hand-wringing in recent years over its imminent obsolescence was borne of advances in remote learning and new found capacities to exponentially scale single instructors to reach tens, if not hundreds, of thousands of students across the globe. How, in this brave-ish new world, would matriculant tuition accruing to a handful of instructional specialists/celebrities continue to subsidize the scientific mission?

If the arrival of YouTube and Khan Academy gave credence to the academic apocalypse theory, then the coronavirus pandemic and the global adoption of Zoom instruction would surely make a reality of it. I will admit, for the first time in my career, I’m seeing the cracks in the edifice of the academy. And, yes, it was the pandemic that made them more prominent to me.

But its not on the educational side of our dual mission. It’s the science.

Dr. Katalin Kariko is very likely to win a Nobel prize for her immense contributions to our understanding of messenger RNA (mRNA) and how it can be manipulated to create an entirely new class of vaccines that, it is not hyperbole to say, stand to offer a global shift in health. The prospect is there for not just an HIV vaccine, or a broad-spectrum influenza vaccine, or a malaria vaccine, but the broad mitigation of viruses as a burden on humanity.

Dr. Kariko has been pursuing her scientific mission with a single-mindedness that jumps off the page in everything that has been written about her. What also jumps off the page, at least to those of use who have been trying make a career in academic research, is the university system that has worked diligently for decades to push Dr. Kariko, and her scientific mission, out of the academy. At every stage of the hiring, retention, and grant application process, Dr. Kariko’s research has been bludgeoned with not so much criticism or doubt, but what seems more like horrifying indifference. Grant reviewers saw little value, her colleagues noted that she lacked finesse in writing grant applications, and the academic institutions that employed her saw little value in employing someone, even for less than $60k a year in salary, that was unable to consistently bring in large grants (sidenote: her husband often estimated her effective wage to be roughly a dollar an hour: from the university’s point of view, it wasn’t the expense she represented on the balance sheet, it was the opportunity cost of the grants she wasn’t winning that someone else in her slot would).

This is a problem.

To be clear, this indifference is far more damning than any sort of broad disagreement would have been. The nature of science is such that most advances are incremental, but every now and then there are the rare revolutionary upheavals, where something we thought we absolutely knew for sure turns out to be completely wrong. That scientific mavericks that push such theories, most of which are completely wrong, meet resistance is natural (and probably optimal). But indifference is a problem, because indifference does more to reveal the underlying incentives propelling researchers. Universities were indifferent to her research because it wasn’t generating grant money, and that is the job she was hired to do.

Patents are great. Prestigious awards are welcome. Published papers are not entirely a waste of your time. But make no mistake, if you don’t successfully apply for grants, your days in academic science are numbered. I spent three years as an oddly appointed economist in arguably the greatest medical school of the last decade. I got to hang around brilliant physicians who spent a lot of their time every week actually (not figuratively or indirectly) saving lives. I also witnessed dedicated researchers break down into tears upon receiving the news that their grant application had been denied, which meant their contract with the university would not be renewed and their research career effectively terminated. I saw how little grant application aptitude correlated with talent or passion. I saw people thrive in system while others failed, with little in the way of scientific aptitude to distinguish them.

The most practical advice I was privy to was this: work in someone’s lab, pursue your project in parallel with their resources. Once you have an advance that would be worthy of a grant application, write up the application for a project you‘ve already completed. List your previous PI as a collaborator, promise exactly the results you already have, describe your budget, schedule, and proposed outputs in shocking detail, and then radically oversell the importance of the discovery. Once you win the grant, use that money to pursue your next project while writing up the outcome of your previous one. Once you have results, apply for yet another retrospective funding grant, and continue to daisy chain that until you win a massive grant, a coveted NIH R-1 perhaps, within which you can bundle a series of projects, hiring as many post-docs and early researchers as you can. You will then manage this team who will execute your research while hopefully starting their own retrospective grant application daisy chains. Is this a common strategy? I don’t know – it seems odd that the dates of human subjects testing could be obscured. But the point was made to me – this isn’t about science, this is a career life-or-death game where only the 20% of applicants are funded.

To be honest, I don’t care that people are gaming funding institutions. And, to be clear, “playing the game” is part of any career, no matter how idealistic you want to be. Academic research science is in deep, deep trouble, however, if grant application gamesmanship dominates scientific ingenuity in the talent acquisition and retention strategies of major universities. It means we’re no longer scientists, we’re rent-seekers. We’re the person in the village best at memorizing Mao’s Little Red Book: smart, talented, but in the end wasted. Or, much worse, we’re just poseurs.

Piecing together what I’ve read in articles and her Wikipedia entry, after Penn demoted her to adjunct status, Dr. Kakri found a home at BioNTech in 2013, where they and other biotech firms saw tremendous value in her work, yada yada yada, her research with Draw Weissman saved millions of lives going forward and maybe just the whole damn world.

Two takeaways:

  1. If Penn, after demoting her to being an adjunct, tries to claim her and her work as their own we riot.
  2. What is the marginal value of university research if all we’re producing is grant applications?

Part of the blame, of course, has to be placed at the door of the NIH and NSF grant application review process. But how much longer are they going to matter either?

  1. 2021 NSF Budget: $8.5 Billion
  2. 2021 NIH Budget: $43 Billion
  3. Tesla Market Cap: $650 Billion
  4. Elon Musk net worth: $167 Billion

The whole point of the NSF, NIH, and the academic research project is the production of the public good that is basic science. Absent private profit incentives, they should be able to pursue the big picture project that are too broad in application for private companies and the high risk-high reward projects that are or venture too risky even for venture capital.

The advantages of government agencies, however, are limited if they are overwhelmingly surpassed in scale by private market science. Even if 99% of firms can’t overcome the public goods problem, the 1% (ironically within public economics what would be referred to as “privileged groups”) of firms that stand to profit from advancing basic science have the scale to execute such ambitions. More importantly, however, they may also have better incentives. Yes, they are greedily trying to make a profit off of their innovations, but at least the innovation remains their goal.

I’m not worried about the value of university professors as educators. It turns out that education doesn’t scale as well as we thought. That there is tremendous value to be in a room together when you’re trying to pass on explicit, complex, and tacit knowledge. Nor am I worried in the slightest about capital-S Science. There is a bright future for any and every institution producing science, even the most basic, broadest science that no private company or patent strategy could ever exclude others from benefiting from. But, I’m afraid, there is no future for the production of grant applications or the institutions that pursue them at the expense of brilliant minds trying to solve our most important puzzles.

Created By

There is an old adage, I don’t know who to attribute it to (probably Norman Lear), that theater is for actors, movies for directors, and television for producers. The logic behind it is fairly straight-forward and compelling.

No matter how much the director works to make their vision come to life on the stage, when the curtain rises the production succeeds or fails based on the choices the actors make that night, in that moment. They have all the power. Cinema is a different animal, granting considerably more influence to the director. They place the camera, and therefore the audience, wherever they want. They can demand take after take until they fill the frame with the vision they hold in their mind. They can lean over the shoulder of the editor at every step, telling the story they want to tell. The director does not, by any means, hold unchecked power, but they are the high-leverage determinant of a project’s success or failure.

Television as a producer’s medium is, in my opinion, slightly out of date. When people spoke of the power of producers within television, they were speaking of network television; a landscape with limited channels where few would ever be so foolish to dismiss the power of the median voter theorem. Producers thrived because they made the high leverage decision: what gets to be on television. The actors, the writing, the (ha!) cinematography, those were all 2nd-order concerns, trivial concerns really, that lived in the shadow of the one decision that truly mattered: did you get to be on television?

Whole lines of economic research and theory center on the economies of scale and network effects. If you’ve ever wondered why books about old Hollywood have some of the craziest stories you’ve ever heard, it all comes back to the simple, but rich, economics of a marketplace with massive network effects for consumers (you want to watch what everyone else is watching), enormous fixed costs for setting up a network that absolutely trivialize the marginal costs of producing a show, the nearly zero marginal costs of broadcasting, and the enormous barriers to entry for potential rival networks. Coupled with the enormous status of associated with “being on television”, you arrive at an outcome where the artistic quality of content is almost irrelevant to market success, labor is willing to work for peanuts, and your capital inputs are almost exclusively fixed costs. Who’s the high leverage determinant of outcomes? The person who gets to decide what gets to be put on television.

That world is gone and I am grateful for it. Television is now the medium for writers.

We live in an endless wonderland of channels and content. The median viewer is still well served by a multitude of outlets, but it is within the microbiomes of this new ecology of entertainment that most of us are lured towards. If the defining attribute of the supply of entertainment has become its specificity, then the defining attribute of our demand is its depth. We demand 32 film superstructures with fully fleshed out worlds within worlds within worlds. We demand 6 seasons and a movie exploring the relationships between a community college study group and their metacommentary on film and television and how it has come to define how we view relationships. We demand 10th season callbacks to a sight gag from season 2 that was originally an homage something Truffaut did (which was itself an homage to Hitchcock). We want the story to keep going and going and going, and if has to end, it sure as hell had better not all been a dream.

Showrunners, who are typically the final typewriter that most scripts go through, and their teams of writers are producing the content that we voraciously binge. I don’t (want to) know how many hours of television I watch a year. but I have no doubt that I’m consuming 1000’s of manhours of writing, which makes it hard to complain about the price of HBOMax when I’m effectively paying pennies per hour for good writing. Maybe good writing has always been in short supply, but for the first time it is the the high leverage determinant of the success and failure of outcomes– good writing is the short side of the market. So if you want to make it as a writer, keep writing! But if you want to make a career and pay a mortgage as a writer, I suggest you bone up on your television story structure.

NB: for the couple dozen or so of you who read this, be advised that I am Mike Makowsky, the economics professor, not Mike Makowsky the talented screenwriter. Please do not blame him for my opinions, though I do encourage you to watch his movie Bad Education, which is excellent.

On Cylindrical Revolutions

The three technological innovations new to my life in the last year with the greatest impact are:

  1. Pfizer mRNA vaccines (price = $19.50, input costs: no less than $2 Billion, probably more)
  2. Amazon Basics Foam Roller (price $18.99, input costs: $4.44 per ounce of styrofoam)
  3. Zoom teleconferencing (price: $no idea what my school pays for it, input costs: $146 Million in venture funding)

The vaccine, of which I am scheduled to receive my first dose of tomorrow, will allow me to (sort-of) return to my pre-pandemic life. The introduction and regular use of a cylinder of high-density styrofoam has given me a better functioning left leg than I’ve enjoyed in 5 years. Zoom has arguably done more to maintain my the short-term integrity of my income (i.e. it’s allowed me to teach online effectively).

That is a very oddly shaped distribution of investments in high-utility yield innovations.

Biotechnology and medicine as a high investment, high risk, big payoff innovation game is well understood. Less known was whether or not a rapid “innovation on demand” vaccine project was an achievable outcome, no matter how much money was thrown at it. Turns out it was, and we’re left with what might be the most impressive feat of willed innovation since the moon landing. High-resolution teleconferencing technology, on the other hand, is exactly the kind of product we’ve grown accustomed to modern tech firms producing– the supply of such innovative products via the private capital-entrepreneurship pipeline is almost always less in question than the eventual demand it may or may not find in the marketplace.

But what of treating your muscles like sugar cookie dough? This is neither a sophisticated new composition of materials nor, at face value, a particularly complex theory of musculature. But, to my knowledge, this is not something even professional athletes were doing 7 years ago, yet now is both the bleeding edge of physical maintenance and such common knowledge that everyone who’s strained a muscle in the last 6 months currently has one of these cylinders leaning against a wall in their home. And, while I don’t mean to oversell it, the introduction of foam rolling has massively improved the quality of my life, not just when I try to play any sort of sport, but when I walk down a flight of stairs. It’s not crazy to suggest it may buy me an extra decade of easy use of my preferred mode of transportation, and while using my natural knees at that.

Investment in innovation is an interesting thing – there appears to be significant returns to scale at the micro, meso, and macro levels. Firms flush with capital can focus teams on single problems, fill them with talent, and grant them the keys to every piece of equipment deemed to hold even the slightest possibility of aid en route to an end product. There are simply innovative outcomes on the horizon for the Pfizers of the world that will never be available to scrappy new start-ups. At the same time, we can see the network-driven returns to scale in markets, a la Silicon Valley or Hollywood, that only begin to appear when a critical mass of agents all find themselves drawn to the bubbling creative soups that appears in the diners, salons, and coffee shops of whatever place has become the place.

But there are scale returns at the most macro of macro levels as well, and that is where we get miraculous cylinders of foam, as well as wheels on suitcases and the polymerase chain reaction. People are many things. Occupiers of space. Emitters of carbon dioxide. Consumers of fried dough. Sometimes while doing all three they also come up with ideas.

Humans as idea machines lies at the core of Michael Kremer’s theory of economic growth, and it is perhaps my favorite idea within economics in the last 40 years. Simply put, more people leads to more ideas. Population growth is not just a product of innovation, it is a source of it. Every individual is a lottery ticket that we hope pays off with a world changing eureka moment that the rest of us can benefit from and build on for all time going forward. More people, more lottery tickets.

Those organic globules of cognitive betting slips coalesce into the long tail of innovation return on investment. We take the brightest minds, throwing them and piles of cash at our biggest problems, hoping that for the closest thing to a assured payoff. But it’s within the billions of people, and their billions of bad ideas that sometimes aren’t, within which we get countless miracles that change our lives for the better bit by bit, one smoothened middle-aged stride at a time.

Berkson’s Paradox nay Bias and Spring Break Blogging

You may be tempted to observe a negative correlation between the length of my blog posts and fraction of the previous 7 days that can be accounted for as “Spring Break”, but I submit that you may simply be omitting from the sample all of the short blog posts I could hypothetically be writing in crisp fall months.

Do read Lionel’s whole thread though. It’s good.

Consumption as Inflation Hedge

The emerging market in digital art as nonfungible tokens is the strongest signal of expected inflation I’ve seen to date.

Let’s back up.

Digital art is being sold as nonfungible tokens (NFTs). Is this a bubble? Don’t know. Is this art? Don’t care. Is a piece of digital art as an NFT harder easier or harder to duplicate? I imagine it is easier for the artist, but they have an incentive not to issue duplicates, because doing so erodes the market value of all future digital art NFTs the producer might issue. Is a piece of digital art as NFT harder to duplicate for a forger? I imagine so. The NFT as both art and artists signature is certainly harder to duplicate than traditional media and penmanship. Which is to say we have little reason to worry about the value of a piece of art being inflated away by the artist or criminal forgers.

Now that’s interesting.

The general rule of thumb is that the more consumption value a good offers, the worse it will perform, on average, as an investment. Art, baseball cards, comic books, vinyl records, memorabilia, homes – these are all generally inferior to equities as investments. It stands to reason, though I certainly haven’t checked, that the same logic applies to hedging against inflation as well. Precious metals, while less fun, should offer a superior hedge against inflation than art, particular in relation to art by living artists, where the supply is anything but fixed.

In this regard, however, NFTs are a bit of a game changer. The supply of any given Beeple NFT is fixed forever at one, and there is as yet no reason to believe otherwise. Storage and security costs approach zero, which is something that can’t be said about a 20-foot tall metallic balloon dog. The consumption value is subjective and I’ll leave it to market auctions to suss that out. The inflationary hedge value, however – in this manner NFTs may be an game-changing innovation for prominent living artists, allowing them to capture rents from the value they create that has previously eluded them prior to shedding their mortal coil.

The bond market isn’t giving unambiguous signals of inflationary pressures yet, but signs are creeping in, and among those signs I include seemingly rabid excitement for mixing cultural-status consumption with cryptocurrency-enabled hedges against the prospect of what would be the first real wave of inflation we’ve seen in 40 years.

Which is a long winded way of saying I’m not rooting for inflation, but I’d also be happy to sell my mint-condition complete set of 1987 Fleer baseball cards if you’re looking to hedge your portfolio.

Political Poverty is a Choice

Political drama was about to happen and then it didn’t. Across the country, deep and insightful thinkpieces were left unfinished, relegated to the folder for things writers hope will become future brilliance but definitely never will.

The Big Covid Stimulus Bill was about to fall short by a single vote, with Senator Joe Manchin (D-West Virginia) threatening to break against party lines. It was a disaster… until it wasn’t. A political catastrophe, evidence that the Democrats were a failed coalition once again humbled by their ruthless coordinated opposition… until it wasn’t.

So what was the source of this unforeseeable political miracle? Joe Biden’s long-running political strategy of asking people what they wanted, keeping promises, and not being a d*ck.

As much as I want to roll with three paragraphs of clever wordplay referencing stratagems and gambits, the obvious point to be made is that Biden has decades of political capital that the entire Democratic party is currently able to leverage. In contrast, the Republican Party is currently fronted by a Senator who has broken every political norm for short/medium run political gain, while bearing the brand of a career grifter who spent decades opting not to pay his contractors, employees, or lenders.

I’m not much for making forecasts or predictions, so here’s my predictive forecast for the Republican party: they don’t matter and won’t for years.

Make no mistake: their politics still matter a great deal. White ethno-nationalism has a real foothold in chunks of the electorate all over the country, evangelical Christians remains one of the most influential voting blocks, and the US system remains weighted towards the preferences of rural voters. Rather, what I mean is that the institution of the Republican Party no longer brings much to the political bargaining table. The party has spent down decades of political capital and no recourse to trust in their reputation to solve collective action problems. The bill has finally come due for their spendthrift and short-sighted culture. As much as it may hurt our sympathetic sensibilities, we owe it to them to let them learn from this experience and, after a few decades of trustworthy behavior and political saving, they should be able to pull their party up by their bootstraps.

In four years, two with control of all three branches, the GOP was never ever able to pass legislation as impactful on the US landscape as what the Democrats pulled off this week. The Republican party remains an efficient fundraising organizing and cultural brand for running a campaign, no doubt. There’s not going to be a third-party usurping of their status as one of the two dominant parties, at least not any time particularly soon. But as far as the legislative marketplace is concerned, the Republican party is dead broke.

Dolly Parton and the Danger of Doing What You Love

Let’s get through the easy parts quick. This Vox feature does its best to argue for, without ever explicitly stating or committing to, the thesis that Dolly Parton should be canceled because she has never said or done anything controversial, let alone anything justifying cancelation.

It is not good and is largely unworthy of comment. As much as some of you crave dunking on the proponents of cancel culture with a an intensity that sometimes feels a lot like, well…cancel culture, I’m bored with the whole family of skirmishes, vendettas, and public identity burnings.

I want to talk about sh*t jobs.

I don’t mean unpleasant, dangerous, or low status jobs. There are positive compensating wage differentials for such things. No, I submit to you that the new sh*t jobs of the modern developed economy are relatively pleasant, safe, and within the appropriate social circles, quite high status. And therein lies the trap. Let me set the scene.

You’re at a top 100 undergraduate university. English is your first language, you’re accustomed to receiving high grades, and are sufficiently socially adept that attending college parties is at least moderately enjoyable. In choosing your major, you are persuaded that you should choose the subject within which you experience the greatest pleasure executing your assignments and participating in class. While math is by no means beyond your capacity, studying it brings you little pleasure, and there is no similar mechanism for you to earn approbation from your professor or impress you classmates. You don’t get excited about telling your friends you are planning to become an engineer or chemist, and, perhaps most importantly, imagining your future self as an employee in sensible work slacks fills you with an almost crippling amount of ennui.

So you start on the path to become a writer. You know that fiction writing is a brutally competitive field, dominated by a handful of (what you imagine to be) supernovas of talent. You’re practical, you tell yourself, and imagine a career in journalism or journalism adjacent publications where you research beat stories and features, allowing yourself to get excited about climbing the ladder and eventually writing a regular column where you blow our collective minds with your insight and pith. It only takes six months into your first gig that you realize the problem. The really big problem.

Every other English major in the country had exactly the same idea. A lot of sociology, history, critical theory, and field studies majors, too. The field is flooded. But it gets worse. It’s also filled with engineers, economists, psychologists, biologists – people with specialized knowledge, often with advanced degrees, all competing with you for space in a brutally selective ecosystem where every ounce of attention and influence is measured to the last eyeball.

But it gets so much worse.

Thousands of those people are willing to do the job– your job — for free. For nothing. Hell, some of them are willing to pay the publisher for the opportunity to do what you considered the vocation that would pay your rent. How can you compete with that? It’s beyond our fears of being underbid by people willing to take less pay, of our job being outsourced to workers in another country with a lower standard of living and weaker labor laws. Nobody’s worried about the execs at their company discovering a sweatshop in Vietnam full of employees willing to pay your boss for the right to do your job.

But that is exactly what’s happened to everyone who wanted to write about sports, music, partisan politics, or, for that matter, any subculture where being a tastemaker or cultural curator is catnip for the teenage soul. There’s been a revival of unionization in the digital print business and its easy to see why: they need to close shop. Everyone who’s gotten their foot in the door and ridden the elevator up to their new 6th floor cubicle has been greeted with the same horrifying sight. Teeming masses, as far as the eye can see, all desperate for their job, for their identity, as a writer. So desperate they’ll do it for free. Some want a chance to prove themselves, but many of them just want a hobby. They competently teach 7th grade band during the day for a pay package that includes health and dental insurance, all while wearing a very sensible skort from Costco. But by night they write fiery, in-depth, shockingly well-informed features about their favorite North London soccer team, Icelandic DJ subculture, or how to get the most bang for your buck shopping at Costco. The research, the writing, the promotion, they do all of this for free.

Which means every assignment could be your last. Which means you need to get attention, no matter what. Most days it’s not that big a risk. You churn out 1-2 posts per day, mostly just recapping news or taking a few shots at someone who wrote something you don’t like or, at least, you think other people won’t like. But every now and then you shoot the moon on a big feature, going through old sources, putting together a collage of links that you think will jar readers into not only reading your work, but responding to it and, most importantly, sharing it with others on Facebook, Twitter, or even Instagram. A viral hit is the kind of thing that your overlords will remember the next time your writing hits a fallow period.

But if your feature doesn’t pan out, that could be a problem. If a beloved country singer with a reputation for virtuoso talent, kindness, and often overwhelming generosity that actually makes the world a measurably better place, well, you don’t have the luxury of letting three weeks go to waste. If you can’t find evidence that she’s a bad person, well, you’ll have to go with your gut. And your gut tells you that everyone who is successful is a bad person. Lack of evidence to their secret depravity is itself only evidence to how much they have invested to hide said depravity.

That’s the problem with trying to make what you love at 19 into a career. You’re a kid, you don’t know that much about what you’re going to like in 10 years, all you know is what is fun and what is hard. And, in rough approximation, the same things are fun and hard for all of us. Only studying the things that are fun, dodging whenever possible the things that are hard, will leave you with nothing to rely on but your talent. And, ample as that talent may be, it is unaugmented by the skills and tools that are harder and less fun to come by, tools which would differentiate you from the teaming oceans of talent sloshing against the sides of that cubicle, all desperate to do your job. For free.

Academia as tax shelter

A very brief story:

My advisor was Laurence Iannaccone, student of Gary Becker, seminal and in many ways founding contributor to the economic study of religion, now of Chapman University. His observation is a common one in academia, a point of pride for some even, though that varies greatly by discipline, as does their market options outside of the academy. And, yes, flexible work schedules, post-tenure job security, and sometimes picturesque campuses all should be counted towards the total compensation of those fortunate enough to secure a faculty appointment. But the power of the observation goes far beyond proper labor market accounting.

As I find so often to be the case, there is good sociology to be done, but the best first step in doing so is a little bit of economics. To wit:

The academy is, on average, considerably to the left of the population at large. Now this difference, mind you, is grossly exaggerated by your typical right-wing windbag who seems to think that universities begin and end in the English department, but the difference remains. So why would your typical economics, chemistry, or architecture professor tend to be left of the popular center? Well, if the median self-identified lefty got to choose the federal and state tax rates, what would they be? Ok, and how much of that will I have to pay out of my non-pecuniary income? Until they figure out how to tax the thrill of pursuing my own self-determined research agenda, not very much. Taxes are cheap when half of your compensation is non-pecuniary.

The academy is a club.

Scratch that.

The academy is a hierarchy of nested clubs. Which means that we often suffer from exclusionary FOMO akin to fourth tier English gentry trying to marry off five daughter in the early 19th century. Membership in those clubs– those famed research groups, donor-named centers, or even (god forbid) schools of thought — they become more than just sources of funding, workshop critique, and coauthor match-making sock hops. These clubs become the well springs from which ever increasing portions of our non-pecuniary income come from. They become our social networks, our friends, and even ,with a handful of co-authors you’ve gone into scientific battle alongside, a second family. The next time you see someone dig in their heels, seemingly denying the mounting evidence that they were on the wrong side of a scientific argument, don’t just blindly assume they are too stubborn and arrogant to acknowledge they might have been wrong. Consider how unfunded or, more importantly, how lonely they stand to be if they’re the first to give up the fight.

It’s why we covet tenure so much. Don’t get me wrong, everyone wants job security. But for most of us, the prospect of being laid off doesn’t necessarily include the possibility of being jettisoned from what you’ve slowly constructed as a separate parallel universe within which you have carefully curated the technical, educational, and social capital necessary to produce your career and life. If you get laid off from programming for Netflix, the next few weeks or months will be unpleasant, scary even. You may begin to doubt your ability or life choices. But that next job will come, and you will as often as not find yourself with a nearly identical life on the other side.

There are those in the academy though for whom this is all they’ve ever known. Bachelors, doctorate, tenure-track academic placement. Throw in a post-doc and that’s 20 years, and you’re entire adult life, in and around universities. Even if they’re from a field fortunate enough to have robust private sector options, how much will doubling your salary really soften the blow for such a person?

I say all of this now not as a critique of academia, or even to lead to prescriptions or advice. You want my advice? Fine, here: don’t go straight to grad school. Dip your toe in the real world, see how you like it. Come back in a few years with a little experience and distaste for office life. It’ll serve you well when your dissertation hits one of its many inevitable nadirs.

Rather, I invite you to consider this: what does the world start to look like when our utility comes less from the goods that we buy and the experiences we have, and more from the clubs we are members of? What does it look like when those clubs find newer and better ways to monitor our behavior and our expressed beliefs? What does it look like when the purging of membership rolls becomes a part of the culture of those clubs?