Free speech in the digital age

I finally broke down and wrote a cancel-culture-adjacent piece. It originally appeared at Merion West, an online magazine. Since this essay contains themes I’ve been mulling over but have struggled to articulate for a while, I thought I’d reprint the piece here, with some commentary in the footnotes. Enjoy!


If I had to pick one thing America does better than any other nation, I’d have to go with free speech. The American commitment to free speech is legendary, codified by the First Amendment, which guarantees all Americans the right to worship, peacefully assemble, and otherwise express themselves without fear of government censorship.

As legal protections for freedom of expression go, the First Amendment remains the gold standard worldwide. We often take this for granted, forgetting that most people don’t live under the same conditions. Hold the American stance on freedom of speech in contrast with that of Iran or Saudi Arabia, where blasphemy is punishable by death, or China, where one-to-three million members of an ethno-religious sect are packed into concentration camps for crimes as spurious as abstaining from alcohol.*

If picking on theocracies and dictatorships strikes you as low-hanging fruit, recall that Europeans also live with less freedom of expression. A U.K. man was arrested and fined for posting a YouTube video that showed his girlfriend’s pug performing Nazi salutes, for example. By comparison, the American Civil Liberties Union has used the First Amendment to defend the rights of neo-Nazis and civil rights protestors alike to assemble.

Our commitment to the rights of others to express themselves, even if they hold heinous beliefs, is something uniquely American, perhaps the finest piece of our cultural heritage. Unfortunately, it’s a commitment we seem to be turning our backs on—and the First Amendment is often used as a moral license to do so.

The First Amendment guarantees one freedom from government censorship; it doesn’t establish the positive right to speech. This is as it should be, as anything more would require the compulsion of others to either hear or facilitate one’s speech. However, this allows people to take a narrow view of freedom of speech as being merely freedom from government censorship. We might call this the “showing you the door” strain of free speech thought. Such a view, while legally coherent, ignores that free speech has a cultural component as well—one that needs constant maintenance if it’s not to fall into disrepair.

That component might be described as a willingness to err on the side of permissiveness when it comes to public discourse—or perhaps an understanding that we generally tend to benefit from living in a culture where people can push boundaries without intolerable social and economic risk.** Its bedrock values are charity, humility, and tolerance. 

When I speak of a threat to free speech culture, I’m talking about the newly enabled impulse to defenestrate and defame people, often for trivial transgressions, sometimes years after the offense—“cancel culture,” if you must. It is distinct from free speech culture in that it doesn’t seek to confront opposing views but rather to erase them, often in ways that are financially or personally ruinous for the offending party. It’s the self-righteous, vindictive justice of the mob.

Because the internet

Anyone observer of humanity can tell you this is not new behavior. On the contrary, it’s been more the rule than the exception. But it does seem exacerbated and facilitated by modern life, especially the internet.

As more of life moved online, it became easily searchable, permanent, and largely public. This migration—the result of social encouragement to live in full view of your friends, casual acquaintances, and advertisers—has spawned a panopticonic social archive that can easily be turned against you.*** These are conditions unique to life in the 21st Century that many adults, let alone children, seem understandably ill-adept at navigating.

When paired with the rapid mutation of norms (also aided by the internet) surrounding acceptable speech and the mob mentality incentivized by social media and click-hungry outlets, this creates an environment ripe for reflexive, post-hoc defamation, to which even—or more accurately, especially—powerful liberal institutions (the very same tasked with guarding free inquiry) are showing little resistance

In such a hostile environment, the obvious choice becomes to abstain from speech that not only is controversial but also that which might someday be controversial. (The exception being those who are financially immune to cancelation and can thus be afforded public free thought.) This is clearly at odds with a culture of free speech, in which ideas can be freely debated, and people can change their minds over time. 

We’re already seeing the consequences: authors pulling their own books from publication for excruciatingly trivial offenses; professionals being fired for sharing objective research that supports unhappy findings. But the future consequences will be unseen: the important medical studies that aren’t conducted; the bold art that isn’t created; the policy failures that can’t be named, much less halted. From this vantage point, the future looks bleak, the province of the anodyne and ineffectual.

Censorship has been outsourced to private actors

Much like the social surveillance system under which we live (voluntarily, it must be said), the modern thought police regime is not a product of the state. Censorship has been outsourced to private companies and zealous volunteers, who are themselves often exercising their free speech rights in the course of policing others’ speech. From a legal standpoint, this is of course distinct from government censorship, and therefore not a First Amendment issue. No one has a right to a subreddit or a Twitter handle or a New York Times op-ed.

Yet it would be a mistake to say that these companies and individuals don’t or can’t pose a threat to free speech in the broader, cultural sense. To do so, you would have to ignore the market power of the relatively few actors that control the channels of speech in the modern era. The collapse of local media and the consolidation of firms within the industry, for example, have endowed the remaining actors with the power to filter the coverage of events and viewpoints that millions of Americans are exposed to. Do you trust them not to use it?

Over half of Americans get their news through Facebook, which is known to have manipulated users’ feeds to alter their emotions. Some 80% of search engine traffic flows through Google, home to famously opinionated, activist employees. About a quarter of journalists turn to Twitter—the use of which has been shown by at least one study to affect journalists’s judgement of newsworthiness—as their primary news source. The case of social media platforms and search engines is particularly illustrating: while they are private actors that users engage with of their own volition, network effects are built into their business models, meaning once established, they’re not as vulnerable to competition as other businesses and products are.

These companies are well within their legal rights to create their own policies and remove content that violates them, to algorithmically promote or suppress content on their properties, or to flag content as misinformation if they deem it so. But to deny that in doing so they might chill, stifle, or otherwise impact free expression is fanciful.

There are no easy fixes

Part of the irony of this problem is that addressing it in the most straightforward way (through policy or regulation) would actually represent a huge step in the wrong direction. Do I worry about the market power of companies that control the modern channels of speech? Yes, especially given the political power dynamics at play in many of our most powerful institutions. Do I think media polarization is dangerous and bad? You bet. But maintaining the independence of private actors, and thus the core of the First Amendment, is more important than the pursuit of an ephemeral unbiased public sphere.

That’s fine, because this isn’t a policy problem. It’s a cultural problem, and it requires a cultural solution: a revival of free speech culture and the virtues upon which it rests. We need to check our instincts to banish things we don’t like, and we need to voice our skepticism of those who over rely on the power of censorship.**** (It would probably also be a good idea for individuals to rethink how they use the internet.) 

I know this is a lot to ask, especially under the conditions of the digital age. But I have hope. Cultural free speech is a core American value and a key component of life in a pluralistic society. If anyone is going to defend it, it will be us.

Notes

* When I started writing this piece (about a week ago), Uyghur oppression was the most relevant example of Chinese human rights violation. By the time it was published, that had changed.

** This is one of those American ideals that has certainly never been implemented or enjoyed uniformly. As sociology professor Rod Graham points out, for a long time, you could risk losing your job and destroying your personal life by coming out as gay, for example. So while the tone of this piece is somewhat pessimistic about the state of modern free speech, I think it’s important to note that in a lot of ways, things have improved.

***  I should have also brought up that sometimes, as in many of the “Karen” videos going around, this social surveillance system is quite literally weaponized. There are incentives in place to do so—mainly the promise of money and virality for the poster.

****  There’s always going to be an Overton window; I don’t mean to suggest it could be any other way. That’s just part of living in a society.

Thoughts on Marc Andreessen’s IT’S TIME TO BUILD

Way way back in April of 2020, a venture capitalist named Marc Andreessen wrote an all-caps exhortation to western (particularly American) institutions and individuals: IT’S TIME TO BUILD. It’s a quick read, so I do recommend it. If that’s out of the question, you can get the gist from the opening paragraphs:

Every Western institution was unprepared for the coronavirus pandemic, despite many prior warnings. This monumental failure of institutional effectiveness will reverberate for the rest of the decade, but it’s not too early to ask why, and what we need to do about it.

Many of us would like to pin the cause on one political party or another, on one government or another. But the harsh reality is that it all failed — no Western country, or state, or city was prepared — and despite hard work and often extraordinary sacrifice by many people within these institutions. So the problem runs deeper than your favorite political opponent or your home nation.

Part of the problem is clearly foresight, a failure of imagination. But the other part of the problem is what we didn’t *do* in advance, and what we’re failing to do now. And that is a failure of action, and specifically our widespread inability to *build*.

We see this today with the things we urgently need but don’t have. We don’t have enough coronavirus tests, or test materials — including, amazingly, cotton swabs and common reagents. We don’t have enough ventilators, negative pressure rooms, and ICU beds. And we don’t have enough surgical masks, eye shields, and medical gowns — as I write this, New York City has put out a desperate call for rain ponchos to be used as medical gowns. Rain ponchos! In 2020! In America!

Marc Andreessen, “IT’S TIME TO BUILD”

Andreessen’s blog post is very good, even if it’s mostly an extended rallying cry. I think it was also very timely, as it alludes to a few subtextual themes I’m seeing come up more and more in politics:

  1. The US economy is increasingly concerned with rent extraction and distribution as opposed to genuinely productive economic activity, the latter having been off-shored to a great extent. The dollars-and-cents economic benefits of doing so aren’t really up for debate, but in social and political terms, the trade-off is looking less appealing these days. Prediction: interest in industrial policy is going to (continue to) increase among the right and possibly the left.
  2. Proceeding from a default assumption of capital scarcity is maybe not a smart way to make policy anymore. We are awash in money and not averse to printing more or deficit spending when the mood strikes. Obviously there’s a limit to how long you can get away with stuff like that, but if we can fight endless wars perhaps we can also fix a few roads.
  3. Maybe democracy is the problem? Others responded to Andreessen’s blog post by pointing out that there are political impediments to building as aggressively as Andreessen would like. Vox’s editor in chief, Ezra Klein, writes that American institutions public and private have become “vetocracies,” meaning that they’re biased against action instead of in its favor. Similarly, Steven Buss notes in Exponents Magazine that entrenched interests have captured regulators, making building, in many cases, illegal. Homeowners, for example, are hostile to development and form a powerful local political constituency.

    The thing is… isn’t this basically just policymakers being tuned into the desires of their constituents—or at least those inclined to make their voices heard? The only people who care enough to show up at a zoning meeting are the homeowners who don’t want the high-rise going in across the street. Professions lobby to be licensed so as to increase their income and limit competition, but members of the public generally don’t care enough to show up at the state house with a pitchfork.

    This is just the way it’s going to be, so maybe the answer is a system that doesn’t particularly care what its constituents have to say—or at least cares less in areas prone to regulatory capture.
  4. Finally, America’s ailments extend beyond the realms of economics and technocratic governance. Ours is a crisis of imagination, spirit, and mythology, exacerbated by the collapse of social capital across much of the nation. Consider the following anecdote1:

    In 1869, a businessman named George Atwater set out to install a network of rails throughout the city of Springfield, MA—from where I write presently—on which horses would pull carriages, a pre-electric trolley system. It seemed like such a ridiculous idea the board of aldermen laughed as they gave him permission and mocked him with an “initial investment” of eleven cents.

    Atwater built it anyway, and it turned out to be a huge success, expanding throughout the city and surpassing an annual ridership of 1 million by 1883. In 1890, less than a decade after the first electric power stations were built, the Springfield rail system began electrifying routes. By the next summer, all lines had been converted from horse to electric power. By 1904, ridership was 19 million; by 1916 it was 44 million.

    All of this—bold, successful investment in infrastructure, the rapid adoption of new technology, reliable and profitable public transportation—is technically possible today, yet this story could never take place in 2020. The aldermen would have dragged their feet, insisted on handouts to favored constituencies, and requested a handful of impact studies. Atwater would have stuck his investment in the stock market. The story would not have taken place here, because Springfield, like many former manufacturing cities, is in many ways a husk of its formerly productive self. Atwater would have lived in San Francisco, Boston, or New York.

Andreessen is right. It’s time to build. But let’s go broader than that: It’s time for a general return to alacrity in the public and private spheres, particularly for those of us who don’t live in one of the nexuses of the new economy. It’s time to rebuild social capital. It’s time to turn off autopilot.

Let’s fucking go.

###

  1. I came across this story in Lost Springfield, a local history book by Derek Strahan, who blogs at lostnewengland.com. I really enjoyed the book, so if you’re interested in the region’s history, I’d check out Strahan’s work.

Armchair Psych: Why Elizabeth Warren’s Loss Inspires “Fury and Grief”

You may have noticed that Senator Elizabeth Warren has suspended her campaign for president after a disappointing showing in the Democratic primary. You may have also noticed that some people are very upset. A small sample:

I do believe Warren’s loss is particularly painful for her supporters — and not just those in the media. I’m going to bend my rule about not discussing electoral politics on this blog so I can offer an armchair psychologist take on why Elizabeth Warren’s defeat has inspired such “fury and grief.” The usual disclaimers apply, probably more than usual.

#Goals

The first piece of the puzzle is that Warren’s supporters strongly identify with and admire her. Unlike, say, Bernie Sanders or Joe Biden, who are both personally wealthy and powerful but enjoy substantial support from the middle and lower classes, Elizabeth Warren actually has a lot in common with her supporters: white, highly educated professionals.

She is like them, only more so: a veteran of prestigious institutions from Boston to D.C., impeccably credentialed and accomplished, with grandchildren and a $12 million net worth to boot. She is “having it all” made flesh, an avatar of success. This encourages supporters to project themselves onto Warren. Their parasocial relationship makes her loss harder to deal with because it feels like a personal rejection, and in some ways, it is.

Technocracy and its True believers

Understanding Warren and her supporters as ideological technocrats is essential to making sense of their dismay at her poor performance. A technocrat’s authority is legitimized by displaying expertise, of which Warren did plenty. Her frequent allusions to her competence and preparedness — she has “a plan for that!” — are a straightforward appeal to technocratic ethos.

But raw displays of expertise are not the only route toward technocratic legitimacy, and indeed, few will have the occasion to put forth arcane “plans” to remake society and be taken seriously (though that is the dream). Expertise and the authority it grants can also be obtained through association with prestigious institutions.

Within these places, advancement, evaluation, and remuneration of personnel are typically formulaic matters. (For an example, check out the salary schedule for foreign service officers. Another is how public school teachers’ salaries are calculated.) This is a superficial gesture to the ideals of fairness and objectivity. The impersonality and aversion to qualitative data it necessitates are regarded as features, not bugs, of bureaucracy.

The reason this is important isn’t because Elizabeth Warren spent her career in such places. It’s because her supporters have too. These ideas are not only intuitive to them, they are fundamental, ethical truths. Elizabeth Warren deserves the job. She spent a lifetime earning it.

Whether or not the world should work this way is an open question. But to convince yourself this is the only morally valid way it could work, as Warren supporters seem to have, is an error in judgement. In practical terms, it’s a really poor model for understanding how actual voters make decisions about political leadership in a democracy. Presidential hopefuls wouldn’t be subjecting themselves to the Iowa State Fair if the election could be decided by a resume-scanning software.

The technocratic path and its costs

The technocratic path to power is not merely a career plan; it’s a full-blown ideology with ideas about what’s valuable, what constitutes a good life, and who deserves what. But underneath it lies the universal, deeply held human desire for esteem.

Recall that Elizabeth Warren is an aspirational figure for the upper-middle class, which defines itself by its intellect and is preoccupied with the markings thereof. Her path is their path, the Gramscian march that culminates in power and respect. Her decisive failure to obtain their platonic form, then, calls into question the legitimacy of the rules they’ve been playing by and the immense sacrifices doing so requires.

If giving years of your life and hundreds of thousands of dollars to the machine doesn’t buy you unquestioned esteem, what’s the point? You could have relaxed more, could have taken a job that actually paid and bought a house. You could have just pulled a Scott Alexander and exorcised your passions in a blog! But time only goes forward, so the present and future have to justify the past.

Status quo bias and sexism

Last item of note. The portrayal of Warren’s sound defeat as sexism is as predictable as it is unfalsifiable. But for the purposes of this blog post, we’re not really interested in if it’s true so much as the idea that her supporters want it to be true.

As I see it, sexism provides the least challenging explanation for her failure, not intellectually — it requires some serious mental gymnastics to fit her third-place finish in Massachusetts (among Democratic women!) into that narrative — but personally and philosophically.

If someone is rejected for their immutable qualities, those doing the rejecting can be safely dismissed as bigots, and their opinions need not be taken too seriously. It’s not me; it’s you. The rejection of an ethos is different, because it’s not a repudiation of what you happen to be but rather what you choose to be. It’s harder for Warren supporters to swallow because they share her convictions, part of which is that they possess The Truth (which is why they should be in charge of policy and journalism and academia and human resources and…). This is like the ultimate public repudiation of that.

But most of all, the cure for sexism — no doubt some combination of activism; advocacy; TED Talks; and a ubiquitous, memetic media campaign — requires no change on their part. They’re already doing these things; in fact, there are entire industries and departments, staffed by the Warren demographic, devoted to these endeavors. Insofar as their daily lives are concerned, doubling down on sexism being the problem is activism against change.

“IRL Impressions”

I have, perhaps belatedly, entered the point in life at which I no longer have standing weekend plans to drink with friends. Not coincidentally, I’ve been doing more in the way the contemplative and outdoors-y. A few weekends ago, my girlfriend and I went for a hike on Monument Mountain in Great Barrington, Massachusetts. (Credit where it’s due: we got the idea from MassLive’s list of the best hikes in Massachusetts. We’ve tepidly declared our intention to hit them all this spring and summer.)

We opted for the mountain’s most direct root, looking for a challenge. But despite being advertised as strenuous, the trail was mostly tame, its steepest segments obviated by the installation of stone steps. We had a pleasant, if not effortless, ascent, punctuated by this or that detour to examine our surroundings.

After an hour or so, we reached the summit. Monument Mountain isn’t very tall. At a little over 500 meters, it’s about half the height of Massachusetts’ tallest peak, Mount Greylock. But that bit of relativity is less salient when you’re looking down on soaring hawks and the slow-motion lives of the humans below.

It was a beautiful day, and we weren’t the only ones out. In the background, two young women were taking turns photographing each other for the ‘Gram, the perfect receipt of an afternoon well spent. “I’m gonna do some sunglasses on, then do some sunglasses off,” one said. I immediately wrote down the quote in a text to Megan, who laughed.

Every now and again, something like this makes me think about the growing importance of online media — in business, culture, love, politics, and other areas of life. Social media is a mixed bag, but the advantages of scale it offers are pretty much uncontested. I wonder if we’ll reach a point at which the significance of online life — where 10 million can attend a concert and content can be redistributed in perpetuity at low marginal costs — eclipses that of our offline lives. If awareness is a necessary component of significance, it’s hard to see how it wouldn’t.

A few months ago, my company hired a consultant to help us attract sponsorship for an event. As part of their information-gathering, the consultant asked us what the “IRL impressions” of the event would be — a term that mimics social media analytics and that both parties eventually decided probably meant “attendance.” This struck me as at once amusing and depressing: the internet’s centrality is such that it must now be specified when we’re talking about the real — no, physical — world.

What Colleges Sell (continued)

I’m obviously not one to prioritize quantity when it comes to writing. Counting this one, I’ve written four blog posts this year — not great for a guy whose New Year’s resolution set the pace at two per month. Even less so when you consider that half of them have now been follow-up posts.

However, there was some interesting Facebook discussion on my last post that I felt merited some elucidation here, where those who don’t follow me on social media can digest it. (I won’t ask anyone to follow on social, but to those of you who are here via social media, you should subscribe to get these posts by email.) I’m also working on something else that’s a bit involved, and I thought this would be a good stopgap.

As loyal readers are aware, my last post touched on the college-admissions scandal and the cultural legwork being done by our vision of education as a transformative asset.

Elite colleges sell these ideas back to us by marketing education as a transformative experience, an extrinsic asset to be wielded. In an unequal society, this is a particularly comforting message, because it implies:

  1. The world works on meritocracy. High-status individuals not only are better than most, they became so through efforts the rest of us can replicate.
  2. We can achieve equality of outcomes with sufficient resources. This has the added bonus of perpetuating the demand for high-end education.

An observation I couldn’t figure out how to work in is that getting into elite colleges seems by far the hardest part of graduating from them. Admissions is, after all, the part of the process the accused parents were cheating, and to my knowledge, none of the students involved were in danger of failing out, despite having been let in under false pretense.

The low bar for good grades at elite colleges, the “Harvard A,”¹ is so widely acknowledged that to call it an open secret would be misleading.² Stuart Rojstaczer, the author of gradeinflation.com documents two distinct periods of grade inflation in the last 50 years: the Vietnam War era, in which men who flunked out would likely be sent off to fight an unpopular war, and the “Student as a Consumer” era of today.

The transition to the latter has meant a change in teaching philosophy and an increased centrality of the admissions process. On his website, Mr Rojstaczer quotes a former University of Wisconsin Chancellor as saying, “Today, our attitude is we do our screening of students at the time of admission. Once students have been admitted, we have said to them, ‘You have what it takes to succeed.’ Then it’s our job to help them succeed.” (Emphasis mine.)

This is consistent with my not-so-between-the-lines theorizing that the later-in-life achievements of elite colleges grads are mostly attributable to selection effects, not education. It turns out this was studied by Alan Krueger and Stacy Dale, who found salary differences between elite college graduates and those who applied to elite schools but didn’t attend were “generally indistinguishable from zero.”

Of course, this is kind of depressing, because if good schools don’t make “winners,” but rather attract and rebrand them, then it’s a lot easier to attribute their graduates’ success to factors that are not only beyond their control but for which there are likely no or few policy levers — genetics, culture, family structure, and others.

I think this is an unwelcome conclusion to the point that even incontrovertible evidence — whatever that would look like — would be ignored or stigmatized by polite society. Most people probably agree that public policy should keep away from these areas of life.³

Regardless, I think we should be more honest with ourselves about our obsession with elite schools and our expectations of education more generally.

*

Footnotes:

  1. In case you don’t feel like clicking the link: In 2013, Harvard’s dean revealed the median grade awarded at the school to be an A-, while the most common grade given was a straight A.
  2. Though apparently to a lesser degree, this has been the case at four-year colleges across the board, not just top-tier private ones.
  3. Then again, maybe they don’t. A recent survey of over 400 US adults found “nontrivial” levels of support for eugenic policies among the public, increasing with the belief that various traits — intelligence, poverty, and criminality — are heritable and also associated with attitudes held by the respondent about the group in question. The questions in the study were framed as support for policies that would encourage or discourage people with particular traits to have more or fewer children. (If you have 10 minutes, read the study, freely accessible at slatestarcodex. Also good: Scott Alexander’s piece on social censorship, in which the aforementioned paper is linked.)

What Colleges Sell

The recent college-admissions scandal has me, and probably many of you, thinking about the institutional power of elite colleges. It’s remarkable that even those we would consider society’s “winners” aren’t immune to their pull. Take for example Olivia Giannulli, who is from a wealthy family; has nearly 2 million YouTube followers; owns a successful cosmetics line (pre-scandal, anyway); and whose parents, Laurie Loughlin and Mossimo Giannulli, allegedly paid $500,000 to get her and her sister accepted to USC.

Why?

The standard line is that the point of college is to learn. Getting into a better school avails one of better information, which translates into more marketable skills—human capital accrual, in economics jargon. The many deficiencies of this view have birthed the somewhat-cynical “signaling theory”: the idea that college degrees serve mainly as signals to employers of positive, pre-existing characteristics like intelligence or attention to detail.

Signalling theory is powerfully convincing, but it doesn’t fully explain the insanity endemic to the elite college scene. There’s more going on at the individual, familial, and societal levels.

First the individual. If the human capital isn’t the point, social capital could be. The student bodies of elite schools are well curated for networking among the intelligent, the wealthy, and what we might call the “legacy crowd”—non-mutually exclusive groups that mutually benefit from this four-year mixer. Who you sit next to in class might matter more than what’s being taught.

Colleges, particularly those of renown, provide a sense of unabashed community that is in short supply elsewhere in American life. If you read universities’ marketing or speak with admissions staff, this is often a selling point. The idea that former classmates and fraternity brothers become a nepotistic social network post-graduation is intuitive, and probably a very compelling reason to attend a particular school.¹

What’s true for the individual is true for the family. Parents want the best for their children, and they know the kinds of doors attending the right school will open. But for parents, there are added elements at stake: self- and peer-appraisal.² That is, as educational attainment has become accepted not only as a means to but validation of social mobility, parents have come to define their success by the institutions their children attend. YouGov polling found that thirty-four percent of parents would pay a college prep organization to take a college admittance test on their child’s behalf. One in four would pay college officials to get their child into a good school.

college bribery 2

I’d bet this is an understatement caused by social-desirability bias.³

Last up, and most interesting, is society at large. Even though most of us won’t attend a very prestigious university, if we attend one at all, the legitimacy of those institutions still rests on our perception. For us to be bought in, we need a culturally acceptable premise for the power enjoyed by Harvard, Yale, and the like—a role that can’t be filled by the networking and status-driven benefits I’ve described so far. This brings us full circle, back to the idea of higher education as a method of information conveyance.

Though the human capital accrual theory of education is probably bunk, most people’s belief in it feels sincere. In my view, this is the confluence of three phenomena: observed correlations between educational attainment and positive outcomes, our cultural commitments to self-sufficiency and equal opportunity, and a mostly unstated but potent desire to manufacture equality of outcomes.

Elite colleges sell these ideas back to us by marketing education as a transformative experience, an extrinsic asset to be wielded. In an unequal society, this is a particularly comforting message, because it implies:

  1. The world works on meritocracy. High-status individuals not only are better than most, they became so through efforts the rest of us can replicate.
  2. We can achieve equality of outcomes with sufficient resources. This has the added bonus of perpetuating the demand for high-end education.

The meritocratic, knowledge-driven higher education model is a product we’re all happy to buy because we like what it says about us. Its violation is disillusioning on a societal level, hence the disproportionate outrage created by scandal involving some 50 students.

Perhaps this is an opportunity to reexamine our relationship with and expectations of the upper echelons of higher education. If we find signaling theory compelling, and I personally do, shouldn’t a society committed to equality of opportunity and social mobility seek to marginalize, rather than fetishize, the institutional power of these universities?

Somewhat more darkly, we should ask ourselves if our belief in the transformative power of education might not be the product of our collective willing ignorance—a noble lie we tell ourselves to avoid confronting problems to which we have few or no solutions. If pre-existing traits—innate intelligence, social connections, wealth, and others—most accurately explain one’s success, what of the increasingly selective institutions that facilitate their convergence?

*

Footnotes:

  1. Though I’ve heard plenty of anecdotal claims to this effect (including from an admissions officer during a grad school interview), I don’t have any hard proof. If one of you knows of a such a study, point me in the right direction.
  2. I just wanted to note that this feels very in line with the general trend of wealthier people having fewer children but spending an enormous amount of resources to give them even very marginal advantages.
  3. This is when people respond to polls in the ways they think are more likely to be viewed favorably by others. Basically, people under-report bad behavior (maybe using drugs or committing crimes) and over-report good behavior (like voting).

Business Is Getting Political—and Personal

As anyone reading this blog is undoubtedly aware, Sarah Huckabee Sanders, the current White House Press Secretary, was asked last month by the owner of a restaurant to leave the establishment on the basis that she and her staff felt a moral imperative to refuse service to a member of the Trump administration. The incident, and the ensuing turmoil, highlights the extent to which business has become another political battleground—a concept that makes many anxious.

Whether or not businesses should take on political and social responsibilities is a fraught question—but not a new one. Writing for the New York Times in 1970, Milton Friedman famously argued that businesses should avoid the temptation go out of their way to be socially responsible and instead focus on maximizing profits within the legal and ethical framework erected by government and society. To act otherwise at the expense profitability, he reasoned, is to spend other people’s money—that of shareholders, employees, or customers—robbing them of their agency.

Though nearing fifty years of age, much of Milton Friedman’s windily and aptly titled essay, The Social Responsibility of Business Is to Increase Profits, feels like it could have been written today. Many of the hypotheticals he cites of corporate social responsibility—“providing employment, eliminating discrimination, avoiding pollution”—are charmingly relevant in the era of automation anxiety, BDS, and one-star campaigns. His solution, that businesses sidestep the whole mess, focus on what they do best, and play by the rules set forth by the public, is elegant and simple—and increasingly untenable.

One reason for this is that businesses and the governments Friedman imagined would reign them in have grown much closer, even as the latter have grown comparatively weaker. In sharp contrast to the get-government-out-of-business attitude that prevailed in the boardrooms of the 1970s, modern industry groups collectively spend hundreds of millions to get the ears of lawmakers, hoping to obtain favorable legislation or stave off laws that would hurt them. Corporate (and other) lobbyists are known to write and edit bills, sometimes word for word.

You could convincingly argue that this is done in pursuit of profit: Boeing, for example, spent $17 million lobbying federal politicians in 2016 and received $20 million in federal subsidies the same year. As of a 2014 report by Good Jobs First, an organization that tracks corporate subsidies, Boeing had received over $13 billion of subsidies and loans from various levels of government. Nevertheless, this is wildly divergent from Friedman’s idea of business as an adherent to, not architect of, policy.

As business has influenced policy, so too have politics made their mark on business. Far more so than in the past, today’s customers expect brands to take stands on social and political issues. A report by Edelman, a global communications firm, finds a whopping 60% of American Millennials (and 30% of consumers worldwide) are “belief-driven” buyers.

This, the report states, is the new normal for businesses—like it or not. Brands that refrain from speaking out on social and political issues now increasingly risk consumer indifference, which, I am assured by the finest minds in marketing, is not good. In an age of growing polarization, every purchase is becoming a political act. Of course, when you take a stand on a controversial issue, you also risk alienating people who think you’re wrong: 57% of consumers now say they will buy or boycott a brand based on its position on an issue.

This isn’t limited to merely how corporations talk. Firms are under increasing social pressure to hire diversity officers, change where they do business, and reduce their environmental impact, among other things. According to a 2017 KPMG survey on corporate social responsibility, 90% of the world’s largest companies now publish reports on their non-business responsibilities. This reporting rate, the survey says, is being driven by pressure from investors and government regulators alike.

It turns out that a well marketed stance on social responsibility can be a powerful recruiting tool. A 2003 study by the Stanford Graduate School of Business found 90% of graduating MBAs in the United States and Europe prioritize working for organizations committed to social responsibility. Often, these social objectives can be met in ways that employees enjoy: for example, cutting a company’s carbon footprint by letting employees work from home.

In light of all this, the choice between social and political responsibility and profitability seems something of a false dichotomy. The stakes are too high now for corporations to sit on the sidelines of policy, politics, and society, and businesses increasingly find themselves taking on such responsibilities in pursuit of profitability. Whether that’s good or bad is up for debate. But as businesses have grown more powerful and felt the need to transcend their formerly transactional relationships with consumers, it seems to be the new way of things.

Is College Worth It?

It’s a query that would have been unthinkable a generation or two ago. College was once – and in fairness, to a large extent, still is – viewed as a path to the middle class and a cultural rite of passage. But those assumptions are, on many fronts, being challenged. Radical changes on the cost and benefit sides of the equation have thrown the once axiomatic value of higher education into question.

Let’s talk about money first. It’s no secret that the price of a degree has climbed rapidly in recent decades. Between 1985 and 2015, the average cost of attending a four-year institution increased by 120 percent, according to data compiled by the National Center for Education Statistics, putting it in the neighborhood of $25,000 per year – a figure pushing 40 percent of the median income.

That increase has left students taking more and bigger loans to pay for their educations. According to ValuePenguin, a company that helps consumers understand financial decisions, between 2004 and 2014, the amount of student loan borrowers and their average balance size increased by 90 percent and 80 percent, respectively. Among the under-thirty crowd, 53 percent with a bachelor’s degree or higher now report carrying student debt.

Then there’s time to consider. Optimistically, a bachelor’s degree can be obtained after four years of study. For the minority of students who manage this increasingly rare feat, that’s still a hefty investment: time spent on campus can’t be spent doing other things, like work, travel, or even just enjoying the twilight of youth.

And for all the money and time students are sinking into their post-secondary educations, it’s not exactly clear they’re getting a good deal – whether gauged by future earnings or the measurable acquisition of knowledge. Consider the former: While there is a well acknowledged “college wage premium,” the forces powering it are up for debate. A Pew Research Center report from 2014 shows the growing disparity to be less a product of the rising value of a college diploma than the cratering value of a high school diploma. The same report notes that while the percentage of degree-holders aged 25-32 has soared since the Silent Generation, median earnings for full-time workers of that cohort have more or less stagnated across the same time period.

Meanwhile, some economists contend that to whatever extent the wage premium exists, it’s impossible to attribute to college education itself. Since the people most likely to be successful are also the most likely to go to college, we can’t know to what extent a diploma is a cause or consequence of what made them successful.

In fact, some believe the real purpose of formal education isn’t so much to learn as to display to employers that a degree-holder possess the attributes that correlate with success, a process known as signalling. As George Mason Professor of Economics (and noted higher-ed skeptic) Bryan Caplan has pointed out, much of what students learn, when they learn anything, isn’t relevant to the real world. Professor Caplan thinks students are wise to the true value of a degree, which could explain why almost no student ever audits a class, why students spend about 14 hours a week studying, and why two-thirds of students fail to leave university proficient in reading.

Having spent the last 550-ish words bashing graduates and calling into question the legitimacy of the financial returns on a degree, you might fairly ask if I’m saying college really isn’t worth your time and money. While I’d love to end it here and now with a hot take like that, the truth is it’s a really complicated, personal question, and I can’t give a definitive answer. What I can offer are some prompts that might help someone considering college to make that choice for themself, based on things I wish I’d known before heading off to school.

  • College graduates fare better on average by many metrics. Even if costs of attendance are rising, they still have to be weighed against the potential benefits. Income, unemployment, retirement benefits, and health care: those with a degree really do fare better. Even if we can’t be sure of the direction or extent this relationship is causal, one could reasonably conclude the benefits are worth the uncertainty.
  • Credentialism might not be fair, but it’s real. Plenty of employers use education level as a proxy for job performance. If the signalling theory really is accurate, the students who pursue a degree without bogging themselves down with pointless knowledge are acting rationally. As Professor Caplan points out in what seems a protracted, nerdy online feud with Bloomberg View’s Noah Smith, the decision to attend school isn’t made in a cultural vacuum. Sometimes, there are real benefits to conformity – in this case, getting a prospective employer to give you a shot at an interview. Despite my having never worked as a sociologist (alas!), my degree has probably opened more than a few doors for me.
  • What and where you study are important. Some degrees have markedly higher returns than others, and if money is part of the consideration (and I hope it would be), students owe it to themselves to research this stuff beforehand.
  • For the love of god, if you’re taking loans, know how compound interest works. A younger, more ignorant version of myself once thought I could pay my loans off in a few years. How did I reach this improbable conclusion? I conveniently ignored the fact that interest on my loans would compound. Debt can be a real bummer. It can keep you tethered to things you might prefer to change, say a job or location, and it makes saving a challenge.
  • Relatedly, be familiar with the economic concept of opportunity cost. In short, this just means that time and money spent doing one thing can’t do something else. To calculate the “economic cost” of college, students have to include the money they could have made by working for those four years. If we conservatively put this number at $25,000 per year, that means they should add $100,000 in lost wages to the other costs of attending college (less if they work during the school year and summer).
  • Alternatives to the traditional four-year path are emerging. Online classes, some of which are offering credentials of their own, are gaining popularity. If they’re able to gain enough repute among employers and other institutions, they might be able to provide a cheaper alternative for credentialing the masses. Community colleges are also presenting themselves as a viable option for those looking to save money, an option increasingly popular among middle class families.

There’s certainly more to consider, but I think the most important thing is that prospective students take time to consider the decision and not simply take it on faith that higher education is the right move for everyone. After all, we’re talking about a huge investment of time and money.

A different version of this article was published on Merion West.

Ben Carson’s Tragically Mundane Scandal

Whatever else it might accomplish, President Donald Trump’s administration has surely earned its place in history for laying to rest the myth of Republican fiscal prudence. Be they the tax dollars of today’s citizens or tomorrow’s, high ranking officials within Mr. Trump’s White House seem to have no qualms about spending them.

The latest in a long series of questionable expenses is, of course, none other than Department of Housing and Urban Development Secretary Ben Carson’s now infamous $31,000 dining set, first reported on by the New York Times.¹ Since the Times broke the story, Mr. Carson has attempted to cancel the order, having come under public scrutiny for what many understandably deem to be an overly lavish expenditure on the public dime.

At first blush, Secretary Mr. Carson’s act is egregious. As the head of HUD, he has a proposed $41 billion of taxpayer money at his disposal. Such frivolous and seemingly self-aggrandizing spending undermines public trust in his ability to use taxpayer funds wisely and invites accusations of corruption. It certainly doesn’t help the narrative that, as some liberals have noted with derision, this scandal coincides with the proposal of significant cuts to the department’s budget.

But the more I think about it, the more I’m puzzled as to why people are so worked up about this.

Let me be clear: this certainly isn’t a good look for the Secretary of an anti-poverty department with a shrinking budget, and it’s justifiable that people are irritated. At a little more than half the median annual wage, most of us would consider $31,000 an absurd sum to spend on dining room furniture. The money that pays for it does indeed come from private citizens who would probably have chosen not to buy Mr. Carson a new dining room with it.

And yet, in the realm of government waste, that amount is practically nothing.
Government has a long, and occasionally humorous, history of odd and inefficient spending.

Sometimes, it can fly under the radar simply by virtue of being bizarre. Last year, for example, the federal government spent $30,000 in the form of a National Endowment for the Arts grant to recreate William Shakespeare’s play “Hamlet” – with a cast of dogs. Other times, the purchase at hand is too unfamiliar to the public to spark outrage. In 2016, the federal government spent $1.04 billion expanding trolley service a grand total of 10.92 miles in San Diego: an average cost of $100 million per mile.

Both of those put Mr. Carson’s $31,000 dining set in a bit of perspective. It is neither as ridiculous as the play nor as great in magnitude as the trolley. So why didn’t either of those incidents receive the kind of public ire he is contending with now?

The mundanity of Mr. Carson’s purchase probably hurts him in this regard. Not many of us feel informed enough to opine on the kind of money one should spend building ten miles of trolley track, but most of us have bought a chair or table. That reference point puts things in perspective and allows room for an emotional response. It’s also likely this outrage is more than a little tied to the President’s unpopularity.

Ironically, the relatively small amount of money spent might also contribute to this effect. When amounts get large enough, like a billion dollars, we tend to lose perspective – what’s a couple million here or there? But $31,000 is an amount we can conceptualize.

So it’s possible that we’re blowing this a little out of proportion for forces that are more emotional than logical. But I still think the issue is a legitimate one that deserves more public attention than it usually gets, and it would be interesting if the public were able to apply this kind of pressure to other instances of goofy spending. Here’s hoping, anyway.

A version of this article originally appeared on Merion West

1. I wrote this article the day before word broke that Secretary of the Interior Ryan Zinke had spent $139,000 upgrading the department’s doors.