Demographics Aren’t Destined

Last June, the Census reported the white-alone population to have declined by .2% in absolute terms between July 2016 and July 2017. Though it may seem trivial, this factoid has immense significance to those on opposing sides of the culture wars, both of which have taken it to herald the decline of white political significance and the rise of a more diverse, and therefore liberal, electorate.

Frankly, there’s too much there to talk about in one blog post. Instead, I’d like to address an issue that’s bugged me for a long time:

In projection after projection showing a minority-white America, Hispanic members of each racial category are separated and lumped into their own group, despite the racial diversity of Latin America. This is significant because the rising tide of American diversity is mainly the result of a four-decade wave of immigration from Latin America and the high fertility rate of their descendants (though both forces have recently calmed). In 1960, 3.5% of the country identified as Hispanic or Latino. Nearly 60 years later, that figure has risen to 18%, with expectations that a quarter of the country will identify as Hispanic by 2065.

But disaggregating Hispanics from racial categories is inconsistent, not just with official Census convention—which designates Hispanic/Latino an ethnicity, a variable mutually independent from race—but also with evidence that suggests many Hispanics are beginning to assimilate more wholly into the white population.

This isn’t a (just) pedantic rant about Census data. I think there’s a solid argument to be made that we’re actually in the middle of an expansion, rather than contraction, of American whiteness.

Take, for starters, that a slim majority of American Hispanics already identify as white, at least when asked about their race on the Census. This doesn’t seem like a vestige of a more racially animose time. Per the 2010 Census, 53% of US Hispanics describe themselves as “white alone,” up from 48% in 2000.

US Hisp Race
Between 2000 and 2010, the share of US Hispanics identifying as white alone increased, while the proportion selecting “some other race” when asked to identify decreased.

Secondly, Hispanic identity seems to fade the further removed from immigration a one is. According the Pew Center’s 2015 National Survey of Latinos, all but 3% of foreign-born Americans with Hispanic ancestry identify as Hispanic or Latino. In the second generation, that share increases only slightly, to 8%. But by the third and fourth generations, it climbs rapidly, to 23% and 50%, respectively. This is truer among younger cohorts.

All told, 11% of US adults with Hispanic ancestry do not identify as such. Because immigration has been replaced by native births as the main driver of US Hispanic population growth in the last few decades, it’s not unreasonable to expect this fraction of “non-Hispanics” to grow.

Hisp growth
Source: Based on Pew Research Center tabulations, Pew Research Center historical projections (Passel and Cohn, 2008).

Also worth considering are Hispanics’ growing geographical dispersion and high rate of intermarriage, especially among younger generations. Twenty-eight percent of 18- to 35-year-old US Hispanics are married to non-Hispanics. Again, this trend grows stronger the longer one’s family has been in the United States: nearly 60% of third-generation US Hispanics ages 18 to 35 are married to someone who isn’t Hispanic. Moving out of the city and marrying extra-ethnically seem, admittedly conjecturally, indicative of cultural assimilation.

It seems like Hispanics are following the arc of other (European and Levantine) immigrant groups who were once, and in some cases still are, considered outside the bounds of conventional whiteness. All of this is to say, I’m skeptical that the way Hispanics view themselves in 2018 is the way they will in 2050—especially as they become more enmeshed in mainstream American society.

Of course, this is just a prediction I’m making in my living room. I don’t have a crystal ball or any special insight into the minds of the American public. I’m going to end with some reasons things might not go as I imagine:

The 101 reason would probably be “politics,” with which race seems to have a bicausal relationship in America. It’s not hard to imagine the Republican party alienating Hispanics with nativism while selling themselves, intentionally or otherwise, as the party of White America. Similarly, Democrats’ ability to court Hispanics relies to some degree on the extent to which they feel shut out from the cultural and political mainstream. Both could push Hispanics to think of themselves as non-white more frequently.

Relatedly, the Office of Management and Budget could affect Hispanics’ racial identities through bureaucratic means. A few years ago, there was talk of combining the race and ethnicity questions, with “Hispanic” offered as a choice alongside Asian, black, white, etc. As I noted at the time, this might bring the Census questions more in line with the way Americans think about race today—but it would also be putting a thumb on the scales. There’s really no neutral position for the Census to take in this matter.

Anecdotally and finally, it also seems like the psychic benefits of whiteness have waned a lot over the last few decades—especially as regards low-status whites. Part of this owes to good news: cultural progress on matters of race, which has begun to erode the relatively elevated status enjoyed by whites at the expense of minorities. Other explanations are more sinister and reflect anomic decay in the white population: rising rates of suicide, drug overdoses, and voluntary unemployment. For one reason or another, whiteness no longer feels as enviable a club as it probably did in the 20th century when Italians, Jews, and other so-called “white ethnics” made the conscious effort to join its ranks.

Walking It Back

My last post was surprisingly popular—and not just among people who know me personally. I even managed to pick up a few new followers, who I’m afraid will be put off a bit when they discover travel writing isn’t aligned the usual subjects of this blog (but hopefully not!).

Anyway, as you may or may not recall, the last post incorporated a graph of the distance I’d walked the days before, during, and after various legs of my trip through Italy:

miles walked

In the graph’s caption, I glibly blamed my apparent sedentarism on my office job and commute. I like to think of myself as a decently fit person, you see. Surely, I reasoned, my desk job must be impeding an otherwise active lifestyle. I mean, I have a standing desk—clearly I’m a man who values his physical fitness.

It occurred to my a few days later that my hypothesis was actually pretty testable: if work and commuting were really to blame, my weekends should be significantly more active (measured by distance walked/run) than average. Apple has, for some reason, elected to make exporting health data from iPhones an incredibly difficult process. So, with the zeal of an intern, I manually entered 242 days worth of mileage, attempting to evidence my claim.

Looking back, my naiveté was almost cute. In the era of “binge-watching,” I really believed myself exceptional.

The raw data is pretty depressing. The mean distance walked is 1.54 miles. But the data is right-skewed, meaning outliers on the upper end of the distribution are pulling the mean higher. (The median distance walked over this period is a shockingly low .985 miles.) It’s also telling that the distribution isn’t bimodal, which would indicate two distinct populations—in the case of my hypothesis, weekdays and weekends.

Miles Walked histboxmiles

I could have quit here, but I’ve touched on the importance of publishing negative results before and therefore had a cross to bear. To make the data set more normal, I removed outliers (in this case, all values greater than 3.73 miles) and used a square-root transformation:

Square Root Miles Walked, no outliers

The means of our new, outlier-free population and the “weekend” sample (n=61) are, respectively, 1.023² miles and 1.046² miles, and the population standard deviation is .377² miles. At the 95% confidence level, the sample would have to have a mean of about 1.106² miles to be statistically higher than the average.

It is with great shame that I reject the alternate hypothesis. And I do hereby humbly apologize to office life for blaming it for what is clearly a personal shortcoming.

A few caveats, in case my health insurance provider is reading:

  • I do exercise most days before work. But mostly pull-ups, lunges, and other anaerobic stuff. I only run sporadically—and when I do, I don’t always bring my phone with me.
  • I can’t vouch for the accuracy of the iPhone’s pedometer. Anecdotally, I’ve heard it isn’t great, and light research confirms it has trouble measuring steps under some common conditions, like being held or kept in a backpack.
  • The combination of the above suggests iPhone health data offers a convenient but incomplete metric to assess one’s activity. For example, July 31, a day my phone credits me with walking 4.7 miles, also happens to be a day I went for a 30-mile bike ride.
  • Including Fridays in the “weekend” sample raises the mean distance slightly, to 1.08 miles, but still not enough to achieve statistical significance.
  • Uh, I will try to do better.

Summer Vacation to Italy

We’re going to try something a little different with today’s post. Instead of a research piece, I’m just going to tell you about my and my girlfriend’s trip to Italy.

This idea was partly born of my growing distaste for social media—the blog post, that is, not the trip to Italy. Standard operating procedure when someone my age takes a trip is to upload photos to Facebook or its increasingly popular appendage, Instagram (filters and ironic captions appreciated but not required). High on endorphins, dehydrated, and possibly a little drunk somewhere in southern Italy, I hatched this quixotic act of rebellion: to post—no, upload—my photos to my own site, thereby subverting one of the day’s most powerful and opaque companies.

It’s stupid, but I’m sticking to it.

It also so happens that a few people have asked me for more detail about the trip than I care to provide in a comment or photo description. I’ll try my best to cover it all without going overboard.

*

Our trip began in Rome. We spent three days in the capital doing the obligatory sightseeing: we went to the Vatican Museums, the Sistine Chapel, the Roman Forum, the Colosseum, Palatine Hill, the Spanish Steps (twice), the Capuchin Crypt, the Da Vinci Experience, and some lesser attractions.

This involved lots of waiting in line, fending off potential tour guides (both legitimate and otherwise, I suspect), and, above all, walking. All told, we walked almost 25 miles in just over three days.

miles walked
Data courtesy of my iPhone. Sedentarism courtesy of my office job and commute.

That kind of tourism, seeing major attractions and waiting in lines, really isn’t my style. But I have to admit, it was worth it. It really wouldn’t have been right to go to Rome and do otherwise. Plus, I kind of have a thing for architecture.

Volumes have been written about the beauties of ancient Rome, so I won’t wax poetic about the Colosseum or Michaelangelo’s greatest works. I will, however, say that the Capuchin Crypt, intricately decorated with the bones of thousands, including some children, is bananas. You should drop by if you’re in Rome. (They also don’t allow photos, but Megan was able to snag some while the attendant wasn’t looking.)

Beyond visiting the usual tourist attractions, we mostly ate and drank while in Rome. It’s a beautiful, humid city.

The next leg of our trip took us to Puglia. We used Lecce, a city of just under 100,000 people known for its baroque architecture, as a base from which we took day trips to towns on the Pugliese coast.

Instead, we spent our time in the south on the beach. We went to Otranto first, a white city with a beautiful port and precise geometric architecture. We ate raw fish (actually, we did that just about everywhere we went), drank wine, and tempted fate by falling asleep on the beach without sunscreen.

Our next stop took us to Castrignano del Campo, a small town on the very edge of the Italian “heel”. The view of the turquoise Adriatic Sea is the kind of beautiful scene you hope for when you buy a plane ticket. Italians, we noticed, have an interesting take on what constitutes a beach. We followed the crowds to a porous, jagged slab of what I believe was volcanic rock. As I gingerly climbed across the “beach”—on all fours after my flip flops broke—the Italians were pretty much treating it like sand, some laying directly on it. America has made us soft.

Heading north toward Bari, from which we would fly back to Boston, we stopped in Alberobello, a town tucked away in the mountains of Puglia. The town is famous for being full of trulli, medieval stone huts built collapsible and without mortar to aid in tax evasion. (Many of the ones we saw looked pretty permanent, though.)

Aside from Rome, Bari was the second largest city we visited. Unlike the capital, it doesn’t feel touristy—which I don’t mean in a particularly nice way. It was definitely safer and prettier than I was led to believe—neither the internet nor Italians from the region are hold much regard for it—but not quite memorable.

The surrounding town of Polignano a Mare was a different story. Of particular interest to me were the swimming spots, though we regrettably forgot to bring our suits. The town center is a beautiful labyrinth of whitewashed buildings, buzzing with amateur photographers, day-drinkers, and crowded gelato shops.

Oh, and while were there, we ate dinner in a cave. It was a little nicer than it sounds—and yes, that is my (vastly inferior) reprisal of Tom Haverford’s espresso shot.

*

Other scattered thoughts from my trip:

  • Language: If you know other Latin languages, which I happen to, Italian is really easy to get a hold of. I decided to learn some a few weeks before we took off, and it was actually pretty helpful, especially in the provincial south where multilingualism is rarer. More important, I think, is that it seemed appreciated by most Italians. For you travelers out there: I highly recommend learning at least the basics before you head somewhere. Counterpoint: if you’re not obviously foreign and get good enough at the beginnings of conversations (which are often rote), people will ask you for directions.
  • Getting away from American media was an unexpected blessing. My job is politics-adjacent, and that I managed to get away during the Manafort convictions is wonderful beyond description. A self-imposed Facebook moratorium, aided by a lack of overseas data, was key to this.
  • Trap music has definitely made it to Italy, as has Brazilian funk, oddly. I was also surprised to see—or hear, I guess—how much Italians seem to like reggaeton.
  • Driving in Italy is a rush. The roads, built centuries ago, are far narrower, and there’s basically no delineation between what areas belong to pedestrians and drivers.
  • Food: super expensive, super good.

Environmentalism Could Use Some Ideological Diversity

Before we get into it: If you’re here by virtue of a Facebook ad and you like what you read, consider following this blog directly. Go ahead. Hit that little red box over there and stick it to Zuckerberg. ⇒

*

Environmentalism, once a point of mutual agreement between liberals and conservatives, is flagging under the demand to meet the highest standards of left-wing activism. For an example, look no further than the latest craze in pop environmentalism: local straw bans.

Unless you’ve been living under a rock, you know that straw bans—local laws prohibiting restaurants from giving straws and other single-use plastics to customers—are springing up across America. All told, 28 US cities, joined by a growing list of businesses that includes Starbucks, have so far banned or limited the use of plastic straws or are considering doing so.

While these bans are moving at a steady clip in progressive enclaves, they haven’t been without detractors. As one might expect, the plastics and restaurant industries aren’t thrilled, and the bans have exasperated others who point out that plastic straws make up about .03% of the plastic that enters our oceans each year.

Those objections were never destined to blunt the enthusiasm of the celebrity-fueled campaign to #stopsucking. But another argument, made from within the ranks, has proven far more effective: Several disability advocates have pointed out that plastic straws, considered by most an item of convenience, are in fact essential to the dining experience for people with mobility issues. By outright banning them, they argue, these cities and businesses are forgetting about and harming disabled people.

The traction of this counter—the same argument has appeared in the Washington Post, Vox, Time, the Guardian, NPR, Teen Vogue and many more outlets—says a lot about how far left environmentalism has moved. The appeal of behalf of the disabled is effective because it, like other forms of progressive activism, appeals to the moral touchstones of protecting victims and promoting equality. Environmentalism, though practically a poster child for non-partisanship, is often pitched the same way: save the rain forest, protect the environment, the victims of climate change. This has no doubt added to its polarization, as this messaging is less effective with moderates and conservatives.

The progressive desires to protect victims and strive for equality have unquestionably been the impetus for much positive change in American society. But they can also be a weakness: left unchecked, progressive movements can auto-cannibalize as these motivations are pursued at the expense of all else—including their original goals.

A few notorious examples: The now-defunct Cape Wind project, which could have farmed enough wind energy to power 200,000 homes, met with tremendous resistance from Massachusetts residents who cited concerns about the effects on local fish and bird populations (as well as some less noble complaints about the view). The Ivanpah solar tower faced constant legal and political resistance from California environmentalists, despite estimates that it would prevent the emission of 500,000 metric tons of carbon per year. Dams and other forms of hydroelectric power, responsible for close to half of all the renewable energy generation in the United States, are also known to provoke the ire of green activists.

In each case, progressives who otherwise champion the worthwhile goal of cleaner energy are letting the perfect be the enemy of the good. The desire to avoid harming anyone vulnerable at any cost can lead to paralysis. In environmentalism, where big changes are difficult and marginal actions more tempting, the costs are more likely to be borne by those for whom a seeming inconvenience can be a prohibitive obstacle. That can be uncomfortable under normal circumstances, but when advocates are overwhelmingly hyper-sensitive to the “losers” side of the equation, as progressives often are, it can be downright intolerable.

Yet if we mean to make an omelette, eggs must be broken. While I personally think the straw ban craze is more performative than functional, there will no doubt be times when our society will have to make trade-offs to protect our natural world. If you think “the rich” alone will bear these costs, you’re kidding yourself. I’m not saying that environmentalists should be okay with making the lives of disabled people harder (and in this case, the workaround suggested by disability advocates—that restaurants simply stop offering them to everyone and keep some on hand for requests—is entirely reasonable). What I am saying is there will be tough choices on the road ahead, and environmentalism must decide if it will pursue left-wing purity or practicality.

Should it choose the latter, it will need to diversify its support. There won’t always be a happy marriage between impactful environmentalism and progressive values. For environmental groups to weather the political storm, they’ll have to be able to tap sources of support from outside the left.

Of course, this isn’t a one-way street; people outside the political left will have to start caring about these issues in much more visible ways and be willing to push their representatives. As it is, they’re giving up their chance to shape a movement and be part of the conversation. Unfortunately, for the moment, conservative political environmentalism remains somewhat niche. That’s a shame because the long-term viability of life on Earth is perhaps the greatest and most complicated matter concerning humanity. It would be foolish to think any political faction could tackle it alone.

This post originally appeared on Merion West

A Glance at College Attainment in NYC

I’m taking a couple days off from work this week. Unfortunately, it’s raining, so I’m inside playing around with some county-level education data in R, and I thought I’d throw up a quick blog post. The data set, which goes back to the 1970s, comes courtesy of the USDA and can be found here.

A quirk of New York City is that each of the five boroughs is also its own county. I took the opportunity to make some graphs illustrating how educational attainment has changed in the city and its boroughs over the past 50 years. There are a few interesting insights to be had here:

New York City’s college attainment rate closely mimics the country’s. Since 1970, the percentage of residents over the age of 25 in both groups who have attended at least some college has risen from just over 20% to just under 60%.

US NYC

But between the boroughs, there’s quite a bit of diversity. Manhattan is the only borough to have ever had a higher than city-average rate of college attainment. What’s more, the gap between the Manhattan and the New York City average has only grown over time. That said, because it started out with a higher rate of college-educated residents, Manhattan has experienced the lowest rate of growth in this area. Here’s each borough compared with the city’s average over the last 50 years:

Boroughs Some College

Perhaps unsurprisingly, given its reputation as a hotbed of gentrification, Brooklyn leads the way in terms of educational growth among its residents. In 1970, Brooklyn and the Bronx had similar rates of residents with at least some college (12 and 13 percent, respectively), compared with a US average of 21.3%. By 2016, however, the gap between the former had widened to 10.5%. Brooklyn has surpassed Queens and is closing in on the NYC average.

A more detailed look at the change in educational attainment in Brooklyn, which now has slightly more residents with than without some college experience:

Rise of Educated Brooklyn

 

Business Is Getting Political—and Personal

As anyone reading this blog is undoubtedly aware, Sarah Huckabee Sanders, the current White House Press Secretary, was asked last month by the owner of a restaurant to leave the establishment on the basis that she and her staff felt a moral imperative to refuse service to a member of the Trump administration. The incident, and the ensuing turmoil, highlights the extent to which business has become another political battleground—a concept that makes many anxious.

Whether or not businesses should take on political and social responsibilities is a fraught question—but not a new one. Writing for the New York Times in 1970, Milton Friedman famously argued that businesses should avoid the temptation go out of their way to be socially responsible and instead focus on maximizing profits within the legal and ethical framework erected by government and society. To act otherwise at the expense profitability, he reasoned, is to spend other people’s money—that of shareholders, employees, or customers—robbing them of their agency.

Though nearing fifty years of age, much of Milton Friedman’s windily and aptly titled essay, The Social Responsibility of Business Is to Increase Profits, feels like it could have been written today. Many of the hypotheticals he cites of corporate social responsibility—“providing employment, eliminating discrimination, avoiding pollution”—are charmingly relevant in the era of automation anxiety, BDS, and one-star campaigns. His solution, that businesses sidestep the whole mess, focus on what they do best, and play by the rules set forth by the public, is elegant and simple—and increasingly untenable.

One reason for this is that businesses and the governments Friedman imagined would reign them in have grown much closer, even as the latter have grown comparatively weaker. In sharp contrast to the get-government-out-of-business attitude that prevailed in the boardrooms of the 1970s, modern industry groups collectively spend hundreds of millions to get the ears of lawmakers, hoping to obtain favorable legislation or stave off laws that would hurt them. Corporate (and other) lobbyists are known to write and edit bills, sometimes word for word.

You could convincingly argue that this is done in pursuit of profit: Boeing, for example, spent $17 million lobbying federal politicians in 2016 and received $20 million in federal subsidies the same year. As of a 2014 report by Good Jobs First, an organization that tracks corporate subsidies, Boeing had received over $13 billion of subsidies and loans from various levels of government. Nevertheless, this is wildly divergent from Friedman’s idea of business as an adherent to, not architect of, policy.

As business has influenced policy, so too have politics made their mark on business. Far more so than in the past, today’s customers expect brands to take stands on social and political issues. A report by Edelman, a global communications firm, finds a whopping 60% of American Millennials (and 30% of consumers worldwide) are “belief-driven” buyers.

This, the report states, is the new normal for businesses—like it or not. Brands that refrain from speaking out on social and political issues now increasingly risk consumer indifference, which, I am assured by the finest minds in marketing, is not good. In an age of growing polarization, every purchase is becoming a political act. Of course, when you take a stand on a controversial issue, you also risk alienating people who think you’re wrong: 57% of consumers now say they will buy or boycott a brand based on its position on an issue.

This isn’t limited to merely how corporations talk. Firms are under increasing social pressure to hire diversity officers, change where they do business, and reduce their environmental impact, among other things. According to a 2017 KPMG survey on corporate social responsibility, 90% of the world’s largest companies now publish reports on their non-business responsibilities. This reporting rate, the survey says, is being driven by pressure from investors and government regulators alike.

It turns out that a well marketed stance on social responsibility can be a powerful recruiting tool. A 2003 study by the Stanford Graduate School of Business found 90% of graduating MBAs in the United States and Europe prioritize working for organizations committed to social responsibility. Often, these social objectives can be met in ways that employees enjoy: for example, cutting a company’s carbon footprint by letting employees work from home.

In light of all this, the choice between social and political responsibility and profitability seems something of a false dichotomy. The stakes are too high now for corporations to sit on the sidelines of policy, politics, and society, and businesses increasingly find themselves taking on such responsibilities in pursuit of profitability. Whether that’s good or bad is up for debate. But as businesses have grown more powerful and felt the need to transcend their formerly transactional relationships with consumers, it seems to be the new way of things.

Occupational Licensing Versus the American Dream

Imagine: You’re one of the 6.1 million unemployed Americans. Try as you might, you can’t find a job. But you’ve always been great at something—cutting hair, giving manicures, or maybe hanging drywall—so great, in fact, that you reckon you could actually make some real money doing it. What’s the first thing you do?

If your answer was something other than, “Find out how to obtain the state’s permission,” you’re in for a surprise.

A shocking amount of occupations require workers to seek permission from the government before they can legally practice. This includes not just the obvious, like doctors and lawyers, whose services, if rendered inadequately, might do consumers life-threatening harm, but also barbers, auctioneers, locksmiths, and interior designers.

This phenomenon is known as occupational licensing. State governments set up barriers to entry for certain occupations, ostensibly to the benefit and protection of consumers. They range from the onerous—years of education and thousands of dollars in fees—to trivialities like registering in a government database. At their most extreme, such regulations make work without a permit illegal.

As the United States transitioned from a manufacturing to a service-based economy, occupational licensing filled the “rules void” left by the ebb of labor unions. In the past six decades, the share of jobs requiring some form of license has soared, going from five percent in the 1950s to around 30 percent today. Put another way: over a quarter of today’s workforce requires government permission to earn a living.

There’s little proof that licensing does what it’s supposed to. For one, the potential impact to public safety seems wholly incidental to the burden of compliance for a given job. In most states, it takes 12 times as long to become a licensed barber as an EMT. In a 2015 Brookings Institution paper, University of Minnesota Professor Morris Kleiner, who has written extensively on the subject, states: “…economic studies have demonstrated far more cases where occupational licensing has reduced employment and increased prices and wages of licensed workers than where it has improved the quality and safety of services.”

Ironically, the presence of strict licensing regulations also seems to encourage consumers to seek lower-quality services—sometimes at great personal risk. When prices are high or labor is scarce, consumers take a DIY approach or forego services entirely. A 1981 study on the effects of occupational licensing found evidence for this in the form of a negative correlation between electricians per capita and accidental electrocutions.

A less morbid, but perhaps more salient, observation is that licensing often creates burdens that are unequally borne. Licensing requirements make it difficult for immigrants to work. In many states, anyone with a criminal conviction can be outright denied one, regardless of the conviction’s relevance to their aspirations. These policies, coupled with the potential costs of money and time, can make it harder for poorer people, in particular, to find work.

But surely, you might say, there must be some benefit to licensing. And technically, you’d be right.

Excessive licensing requirements are a huge boon to licensed workers. They restrict the supply of available labor in an occupation, limiting competition and in some cases raising wages. There’s little doubt that occupational licensing, often the result of industry lobbying, functions mainly as a form of protectionism. A 1975 Department of Labor study found a positive correlation between the rates of unemployment and failures on licensing exams.

Yet even licensed workers can’t escape the insanity unscathed. Because licenses don’t transfer from state to state; workers whose livelihoods depend on having a license face limited mobility, which ultimately hurts their earning potential.

Though licensure reform is typically thought of as a libertarian fascination—the libertarian-leaning law firm Institute for Justice literally owns occupationallicensing.com—it also has the attention of more mainstream political thinkers. The Obama Administration released a report in 2015 outlining suggestions on how the states might ease the burden of occupational licensing, and in January of this year, Labor Secretary Alexander Acosta made a similar call for reform.

Thankfully, there seems to be some real momentum on this issue. According to the Institute for Justice, 15 states have reformed licensing laws to “make it easier for ex-offenders to work in state-licensed fields” since 2015. Louisiana and Nebraska both made some big changes this year as well. That’s a great start, but there’s still much work to be done.

This article originally appeared on Merion West

Is College Worth It?

It’s a query that would have been unthinkable a generation or two ago. College was once – and in fairness, to a large extent, still is – viewed as a path to the middle class and a cultural rite of passage. But those assumptions are, on many fronts, being challenged. Radical changes on the cost and benefit sides of the equation have thrown the once axiomatic value of higher education into question.

Let’s talk about money first. It’s no secret that the price of a degree has climbed rapidly in recent decades. Between 1985 and 2015, the average cost of attending a four-year institution increased by 120 percent, according to data compiled by the National Center for Education Statistics, putting it in the neighborhood of $25,000 per year – a figure pushing 40 percent of the median income.

That increase has left students taking more and bigger loans to pay for their educations. According to ValuePenguin, a company that helps consumers understand financial decisions, between 2004 and 2014, the amount of student loan borrowers and their average balance size increased by 90 percent and 80 percent, respectively. Among the under-thirty crowd, 53 percent with a bachelor’s degree or higher now report carrying student debt.

Then there’s time to consider. Optimistically, a bachelor’s degree can be obtained after four years of study. For the minority of students who manage this increasingly rare feat, that’s still a hefty investment: time spent on campus can’t be spent doing other things, like work, travel, or even just enjoying the twilight of youth.

And for all the money and time students are sinking into their post-secondary educations, it’s not exactly clear they’re getting a good deal – whether gauged by future earnings or the measurable acquisition of knowledge. Consider the former: While there is a well acknowledged “college wage premium,” the forces powering it are up for debate. A Pew Research Center report from 2014 shows the growing disparity to be less a product of the rising value of a college diploma than the cratering value of a high school diploma. The same report notes that while the percentage of degree-holders aged 25-32 has soared since the Silent Generation, median earnings for full-time workers of that cohort have more or less stagnated across the same time period.

Meanwhile, some economists contend that to whatever extent the wage premium exists, it’s impossible to attribute to college education itself. Since the people most likely to be successful are also the most likely to go to college, we can’t know to what extent a diploma is a cause or consequence of what made them successful.

In fact, some believe the real purpose of formal education isn’t so much to learn as to display to employers that a degree-holder possess the attributes that correlate with success, a process known as signalling. As George Mason Professor of Economics (and noted higher-ed skeptic) Bryan Caplan has pointed out, much of what students learn, when they learn anything, isn’t relevant to the real world. Professor Caplan thinks students are wise to the true value of a degree, which could explain why almost no student ever audits a class, why students spend about 14 hours a week studying, and why two-thirds of students fail to leave university proficient in reading.

Having spent the last 550-ish words bashing graduates and calling into question the legitimacy of the financial returns on a degree, you might fairly ask if I’m saying college really isn’t worth your time and money. While I’d love to end it here and now with a hot take like that, the truth is it’s a really complicated, personal question, and I can’t give a definitive answer. What I can offer are some prompts that might help someone considering college to make that choice for themself, based on things I wish I’d known before heading off to school.

  • College graduates fare better on average by many metrics. Even if costs of attendance are rising, they still have to be weighed against the potential benefits. Income, unemployment, retirement benefits, and health care: those with a degree really do fare better. Even if we can’t be sure of the direction or extent this relationship is causal, one could reasonably conclude the benefits are worth the uncertainty.
  • Credentialism might not be fair, but it’s real. Plenty of employers use education level as a proxy for job performance. If the signalling theory really is accurate, the students who pursue a degree without bogging themselves down with pointless knowledge are acting rationally. As Professor Caplan points out in what seems a protracted, nerdy online feud with Bloomberg View’s Noah Smith, the decision to attend school isn’t made in a cultural vacuum. Sometimes, there are real benefits to conformity – in this case, getting a prospective employer to give you a shot at an interview. Despite my having never worked as a sociologist (alas!), my degree has probably opened more than a few doors for me.
  • What and where you study are important. Some degrees have markedly higher returns than others, and if money is part of the consideration (and I hope it would be), students owe it to themselves to research this stuff beforehand.
  • For the love of god, if you’re taking loans, know how compound interest works. A younger, more ignorant version of myself once thought I could pay my loans off in a few years. How did I reach this improbable conclusion? I conveniently ignored the fact that interest on my loans would compound. Debt can be a real bummer. It can keep you tethered to things you might prefer to change, say a job or location, and it makes saving a challenge.
  • Relatedly, be familiar with the economic concept of opportunity cost. In short, this just means that time and money spent doing one thing can’t do something else. To calculate the “economic cost” of college, students have to include the money they could have made by working for those four years. If we conservatively put this number at $25,000 per year, that means they should add $100,000 in lost wages to the other costs of attending college (less if they work during the school year and summer).
  • Alternatives to the traditional four-year path are emerging. Online classes, some of which are offering credentials of their own, are gaining popularity. If they’re able to gain enough repute among employers and other institutions, they might be able to provide a cheaper alternative for credentialing the masses. Community colleges are also presenting themselves as a viable option for those looking to save money, an option increasingly popular among middle class families.

There’s certainly more to consider, but I think the most important thing is that prospective students take time to consider the decision and not simply take it on faith that higher education is the right move for everyone. After all, we’re talking about a huge investment of time and money.

A different version of this article was published on Merion West.

Ben Carson’s Tragically Mundane Scandal

Whatever else it might accomplish, President Donald Trump’s administration has surely earned its place in history for laying to rest the myth of Republican fiscal prudence. Be they the tax dollars of today’s citizens or tomorrow’s, high ranking officials within Mr. Trump’s White House seem to have no qualms about spending them.

The latest in a long series of questionable expenses is, of course, none other than Department of Housing and Urban Development Secretary Ben Carson’s now infamous $31,000 dining set, first reported on by the New York Times.¹ Since the Times broke the story, Mr. Carson has attempted to cancel the order, having come under public scrutiny for what many understandably deem to be an overly lavish expenditure on the public dime.

At first blush, Secretary Mr. Carson’s act is egregious. As the head of HUD, he has a proposed $41 billion of taxpayer money at his disposal. Such frivolous and seemingly self-aggrandizing spending undermines public trust in his ability to use taxpayer funds wisely and invites accusations of corruption. It certainly doesn’t help the narrative that, as some liberals have noted with derision, this scandal coincides with the proposal of significant cuts to the department’s budget.

But the more I think about it, the more I’m puzzled as to why people are so worked up about this.

Let me be clear: this certainly isn’t a good look for the Secretary of an anti-poverty department with a shrinking budget, and it’s justifiable that people are irritated. At a little more than half the median annual wage, most of us would consider $31,000 an absurd sum to spend on dining room furniture. The money that pays for it does indeed come from private citizens who would probably have chosen not to buy Mr. Carson a new dining room with it.

And yet, in the realm of government waste, that amount is practically nothing.
Government has a long, and occasionally humorous, history of odd and inefficient spending.

Sometimes, it can fly under the radar simply by virtue of being bizarre. Last year, for example, the federal government spent $30,000 in the form of a National Endowment for the Arts grant to recreate William Shakespeare’s play “Hamlet” – with a cast of dogs. Other times, the purchase at hand is too unfamiliar to the public to spark outrage. In 2016, the federal government spent $1.04 billion expanding trolley service a grand total of 10.92 miles in San Diego: an average cost of $100 million per mile.

Both of those put Mr. Carson’s $31,000 dining set in a bit of perspective. It is neither as ridiculous as the play nor as great in magnitude as the trolley. So why didn’t either of those incidents receive the kind of public ire he is contending with now?

The mundanity of Mr. Carson’s purchase probably hurts him in this regard. Not many of us feel informed enough to opine on the kind of money one should spend building ten miles of trolley track, but most of us have bought a chair or table. That reference point puts things in perspective and allows room for an emotional response. It’s also likely this outrage is more than a little tied to the President’s unpopularity.

Ironically, the relatively small amount of money spent might also contribute to this effect. When amounts get large enough, like a billion dollars, we tend to lose perspective – what’s a couple million here or there? But $31,000 is an amount we can conceptualize.

So it’s possible that we’re blowing this a little out of proportion for forces that are more emotional than logical. But I still think the issue is a legitimate one that deserves more public attention than it usually gets, and it would be interesting if the public were able to apply this kind of pressure to other instances of goofy spending. Here’s hoping, anyway.

A version of this article originally appeared on Merion West

1. I wrote this article the day before word broke that Secretary of the Interior Ryan Zinke had spent $139,000 upgrading the department’s doors.

A Political Future for Libertarians? Not Likely.

When it was suggested I do a piece about the future of the Libertarian Party, I had to laugh. Though I’ve been voting Libertarian since before Gary Johnson could find Aleppo on a map, I’ve never really had an interest in Libertarian Party politics.

Sure, the idea is appealing on a lot of levels. Being of the libertarian persuasion often leaves you feeling frustrated with politics, especially politicians. It’s tempting to watch the approval ratings of Democrats and Republicans trend downward and convince yourself the revolution is nigh.

But if I had to guess, the party will remain on the periphery of American political life, despite a relatively strong showing in the 2016 Presidential election. A large part of this – no fault of the Libertarian party – is due to anti-competitive behavior and regulation in the industry of politics. But a substantial amount of blame can be attributed to the simple and sobering fact that the type of government and society envisioned by hardcore Libertarians – the type that join the party – is truly unappealing to most of America.

Unless public opinion radically shifts, it feels like the Libertarian Party will mainly continue to offer voters a symbolic choice. Don’t get me wrong: I’m happy to have that choice, and it really would be nice to see people who genuinely value individual freedom elected to public office. But political realities being what they are, I’m going to hold off on exchanging my dollars for gold and continue paying my income taxes.

So that’s the bad news for libertarians. Here’s the good news: the cause of advancing human liberty isn’t dependent on a niche political party. The goal of libertarianism as a philosophy – the preservation and expansion of individual liberties – has no partisan allegiance. Victory for the Libertarian Party is (thankfully) not requisite for libertarians to get more of what they want.

Advancing their agenda has, for libertarians, proved to be more a question of winning minds than elections. While “capital-L” Libertarians remain on the political margins, aspects of libertarian thought are appreciated by people of all political persuasions and often receive appeal for support. Although no Libertarian Party member has ever held a seat in Congress, moved into a governor’s mansion, or garnered more than four percent of the national vote, many long-held libertarian ideas have enjoyed incredible success, and others are still gaining momentum.

Same-sex marriage is now the law of the land, as has been interracial marriage. Support for legalizing marijuana is at an all-time high (pun intended), and ending the larger ‘war on drugs’ is an idea gaining currency, not only in the US but worldwide. The draft is a thing of the past; the public is growing wary and weary of interventionist foreign policy. A plan to liberalize our immigration system, though stuck in legislative limbo, remains a priority for most Americans, and the United States remains a low-tax nation among countries in the developed world, especially when it comes to individual tax rates.

And not all the good news comes from the realm of politics. Americans have maintained and expanded on a culture of philanthropy, per-capita giving having tripled since the mid-1950s. The rise of social media and the internet has made it easier than ever for people to exchange ideas. Technology of all sorts has lowered prices for consumers and helped people live more productive lives. Even space exploration – until recently exclusively the purview of governments – is now within private reach.

None of this was or will be passed with legislation written or signed into law by a Libertarian politician. But that’s not what really matters. What really matters is that people are freer today to live the kinds of lives they want, peacefully, and without fear of persecution. Yes, there is still much that might be improved, at home and certainly abroad. But in a lot of ways, libertarians can rest happily knowing that their ideas are winning, even if their candidates are not.

This article originally appeared on Merion West.