I’ve noticed media outlets are reporting that young adults make up a significant share of coronavirus cases with an air of incredulity.
My local paper posted on Facebook that “More than 50% of coronavirus cases in Massachusetts are people under the age of 50.” Very similarly, the Pittsburgh Post Gazette writes that “more than half of Pennsylvania’s confirmed COVID-19 patients are under 50 years old.” The New York Times, for its part, reports that “nearly 40 percent of patients sick enough to be hospitalized were age 20 to 54.” *
I can’t decide if this is a psyop to get young people to take the epidemic more seriously (as numerous spring-break photos show they should) or genuine surprise. If it’s the latter, I’m not sure if that’s warranted.
In each case, the age ranges in question are massive and not very meaningful without comparison to the age distribution of the general population. For example, in Massachusetts, about 63% of the population is under the age of 50. So if the incidence of coronavirus were age-independent, we might actually expect more cases among people under 50.
I think the issue, then, is that people seem to be assuming prevalence of the virus should be age-dependent to a higher degree than we’re observing. Maybe there’s a good Bayesian case to be made for this null hypothesis; I don’t know. But I feel like laypeople — local papers included — ought to be preceding with the assumption of age-independence, especially because we still don’t have much information.
Also, what’s going on with the under-20 crowd, which makes up 23% of the population but only 2.2% of MA coronavirus cases? Is Gen Z+ holding out on us?
* This isn’t as egregious as the other two examples. It’s still a huge age range: about 48% of America is between the ages of 20 and 54. But since we’re talking about the severity of symptoms and hospitalizations, it seems much more noteworthy.
This visualization didn’t make the final cut, but it’s nonetheless cool. It demonstrates that smoking rates among Hispanics are far less responsive to income than those of other ethnic groups (though even for Hispanics, the relationship between income and smoking rates is statistically significant). I was surprised to find this relationship, but apparently it’s a known factor of the phenomenon called “the Hispanic Paradox” (alternatively known as the “Latino Paradox”).
The paradox is that, on average, American Hispanics live longer than their non-Hispanic white counterparts, even though the former tend to have lower incomes and less education. The causes aren’t entirely understood, but Hispanics’ low smoking rates are thought to be a major contributor.
Some of the difference in smoking rates can be explained by immigration. Latin American countries tend to have lower smoking rates than the United States. Among those born in the United States, only Mexican-Ameicans seem to retain lower smoking rates and the attendant mortality advantage over non-Hispanic whites. It will be interesting to see if the Paradox ebbs as native-born Hispanics begin to account for more of the Hispanic population.
The Hispanic Paradox illustrates the capricious power of cultural influence on real-world outcomes — and conversely forces us to confront our limited ability to re-engineer the world.
We tend to think of (the physical, policy, social, or economic) environment and choice as the chief determinants of human behavior and outcomes. But we are just as much a product of the commingling of genetics and culture. The paths before us are well-worn by our predecessors, and we would be arrogant to think we can wholly resist their inclinations.
A report from the Pew Center is the latest to document America’s rapidly declining religiosity. Pew’s numbers show a 12-point decrease in the percentage of Americans calling themselves Christians between 2009 and 2019, while those describing themselves as atheists, agnostics, or “nothing in particular” have risen from 17% to 26%.
These findings are mirrored in the General Social Survey, which has been asking respondents about their religious affiliation since 1972.
The demographic surge of the religiously unaffiliated is a story of treatment effects triumphing over selection effects. A little context will help explain.
Natural selection seems to decidedly favor the religious. As the social psychologist Ara Norenzayan details in Big Gods (an excellent if at times slow book), religious societies, specifically those that follow omnipotent, moralizing “Big Gods,” have historically been able to outcompete others. To summarize Norenzayan’s findings, this is due to three factors, the first two of which are increased trust and greater social stability, both made possible by supernatural monitors (gods) that have allowed societies to scale by “building moral communities of strangers.”
In recent history, some societies (think Scandinavia and Japan) have been able to “kick away the ladder” of religion, replicating the monitoring effects of Big Gods through trusted civic institutions—police, courts, and others that allow for anonymous actors to cooperate in the same way religion used to. But there’s one other important advantage of religious societies that secular societies haven‘t been able to engineer: above-replacement fertility rates.
The religiously unaffiliated reproduce at notably lower rates—so much so that, even as I’m writing a blog post about their astronomical demographic growth in America, the Pew Center projects they will decline as a share of the global population, from 16% to 13% between 2010 and 2050.
In a personal email quoted by Norenzayan in Big Gods, a colleague confides that despite reviewing all available data and case studies back to early Greece and India, he was unable to find a single example of a secular society maintaining a birth rate higher than two children per women for even a century. France, Germany, Japan, and quite a few other countries are trying—and failing—to address this problem with a variety of subsidies. Thus far, there is no secular substitute for religion’s fertility premium.
So with forces of nature solidly in support of religion, why is it rapidly losing ground across the rich world? It turns out there are countervailing secularizing forces that, it feels safe to say, have grown powerful enough to chip away at the natural demographic advantage of the religious. Unlike the selection effects that propel religiosity, these are treatment effects, meaning they’re driven by exposure to certain conditions. In addition to the aforementioned creation of secular civic institutions, those conditions include education, rising incomes, and the general removal of existential threat (of the “where is my food coming from” variety, anyway).
Notably, somewhere around 78% of all “religious nones” are “converts,” if you will, meaning they were born into a religion and ceased to identify with it over time.
While there is some observable increase of religious unaffiliation within generations, the flight from religiosity is largely driven by generational replacement. In other words, it’s not like longtime worshipers have suddenly lost faith en masse—it’s that their grandchildren aren’t interested, and older generations are losing ground demographically. This fits the pattern of other paradigmatic shifts in public opinion, and to me, suggests there’s an element of timing involved, that conditional secularization may be contingent upon one’s formative environment.
The question that remains to be seen is whether or not the secularizing rich world can support itself. Our economies, infrastructure, and social welfare systems are reliant on people, and a large population that doesn’t reproduce will age with dramatic consequences (see Baby Boomers). In the long run, this is probably one of the most consequential political issues out there.
Last Tuesday, abortion-rights advocates around the country held rallies in response to restrictive abortion laws or bills passed or introduced in several states. At the events, legislators and protesters decried the bills as an attack on women’s rights, an attempt by men to control women’s bodies.
This refrain, that abortion is an issue that divides the sexes, is a common narrative — at least in my social and professional circles. But it’s discordant with data that shows that men and women within a given society, including the United States, often have very similar views on abortion. A graphic from the Pew Center illustrates this nicely:
Broadly speaking, with respect to the above graph, the differences between nations are much greater than the sex-based differences within them. This suggests to me that cultural factors play a larger role than sex in determining one’s position on abortion and that men and women seem roughly equally sensitive to these forces.
I wanted to check if a similar phenomenon could be observed within the United States’ population. To get a sense of this, I pulled data from the 2018 General Social Survey, which asks respondents “whether or not [they] think it should be possible for a pregnant woman to obtain a legal abortion if the woman wants it for any reason” — a question similar but not identical to the one the Pew Center asks above.
Unfortunately, the GSS doesn’t release data about which state respondents live in unless you pay for it, which I’m not going to do (since I have certainly not made any money with this blog). So that means we can’t examine the opinions of residents where legislators have moved toward a more restrictive stance on abortion. We can, however, get data at the regional level, which seems like an OK proxy.
The regions used by the GSS aren’t entirely conventional — for example, Montana and New Mexico are both in the “Mountain” region — so here’s a chart for reference:
Keeping with the format, if not pleasing aesthetics, of the Pew graph, here are the results by region, arranged by the percentage of women who think a pregnant woman should be able to get an abortion for any reason.
Our regional chart resembles Pew’s international chart in that it shows larger variances between regions than within them. Notably, the four regions below the national average — South Atlantic, West South Central, West North Central, and East South Central — contain states where restrictive abortion bills have been introduced. Even more notably, women in three of those four regions are less likely than men to have responded affirmatively.
Let’s take a look at another question from the GSS that asks respondents for their views on the morality of abortion. The question is, “Leaving aside whether or not you think abortion should be legal, are you morally opposed to abortion or not, or would you say it depends?”
Again, results are sorted by the percentage of female respondents — this time those stating a moral opposition to abortion.
In nearly every region in the United States, the percentage of women morally opposed to abortion is greater than the share of men reporting the same. I was so surprised by these results I checked my code three times, but there you have it. It may be explained in part by greater religiosity among women. Nonetheless, it’s out of sync with the narrative that the push for a more restrictive stance on abortion is a manifestation of men’s desires to enforce their beliefs on women since by and large the women in question share these beliefs. (That said, the bills and laws, from what I can tell, are wildly out of step with the way Americans broadly think about abortion.)
I’ll wrap up by saying that I don’t believe the people decrying a war on women’s rights are being disingenuous. I know enough people, men and women, who hold this belief to know that’s sincerely the way they see things. I don’t know very many, possibly any, strictly anti-abortion people — New Englander here — but I think it’s necessary to take their claims on good faith, too. Presumably 30% of women in the South East are not hostile to women’s rights as they see them.
My post on the relationship between ideology and fertility rates generated some great feedback and critiques (albeit mostly on a Facebook thread). Sadly, none of this was related to the awesome pun in the title of the piece. (Seriously, no love for “The Kids Are All Right”?)
Well, life goes on.
In light of the interest in the subject, I’ve decided to do a quick follow-up piece to address some readers’ questions and adding a bit of information, particularly as relates to this graph from the original post:
1. Are there more people on the political left?
A couple people asked about the ideological composition of the nation and the sample I used. This is an important question, because if the political right makes up a small enough minority of the population or sample, then my graph, which shows the average number of children per respondents of different ideologies but doesn’t convey sample sizes, is a bit misleading—or at least less compelling. So my fault for not going into it in the first place.
Per the most recent polling by Gallup, The American electorate identifies as roughly 26% liberal, 35% moderate, and 35% conservative. This is after two decades of a slow, steady increase in the percentage of Americans calling themselves “liberal.” More on that later.
The sample I pulled from the General Social Survey (GSS) reflects Gallup’s national numbers pretty well: out of the total 8,539 respondents sampled, 2,346 (27.47%) identified as some degree of liberal, 3,285 (38.47%) as moderate, and 2,908 (34.50%) as some degree of conservative.
2. Are there more women on the political left?
I believe this question is getting at the same idea: if the majority of women are left of moderate, then the higher fertility of women on the right is less consequential for the electorate. According to Gallup’s national numbers, 30% of women identify as liberals—the same percentage as call themselves conservatives. For men, those numbers are notably different: 40% and 21%, respectively.
The sample I used showed more gender parity in ideologies, but it’s not hugely off. At any rate, the important thing is that the elevated fertility rates of conservative women can’t be written off as the effect of a small sample size.
3. But the population has been getting more liberal. Doesn’t that kind of throw a wrench in this narrative?
Only time will tell, I suppose! To be clear, this is how many people read the tea leaves, and the story I’m telling is a bit of heterodoxy. While I can’t offer a firm answer to this question now, I have a few remarks:
The past is no guarantee of the future. (Ask GE shareholders, amirite?) Just because the electorate has been getting more liberal doesn’t mean it will continue to.
I suspect the secular trend toward liberalization is as influenced by macroeconomics and sociological factors as it is individual characteristics and experiences. The question is, what will be the effects of today’s macroeconomic and sociological upheaval on future voters—or their children?
Relatedly, I think time horizon matters a great deal when evaluating whether or not the future looks liberal or conservative. This is theory on my part, but maybe populations naturally move to the right over the long term (because conservatives reproduce more) unless cultural forces pulling leftward—economic globalization?—are sufficiently strong and sustained.
Finally, I think it’s worth noting that America’s liberal ranks are mostly swelling at the expense of its moderate contingent, perhaps due to increasing political polarization.
There were a few questions brought up to which I don’t have good answers at the moment. Without promising when, if at all, to address them—I’ve learned not to make firm commitments relating to this fine blog—here are two excellent threads I should follow:
Does completing various life milestones (having children, buying a house, getting married) make people more conservative? There must be some longitudinal studies on this somewhere…
To what extent do children’s political views match their parents’, and is there symmetry between liberals and conservatives in this regard? (I linked to one study by Gallup in the last post that suggested a 70% match between parents and their children, but there’s probably a lot more work on this out there.)
A notion that’s become somewhat common among the left in recent years is that American conservatives are demographically doomed. Very often, this is discussed in terms of race and age—an allusion to the declining share of the non-Hispanic white American population and the accompanying erosion of its political sway, as well as the fact that conservatives tend to be older.
On its face, this seems like a tidy theory. But under the surface, I think it’s a great deal more complicated than the (sometimes wishful) theorizing of liberal pundits allows.
First, as I wrote a couple of weeks ago, the American concept of race, whiteness in particular, is a moving target. I think there’s a solid chance it will look quite a bit different in a couple of decades. But a more interesting foil, I think, to the liberals-own-tomorrow theory manifests itself in fertility rates.
People are having fewer children the rich world over, causing consternation among governments of countries whose economic futures depend on population growth. The political implications of this alone are fascinating, but the general trend obscures another interesting story: the intranational ideological disparity in fertility.
Liberals, it appears, are having fewer children than their conservative counterparts. Combing through General Social Survey (GSS) results from the last 10 years provides a clear view of this phenomenon. Among heterosexual men between the ages of 35 and 50, those who identify as “extremely liberal” had on average 1.79 children, whereas those describing themselves as “extremely conservative” had 2.43. For women, the difference was even starker: “extremely liberal” women had 1.69 on average to “extremely conservative” women’s 2.63.
Many factors contribute to this disparity—most of which concern liberal women’s increased preferences for family planning. Anecdotally, liberals are later to marry, more likely to pursue advanced degrees, less religious, and, among women, more likely to pursue careers—the confluence of which makes for fewer babies. This trend seems to have gotten stronger over the past few decades.
All this is significant because there’s at least some evidence that, most of the time, children end up inheriting their parents’ political views. This makes sense whether you view political ideology as a product of nature—differences in brain structure that give some a proclivity for novelty and others an aversion to risk—or nurture. Either way, if liberals are bearing and raising fewer children, it could mean fewer liberal adults down the line.
There are a few signs this might already be happening. One study found that after decades of decline, high school students’ support for traditional gender roles in the family has been rising steadily since 1994. Goldman Sachs pegs the ascendant Generation Z as especially fiscally conservative. Finally, a survey by the Hispanic Heritage Foundation of 50,000 14 – 18-year-olds found—shockingly, in my view—that the majority identified as Republicans and would support Donald Trump in the 2016 election. (An important caveat to this survey was that nearly a third of those polled would have declined to vote, had they been able.)
The policy preferences of an increasingly conservative nation are one thing—and obviously ideology colors one’s assessment of how good or bad that would be. But what really worries me is the thought of an even more politically segmented society; one in which an increasingly liberal minority of elites maintains control of the nation’s cultural power centers and an increasingly conservative majority grows frustrated with its obsolescence in the new economy, which, intentionally or not, places a premium on educational attainment,city-living, and delayed entry into family life.
Last June, the Census reported the white-alone population to have declined by .2% in absolute terms between July 2016 and July 2017. Though it may seem trivial, this factoid has immense significance to those on opposing sides of the culture wars, both of which have taken it to herald the decline of white political significance and the rise of a more diverse, and therefore liberal, electorate.
Frankly, there’s too much there to talk about in one blog post. Instead, I’d like to address an issue that’s bugged me for a long time:
In projection after projection showing a minority-white America, Hispanic members of each racial category are separated and lumped into their own group, despite the racial diversity of Latin America. This is significant because the rising tide of American diversity is mainly the result of a four-decade wave of immigration from Latin America and the high fertility rate of their descendants (though both forces have recently calmed). In 1960, 3.5% of the country identified as Hispanic or Latino. Nearly 60 years later, that figure has risen to 18%, with expectations that a quarter of the country will identify as Hispanic by 2065.
But disaggregating Hispanics from racial categories is inconsistent, not just with official Census convention—which designates Hispanic/Latino an ethnicity, a variable mutually independent from race—but also with evidence that suggests many Hispanics are beginning to assimilate more wholly into the white population.
This isn’t a (just) pedantic rant about Census data. I think there’s a solid argument to be made that we’re actually in the middle of an expansion, rather than contraction, of American whiteness.
Take, for starters, that a slim majority of American Hispanics already identify as white, at least when asked about their race on the Census. This doesn’t seem like a vestige of a more racially animose time. Per the 2010 Census, 53% of US Hispanics describe themselves as “white alone,” up from 48% in 2000.
Secondly, Hispanic identity seems to fade the further removed from immigration a one is. According the Pew Center’s 2015 National Survey of Latinos, all but 3% of foreign-born Americans with Hispanic ancestry identify as Hispanic or Latino. In the second generation, that share increases only slightly, to 8%. But by the third and fourth generations, it climbs rapidly, to 23% and 50%, respectively. This is truer among younger cohorts.
All told, 11% of US adults with Hispanic ancestry do not identify as such. Because immigration has been replaced by native births as the main driver of US Hispanic population growth in the last few decades, it’s not unreasonable to expect this fraction of “non-Hispanics” to grow.
Also worth considering are Hispanics’ growing geographical dispersion and high rate of intermarriage, especially among younger generations. Twenty-eight percent of 18- to 35-year-old US Hispanics are married to non-Hispanics. Again, this trend grows stronger the longer one’s family has been in the United States: nearly 60% of third-generation US Hispanics ages 18 to 35 are married to someone who isn’t Hispanic. Moving out of the city and marrying extra-ethnically seem, admittedly conjecturally, indicative of cultural assimilation.
It seems like Hispanics are following the arc of other (European and Levantine) immigrant groups who were once, and in some cases still are, considered outside the bounds of conventional whiteness. All of this is to say, I’m skeptical that the way Hispanics view themselves in 2018 is the way they will in 2050—especially as they become more enmeshed in mainstream American society.
Of course, this is just a prediction I’m making in my living room. I don’t have a crystal ball or any special insight into the minds of the American public. I’m going to end with some reasons things might not go as I imagine:
The 101 reason would probably be “politics,” with which race seems to have a bicausal relationship in America. It’s not hard to imagine the Republican party alienating Hispanics with nativism while selling themselves, intentionally or otherwise, as the party of White America. Similarly, Democrats’ ability to court Hispanics relies to some degree on the extent to which they feel shut out from the cultural and political mainstream. Both could push Hispanics to think of themselves as non-white more frequently.
Relatedly, the Office of Management and Budget could affect Hispanics’ racial identities through bureaucratic means. A few years ago, there was talk of combining the race and ethnicity questions, with “Hispanic” offered as a choice alongside Asian, black, white, etc. As I noted at the time, this might bring the Census questions more in line with the way Americans think about race today—but it would also be putting a thumb on the scales. There’s really no neutral position for the Census to take in this matter.
Anecdotally and finally, it also seems like the psychic benefits of whiteness have waned a lot over the last few decades—especially as regards low-status whites. Part of this owes to good news: cultural progress on matters of race, which has begun to erode the relatively elevated status enjoyed by whites at the expense of minorities. Other explanations are more sinister and reflect anomic decay in the white population: rising rates of suicide, drug overdoses, and voluntary unemployment. For one reason or another, whiteness no longer feels as enviable a club as it probably did in the 20th century when Italians, Jews, and other so-called “white ethnics” made the conscious effort to join its ranks.
I’m taking a couple days off from work this week. Unfortunately, it’s raining, so I’m inside playing around with some county-level education data in R, and I thought I’d throw up a quick blog post. The data set, which goes back to the 1970s, comes courtesy of the USDA and can be found here.
A quirk of New York City is that each of the five boroughs is also its own county. I took the opportunity to make some graphs illustrating how educational attainment has changed in the city and its boroughs over the past 50 years. There are a few interesting insights to be had here:
New York City’s college attainment rate closely mimics the country’s. Since 1970, the percentage of residents over the age of 25 in both groups who have attended at least some college has risen from just over 20% to just under 60%.
But between the boroughs, there’s quite a bit of diversity. Manhattan is the only borough to have ever had a higher than city-average rate of college attainment. What’s more, the gap between the Manhattan and the New York City average has only grown over time. That said, because it started out with a higher rate of college-educated residents, Manhattan has experienced the lowest rate of growth in this area. Here’s each borough compared with the city’s average over the last 50 years:
Perhaps unsurprisingly, given its reputation as a hotbed of gentrification, Brooklyn leads the way in terms of educational growth among its residents. In 1970, Brooklyn and the Bronx had similar rates of residents with at least some college (12 and 13 percent, respectively), compared with a US average of 21.3%. By 2016, however, the gap between the former had widened to 10.5%. Brooklyn has surpassed Queens and is closing in on the NYC average.
A more detailed look at the change in educational attainment in Brooklyn, which now has slightly more residents with than without some college experience:
Race, however nebulous a concept, is typically thought of as static among individual human beings. And yet, millions of Americans may wake up on Census Day, 2020 as a member of a different race–at least, on paper.
The proposed changes are meant to help align census definitions with the way Americans think about race. It’s an incredibly difficult, if not quixotic, task, due in no small part to the census’ historical ambiguity on the issue.
A little background
Since its inception in 1790, the United States Census has tracked racial data. The federal government uses this data to track health and environmental outcomes across populations, promote equal employment opportunities, to redistrict, and to inform federal policy with regard to civil rights.
Because the decennial census data are used for redistricting purposes and inform race-related social policy, you can count on a lot of advocacy and politics influencing every step of the process.
The Census Bureau collects data in accordance with the guidelines set for it by the Office of Management and Budget. This data is self-reported (though it has only been this way since 1960) and is acknowledged to adhere to social, rather than scientific, definitions.
Here’s what’s happening
A proposal by the Office of Management and Budget suggests creating a new racial category, MENA (Middle East and North Africa), and combining the ethnicity and race questions. Together, both changes could affect the way over 60 million Americans racially identify.
Under current guidelines, people with “origins” in the Middle East and North Africa are considered white. Critics, notably the Arab American Institute, claim this categorization is an inaccurate vestige of anti-Asian immigration law from the 19th century that led Middle Eastern immigrants to advocate for white status. Creating a separate racial category will allow for better data collection and confer upon “MENA-Americans” the same legal protections and privileges granted to other minority groups, they argue.
Hispanics, on the other hand, are not currently considered a race by the census, but rather an ethnicity. In fact, the only two options given to Americans for ethnicity are “Hispanic or Latino” or “Not Hispanic or Latino.” This has meant that Hispanics have been free to self-identify with any race they feel accurately describes them–a choice that has produced some confusion. Given that freedom, a slight but increasing majority of Hispanics have chosen to describe themselves as white (53% in 2010, up from 47.9% in 2000).
But by collapsing the ethnicity and race questions into one general question, the next census may change that.
Having Hispanic, Latino, or Spanish origin appear next to other race options may encourage Hispanics who had previously considered themselves white to simply identify as “Hispanic”–after all, Spain is notably absent from the countries listed under “white.” As Mike Gonzalez writes for National Review:
The proposed census form defines “white” as “German, Irish, English, Italian, Polish, French, etc.” For “Hispanic, Latino or Spanish,” the definition is “Mexican American, Puerto Rican, Cuban, Salvadoran, Dominican, Colombian, etc.”
Now, if you’re a Mexican American who has always considered yourself white because of your Spanish ancestry, you have one choice. You would never check a box designated for persons of German, Irish, or other origins north of the Pyrenees, because that doesn’t describe you. So the only choice you have is Hispanic.
Social definitions of race are neither static nor universal…nor immune to bureaucracy
This is far from the first time bureaucratic lines around race and ethnicity have been redrawn. Different federal policies and amendments thereto have led people to alter their racial and ethnic identities for hundreds of years in America.
The most obvious example would be the term “Hispanic,” which was first officially used in the 1970s. It’s more of a political and bureaucratic convenience than a valid anthropological grouping, and is rarely used outside of the United States. Yet today many Americans celebrate Hispanic Heritage Month, listen to “Hispanic music”, and identify as Hispanics.
Another example would be Indian Americans’ historical flirtation with different racial categories. In the early 1900s, there were several court cases in which individual Indian Americans were determined to be white and non-white, depending on the case. In 1930 and 1940, the census listed “Hindu” as a race, but in 1970 Indians were instructed to self-report as “white.”
By 1980, that had changed again and Indian Americans were grouped under “Asian.” However in 1990, 10% of respondents with Indian origins self-identified as “white” and 5% self-reported as “black,” despite being specifically instructed to check “Asian” by the census. Non-Asian identification rose in generations removed from immigration. Among US-born South Asians, the portion that identified as white rose to 25% in 1990.
Allowing greater leeway in racial reporting has also yielded significant demographic changes.
A change to census policy beginning in 1960 allowed respondents to self-report race, rather than require verification from a local enumerator. As a result, the Native American population has since exploded in a way that can not be explained by birth rates or immigration. A 2000 change that allowed respondents to select multiple races furthered this trend.
Tilting at windmills
The best lesson is that race in America has nothing to do with biology and is as informed by politics as much as it informs them. Racial identities are more idiosyncratic and plastic than we tend to think of them as being. The census doesn’t–and cannot–tell us about the actual genealogical diversity of America; what it actually measures is our collective perception thereof. That perception and the metrics by which we assess it are constantly changing.
The great irony here is that the OMB and the Census Bureau are both causing and reacting to changes in perceptions of racial identity in America! In order to ask questions the way they think respondents want to hear them, they inadvertently place their thumbs on the scale. Even though racial and ethnic data are self-reported, they will always be influenced by the definitions put forth by the OMB.