We click one more day, it’s March, and we’re inching away from Old Man Winter to the new days of Spring.
Photo by Mark Tegethoff on Unsplash
Universities are medieval institutions in origin, though for most modern universities the religious purpose that they once served is no longer fundamental. It may linger on in Latin or Hebrew mottos, but religious piety and theological inquiry are not at the core of what most professors and students now wish to do. Medieval elements persist in the architecture of many universities: in the “quad” that can be found at the physical center of many college campuses and in the collegiate gothic, a style that was especially popular in the United States in the 1920s, leaving its mark on Yale, Duke and many other universities. At graduation, gestures toward the medieval pre-history of the modern university proliferate, from the granting of degrees to the wearing of robes.
Whatever medieval vestiges remain, the medieval university’s notion of the good life is no longer operative. It cannot be said that that twenty-first century university is divinely inspired, that its students enter in – as students – in pursuit of revelation, of God or of the religious practices that might substantiate revelation and bring students closer to God. The medieval university’s notion of the good is profoundly foreign to modern higher education. The Enlightenment university, which persists, had as its highest ideal inquiry itself, and the post-Enlightenment university, which is still being born, often has politics as its highest ideal, whether that means social justice of one or another kind of political activism. The vita activa has gained on the vita contemplativa, or in universities the vita contemplativa must justify itself as the vehicle of the vita activa. One studies in order to act.
When considering its commitment to the good life, a life of moral and intellectual excellence, the modern university would do well to remember its medieval past. Medieval universities were monastic institutions designed and structured for prayer, and prayer, like many other forms of religious devotion, is often quiet. Typically done in silence, praying requires silence in order to be done. The university “quad” harkens back to the monastery cloister. The cloister’s open spaces and walkways left space for quiet. They kept the city at bay; they kept the market at bay; they kept the commotion of quotidian life at bay. This the modern university is still able to do. A once-sacred function – the provision of quiet, the gift of quiet – can be absorbed into the modern university. If so, it can help lead students toward the good life.
O my, ending a sentence with a preposition or not?
An authority on the English language has set us free from the tethers of what many have long regarded as a grammatical no-no. Or has it?
The answer depends on how you side with a declaration from Merriam-Webster:
"It is permissible in English for a preposition to be what you end a sentence with," the dictionary publisher said in a post shared on Instagram last week. "The idea that it should be avoided came from writers who were trying to align the language with Latin, but there is no reason to suggest ending a sentence with a preposition is wrong."
Merriam-Webster had touched on a stubborn taboo — the practice of ending sentences with prepositions such as to, with, about, upon, for or of — that was drilled into many of us in grade school. The post ignited an emphatic debate in the comment section.
Many were adamant that a concluding preposition is lazy, or just sounded plain weird.
"Maybe so, but it doesn't sound expressive and at times sounds like someone isn't intelligent enough to articulate themselves," one user replied to Merriam-Webster.
Others heartily welcomed the permission granted.
"Thank you. How many times have I made an awkward sentence to avoid a preposition at the end?!?!" another person wrote.
The emotionally charged response to the post doesn't surprise Ellen Jovin, who travels the country with her "grammar table" fielding questions about Oxford commas, apostrophes and other hot-button linguistic topics.
"I spend a lot of time dealing with the Concluding Preposition Opposition Party," she said. "I know that any day that I want to start a fight, all I have to do is say something about this in public."
Jovin sees concluding preposition opponents as operatives of a sort of sunk cost fallacy. People have invested a lot of time in finding ways to not end clauses and sentences with prepositions. So, when someone comes along and tells you there's no such rule, it's human nature to cling tighter to something that cost so much time and energy.
"I also think that because not ending with prepositions is associated with a more formal style — maybe some of the anger comes from a kind of pricked pomposity," she said. "Maybe sometimes they feel that someone is criticizing a larger style decision that they've made."
As for Jovin, "I end with prepositions and I'm perfectly happy with my life," she said.
SMcK: and what about splitting an infinitive, Ellen?
Yes, Christian Nationalism has to be defined, and John Hawthorne has a good entry point:
I’ve been following this conversation ever since Andrew Whitehead and Sam Perry wrote Taking America Back for God: Christian Nationalism in the United States in 2020. Using data from the Baylor Religion Survey, they explored four broad approaches to Christian Nationalism. They differentiated between Ambassadors, Accommodators, Resisters, and Rejectors. They then examined the political, policy, and background correlates of the four groups.
Today, the Public Religion Research Institute released results of a detailed survey on Christian Nationalism. Their survey follows logic similar to that of Andrew and Sam. They have five critical questions:
The U.S. government should declare America a Christian nation.
Laws should be based on Christian values.
If the U.S. moves away from our Christian foundations, we will not have a country anymore.
Being Christian is an important part of being truly American.
God has called Christians to exercise dominion over all areas of American society.
They measure agreement with these five issues to develop four categories. They label these Adherents, Sympathizers, Resisters, and Rejectors. The distribution of those four groups looks like this:
Anne Helen Petersen and body-research’s major gap: women.
This is a big one, but you address it head on and I think it’s important for people to read it here: why haven’t we studied women’s bodies (and why, for the most part, do we continue to exclude them?)
When I asked researchers this question, they told me the same thing: women are complicated.
Scientists tend to favor simplicity, especially when the goal is to understand complex physiological, molecular, and chemical behavior. They want to eliminate as many extraneous factors as possible to reduce the “noise” in the data. That way, when they look at the data, they can more easily pinpoint—ah, this is the thing that’s influencing the outcome or that’s making the difference.
That’s why scientists tend to study homogenous strains of male rats of the same age to minimize variability. That’s why scientists recruit a standardized cohort of participants, one that’s relatively controlled and with few external variables. That’s why scientists default to men as participants.
Female bodies throw a whole wrench in the system because of menstrual cycles. The hormonal environment in female bodies is constantly changing, not just during a given month but throughout the lifespan. These fluctuating hormones influence a wide range of physiological factors aside from reproduction and fertility. They add noise to the data that scientists have to control for and that can take time and money. It’s just easier to study male bodies because they don’t experience the same hormonal fluctuations. (But their hormones do fluctuate!)
But I think that’s somewhat of a simplistic answer. It’s easy to say “scientists don’t care about women” or fault them for these biases, but it’s the systems that influence how they act and the decisions they make, which partly explains why women continue to be under-represented in biomedical research. I think we have to think about the historical perceptions about female bodies and how that planted the seeds of gender bias in biomedical research. Because when women aren’t included from the beginning, they continue to be overlooked because there’s no precedent to include them in the first place. It creates a blind spot that enables institutions to routinely exclude women and position them as an exception to the rule rather than part of the norm.
So, if you think about it, the idea that women are the weaker sex has been deeply ingrained in Western cultures since antiquity. Women were seen as the foil to men: the delicate, feebleminded, and feminine counterparts to male virility, wisdom, and masculinity. Female bodies were the inferior version of male bodies and therefore weren’t considered worthy of studying.
The Harvard Fatigue Lab is widely considered the birthplace of exercise science. Some of the foremost leaders in the field got their start there. When scientists wanted to study exercise, they turned to men—typically young, college-age, white men—because they were the ones who were allowed to compete and play sports and they were the ones at the universities where they were studying these things. That became the standard methodology. When the lab closed in 1947, its former students, staff, and fellows dispersed and went on to establish 17 new labs across the country, bring the legacy and methodologies of their mentors with them. They then trained the next generation of exercise scientists who went on to train the next and those methodologies were passed down. You might not think to question it because that’s what your professor taught you.
Eventually, a 70-kg man came to represent the average person in biomedical research. It wasn’t until 1993 that the National Institutes of Health (NIH) Revitalization Act mandated the inclusion of women and minorities in NIH-funded clinical research. It’s only been since 2016 that scientists have been required to account for sex as a biological variable in preclinical research and human studies.
What that means is that funding has historically flowed more easily to studies investigating male cells, animals, and humans. Since everyone studies male specimens, journal editors and peer reviewers are more familiar with this context and may look upon those studies more favorably for publication. Meanwhile, researchers have to clear countless hurdles to prove a study involving female specimens or participants is worthwhile. They may be met with skepticism because the methodology and research questions are novel. Or maybe the study question has already been investigated in men so the study is deemed redundant and not worthy of funding or publication.
All these factors create a huge gap in the research. Between 2014 and 2020, only 6% of sports science research focused exclusively on women and women made up only 34% of research participants.
Rosemary Salomone and the English Language:
Leap forward to 2001 when an article in Bloomberg Businessweek put a twenty-first-century turn on Condorcet’s cautionary words. The title of the article, “The Great Divide: In Europe, Speaking the Lingua Franca Separates the Haves and the Have-Nots,” is as evocative now as it was in 2001, and as it would have been back in 1794. The accompanying illustration is even more so. It depicts three men in business suits. Two are large, broad-shouldered, and powerful-looking figures, smiling at each other as they forcefully stride ahead apparently on a cloud. One with his arm on the back of the other asks, “Speak English?” The other responds, “Of course!” The third man, a small figure (about a third of their size) is frantically running after them on the ground. He holds an open book in his right hand and clutches another closed book under his left arm. Other books are falling from his grasp. The books have words, pictures, or titles that represent learning English. In the background is a church with a steeple, representing a European town or city. The message is clear. English is the sine qua non of happiness and power (“Of course!”). It gives you entrée to a world of colleagues with a similar state of mind and professional stature. Without it, you’re left behind. You’re insignificant. You’re desperately trying to catch up.
With a string of examples, the article goes on to explain that English had become “firmly entrenched nearly everywhere as the international language of business, finance, and technology.” Even more so, it was becoming the “binding agent for Europe.” English had already become “Europe’s language.” It was an “imperative.” Though British and American managers working in Europe would be wise to develop bilingual skills, “new forces,” including the internet, were “pushing Europe toward a common language.” The article warned that while speaking English was bringing Europe together in some ways, it also was dividing the continent into “haves” and “have-nots.” Only 29 percent of Europeans were able to carry on a conversation in English. That was two decades ago. By 2012, the last date of an official European language survey, a majority of EU citizens (56 percent) spoke English as a first or second language. Setting aside the loss of first language speakers since the United Kingdom’s departure from the European Union, and not considering levels of fluency, the figure on second language speakers especially among young people is presumably higher today but certainly nowhere near universal.
In the intervening years, English has become not just the “language of Europe”; it has become the dominant lingua franca of the world. It is an official language of the United Nations, the World Trade Organization, the International Criminal Court, and NATO.
Noah Millman ponders the tragedy of Aaron Bushnell:
In November 1970, the great Japanese novelist and ultra-nationalist Yukio Mishima made a somewhat farcical attempt at a coup, seizing control of the commandant’s office on a military base with the help of a handful of followers, and demanding the restoration of the emperor to supreme power. When, predictably, these actions had no effect, he ritually disemboweled himself and instructed his followers to cut off his head.
I bring Mishima’s political suicide up to reflect on Aaron Bushnell, who set himself on fire to protest Israel’s war in Gaza and America’s support thereof. I could have begun this post with some other instance of political suicide: Mohammed Bouazizi, the Tunisian street vendor whose self-immolation sparked the Arab Spring—or, even more apropos, Norman Morrison, who set himself on fire in front of the Pentagon (and his one-year-old daughter) to protest the Vietnam War. I chose Mishima because I feel reasonably confident none of my readers will sympathize with the cause for which he killed himself which, I hope, will make it easier to understand the meaning of suicide as a political gesture. That meaning, it seems to me, is fundamentally rooted in despair.
To be clear: I’m not saying it is rooted in mental illness. I’m sure that some people who kill themselves for political reasons are first and foremost suffering from suicidal ideation, and some of them may be suffering from psychotic delusions. I don’t think most of the people who have killed themselves for political reasons can be so easily dismissed, however, and anyway I am wary of pathologizing extreme emotional states without some justification apart from the emotional states themselves. But I do think despair—the absence of hope—is the right way to describe the emotional state that leads to suicide, and that in the case of political suicide what we’re talking about is political despair, the belief that political change is absolutely necessary, but also nigh impossible.
Thank you Scott for your Saturday meanderings.
About ending a sentence with a preposition: There is a story I read somewhere about Winston Churchill during WW II. His speeches had to be reviewed by a censor, to be sure no military secrets were inadvertently revealed. One censor had the temerity to correct his grammar, so that a sentence did not end with a preposition. Churchill is reported to have said to Parliament, This is interference up with which I will not put! (Don't know if it's true, but it's a good story.)