14.11.18

CONFRONTING INCOHERENCE?

The Chronicle of Higher Education interviews Harvard historian Jill Lepore.  "The Academy Is Largely Itself Responsible for Its Own Peril."

By all means, go, read, understand the article in full.  I will quote liberally from it as who knows how soon the House Organ for Business as Usual in Higher Education puts it behind the paywall.

Her new book, These Truths, sounds like something to put on the Christmas list.
The American Revolution, Lepore shows, was also an epistemological revolution. The country was built on truths that are self-evident and empirical, not sacred and God-given. "Let facts be submitted to a candid world," Thomas Jefferson wrote in the Declaration of Independence. Now, it seems, our faith in facts has been shaken. These Truths traces how we got here.
That alone might be worth the price of admission, but read on.
This kind of book written by a single author for a general readership is an unusual effort. It hasn’t been done often, though it used to be a routine capstone endeavor of a certain sort of notable historian. It became untenable when the historical profession became bigger and broader. In the 1960s, women and people of color got Ph.D.s and revolutionized the study of the past. They incorporated those whose experiences and especially whose politics had been left out. You can think of that, and I certainly do, as an incredible explosion of historical research that was profoundly important and urgently necessary. But you can also think of it as shattering an older story of America.
She goes on to note that the summaries of events adapted to a younger audience, junior high and high school history books, if you will, might thus become unwieldy, or subject to political maneuvering at the school boards and state departments of public instruction, and the distinguished senior professors of history gave way to "journalists" as authors (and often potted authors at that).

Why the academic historians couldn't work through the "incredible explosion" in the peer reviewed journals and let the leading scholars curate the discussion and yet write the textbooks might be for another day.  It's certainly for a different post, that is, if I want to, as the conversation takes a most instructive turn.
I call the book These Truths to invoke those truths in the Declaration of Independence that Jefferson describes, with the revision provided by Franklin, as "self-evident" — political equality, natural rights, and the sovereignty of the people. But I’m also talking about an unstated fourth truth, which is inquiry itself. Anyone who has spent time with the founding documents and the political and intellectual history in which they were written understands that the United States was founded quite explicitly as a political experiment, an experiment in the science of politics. It was always going to be subject to scrutiny. That scrutiny is done not from above by some commission, but by the citizenry itself.
That scrutiny is always done by humans themselves. Governments that perform poorly? Pitchforks and torches.  Universities that perform poorly?  People can find work-arounds for them too.

Professor Lepore doesn't quite say "bet on emergence" and yet it's in there.
There’s an incredibly rich scholarship on the history of evidence, which traces its rise in the Middle Ages in the world of law, its migration into historical writing, and then finally into the realm that we’re most familiar with, journalism. That’s a centuries-long migration of an idea that begins in a very particular time and place, basically the rise of trial by jury starting in 1215. We have a much better vantage on the tenuousness of our own grasp of facts when we understand where facts come from.

The larger epistemological shift is how the elemental unit of knowledge has changed. Facts have been devalued for a long time. The rise of the fact was centuries ago. Facts were replaced by numbers in the 18th and 19th centuries as the higher-status unit of knowledge. That’s the moment at which the United States is founded as a demographic democracy. Now what’s considered to be most prestigious is data. The bigger the data, the better.
That doesn't sound completely correct, perhaps I should hold judgement until I've read the book. On the other hand, if you say systematic evidence is stronger than anecdotal evidence, perhaps you might understand how that status hierarchy emerged.  An aside: political discourse (journalism, if you will) is still more about the compelling stories, the stuff immediately seen, with the stuff that's not unseen not mentioned.

Might one of the greatest accomplishments of human inquiry be the recognition that the consequences beyond the immediate matter?

That's not the direction the interview takes.
That transformation, from facts to numbers to data, traces something else: the shifting prestige placed on different ways of knowing. Facts come from the realm of the humanities, numbers represent the social sciences, and data the natural sciences. When people talk about the decline of the humanities, they are actually talking about the rise and fall of the fact, as well as other factors. When people try to re-establish the prestige of the humanities with the digital humanities and large data sets, that is no longer the humanities. What humanists do comes from a different epistemological scale of a unit of knowledge.
That "different ways of knowing" is a tell.  It comes from a discourse practice that at best might be healthy skepticism and at worst has implicit scare quotes around any mention of truth in speech or writing.  Stop denying coherent beliefs, humanities types, and your life might get better.  Not to mention that third-year graduate students in statistics and economics have probably forgotten more about inference than most literary types ever learned.

The good news is the direction the conversation next takes.
The academy is largely itself responsible for its own peril. The retreat of humanists from public life has had enormous consequences for the prestige of humanistic ways of knowing and understanding the world.

Universities have also been complicit in letting sources of federal government funding set the intellectual agenda. The size and growth of majors follows the size of budgets, and unsurprisingly so. After World War II, the demands of the national security state greatly influenced the exciting fields of study. Federal-government funding is still crucial, but now there’s a lot of corporate money. Whole realms of knowing are being brought to the university through commerce.
That train probably left the station when President Lincoln proposed the land-grant colleges.  There's long been a tension between the practical and the foundational learning, and foundational knowledge, as Professor Lepore implicitly notes, is too important for the congregation to entrust solely to the clerisy.  I'm less sympathetic to the "humanists."  Once they started dressing up simple ideas with word-noise, or denying self-evident truths, they marginalized themselves.

That goes for economists, too, although they dress up their simple ideas with topology.

That noted, is the professor calling out the Fatal Conceit?  "I don’t expect the university to be a pure place, but there are questions that need to be asked. If we have a public culture that suffers for lack of ability to comprehend other human beings, we shouldn’t be surprised. The resources of institutions of higher learning have gone to teaching students how to engineer problems rather than speak to people."  Yup, that applies doubly to the deanlets and deanlings of student affairs.

The interview concludes with intriguing perspectives on innovation and disruption.  In politics, we have the Constitution, and metaphors about the Senate as a cooling saucer for a reason.
Innovation as an idea in America is historically a negative thing. Innovation in politics is what is to be condemned: To experiment recklessly with a political arrangement is fatal to our domestic tranquillity. So there’s a lot of anti-innovation language around the founding, especially because Republicanism — Jeffersonianism — is considered excessively innovative. Innovation doesn’t assume its modern sense until the 1930s, and then only in a specialized literature.
That's probably heresy to the surviving fans of Franklin Roosevelt and the brains trust, whether in History or at the Kennedy School or if there are any applied economists studying public finance.

In commerce, well, if you understand competition as discovery, you'll grasp instantly why disruption for its own sake is dumb.
Disruption has a totally different history. It’s a way to avoid the word "progress," which, even when it’s secularized, still implies some kind of moral progress. Disruption emerges in the 1990s as progress without any obligation to notions of goodness. And so "disruptive innovation," which became the buzzword of change in every realm in the first years of the 21st century, including higher education, is basically destroying things because we can and because there can be money made doing so. Before the 1990s, something that was disruptive was like the kid in the class throwing chalk. And that’s what disruptive innovation turned out to really mean. A little less disruptive innovation is called for.
In economics, "disruptive innovation" takes on a more subtle meaning, think about something like a diesel locomotive on a coal-fueled railroad, or a smaller hard drive that doesn't have the capacity to serve a mainframe.  There's not always money to be made in being disruptive for its own sake, and I really should elaborate further on why that works, and I'll probably rely on Professor Lepore in New Yorker on that score.  I also shouldn't rest until people become more careful about labeling changes, particularly changes that they like, as "progress."

That's enough for today.  Go read the interview.

No comments: