Writing in a recent New York Times Sunday review column, Neal Gabler takes note of the The Atlantic’s list of the “14 Biggest Ideas of the Year”:
Take a deep breath. The ideas include “The Players Own the Game” (No. 12), “Wall Street: Same as it Ever Was” (No. 6), “Nothing Stays Secret” (No. 2), and the very biggest idea of the year, “The Rise of the Middle Class—Just Not Ours,” which refers to growing economies in Brazil, Russia, India and China.
Now exhale. It may strike you that none of these ideas seem particularly breathtaking. In fact, none of them are ideas. They are more on the order of observations.
Gabler goes on to observe that we no longer seem to live in a time when big ideas “ignite fires of debate, stimulate other thoughts, incite revolutions and fundamentally change the ways we look at and think about the world.” Compared to the time when thinkers like Albert Einstein, Betty Friedan and Carl Sagan held the public’s attention, our age is impoverished.
But why are we living in this “post-idea” world? Part of Gabler’s explanation is a broader cultural trend:
It is no secret, especially here in America, that we live in a post-Enlightenment age in which rationality, science, evidence, logical argument and debate have lost the battle in many sectors, and perhaps even in society generally, to superstition, faith, opinion and orthodoxy. While we continue to make giant technological advances, we may be the first generation to have turned back the epochal clock—to have gone backward intellectually from advanced modes of thinking into old modes of belief.
On this, we could not agree more. At The Undercurrent we’ve long been critics of the increasing popularity of Evangelical Christianity, the looming threat of political Islam, and the bland indifference to both by allegedly secular critics. Science and reason are under assault, whether by right-wing religionists who would arrest the advance of stem cell research, or by left-wing multiculturalists, feminists, and environmentalists who see science as a form of Western patriarchal imperialism.
But why has America abandoned the bold Enlightenment pursuit of big ideas and instead “turned back the ephocal clock” to an increasing reliance upon superstition and faith? Somewhat surprisingly, Gabler blames the Information Age: he claims that we are inundated with so much information from web sites and social networks that we cannot process it into something meaningful. “Instead of theories, hypotheses and grand arguments, we get instant 140-character tweets about eating a sandwich or watching a TV show.”
But does technology really mold our minds? To be sure, it is a tool that extends the reach of our hands and of our senses. As a tool, it can be used poorly or used well. People who watch a lot of television can become illiterate couch potatoes. But they can also become media critics. Twitter can be used to share meaningless gossip about celebrities, or it can be used to foment political revolutions. Even media critics and revolutions are not guaranteed to deliver anything true or meaningful. If we sometimes don’t like what they deliver, isn’t it obvious that we shouldn’t blame the tools—but the choices made by the tool users?
And if America has entered a post-Enlightenment period that distrusts rationality in favor of superstition, should we blame technology? Or should we blame the philosophers who, many decades before the embrace of superstition noted by Gabler, cast doubt on the efficacy of human reason, thereby creating a vacuum into which faith and superstition would enter. One way these philosophers undercut the mind, curiously, was by arguing that it was a helpless pawn of material forces—not unlike Gabler’s own form of technological determinism.
Consider an alternate big idea: the manmade world we see around us is the product of the choices of individual human minds. If we don’t like the world, we should rethink the choices that produced it.
Creative Commons-licensed picture from Flickr user Rosaura Ochoa.