Danny Duncan Collum, a Sojourners contributing writer, teaches writing at Kentucky State University in Frankfort, Kentucky. He is the author of the novel White Boy.
Posts By This Author
Do We Need Universal Basic Income?
WHEN I TOLD my oldest son I was writing about universal basic income (UBI), he said, “All I know is that the Silicon Valley guys are pushing it, so it must be bad.” And he had a point. UBI has entered U.S. political debate most prominently as Silicon Valley’s favorite solution to a problem mostly of its own creation—massive permanent job loss due to artificial intelligence and robotics.
Under a universal basic income policy, all U.S. citizens would receive from the government a regular, permanent payment of, say, $1,000 per month, regardless of their other income or employment status. It wouldn’t get rid of the grotesque income inequality in the U.S. In fact, it wouldn’t even guarantee each person a decent standard of living. But it would get everyone up to the official poverty level.
Tech industry UBI proponents include Facebook CEO Mark Zuckerberg, Tesla founder Elon Musk, and Amazon kingpin Jeff Bezos. But the idea is most identified with former Silicon Valley entrepreneur Andrew Yang, who made it the defining issue of his long-shot campaign for the Democratic presidential nomination.
Still, UBI is an idea much older and bigger than any of its shadier supporters. While the term “universal basic income” is of fairly recent coinage, the idea that every human deserves some share of the earth’s bounty is an old one. In 1797, one of America’s founding philosophes, Thomas Paine, wrote that “the earth, in its natural uncultivated state was, and ever would have continued to be, the common property of the human race.” But, Paine continued, “the system of landed property ... has absorbed the property of all those whom it dispossessed, without providing, as ought to have been done, an indemnification for that loss.”
Paine proposed a single payment at the attainment of adulthood as compensation for the loss of our natural right to the earth. Paine was echoing the ideas of some of the earliest Christian teachers, including St. Ambrose (340-397 C.E.), who wrote: “God has ordered all things ... so that there should be food in common to all, and that the earth should be the common possession of all. Nature, therefore, has produced a common right for all, but greed has made it a right for a few.”
So universal basic income is not just the latest Silicon Valley fad. It’s rooted in an understanding of the origins of wealth and of our obligations to each other that is consistent with both our democratic and religious traditions.
But that still leaves plenty of room for debate about whether UBI is the right solution for America’s most pressing social and economic woes.
How Would UBI Work?
ECONOMIC DEBATE OVER the past 50 years has offered a variety of UBI-type proposals, from Richard Nixon’s negative income tax to the social wealth dividend proposed by some contemporary democratic socialists. The best-known and most-debated current UBI plan is the one proposed by the Yang campaign. This version of UBI rests on three pillars:
First, it is “universal.” Everyone gets it, without conditions—from Warren Buffett down to the apparently able-bodied guy with the “Please Help” sign at the exit ramp. That, of course, raises the first blizzard of objections. Why give money to rich people who don’t need it or purportedly irresponsible people who might waste it?
Paying for UBI would almost certainly involve new taxes on the wealthy, so Warren Buffett wouldn’t be keeping his $1,000 per month. As to the fear of aiding the “undeserving poor,” it’s true that historically most of the meager social benefits offered in the U.S. are means-tested (for those with the very lowest incomes) and conditional upon some form of good behavior (hours worked, clean drug tests, etc.). This has helped create a culture that stigmatizes public benefits as “welfare” and brands beneficiaries as, if not sinful, at least defective.
Social Media Reaps What It Sows
EVERY U.S. PRESIDENT since Richard Nixon has complained about his news coverage. But the man who lives in the White House now is doing something about it.
In August, Politico reported that the Trump administration is drafting an executive order to counter “liberal” bias in story selection and search results on the platforms Facebook, Twitter, and Google (owner of YouTube). According to this report, both the Federal Communications Commission and the Federal Trade Commission may be tasked with enforcing the neutrality of the digital platforms and the algorithms that prioritize stories and topics.
“Social media bias” must work well as a Republican fundraising pitch, because the administration and its allies in Congress have been harping on it for the past year. In September 2018, Twitter chief Jack Dorsey was hauled before the (then-Republican-controlled) House Energy and Commerce Committee and roasted over arcane and unproven claims of his company’s anti-conservative bias. The next day, Jeff Sessions (then still U.S. attorney general) called a meeting of his state-level counterparts to discuss possible actions against the alleged bias.
Lost Causes and Slivers of Hope
APPROPRIATELY ENOUGH, The Saint of Lost Causes —the new Justin Townes Earle album—has an Orthodox icon of St. Jude on the cover. In Catholic lore, St. Jude is the patron saint of lost causes and desperate cases. But if a person can still turn to a saint for intercession, the cause isn’t entirely lost. The desperate act of prayer implies at least a sliver of hope for grace and mercy, and that’s mostly where the people in Earle’s new batch of songs are: down to their last desperate prayer but still hoping.
At the beginning of the album, in the title track, Earle lays it out, singing: “Now it’s a cruel world / But it ain’t hard to understand / You got your sheep, got your shepherds / Got your wolves amongst men.” Over the course of the next 11 songs, we see the world mostly from the point of view of the sheep. We hear from some fracked-out citizens in “Don’t Drink the Water” who are growing increasingly restless as some oil company hack keeps claiming that their poisoned land and water, and the occasional earthquake, are all an “act of God.” Later, in “Flint City Shake It,” a streetwise Michigander fills us in on how General Motors assassinated his still-resilient hometown. Then there’s the junkie desperado of “Appalachian Nightmare” who hopes God can forgive him at the moment of his death.
World Wide Death
THIS YEAR MARKS the 50th anniversary of the invention of the internet. One day in October 1969, scientists successfully transmitted data from a campus computer at UCLA to a computer at Stanford. Twenty years later, the infrastructure for the World Wide Web went into operation, and the creation of our whole digital universe quickly followed.
Lately, there have been plenty of days that have convinced me that the invention of the internet is one of the worst things that has happened since our first human parents decided that a little bit of “knowledge of good and evil” couldn’t possibly hurt anything.
The Worth of Work
THE AGE OF THE ROBOTS is here. If you didn’t notice, it’s because we’re calling them artificial intelligence (AI) and they don’t look like we expected. They’re the touchscreen kiosk that has replaced the cashier at Panera, the mechanical arms and claws flipping burgers at fast food joints, the drone that may someday deliver your Amazon order. They’re the software that can turn a baseball box score or corporate earnings report into a wire service news story.
According to a recent report from the Brookings Institute, about 38 percent of the adult population could be put out of work by smart machines in the next generation. The choices we are making about our AI future depend upon our answer to the question Wendell Berry posed 30 years ago with his book What Are People For? Up to now, at least in the U.S., the answer has been that people exist to generate corporate profits.
Andrew Yang, a Silicon Valley entrepreneur running for the Democratic presidential nomination, argues that Donald Trump is president because automation eliminated 4 million manufacturing jobs in the Rust Belt states Trump narrowly won. Yang expects that blue collar alienation will multiply soon, when driverless vehicles replace 3.5 million truck drivers.
Learning How to Live Life (on Life’s Terms)
A FEW YEARS AGO, I was sitting in a McDonald’s, getting some work done during my son’s orchestra practice, when I looked up and saw an ambulance parked on the sidewalk outside, a team of EMTs at work on a man at one of the patio tables. I pulled out my earbuds just in time to hear the McDonald’s worker behind the counter say, in an exasperated tone, “I don’t know why they have to come here to shoot their dope.”
The man was being revived from a heroin overdose, but life was going on around him as though his situation was a routine occurrence. That’s because here in Kentucky, it is.
It’s even more routine in Huntington, W.Va., a city known as the overdose capital of America, with a rate twice the national average. In 2017, Huntington was the setting for a prize-winning Netflix documentary, Heroin(e), by West Virginia filmmaker Elaine McMillion Sheldon, which profiled three women—a fire chief, a drug court judge, and a Christian volunteer—who had their fingers in the dike, struggling to hold back the overdose deluge.
There’s No Stopping Populism
A SPECTER IS haunting the neoliberal establishments of Europe and the Americas: populism. And the intelligentsia beholden to those establishments is pitching a hissy fit in response.
You can see it happening via publications such as The Atlantic —with headlines such as “What Populists Do to Democracies” and “How to Be a Populist”—and The Guardian, which has devoted an inordinate amount of its cyberspace to “Team Populism,” a transnational network of academics studying the rise of populist movements and leaders. A search of my university library database shows 1,259 books with the word “populism” in the title published just since 2016. The Guardian even offers a “How Populist Are You?” quiz.
The current populist moment gives the international commentariat a lot to chew on. For starters, there is so much disagreement about what “populism” even means. It’s hard to see how a word regularly applied to Donald Trump and Bernie Sanders can mean much of anything at all. In their work, the Team Populism people try to sort out this left-right mishmash by detaching the phenomenon of populism from its associations with socialism and ethno-nationalism. They consider populism not an ideology for governing but a strategy for attaining and keeping power. According to their June 2018 policy paper: “[Scholars] call something populist if it expresses the belief that politics embodies a struggle between the forces of good, understood as the will of the common people, and the forces of evil, associated with a conspiring elite.”
The New Digital Divide
TWENTY YEARS AGO, when we talked about the “digital divide” we meant things like low-income people’s access to computers and the internet. But according to a recent study from Common Sense Media, that turn-of-the-century gap has largely closed. Seventy percent of families with an annual household income below $30,000 now have a computer at home, and 75 percent have high-speed internet access. In addition, low-income families are near the national average for access to mobile devices such as smart phones and tablets.
But another digital divide is emerging that could have more dangerous long-term consequences.
Researchers have discovered a lot about how brain development and personality formation happen, and their lessons keep coming back to the importance of real-world experiences and face-to-face human interactions, especially in the childhood years. To avoid passivity and mental laziness in their children, many high-income parents are starting to limit their children’s time on digital devices. Common Sense Media found that the children of upper-income families spent half as much time in front of screens as did children of low-income families.
Several private schools are even dialing back their reliance on digital technology. Meanwhile, in many public schools, students are being issued Chromebooks or iPads and shunted into online learning programs. According to Education Week, American schools spend $3 billion per year on digital content, as well as $8 billion-plus yearly on hardware and software, with little to show for it so far in the way of improved learning.
Graceless in ‘Graceland’
ON SEPT. 22, 2018, Paul Simon took to an outdoor stage in his native borough of Queens, N.Y., for the last show of his aptly named “Homeward Bound” farewell tour. After 52 years near the top of the music business, Simon was finally ready to get off the bus for good. Simon’s not taking a vow of musical silence, but he does say that he has no plans for further work.
So, let’s take the man at his word and assume that this could be a good time to assess what the singer-songwriter has meant to his audiences, his country, and, in the end, the great march of human culture.
That may sound a little grandiose for a guy who started as a hack songwriter in the Brill Building pop factory and made his first record (with Art Garfunkel, of course) under the name “Tom and Jerry.” But after the whole “The Sound of Silence,” folk-rock thing passed, Simon went on a long, long run in which he often elevated the American pop song to the level of high art. And, from “America,” dropped into the maelstrom of 1968, to the Nixon era’s “American Tune,” to “Wristband” in the age of Trump, he occasionally even captured the spirit of his age in a memorable, hummable verse, chorus, and bridge.
In brief, the guy’s a genius. And, though he started in the era when singer-songwriters were supposed to be the new poets, his real genius turned out to be musical: those infectious tunes and, from the mid-’70s on, those propulsive rhythmic arrangements.
When Zeal Turns Tragic
ON THE MORNING OF June 23, 2014, a 79-year-old retired Methodist minister, Charles Moore, parked his Volkswagen hatchback in a strip mall parking lot in his old hometown of Grand Saline, Texas. For most of the day he stood in the lot, watching the cars go by on U.S. Highway 80. Sometime after 5:30 p.m., Moore set a small foam cushion on the parking lot asphalt, knelt on the cushion, poured gasoline over himself, and set himself on fire.
That act of public suicide provides the starting point for the PBS Independent Lens documentary Man on Fire, available for viewing online starting Dec. 17. The hourlong film is a sustained reflection both on Moore’s life as an especially stubborn perennial dissident and on the life of the town where his journey began and ended.
When Moore died, he left behind a neatly typed testimony tucked under a windshield wiper of his car, which portrayed his suicide as an act of solidarity with the untold numbers of African Americans lynched and brutalized in a town that was still largely unrepentant. On the dashboard of Moore’s car was a copy of his high school yearbook, presumably to prove to Grand Saline authorities that he was in fact one of their own, although he hadn’t really lived there for decades.
On its face, Moore’s suicide sounds like the tragic act of a man mired in depression and possibly even delusions. But the story becomes more complex when you read Michael Hall’s long article, also titled “Man on Fire,” from the December 2014 issue of Texas Monthly.
What Are These 'Facts' You Speak Of?
FOR 20 YEARS, Alex Jones, a radio show host and founder of the Infowars website, has been spreading one off-the-wall conspiracy theory after another, and, for the past decade, social media have amplified his voice and his reach to a level his predecessors on the “paranoid Right” could never have imagined. In early August, Facebook and Google-owned YouTube finally took measures to effectively ban Jones from their platforms. But the way they did it raises more questions than it answers about the possibility of restoring respect for truth to public life in the United States.
Way back in the dying days of the 20th century, Alex Jones started his career ranting about the old conspiracy standbys, such as fluoride in our drinking water. But then 9/11 happened, and Jones took his act to a whole new level, claiming that the attacks on the World Trade Center and the Pentagon were really “inside jobs” unleashed by the secret government to launch a global war and suspend civil liberties.
In days gone by, such a theory would have been passed around on mimeographed fliers, and mainstream journalism, shackled by considerations of fact, wouldn’t have touched it. But the social media era has freed us from all that. Now anybody can say anything, and everybody can hear it. Suddenly Alex Jones had an audience of millions for his Facebook pages, his YouTube channel, and his website; this success seemed to egg him on to ever more outrageous pronouncements. Finally, he hit rock bottom with the claim that the Sandy Hook school shooting was faked (to provide a pretext for seizing Americans’ guns) and all those grieving parents were only acting.
After the Battle
TWO OF THIS year’s most compelling music releases so far give a deeply personal voice to the moral, emotional, and psychological struggles of the men and women who have waged our long-term “war on terror.” Healing Tide by The War and Treaty is the first full-length recording from this husband-and-wife team; the husband, Michael Trotter Jr., became a musician during his time in Iraq and has since struggled with post-traumatic stress disorder. Meanwhile, Mary Gauthier’s Rifles and Rosary Beads features 11 songs co-written with Iraq and Afghanistan vets through the SongwritingWith:Soldiers (SW:S) project.
Trotter, the songwriter and keyboardist in The War and Treaty, enlisted in the Army in 2003 simply to provide health insurance and a steady paycheck for his daughter and first wife. Within six months he was in Iraq, stationed in the remains of one of Saddam Hussein’s palaces. One day, a captain who had taken Trotter under his wing and knew that he could sing took him into a rubble-strewn room that held a piano. Trotter had never played in his life, but in his downtime he taught himself and started writing songs.
Then that captain was killed in action. Trotter wrote a song and performed it at the captain’s memorial. The song was “Dear Martha,” which the band still performs, and it made such an impression on all the soldiers at the memorial that Trotter’s colonel tasked him to write and perform a song for the memorial services of soldiers who died in action.
I JUST STUMBLED onto the whole Rob Bell thing in the past few weeks. Before that, I knew the name, and I vaguely associated it with some headlines about the founder and pastor of the Mars Hill Bible Church in Michigan becoming evangelical non grata for writing a book, Love Wins, in which he said some thought-provoking things about the afterlife.
That was it. Then, while researching something else, I watched the new documentary The Heretic (directed by Andrew Morgan, available on Amazon and iTunes). I was astounded. A 40-something guy was on stage, alone, dressed in what seemed like an ill-fitting hipster costume: Cropped pants, shoes with no socks, and a weirdly undersized jacket. He held just a wireless mike and talked, to a theater filled with 500 or so paying customers, about Jesus and the Bible and what it all really means. This apparently happens all over the country, and all over the English-speaking world. This was a revelation to me.
Old Crows, New Tricks
TWENTY YEARS ago, Old Crow Medicine Show, the 21st century old-time string band, began as a gigantic all-or-nothing bet on the viability of American traditions long left for dead. The kind of bet that only foolish young people could make. In 1998, fiddling frontman Ketch Secor, freshly rejected by his high school girlfriend, gathered a band of like-minded pickers and took off from upstate New York on an epic transcontinental busk-a-thon. One guy in the van was Critter Fuqua, Secor’s best friend from their school days in Harrisonburg, Va. The rest were neo-folk enthusiasts from the rural Northeast.
For the next few months, the newly named Old Crow Medicine Show pulled into towns that were barely on the map, stood in front of a centrally located store, and turned loose a blaze of ancient American music, fueled by punk-rock energy and abandon. The people came and cheered, and enough money fell into the banjo case to keep the gas tank full. The latter-day pioneers never went to sleep hungry, and they came back to the East convinced that they were onto something real and life-changing.
Fake News and Real Lies
FORMER FOX NEWS chair Roger Ailes is the single individual most responsible for the toxically divisive and fact-challenged nature of America’s current political culture. So it would be nice to think that Ailes’ disgraced departure from the cable news channel he created might mark the end of an era. Nice, but probably delusional. For one thing, at this writing, day-to-day control of Fox News remains in the hands of Ailes acolytes, and Ailes himself may be back in the political consulting game as Donald Trump’s debate coach. The Ailes era has been a very long one, and the changes he helped make are now deeply imbedded in the way we do politics, and even the way many people live their daily lives.
The scope and magnitude of Ailes’ accomplishments are truly staggering. Forty-eight years ago he helped Richard Nixon become president by devising a media strategy that allowed the candidate to almost entirely avoid dealing with actual journalists. Instead, Ailes staged a series of “town hall” meetings that were designed to look like open forums, with the candidate answering questions from “real people.” But the audiences were carefully selected, the questions were scripted, and the sessions were edited for national broadcast as paid advertisements.
This strategy of disguising propaganda as “real” events became a keystone of Ailes’ career. In the 1970s, he ran a short-lived operation called Television News Inc. (TVN), funded by right-wing brewing tycoon Joseph Coors. TVN aimed to supply local TV news programs with professional, prepackaged “news” stories, reported by real journalists, that were actually thinly veiled right-wing messages. This turned out to be a world-changing idea whose time had not yet come. The TVN motto, by the way, was “Fair and Balanced.”
Tech and Consequences
TOWARD THE END of August this year, more than 100 million potential U.S. voters were exposed to a fake story about the presidential election that was disguised as hard news. The story, which claimed that Fox News anchor Megyn Kelly had endorsed Hillary Clinton, began on an ultra-Right website called endingthefed.com, but a link to it quickly appeared in the “Trending” box at the top of the Facebook screen. Not only did the fraudulent link slip through Facebook’s legendary screening software, but it stayed there for a full eight hours.
A couple of weeks later, the opposite problem struck when the Facebook robo-censor kicked out posts containing the Pulitzer Prize-winning 1972 photograph of a young naked Vietnamese girl fleeing a U.S. napalm attack. The Facebook Machine didn’t see a gut-wrenching statement about the cruelty of war. It only saw a naked little girl. After an entire day of protests, Facebook finally announced that it would reprogram the software to allow that photo of a naked girl.
Facebook has been cajoled and scolded over the past year by various German officials about the company’s failure to preemptively remove racist material, as German law requires. But Zuckerberg insists Facebook is “a tech company, not a media company.” We build “the tools,” he said, “we do not produce any content.”
The through line in all of these controversies is a persistent question about the role of human decisions versus that of computer algorithms in determining what material appears on Facebook or other digital media intermediaries, including the Google News search engine. Are we just going to see the stories that are generating the most statistically measurable buzz? Or will trained professionals take a hand in guaranteeing that what we see is actually true? The answer has enormous legal consequences for companies such as Facebook. If their human staffs are making choices about the veracity and relative importance of news stories, then digital media platforms may be liable to lawsuits over the content of those stories. But the stakes are even higher for the future of journalism and the functioning of democracy.
A Nobel Prize for the Masses?
ANY REASONABLE person should admit that Bob Dylan’s 54 years as a great American artist deserve some kind of monumental recognition, maybe even a real monument somewhere. But the monumental recognition Dylan received in October from the Nobel Prize committee for literature has generated plenty of argument, much of it among reasonable people. Scottish novelist Irvine Welsh had the best one-liner. “This,” he said, “is an ill-conceived nostalgia award wrenched from the rancid prostates of senile, gibbering hippies.”
But, generational animosities aside, the most cogent complaint about the Dylan Nobel goes like this: “Sure, most of his music is great. But is it literature?”
And of course it’s not. At least not if literature is limited to its dictionary definition as the stuff composed to be read from a page (or, today, a screen). However, in announcing Dylan’s prize, the Nobel committee dodged that whole question. They didn’t call him a “poet.” Instead, they honored his “new poetic expressions within the great American song tradition.”
I’m not sure exactly what the Nobel committee meant by that cryptic utterance, but it hits pretty close to the heart of Dylan’s achievement. At his best Dylan has brought the sensibility, philosophical stance, and rough-hewn sound of what Greil Marcus calls “the old, weird America” into our postmodern era not as archaeological artifact, but as a living tradition.
The voice of the old, weird America, echoing through Dylan’s songs, is the voice of the medicine-show snake oil peddlers and the Appalachian snake-handlers. It’s the voice of the slave, or his recent descendant, for whom the rising waters of the Mississippi were a metaphor for his entire life. It is the dirt farmer driven mad by the wails of his hungry children. The Southern poor white committing racist violence as a pawn in the rich man’s game. It’s the Sunday morning believer and the Saturday night cynic. The oral culture of Dylan’s America was raw, unmediated, life on life’s terms. And that’s the voice we can still hear in the best of his songs.
Down the Breitbart Rabbit Hole
AS HAS BEEN widely noted, when Donald Trump named Steve Bannon to head his presidential campaign, he brought into the U.S. political mainstream a set of ideas that have, for at least 75 years, been relegated to a disreputable fringe. Bannon has bounced through a number of incarnations in the past three decades—naval officer, investment banker, and film producer—before joining the ultraconservative “news” website Breitbart.com, first as a board member, then, after founder Andrew Breitbart’s sudden death in 2012, as executive chair. In that role, he took an outlet that was already at the far right edge of American politics down the rabbit hole and into the underground world of race-based nationalist theories and the politics of white resentment.
Breitbart founded his site in 2007, and it came to prominence in 2009 when the site promoted the deceptively edited hidden-camera videos that led to the demise of the ACORN community organizing network. A little later, Breitbart was the first outlet to post the again deceptively edited videos that led to the firing of African-American U.S. Department of Agriculture official Shirley Sherrod. In 2011, Breitbart broke the story of liberal Democratic representative Anthony Weiner’s penchant for obscene self-portraits.
Then Bannon took over in 2012, and the website began to exhibit a new interest in the far right nationalist movements rising in Europe. This, coupled with a pre-existing obsession with the imagined dangers of illegal immigration, helped make the site, as Bannon later boasted, “the platform for the alt-right.” The term “alt-right,” as we now all know, refers to a loose, mostly online network of white activists gathered around the general notion that the “white race” and its European-derived culture is slated for obliteration by the forces of globalism and multiculturalism.
What Trump Got Right
BY THE TIME you read this, all of the important appointments in the new Trump administration will have been made, and the shape of the disaster that awaits us will be clear. Maybe the new president never did, as New Yorker satirist Andy Borowitz suggested, appoint cartel kingpin Joaquin “El Chapo” Guzman as head of the Drug Enforcement Administration. But with the appointment of fast-food mogul Andrew Puzder as secretary of labor, vulture capitalist Wilbur Ross as secretary of commerce, and Wall Street vampire Steven Mnuchin as secretary of treasury, Trump certainly spit in the face of the low-income white voters who put him over the top in the industrial Midwest.
Which brings us back to the recurring question: Why did so many blue-collar white people vote for a greedy, self-dealing billionaire in the first place? One answer is that Trump very effectively pushed the buttons of racial resentment (mostly about immigrants and Muslims) that are especially sensitive in less-educated, white areas. There is certainly something to that theory. But it doesn’t account for the fact that, as New York Times polling whiz Nate Cohn has noted, “Clinton suffered her biggest losses in the places where Obama was strongest among white voters.”
I would argue instead that Trump won primarily because he finally named the shadow that has hung, unacknowledged, over American life for at least the past 25 years: globalism. On June 28, 2016, during one of candidate Trump’s rare attempts to stay on message and give a serious public policy statement, he said, “Today, we import nearly $800 billion more in goods than we export. This is not some natural disaster. ... It is the consequence of a leadership class that worships globalism over Americanism.”
The Niebuhr We Need
NEARLY 46 YEARS after his death, Protestant theologian Reinhold Niebuhr is never very far from the public eye. He’s already immortal as the originator of the world-famous Serenity Prayer (“God grant me the serenity to accept the things I cannot change, the courage to change the things I can, and the wisdom to know the difference.”). And just last fall, an article in Harper’s took up the eternal question: “Where is our Reinhold Niebuhr?” President Obama once called him his favorite philosopher, and Niebuhr is regularly “proof-texted” by polemicists across the political spectrum, especially on questions of war and peace.
In April, PBS will air a documentary, An American Conscience: The Reinhold Niebuhr Story, directed by Martin Doblmeier. It will give an even broader public the chance to reflect on Niebuhr’s significance, in the company of such notables as Cornel West, Stanley Hauerwas, President Jimmy Carter, and New York Times columnist David Brooks.
I was eager to see the film. I’ve always felt that Reinhold Niebuhr was somewhere in my family tree. As a student at a Baptist-related college in the 1970s, I got heavy doses of his book Moral Man and Immoral Society. Later, I had the opportunity to interview Myles Horton, founder of the mother church of Southern radicalism, the Highlander Center, and learned that Horton had studied with Niebuhr at Union Theological Seminary. In fact, Niebuhr helped fund Highlander. And about a dozen years after that, I gave one of our sons the middle name Myles, in honor of Horton. So how many degrees of separation is that?