Danny Duncan Collum, a Sojourners contributing writer, teaches writing at Kentucky State University in Frankfort, Kentucky. He is the author of the novel White Boy.
Posts By This Author
News We Could Lose
IN MY YEARS of writing this column, the politics and culture of U.S. public broadcasting has been a topic in regular rotation. During Democratic administrations, I’ve tended to bash both the Public Broadcasting Service and National Public Radio for elitism, timidity, and pro-corporate bias.
But during Republican administrations it’s always seemed necessary to defend the very existence of a nonprofit, public-interest alternative in the vast, depressing, and sometimes dangerous strip mall that is U.S. commercial media.
These days the timidity of U.S. public broadcasting is still in evidence. For instance, NPR has steadfastly refused to join other prestigious media outlets in calling Donald Trump’s patent deliberate falsehoods by the appropriate four-letter Anglo-Saxon word: “Lies.” And as for elitism, take Victoria ... please!
But let’s put all that aside for now. The guard has changed again, and a new president has issued a budget blueprint that would eliminate any federal spending to support public broadcasting. So it’s time again to restate the obvious reasons why public media matter.
Picked Clean to the Bone
IN THE 2015 speech announcing his candidacy for president, Donald Trump declared, “The American dream is dead.” The people of Lancaster, Ohio, a small town at the edge of Appalachia, heard him loud and clear and later gave him 60 percent of their votes. Glass House: The 1% Economy and the Shattering of the All-American Town , by Lancaster native Brian Alexander, shows in fine-grained detail how the American dream of opportunity and fairness died in Lancaster and in similar towns all across the middle of the country.
Lancaster should have been the last place you would look for evidence of American decline. In 1947, a Forbes magazine cover story depicted it as “the All-American town.” It had a thriving manufacturing economy, a burgeoning middle-class, and enlightened civic leadership. For reasons of history and geography, Lancaster also had a reputation as “the whitest town in America,” but that didn’t bother Forbes too much back then.
The Lancaster of Alexander’s childhood and youth sounds a lot like Bedford Falls in the movie It’s a Wonderful Life, but as the 20th century wore on, the town turned into Pottersville. When Alexander went back to write this book, he found that the glass factory where his father had worked was demolished. Most people had to drive an hour or more to Columbus for a job, civic life was deteriorating, and opioid addiction was rampant.
The main foundation of Lancaster’s All-American past was Anchor Hocking, a Fortune 500 glass manufacturer. According to Alexander, the industrialists who built Anchor Hocking in the early 20th century were real George Bailey types. Sure, they wanted to make a buck, but they were suckers for fuzzy-headed notions about the common good that led them to subsidize various public amenities for the town and cooperate with the unions that delivered a family wage to generations of Lancastrians. In those days, we learn, executives and managers might live on the same block with machine operators and share beers at the same local tavern.
Fake Populism at the FCC
WITH EACH PASSING week of his administration, the epic scale of the deception Donald Trump pulled off last November becomes more evident.
In his last TV ad of the presidential campaign, Donald Trump decried “a global power structure that is responsible for the economic decisions that have robbed our working class, stripped our country of its wealth and put that money into the pockets of a handful of large corporations and political entities.” Two weeks earlier, when the AT&T-Time Warner merger was announced, Trump said: “As an example of the power structure I’m fighting, AT&T is buying Time Warner and thus CNN, a deal we will not approve in my administration because it’s too much concentration of power in the hands of too few.” Later he added, “Deals like this destroy democracy.”
Since then, of course, the great champion of the people has given us a Treasury secretary (Steven Mnuchin) who, as a hedge fund manager and banker, made a specialty not only of “robb[ing] our working class,” but foreclosing on their homes to boot. And now the candidate who condemned the AT&T-Time Warner merger as oligarchic and anti-democratic has become a president whose most recent comment on the merger was simply, “I haven’t seen any of the facts, yet.” Worse still, Trump has appointed a Federal Communications Commission chair (Ajit Pai) who has promised to undo the Obama-era net neutrality regulations, and who never met a media merger he didn’t like. For example, Pai, who has worked as a lawyer for Verizon, said he would have approved the Comcast-Time Warner Cable merger that the Obama FCC blocked in 2015.
The Niebuhr We Need
NEARLY 46 YEARS after his death, Protestant theologian Reinhold Niebuhr is never very far from the public eye. He’s already immortal as the originator of the world-famous Serenity Prayer (“God grant me the serenity to accept the things I cannot change, the courage to change the things I can, and the wisdom to know the difference.”). And just last fall, an article in Harper’s took up the eternal question: “Where is our Reinhold Niebuhr?” President Obama once called him his favorite philosopher, and Niebuhr is regularly “proof-texted” by polemicists across the political spectrum, especially on questions of war and peace.
In April, PBS will air a documentary, An American Conscience: The Reinhold Niebuhr Story, directed by Martin Doblmeier. It will give an even broader public the chance to reflect on Niebuhr’s significance, in the company of such notables as Cornel West, Stanley Hauerwas, President Jimmy Carter, and New York Times columnist David Brooks.
I was eager to see the film. I’ve always felt that Reinhold Niebuhr was somewhere in my family tree. As a student at a Baptist-related college in the 1970s, I got heavy doses of his book Moral Man and Immoral Society. Later, I had the opportunity to interview Myles Horton, founder of the mother church of Southern radicalism, the Highlander Center, and learned that Horton had studied with Niebuhr at Union Theological Seminary. In fact, Niebuhr helped fund Highlander. And about a dozen years after that, I gave one of our sons the middle name Myles, in honor of Horton. So how many degrees of separation is that?
What Trump Got Right
BY THE TIME you read this, all of the important appointments in the new Trump administration will have been made, and the shape of the disaster that awaits us will be clear. Maybe the new president never did, as New Yorker satirist Andy Borowitz suggested, appoint cartel kingpin Joaquin “El Chapo” Guzman as head of the Drug Enforcement Administration. But with the appointment of fast-food mogul Andrew Puzder as secretary of labor, vulture capitalist Wilbur Ross as secretary of commerce, and Wall Street vampire Steven Mnuchin as secretary of treasury, Trump certainly spit in the face of the low-income white voters who put him over the top in the industrial Midwest.
Which brings us back to the recurring question: Why did so many blue-collar white people vote for a greedy, self-dealing billionaire in the first place? One answer is that Trump very effectively pushed the buttons of racial resentment (mostly about immigrants and Muslims) that are especially sensitive in less-educated, white areas. There is certainly something to that theory. But it doesn’t account for the fact that, as New York Times polling whiz Nate Cohn has noted, “Clinton suffered her biggest losses in the places where Obama was strongest among white voters.”
I would argue instead that Trump won primarily because he finally named the shadow that has hung, unacknowledged, over American life for at least the past 25 years: globalism. On June 28, 2016, during one of candidate Trump’s rare attempts to stay on message and give a serious public policy statement, he said, “Today, we import nearly $800 billion more in goods than we export. This is not some natural disaster. ... It is the consequence of a leadership class that worships globalism over Americanism.”
Down the Breitbart Rabbit Hole
AS HAS BEEN widely noted, when Donald Trump named Steve Bannon to head his presidential campaign, he brought into the U.S. political mainstream a set of ideas that have, for at least 75 years, been relegated to a disreputable fringe. Bannon has bounced through a number of incarnations in the past three decades—naval officer, investment banker, and film producer—before joining the ultraconservative “news” website Breitbart.com, first as a board member, then, after founder Andrew Breitbart’s sudden death in 2012, as executive chair. In that role, he took an outlet that was already at the far right edge of American politics down the rabbit hole and into the underground world of race-based nationalist theories and the politics of white resentment.
Breitbart founded his site in 2007, and it came to prominence in 2009 when the site promoted the deceptively edited hidden-camera videos that led to the demise of the ACORN community organizing network. A little later, Breitbart was the first outlet to post the again deceptively edited videos that led to the firing of African-American U.S. Department of Agriculture official Shirley Sherrod. In 2011, Breitbart broke the story of liberal Democratic representative Anthony Weiner’s penchant for obscene self-portraits.
Then Bannon took over in 2012, and the website began to exhibit a new interest in the far right nationalist movements rising in Europe. This, coupled with a pre-existing obsession with the imagined dangers of illegal immigration, helped make the site, as Bannon later boasted, “the platform for the alt-right.” The term “alt-right,” as we now all know, refers to a loose, mostly online network of white activists gathered around the general notion that the “white race” and its European-derived culture is slated for obliteration by the forces of globalism and multiculturalism.
A Nobel Prize for the Masses?
ANY REASONABLE person should admit that Bob Dylan’s 54 years as a great American artist deserve some kind of monumental recognition, maybe even a real monument somewhere. But the monumental recognition Dylan received in October from the Nobel Prize committee for literature has generated plenty of argument, much of it among reasonable people. Scottish novelist Irvine Welsh had the best one-liner. “This,” he said, “is an ill-conceived nostalgia award wrenched from the rancid prostates of senile, gibbering hippies.”
But, generational animosities aside, the most cogent complaint about the Dylan Nobel goes like this: “Sure, most of his music is great. But is it literature?”
And of course it’s not. At least not if literature is limited to its dictionary definition as the stuff composed to be read from a page (or, today, a screen). However, in announcing Dylan’s prize, the Nobel committee dodged that whole question. They didn’t call him a “poet.” Instead, they honored his “new poetic expressions within the great American song tradition.”
I’m not sure exactly what the Nobel committee meant by that cryptic utterance, but it hits pretty close to the heart of Dylan’s achievement. At his best Dylan has brought the sensibility, philosophical stance, and rough-hewn sound of what Greil Marcus calls “the old, weird America” into our postmodern era not as archaeological artifact, but as a living tradition.
The voice of the old, weird America, echoing through Dylan’s songs, is the voice of the medicine-show snake oil peddlers and the Appalachian snake-handlers. It’s the voice of the slave, or his recent descendant, for whom the rising waters of the Mississippi were a metaphor for his entire life. It is the dirt farmer driven mad by the wails of his hungry children. The Southern poor white committing racist violence as a pawn in the rich man’s game. It’s the Sunday morning believer and the Saturday night cynic. The oral culture of Dylan’s America was raw, unmediated, life on life’s terms. And that’s the voice we can still hear in the best of his songs.
Tech and Consequences
TOWARD THE END of August this year, more than 100 million potential U.S. voters were exposed to a fake story about the presidential election that was disguised as hard news. The story, which claimed that Fox News anchor Megyn Kelly had endorsed Hillary Clinton, began on an ultra-Right website called endingthefed.com, but a link to it quickly appeared in the “Trending” box at the top of the Facebook screen. Not only did the fraudulent link slip through Facebook’s legendary screening software, but it stayed there for a full eight hours.
A couple of weeks later, the opposite problem struck when the Facebook robo-censor kicked out posts containing the Pulitzer Prize-winning 1972 photograph of a young naked Vietnamese girl fleeing a U.S. napalm attack. The Facebook Machine didn’t see a gut-wrenching statement about the cruelty of war. It only saw a naked little girl. After an entire day of protests, Facebook finally announced that it would reprogram the software to allow that photo of a naked girl.
Facebook has been cajoled and scolded over the past year by various German officials about the company’s failure to preemptively remove racist material, as German law requires. But Zuckerberg insists Facebook is “a tech company, not a media company.” We build “the tools,” he said, “we do not produce any content.”
The through line in all of these controversies is a persistent question about the role of human decisions versus that of computer algorithms in determining what material appears on Facebook or other digital media intermediaries, including the Google News search engine. Are we just going to see the stories that are generating the most statistically measurable buzz? Or will trained professionals take a hand in guaranteeing that what we see is actually true? The answer has enormous legal consequences for companies such as Facebook. If their human staffs are making choices about the veracity and relative importance of news stories, then digital media platforms may be liable to lawsuits over the content of those stories. But the stakes are even higher for the future of journalism and the functioning of democracy.
Fake News and Real Lies
FORMER FOX NEWS chair Roger Ailes is the single individual most responsible for the toxically divisive and fact-challenged nature of America’s current political culture. So it would be nice to think that Ailes’ disgraced departure from the cable news channel he created might mark the end of an era. Nice, but probably delusional. For one thing, at this writing, day-to-day control of Fox News remains in the hands of Ailes acolytes, and Ailes himself may be back in the political consulting game as Donald Trump’s debate coach. The Ailes era has been a very long one, and the changes he helped make are now deeply imbedded in the way we do politics, and even the way many people live their daily lives.
The scope and magnitude of Ailes’ accomplishments are truly staggering. Forty-eight years ago he helped Richard Nixon become president by devising a media strategy that allowed the candidate to almost entirely avoid dealing with actual journalists. Instead, Ailes staged a series of “town hall” meetings that were designed to look like open forums, with the candidate answering questions from “real people.” But the audiences were carefully selected, the questions were scripted, and the sessions were edited for national broadcast as paid advertisements.
This strategy of disguising propaganda as “real” events became a keystone of Ailes’ career. In the 1970s, he ran a short-lived operation called Television News Inc. (TVN), funded by right-wing brewing tycoon Joseph Coors. TVN aimed to supply local TV news programs with professional, prepackaged “news” stories, reported by real journalists, that were actually thinly veiled right-wing messages. This turned out to be a world-changing idea whose time had not yet come. The TVN motto, by the way, was “Fair and Balanced.”
When Journalism Jumped the Shark
IN THE FIRST half of 2016, O.J. Simpson, who still resides in a Nevada prison for his bungled robbery of sports memorabilia, seemed to be everywhere. First, there was the FX series “The People vs. O.J. Simpson: American Crime Story,” a high-quality, behind-the-scenes dramatization of Simpson’s 1995 murder trial. Then came “O.J.: Made in America,” a seven-and-a-half hour epic ESPN documentary in which director Ezra Edelman finally gives the Simpson story its due as a landmark event in the history of U.S. attitudes toward race, celebrity, and domestic violence, and in the evolution of our mass media culture.
Among other things, the O.J. Simpson murder trial marked the end of an era in which professional journalists observed events, then summarized and framed them into a coherent narrative for public consumption. This legacy of the print age persisted well into the broadcast era. Until the late 20th century, live, real-time TV coverage was limited to things like sporting events, inaugurations, and moonshots, or national disasters. Otherwise, the world was presented to TV viewers in one neat, 30-minute daily package at 6 p.m.
Like everything else in American culture, this started to change as cable replaced over-the-air broadcasting and specialty channels proliferated. In 1979, C-SPAN started running gavel-to-gavel coverage of the generally somnolent proceedings of the U.S. Congress. But CNN came along the next year to make things such as a toddler falling down a well in Texas into a national melodrama. True, CNN also went wall-to-wall on things such as the Iran-Contra investigations and the first Iraq war, but in the months between legitimate big events it also whipped up essentially local stories, such as child disappearances or shark attacks, into manufactured national crises.
- 1 of 41