Danny Duncan Collum, author of the novel White Boy, teaches writing at Kentucky State University in Frankfort.
Posts By This Author
The Worth of Work
THE AGE OF THE ROBOTS is here. If you didn’t notice, it’s because we’re calling them artificial intelligence (AI) and they don’t look like we expected. They’re the touchscreen kiosk that has replaced the cashier at Panera, the mechanical arms and claws flipping burgers at fast food joints, the drone that may someday deliver your Amazon order. They’re the software that can turn a baseball box score or corporate earnings report into a wire service news story.
According to a recent report from the Brookings Institute, about 38 percent of the adult population could be put out of work by smart machines in the next generation. The choices we are making about our AI future depend upon our answer to the question Wendell Berry posed 30 years ago with his book What Are People For? Up to now, at least in the U.S., the answer has been that people exist to generate corporate profits.
Andrew Yang, a Silicon Valley entrepreneur running for the Democratic presidential nomination, argues that Donald Trump is president because automation eliminated 4 million manufacturing jobs in the Rust Belt states Trump narrowly won. Yang expects that blue collar alienation will multiply soon, when driverless vehicles replace 3.5 million truck drivers.
Learning How to Live Life (on Life’s Terms)
A FEW YEARS AGO, I was sitting in a McDonald’s, getting some work done during my son’s orchestra practice, when I looked up and saw an ambulance parked on the sidewalk outside, a team of EMTs at work on a man at one of the patio tables. I pulled out my earbuds just in time to hear the McDonald’s worker behind the counter say, in an exasperated tone, “I don’t know why they have to come here to shoot their dope.”
The man was being revived from a heroin overdose, but life was going on around him as though his situation was a routine occurrence. That’s because here in Kentucky, it is.
It’s even more routine in Huntington, W.Va., a city known as the overdose capital of America, with a rate twice the national average. In 2017, Huntington was the setting for a prize-winning Netflix documentary, Heroin(e), by West Virginia filmmaker Elaine McMillion Sheldon, which profiled three women—a fire chief, a drug court judge, and a Christian volunteer—who had their fingers in the dike, struggling to hold back the overdose deluge.
There’s No Stopping Populism
A SPECTER IS haunting the neoliberal establishments of Europe and the Americas: populism. And the intelligentsia beholden to those establishments is pitching a hissy fit in response.
You can see it happening via publications such as The Atlantic —with headlines such as “What Populists Do to Democracies” and “How to Be a Populist”—and The Guardian, which has devoted an inordinate amount of its cyberspace to “Team Populism,” a transnational network of academics studying the rise of populist movements and leaders. A search of my university library database shows 1,259 books with the word “populism” in the title published just since 2016. The Guardian even offers a “How Populist Are You?” quiz.
The current populist moment gives the international commentariat a lot to chew on. For starters, there is so much disagreement about what “populism” even means. It’s hard to see how a word regularly applied to Donald Trump and Bernie Sanders can mean much of anything at all. In their work, the Team Populism people try to sort out this left-right mishmash by detaching the phenomenon of populism from its associations with socialism and ethno-nationalism. They consider populism not an ideology for governing but a strategy for attaining and keeping power. According to their June 2018 policy paper: “[Scholars] call something populist if it expresses the belief that politics embodies a struggle between the forces of good, understood as the will of the common people, and the forces of evil, associated with a conspiring elite.”
The New Digital Divide
TWENTY YEARS AGO, when we talked about the “digital divide” we meant things like low-income people’s access to computers and the internet. But according to a recent study from Common Sense Media, that turn-of-the-century gap has largely closed. Seventy percent of families with an annual household income below $30,000 now have a computer at home, and 75 percent have high-speed internet access. In addition, low-income families are near the national average for access to mobile devices such as smart phones and tablets.
But another digital divide is emerging that could have more dangerous long-term consequences.
Researchers have discovered a lot about how brain development and personality formation happen, and their lessons keep coming back to the importance of real-world experiences and face-to-face human interactions, especially in the childhood years. To avoid passivity and mental laziness in their children, many high-income parents are starting to limit their children’s time on digital devices. Common Sense Media found that the children of upper-income families spent half as much time in front of screens as did children of low-income families.
Several private schools are even dialing back their reliance on digital technology. Meanwhile, in many public schools, students are being issued Chromebooks or iPads and shunted into online learning programs. According to Education Week, American schools spend $3 billion per year on digital content, as well as $8 billion-plus yearly on hardware and software, with little to show for it so far in the way of improved learning.
Graceless in ‘Graceland’
ON SEPT. 22, 2018, Paul Simon took to an outdoor stage in his native borough of Queens, N.Y., for the last show of his aptly named “Homeward Bound” farewell tour. After 52 years near the top of the music business, Simon was finally ready to get off the bus for good. Simon’s not taking a vow of musical silence, but he does say that he has no plans for further work.
So, let’s take the man at his word and assume that this could be a good time to assess what the singer-songwriter has meant to his audiences, his country, and, in the end, the great march of human culture.
That may sound a little grandiose for a guy who started as a hack songwriter in the Brill Building pop factory and made his first record (with Art Garfunkel, of course) under the name “Tom and Jerry.” But after the whole “The Sound of Silence,” folk-rock thing passed, Simon went on a long, long run in which he often elevated the American pop song to the level of high art. And, from “America,” dropped into the maelstrom of 1968, to the Nixon era’s “American Tune,” to “Wristband” in the age of Trump, he occasionally even captured the spirit of his age in a memorable, hummable verse, chorus, and bridge.
In brief, the guy’s a genius. And, though he started in the era when singer-songwriters were supposed to be the new poets, his real genius turned out to be musical: those infectious tunes and, from the mid-’70s on, those propulsive rhythmic arrangements.
When Zeal Turns Tragic
ON THE MORNING OF June 23, 2014, a 79-year-old retired Methodist minister, Charles Moore, parked his Volkswagen hatchback in a strip mall parking lot in his old hometown of Grand Saline, Texas. For most of the day he stood in the lot, watching the cars go by on U.S. Highway 80. Sometime after 5:30 p.m., Moore set a small foam cushion on the parking lot asphalt, knelt on the cushion, poured gasoline over himself, and set himself on fire.
That act of public suicide provides the starting point for the PBS Independent Lens documentary Man on Fire, available for viewing online starting Dec. 17. The hourlong film is a sustained reflection both on Moore’s life as an especially stubborn perennial dissident and on the life of the town where his journey began and ended.
When Moore died, he left behind a neatly typed testimony tucked under a windshield wiper of his car, which portrayed his suicide as an act of solidarity with the untold numbers of African Americans lynched and brutalized in a town that was still largely unrepentant. On the dashboard of Moore’s car was a copy of his high school yearbook, presumably to prove to Grand Saline authorities that he was in fact one of their own, although he hadn’t really lived there for decades.
On its face, Moore’s suicide sounds like the tragic act of a man mired in depression and possibly even delusions. But the story becomes more complex when you read Michael Hall’s long article, also titled “Man on Fire,” from the December 2014 issue of Texas Monthly.
What Are These 'Facts' You Speak Of?
FOR 20 YEARS, Alex Jones, a radio show host and founder of the Infowars website, has been spreading one off-the-wall conspiracy theory after another, and, for the past decade, social media have amplified his voice and his reach to a level his predecessors on the “paranoid Right” could never have imagined. In early August, Facebook and Google-owned YouTube finally took measures to effectively ban Jones from their platforms. But the way they did it raises more questions than it answers about the possibility of restoring respect for truth to public life in the United States.
Way back in the dying days of the 20th century, Alex Jones started his career ranting about the old conspiracy standbys, such as fluoride in our drinking water. But then 9/11 happened, and Jones took his act to a whole new level, claiming that the attacks on the World Trade Center and the Pentagon were really “inside jobs” unleashed by the secret government to launch a global war and suspend civil liberties.
In days gone by, such a theory would have been passed around on mimeographed fliers, and mainstream journalism, shackled by considerations of fact, wouldn’t have touched it. But the social media era has freed us from all that. Now anybody can say anything, and everybody can hear it. Suddenly Alex Jones had an audience of millions for his Facebook pages, his YouTube channel, and his website; this success seemed to egg him on to ever more outrageous pronouncements. Finally, he hit rock bottom with the claim that the Sandy Hook school shooting was faked (to provide a pretext for seizing Americans’ guns) and all those grieving parents were only acting.
After the Battle
TWO OF THIS year’s most compelling music releases so far give a deeply personal voice to the moral, emotional, and psychological struggles of the men and women who have waged our long-term “war on terror.” Healing Tide by The War and Treaty is the first full-length recording from this husband-and-wife team; the husband, Michael Trotter Jr., became a musician during his time in Iraq and has since struggled with post-traumatic stress disorder. Meanwhile, Mary Gauthier’s Rifles and Rosary Beads features 11 songs co-written with Iraq and Afghanistan vets through the SongwritingWith:Soldiers (SW:S) project.
Trotter, the songwriter and keyboardist in The War and Treaty, enlisted in the Army in 2003 simply to provide health insurance and a steady paycheck for his daughter and first wife. Within six months he was in Iraq, stationed in the remains of one of Saddam Hussein’s palaces. One day, a captain who had taken Trotter under his wing and knew that he could sing took him into a rubble-strewn room that held a piano. Trotter had never played in his life, but in his downtime he taught himself and started writing songs.
Then that captain was killed in action. Trotter wrote a song and performed it at the captain’s memorial. The song was “Dear Martha,” which the band still performs, and it made such an impression on all the soldiers at the memorial that Trotter’s colonel tasked him to write and perform a song for the memorial services of soldiers who died in action.
Individualism Wins
I JUST STUMBLED onto the whole Rob Bell thing in the past few weeks. Before that, I knew the name, and I vaguely associated it with some headlines about the founder and pastor of the Mars Hill Bible Church in Michigan becoming evangelical non grata for writing a book, Love Wins, in which he said some thought-provoking things about the afterlife.
That was it. Then, while researching something else, I watched the new documentary The Heretic (directed by Andrew Morgan, available on Amazon and iTunes). I was astounded. A 40-something guy was on stage, alone, dressed in what seemed like an ill-fitting hipster costume: Cropped pants, shoes with no socks, and a weirdly undersized jacket. He held just a wireless mike and talked, to a theater filled with 500 or so paying customers, about Jesus and the Bible and what it all really means. This apparently happens all over the country, and all over the English-speaking world. This was a revelation to me.
Old Crows, New Tricks
TWENTY YEARS ago, Old Crow Medicine Show, the 21st century old-time string band, began as a gigantic all-or-nothing bet on the viability of American traditions long left for dead. The kind of bet that only foolish young people could make. In 1998, fiddling frontman Ketch Secor, freshly rejected by his high school girlfriend, gathered a band of like-minded pickers and took off from upstate New York on an epic transcontinental busk-a-thon. One guy in the van was Critter Fuqua, Secor’s best friend from their school days in Harrisonburg, Va. The rest were neo-folk enthusiasts from the rural Northeast.
For the next few months, the newly named Old Crow Medicine Show pulled into towns that were barely on the map, stood in front of a centrally located store, and turned loose a blaze of ancient American music, fueled by punk-rock energy and abandon. The people came and cheered, and enough money fell into the banjo case to keep the gas tank full. The latter-day pioneers never went to sleep hungry, and they came back to the East convinced that they were onto something real and life-changing.
Down the Breitbart Rabbit Hole
AS HAS BEEN widely noted, when Donald Trump named Steve Bannon to head his presidential campaign, he brought into the U.S. political mainstream a set of ideas that have, for at least 75 years, been relegated to a disreputable fringe. Bannon has bounced through a number of incarnations in the past three decades—naval officer, investment banker, and film producer—before joining the ultraconservative “news” website Breitbart.com, first as a board member, then, after founder Andrew Breitbart’s sudden death in 2012, as executive chair. In that role, he took an outlet that was already at the far right edge of American politics down the rabbit hole and into the underground world of race-based nationalist theories and the politics of white resentment.
Breitbart founded his site in 2007, and it came to prominence in 2009 when the site promoted the deceptively edited hidden-camera videos that led to the demise of the ACORN community organizing network. A little later, Breitbart was the first outlet to post the again deceptively edited videos that led to the firing of African-American U.S. Department of Agriculture official Shirley Sherrod. In 2011, Breitbart broke the story of liberal Democratic representative Anthony Weiner’s penchant for obscene self-portraits.
Then Bannon took over in 2012, and the website began to exhibit a new interest in the far right nationalist movements rising in Europe. This, coupled with a pre-existing obsession with the imagined dangers of illegal immigration, helped make the site, as Bannon later boasted, “the platform for the alt-right.” The term “alt-right,” as we now all know, refers to a loose, mostly online network of white activists gathered around the general notion that the “white race” and its European-derived culture is slated for obliteration by the forces of globalism and multiculturalism.
Message Control
AMONG THE THINGS the Trump administration has successfully disrupted is the media hierarchy within the White House press corps. These days the Christian Broadcasting Network gets called on at presidential press conferences and CNN gets ignored.
One of the biggest beneficiaries of this shift has been a chain of local TV stations called the Sinclair Broadcast Group, which currently reaches 38 percent of U.S. households with a blend of local news and right-wing messaging. Sinclair is a big power on the U.S. media landscape, and it’s about to get a lot bigger and more powerful. Today the group owns 173 stations, but it is about to take advantage of a Trump administration change in media ownership rules to buy the 42 stations owned by Tribune Media, including outlets in New York and Los Angeles and the Chicago-based WGN America cable channel.
Tech and Consequences
TOWARD THE END of August this year, more than 100 million potential U.S. voters were exposed to a fake story about the presidential election that was disguised as hard news. The story, which claimed that Fox News anchor Megyn Kelly had endorsed Hillary Clinton, began on an ultra-Right website called endingthefed.com, but a link to it quickly appeared in the “Trending” box at the top of the Facebook screen. Not only did the fraudulent link slip through Facebook’s legendary screening software, but it stayed there for a full eight hours.
A couple of weeks later, the opposite problem struck when the Facebook robo-censor kicked out posts containing the Pulitzer Prize-winning 1972 photograph of a young naked Vietnamese girl fleeing a U.S. napalm attack. The Facebook Machine didn’t see a gut-wrenching statement about the cruelty of war. It only saw a naked little girl. After an entire day of protests, Facebook finally announced that it would reprogram the software to allow that photo of a naked girl.
Facebook has been cajoled and scolded over the past year by various German officials about the company’s failure to preemptively remove racist material, as German law requires. But Zuckerberg insists Facebook is “a tech company, not a media company.” We build “the tools,” he said, “we do not produce any content.”
The through line in all of these controversies is a persistent question about the role of human decisions versus that of computer algorithms in determining what material appears on Facebook or other digital media intermediaries, including the Google News search engine. Are we just going to see the stories that are generating the most statistically measurable buzz? Or will trained professionals take a hand in guaranteeing that what we see is actually true? The answer has enormous legal consequences for companies such as Facebook. If their human staffs are making choices about the veracity and relative importance of news stories, then digital media platforms may be liable to lawsuits over the content of those stories. But the stakes are even higher for the future of journalism and the functioning of democracy.
News We Could Lose
IN MY YEARS of writing this column, the politics and culture of U.S. public broadcasting has been a topic in regular rotation. During Democratic administrations, I’ve tended to bash both the Public Broadcasting Service and National Public Radio for elitism, timidity, and pro-corporate bias.
But during Republican administrations it’s always seemed necessary to defend the very existence of a nonprofit, public-interest alternative in the vast, depressing, and sometimes dangerous strip mall that is U.S. commercial media.
These days the timidity of U.S. public broadcasting is still in evidence. For instance, NPR has steadfastly refused to join other prestigious media outlets in calling Donald Trump’s patent deliberate falsehoods by the appropriate four-letter Anglo-Saxon word: “Lies.” And as for elitism, take Victoria ... please!
But let’s put all that aside for now. The guard has changed again, and a new president has issued a budget blueprint that would eliminate any federal spending to support public broadcasting. So it’s time again to restate the obvious reasons why public media matter.
Fake Populism at the FCC
WITH EACH PASSING week of his administration, the epic scale of the deception Donald Trump pulled off last November becomes more evident.
In his last TV ad of the presidential campaign, Donald Trump decried “a global power structure that is responsible for the economic decisions that have robbed our working class, stripped our country of its wealth and put that money into the pockets of a handful of large corporations and political entities.” Two weeks earlier, when the AT&T-Time Warner merger was announced, Trump said: “As an example of the power structure I’m fighting, AT&T is buying Time Warner and thus CNN, a deal we will not approve in my administration because it’s too much concentration of power in the hands of too few.” Later he added, “Deals like this destroy democracy.”
Since then, of course, the great champion of the people has given us a Treasury secretary (Steven Mnuchin) who, as a hedge fund manager and banker, made a specialty not only of “robb[ing] our working class,” but foreclosing on their homes to boot. And now the candidate who condemned the AT&T-Time Warner merger as oligarchic and anti-democratic has become a president whose most recent comment on the merger was simply, “I haven’t seen any of the facts, yet.” Worse still, Trump has appointed a Federal Communications Commission chair (Ajit Pai) who has promised to undo the Obama-era net neutrality regulations, and who never met a media merger he didn’t like. For example, Pai, who has worked as a lawyer for Verizon, said he would have approved the Comcast-Time Warner Cable merger that the Obama FCC blocked in 2015.
What Trump Got Right
BY THE TIME you read this, all of the important appointments in the new Trump administration will have been made, and the shape of the disaster that awaits us will be clear. Maybe the new president never did, as New Yorker satirist Andy Borowitz suggested, appoint cartel kingpin Joaquin “El Chapo” Guzman as head of the Drug Enforcement Administration. But with the appointment of fast-food mogul Andrew Puzder as secretary of labor, vulture capitalist Wilbur Ross as secretary of commerce, and Wall Street vampire Steven Mnuchin as secretary of treasury, Trump certainly spit in the face of the low-income white voters who put him over the top in the industrial Midwest.
Which brings us back to the recurring question: Why did so many blue-collar white people vote for a greedy, self-dealing billionaire in the first place? One answer is that Trump very effectively pushed the buttons of racial resentment (mostly about immigrants and Muslims) that are especially sensitive in less-educated, white areas. There is certainly something to that theory. But it doesn’t account for the fact that, as New York Times polling whiz Nate Cohn has noted, “Clinton suffered her biggest losses in the places where Obama was strongest among white voters.”
I would argue instead that Trump won primarily because he finally named the shadow that has hung, unacknowledged, over American life for at least the past 25 years: globalism. On June 28, 2016, during one of candidate Trump’s rare attempts to stay on message and give a serious public policy statement, he said, “Today, we import nearly $800 billion more in goods than we export. This is not some natural disaster. ... It is the consequence of a leadership class that worships globalism over Americanism.”
A Nobel Prize for the Masses?
ANY REASONABLE person should admit that Bob Dylan’s 54 years as a great American artist deserve some kind of monumental recognition, maybe even a real monument somewhere. But the monumental recognition Dylan received in October from the Nobel Prize committee for literature has generated plenty of argument, much of it among reasonable people. Scottish novelist Irvine Welsh had the best one-liner. “This,” he said, “is an ill-conceived nostalgia award wrenched from the rancid prostates of senile, gibbering hippies.”
But, generational animosities aside, the most cogent complaint about the Dylan Nobel goes like this: “Sure, most of his music is great. But is it literature?”
And of course it’s not. At least not if literature is limited to its dictionary definition as the stuff composed to be read from a page (or, today, a screen). However, in announcing Dylan’s prize, the Nobel committee dodged that whole question. They didn’t call him a “poet.” Instead, they honored his “new poetic expressions within the great American song tradition.”
I’m not sure exactly what the Nobel committee meant by that cryptic utterance, but it hits pretty close to the heart of Dylan’s achievement. At his best Dylan has brought the sensibility, philosophical stance, and rough-hewn sound of what Greil Marcus calls “the old, weird America” into our postmodern era not as archaeological artifact, but as a living tradition.
The voice of the old, weird America, echoing through Dylan’s songs, is the voice of the medicine-show snake oil peddlers and the Appalachian snake-handlers. It’s the voice of the slave, or his recent descendant, for whom the rising waters of the Mississippi were a metaphor for his entire life. It is the dirt farmer driven mad by the wails of his hungry children. The Southern poor white committing racist violence as a pawn in the rich man’s game. It’s the Sunday morning believer and the Saturday night cynic. The oral culture of Dylan’s America was raw, unmediated, life on life’s terms. And that’s the voice we can still hear in the best of his songs.
Strange and Beautiful Psalms
AT THIS POINT, it’s almost a tradition that aging roots music icons find a third, fourth, or fifth act in partnership with some latter-day guru of cool. Think Rick Rubin and Johnny Cash, Jack White and Loretta Lynn, Joe Henry and almost everyone else.
But the latest such pairing is, on the surface at least, the most incongruous yet. Jessi Colter, a soulful country singer most famous for being the widow of Waylon Jennings, has made an album (The Psalms) with Lenny Kaye, the rock historian, producer, and guitarist most famous for his lifetime membership in the Patti Smith Group.
Unlike all those other musical odd couplings, this one is not cross-generational. Colter is only three years older than Kaye, but it was always a long way from CBGB to the Grand Ole Opry. Yet here they are collaborating, on an album of Bible verses set to music no less. But when you look a little bit below the surface, this pairing makes all the sense in the world.
The origins of this album go all the way back to 1995, when Kaye, who has always kept up his career as a music journalist, was in Nashville helping Waylon Jennings write his autobiography. One morning, he walked into the living room and beheld Colter at the piano, her Bible open before her, laying down chords and improvising melodies as she sang from the King James Version of the Psalms. It was, Kaye has written, “one of the most beautiful expressions of belief I had ever witnessed.”
Fake News and Real Lies
FORMER FOX NEWS chair Roger Ailes is the single individual most responsible for the toxically divisive and fact-challenged nature of America’s current political culture. So it would be nice to think that Ailes’ disgraced departure from the cable news channel he created might mark the end of an era. Nice, but probably delusional. For one thing, at this writing, day-to-day control of Fox News remains in the hands of Ailes acolytes, and Ailes himself may be back in the political consulting game as Donald Trump’s debate coach. The Ailes era has been a very long one, and the changes he helped make are now deeply imbedded in the way we do politics, and even the way many people live their daily lives.
The scope and magnitude of Ailes’ accomplishments are truly staggering. Forty-eight years ago he helped Richard Nixon become president by devising a media strategy that allowed the candidate to almost entirely avoid dealing with actual journalists. Instead, Ailes staged a series of “town hall” meetings that were designed to look like open forums, with the candidate answering questions from “real people.” But the audiences were carefully selected, the questions were scripted, and the sessions were edited for national broadcast as paid advertisements.
This strategy of disguising propaganda as “real” events became a keystone of Ailes’ career. In the 1970s, he ran a short-lived operation called Television News Inc. (TVN), funded by right-wing brewing tycoon Joseph Coors. TVN aimed to supply local TV news programs with professional, prepackaged “news” stories, reported by real journalists, that were actually thinly veiled right-wing messages. This turned out to be a world-changing idea whose time had not yet come. The TVN motto, by the way, was “Fair and Balanced.”
Picked Clean to the Bone
IN THE 2015 speech announcing his candidacy for president, Donald Trump declared, “The American dream is dead.” The people of Lancaster, Ohio, a small town at the edge of Appalachia, heard him loud and clear and later gave him 60 percent of their votes. Glass House: The 1% Economy and the Shattering of the All-American Town , by Lancaster native Brian Alexander, shows in fine-grained detail how the American dream of opportunity and fairness died in Lancaster and in similar towns all across the middle of the country.
Lancaster should have been the last place you would look for evidence of American decline. In 1947, a Forbes magazine cover story depicted it as “the All-American town.” It had a thriving manufacturing economy, a burgeoning middle-class, and enlightened civic leadership. For reasons of history and geography, Lancaster also had a reputation as “the whitest town in America,” but that didn’t bother Forbes too much back then.
The Lancaster of Alexander’s childhood and youth sounds a lot like Bedford Falls in the movie It’s a Wonderful Life, but as the 20th century wore on, the town turned into Pottersville. When Alexander went back to write this book, he found that the glass factory where his father had worked was demolished. Most people had to drive an hour or more to Columbus for a job, civic life was deteriorating, and opioid addiction was rampant.
The main foundation of Lancaster’s All-American past was Anchor Hocking, a Fortune 500 glass manufacturer. According to Alexander, the industrialists who built Anchor Hocking in the early 20th century were real George Bailey types. Sure, they wanted to make a buck, but they were suckers for fuzzy-headed notions about the common good that led them to subsidize various public amenities for the town and cooperate with the unions that delivered a family wage to generations of Lancastrians. In those days, we learn, executives and managers might live on the same block with machine operators and share beers at the same local tavern.