Danny Duncan Collum, a Sojourners contributing writer, teaches writing at Kentucky State University in Frankfort, Kentucky. He is the author of the novel White Boy.
Posts By This Author
The Birth of America's Class Divisions
IN AUGUST 2019, The New York Times published a special edition of its magazine, with an accompanying podcast, to note the 400th anniversary of the arrival of the first Africans in the Virginia colony. They called the total work “The 1619 Project.” As a Times blurb for the project put it, “American slavery began 400 years ago this month. This is referred to as the country’s original sin, but it is more than that: It is the country’s true origin.”
Almost a year later, “The 1619 Project” became a school history curriculum, and in the waning days of his presidential administration Donald Trump pushed back with plans for a “1776 Commission” to promote “patriotic education” and counter the claim that “America is a wicked and racist nation.”
It’s not surprising that a nation in which everyone has a right to their own facts may end up with two foundings. However, while those who emphasize the centrality of African enslavement in the American story are certainly closer to the truth, both the champions of 1619 and 1776 are missing something crucial. For all the things it got right, “The 1619 Project” over-simplified the origins of the U.S. slave system. As the eminent African American historian Nell Irvin Painter wrote in The Guardian, “People were not enslaved in Virginia in 1619, they were indentured. The [first] Africans were sold and bought as ‘servants’ for a term of years, and they joined a population consisting largely of European indentured servants, mainly poor people from the British Isles.”
Selling Personal Data Should Be Banned
THE PAST DECADE has seen an endless trickle of negative stories about social media—data breaches, Russian bots, cyberbullying, digital radicalization, etc.—so by now almost everyone knows that the amusement and convenience those platforms offer come with a downside. But now a new Netflix documentary, The Social Dilemma, is here to tell us one big thing: It’s worse than we thought. In fact, it’s worse than we could have possibly imagined.
In the film this alarm is raised by many of the very people who helped create the systems they now decry. We’re talking about the guy who invented the “Like” button for Facebook, the guy who designed the recommendation engine for YouTube, the fellow who invented the infinite scroll. One after another these mostly white, mostly male characters come on camera to tell us how badly their proudest accomplishments have gone awry.
The big problem these folks warn us about is that our smartphones constantly collect data (what we buy, what music we play, where we are, who we talk to, etc., ad nauseam) and that data is used to fuel a system of targeted alerts, notifications, and recommendations designed to keep us on a site for as long as possible and deliver us to advertisers who also have that data about us.
Poorer Than Our Parents?
OK, I ADMIT IT. I haven’t read Thomas Piketty’s 700-page Capital in the Twenty-First Century, the most talked about economics book of recent decades. There are too many novels in the world, and economics is hard. But not to worry, even for numerophobes like me, documentary filmmaker Justin Pemberton has come to the rescue with a quick and clever 103-minute movie of the same name that lots of people who claim to have read Piketty’s book say is, if a not a sufficient replacement, at least an effective companion.
Despite the title, the bulk of the film covers the 18th, 19th, and 20th centuries: A rotating cast of talking heads (including Piketty’s own) narrate the story of wealth in Europe and North America—from the palace of Versailles (“Royals” by Lorde on the soundtrack) to the slave markets of New Orleans and the happy suburbs of mid-20th century America. All this is illustrated by a montage of clips from movies, includingLes Misérables (the old black-and-white version and the musical), Pride and Prejudice, The Grapes of Wrath, and many more, and Depression-era newsreel footage of striking workers battling police and seizing factories.
Working From Home With No High-Speed Internet
FOR OVER A month now, like everyone else, I have been isolated at home. Here with me are my wife, Polly, and our two youngest sons, both college students. All of us are continuing our work and studies online, and our rural home has become a sort of cyber-monastery. We meet in the morning for daily Mass on YouTube, then peel off to our separate hotspots to toil through the day. We don’t have compline, but we do often reconvene to watch The West Wing.
This isn’t the life any of us would have chosen, but it’s probably good for the soul to surrender some of our precious, almighty power of choice, and it helps that almost everyone is going through this with us. But there’s one aspect of our locked-down life that we can’t attribute to the vagaries of a random virus, and it isn’t shared with most of our fellow citizens. Instead, it’s one of the many rank inequalities the COVID-19 crisis has exposed in American life.
Unlike most of you, we don’t have access to high-speed broadband internet. Two months ago, that might have seemed like a trivial complaint, but not now. And we’re far from alone. The best estimate says that there are about 42 million of us, about 13 percent of the U.S. population, mostly in rural America. Then there are all the people who could have access to broadband, but don’t, mostly because they can’t afford it. All told, about 30 percent of us are out here in the digital cold.
The Catholic Church According to Netflix
ACCORDING TO ARISTOTLE'S Poetics, art is supposed to imitate life. However, Oscar Wilde claimed that life more often imitates art. In the case of the recent Netflix movie The Two Popes and warring camps within the Catholic Church, it may be hard to tell which is which.
The Two Popes —which depicts an imagined relationship between Pope Emeritus Benedict XVI and his successor, Pope Francis—was bound to inflame tensions between those who believe that Francis wants to toss out historic church teachings on marriage and sexuality and those who suspect that anyone with a soft spot for the Latin Mass wants to bring back the Inquisition. Then, within weeks of the movie’s release, we had the spectacle of Benedict appearing as co-author on a book about priestly celibacy that seemed like a timed rebuke to the limited openness to ordaining married men expressed at the Amazon Synod that was called by Francis. Benedict later asked that his name be removed from the book.
Time to Delete Your Church’s Facebook Page?
BY NOW THE sins of Facebook, as a social media platform and megacorporation, are well-known. You’ve got invasions of privacy, data breaches, viral falsehoods, livestreamed rapes and murders, and the list goes on.
Well, a few months ago, the volunteer technology committee at the Catholic parish where my wife, Polly, works as social responsibility minister did something about it. They asked their parish council to consider taking the congregation off Facebook entirely and no longer using the platform as a medium of communication. When Polly told me about this, I was a little surprised. Maybe I missed something, but, amid the sporadic calls to “Delete Facebook” in the wake of the company’s various scandals, I hadn’t heard of a religious community actually implementing a boycott.
Once you think about it, the arguments for boycotting Facebook are pretty obvious. When we lend our eyeballs to that platform, we bring it advertising dollars, helping to fund its corrupt and dangerous practices. And what’s worse, the company’s business model makes every person or organization with a Facebook page a recruiter for the company and turns every posted detail of our lives into a product (consumer data) that the company can sell to commercial and political advertisers. When a congregation encourages parishioners to log onto a church Facebook page and share what they find there with interested friends, the church places its members and friends at risk of having personal information exposed to bad actors.
What’s the Matter with Our White Working Class?
DONALD TRUMP'S VICTORY came mostly from non-college-educated whites in the Appalachian parts of Pennsylvania and Ohio and the deindustrialized Rust Belt regions of Michigan and Wisconsin, including many areas that had voted twice for Barack Obama. As this realization dawned, many affluent, educated, bicoastal liberals began to ask: What’s the matter with our white working class? J.D. Vance, who wrote Hillbilly Elegy: A Memoir of a Family and a Culture in Crisis (2016) and grew up in the Rust Belt, in a family still moored to Appalachian Kentucky, turned out to be just the guy to tell the neoliberal elite what it wanted to hear.
Sure America’s industrial economy went to hell in the past four decades, he acknowledged. But Vance said his people haven’t pulled out of that slump because of what he called “hillbilly culture”—which, in his telling, seems to consist mostly of drug and alcohol abuse, hair-trigger violence, and a debilitating tendency to blame others for one’s problems (i.e. the government, coal or steel companies, Obama, etc.). This, of course, is in stark contrast to what Vance did with his own impoverished circumstances: joined the Marines, went to college and law school, and became a Silicon Valley venture capitalist.
Now, Trump is campaigning again, and Vance is back, too, with the pending release of a Hillbilly Elegy movie directed by Ron Howard. In the interim, a steady stream of other books have appeared, offering more systematic reflections on how some in the white working class became angry enough to give us Trump.
White Working Class: Overcoming Class Cluelessness in America , by Joan C. Williams (2017), was one of the first and, given its limitations, best of the books. Williams confesses to her membership in what she calls the Professional-Managerial Elite (PME). But she’s married to a man from working-class origins, a “class migrant,” she calls him, and that’s helped her see the “cluelessness” of her peers. Williams’ message is simple: “When you leave the two-thirds of Americans without college degrees out of your vision of the good life, they notice.”
A Sane Country Would Welcome Them All
IT WAS APRIL 2017, just a couple of months into the Trump era, and our family was at our parish’s Easter vigil—a three-hour-plus Saturday night service that begins with a bonfire and includes the baptism and confirmation of those who’ve spent the last year preparing to enter the church. Our parish has one of the largest Hispanic communities in the area, so our Easter vigils are always bilingual.
By the time we distributed communion, it was around 11 p.m., and as I watched the procession of my Catholic neighbors go by, I was struck by the sight of the brown-skinned men, husbands and fathers in their 20s and 30s, coming down the aisle with sleeping babies cradled tenderly in their arms. They were contradictions to the president’s words: “When Mexico sends its people, they’re not sending their best.”
The recent Netflix documentary series Living Undocumented follows eight families through all nine circles of U.S. immigration hell. The immigrants in the series are from Honduras, Mexico, Colombia, Laos, Mauritania, and Israel. But all of them, even the Laotian guy who picked up a drug felony in his troubled youth, are people any sane country would welcome. And our government is doing everything it can to send them away.
Do We Need Universal Basic Income?
WHEN I TOLD my oldest son I was writing about universal basic income (UBI), he said, “All I know is that the Silicon Valley guys are pushing it, so it must be bad.” And he had a point. UBI has entered U.S. political debate most prominently as Silicon Valley’s favorite solution to a problem mostly of its own creation—massive permanent job loss due to artificial intelligence and robotics.
Under a universal basic income policy, all U.S. citizens would receive from the government a regular, permanent payment of, say, $1,000 per month, regardless of their other income or employment status. It wouldn’t get rid of the grotesque income inequality in the U.S. In fact, it wouldn’t even guarantee each person a decent standard of living. But it would get everyone up to the official poverty level.
Tech industry UBI proponents include Facebook CEO Mark Zuckerberg, Tesla founder Elon Musk, and Amazon kingpin Jeff Bezos. But the idea is most identified with former Silicon Valley entrepreneur Andrew Yang, who made it the defining issue of his long-shot campaign for the Democratic presidential nomination.
Still, UBI is an idea much older and bigger than any of its shadier supporters. While the term “universal basic income” is of fairly recent coinage, the idea that every human deserves some share of the earth’s bounty is an old one. In 1797, one of America’s founding philosophes, Thomas Paine, wrote that “the earth, in its natural uncultivated state was, and ever would have continued to be, the common property of the human race.” But, Paine continued, “the system of landed property ... has absorbed the property of all those whom it dispossessed, without providing, as ought to have been done, an indemnification for that loss.”
Paine proposed a single payment at the attainment of adulthood as compensation for the loss of our natural right to the earth. Paine was echoing the ideas of some of the earliest Christian teachers, including St. Ambrose (340-397 C.E.), who wrote: “God has ordered all things ... so that there should be food in common to all, and that the earth should be the common possession of all. Nature, therefore, has produced a common right for all, but greed has made it a right for a few.”
So universal basic income is not just the latest Silicon Valley fad. It’s rooted in an understanding of the origins of wealth and of our obligations to each other that is consistent with both our democratic and religious traditions.
But that still leaves plenty of room for debate about whether UBI is the right solution for America’s most pressing social and economic woes.
How Would UBI Work?
ECONOMIC DEBATE OVER the past 50 years has offered a variety of UBI-type proposals, from Richard Nixon’s negative income tax to the social wealth dividend proposed by some contemporary democratic socialists. The best-known and most-debated current UBI plan is the one proposed by the Yang campaign. This version of UBI rests on three pillars:
First, it is “universal.” Everyone gets it, without conditions—from Warren Buffett down to the apparently able-bodied guy with the “Please Help” sign at the exit ramp. That, of course, raises the first blizzard of objections. Why give money to rich people who don’t need it or purportedly irresponsible people who might waste it?
Paying for UBI would almost certainly involve new taxes on the wealthy, so Warren Buffett wouldn’t be keeping his $1,000 per month. As to the fear of aiding the “undeserving poor,” it’s true that historically most of the meager social benefits offered in the U.S. are means-tested (for those with the very lowest incomes) and conditional upon some form of good behavior (hours worked, clean drug tests, etc.). This has helped create a culture that stigmatizes public benefits as “welfare” and brands beneficiaries as, if not sinful, at least defective.
Social Media Reaps What It Sows
EVERY U.S. PRESIDENT since Richard Nixon has complained about his news coverage. But the man who lives in the White House now is doing something about it.
In August, Politico reported that the Trump administration is drafting an executive order to counter “liberal” bias in story selection and search results on the platforms Facebook, Twitter, and Google (owner of YouTube). According to this report, both the Federal Communications Commission and the Federal Trade Commission may be tasked with enforcing the neutrality of the digital platforms and the algorithms that prioritize stories and topics.
“Social media bias” must work well as a Republican fundraising pitch, because the administration and its allies in Congress have been harping on it for the past year. In September 2018, Twitter chief Jack Dorsey was hauled before the (then-Republican-controlled) House Energy and Commerce Committee and roasted over arcane and unproven claims of his company’s anti-conservative bias. The next day, Jeff Sessions (then still U.S. attorney general) called a meeting of his state-level counterparts to discuss possible actions against the alleged bias.
Lost Causes and Slivers of Hope
APPROPRIATELY ENOUGH, The Saint of Lost Causes —the new Justin Townes Earle album—has an Orthodox icon of St. Jude on the cover. In Catholic lore, St. Jude is the patron saint of lost causes and desperate cases. But if a person can still turn to a saint for intercession, the cause isn’t entirely lost. The desperate act of prayer implies at least a sliver of hope for grace and mercy, and that’s mostly where the people in Earle’s new batch of songs are: down to their last desperate prayer but still hoping.
At the beginning of the album, in the title track, Earle lays it out, singing: “Now it’s a cruel world / But it ain’t hard to understand / You got your sheep, got your shepherds / Got your wolves amongst men.” Over the course of the next 11 songs, we see the world mostly from the point of view of the sheep. We hear from some fracked-out citizens in “Don’t Drink the Water” who are growing increasingly restless as some oil company hack keeps claiming that their poisoned land and water, and the occasional earthquake, are all an “act of God.” Later, in “Flint City Shake It,” a streetwise Michigander fills us in on how General Motors assassinated his still-resilient hometown. Then there’s the junkie desperado of “Appalachian Nightmare” who hopes God can forgive him at the moment of his death.
World Wide Death
THIS YEAR MARKS the 50th anniversary of the invention of the internet. One day in October 1969, scientists successfully transmitted data from a campus computer at UCLA to a computer at Stanford. Twenty years later, the infrastructure for the World Wide Web went into operation, and the creation of our whole digital universe quickly followed.
Lately, there have been plenty of days that have convinced me that the invention of the internet is one of the worst things that has happened since our first human parents decided that a little bit of “knowledge of good and evil” couldn’t possibly hurt anything.
The Worth of Work
THE AGE OF THE ROBOTS is here. If you didn’t notice, it’s because we’re calling them artificial intelligence (AI) and they don’t look like we expected. They’re the touchscreen kiosk that has replaced the cashier at Panera, the mechanical arms and claws flipping burgers at fast food joints, the drone that may someday deliver your Amazon order. They’re the software that can turn a baseball box score or corporate earnings report into a wire service news story.
According to a recent report from the Brookings Institute, about 38 percent of the adult population could be put out of work by smart machines in the next generation. The choices we are making about our AI future depend upon our answer to the question Wendell Berry posed 30 years ago with his book What Are People For? Up to now, at least in the U.S., the answer has been that people exist to generate corporate profits.
Andrew Yang, a Silicon Valley entrepreneur running for the Democratic presidential nomination, argues that Donald Trump is president because automation eliminated 4 million manufacturing jobs in the Rust Belt states Trump narrowly won. Yang expects that blue collar alienation will multiply soon, when driverless vehicles replace 3.5 million truck drivers.
Learning How to Live Life (on Life’s Terms)
A FEW YEARS AGO, I was sitting in a McDonald’s, getting some work done during my son’s orchestra practice, when I looked up and saw an ambulance parked on the sidewalk outside, a team of EMTs at work on a man at one of the patio tables. I pulled out my earbuds just in time to hear the McDonald’s worker behind the counter say, in an exasperated tone, “I don’t know why they have to come here to shoot their dope.”
The man was being revived from a heroin overdose, but life was going on around him as though his situation was a routine occurrence. That’s because here in Kentucky, it is.
It’s even more routine in Huntington, W.Va., a city known as the overdose capital of America, with a rate twice the national average. In 2017, Huntington was the setting for a prize-winning Netflix documentary, Heroin(e), by West Virginia filmmaker Elaine McMillion Sheldon, which profiled three women—a fire chief, a drug court judge, and a Christian volunteer—who had their fingers in the dike, struggling to hold back the overdose deluge.
There’s No Stopping Populism
A SPECTER IS haunting the neoliberal establishments of Europe and the Americas: populism. And the intelligentsia beholden to those establishments is pitching a hissy fit in response.
You can see it happening via publications such as The Atlantic —with headlines such as “What Populists Do to Democracies” and “How to Be a Populist”—and The Guardian, which has devoted an inordinate amount of its cyberspace to “Team Populism,” a transnational network of academics studying the rise of populist movements and leaders. A search of my university library database shows 1,259 books with the word “populism” in the title published just since 2016. The Guardian even offers a “How Populist Are You?” quiz.
The current populist moment gives the international commentariat a lot to chew on. For starters, there is so much disagreement about what “populism” even means. It’s hard to see how a word regularly applied to Donald Trump and Bernie Sanders can mean much of anything at all. In their work, the Team Populism people try to sort out this left-right mishmash by detaching the phenomenon of populism from its associations with socialism and ethno-nationalism. They consider populism not an ideology for governing but a strategy for attaining and keeping power. According to their June 2018 policy paper: “[Scholars] call something populist if it expresses the belief that politics embodies a struggle between the forces of good, understood as the will of the common people, and the forces of evil, associated with a conspiring elite.”
The New Digital Divide
TWENTY YEARS AGO, when we talked about the “digital divide” we meant things like low-income people’s access to computers and the internet. But according to a recent study from Common Sense Media, that turn-of-the-century gap has largely closed. Seventy percent of families with an annual household income below $30,000 now have a computer at home, and 75 percent have high-speed internet access. In addition, low-income families are near the national average for access to mobile devices such as smart phones and tablets.
But another digital divide is emerging that could have more dangerous long-term consequences.
Researchers have discovered a lot about how brain development and personality formation happen, and their lessons keep coming back to the importance of real-world experiences and face-to-face human interactions, especially in the childhood years. To avoid passivity and mental laziness in their children, many high-income parents are starting to limit their children’s time on digital devices. Common Sense Media found that the children of upper-income families spent half as much time in front of screens as did children of low-income families.
Several private schools are even dialing back their reliance on digital technology. Meanwhile, in many public schools, students are being issued Chromebooks or iPads and shunted into online learning programs. According to Education Week, American schools spend $3 billion per year on digital content, as well as $8 billion-plus yearly on hardware and software, with little to show for it so far in the way of improved learning.
Graceless in ‘Graceland’
ON SEPT. 22, 2018, Paul Simon took to an outdoor stage in his native borough of Queens, N.Y., for the last show of his aptly named “Homeward Bound” farewell tour. After 52 years near the top of the music business, Simon was finally ready to get off the bus for good. Simon’s not taking a vow of musical silence, but he does say that he has no plans for further work.
So, let’s take the man at his word and assume that this could be a good time to assess what the singer-songwriter has meant to his audiences, his country, and, in the end, the great march of human culture.
That may sound a little grandiose for a guy who started as a hack songwriter in the Brill Building pop factory and made his first record (with Art Garfunkel, of course) under the name “Tom and Jerry.” But after the whole “The Sound of Silence,” folk-rock thing passed, Simon went on a long, long run in which he often elevated the American pop song to the level of high art. And, from “America,” dropped into the maelstrom of 1968, to the Nixon era’s “American Tune,” to “Wristband” in the age of Trump, he occasionally even captured the spirit of his age in a memorable, hummable verse, chorus, and bridge.
In brief, the guy’s a genius. And, though he started in the era when singer-songwriters were supposed to be the new poets, his real genius turned out to be musical: those infectious tunes and, from the mid-’70s on, those propulsive rhythmic arrangements.
When Zeal Turns Tragic
ON THE MORNING OF June 23, 2014, a 79-year-old retired Methodist minister, Charles Moore, parked his Volkswagen hatchback in a strip mall parking lot in his old hometown of Grand Saline, Texas. For most of the day he stood in the lot, watching the cars go by on U.S. Highway 80. Sometime after 5:30 p.m., Moore set a small foam cushion on the parking lot asphalt, knelt on the cushion, poured gasoline over himself, and set himself on fire.
That act of public suicide provides the starting point for the PBS Independent Lens documentary Man on Fire, available for viewing online starting Dec. 17. The hourlong film is a sustained reflection both on Moore’s life as an especially stubborn perennial dissident and on the life of the town where his journey began and ended.
When Moore died, he left behind a neatly typed testimony tucked under a windshield wiper of his car, which portrayed his suicide as an act of solidarity with the untold numbers of African Americans lynched and brutalized in a town that was still largely unrepentant. On the dashboard of Moore’s car was a copy of his high school yearbook, presumably to prove to Grand Saline authorities that he was in fact one of their own, although he hadn’t really lived there for decades.
On its face, Moore’s suicide sounds like the tragic act of a man mired in depression and possibly even delusions. But the story becomes more complex when you read Michael Hall’s long article, also titled “Man on Fire,” from the December 2014 issue of Texas Monthly.
What Are These 'Facts' You Speak Of?
FOR 20 YEARS, Alex Jones, a radio show host and founder of the Infowars website, has been spreading one off-the-wall conspiracy theory after another, and, for the past decade, social media have amplified his voice and his reach to a level his predecessors on the “paranoid Right” could never have imagined. In early August, Facebook and Google-owned YouTube finally took measures to effectively ban Jones from their platforms. But the way they did it raises more questions than it answers about the possibility of restoring respect for truth to public life in the United States.
Way back in the dying days of the 20th century, Alex Jones started his career ranting about the old conspiracy standbys, such as fluoride in our drinking water. But then 9/11 happened, and Jones took his act to a whole new level, claiming that the attacks on the World Trade Center and the Pentagon were really “inside jobs” unleashed by the secret government to launch a global war and suspend civil liberties.
In days gone by, such a theory would have been passed around on mimeographed fliers, and mainstream journalism, shackled by considerations of fact, wouldn’t have touched it. But the social media era has freed us from all that. Now anybody can say anything, and everybody can hear it. Suddenly Alex Jones had an audience of millions for his Facebook pages, his YouTube channel, and his website; this success seemed to egg him on to ever more outrageous pronouncements. Finally, he hit rock bottom with the claim that the Sandy Hook school shooting was faked (to provide a pretext for seizing Americans’ guns) and all those grieving parents were only acting.
After the Battle
TWO OF THIS year’s most compelling music releases so far give a deeply personal voice to the moral, emotional, and psychological struggles of the men and women who have waged our long-term “war on terror.” Healing Tide by The War and Treaty is the first full-length recording from this husband-and-wife team; the husband, Michael Trotter Jr., became a musician during his time in Iraq and has since struggled with post-traumatic stress disorder. Meanwhile, Mary Gauthier’s Rifles and Rosary Beads features 11 songs co-written with Iraq and Afghanistan vets through the SongwritingWith:Soldiers (SW:S) project.
Trotter, the songwriter and keyboardist in The War and Treaty, enlisted in the Army in 2003 simply to provide health insurance and a steady paycheck for his daughter and first wife. Within six months he was in Iraq, stationed in the remains of one of Saddam Hussein’s palaces. One day, a captain who had taken Trotter under his wing and knew that he could sing took him into a rubble-strewn room that held a piano. Trotter had never played in his life, but in his downtime he taught himself and started writing songs.
Then that captain was killed in action. Trotter wrote a song and performed it at the captain’s memorial. The song was “Dear Martha,” which the band still performs, and it made such an impression on all the soldiers at the memorial that Trotter’s colonel tasked him to write and perform a song for the memorial services of soldiers who died in action.