The new Netflix documentary The Social Dilemma doesn’t really dwell on the dilemma part. It makes only passing acknowledgement that the internet and social media have been and can be rich with human connection, pro-democratic action and vast seas of information. Mostly it’s pretty dire, even overreaching, prompting one critic to liken it to Reefer Madness. Between very effective interviews with dissident and repentant tech engineers and executives, the film builds a weaker scripted drama of a family being poisoned by algorithms, which we see as devious personas manipulating our protagonists’ psychological vulnerabilities in real time. It’s kind of cringey at times, but I do see the filmmakers trying hard to make the issue accessible and get beyond just talking heads and tedious tech b-roll. So I’d say watch it, and if you have kids, definitely watch it with them. It might introduce you/them to the concepts of “attention extraction” and algorithm-fueled radicalization. In its clunky way, it hits hard on two key social harms of the AI world - the mental health of young people and political disinformation and polarization. But don’t stop there. Please watch PBS’s Frontline special Generation Like and its more recent episode on Artificial Intelligence. And please dig into extensive prophetic literature on the subject including books by Tim Wu, Roger McNamee, Jaron Lanier , Jonathan Taplin, Astra Taylor and Cathy O’Neil, some of whom speak in The Social Dilemma.
Some quasi-ironically refer to the big five tech companies Facebook, Amazon, Apple, Netflix and Google with the fearsome acronym FANG or FAANG, and as I sit here using or referencing or recommending services provided by all five behemoths, I can’t tell you they’re evil monstrosities. Every one of those companies has brought value and capability to my life I couldn’t have imagined 20 years ago. They’re here to stay and it’s up to us as citizens to use them wisely and demand social accountability from them. But i feel like Facebook needs to be held up for particular scrutiny and reform. As evidenced by this new journal and newsletter, I’ve withdrawn from Facebook as a forum for my personal writing and reaching out to friends and readers. In just two weeks, I feel my activity on phone and email with loved ones on the rise and my focus on curating important news improving. I wanted one less social network in my life, and I have profound concerns with the ethics and socio-political insights of Facebook’s monolithic boss Mark Zuckerberg. His commentary in his town halls with employees, his blog posts and testimony before Congress just doesn’t suggest a mind nimble and broad enough, or historically informed enough, to reform the behemoth he’s made. He’s an assembly line of platitudes, but he won’t unplug the algorithms that are amplifying disinformation and white nationalist and Q-Anon groups.
The chorus of voices calling out Facebook’s apathy and protectiveness is reaching a crescendo. I’ve pulled together some of the recent landmarks. In July, engineer Max Wang resigned from the company with a searing commentary. “We are failing” to correct the polarizing and radicalizing aspects of the product he said. “And what's worse, we have enshrined that failure in our policies.” His verdict was that Facebook is “hurting people at scale.” Just last week, Buzzfeed reported on the parting memo left by fired data analyst Sophie Zhang that focused on the platform’s impotence and apathy as governments in numerous countries used Facebook to foment violence and election corruption. A week prior, The Washington Post reported on the parting warning of software engineer Ashok Chandwaney:
“I’m quitting because I can no longer stomach contributing to an organization that is profiting off hate in the US and globally,” Chandwaney wrote in a letter posted on Facebook’s internal employee network shortly after 8 a.m. Pacific time. The nearly 1,300-word document was detailed, bristling with links to bolster its claims and scathing in its conclusions.
Vanity Fair documented that despite wailing from conservatives that Facebook is biased against them and their ideas, the content that generates shares and virality is almost uniformly from right wing sources. Axios reports here on how Facebook’s new policies about combatting climate change disinformation don’t really grasp the nature of the problem.
This enlightened cascade of concern has spawned the organization Stop Hate For Profit, a partnership of civil rights organizations, which launched with a June/July boycott of Facebook advertising. NPR reported:
For Rashad Robinson, this moment was a long time coming. "Facebook has given [advertisers] no other option because of their failure, time and time again, to address the very real and the very visible problems on their platform," Robinson, president of the civil rights group Color of Change, told NPR.
More than 1,200 businesses, including some high profile brands, suspended FB advertising for a month. It’s a drop in the company’s revenue ocean, but it took a stand. A new post on Friday sounded the alarm again: “We are quickly approaching one of the most consequential elections in American history. Facebook’s unchecked and vague “changes” are falling dangerously short of what is necessary to protect our democracy.”
These critiques and The Social Dilemma do not even mention one of the other massive societal costs of Facebook, the chokehold it and Google have on most of the advertising revenue in our economy. Newspapers and news organizations relied on consumer advertising to fulfill their public service mission for more than a century, until these network super-servers interposed themselves between advertisers and content creators. The chilling effect of that is incalculable, an existential threat to the American free press. No understanding of Facebook and Google is complete without knowing that. They are mega advertising agencies at the end of the day, with the most sophisticated tools for mind control at their disposal ever conceived by humans, all willingly submitted to by hundreds of millions of people.
These companies are so large and powerful that meaningful, radical regulation doesn’t seem possible in the near term. Congress lags far behind the curve on these complex issues, and most of the members fear crossing these multi-billion-dollar American employers. But not every problem has a federal solution. Public pressure and education can effect change and more important it can spread the ethos of the media literacy that’s so critical to every individual trying to navigate this unprecedented intervention in the information ecosystem. Basic civics plus digital citizenship just needs to be taught and shared and advocated for. Algorithms that amplify misinformation for profit should definitely be regulated, but I believe that an army of media activists can get the companies to make those changes before Washington can catch up. Possibly The Social Dilemma, for its weak spots, may represent the point at which this red pill style awareness of the problems jumps from the info-elite to the mainstream.
Finally, here’s a really weird but perhaps cathartic bit of art/activism using deepfake technology to invent a chastened Mark Zuckerberg, holding himself to account. It feels like fantasy because it is.