Big Tech and Populism: The Regulatory Breaking Point?.

Is government on the brink of regulating your Facebook feed? As outlandish as that idea seems, it gathered steam at this year’s SXSW festival, the annual gathering of tech and media innovators and influencers in Austin, Texas. Indeed, at a Vox Media party, people shouted, “break them up!” during a panel discussion about antitrust policy in tech.

It has been three years since the term “fake news” became a household phrase and one year since the Cambridge Analytica scandals revealed the extent to which media misinformation and the systematic harvesting, parceling and dissemination of tech users’ personal data are intertwined. Both of these trends spurred calls for big tech (generally conceived as being led by Facebook, Google and Amazon, but usually including companies like Twitter) to self-regulate or be regulated. These are complex issues, as demonstrated with embarrassing force during the post-Cambridge Analytica hearings by the inability of senators to understand the workings of Facebook. But we do know — and have known for a long time — how corrosive these trends can be.

The polarization of media consumption is driving a dangerous brand of populism. Americans have less trust in institutions than ever before. Gallup research tells us only 10 percent of citizens have a great deal of trust government and only 45 percent trust mass media (up from 32 percent in 2016). More importantly, Americans don’t trust each other for one simple reason: they are increasingly limited to opinions or outlooks that are similar to their own. The trend is omnipresent, as pointed out during Clyde Group’s SXSW panel on populism and tech. “The term echo chamber exists for a reason,” said Daniel Lippman, co-author of Politico Playbook and a reporter for Politico. “More and more Americans are listening and talking to those with similar outlooks.”

Does media have a responsibility for its role in the rise of undemocratic populism? Samantha Dravis, a conservative public affairs expert, places most of the responsibility on individual users to fully research issues and account for different views. But she also points out the extent to which artificial intelligence will make this harder. “As AI becomes more enhanced, it will likely continue to optimize a users’ ability to block out alternative views. In some cases, this isn’t necessarily a bad thing. If I’m a Kansas City Chiefs fan, I don’t necessarily want to see ads for the New York Jets,” she pointed out during the panel.

With 2020 looming, both how America consumes its news, and the content of the news itself, are increasingly important. It’s why Robby Mook, the manager of Hillary Clinton’s 2016 presidential campaign and a fellow at the Harvard Kennedy School, is concerned about the profits derived from clicks and ratings, as they create a market incentive for news outlets to cater to partisans, instead of trying to appeal to the general electorate. In some cases, “news” outlets are making lots of money on exaggerations or outright lies. We may need to think about what we label as “news” versus “entertainment,” just like how Cheez-Whiz isn’t allowed to be called “cheese.” Mook pointed out that as long as news coverage is profit-driven rather than focused on editorial independence, it risks the impartiality of the information.

All of this affirms the chorus heard throughout the streets of Austin: media and tech need to bring us solutions, not leave us guessing at their motivations. Indeed, just hours before SXSW kicked off, Elizabeth Warren demonstrated how serious many are taking these issues with her call to break up tech giants Facebook, Amazon and Google. The acceleration of innovation and the sophistication of technology only alter that dynamic in the scale of the problem, not in the simple truth of the equation: regulate yourself or be regulated. The choice remains with big tech. But it’s unclear how long that will be true.


We impact outcomes.
Let’s talk.