3bl logo
Subscribe

By signing up you agree to our privacy policy. You can opt out anytime.

What is the Social Responsibility of Social Media?

johnhowell headshotWords by John Howell
Data & Technology
hero
Share

Do digital companies have a duty to curate content that some find alarming? That’s the current dilemma of social media businesses Facebook, Twitter, YouTube, Instagram, Spotify, and other platforms, which have self-defined as “agnostic” distributors—that is, neutral carriers not responsible for the content published through their channels and networks. These iconic brands, avatars for the enhanced exercise of freedom of speech on a scale previously unimaginable in the history of communications, now find themselves targets of increasing criticism for purveying content that is perceived as damaging, even dangerous, by their global community of users. What is the social responsibility of social media?

What’s at stake is the ongoing conflict between the First Amendment guarantee of freedom of speech, a right often under attack throughout its long history, and a contemporary culture that permits—even rewards—extreme, inflammatory commentary by measures of views and click-throughs, and amplifies it to an unprecedentedly large audience through the Internet.

Critics argue that like publishers, these “distributors” are responsible for the content they carry and present, if even they don’t create it. Companies argue that it is not their right or duty to censor third-party content.

Of course, like all companies, these businesses have standards that can be referred to. And these “community standards” are increasingly being belatedly invoked as the limits of free expression have stretched to the breaking point, with a growing negative pushback against tech companies that once seemingly could do no wrong in the eyes of their enthusiastic users.

Item: Twitter finally joined its tech industry peers in deleting some content produced by Infowars, a site specializing in conspiracy theories, hate speech, and harassment of its perceived enemies. YouTube, Facebook, Apple, and Spotify all invoked their “community standards” as the reason for deleting that site’s comments.

Item: Amazon deleted products on its site that featured Nazi and white supremacist symbolism after their presence was pointed out to the online retailer by activist groups. The company said it removed products that “violated [its] policy against product listings that promote hatred, violence or discrimination.”

The U.S. is unique in having a First Amendment. Read broadly, it calls for no “abridgement” of speech. Historically, the common sense rule regarding speech that leads to prohibited or even criminal action has been applied, best expressed in the maxim that yelling “fire” in a crowded theater leads to public disorder and life-threatening action. For many European countries, with the experience of fascism and genocide in their modern histories, speech is regulated to prevent subsequent social unrest. For example, Google operates in Europe under different codes of privacy than in the U.S. (What to make of Germany’s recent relaxation of the general ban on Nazi symbols in video games, now grouped with other exceptions such as academic studies, films, theater plays, historical exhibits, and other cultural expressions? The use of generally forbidden imagery will be determined on a case-by-case basis, says the German government. Is this a sophisticated upgrade to “deal responsibly with difficult subjects,” as Felix Falk, head of the gaming industry lobby, says? Or an invitation to the global alt-right to bait the general public with anti-Semitic views?

What’s also at stake here is the brand reputations of these tech companies. Questions about data privacy, the commercialization of private data, revenge porn, blatant disinformation, and outright threats have spurred a variety of rules and laws to curb unlimited freedom of expression through digital distribution channels, and have created a cloud of concern over the once golden-haloed firms. The answers to these questions have real consequences in share prices, employee activism, and future workforce recruitment as well as lawsuits, unwanted government regulation, and user un-subscriptions.

I’m looking forward to the case studies that will be written to analyze how the big tech brands navigate the increasingly stormy waters of taking stands by defining standards, and maintaining their integrity and reputation while doing so.

John Howell headshotJohn Howell

John Howell, Chief of Thought Leadership and Editorial Director, is a co-founder of 3BL Media, the parent company of Triple Pundit, begun in 2009. Howell oversees original editorial content procurement and creation. He is also the author of the weekly Brands Taking Stands Newsletter. He has written and edited for Elle, Artforum, High Times, the New York Times Magazine, and the LA Times. Howell is based in Wonalancet, NH.

Read more stories by John Howell