Social Media and the Future of Democracy

Does social media promote positive democratic debate, sow hatred, or both? Students explore whether social media has contributed to echo chambers, and hate speech — and consider options for improving public discussion on social media.

To The Teacher

When social media platforms were first created in the late 1990s and early 2000s, their creators and supporters promised a democratic revolution, arguing that these apps would foster robust public discussion and enrich civic life. But more recently, concern has grown about how social media can contribute to political fragmentation, echo chambers, and hate speech.

Does social media promote positive democratic debate, or does it encourage hate speech? And how have our online interactions changed our in-person politics?

This lesson consists of two readings. In the first reading, students explore concerns that social media platforms are creating echo chambers and stoking hate. In the second, students consider how we might improve public discussion on social media. Click here for a pdf handout of the readings and discussion questions. 
 

Note: This lesson is Part 2 of a series of lessons on social media.

 

Social media
 


 

Reading One:
Social Media and Democratic Debate
 

When social media platforms were first created in the late 1990s and early 2000s, their creators and supporters promised a democratic revolution. As millions of users around the planet joined platforms such as Facebook, Twitter, and Snapchat, early proponents of social media argued that these apps would transform political debate—fostering robust public discussion and allowing people to create a rich civic life by engaging with their friends and neighbors online. But in more recent years perspectives on social media and democracy have grown more negative, focusing on stories of political fragmentation, echo chambers, and hate speech.

Today, we might ask: Does social media promote positive democratic debate, or does it encourage hate speech? And how have our online interactions changed our in-person politics?

In the past, advocates of social media touted its capacity to put all parties in a political discussion on a level playing field. Some journalists referred to the Arab Spring uprisings in Tunisia and Egypt in 2011 as “Twitter Revolutions.” In a 2012 article on Mic.com, writer Saleem Kassim voiced this type of techno-optimism, writing:

Being capable of sharing an immense amount of uncensored and accurate information throughout social networking sites has contributed to the cause of many Arab Spring activists. Through social networking sites, Arab Spring activists have not only gained the power to overthrow powerful dictatorship, but also helped Arab civilians become aware of the underground communities that exist and are made up of their brothers, and others willing to listen to their stories.

In countries like Egypt, Tunisia, and Yemen, rising action plans such as protests made up of thousands, have been organized through social media such Facebook and Twitter. “We use Facebook to schedule the protests” an Arab Spring activist from Egypt announced “and [we use] Twitter to coordinate, and YouTube to tell the world.” The role that technology has taken in allowing the distribution of public information such as the kinds stated by the aforementioned activist, had been essential in establishing the democratic movement that has helped guide abused civilians to overthrow their oppressor.

Social networks have broken the psychological barrier of fear by helping many to connect and share information. It has given most people in the Arab world the knowledge that they are not alone, that there are others experiencing just as much brutality, just as much hardships, just as much lack of justice. Social networks "for the first time provided activists with an opportunity to quickly disseminate information while bypassing government restrictions," Hussein Amin, professor of mass communications at the American University in Cairo said.

https://www.mic.com/articles/10642/twitter-revolution-how-the-arab-spring-was-helped-by-social-media


For protestors looking to share information in repressive societies, social media can be a powerful tool. However, in recent years, many political observers have pointed out that social media platforms are structured in ways that can also contribute to trends that are dangerous to democracy.

For example: Social media platforms try to predict our preferences (as part of their business model). This can mean that they only show us posts and stories from viewpoints similar to our own. This can lead to the creation of a social media “bubble,” an echo chamber in which one’s preconceived views are reinforced, rather than fostering a pluralistic discussion in which lots of different views are represented. Critics fear that this could be contributing to greater political polarization in our country.

In extreme cases, social media companies have allowed groups promoting white supremacy, anti-Semitism, and sexism to proliferate. In an April 2019 report, Council on Foreign Relations editor Zachary Laub summarized recent research on social media and hate speech. He wrote:

As more and more people have moved online, experts say, individuals inclined toward racism, misogyny, or homophobia have found niches that can reinforce their views and goad them to violence. Social media platforms also offer violent actors the opportunity to publicize their acts.

Social scientists and others have observed how social media posts, and other online speech, can inspire acts of violence:

In Germany a correlation was found between anti-refugee Facebook posts by the far-right Alternative for Germany party and attacks on refugees. Scholars Karsten Muller and Carlo Schwarz observed that upticks in attacks, such as arson and assault, followed spikes in hate-mongering posts.

In the United States, perpetrators of recent white supremacist attacks have circulated among racist communities online, and also embraced social media to publicize their acts. Prosecutors said the Charleston church shooter, who killed nine black clergy and worshippers in June 2015, engaged in a “self-learning process” online that led him to believe that the goal of white supremacy required violent action.

The 2018 Pittsburgh synagogue shooter was a participant in the social media network Gab, whose lax rules have attracted extremists banned by larger platforms. There, he espoused the conspiracy that Jews sought to bring immigrants into the United States, and render whites a minority, before killing eleven worshippers at a refugee-themed Shabbat service. This “great replacement” trope, which was heard at the white supremacist rally in Charlottesville, Virginia, a year prior and originates with the French far right, expresses demographic anxieties about nonwhite immigration and birth rates….

The same technology that allows social media to galvanize democracy activists can be used by hate groups seeking to organize and recruit. It also allows fringe sites, including peddlers of conspiracies, to reach audiences far broader than their core readership. Online platforms’ business models depend on maximizing reading or viewing times. Since Facebook and similar platforms make their money by enabling advertisers to target audiences with extreme precision, it is in their interests to let people find the communities where they will spend the most time.

https://www.cfr.org/backgrounder/hate-speech-social-media-global-comparisons


While the founders of social media platforms envisioned a robust and healthy political debate, the outcomes after two decades are decidedly more mixed. As millions of people rely on these technologies for news, community-building, and communicating with friends and family, the stakes for making social media into a force that can be consistent with our democratic ideals continue to rise.
 

For Discussion

  1. How much of the material in this reading was new to you, and how much was already familiar? Do you have any questions about what you read?
  1. What are some arguments in favor of the idea that social media promotes democracy? How can social media platforms foster democratic engagement?
     
  2. Can you think of examples of how social media has fostered positive civic engagement on the platforms you use?
  1. According to the reading, why might social media contribute to social polarization or the creation of political “bubbles”?
     
  2. Can you think of ways the social media platforms you use have seemed to encourage polarization or bubbles?
  1. Some analysts believe that social media is not to blame for the rise in hate speech, arguing instead that white supremacists and other extremist movements have always used whatever technology is available to spread their message, and that the online era is no different. What do you think of this argument? Are the risks and dangers posed by current technologies different from those of the past?
  1. What do you think? On the whole, is social media helping our democracy or hurting it? Explain your position.


 


 

Reading Two:
What Can Be Done To Change Social Media?
 

“Social media doesn’t work the way we think it should.” This was the conclusion of MIT Associate Professor of the Practice in Media Arts and Sciences Ethan Zuckerman, who has worked for and studied social media companies over the past three decades.

Since the 2016 U.S. presidential election and the Brexit vote in Britain, many people have thought the same thing, expressing concern that online platforms contribute to a rise in political polarization and hate speech.

But if there are worries about how social media is affecting our public life, what can be done to change these platforms for the better? Various advocates have explored possibilities for action by government, by tech corporations, or by social media users themselves.

One option is government regulation. In the last three years, there have been a flurry of policy proposals in the United States and other countries. In an April 8, 2019 article for NPR, news desk reporter Matthew Schwartz described a broad set of prospective government regulations of social media companies in the United Kingdom (U.K.). He wrote:

[T]he U.K. plans to require social media companies to be much more active in removing and responding to harmful material on their platforms. The sweeping 102-page white paper, released Monday [by two British government departments], envisions requirements for everything from ensuring an accurate news environment, to combating hate speech, to stamping out cyberbullying.

The proposal is the latest in a series of increased efforts by governments around the world to respond to harmful content online. Last year Germany imposed fines of up to $60 million if social media companies don't delete illegal content posted to their services. After the massacre in Christchurch, New Zealand, Australian lawmakers passed legislation subjecting social media executives to imprisonment if they don't quickly remove violent content. The U.K. proposal goes further than most by proposing the creation of a new regulatory body to monitor a broad array of harms and ensure companies comply.

"I'm deeply concerned that social media firms are still not doing enough to protect users from harmful content," Prime Minister Theresa May said. "So today, we're putting a legal duty of care on these companies to keep users safe. And if they fail to do so, tough punishments will be imposed. The era of social media firms regulating themselves is over."

The new duty of care has yet to be fleshed out, but U.K. officials offered plenty of suggestions for what they expect a new regulator to include in a code of practice. Officials expect companies to do what they can to "counter illegal content and activity." That could include requirements to actively "scan or monitor content for tightly defined categories of illegal content" such as or threats to national security, or material that sexually exploits children.

https://www.npr.org/2019/04/08/711091689/u-k-regulators-propose-broad-social-media-regulations-to-counter-online-harms
 

Some skeptics might express concern, however, that more government involvement in regulating social media could result in censorship or state repression of free speech. Striking the right balance between allowing free expression, on the one hand, and curtailing hate speech that is actively promoting violence, on the other, is difficult.

Advocates such as U.N. Special Rapporteur on Freedom of Opinion & Expression David Kaye have proposed an alternative approach. They focus on ways that companies can step up to allow greater community moderation. In a June 2018 article for Reuters, Kaye laid out three strategies companies could use to create environments that are more conducive to healthy democratic debate online. He writes:

First, internet companies need to involve local communities in governing their platforms. The corporations as they are currently configured cannot rule public space everywhere. They must find ways to devolve authority to local actors – not to governments, but to their users. Hiring teams of experts alone simply doesn’t cut it. Steps like diversifying leadership, enabling greater local content moderation not outsourced to contractors, and engaging deeply with the communities where they operate are essential….

Second, the companies must disclose radically more information about the nature of their rulemaking and enforcement concerning expression on their platforms. Greater disclosure means individual empowerment, giving people an opportunity to provide genuine critiques of how those rules apply, and how the companies get it wrong, in specific countries….

Finally, the companies make claims to global roles, so they should adopt global standards – not the First Amendment, and not terms of service allowing them complete discretion. They should apply human rights law, which provides global standards protecting everyone’s right to “seek, receive and impart information and ideas of all kinds, regardless of frontiers.” …. Those rules would provide better grounding for company operations and allow real capacity to push back against governments seeking to interfere with freedom of expression....

Opaque forces, corporate and governmental, are shaping the ability of individuals worldwide to exercise their freedom of expression. With looming governmental intervention, the companies need to change in order to meet the threats they pose in this digital age.

https://www.reuters.com/article/us-kaye-media-commentary/commentary-how-to-fix-social-media-without-censorship-idUSKBN1JF34H


While government and corporations can have roles in shaping democratic engagement on social media, everyday internet users themselves do not have to wait for these institutions to take action. Instead, say advocates, there are a variety of steps that we can take ourselves. In a 2018 blog post entitled “Six or Seven Things Social Media Can Do for Democracy,” MIT Professor Ethan Zuckerman highlighted strategies social media users are already deploying to improve democratic debate.

Users in social networks like Twitter and Facebook have little control over how those networks are governed, despite the great value they collectively create for platform owners. This disparity has led Rebecca MacKinnon to call for platform owners to seek Consent of the Networked, and Trebor Scholz to call us to recognize participation in social networks as Digital Labor. But some platforms have done more than others to engage their communities in governance.

Reddit is the fourth most popular site on the U.S. internet and sixth most popular site worldwide, as measured by Alexa Internet, and is a daily destination for at least 250 million users. The site is organized into thousands of “subreddits,” each managed by a team of uncompensated, volunteer moderators, who determine what content is allowable in each community….

Some Reddit communities have begun working with scholars to examine scientifically how they could govern their communities more effectively. /r/science, a community of 18 million subscribers and over a thousand volunteer moderators, has worked with communications scholar Nathan Matias to experiment with ways of enforcing their rules to maximize positive discussions and throw out fewer rulebreakers. The ability to experiment with different rules in different parts of a site and to study what rulesets best enable what kinds of conversations could have benefits for supporters of participatory democracy offline as well as online.

http://www.ethanzuckerman.com/blog/2018/05/30/six-or-seven-things-social-media-can-do-for-democracy/
 

Whether governmental, corporate, or community action presents the most promising avenue for creating change, a wide variety of reformers are envisioning ways to reclaim the democratic promise of social media—or at least help preserve democracy in an era of social media disruption.
 

For Discussion

  1. How much of the material in this reading was new to you, and how much was already familiar? Do you have any questions about what you read?
  1. According to the reading, what are some proposals for how governments can change how social media companies operate?
  1. According to the reading, what are some potential changes that corporations or users could enact themselves improve political debate on social media platforms? Which ideas do you think are the most promising?
  1. Can you think of any of your own proposals for how social media can be better used to promote democracy? Are there any measures that you think should be pursued?
  1. In a 2017 article in The Washington Post, scholars Joshua Tucker, Yannis Theocharis, Margaret E. Roberts and Pablo Barberá argued that “social media itself is neither inherently democratic or non-democratic, but yet another arena in which political actors contest for power”? What do you think they mean by this argument? Do you agree or disagree? Explain your position.
     
  2. The current major social media platforms are run by corporations, which have an imperative to increase revenues and return profits. Can you imagine a different kind of social media platform where civic engagement, rather than maximizing profits, was the goal? How would such a platform work? Who would pay for it?

 

Research assistance provided by John Bergen.