The internet ecosystem provides immense economic, social, and cultural benefits by enabling people everywhere to connect and share ideas. But the internet as we know it cannot exist without strong legal protections for the interactive computer services, including platforms, that make the internet an accessible, diverse, and functional place. Policymakers must recognize that intermediary liability protections, such as Section 230 of the Communications Decency Act of 1996, do not provide “total immunity” from wrongdoing by bad actors. Instead, intermediary liability protections merely enable organizations of all sizes, from the smallest startups to the world’s largest companies, to provide interactive computer services for users to connect and share ideas. Innovators rely on intermediary liability protections to innovate and develop new and better methods of communication.
Intermediary liability protections are based on two bedrock principles: Free speech is an important and fundamental right, and wrongdoers should be held accountable for their own actions. Without strong intermediary liability protections, interactive computer services would have no choice but to censor controversial opinions on social media, turn off user reviews on product pages, require bloggers to get approval before publishing their articles, force users to have their emails read and fact-checked by corporations before sending, and eliminate search engines that connect people with useful content and websites. One thing is certain: Making interactive computer services liable for the content generated by users online will force those providers to protect themselves by taking control of content on the internet, which is bad for users and stifles innovation.
As policymakers consider reforms to the way that the internet functions, they should reflect on the following principles:
Intermediary Liability Generally
- The First Amendment cannot exist in the 21st century without protections for the intermediaries that provide opportunities for user-generated speech.
- Policymakers should support an internet ecosystem that holds bad actors who misuse digital services responsible for their own actions.
- Intermediary liability protections make the internet a better, safer, and more useful place. Liability protections for interactive computer services allow market forces to incentivize new and innovative ways of connecting users while limiting the impact of harmful content. Liability protections allow interactive computer services to set their own rules for moderating content that best fit their own platform and users.
- Policymakers should not disadvantage interactive computer services compared with their offline counterparts.
- Algorithms are not publishing decisions and do not endorse content or speech. Rather, algorithms are automated ways of organizing data and trying to make systems more useful by connecting users with the content they need, whether that content is a product on an online marketplace, an instructional cooking video, or a better route home in heavy traffic.
- Policymakers should support efforts to require platforms to have reasonable processes and systems in place, based on industry best practices, to manage the prevalence and risk of illegal content.
Child Safety Online
- Intermediary liability protections empower users to select the interactive computer services that best fit their circumstances and empower platforms to develop standards for age-appropriate user experiences.
- Intermediary liability protections were created to protect interactive computer services that choose to remove problematic or harmful content, and have resulted in proactive, voluntary innovations that make the internet safer for people of all ages. Thanks to intermediary liability protections, social media platforms, email providers, and search engines are now the largest removers and reporters of suspected child sexual abuse material online.
- Policymakers should empower platforms to remove and disable harmful content, such as child sexual abuse material or non-consensual intimate imagery. Policymakers should avoid top-down, government micromanagement of interactive computer services that stifles innovation and instead promote industry collaboration to enable the development of better methods of protecting users.
- Policymakers should support robust funding and other resourcing to reduce friction for law enforcement to investigate and prosecute predators and abusers who victimize children online. Policymakers should also support public-private partnerships and community-based efforts to prevent child victimization.