Overview
On July 28, Senator Ron Wyden objected to the Senate’s passage of the Intelligence Authorization Bill for Fiscal Year 2016. He objected not because he opposes the funding decisions included in the legislation but rather because of just 29 lines of text among the 41 pages of proposed legislation that have nothing to do with intelligence spending. Those 29 lines, found in Section 603 of S. 1705, would require Internet companies to report to the Attorney General (or her designee) “terrorist activity” on their platforms. In support of this idea, proponents have raised concerns about use of the Internet by terrorist organizations such as ISIS to promote terrorism and recruit new members. Of course such concerns are appropriate, but the proposed legislation creates too much collateral damage. Our client, the Internet Association, has raised concerns with Section 603. The views here, however, are my own.
The Supreme Court, among others, has noted, “[C]ontent on the Internet is as diverse as human thought.” This means that along with supercharged innovation, economic development, and democratic discourse, the Internet also facilitates the views of the intolerant, hateful, and yes even criminal elements around the globe.
In the US, the First Amendment protects the rights of individuals to express intolerant and hateful ideas. We are often criticized for this, to which we respond that the best means of combating such speech is by ensuring the ability of others to respond. In this dynamic, we believe that the marketplace of ideas is the best referee. Certainly, it is a better referee we can agree than a bureaucrat in a government agency making decisions about what should be censored. Put another way, the dangers of government-controlled speech far outweigh concerns over the promotion of speech we find objectionable.
Yet the First Amendment does not protect organizations from laws prohibiting them from conspiring to commit violent acts or raise money to fund criminal activities. The First Amendment does not protect an individual’s right to incite imminent lawless action that is likely to incite such action.
When use of the Internet crosses the line from protected speech to criminal activity, law enforcement can and should intervene. In such cases, Internet companies can and do cooperate with lawful requests to assist efforts to investigate and prosecute criminal behavior.
A key problem with Section 603, however, is that the trigger for the reporting mandate is based on the vague and undefined term “terrorist activity.” This term is not a term of art in the US criminal code and arguably goes well beyond criminal activity to speech that is protected under the First Amendment.
Proponents of the provision compare the reporting obligation to the existing reporting obligation for child pornography images in 18 U.S.C. §2258A. That law requires intermediaries that obtain actual knowledge of any facts and circumstances from which there is an apparent violation of federal child exploitation crimes involving child pornography to file a report with the National Center for Missing and Exploited Children (NCMEC).
The NCMEC reporting obligations, however, relate to images that are per se unlawful and are never protected speech under the US Constitution. A government mandate that an Internet company report facts and circumstances connected to the vague and overbroad term “terrorist activity” certainly would result in overbroad reporting to the government of speech that is protected under the First Amendment.
More troubling, if adopted, the provision would serve as a global template for other countries to impose reporting requirements for activities those jurisdictions deem unlawful. This would be particularly problematic with countries that regulate speech, including political speech, and with authoritarian regimes that would demand that Internet companies police their citizens’ activities.
Section 603 also creates a practical compliance problem. Because no one knows the definition of “terrorist activity,” how does one counsel a client to establish a compliance protocol under the proposal?
Any company would be at risk that if it did not report “terrorist activity,” it could be liable if there were a subsequent event that resulted in loss of life, limb, or property. Likely, this would result in designing a protocol to over-report anything that could be considered “terrorist activity.” Given the massive scale of content shared and created on the Internet daily, this would result in reporting of items that are not likely to be of material concern to public safety and would create a “needle in the haystack” problem for law enforcement. This serves no one’s purposes and adds privacy concerns to the First Amendment concerns noted above.
This creates a perverse incentive for a company to avoid obtaining knowledge of any activity that would trigger the reporting requirement—the exact opposite of what the proponents of the legislation want. Yet, designing such an avoidance protocol is nearly impossible. If even one low-level employee received an over-the-transom email about a “terrorist activity,” knowledge of the activity can be imputed to the entire company – exacerbating the potential liability faced by an Internet company.
Section 603 has other problems. The scope of the kind of Internet platforms that would be covered by the proposal is enormous. The reporting mandate applies to an “electronic communication service” (ECS) and a “remote computing service” (RCS). An ECS is arguably any service that provides a person with the ability to communicate with others electronically. The definition of “remote computing service” is “the provision to the public of computer storage or processing services by means of an electronic communications system.” These terms create a huge universe of entities subject to the mandate, including but certainly not limited to social media companies, search engines, Internet service providers, blogs, community bulletin boards, universities, advocacy organizations, and religious institutions.
Further, the proposal would not limit the reporting requirement to publicly viewable sites. It would require a cloud storage provider to police a third party’s internal, stored communications to avoid the potential liability under the provision.
For all of the reasons above, Senator Wyden was right to object to the reporting mandate.
And the Senate Select Committee is right to raise concerns with the use of the Internet by terrorist organizations. Confronting such use, however, must not be done at the expense of the First Amendment and by requiring Internet companies to police and report on their users’ activities.