In the News The Am Law Litigation Daily

Litigators of the Week: Kirkland Duo Wins One for Facebook—and the Internet

Our Litigators of the Week are Kirkland & Ellis partners Craig Primis and K. Winn Allen, who successfully defended Facebook against a high-profile suit by victims of Hamas terrorist attacks in Israel. The plaintiffs alleged that the social networking giant provided material support to terrorist activities.

On Wednesday, the U.S. Court of Appeals for the Second Circuit held that the claims were barred by the Communications Decency Act. The majority rejected the plaintiffs’ argument that because Facebook uses algorithms to suggest content or connections to other users, it was not shielded by the law, which immunizes online publishers.

Primis and Allen discussed the case with Lit Daily.

Lit Daily: What was at stake for Facebook—and other online publishers—in this case?  

Craig Primis: This was an important [Communications Decency Act] case for Facebook and other social media companies. In many ways, the CDA led to the explosive growth of the internet by allowing platforms to serve as forums for speech and the exchange of ideas. While the enactment of the CDA preceded the creation of social media companies, the statute has been instrumental in allowing those companies to provide a platform for engagement and information sharing among billions of people.

The plaintiffs in this case argued that, by implementing algorithms to organize, present, and suggest information posted by its users, Facebook somehow lost CDA protection. If plaintiffs had succeeded, it could have significantly limited the ability of social media companies and other online information aggregators like Google and Yelp to organize and present the massive quantities of information on their platforms to users looking for information posted by others.  

What were the plaintiffs’ main allegations against Facebook?

Winn Allen: Plaintiffs alleged that Facebook provided material support and resources to Hamas by failing to prevent Hamas members and sympathizers from spreading terrorist messages on Facebook and praising terrorist attacks in Israel. The plaintiffs recognized Facebook has policies against, and regularly removes, extremist content, but claimed Facebook was not doing enough. Facebook strongly denied these allegations, both on the facts and the law. The plaintiffs were represented by Robert Tolchin and Meir Katz of The Berkman Law Office in New York.

We argued that plaintiffs’ claims were all barred by the CDA. Even if they weren’t, we also argued that their claims failed to state a claim under the Anti-Terrorism Act, which a number of courts have now held in similar cases.

When and how did you become involved in the case and who were the key members of your team?  

Winn Allen: These cases started in 2016. Facebook asked us to represent them after we successfully argued a similar case in the D.C. Circuit brought by Larry Klayman, which resulted in the D.C. Circuit’s first published opinion recognizing CDA protection for social media companies. Other than the two of us, the key member of the team was Mary Miller, an all-star associate who recently left the firm for a clerkship with Judge Leon here in Washington.

Walk us through what happened at the district court level.   

Winn Allen: The case actually started as two separate cases: Cohen and Force. The plaintiffs in Cohen were 20,000 Israeli citizens who alleged they were targeted by terrorist organizations. The district court dismissed the Cohen action, finding that the plaintiffs lacked standing because their harms were too speculative. The plaintiffs didn’t appeal that case.  

The second case, Force, is what the Second Circuit just resolved. Unlike the Cohen plaintiffs, the Force plaintiffs were either victims or relatives of victims of actual terrorist attacks. In May 2017 the district court dismissed their claims under the CDA, and that’s the ruling the Second Circuit just affirmed.

On appeal, was it a hot bench? Any highlights or surprises?

Craig Primis: It was a hot bench at oral argument and both sides argued well beyond their allotted time.  Though the court decided the case on CDA grounds, argument focused not only on the CDA issues but also on numerous issues under the Anti-Terrorism Act. Given the sensitivity of the allegations and the potential scope and impact of the ruling plaintiffs sought, the court took the case seriously and had hard questions for both sides.

There’s been continued debate about the proper scope of Section 230 of the Communications Decency Act and the immunities it provides social media companies. What makes the decision significant in this regard?  

Craig Primis: This case was one of the first to directly address whether a social media platform’s use of algorithms to suggest content and connections to other users deprived the publisher of CDA immunity.  

As you can imagine, suggestions of content and friends are a key part of the Facebook platform, as it is for other social media companies. Without them, it would be difficult to present all that information in a meaningful way that allows users to find information they are actually interested in. The decision squarely holds that “[m]erely arranging and displaying others’ content” does not deprive Facebook of CDA immunity.

How did the court address the argument that Facebook is an information content provider?

Winn Allen: The Second Circuit agreed with us that a social media company providing a platform for speech does not become an “information content provider” under the CDA just by suggesting content or friends to other users.

Instead, the court said that making information and content more available to users is “an essential part of traditional publishing” and to hold Facebook liable for doing so “is both ungrounded in the text of Section 230 and contrary to its purpose.” The court said that “accepting plaintiffs’ argument would eviscerate” Section 230 of the CDA.

It’s also important to note that, even if the court had not agreed with us on the CDA question, we think plaintiffs’ claims still would have failed under the ATA. A number of court of appeals decisions have recently affirmed dismissal of similar ATA claims against social media companies.

As the panel noted, Facebook “has been criticized recently—and frequently—for not doing enough to take down offensive or illegal content.” What’s your response?  

Craig Primis: Facebook’s position on terrorism and violent extremism is clear—the company bans all praise, support and representation of individuals and organizations that proclaim a violent or hateful mission or are engaged in acts of hate or violence. This is true regardless of ideology or motivation—as soon as Facebook becomes aware of this content, it takes action to remove it.

The company has made major investments to build out the expertise on its dedicated counterterrorism and violent extremism teams, improve the technical tools it uses to proactively detect terrorist content, and strengthen partnerships across industry, civil society, academia and government. Facebook’s investments have been successful: Since the beginning of 2018, for example, Facebook has taken down more than 25 million pieces of terrorist content, 99% of which Facebook proactively identified and removed before anyone reported it.

The victims in the case—including an infant and a teen—seem completely sympathetic. How did you balance being sensitive to their families’ losses with zealously advocating for your client?   

Craig Primis: It is always a challenge defending a case where the facts are so tragic, and we tried to be sensitive throughout the case that there are real victims here, and their concerns about Hamas are valid and real.      

In his dissent, Chief Judge Katzmann suggested that Congress should revisit the Communications Decency Act, noting that when it was passed in 1996, “the internet was an afterthought.” What’s your take on that?

Winn Allen: The majority opinion expressly addressed and rejected that argument. As the majority pointed out, Section 230 itself contains “pro-Internet-development policy statements” and “specific findings” about the importance of the internet and the value the internet can have for sharing information and fostering dialogue. There isn’t a doubt that the legislators who wrote this statute had the protection of the internet as a forum for speech foremost on their minds.