From Justice Thomas’s statement respecting denial of certiorari today in Doe v. Facebook, Inc. [UPDATE: just saw that Jonathan beat me to it, but I thought I’d keep this up; the main extra matters in this post are the excerpts from the Texas Supreme Court opinion, which may help explain the background of the litigation]:
In 2012, an adult, male sexual predator used Facebook to lure 15-year-old Jane Doe to a meeting, shortly after which she was repeatedly raped, beaten, and trafficked for sex. Doe eventually escaped and sued Facebook in Texas state court, alleging that Facebook had violated Texas’ anti-sex-trafficking statute and committed various common-law offenses. Facebook petitioned the Texas Supreme Court for a writ of mandamus dismissing Doe’s suit. The court held that a provision of the Communications Decency Act known as § 230 bars Doe’s common-law claims, but not her statutory sex-trafficking claim.
Section 230(c)(1) states that “[n]o provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.” The Texas Supreme Court emphasized that courts have uniformly treated internet platforms as “publisher[s]” under § 230(c)(1), and thus immune, whenever a plaintiff ‘s claim “‘stem[s] from [the platform’s] publication of information created by third parties.'”
As relevant here, this expansive understanding of publisher immunity requires dismissal of claims against internet companies for failing to warn consumers of product defects or failing to take reasonable steps “to protect their users from the malicious or objectionable activity of other users.” The Texas Supreme Court acknowledged that it is “plausible” to read § 230(c)(1) more narrowly to immunize internet platforms when plaintiffs seek to hold them “strictly liable” for transmitting third-party content, but the court ultimately felt compelled to adopt the consensus approach.
This decision exemplifies how courts have interpreted § 230 “to confer sweeping immunity on some of the largest companies in the world,” particularly by employing a “capacious conception of what it means to treat a website operator as [a] publisher or speaker.” Here, the Texas Supreme Court afforded publisher immunity even though Facebook allegedly “knows its system facilitates human traffickers in identifying and cultivating victims,” but has nonetheless “failed to take any reasonable steps to mitigate the use of Facebook by human traffickers” because doing so would cost the company users—and the advertising revenue those users generate. [Plaintiff’s Complaint]; see also Reply Brief (listing recent disclosures and investigations supporting these allegations). It is hard to see why the protection § 230(c)(1) grants publishers against being held strictly liable for third parties’ content should protect Facebook from liability for its own “acts and omissions.”
At the very least, before we close the door on such serious charges, “we should be certain that is what the law demands.” As I have explained, the arguments in favor of broad immunity under § 230 rest largely on “policy and purpose,” not on the statute’s plain text. Here, the Texas Supreme Court recognized that “[t]he United States Supreme Court—or better yet, Congress—may soon resolve the burgeoning debate about whether the federal courts have thus far correctly interpreted section 230.” Assuming Congress does not step in to clarify § 230’s scope, we should do so in an appropriate case.
Unfortunately, this is not such a case. We have jurisdiction to review only “[f]inal judgments or decrees” of state courts. And finality typically requires “an effective determination of the litigation and not of merely interlocutory or intermediate steps therein.” Because the Texas Supreme Court allowed Doe’s statutory claim to proceed, the litigation is not “final.” Conceding as much, Doe relies on a narrow exception to the finality rule involving cases where “the federal issue, finally decided by the highest court in the State, will survive and require decision regardless of the outcome of future state-court proceedings.” But that exception cannot apply here because the Texas courts have not yet conclusively adjudicated a personal-jurisdiction defense that, if successful, would “effectively moot the federal-law question raised here.”
I, therefore, concur in the Court’s denial of certiorari. We should, however, address the proper scope of immunity under § 230 in an appropriate case.
Here is the Texas Supreme Court’s summary of the plaintiffs’ common-law claims, which it held were preempted:
The essence of Plaintiffs’ negligence, gross-negligence, negligent-undertaking, and products-liability claims is that, because Plaintiffs were users of Facebook or Instagram, the company owed them a duty to warn them or otherwise protect them against recruitment into sex trafficking by other users. Facebook violated that duty, Plaintiffs contend, by its failures to “implement any safeguards to prevent adults from contacting minors,” “report suspicious messages,” “warn of the dangers posed by sex traffickers,” or “identify sex traffickers on its Platforms.” Under the view of section 230 adopted in every published decision of which we are aware, these claims “treat” Facebook “as the publisher or speaker” of third-party communication and are therefore barred.
Plaintiffs argue that their common-law claims do not treat Facebook as a “publisher” or “speaker” because they “do not seek to hold [it] liable for exercising any sort of editorial function over its users’ communications,” but instead merely for its own “failure to implement any measures to protect them” from “the dangers posed by its products.” Yet this theory of liability, while phrased in terms of Facebook’s omissions, would in reality hold the company liable simply because it passively served as an “intermediar[y] for other parties’… injurious messages.”
Put differently, “the duty that [Plaintiffs] allege [Facebook] violated” derives from the mere fact that the third-party content that harmed them was transmitted using the company’s platforms, which is to say that it “derives from [Facebook’s] status … as a ‘publisher or speaker'” of that content. These claims seek to impose liability on Facebook for harm caused by malicious users of its platforms solely because Facebook failed to adequately protect the innocent users from the malicious ones. All the actions Plaintiffs allege Facebook should have taken to protect them—warnings, restrictions on eligibility for accounts, removal of postings, etc.—are actions courts have consistently viewed as those of a “publisher” for purposes of section 230. Regardless of whether Plaintiffs’ claims are couched as failure to warn, negligence, or some other tort of omission, any liability would be premised on second-guessing of Facebook’s “decisions relating to the monitoring, screening, and deletion of [third-party] content from its network.” …
And here’s the Texas Supreme Court’s summary of the state statutory claims, and why they aren’t preempted:
Plaintiffs also sued Facebook under a Texas statute creating a civil cause of action against anyone “who intentionally or knowingly benefits from participating in a venture that traffics another person.” According to Plaintiffs, Facebook violated this statute through such “acts and omissions” as “knowingly facilitating the sex trafficking of [Plaintiffs]” and “creat[ing] a breeding ground for sex traffickers to stalk and entrap survivors.” … Liability under [this statute] requires a showing that a defendant acquired a benefit by “participat[ing]” in a human-trafficking “venture.” Such “participation” connotes more than mere passive acquiescence in trafficking conducted by others. This much is evident from the common meaning of “participate,” representative definitions of which include, “[t]o be active or involved in something; take part,” and, “to take part, be or become actively involved, or share (in).” …
Thus, to charge Facebook with “intentionally or knowingly benefit[ting] from participating in a [trafficking] venture” is to charge it with “some affirmative conduct”—that is, “an overt act” beyond “mere negative acquiescence”—”designed to aid in the success of the venture.” It follows that a claim under section 98.002 arises not merely from a website’s failure to take action in response to the injurious communications of others, but instead from the website’s own affirmative acts to facilitate injurious communications.
This distinction—between passive acquiescence in the wrongdoing of others and affirmative acts encouraging the wrongdoing—is evident in Plaintiffs’ allegations, which we construe liberally at the [motion-to-dismiss] stage. While many of Plaintiffs’ allegations accuse Facebook of failing to act as Plaintiffs believe it should have, the section 98.002 claims also allege overt acts by Facebook encouraging the use of its platforms for sex trafficking. For instance, the petitions state that Facebook “creat[ed] a breeding ground for sex traffickers to stalk and entrap survivors”; that “Facebook … knowingly aided, facilitated and assisted sex traffickers, including the sex trafficker[s] who recruited [Plaintiffs] from Facebook” and “knowingly benefitted” from rendering such assistance; that “Facebook has assisted and facilitated the trafficking of [Plaintiffs] and other minors on Facebook”; and that Facebook “uses the detailed information it collects and buys on its users to direct users to persons they likely want to meet” and, “[i]n doing so, … facilitates human trafficking by identifying potential targets, like [Plaintiffs], and connecting traffickers with those individuals.” Read liberally in Plaintiffs’ favor, these statements may be taken as alleging affirmative acts by Facebook to encourage unlawful conduct on its platforms….
The available precedent indicates that Facebook enjoys no CDA immunity from claims founded on such allegations. For instance, the Ninth Circuit has held that defendants lose their CDA immunity if they go beyond acting as “passive transmitter[s] of information provided by others.” A defendant that operates an internet platform “in a manner that contributes to,” or is otherwise “directly involved in,” “the alleged illegality” of third parties’ communication on its platform is “not immune.” Here, Plaintiffs’ statutory cause of action is predicated on allegations of Facebook’s affirmative acts encouraging trafficking on its platforms.
These allegations differ from Plaintiffs’ common-law claims, under which Facebook is accused only of “providing neutral tools to carry out what may be unlawful or illicit” communication by its users. The common-law claims are “based on [Facebook’s] passive acquiescence in the misconduct of its users,” for which the company is “entitled to CDA immunity.” Like the Ninth Circuit, however, we understand the CDA to stop short of immunizing a defendant for its “affirmative acts … contribut[ing] to any alleged unlawfulness” of “user-created content.” Facebook’s alleged violations of [the Texas statute] fall in the latter category. These allegations do not treat Facebook as a publisher who bears responsibility for the words or actions of third-party content providers. Instead, they treat Facebook like any other party who bears responsibility for its own wrongful acts. Other courts have drawn a similar line….
These [statutory] claims may proceed to further litigation, although we express no opinion on their viability at any later stage of these cases….