in Re Facebook, Inc. and Facebook, Inc. D/B/A Instagram

Court: Texas Supreme Court
Date filed: 2021-06-25
Citations:
Copy Citations
Click to Find Citing Cases
Combined Opinion
               IN THE SUPREME COURT OF TEXAS
                                       ══════════
                                         No. 20-0434
                                       ══════════

     IN RE FACEBOOK, INC. AND FACEBOOK, INC. D/B/A INSTAGRAM, RELATORS

            ══════════════════════════════════════════
                      ON PETITION FOR WRIT OF MANDAMUS
            ══════════════════════════════════════════


                                  Argued February 24, 2021


       JUSTICE BLACKLOCK delivered the opinion of the Court.

       JUSTICE BUSBY and JUSTICE HUDDLE did not participate in the decision.


       Facebook seeks writs of mandamus directing the dismissal of three lawsuits pending

against it in district court. The plaintiffs in all three cases allege they were victims of sex

trafficking who became entangled with their abusers through Facebook. They assert claims for

negligence, negligent undertaking, gross negligence, and products liability based on Facebook’s

alleged failure to warn of, or take adequate measures to prevent, sex trafficking on its internet

platforms. They also assert claims under a Texas statute creating a civil cause of action against

those who intentionally or knowingly benefit from participation in a sex-trafficking venture. See

TEX. CIV. PRAC. & REM. CODE § 98.002.

       In all three lawsuits, Facebook moved to dismiss all claims against it as barred by section

230 of the federal “Communications Decency Act” (“CDA”), which provides that “[n]o cause of

action may be brought and no liability may be imposed under any State or local law that is
inconsistent with this section.” 47 U.S.C. § 230(e)(3). Facebook contends that all the plaintiffs’

claims are “inconsistent with” section 230(c)(1), which says that “[n]o provider or user of an

interactive computer service shall be treated as the publisher or speaker of any information

provided by another information content provider.”

       For the reasons explained below, we deny mandamus relief in part and grant it in part. The

plaintiffs’ statutory human-trafficking claims may proceed, but their common-law claims for

negligence, gross negligence, negligent undertaking, and products liability must be dismissed.

       We do not understand section 230 to “create a lawless no-man’s-land on the Internet” in

which states are powerless to impose liability on websites that knowingly or intentionally

participate in the evil of online human trafficking. Fair Hous. Council v. Roommates.Com, LLC,

521 F.3d 1157, 1164 (9th Cir. 2008) (en banc). Holding internet platforms accountable for the

words or actions of their users is one thing, and the federal precedent uniformly dictates that section

230 does not allow it. Holding internet platforms accountable for their own misdeeds is quite

another thing. This is particularly the case for human trafficking. Congress recently amended

section 230 to indicate that civil liability may be imposed on websites that violate state and federal

human-trafficking laws. See Allow States and Victims to Fight Online Sex Trafficking Act

(“FOSTA”), Pub. L. No. 115-164, 132 Stat. 1253 (2018). Section 230, as amended, does not

withdraw from the states the authority to protect their citizens from internet companies whose own

actions—as opposed to those of their users—amount to knowing or intentional participation in

human trafficking.

       Whether the plaintiffs can prove such a claim against Facebook is not at issue in this

mandamus proceeding. At this early stage of these cases, we take the plaintiffs’ allegations as true


                                                  2
and construe them liberally against dismissal. We hold only that the statutory claim for knowingly

or intentionally benefiting from participation in a human-trafficking venture is not barred by

section 230 and may proceed to further litigation.

        As for the plaintiffs’ other claims, section 230 is no model of clarity, and there is ample

room for disagreement about its scope. See generally Malwarebytes, Inc. v. Enigma Software Grp.

USA, LLC, 141 S. Ct. 13 (2020) (statement of Thomas, J., respecting denial of certiorari). Despite

the statutory text’s indeterminacy, the uniform view of federal courts interpreting this federal

statute requires dismissal of claims alleging that interactive websites like Facebook should do more

to protect their users from the malicious or objectionable activity of other users. The plaintiffs’

claims for negligence, negligent undertaking, gross negligence, and products liability all fit this

mold.   The United States Supreme Court—or better yet, Congress—may soon resolve the

burgeoning debate about whether the federal courts have thus far correctly interpreted section 230

to bar such claims. Nevertheless, the prevailing judicial interpretation of section 230 has become

deeply imbedded in the expectations of those who operate and use interactive internet services like

Facebook. We are not interpreting section 230 on a clean slate, and we will not put the Texas court

system at odds with the overwhelming federal precedent supporting dismissal of the plaintiffs’

common-law claims.

        Facebook’s petition for mandamus relief is denied in part and conditionally granted in part.

The human-trafficking claims under section 98.002 of the Civil Practice and Remedies Code may

proceed in accordance with this opinion, but the common-law claims must be dismissed.

                                          I. Background

        This proceeding involves three separate lawsuits against Facebook based on similar


                                                 3
allegations, which must be “taken as true” for purposes of deciding Facebook’s motions to dismiss.

TEX. R. CIV. P. 91a. The facts alleged in each plaintiff’s live petition are summarized below.

        Cause No. 2018-69816 (334th Dist. Ct.). Plaintiff was fifteen years old in 2012 when she

was “friended” by another Facebook user with whom she shared several mutual friends. The user’s

profile featured photographs of “scantily-clad young women in sexual positions” with money

stuffed in their mouths, as well as “other deeply troubling content.” The user, who was “well over”

the age of eighteen, contacted Plaintiff using Facebook’s messaging system, which the two began

using to communicate regularly. He told Plaintiff she was “pretty enough to be a model” and

promised to help her pursue a modeling career. After Plaintiff confided in him about an argument

with her mother, he again offered her a modeling job and proposed they meet in person. Shortly

after meeting him, Plaintiff was photographed and her pictures posted to the website Backpage

(which has since been shut down due to its role in human trafficking), advertising her for

prostitution. As a result, Plaintiff was “raped, beaten, and forced into further sex trafficking.”

        Cause No. 2018-82214 (334th Dist. Ct.). Plaintiff was fourteen years old in 2017 and was

a user of both Facebook and Instagram, which Facebook owns. She was contacted via Instagram

by a male user who was “well over” eighteen years of age. Using “false promises of love and a

better future,” he lured Plaintiff “into a life of trafficking through traffickers who had access to her

and sold her through social media.” Her traffickers used Instagram to advertise Plaintiff as a

prostitute and to arrange “‘dates’ (that is, the rape of [Plaintiff] in exchange for money).” As a

result, Plaintiff was raped numerous times. Following Plaintiff’s rescue from the trafficking

scheme, traffickers continued to use her profile to attempt to entrap other minors in the same

manner. Plaintiff’s mother reported these activities to Facebook, which never responded.


                                                   4
       Cause No. 2019-16262 (151st Dist. Ct.). Plaintiff was fourteen years old in 2016 and used

an Instagram account, on which she identified herself as fourteen years old. She was not required

to verify her age or to link her account to that of a parent or guardian. Another Instagram user, a

man of about thirty with whom Plaintiff was not acquainted, “friended” her on Instagram. Between

2016 and 2018, the man and Plaintiff regularly exchanged messages. The correspondence was

part of his alleged efforts to “groom” Plaintiff to ensnare her in a sex-trafficking operation. In

March 2018, he convinced Plaintiff to sneak away from her home and meet him. Upon meeting

her, he took her to a motel, photographed her, and posted the pictures to Backpage. Plaintiff was

then raped repeatedly by men who responded to her traffickers’ posting on the site.

       Litigation Against Facebook. The plaintiffs in all three lawsuits (“Plaintiffs”) brought

essentially identical claims against Facebook. First, Plaintiffs allege Facebook owed them a duty

to exercise reasonable care to protect them from the “dangers of grooming and recruitment on [its

platforms] by sex traffickers.” Plaintiffs argue that Facebook breached this duty by, among other

omissions, its “[f]ailure[s] to warn” of those risks, “implement awareness campaigns” about “sex

traffickers using its website,” “verify the identity and/or age of users,” “implement any safeguards

to prevent adults from contacting minors,” “report suspicious messages between a minor and an

adult user,” “require accounts for minors to be linked to those of adults,” or “deprive known

criminals from having accounts.” “Facebook’s duty could have been satisfied,” Plaintiffs contend,

“through warnings posted on users’ feeds, e-mails to accounts run by users under the age of 18,

and/or through informing authorities of what it knew about red-flag activities and messages

between users.” Plaintiffs also brought gross-negligence and negligent-undertaking claims based

largely on the same allegations. Finally, Plaintiffs brought products-liability claims under the


                                                 5
theory that “[a]s a manufacturer, Facebook is responsible for the defective and unreasonabl[e]

characteristics in its . . . product[s].” Plaintiffs contend that these products, specifically Facebook

and Instagram, were “marketed to children under the age of 18, without providing adequate

warnings and/or instructions regarding the dangers of ‘grooming’ and human trafficking on [either

platform]. These dangerous warning and marketing defects were both the direct and producing

cause of [Plaintiffs’] trafficking.”1

        In addition to the foregoing common-law claims, Plaintiffs also sued under a Texas statute

that creates a civil cause of action against anyone “who intentionally or knowingly benefits from

participating in a venture that traffics another person.” TEX. CIV. PRAC. & REM. CODE § 98.002(a).

Plaintiffs claim “Facebook breached this duty by knowingly facilitating . . . sex trafficking.”

Facebook allegedly did so by “creating a breeding ground for sex traffickers to stalk and entrap

survivors,” “[r]aising advertising fees by extending its ‘user base’ to include sex traffickers,”

“[i]ncreasing profits by not using advertising space for public service announcements regarding

the dangers of . . . sex traffickers,” and “[i]ncreasing profit margins due to lower operation cost by

not implementing safeguards requiring verification of [users’] identit[ies].”

        In all three lawsuits, Facebook moved under Rule 91a to dismiss all claims as barred by

section 230. The motions were denied in relevant part by the district courts. Facebook sought

mandamus relief in the court of appeals. A divided panel denied relief without substantive



        1
           In Cause No. 2019-16262, the district court granted Facebook’s motion to dismiss the products-liability
claim on the ground that Facebook is not a “product” (an issue not raised here). In the other two suits, Facebook’s
briefing in support of its motions to dismiss argued for dismissal of the products-liability claims on section 230
grounds. We agree with Facebook that Plaintiffs’ products-liability claims in Cause Nos. 2018-82214 and 2018-69816
are properly before this Court.



                                                        6
explanation. 607 S.W.3d 839 (Tex. App.—Houston [14th Dist.] 2020). One justice would have

granted relief based on section 230. Id. at 839–40 (Christopher, J., dissenting). Facebook

petitioned this Court for writs of mandamus.2

                                                      II. Analysis

                                              A. Standard of Review

         “Mandamus relief is appropriate” to correct “a clear abuse of discretion” for which a relator

“has no adequate remedy by appeal.” In re Geomet Recycling LLC, 578 S.W.3d 82, 91 (Tex.

2019). In this proceeding, the principal point of contention is whether the district courts abused

their discretion by denying Facebook’s motions to dismiss based on section 230. The answer

depends on the meaning of section 230, a federal statute immunizing “interactive computer

service[s]” from certain liability stemming from content created “by []other information content



         2
            On January 14, 2021, this Court was informed that the district judge who denied Facebook’s Rule-91a
motions in Cause Nos. 2018-69816 and 2018-82214 no longer holds office. Since this “case is an original
proceeding under Rule 52,” we “must abate the proceeding to allow the successor [judge] to reconsider the original
[judge’s] decision.” TEX. R. APP. P. 7.2(b). Accordingly, on January 22nd, we abated this mandamus proceeding in
part “until further order of th[is] Court,” instructing the parties to report back within 60 days. The partial abatement did
not affect Cause No. 2019-16262, and oral argument proceeded as scheduled on February 24th. On March 23rd, the
parties submitted a status report explaining that the plaintiffs in both affected cases filed motions on February 11th
“asking [the successor judge] to adopt [her predecessor’s] order[s],” and that “Facebook filed . . . responses in
opposition” on March 8th. As far as this Court is aware, both motions remain pending.

          Although the judge now presiding over Cause Nos. 2018-69816 and 2018-82214 has not made a ruling of
which we are aware, Rule 7.2(b) does not require indefinite abatement. It requires only that we “abate the proceeding
to allow [the new judge] to reconsider the original [judge’s] decision.” That requirement has been met here. Over
four months have passed since the plaintiffs asked the new judge to adopt the original judge’s decision, and over three
months have passed since Facebook responded. The intervening time has been sufficient to “allow the successor to
reconsider the original [judge]’s decision.” “[M]andamus is a discretionary writ,” the availability of which depends
in part on our equitable judgment as to whether mandamus relief is an “efficient manner of resolving the dispute.” In
re Blevins, 480 S.W.3d 542, 544 (Tex. 2013). Our decision today resolves Facebook’s mandamus petition with respect
to Cause No. 2019-16262, and under these circumstances Rule 7.2(b) does not require us to refrain from doing the
same in the heretofore abated portions of this proceeding. This Court’s January 22nd abatement order regarding Cause
Nos. 2018-69816 and 2018-82214 is lifted, and Facebook’s entire mandamus petition regarding all three trial-court
cases is disposed of as described in this opinion.



                                                             7
provider[s].” 47 U.S.C. § 230(c), (e)(3).3 We review de novo the trial courts’ legal conclusions,

including their interpretations of federal statutes, since an error of law or an erroneous application

of law to facts is always an abuse of discretion. In re Geomet, 578 S.W.3d at 91–92.

         Although mandamus relief is often unavailable to correct the erroneous denial of a motion

to dismiss, it may nevertheless be warranted if a litigant would suffer “impairment or loss” of

“important substantive . . . rights” while awaiting the error’s correction on appeal. In re Prudential

Ins. Co. of Am., 148 S.W.3d 124, 136 (Tex. 2004). Among the rights that can only be vindicated

by dismissal are those conferred by “federal statutes [that] provid[e] covered defendants with

immunity from suit.” In re Academy, Ltd., __ S.W.3d __, __ (Tex. 2021). As we held today in

Academy, for example, the federal “Protection of Lawful Commerce in Arms Act” (“PLCAA”)

created such an entitlement to dismissal because the Act provided that certain actions “may not be

brought” against covered defendants. Id. at __. In that case, mandamus relief was warranted to

correct erroneous denial of a motion to dismiss based on the PLCAA because requiring a defendant

to “proceed[] to trial” and await the error’s correction on appeal “‘would defeat the substantive

right’ granted by the [statute]” to be free from burdensome litigation. Id. at __ (quoting In re

McAllen Med. Ctr., Inc., 275 S.W.3d 458, 465 (Tex. 2008)).

         The same is true here. Just as the PLCAA provides that certain actions “may not be

brought,” section 230 contains a materially identical instruction that “[n]o cause of action may be

brought . . . .” The two provisions are indistinguishable with respect to whether they create a



         3
           The parties agree that Facebook is an “interactive computer service.” The CDA defines “information
content provider” as anyone “responsible, in whole or in part, for the creation or development of information provided
through” an “interactive computer service.” § 230(f)(3). No party disputes that the messages sent to Plaintiffs through
Facebook or Instagram by sex traffickers qualify as “information provided by another information content provider.”

                                                          8
substantive right to be free of litigation, not just a right to be free of liability at the end of litigation.

Moreover, most federal cases interpreting section 230 hold that it confers “immunity from suit

rather than a mere defense to liability,” which is “effectively lost if a case is erroneously permitted

to go to trial.” Nemet Chevrolet, Ltd. v. Consumeraffairs.com, Inc., 591 F.3d 250, 254 (4th Cir.

2009) (quoting Brown v. Gilmore, 278 F.3d 362, 366 n.2 (4th Cir. 2002)). For this reason, federal

courts urge “resol[ution of] the question of § 230 immunity at the earliest possible stage” of

litigation so as to vindicate the provision’s protections against “costly and protracted legal battles.”

Id. at 255 (quoting Roommates.Com, 521 F.3d at 1175). As in Academy, if the denials of

Facebook’s motions to dismiss were erroneous, the company lacks an adequate appellate remedy

because its federal statutory right to avoid litigation of this nature would be impaired if it had to

await relief on appeal. In re Prudential, 148 S.W.3d at 136.

        Facebook moved for dismissal based solely on section 230, not on the ground that

Plaintiffs’ allegations do not state a cognizable claim under section 98.002 or under any of their

other legal theories. As a result, we do not consider whether Plaintiffs’ allegations are sufficient

to support the claims asserted. We consider only whether Plaintiffs’ claims, as pleaded, “treat[]”

Facebook “as the publisher or speaker” of third-party content in conflict with section 230. If they

do, the claims “may [not] be brought” and must be dismissed. 47 U.S.C. § 230(c)(1), (e)(3).

                                        B. Governing Principles

        When interpreting a federal statute, this Court generally follows the decisions of the U.S.

Supreme Court. In re Morgan Stanley & Co., 293 S.W.3d 182, 189 (Tex. 2009). The Supreme

Court has not addressed the scope of section 230. It has, however, often stated principles of

statutory interpretation with which we agree. The first is that “when the statutory language is


                                                      9
plain,” courts “must enforce it according to its terms.” Jimenez v. Quarterman, 555 U.S. 113, 118

(2009). “[U]nless otherwise defined, words will be interpreted as taking their ordinary . . . meaning

. . . . at the time Congress enacted the statute.” Perrin v. United States, 444 U.S. 37, 42 (1979). It

is a “cardinal rule,” as well, “that a statute is to be read as a whole, since the meaning of statutory

language, plain or not, depends on context.” King v. St. Vincent’s Hosp., 502 U.S. 215, 221 (1991).

We therefore “consider not only the bare meaning of [each] word” but also the relevant “statutory

scheme.” Bailey v. United States, 516 U.S. 137, 145 (1995). In addition, the objective meaning

conveyed by text may depend on the “backdrop against which Congress enacted [it].” Stewart v.

Dutra Constr. Co., 543 U.S. 481, 487 (2005). Understanding this backdrop is crucial in construing

“term[s] of art” with “established meaning[s]” in the law. Id. We presume, absent a contrary

indication, that “Congress intends to incorporate the well-settled meaning of [such] common-law

terms.” Neder v. United States, 527 U.S. 1, 23 (1999). Still, “the authoritative statement is the

statutory text.” Exxon Mobil Corp. v. Allapattah Servs., Inc., 545 U.S. 546, 568 (2005).4

         In addition to construing the CDA, we must also “inquir[e] whether the laws” of Texas




         4
             Both parties and amici ask us to rely on legislative history when interpreting federal statutes, an invitation
we decline. The use of legislative history to ascertain congressional “intent” has been quite controversial, and the
Supreme Court observed recently that the practice “is vulnerable to . . . serious criticisms.” Allapattah, 545 U.S. at
568. “First, legislative history is itself often murky, ambiguous, and contradictory.” Id. It “serves as an ever-present
judicial mercenary, embraced when helpful and ignored when not,” and courts’ consultation of it often devolves into
“‘looking over a crowd and picking out your friends.’” Tex. Mut. Ins. Co. v. Ruttiger, 381 S.W.3d 430, 457 n.5 (Tex.
2012) (Willett, J., concurring). “Second, judicial reliance” on legislative history permits “strategic manipulation[]” of
the lawmaking process by unrepresentative factions of legislators or legislative staff in order “to secure results they
were unable to achieve through the statutory text.” Allapattah, 545 U.S. at 568. More fundamentally, judicial reliance
on legislative history risks attaching authoritative weight to statements not subject to “the constitutionally prescribed
process of bicameralism and presentment.” John F. Manning, Textualism as a Nondelegation Doctrine, 97 COLUM.
L. REV. 673, 676 (1997). We therefore agree with the U.S. Supreme Court’s venerable observation that, “[i]n
expounding [a] law,” a “court cannot, in any degree, be influenced by . . . debate which took place on its passage
. . . . The law as it passed is the will of the majority of both houses, and the only mode in which that will is spoken is
in the act itself.” Aldridge v. Williams, 44 U.S. 9, 24 (1845).

                                                           10
“have, in their application to this case, come into collision with [this] act of Congress, and deprived

[Facebook] of a right to which that act entitles [it]. Should this collision exist,” state law “must

yield to the law of Congress.” Gibbons v. Ogden, 22 U.S. 1, 210 (1824) (construing U.S. CONST.

art. VI, cl. 2). Because “[p]re-emption . . . shield[s] the system from conflicting regulation of

conduct,” it is “the conduct being regulated, not the formal description of governing legal

standards, that is the proper focus of concern.” Amalgamated Ass’n of St., Elec. Ry. & Motor

Coach Emps. of Am. v. Lockridge, 403 U.S. 274, 292 (1971). Thus, the determination of whether

state tort suits are preempted by federal law depends not merely on how the claims are labelled,

but also on careful consideration of the substance of the claims. Simply because a “cause[] of

action,” under some circumstances, “might escape . . . preemption” does not mean that the

“particular . . . claims” of a given plaintiff necessarily do. Quest Chem. Corp. v. Elam, 898 S.W.2d

819, 820 (Tex. 1995). We therefore look beyond a cause of action’s name to the underlying

conduct or duty on which it is based. Id. at 821.5

                                            C. The Federal Statute

         Section 230 was enacted in 1996 as part of the “Communications Decency Act” (“CDA”).

See Pub. L. No. 104-104, § 509, 110 Stat. 56. Having determined that the “Internet . . . ha[d]

flourished, to the benefit of all Americans, with a minimum of government regulation,” Congress



         5
           The parties dispute whether the presumption against preemption should guide our analysis of whether the
CDA bars Plaintiffs’ claims. Facebook points to recent Supreme Court cases holding that if a “statute ‘contains an
express pre-emption clause’” (as the CDA does), courts “do not invoke any presumption against pre-emption but
instead ‘focus on the plain wording of the clause, which necessarily contains the best evidence of Congress’ pre-
emptive intent.’” Puerto Rico v. Franklin Cal. Tax-Free Tr., 136 S. Ct. 1938, 1946 (2016) (quoting Chamber of
Commerce of U.S. v. Whiting, 563 U.S. 582, 594 (2011)). Plaintiffs, on the other hand, rely on other cases explaining
that the presumption applies even to statutes with an express preemption clause if “the text of the pre-emption clause
is susceptible of more than one plausible reading,” at least “when Congress has legislated in a field traditionally
occupied by the States.” Altria Grp., Inc. v. Good, 555 U.S. 70, 77 (2008). We will not undertake to resolve this
doctrinal dispute, as the outcome of this case would be the same whether the presumption applies or not.
                                                         11
passed section 230 with the stated purposes of “preserv[ing] [a] vibrant and competitive free

market . . . for the Internet . . . , unfettered by Federal or State regulation”; “remov[ing]

disincentives” for “utiliz[ing] . . . blocking and filtering technologies”; and “ensur[ing] vigorous

enforcement of Federal criminal laws” against “obscenity, stalking, and harassment by means of

computer.” 47 U.S.C. § 230(a)(4), (b). It is widely acknowledged that section 230’s liability

protections were primarily a response to Stratton Oakmont, Inc. v. Prodigy Servs. Co., No.

31063/94, 1995 WL 323710 (N.Y. Sup. Ct. May 24, 1995). In that case, a New York court held

that an online bulletin board could be held strictly liable for third parties’ defamatory posts.

Rejecting the defendant’s argument that it was a mere “distributor” of third-party content, the court

reasoned that the defendant’s occasional screening and editing of posts made it a primary publisher

and therefore responsible for defamation by users on its platform. See id. at *4.

        Section 230 “und[id] the perverse incentives created by this reasoning, which effectively

penalized providers for monitoring content.” Shiamili v. Real Estate Grp. of N.Y., Inc., 952 N.E.2d

1011, 1016 (N.Y. 2011). It did so by mandating that “[n]o provider . . . of an interactive computer

service shall be treated as the publisher or speaker of any information provided by another

information content provider” and that “[n]o cause of action may be brought and no liability may

be imposed . . . that is inconsistent with” this prohibition. 47 U.S.C. § 230(c)(1), (e)(3). In addition

to shielding interactive computer services from liability that “treat[s]” them as “publisher[s]” or

“speaker[s]” of third-party content, section 230 also shields them from liability for good-faith

efforts to “restrict access to or availability of material that the provider or user considers to be

obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable . . . .”

Id. § 230(c)(2)(A). Section 230’s dual protections are commonly understood to operate in tandem,


                                                  12
ensuring that a website is not discouraged by tort law from policing its users’ posts, while at the

same time protecting it from liability if it does not. See Batzel v. Smith, 333 F.3d 1018, 1027–28

(9th Cir. 2003); Roommates.Com, 521 F.3d at 1163.

       The meaning of section 230’s prohibition on treating an interactive computer service as the

“publisher” or “speaker” of third-party content is not entirely clear on the face of the statute.

Neither “publisher” nor “speaker” are defined terms, nor can those words’ common meanings tell

us precisely what it means for a cause of action to “treat[]” a defendant “as a publisher or speaker”

of third-party content. Abundant judicial precedent, however, provides ample guidance, nearly all

of it pointing in the same general direction. Federal and state courts have uniformly held that

section 230 “should be construed broadly in favor of immunity.” Force v. Facebook, Inc., 934

F.3d 53, 64 (2d Cir. 2019). The overwhelming weight of precedent “has resulted in a capacious

conception of what it means to treat a website operator as the publisher or speaker of information

provided by a third party.” Jane Doe No. 1 v. Backpage.com, LLC, 817 F.3d 12, 19 (1st Cir. 2016).

The “national consensus,” Shiamili, 952 N.E.2d at 1017, is that “all claims” against internet

companies “stemming from their publication of information created by third parties” effectively

treat the defendants as publishers and are barred, Doe v. MySpace, Inc., 528 F.3d 413, 418 (5th

Cir. 2008). As it has been interpreted, the provision “establishe[s] a general rule that” web service

providers may not be held “legally responsible for information created . . . by third parties” if such

providers “merely enable[d] that content to be posted online.” Nemet Chevrolet, 591 F.3d at 254.

The cases are equally uniform in holding that a plaintiff in a state tort lawsuit cannot circumvent

section 230 through “artful pleading” if his “allegations are merely another way of claiming that

[a defendant] was liable” for harms occasioned by “third-party-generated content” on its website.


                                                 13
MySpace, 528 F.3d at 420.

         Plaintiffs contend that the courts have systematically misread section 230. They urge us to

adopt the view recently proffered by Justice Thomas, under which “the sweeping immunity courts

have read into” section 230 should be scaled back or at least reconsidered. Malwarebytes, 141 S.

Ct. at 18 (statement respecting denial of certiorari). Justice Thomas suggests that courts have erred

by confusing “publisher” with “distributor” liability: “Traditionally, laws governing illegal content

distinguished between publishers or speakers (like newspapers) and distributors (like newsstands

and libraries). Publishers . . . . could be strictly liable for transmitting illegal content. But

distributors” were “liable only when they knew (or constructively knew) that content was illegal.”

Id. at 14. Under this view, although section 230 “grants immunity only from ‘publisher’ or

‘speaker’ liability,” cases interpreting the provision have incorrectly held “that it eliminates

distributor liability too” by “confer[ring] immunity even when a company distributes content that

it knows is illegal.” Id. at 15.6

         We agree that Justice Thomas’s recent writing lays out a plausible reading of section 230’s

text. The fact remains, however, that this more restrictive view of section 230 was articulated in



         6
            Amici the State of Texas and the Governor of Texas ask us to take the narrower view of section 230(c)(1)
suggested by Justice Thomas. To the extent their interest in this case stems from their disagreement with recent actions
by social media companies like Facebook to block political speech the company deems dangerous or misleading, see
Brief of the Governor of Texas as Amicus Curiae at 1, their complaints seem better directed at section 230(c)(2)(A),
which protects internet companies from liability for censoring content the company deems “objectionable.” That
provision is not at issue here. This case involves the protections of section 230(c)(1), which has been interpreted to
insulate websites from liability for declining to censor dangerous, objectionable, or otherwise injurious content
generated by third-party users. For those who believe Facebook and other such platforms should refuse to censor their
users’ speech, it would seem that dialing back the protections of section 230(c)(1)—and thereby expanding the civil
liability these companies face for failing to censor allegedly objectionable posts—would be counterproductive.
See Zeran v. Am. Online, Inc., 129 F.3d 327, 333 (4th Cir. 1997) (Wilkinson, C.J.) (If “[web] service providers [were]
subject to liability only for the publication of information” but “not for its removal, they would have a natural incentive
simply to remove messages,” resulting in “a chilling effect on the freedom of Internet speech.”).

                                                           14
a statement respecting the denial of certiorari, not in the decision of a case, and every existing

judicial decision interpreting section 230 takes the contrary position. If the more limited view of

section 230 were compelled by a plain reading of the statutory text, we would be compelled,

despite the contrary precedent, to follow what the statute says. After all, it is our “right and duty

. . . to interpret and to follow . . . all [federal] laws” according to our best understanding of their

meaning, subject only “to a litigant’s right of review in [the U.S. Supreme] Court.” Idaho v. Coeur

d’Alene Tribe of Idaho, 521 U.S. 261, 275 (1997) (opinion of Kennedy, J.); accord Allen v.

McCurry, 449 U.S. 90, 105 (1980). But if the more limited view is only one reasonable reading

of the text—and if the broader view is also reasonable—we are hard pressed to cast aside altogether

the universal approach of every court to examine the matter over the twenty-five years of section

230’s existence.

        Justice Thomas’s view focuses on traditional distinctions in defamation law between

“publishers” and “distributors.” Malwarebytes, 141 S. Ct. at 15. But as the Fourth Circuit in Zeran

observed, in some ways “distributor liability . . . is merely a subset . . . of publisher liability” within

defamation law. 129 F.3d at 332. As Justice Thomas acknowledges, many authorities refer to

“publishers” and “distributors,” or equivalent categories, as “primary publishers” and “secondary

publishers,” respectively. Id. Section 230’s text “precludes courts from treating internet service

providers as publishers not just for the purposes of defamation law . . . but in general.” Barnes v.

Yahoo!, Inc., 570 F.3d 1096, 1104 (9th Cir. 2009). Thus, it is not a clear departure from the

statutory text to understand section 230’s use of the word “publisher” to include both “primary”

and “secondary” publishers—that is, to view “publisher” in the broader, generic sense adopted in

Zeran and the many decisions following it. Neither would it be unreasonable to conclude that, in


                                                    15
this context, the “publisher” liability covered by section 230 should be distinguished from

“distributor” liability, which section 230 does not mention. As a textual matter, both the broad

and the narrow senses of the word “publisher” are viable readings of section 230(c)(1).

       Which reading is superior, given the statutory context and the other canons of construction,

is a question the U.S. Supreme Court may soon take up. Justice Thomas may be correct that many

courts interpreting section 230 have “filter[ed] their decisions through . . . policy argument[s]” or

otherwise “emphasized nontextual” considerations. Malwarebytes, 141 S. Ct. at 14, 18. In the

end, however, the construction of the provision at which these courts have arrived is a defensible

reading of its plain language. Imposing a tort duty on a social media platform to warn of or protect

against malicious third-party postings would in some sense “treat” the platform “as a publisher”

of the postings by assigning to the platform editorial or oversight duties commonly associated with

publishers. See Green v. Am. Online (AOL), 318 F.3d 465, 471 (3d Cir. 2003) (explaining that

“decisions relating to the monitoring, screening, and deletion of content from” a platform or

network are “actions quintessentially related to a publisher’s role”).

       It also bears mentioning that Congress, with knowledge of the prevailing judicial

understanding of section 230, has twice expanded its scope. See Lorillard v. Pons, 434 U.S. 575,

581 (1978) (When Congress “adopts a new law incorporating sections of a prior law, [it] normally

can be presumed to have had knowledge of the interpretation given to the incorporated law[.]”).

First, in 2002, Congress extended section 230 protections to an additional class of entities. 47

U.S.C. § 941(e)(1) (promoting a child-friendly segment of the internet and extending section 230’s

protections to many of those involved). Second, in 2010, Congress mandated that U.S. courts

“shall not recognize or enforce” foreign defamation judgments that are inconsistent with section


                                                 16
230, while also making clear that “[n]othing in” the applicable sections of the 2010 statute “shall

be construed to . . . limit the applicability of section 230 . . . to causes of action for defamation.”

28 U.S.C. § 4102(c), (e). Several courts have recognized that these legislative extensions of CDA

immunity, modest as they were, are nonetheless some evidence of Congress’s lack of objection to

the courts’ interpretation of section 230.7

        Finally, the broader view of section 230, which originated with Zeran, has been widely

accepted in both state and federal courts, including by the Fifth Circuit. See MySpace, 528 F.3d

at 418–20. This Court, while bound only by decisions of the U.S. Supreme Court, is generally

hesitant to contradict the “overwhelming weight of authority” from both lower federal courts and

state courts on federal questions. In re Morgan Stanley, 293 S.W.3d at 189. We have been

especially “reluctant to depart from the Fifth Circuit’s” construction of federal statutes, as doing

so “would allow parties in Texas to choose how the statute will be applied merely by choosing a

court system.” W & T Offshore, Inc. v. Fredieu, 610 S.W.3d 884, 892 n.1 (Tex. 2020). As a state

court interpreting a federal statute where all existing precedent adopts one of multiple plausible

readings of the statutory text, our best course is to follow the precedent.

        None of this is to suggest that this Court or any other should “resolve questions such as the

one before us by a show of hands.” CSX Transp., Inc. v. McBride, 564 U.S. 685, 715 (2011)

(Roberts, C.J., dissenting). No “weight of authority” is overwhelming enough to justify departure

from unequivocal statutory text. But when faced with a statute that reasonably lends itself to



        7
           See Barrett v. Rosenthal, 146 P.3d 510, 523 (Cal. 2006); Doe ex rel. Roe v. Backpage.com, LLC, 104 F.
Supp. 3d 149, 155–56 (D. Mass. 2015), aff’d, 817 F.3d 12; Jones v. Dirty World Entm’t Recordings LLC, 755 F.3d
398, 408 (6th Cir. 2014); see also Roommates.Com, 521 F.3d at 1187–88 (McKeown, J., concurring in part and
dissenting in part).

                                                      17
multiple readings, we promote stability and predictability in the law by adopting the position

unanimously taken by other courts if the text permits. Here in particular, we must be mindful that

“across the country, . . . entities and individuals doing business” using the internet “have for many

years relied on” the broad immunity from suit conferred by section 230 as interpreted by the courts,

“negotiating their contracts and structuring their . . . transactions against a backdrop of [that]

immunity.” Michigan v. Bay Mills Indian Cmty., 572 U.S. 782, 798–99 (2014). Plaintiffs’ narrow

view of section 230, while textually plausible, is not so convincing as to compel us to upset the

many settled expectations associated with the prevailing judicial understanding of section 230.8

                                     D. Plaintiffs’ Common-Law Claims

         The essence of Plaintiffs’ negligence, gross-negligence, negligent-undertaking, and

products-liability claims is that, because Plaintiffs were users of Facebook or Instagram, the

company owed them a duty to warn them or otherwise protect them against recruitment into sex

trafficking by other users. Facebook violated that duty, Plaintiffs contend, by its failures to

“implement any safeguards to prevent adults from contacting minors,” “report suspicious

messages,” “warn[] of the dangers posed by sex traffickers,” or “identify[] sex traffickers on its

Platforms.” Under the view of section 230 adopted in every published decision of which we are




         8
             Even if we were to adopt the position that section 230 affords protection from “publisher” but not
“distributor” liability, it is not clear that Plaintiffs’ common-law claims would survive. The common law imposed on
a distributor “no duty to examine the various publications that he offers . . . to ascertain whether they contain any
defamatory items. Unless there are special circumstances” suggesting “a particular publication is defamatory, he is
under no duty to ascertain its . . . character.” RESTATEMENT (SECOND) OF TORTS § 581, cmt. d (1977) (emphasis
added). Proof of Facebook’s actual or constructive knowledge of any particular communication’s wrongful character
is not an element of Plaintiffs’ claims (nor do Plaintiffs allege such specific knowledge on Facebook’s part). As a
result, interpreting section 230 to allow traditional distributor liability to be imposed on Facebook might not save these
claims from dismissal.

                                                           18
aware, these claims “treat[]” Facebook “as the publisher or speaker” of third-party communication

and are therefore barred.

       Plaintiffs argue that their common-law claims do not treat Facebook as a “publisher” or

“speaker” because they “do not seek to hold [it] liable for exercising any sort of editorial function

over its users’ communications,” but instead merely for its own “failure to implement any

measures to protect them” from “the dangers posed by its products.” Yet this theory of liability,

while phrased in terms of Facebook’s omissions, would in reality hold the company liable simply

because it passively served as an “intermediar[y] for other parties’ . . . injurious messages.” Zeran,

129 F.3d at 330–31. Put differently, “the duty that [Plaintiffs] allege[] [Facebook] violated”

derives from the mere fact that the third-party content that harmed them was transmitted using the

company’s platforms, which is to say that it “derives from [Facebook’s] status . . . as a ‘publisher

or speaker’” of that content. Barnes, 570 F.3d at 1102. These claims seek to impose liability on

Facebook for harm caused by malicious users of its platforms solely because Facebook failed to

adequately protect the innocent users from the malicious ones. All the actions Plaintiffs allege

Facebook should have taken to protect them—warnings, restrictions on eligibility for accounts,

removal of postings, etc.—are actions courts have consistently viewed as those of a “publisher”

for purposes of section 230. Regardless of whether Plaintiffs’ claims are couched as failure to

warn, negligence, or some other tort of omission, any liability would be premised on second-

guessing of Facebook’s “decisions relating to the monitoring, screening, and deletion of [third-

party] content from its network.” Green, 318 F.3d at 471.

       This is no less true simply because Facebook’s alleged negligent omissions include failures

to “require accounts for minors to be linked to those of adults” or “deprive known criminals from


                                                 19
having accounts.” These claims may be couched as complaints about Facebook’s “design and

operation” of its platforms “rather than . . . its role as a publisher of third-party content,” but the

company’s “alleged lack of safety features ‘is only relevant to [Plaintiffs’] injur[ies] to the extent

that such features’” would have averted wrongful communication via Facebook’s platforms by

third parties. Herrick v. Grindr LLC, 765 Fed. Appx. 586, 590 (2d Cir. 2019) (quoting Herrick v.

Grindr, LLC, 306 F. Supp. 3d 579, 591 (S.D.N.Y. 2018)). At bottom, these “claims seek to hold”

Facebook “liable for its failure to combat or remove offensive third-party content, and are barred

by § 230.” Id.9

         Plaintiffs’ failure-to-warn theory suffers from the same infirmities. The warnings Plaintiffs

seek would only be necessary because of Facebook’s allegedly inadequate policing of third-party

content transmitted via its platforms. “Although it is indirect, liability under such a theory

nevertheless” ultimately arises from the company’s transmission of the harmful content. Herrick,

306 F. Supp. 3d at 591. Moreover, “a warning about third-party content is a form of editing, just

as much as a disclaimer printed at the top of a page of classified ads in a newspaper would be.”

Id. at 592 n.8; accord McDonald v. LG Elecs. USA, Inc., 219 F. Supp. 3d 533, 538 (D. Md. 2016).

         Plaintiffs’ products-liability claims are likewise premised on the alleged failure by

Facebook to “provid[e] adequate warnings and/or instructions regarding the dangers of ‘grooming’



         9
             Plaintiffs argue that their claims may proceed because they “are not based on Facebook’s decision[s] to
publish or alter certain content.” But it is irrelevant whether the allegedly tortious conduct relates to “certain” third-
party content or to third-party content in general. “If the cause of action . . . would treat the [defendant] as the publisher
of a particular posting, [section 230] immunity applies not only for . . . decisions with respect to that posting, but also
for . . . decisions about how to treat postings generally.” Universal Commc’n Sys., Inc. v. Lycos, Inc., 478 F.3d 413,
422 (1st Cir. 2007). Facebook’s decision not to combat potentially harmful communication “by changing its web site
policies” on warnings, flagging of messages, or who may establish an account on its platforms “was as much an
editorial decision” regarding third-party content “as a decision not to delete a particular posting.” Id.; cf. Miami Herald
Pub. Co. v. Tornillo, 418 U.S. 241, 258 (1974).

                                                             20
and human trafficking” on its platforms. Like Plaintiffs’ other common-law claims, these claims

seek to hold Facebook liable for failing to protect Plaintiffs from third-party users on the site. For

that reason, courts have consistently held that such claims are barred by section 230. This has been

the unanimous view of other courts confronted with claims alleging that defectively designed

internet products allowed for transmission of harmful third-party communications. See Herrick,

765 Fed. Appx. at 590; Inman v. Technicolor USA, Inc., No. 11-cv-666, 2011 WL 5829024, at *8

(W.D. Pa. Nov. 18, 2011); Doe v. MySpace, Inc., 629 F. Supp. 2d 663 (E.D. Tex. 2009).10

        This result aligns with the Fifth Circuit’s 2008 decision in a case involving very similar

facts. There, the plaintiff and a sexual predator “met and exchanged personal information” using

the defendant’s website, “which eventually led to an in-person meeting and the sexual assault of

[the plaintiff].” MySpace, 528 F.3d at 419 (quoting Doe v. MySpace, Inc., 474 F. Supp. 2d 843,

849 (W.D. Tex. 2007)). The court held that the plaintiff’s claims were “barred by the CDA”

because her allegation that the defendant was “liable for its failure to implement measures that

would have prevented [the plaintiff] from communicating with [her attacker]” was “merely another

way of claiming that [the defendant] was liable for publishing the communications.” Id. at 420.

Courts around the country have consistently held that section 230 protects defendants from similar

claims of failure to warn of harmful third-party content or negligent failure to protect users from


        10
            While there have been a few instances in which products-liability claims against websites have been
allowed to proceed despite defendants’ CDA objections, see Bolger v. Amazon.com, LLC, 267 Cal. Rptr. 3d 601, 626–
27 (Ct. App. 2020); Erie Ins. Co. v. Amazon.com, Inc., 925 F.3d 135, 139–40 (4th Cir. 2019); State Farm Fire & Cas.
Co. v. Amazon.com, Inc., 390 F. Supp. 3d 964, 973–74 (W.D. Wis. 2019), plaintiffs in those cases alleged that
defendants provided tangible goods that caused physical injury or property damage. Here, by contrast (as in other
cases in which products-liability claims have been held barred by section 230), the allegedly harmful attribute of
Facebook’s “products” was that they permitted transmission of third-party communication that resulted in harm to
Plaintiffs.



                                                        21
third-party content.11

         Plaintiffs rely heavily on one federal appellate decision that denied CDA immunity to a

website operator sued for failure to warn of the risk that sexual predators would use the site to lure

victims. In that case, however, the plaintiff’s failure-to-warn claim had “nothing to do with [the

defendant’s] efforts, or lack thereof, to edit, monitor, or remove user generated content.” Doe v.

Internet Brands, Inc., 824 F.3d 846, 852, 853 (9th Cir. 2016). Instead, the plaintiff claimed that

the defendant was liable based on its actual knowledge of the particular rape scheme of which she

was a victim. The defendant obtained that knowledge not “from any monitoring of postings on

[its] website,” but instead “from an outside source of information.” Id. at 851, 853. Indeed, the

plaintiff did not “allege that [her rapists] posted anything to the website” or that she was “lured by

any posting that [the defendant] failed to remove.” Id. at 851. Internet Brands is therefore “best

read as holding that the CDA does not immunize an [interactive computer service] from a failure

to warn claim when the alleged duty to warn arises from something other than user-generated

content.” Herrick, 306 F. Supp. 3d at 592. “By contrast,” Plaintiffs’ “proposed warning in this

case would be about user-generated content itself.” Id.

         Plaintiffs further argue that section 230’s prohibition on “treat[ing] an internet company as

a ‘publisher or speaker’” preempts only suits that “allege or implicate defamation,” since this was

the primary “kind of liability Congress had in mind” when it enacted the provision. This proposed


         11
           For cases involving failure to warn, see Herrick, 765 Fed. Appx. 586; Doe v. Kik Interactive, Inc., 482 F.
Supp. 3d 1242 (S.D. Fla. 2020); McMillan v. Amazon.com, Inc., 433 F. Supp. 3d 1034, 1045 (S.D. Tex. 2020);
McDonald, 219 F. Supp. 3d at 538; Hinton v. Amazon.com.dedc, LLC, 72 F. Supp. 3d 685, 687, 692 (S.D. Miss. 2014);
and Oberdorf v. Amazon.com Inc., 930 F.3d 136, 153 (3d Cir. 2019), reh’g en banc granted, opinion vacated, 936
F.3d 182 (3d Cir. 2019). For cases involving negligence or related causes of action, see Zeran, 129 F.3d at 332; Green,
318 F.3d at 471; Daniel v. Armslist, LLC, 926 N.W.2d 710, 725–26 (Wis. 2019); MySpace, 528 F.3d at 420; Barnes,
570 F.3d at 1103; and Klayman v. Zuckerberg, 753 F.3d 1354, 1359 (D.C. Cir. 2014).

                                                         22
limitation on section 230 has been rejected by every court that has considered it. See Force, 934

F.3d at 64 n.18. Given that “Congress enacted the [CDA] in part to respond” to Stratton Oakmont,

the “cause of action most frequently associated with . . . section 230 is defamation.” Barnes, 570

F.3d at 1101. “But statutory prohibitions often go beyond the principal evil . . . . Congress was

concerned with when it enacted [them] . . . . , and it is ultimately the provisions of our laws rather

than the principal concerns of our legislators by which we are governed.” Oncale v. Sundowner

Offshore Servs., Inc., 523 U.S. 75, 79 (1998). Section 230’s text neither mentions defamation nor

“limit[s] its application to defamation cases.” Barnes, 570 F.3d at 1101. Courts have recognized

as much by extending section 230 to “a wide variety” of claims, “including housing

discrimination,” “securities fraud,” and “cyberstalking.” Backpage.com, 817 F.3d at 19.

       In sum, Plaintiffs’ claims for negligence, gross negligence, negligent undertaking, and

products liability—all premised on Facebook’s alleged failures to warn or to adequately protect

Plaintiffs from harm caused by other users—are barred by section 230 and must be dismissed.

                                       E. Statutory Claims

       Plaintiffs also sued Facebook under a Texas statute creating a civil cause of action against

anyone “who intentionally or knowingly benefits from participating in a venture that traffics

another person.” TEX. CIV. PRAC. & REM. CODE § 98.002(a). According to Plaintiffs, Facebook

violated this statute through such “acts and omissions” as “knowingly facilitating the sex

trafficking of [Plaintiffs]” and “creat[ing] a breeding ground for sex traffickers to stalk and entrap

survivors.” As explained below, we conclude that section 230 does not bar these claims.

       The relevant language in section 98.002(a) is borrowed almost verbatim from the Texas

statute criminalizing the same conduct. See TEX. PENAL CODE § 20A.02(a). The text of that law


                                                 23
itself closely resembles a federal statute. See 18 U.S.C. § 1591(a).12 Liability under these statutes

requires a showing that a defendant acquired a benefit by “participat[ing]” in a human-trafficking

“venture.” Such “participation” connotes more than mere passive acquiescence in trafficking

conducted by others.          This much is evident from the common meaning of “participate,”

representative definitions of which include, “[t]o be active or involved in something; take part,”

AMERICAN HERITAGE DICTIONARY OF THE ENGLISH LANGUAGE (5th ed. 2016) (emphasis added);

and, “to take part, be or become actively involved, or share (in),” COLLINS ENGLISH DICTIONARY

(12th ed. 2014) (emphasis added). Definitions vary, of course, but a common thread among them

is the understanding that “participation” consists, at a minimum, of some affirmative act.13

         Courts analyzing what it means to “participate” in a “venture” in the criminal context have

consistently required more than passive acquiescence in the wrongdoing of others. “Participation”

typically entails, at a minimum, an overt act in furtherance of the venture. See, e.g., United States

v. Hewitt, 663 F.2d 1381, 1385 (11th Cir. 1981) (“To prove participation, the evidence must show

that the defendant committed an overt act designed to aid in the success of the venture.”).14 A

similar standard is regularly applied in the civil context as well.15 This overt-act conception of


         12
           Section 1591 makes it a crime to “knowingly . . . benefit[], financially or by receiving anything of value,
from participation in a venture which has engaged in” sex trafficking, as defined by the statute. 18 U.S.C. § 1591(a).
         13
             See, e.g., Participation, BLACK’S LAW DICTIONARY (11th ed. 2019) (“The act of taking part in something
. . . .”) (emphasis added); Participation, CAMBRIDGE BUSINESS ENGLISH DICTIONARY (2011) (“the act of taking part
in an event or activity”); Participation, OXFORD ADVANCED AMERICAN DICTIONARY (2011) (“the act of taking part
in an activity or event”).
         14
           Accord United States v. Pope, 739 F.2d 289, 291 (7th Cir. 1984); United States v. Searan, 259 F.3d 434,
444 (6th Cir. 2001); United States v. Longoria, 569 F.2d 422, 425 (5th Cir. 1978); Paredes v. State, 129 S.W.3d 530,
536 (Tex. Crim. App. 2004); Kutzner v. State, 994 S.W.2d 180, 187 (Tex. Crim. App. 1999).
         15
           See Landy v. Fed. Deposit Ins. Corp., 486 F.2d 139, 163–64 (3d Cir. 1973); IIT, an Intern. Inv. Tr. v.
Cornfeld, 619 F.2d 909, 922, 925 (2d Cir. 1980), abrogated on other grounds by Morrison v. Nat’l Austl. Bank Ltd.,


                                                         24
“participation” is further supported by a recent decision construing 18 U.S.C. § 1591(a), the federal

criminal analog of section 98.002(a) of the Civil Practice and Remedies Code. The Sixth Circuit

held that in order to prove “participat[ion]” in a “venture” for section 1591(a) purposes, “a

defendant [must have] actually . . . commit[ted] some ‘overt act’ that furthers the sex trafficking

aspect of the venture.” United States v. Afyare, 632 Fed. Appx. 272, 286 (6th Cir. 2016). “[T]he

statute d[oes] not criminalize . . . ‘mere negative acquiescence.’” Id. (quoting United States v.

Afyare, No. 3:10-CR-00260, 2013 WL 2643408, at *12 (M.D. Tenn. June 12, 2013)).16

         “In construing” section 98.002(a), “we presume that the Legislature acted with knowledge

of th[is] background law and with reference to it.” City of Round Rock v. Rodriguez, 399 S.W.3d

130, 137 (Tex. 2013). By employing terms such as “participat[ion]” and “venture,” which have

well-established meanings in related legal contexts, the legislature is presumed to have adopted

the prevailing judicial understanding of those words. Thus, to charge Facebook with “intentionally

or knowingly benefit[ting] from participating in a [trafficking] venture” is to charge it with “some

affirmative conduct”—that is, “an overt act” beyond “mere negative acquiescence”—“designed to

aid in the success of the venture.” Longoria, 569 F.2d at 425. It follows that a claim under section

98.002 arises not merely from a website’s failure to take action in response to the injurious

communications of others, but instead from the website’s own affirmative acts to facilitate



561 U.S. 247 (2010); Zoelsch v. Arthur Andersen & Co., 824 F.2d 27, 35–36 (D.C. Cir. 1987), abrogated on other
grounds by Morrison, 561 U.S. 247; In re Amaranth Nat. Gas Commodities Litig., 730 F.3d 170, 182 (2d Cir. 2013);
SEC v. Quiros, No. 16-CV-21301, 2016 WL 11578637, at *15 (S.D. Fla. Nov. 21, 2016).
         16
            Unlike TEX. CIV. PRAC. & REM. CODE § 98.002(a), section 1591 now expressly defines “participation in a
venture” as “knowingly assisting, supporting, or facilitating a violation” of the statute’s sex-trafficking prohibition.
This definition, however, was added by FOSTA in 2018. See Pub. L. No. 115-164, § 5. Afyare reached its conclusion
about the meaning of “participation” without the benefit of any statutory definition, so the case remains persuasive
authority on the same word’s meaning in section 98.002(a).

                                                          25
injurious communications.

         This distinction—between passive acquiescence in the wrongdoing of others and

affirmative acts encouraging the wrongdoing—is evident in Plaintiffs’ allegations, which we

construe liberally at the Rule-91a stage. See City of Dallas v. Sanchez, 494 S.W.3d 722, 725 (Tex.

2016). While many of Plaintiffs’ allegations accuse Facebook of failing to act as Plaintiffs believe

it should have, the section 98.002 claims also allege overt acts by Facebook encouraging the use

of its platforms for sex trafficking. For instance, the petitions state that Facebook “creat[ed] a

breeding ground for sex traffickers to stalk and entrap survivors”; that “Facebook . . . knowingly

aided, facilitated and assisted sex traffickers, including the sex trafficker[s] who recruited

[Plaintiffs] from Facebook” and “knowingly benefitted” from rendering such assistance; that

“Facebook has assisted and facilitated the trafficking of [Plaintiffs] and other minors on

Facebook”; and that Facebook “uses the detailed information it collects and buys on its users to

direct users to persons they likely want to meet” and, “[i]n doing so, . . . facilitates human

trafficking by identifying potential targets, like [Plaintiffs], and connecting traffickers with those

individuals.” Read liberally in Plaintiffs’ favor, these statements may be taken as alleging

affirmative acts by Facebook to encourage unlawful conduct on its platforms.17

         The available precedent indicates that Facebook enjoys no CDA immunity from claims

founded on such allegations. For instance, the Ninth Circuit has held that defendants lose their

CDA immunity if they go beyond acting as “passive transmitter[s] of information provided by




         17
           We do not address whether, at the Rule-91a stage, Plaintiffs’ claims have “no basis in law, no basis in fact,
or both” for some reason other than section 230. In this mandamus proceeding, we consider only the section 230
arguments raised in Facebook’s motions to dismiss.

                                                          26
others.” Roommates.Com, 521 F.3d at 1166. A defendant that operates an internet platform “in a

manner that contributes to,” or is otherwise “directly involved in,” “the alleged illegality” of third

parties’ communication on its platform is “not immune.” Id. at 1169. Here, Plaintiffs’ statutory

cause of action is predicated on allegations of Facebook’s affirmative acts encouraging trafficking

on its platforms. These allegations differ from Plaintiffs’ common-law claims, under which

Facebook is accused only of “providing neutral tools to carry out what may be unlawful or illicit”

communication by its users. Id. The common-law claims are “based on [Facebook’s] passive

acquiescence in the misconduct of its users,” for which the company is “entitled to CDA

immunity.” Id. at 1169 n.24. Like the Ninth Circuit, however, we understand the CDA to stop

short of immunizing a defendant for its “affirmative acts . . . contribut[ing] to any alleged

unlawfulness” of “user-created content.” Id. Facebook’s alleged violations of TEX. CIV. PRAC. &

REM. CODE § 98.002(a) fall in the latter category. These allegations do not treat Facebook as a

publisher who bears responsibility for the words or actions of third-party content providers.

Instead, they treat Facebook like any other party who bears responsibility for its own wrongful

acts. Other courts have drawn a similar line. See, e.g., FTC v. Accusearch Inc., 570 F.3d 1187,

1199–201 (10th Cir. 2009); J.S. v. Vill. Voice Media Holdings, LLC, 359 P.3d 714, 718 (Wash.

2015); Dirty World Entm’t Recordings, 755 F.3d at 413.

       We find it highly unlikely that Congress, by prohibiting treatment of internet companies

“as . . . publisher[s],” sought to immunize those companies from all liability for the way they run

their platforms, even liability for their own knowing or intentional acts as opposed to those of their

users. Section 230 itself declares it “the policy of the United States . . . to ensure vigorous

enforcement of Federal criminal laws” against “obscenity, stalking, and harassment by means of


                                                 27
computer.” 47 U.S.C. § 230(b). Nothing in section 230’s operative text goes so far as to “create

a lawless no-man’s-land on the Internet” in which online platforms like Facebook are free to

actively encourage human trafficking. Roommates.Com, 521 F.3d at 1164.

         This view of section 230 conflicts directly with the First Circuit’s 2016 decision in Jane

Doe No. 1 v. Backpage.com, LLC, 817 F.3d 12.18 Congress, however, responded to the Backpage

decision in 2018 by enacting the “Allow States and Victims to Fight Online Sex Trafficking Act”

(“FOSTA”), Pub. L. No. 115-164, 132 Stat. 1253. FOSTA provides that “[n]othing in [section

230] (other than subsection (c)(2)(A)) shall be construed to impair or limit any claim in a civil

action brought under section 1595 of Title 18, if the conduct underlying the claim constitutes a

violation of section 1591 of that title.” 47 U.S.C. § 230(5).

         Both parties argue that FOSTA’s changes to section 230 support their positions. As

Facebook understands FOSTA, the 2018 amendments carved out particular causes of action from

the scope of what section 230 otherwise covers. These carved-out claims include a civil action

under 18 U.S.C § 1595 and certain state criminal prosecutions but not civil human-trafficking

claims under state statutes. Although a state-law claim under section 98.002 looks much like the

federal cause of action created by section 1595, the similarity does not transform Plaintiffs’

statutory claims into suits “brought under” section 1595.                        In Facebook’s view, Congress’s

“meticulous . . . enumeration of exemptions . . . confirms that courts are not authorized to create



         18
            In that case, the plaintiffs sued under 18 U.S.C § 1595, alleging that the defendant website operator
“engaged in a course of conduct designed to facilitate sex traffickers’ efforts to advertise their victims on [its] website,”
which allegedly “led to [the plaintiffs’] victimization.” 817 F.3d at 16. The plaintiffs’ pleadings laid out many ways
in which the defendant had intentionally facilitated illegal activities on its site. Id. at 20. The First Circuit, however,
held that the defendant was nevertheless entitled to section 230 immunity.



                                                            28
additional exemptions.” Law v. Siegel, 571 U.S. 415, 424 (2014).19

         Plaintiffs disagree. They concede that FOSTA does not explicitly except civil human-

trafficking claims under state statutes from section 230’s reach. But FOSTA’s silence in that

regard does not answer whether such claims fell under section 230 to begin with. According to

Plaintiffs, the effect of FOSTA was not, as Facebook assumes, to carve out discrete claims that

would otherwise have been barred by section 230. Instead, FOSTA reflects Congress’s judgment

that such claims were never barred by section 230 in the first place. Under this reading, FOSTA’s

“exception” to section 230 immunity for federal section 1595 claims is not merely an exception.

Instead, it is Congress’s announcement of a rule of construction for section 230(c), under which

human-trafficking claims like those found in section 1595 were never covered by section 230. In

Plaintiffs’ view, by indicating that Backpage was wrong and that section 230 should not be

interpreted to bar federal civil statutory human-trafficking claims, Congress must also have been

indicating that analogous state civil statutory human-trafficking claims likewise are not barred.

After all, there is no conceivable difference between the two categories of claims with respect to

whether they “treat” defendants as “speakers or publishers.”

         For two reasons, we find Plaintiffs’ view of FOSTA’s impact more convincing. First, what

Facebook calls FOSTA’s “exceptions” to section 230 are not introduced with statutory language

denoting carve-outs (such as “notwithstanding” or “except that . . .”). Instead, Congress instructed

that “[n]othing in [section 230] . . . shall be construed to impair” certain claims. The U.S. Supreme


         19
             If section 230(c) did not cover Plaintiffs’ statutory human-trafficking claims prior to FOSTA, it cannot be
the case that FOSTA brought those claims back under section 230 by failing to specifically exclude them. FOSTA
itself disclaims any such result: “Nothing in . . . the[se] amendments . . . shall be construed to . . . preempt any civil
action” filed after their enactment that was not “preempted by section 230” as it was “in effect on the day before” the
amendments’ enactment. Pub. L. No. 115-164, § 7.

                                                           29
Court, in interpreting a materially identical proviso, declined to view it “as establishing an

exception to a prohibition that would otherwise reach the conduct excepted.” Edward J. DeBartolo

Corp. v. Fla. Gulf Coast Bldg. & Constr. Trades Council, 485 U.S. 568, 582 (1988). Rather, the

language in question “ha[d] a different ring to it.” Id. A clause stating that the provision to which

it applies “‘shall not be construed’ to forbid certain [activity],” was, in the Court’s view, better

read as “a clarification of the meaning of [the provision] rather than an exception” to its general

coverage. Id. at 586. The Court agreed with the Eleventh Circuit, which had also understood the

“shall not be construed” clause as “explain[ing] how the [section] should be interpreted rather than

creating an exception” to it. Fla. Gulf Coast Bldg. & Constr. Trades Council v. NLRB, 796 F.2d

1328, 1344 (11th Cir. 1986). Other courts have construed similar statutory language in the same

way.20

          Following this line of reasoning, we do not read FOSTA’s instruction that “[n]othing in

[section 230] . . . shall be construed to impair or limit any . . . civil action brought under [18 U.S.C

§] 1595” to merely except section 1595 claims from the scope of what section 230 would otherwise

cover. Rather, the FOSTA proviso announces a rule of construction applicable to section 230.

Congress’s mandate that section 230 not “be construed” to bar federal civil statutory human-

trafficking claims necessarily dictates that section 230 must not be construed to bar materially




          20
            See, e.g., Garnett v. Renton Sch. Dist. No. 403, 987 F.2d 641, 645 (9th Cir. 1993) (interpreting statutory
proviso that began, “[n]othing in this [act] shall be construed . . .” as announcing “not exceptions to the Act,” but
rather “rules of construction” that “instruct . . . court[s] how to interpret the Act’s central command[s]”); accord Gov’t
of Guam ex rel. Guam Econ. Dev. Auth. v. United States, 179 F.3d 630, 635 (9th Cir. 1999); see also Hoffman v. Hunt,
126 F.3d 575, 582 (4th Cir. 1997) (interpreting state statutory proviso that began, “[t]his section shall not prohibit
[certain activities]” as “[b]y its terms, . . . prohibit[ing] nothing; rather, it serves as a rule of construction . . . . designed
to assure that the . . . statute is not construed to reach” particular conduct); Hammerman & Gainer, Inc. v. Bullock,
791 S.W.2d 330, 333 (Tex. App.—Austin 1990, no writ) (similar).

                                                               30
indistinguishable state civil claims either. The elements of the two claims are very similar. If

liability under federal section 1595 would not treat defendants as “speakers or publishers” within

the meaning of section 230, it is hard to understand how liability under Texas’s section 98.002

could possibly do so.

       Second, another textual indicator favors Plaintiffs’ understanding of FOSTA’s effects. The

“Sense of Congress,” enacted as part of FOSTA’s text, was that “section 230 of the [CDA] was

never intended to provide legal protection to . . . websites that facilitate traffickers in advertising

the sale of unlawful sex acts with sex trafficking victims.” Pub. L. No. 115-164, § 2. If section

230 was “never intended” to immunize defendants against claims brought pursuant to 18 U.S.C

§ 1595, it stands to reason that the provision also never afforded immunity from analogous state-

law causes of action. The “Sense of Congress” is merely a declaratory rather than an operative

provision. But there is widespread agreement that “[a] preamble, purpose clause, or recital is a

permissible indicator of meaning.” ANTONIN SCALIA & BRYAN A. GARNER, READING LAW: THE

INTERPRETATION OF LEGAL TEXTS 193, 194 (2012). When the text itself is indeterminate, such a

provision “is a key to open the mind of the makers, as to . . . the objects, which are to be

accomplished by . . . [a] statute.”      Id. (quoting 1 JOSEPH STORY, COMMENTARIES            ON THE

CONSTITUTION OF THE UNITED STATES § 459 (1833)). Facebook is correct that such prefatory

language “cannot enlarge or confer powers, nor control the words of the act, unless they are

doubtful or ambiguous.” Yazoo & M.V.R. Co. v. Thomas, 132 U.S. 174, 188 (1889). Yet as we

noted above, the parties advance two competing understandings of FOSTA’s impact on section

230. If both are plausible, courts are justified in consulting the enacted “Sense of Congress” when

choosing between them.


                                                  31
       FOSTA thus provides additional support for our conclusion that section 230(c) does not

bar Plaintiffs’ claims alleging Facebook’s affirmative acts in violation of section 98.002. These

claims may proceed to further litigation, although we express no opinion on their viability at any

later stage of these cases. Because Plaintiffs’ common-law claims are barred by section 230,

however, we hold that the district courts abused their discretion by failing to grant Facebook’s

motions to dismiss those claims. We further conclude, as explained above, that Facebook has no

other adequate remedy for the district courts’ improper refusals to dismiss those claims.

                                          III. Conclusion

       The internet today looks nothing like it did in 1996, when Congress enacted section 230.

The Constitution, however, entrusts to Congress, not the courts, the responsibility to decide

whether and how to modernize outdated statutes. Perhaps advances in technology now allow

online platforms to more easily police their users’ posts, such that the costs of subjecting platforms

like Facebook to heightened liability for failing to protect users from each other would be

outweighed by the benefits of such a reform. On the other hand, perhaps subjecting online

platforms to greater liability for their users’ injurious activity would reduce freedom of speech on

the internet by encouraging platforms to censor “dangerous” content to avoid lawsuits. Judges are

poorly equipped to make such judgments, and even were it otherwise, “[i]t is for Congress, not

this Court, to amend the statute if it believes” it to be outdated. Dodd v. United States, 545 U.S.

353, 359–60 (2005).

       The petition for writ of mandamus is denied in part and conditionally granted in part. The

district courts are directed to dismiss Plaintiffs’ claims against Facebook for negligence, gross

negligence, negligent undertaking, and products liability. Plaintiffs’ claims under section 98.002


                                                 32
of the Civil Practice and Remedies Code may proceed. We are confident the district courts will

comply, and the writ will issue only if they do not.


                                              __________________________________
                                              James D. Blacklock
                                              Justice


OPINION DELIVERED: June 25, 2021




                                                33