United States Court of Appeals
FOR THE DISTRICT OF COLUMBIA CIRCUIT
Argued January 18, 2018 Decided February 23, 2018
No. 17-7106
DAWN BENNETT AND
DJ BENNETT HOLDINGS, LLC,
APPELLANTS
v.
GOOGLE, LLC,
APPELLEE
Appeal from the United States District Court
for the District of Columbia
(No. 1:16-cv-02283)
Harry J. Jordan argued the cause and filed the briefs for
the appellants.
John K. Roche argued the cause and filed the brief for the
appellee.
Before: HENDERSON, ROGERS and KAVANAUGH, Circuit
Judges.
Opinion for the Court filed by Circuit Judge HENDERSON.
2
KAREN LECRAFT HENDERSON, Circuit Judge: Offended by
a third-party blog post, Plaintiff Dawn Bennett (Bennett) and
her company, DJ Bennett Holdings, LLC (DJ Bennett), sued
Google LLC (Google) for failing to remove the post. They
alleged three state-law causes of action: (1) defamation; (2)
tortious interference with a business relationship; and (3)
intentional infliction of emotional distress. The district court
granted Google’s motion to dismiss, concluding that the
Communications Decency Act (CDA), 47 U.S.C. § 230,
immunized Google from liability for the publication of third-
party content. We affirm.
I.
Bennett owns DJ Bennett, a retailer of high-end sports
apparel.1 Scott Pierson is the founder of The Executive SEO
Agency, which provides search engine optimization and
marketing (SEO) services. In March 2013, DJ Bennett hired
Pierson to provide SEO services, seeking to increase its sales.
After a few months, the parties’ relationship deteriorated and
Pierson agreed to renegotiate his contract and accept slightly
less than $20,000 as full payment for his services.
DJ Bennett paid Pierson in five installments but the fifth
installment was returned by the post office as “undeliverable.”
Thereafter, Pierson called DJ Bennett’s Vice President and
General Merchandise Manager, Anderson McNeill.
According to McNeill, Pierson was “hysterical” and
“emotionally distraught.” Compl. ¶ 10. Pierson threatened
DJ Bennett, declaring “I know things, I can do things, and I will
shut down your website.” Id. In response, McNeil explained
1
The relevant facts are drawn from the complaint and are
accepted as accurate for this appeal. Bell Atl. Corp. v. Twombly, 550
U.S. 544, 572 (2007).
3
that DJ Bennett had attempted to mail Pierson his final check
but that it had been returned. Pierson then gave McNeil an
alternative address, “the last payment was sent there, and
[Pierson] cashed it.” Id.
After the business relationship fell apart, Pierson wrote a
blog titled “DJ Bennett-think-twice-bad business ethics” and
published it on the internet through Google. Id. ¶ 11. Among
other things, the blog asserted that (1) “DJ Bennett, the luxury
sporting goods company, did not pay its employees or
contractors”; (2) DJ Bennett was “ruthlessly run by Dawn
Bennett who also operated Bennett Group Financial Services”;
(3) Bennett falsely stated that Pierson had agreed to reduce his
hours “as justification for reducing his final invoice by
$3,200”; (4) Pierson’s counsel described Bennett as “judgment
proof”; and (5) “DJ Bennett owes thousands and thousands to
many people.” Id. ¶¶ 11-12. The blog concluded: “I urge
you to think twice before giving your patronage to DJ
Bennett.com . . . . The website is pretty, but the person running
the show is quite contemptible.” Id. ¶ 12.
Through counsel, Bennett attempted to convince Pierson
to remove the post; Pierson refused. Bennett’s counsel also
contacted Google’s general counsel and other senior corporate
officers, “asking them to drop Pierson’s blog because it
violated Google’s Guidelines of what is appropriate material
for inclusion in blogs.” Id. ¶ 13. Notwithstanding Bennett’s
complaints, Google “continues[] to publish Pierson’s blog.”
Id. Bennett also alleged that “as of May 23, 2016, not a single
comment has been received in two years; Pierson was
artificially maintaining his blog in a favorable position by using
black-hat tactics, a practice universally condemned by the
digital media industry, including Google.” Id.
4
Google has a “Blogger Content Policy” that regulates,
inter alia, adult content, child safety, hate speech, crude
content, violence, harassment, copyright infringement, and
malware and viruses.2 Joint Appendix (JA) 42-45. Users are
encouraged to “flag[]” policy violations through the website.
JA 45. If Google finds that the blog does violate its content
policies, it may limit access to the blog, delete the blog, disable
the author’s access or report the user to law enforcement. Id.
If the blog does not violate Google’s policies, Google “will not
take any action against the blog or blog owner.” Id.
II.
We review the district court’s dismissal de novo.
Klayman v. Zuckerberg, 753 F.3d 1354, 1357 (D.C. Cir. 2014).
The CDA recognizes that the internet offers “a forum for a true
diversity of political discourse, unique opportunities for
cultural development, and myriad avenues for intellectual
activity.” 3 47 U.S.C. § 230(a)(3). Accordingly, the Act
codifies “the policy of the United States (1) to promote the
2
The “Blogger Content Policy” is not attached to the
complaint or the motion to dismiss but it is included in the Joint
Appendix. Although Google does not challenge its admissibility, it
is unclear if we may take judicial notice of it. See Kaempe v. Myers,
367 F.3d 958, 965 (D.C. Cir. 2004) (taking judicial notice of public
records). Because the Policy does not alter our analysis, however,
we consider it as background only.
3
The Communications Decency Act is something of a
misnomer; the Act does not promote decency so much as it acts as a
bulwark against “intrusive government regulation of speech.”
Zeran v. Am. Online, Inc., 129 F.3d 327, 330 (4th Cir. 1997).
Although unrestrained speech can often be several shades from
decent, see Cohen v. California, 403 U.S. 15 (1971), that is the
tradeoff that the Congress has apparently endorsed by insulating
computer service providers from liability, 47 U.S.C. § 230(c)(1).
5
continued development of the Internet and other interactive
computer services . . . [and] (2) to preserve the vibrant and
competitive free market that presently exists for the
Internet . . . .” 47 U.S.C. § 230(b). In accordance with that
policy, section 230 of the CDA contains a “Protection for
‘Good Samaritan’ blocking and screening of offensive
material,” which reads: “[n]o provider or user of an interactive
computer service shall be treated as the publisher or speaker of
any information provided by another information content
provider.” Id. § 230(c)(1). It further states: “[n]o provider or
user of an interactive computer service shall be held liable on
account of . . . any action voluntarily taken in good faith to
restrict access to or availability of material that the provider or
user considers to be obscene, lewd, lascivious, filthy,
excessively violent, harassing, or otherwise objectionable.”
Id. § 230(c)(2). To give these provisions teeth, section 230
provides that “[n]o cause of action may be brought and no
liability may be imposed under any State or local law that is
inconsistent with this section.” Id. § 230(e)(3).
The seminal case of Zeran v. America Online, Inc. 4
explained the core functions of the CDA more than two
decades ago:
The amount of information communicated via
interactive computer services is . . . staggering.
The specter of tort liability in an area of such
prolific speech would have an obvious chilling
4
On the 20th anniversary of the CDA, Zeran was heralded as
“internet law’s most important judicial decision.” Eric Goldman &
Jeff Kosseff, Commemorating the 20th Anniversary of Internet
Law’s Most Important Judicial Decision, THE RECORDER (Nov. 10,
2017), perma.cc/RR2M-UZ2M.
6
effect. It would be impossible for service
providers to screen each of their millions of
postings for possible problems. Faced with
potential liability for each message republished
by their services, interactive computer service
providers might choose to severely restrict the
number and type of messages posted.
Congress considered the weight of the speech
interests implicated and chose to immunize
service providers to avoid any such restrictive
effect.
129 F.3d 327, 331 (4th Cir. 1997). The intent of the CDA is
thus to promote rather than chill internet speech. Id. By the
same token, however, the CDA “encourage[s] service
providers to self-regulate the dissemination of offensive
material over their services.” Id. In that respect, the CDA
corrected the trajectory of earlier state court decisions that had
held computer service providers liable when they removed
some—but not all—offensive material from their websites.
Id. (analyzing legislative history and explaining holding of
Stratton Oakmont, Inc. v. Prodigy Servs. Co., No. 31063/94,
1995 WL 323710 (N.Y. Sup. Ct. May 24, 1995)). Put
differently, section 230 incentivized companies to neither
restrict content nor bury their heads in the sand in order to avoid
liability. Id. And in doing so, it paved the way for a robust
new forum for public speech as well as “a trillion-dollar
industry centered around user-generated content.” Eric
Goldman & Jeff Kosseff, Commemorating the 20th
Anniversary of Internet Law’s Most Important Judicial
Decision, THE RECORDER (Nov. 10, 2017), perma.cc/RR2M-
UZ2M.
Like other circuits, we have followed Zeran’s lead and
created a three-part test to determine CDA preemption.
7
Klayman, 753 F.3d at 1357-59 (citing Zeran and related
precedent from other circuits). Google can establish
immunity by showing that (1) it is a “provider or user of an
interactive computer service”; (2) the relevant blog post
contains “information provided by another information content
provider”; and (3) the complaint seeks to hold Google liable as
the “publisher or speaker” of the blog post. Id. at 1357
(quoting 47 U.S.C. § 230(c)). Thus, there is a dividing line
between “interactive computer service”5 providers—which are
generally eligible for CDA section 230 immunity—and
“information content provider[s],”6 which are not entitled to
immunity. Id. The law, then, distinguishes “service” from
“content.” Id.
In Klayman, we held that “a website does not create or
develop content when it merely provides a neutral means by
which third parties can post information of their own
independent choosing online.” Id. at 1358. We noted that,
although the Facebook website’s “Statement of Rights and
Responsibilities” might create an independent cause of action
for breach of contract, the statement did not change the fact that
the plaintiff was seeking to hold Facebook liable as a
“publisher” of the objectionable material. Id. at 1359.
Accordingly, we affirmed the district court’s dismissal of the
5
“The term ‘interactive computer service’ means any
information service, system, or access software provider that
provides or enables computer access by multiple users to a computer
server, including specifically a service or system that provides access
to the Internet and such systems operated or services offered by
libraries or educational institutions.” 47 U.S.C. § 230(f)(2).
6
“The term ‘information content provider’ means any person
or entity that is responsible, in whole or in part, for the creation or
development of information provided through the Internet or any
other interactive computer service.” 47 U.S.C. § 230(f)(3).
8
plaintiff’s claims pursuant to section 230 of the CDA. Id.; see
also Zeran, 129 F.3d at 331 (rejecting argument that defendant
was “distributor” rather than “publisher” under CDA because
it acquired “knowledge of the defamatory statements’
existence”).7
This case is controlled by the three-part test in Klayman.
First, as many other courts have found, Google qualifies as an
“interactive computer service” provider because it “provides or
enables computer access by multiple users to a computer
7
Bennett places great reliance on the Ninth Circuit’s holding
in Fair Housing Council of San Fernando Valley v. Roommates.com,
LLC, 521 F.3d 1157, 1163 (9th Cir. 2008) (en banc). We of course
are not bound by extra-circuit precedent but we nonetheless take a
moment to distinguish Roommates.com, concluding that it cannot
bear the weight of Bennett’s reliance because it marks an outer limit
of CDA immunity—a limit that this case does not even approach.
In Roommates.com, the court held that a website can simultaneously
be an “interactive computer service” provider and an “information
content provider” (e.g., it can provide both services and content).
Roommates.com, 521 F.3d at 1162. The court concluded that the
defendant had, “at least in part,” helped develop content on its
website by requiring users to select from a “limited set of pre-
populated answers” as part of the registration process. Id. at 1166.
For example, when creating a “Roommates.com” profile, the user
had to state his sex and sexual orientation and identify whether he
had children. Id. at 1161-62. Because Roommates.com created
the universe of pre-populated answers, required users to answer its
questions before registering and used those answers in providing
tailored services to its users, the court held that Roommates.com was
a content provider as well as a service provider and that it was not
entitled to CDA immunity for the content that remained on its site.
Id. at 1164. In so holding, the Ninth Circuit emphasized that
“Congress sought to immunize the removal of user-generated
content, not the creation of content.” Id. at 1163 (emphasis in
original).
9
server.” 47 U.S.C. § 230(f)(2); see, e.g., Parker v. Google,
Inc., 422 F. Supp. 2d 492, 501 (E.D. Pa. 2006), aff’d 242 F.
App’x 833 (3d Cir. 2007) (“[T]here is no doubt that Google
qualifies as an ‘interactive computer service’ and not an
‘information content provider.’”). Indeed, Bennett concedes
that fact. Appellant’s Br. 6 (“Google provides interactive
computer services, including websites and social media
platforms.”). Second, Bennett alleges that only Pierson—and
not Google—created the offensive content on the blog.
Compl. ¶¶ 11-12.
Third, Bennett seeks to hold Google liable as a publisher
of the content. Bennett argues that by establishing and
enforcing its Blogger Content Policy, Google is influencing—
and thus creating—the content it publishes. This argument
ignores the core of CDA immunity, that is, “the very essence
of publishing is making the decision whether to print or retract
a given piece of content.” Klayman, 753 F.3d at 1359. In
other words, there is a sharp dividing line between input and
output in the CDA context. Id. Here, the input is the content
of Pierson’s negative blog about Bennett’s business; that blog
was created exclusively by Pierson. Google’s role was strictly
one of output control; it had the choice of leaving Pierson’s
post on its website or retracting it. It did not edit Pierson’s
post nor did it dictate what Pierson should write. Because
Google’s choice was limited to a “yes” or “no” decision
whether to remove the post, its action constituted “the very
essence of publishing.” Id.
In sum, the CDA “allows [computer service providers] to
establish standards of decency without risking liability for
doing so.” Green v. Am. Online, Inc., 318 F.3d 465, 472 (3d
Cir. 2003). Although “other types of publishing activities
might shade into creating or developing content,” the decision
to print or retract is fundamentally a publishing decision for
10
which the CDA provides explicit immunity. Klayman, 753
F.3d at 1359 n.*; see Zeran, 129 F.3d at 332 (“[B]oth the
negligent communication of a defamatory statement and the
failure to remove such a statement when first communicated by
another party . . . constitute publication.”). “None of this
means, of course, that the original culpable party who posts
defamatory messages [will] escape accountability.” Zeran,
129 F.3d at 330. It means only that, if Bennett takes issue with
Pierson’s post, her legal remedy is against Pierson himself as
the content provider, not against Google as the publisher.
For the foregoing reasons, the judgment of dismissal is
affirmed.
So ordered.