Friday, May 29, 2020

"Everything You Need to Know About Section 230", booklet from The Verge; Kossef's "The 26 Words that Created the Internet"


Casey Newton has an online booklet on “The Verge” (a Vox Subsidiary) called “Upload: Everything You Need to Know About Section 230:The Most Important Law for Online Speech”.

A good place to start is with the text of the statute, 47 US Code, Section 230, from the Cornell Law School Site.  This is a provision of the 1996 Telecommunications Act, or Communications Decency Act.,   It was crafted by Sen. Ron Wyden (D-OR) and Rep Brian Cox (R-CA). There was a case. Stratton-Oakmont v. Prodigy, where Prodigy was held liable for anonymous defamatory content in the 1990s, because Prodigy had moderated the content, creating a “moderator’s paradox”.  That case had referred to a more blanket downstream liability concept in Cubby v. CompuServe (1991). 

The most critical provisions are Section C.

“(c)Protection for “Good Samaritan” blocking and screening of offensive material

(1)Treatment of publisher or speaker

No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.

(2)Civil liabilityNo provider or user of an interactive computer service shall be held liable on account of—

(A)

any action voluntarily taken in good faith to restrict access to or availability of material that the provider or user considers to be obscene, lewd, lascivious, filthy, excessively violent, harassing, or otherwise objectionable, whether or not such material is constitutionally protected; or

(B)

any action taken to enable or make available to information content providers or others the technical means to restrict access to material described in paragraph (1).”

 Note that the wording specifically refers to “good Samaritan” blocking and screening of offensive material”, and then the statement that the provider or user of an interactive computer service is not treated as the publisher of material by another speaker, but the second section anticipates that the usual remedy is removal or blocking of offensive content.  The wording does not imply that the operator of a platform cannot have its own overriding partisan political bias.  

Again, the centering of the language around moderation of content comes from the Prodigy case. Bloggers generally are not responsible for comments posted by others (although I use filters for spam comments and have a few times removed offensive comments aimed at others, where Section 230 protects me directly.)

Note also the changes to the law made in 2018 to carve out “exceptions” for “should have known” provisions regarding sex trafficking (FOSTA).  This change has not been very effective in the result it wanted, and may have led even more to targeting of minority women or even trans persons.  But some sites have taken down “hookup” ads or discussion boards out of caution.

Note also that the Verge maintains that European and British commonwealth countries have similar liability laws. 

Section 230 needs to be appreciated in conjunction with the DMCA Safe Harbor law, which applies a comparable liability shield to platforms with respect to copyright infringement (as set apart from most of the other common torts, like defamation).

Together, these two provisions make user generated content on the Internet (most obviously on social media, but also with conventional hosting) possible.  There has been a lot of controversy lately over the Safe Harbor given a recent paper by the Copyright Office, but that is legally a separate discussion not covered by Donald Trump’s Executive Order signed May 29.

Donald Trump’s XO aims to discourage political bias against conservatives in the way platforms apply “good Samaritan” clauses under 230.  There is legal controversy over how the law, as written, would apply and his order will certainly be challenged in court.

It should be noted that Joe Biden has said he wants to eliminate Section 230 completely, and both Bernie Sanders and Elizabeth Warren expressed concerns that user generated content permitted by the platforms amounted to hate speech (as generally outlawed in the EU) that could radicalize less intact people, particularly on the “alt right” or in the white supremacy area.

Great Seal of the United States (obverse)

In Europe we see a similar sentiment, plus the fact that in the Copyright area, there is a general impression that user generated content intended for global distribution should not be presumed as a natural right, but that people should earn some social credibility (personal “social credit”) before they have the right to be heard by the entire world. That is closer to the formal situation today in China.

There is a new book "The 26 Words that Created the Internet" by Jeff Kosseff, from Cornell University Press, 328 pages.  Kosseff appeared on Smerconish on CNN on May 30 and suggested that without 230 there are 3 alternatives (1) Platforms vet speakers just the way publishers do (although there would be more speakers, or (2) No moderation at all, or (3) A takedown system like DMCA Safe Harbor for copyright.  I ordered the book and will review it on Wordpress. 

(Great Seal of the U.S., click for wikipedia attribution, in article on 1996 Telecommunications Act, p.d.)

No comments: