Guarding the Guardians: Content Moderation by Online Intermediaries and the Rule of Law

Niva Elkin-Koren, Maayan Perel

Research output: Chapter in Book/Report/Conference proceedingChapterpeer-review

Abstract

This chapter describes three ways in which content moderation by online intermediaries challenges the rule of law: it blurs the distinction between private interests and public responsibilities; it delegates the power to make social choices about content legitimacy to opaque algorithms; and it circumvents the constitutional safeguard of the separation of powers. The chapter further discusses the barriers to accountability in online content moderation by intermediaries, including the dynamic nature of algorithmic content moderation using machine learning; barriers arising from the partialness of data and data floods; and trade secrecy which protects the algorithmic decision-making process. Finally, the chapter proposes a strategy to overcome these barriers to accountability of online intermediaries, namely ‘black box tinkering’: a reverse-engineering methodology that could be used by governmental agencies, as well as social activists, as a check on private content moderation. After describing the benefits of black box tinkering, the chapter explains what regulatory steps should be taken to promote the adoption of this oversight strategy.
Original languageEnglish
Title of host publicationOxford Handbook of Online Intermediary Liability
EditorsGiancarlo Frosio
PublisherOxford University Press
Chapter34
Pages669-678
ISBN (Print)9780198837138
DOIs
StatePublished - 2020

Publication series

NameOxford Handbooks in Law

Fingerprint

Dive into the research topics of 'Guarding the Guardians: Content Moderation by Online Intermediaries and the Rule of Law'. Together they form a unique fingerprint.

Cite this