Conference Announcement: “Content Moderation & Removal at Scale,” SCU, Feb. 2

I’m pleased to announce “Content Moderation & Removal at Scale,” a conference we’ll be holding on campus on February 2, 2018.  I anticipate a full house, so we’ve set a registration cap. When we reach the cap, we will put subsequent registrations on a waitlist. If you’d like to come, I strongly recommend early registration. If the registration fees pose a hardship in any way and you don’t fit into one of the free registration categories, please contact me.

The Backstory: I was disheartened when I initially saw the first draft of the Allow States and Victims to Fight Online Sex Trafficking Act of 2017 (what I call “the Wagner bill;” its Senate companion is SESTA) to exclude sex trafficking violations from Section 230. The bills reflects unrealistic assumptions about how most online services manage incoming third party content. I thought that we might find more common ground if policymakers better understood how Internet companies actually handle content moderation and removal operations, so they could know what’s easy, what’s feasible but hard and expensive (and thus more likely to be undertaken only by incumbents, not startups), and what’s not possible at any price.

However, companies rarely publicly discuss their content moderation and removal operations. As a result, we lack many basic facts, such as how many people each company employs, what are their job titles, how do they fit in the org chart, and how are they trained. Some reporters and researchers have covered the topic over the years, but often in piecemeal fashion based on secondhand information. Some of these details have been shared in various smaller closed-door events, but the confidentiality cloak has prevented information diffusion. In my initial calls with companies pitching them on the conference, I often learned an incredible amount in the first 3-5 minutes of our conversations–despite the fact that I have studied and written about Internet law for 20+ years. In my conversations, it also became clear that these details weren’t trade secrets or even confidential, they just had not been shared publicly.

I hope this conference is the first of many public conversations about the operations of content moderation and removal. These conversations ought to help the industry accelerate the development of best operational practices. They also should help policymakers better understand the tradeoffs in any efforts they undertake to impose greater content moderation or removal obligations.

Because policymakers are a key audience, there will be a sibling conference held in DC with an agenda customized for its audience. That event is scheduled for January 23, 2018.

The Conference:

Most Internet Law conferences focus on the scope and meaning of substantive rules. In contrast, this conference focuses almost exclusively on how substantive rules are operationalized. Whatever the rules say, and wherever the rules come from–whether it’s legislators, common law, industry standards, or idiosyncratic “house rules”–how do companies translate them into operational practices? Of course, operationalization might differ due to the consequences for violations (i.e., violating legislative rules might lead to jailtime, while violating internal house rules might only be embarrassing). I hope we’ll tease out those nuances through the course of the day.

The main attraction in the morning will be a series of 10 minute presentations by 10 companies about the facts and figures of their content moderation and removal operations. The participating companies are: Automattic, Dropbox, Facebook, Google, Medium, Nextdoor, Pinterest, Reddit, Wikimedia, and Yelp (Note: Nextdoor isn’t listed on the agenda yet but will be added in the next revision). As you can see, this roster of companies ranges from industry giants to much smaller organizations; and the companies have a diversity of editorial practices that should highlight how they’ve optimized operations for their “local” conditions. All of them have agreed to publicly “describe their content moderation and removal operations, such as org charts, department names and job titles, headcount, who determines the policies, escalation paths, and ‘best practice’ tips.” I’m very confident everyone in attendance will learn a lot from these presentations.

(I would have loved to diversify the participant list to include companies outside the Bay Area. I did approach some companies in other regions without success. We might hear from companies in other regions at the DC event).

The main attraction in the afternoon will be four panels on topics that should be interesting to anyone in the industry or observing it:

  • Employee/Contractor Hiring, Training and Mental Well-being
  • Humans vs. Machines
  • In-sourcing to Employees vs. Outsourcing to the Community or Vendors
  • Transparency and Appeals

(Note: I have more panelists to add to the afternoon panels in future revisions).

In addition to the morning presentations and afternoon panels, the conference will feature some brief legal primers, a lunchtime discussion about the history and future of content moderation and removals, and more.

All of the conference proceedings will be on-the-record, and we expect reporters will attend and cover the event. We plan to record the proceedings and are considering a live-stream option.

In addition to the day’s proceedings, many of the participants will be writing a short essay on thematic topics. We plan to bundle the essays into a package and publish the package through a not-yet-identified publication venue.

As you can see, this should be an enlightening and important conversation. I hope you can join us.


Source: Eric Goldman Legal

Curator
Curator

As a reputation management pioneer, Nick has the inside scoop on all things Reputation Management. This blog will focus on Reputation, practices, technologies, providers and re-shared content from some of the preeminent players in the industry. We hope you enjoy!

Leave a Reply

You must be logged in to post a comment.