TL;DR Facebooks’s Community Standards aren’t versioned and appear to be engineered to resist the idea of versioning.
Yesterday I attended the Content Moderation at Scale meeting held in Washington DC, which had representatives of large social media platforms like Facebook, Twitter, Google, Wikipedia, Twitch, Vimeo, Trip Advisor talking about their processes for content moderation. These companies were joined by representatives from the Department of Homeland Security and Europol. This was an east coast version of a similar event that happened at Santa Clara University earlier this year called Content Moderation and Removal at Scale. But if you look at the speaker list you can see it wasn’t exactly getting the band back together. It’s also interesting to note that bringing the event to DC brought in some new sponsors, including the CATO Institute, Charles Koch Institute and Craig Newmark Philanthropies. Yes, quite a diverse (and powerful) set of actors are interested in this topic.
As you can imagine a lot of stuff happened, some of which you can see reflected in the Twitter stream, and the slides and videos are online for the moment. Unfortunately there was very little opportunity for question & answer with the audience after the presentations from the various companies, and by my recollection none of the participants asked questions of each other. So even though it had a light breezy feel to it, this was a tightly controlled event, and in many ways it had to be, to get these people up on stage to talk about this very sensitive topic at all.
As I said, there was very little Q&A with the audience, but there was an online form that was used for people to submit questions to the moderator. Perhaps I wasn’t paying enough attention but I didn’t hear very many questions coming from this system, and there was very little time for discussion after the various presentations anyway. I submitted one question, that was not asked, which is the topic of the remainder of this post.
Tal Niv talks about how their community and terms of service documents are maintained by their legal team using GitHub itself to manage changes and gather issues, etc. https://t.co/Hl9B5o3qtc #comoatscale— Ed Summers (???)(???) ((???)) May 7, 2018
This means that issues are versioned, with a clear history of how they have changed and when. In addition anyone with a (free) GitHub account can create issues that ask questions, or fork the policy and send a [pull request] with a suggested change. Now it appears that only 7 people have actually contributed to the repository–but it is the principle that matters here. For example we can see that the repository has largely been shepherded by Joseph Mazzella who is Product Counsel at GitHub. We can see what he changed and his comments provide some context about why. I think this level of transparency is important for community building.
Community was an essential and oft repeated concept that participants used to talk about how their moderation guidelines work. Specifically Facebook’s recently released Community Standards were held up by Facebook representatives Peter Stern and Katilin Sullivan. They described a “mini-legislature” that put these standards together, which really operate as the law of Facebook. Is it possible to think of 2 billion people as a community? Maybe? It’s certainly different than thinking of Vimeo as a community, which has 2 million users and allows you to pay money for the right to host your content with them, instead of profiting off of selling your data to third parties. But I digress.
My question for the transparency panel was: Have Facebook considered using a process like GitHub’s for making the changes to their community guidelines clear? At the moment there doesn’t appear to be a changelog for the guidelines so it’s hard to understand how/if they have changed. There doesn’t appear to be a way to ask questions about the document, or to suggest changes. Facebook already have a GitHub account so this wouldn’t be asking them to use something that they weren’t using already. This level of transparency (at least for the changes) seems important if Facebook is taking their community guidelines seriously.
I might be reading between the lines a little bit too much, but it appears that Facebook’s choice of deployment for the Community Rules greatly inhibits the level of transparency and accountability to which they can be held. Not only is there no real way to track how these rules are changing, they appear to be engineered in such a way to prevent them from easily being tracked. This is in stark contrast to organizations like GitHub, Tumblr (thanks Tarleton) and Wikimedia (Mediawiki embraces version history) who want to share detailed information about how their policies are changing.
It seems to me that version histories for these community documents are a required foundation for building online communities. I hope that future Content Moderation at Scale events (and I do hope there will be more) can address this issue. It seems like a natural fit for a room full of lawyers and legal nerds, right? Also, another bit of unsolicited advice, maybe lose the cute session where everyone pretends to get their hands dirty in content moderation, and use the additional time for actual, unmoderated Q&A.