New Papers Accepted for ACM FAccT '24
2024 June 07Very excited to share two new papers accepted to the seventh annual ACM Conference on Fairness, Accountability, and Transparency (ACM FAccT '24)!
- Misgendered During Moderation: How Transgender Bodies Make Visible Cisnormative Content Moderation Policies and Enforcement in a Meta Oversight Board Case, co-authored by Kendra Albert and Oliver L. Haimson
- Trans-centered moderation: Trans technology creators and centering transness in platform and community governance: Trans-centered moderation, first-authored by Hibby Thach, co-authored by Samuel Mayworm (myself), Michaelanne Thomas, and Oliver L. Haimson
"Misgendered During Moderation" Abstract:
"Transgender and nonbinary social media users experience disproportionate content removals on social media platforms, even when content does not violate platformsā guidelines. In 2022, the Oversight Board, which oversees Meta platformsā content moderation decisions, invited public feedback on Instagramās removal of two trans usersā posts featuring their bare chests, introducing a unique opportunity to hear trans usersā feedback on how nudity and sexual activity policies impacted them.
We conducted a qualitative analysis of 83 comments made public during the Oversight Boardās public comment process. Commenters criticized Metaās nudity policies as enforcing a cisnormative view of gender while making it unclear how images of trans usersā bodies are moderated, enabling the disproportionate removal of trans content and limiting trans usersā ability to use Metaās platforms. Yet there was significant divergence among commenters about how to address cisnormative moderation. Some commenters suggested that Meta clarify nudity guidelines, while others suggested that Meta overhaul them entirely, removing gendered distinctions or fundamentally reconfiguring the platformās relationship to sexual content.
We then discuss how the Oversight Boardās public comment process demonstrates the value of incorporating trans peopleās feedback while developing policies related to gender and nudity, while arguing that Meta must go beyond only revising policy language by reevaluating how cisnormative values are encoded in all aspects of its content moderation systems."
"Trans-Centered Moderation" Abstract:
"Mainstream platformsā content moderation systems typically employ generalized āone-size-fits-allā approaches, intended to serve both general and marginalized users. Thus, transgender people must often create their own technologies and moderation systems to meet their specific needs.
In our interview study of transgender technology creators (n=115), we found that creators face issues of transphobic abuse and disproportionate content moderation. Trans tech creators address these issues by carefully moderating and vetting their userbases, centering trans contexts in content moderation systems, and employing collective governance and community models.
Based on these findings, we argue that trans tech creatorsā approaches to moderation offer important insights into how to better design for trans users, and ultimately, marginalized users in the larger platform ecology. We introduce the concept of trans-centered moderation ā content moderation that reviews and successfully vets transphobic users, appoints trans moderators to effectively moderate trans contexts, considers the limitations and constraints of technology for addressing social challenges, and employs collective governance and community models. Trans-centered moderation can help to improve platform design for trans users while reducing the harm faced by trans people and marginalized users more broadly."