Search
Browse By Day
Browse By Time
Browse By Person
Browse By Mini-Conference
Browse By Division
Browse By Session or Event Type
Browse Sessions by Fields of Interest
Browse Papers by Fields of Interest
Search Tips
Conference
Location
About APSA
Personal Schedule
Change Preferences / Time Zone
Sign In
X (Twitter)
Increasingly, digital tools are being used by extremist groups to organize, disseminate information, and recruit new members. They thus pose a challenge for thinking about democracy and community in digital environments: how can we use digital platforms to create spaces within which marginalized social groups can constitute themselves while also prevent these groups from becoming violent, extremist, and anti-democratic? While many scholars point to technological affordances or corporate content moderation policies as providing some solutions, in this paper I examine an alternative: that a community’s social norms, and the moderation practices required to sustain those norms, are bottom-up, user-directed practices that can have similar effects, even within the constraints set by platform design or corporate governance decisions.
Using the case of incels (short for “involuntarily celibate”) as an example of a marginalized social group that radicalized into an extremist hate group, I explore whether—and how—a community’s social norms and governance practices are influenced by their embeddedness within larger social networks. Turning to different Reddit communities, both those closely associated with incels (r/incels, r/braincels, r/incelswithouthate) and those for involuntarily celibate people who do not identify as incels (r/foreveralone, r/deadbedrooms), I trace the differences in their social norms and moderation strategies, looking specifically at how incel communities on Reddit interact with the larger Reddit community and the effects of these interactions on the resulting community identity. Though incel communities on Reddit might share the same misogynistic impulses, I argue, the structure of their communities—namely, their interaction with Reddit’s site-wide “Reddiquette” and the power of moderators and site administrators to enforce it—can help us to understand why some communities (like r/foreveralone) exhibit less extremist behavior than others (like r/incels). Those communities who deliberately choose to “play nice” with Reddit’s larger community are less likely to exhibit extremist tendencies—even if they share an underlying misogynistic ideology.