Everyone realizes that the Internet’s public squares have a harassment problem. No one seems to know what to do about it. I argue that’s because they don’t know how to think about online harassment and abuse—or even power, more generally. I argue that I do. But don’t take my word for it. Take my ideas, and implement them yourselves. Then let’s let the results speak for themselves.
“So, maymay,” I can already hear you asking, “how would you design an online social network that was hostile to abusers?” You’re probably asking this because you either don’t know that I’ve written about it before, or you haven’t been able to understand from what I’ve written how to take the lessons from code I’ve deployed in the Predator Alert Tool project and apply it to your own projects. That’s okay. You’re not alone.
Recently, I received an email from a developer asking for advice about this exact issue. They’ve told me they’d be fine with my sharing our conversation here, in the hopes that it gets other developers thinking about what they can do to proactively “protect people from abusers online,” as they put it. Here is our exchange (slightly edited for anonymity and clarity) so far. The email I received went something like this:
Hello! I’m building a new social network and want to be pro-active about protecting people. I wanted to reach out as I have little experience with protecting people from shitty people and abusers online, and the Predator Alert Tools is great. Is there any way I can help contribute to those projects, and/or utilise them somehow with [my project] to help protect people?
Any help you can give would be appreciated.
Thanks,
[Anon Developer]
I wrote back a few days later:
Thanks [for reaching out, Anon Developer].
Yes.
You can contribute to any of the PATs in any way you like. Here’s a short “how to help†page for the project. It talks mostly about Predator Alert Tool for Facebook but it’s relevant to all the tools.
Well, there are a number of themes that run through the entire suite of tools, and those are the only things I can talk about without knowing more about [your specific project]. So for now, let me just point your attention to these two blog posts about the tools.
First, “More on ‘The Match Percentage Fallacy’, or The Influence of Rolequeerness on the Predator Alert Tool project.” This post explicitly uses the language of game theory to talk about protecting people from online predation. An excerpt:
Predator Alert Tool for OkCupid highlights the signals players send when they answer OkCupid’s Match Questions to other players in order to de-silo as much information as possible, thereby hoping to expand the set of possible moves a given player (user of PAT-OKC) is aware of and enabling them to analyze the given situation (the decision tree of their “turnâ€) with the information they received through the tool. This is a fundamentally different approach than the one OkCupid’s “Match Percentage†interface provides, and this is no coincidence.
The “Match Percentage†interface is designed to account for “the best possible outcome†for OkCupid itself, not the best outcome for the OkCupid user. This makes sense when you realize that OkCupid is a company, and they have their own incentives and have defined the win conditions of this complex game very differently than their users (we) have.
In other words, the single most obvious problem with online “dating†sites (a category which include “social networking sites,†obviously) is that they are designed from the ground-up to focus on filtering data out as opposed to considering related data important. This is precisely the environment in which serial rapists are most protected. If you are serious about building a social networking site that is proactive about maintaining an environment hostile to these kinds of abuses, you need to focus on identifying and surfacing information about signals between users that are negative as well as positive. Again: rather than burying those signals, you need to surface them. Use OkCupid’s “Match Percentage†interface as a perfect example of what not to do.
If that’s curious to you and, again, if you’re interested in pursuing this line of questioning further, write back and tell me more about [your project], and yourself, and so on. Let’s have a conversation. Predator Alert Tool’s implementations are different depending on the site for which the specific tool was intended not only because the technology of different sites is different, as you know, but also because the culture of each website is different; users interact with the sites differently based on the messaging, context, and approaches different sites take. So Predator Alert Tool also needs to integrate with a culture, not just a programming language.
For more on that, see this early post by one of my collaborators, “Rape Culture, meet Internet Culture.†An excerpt:
Probably the most well-known recent pushback against rape culture is the Predditors story, in which some Reddit users discovered and published the identities of others who had been posting sexualized pictures of young women. The Predditors tumblr has since been shut down, but its contents are still available in a GoogleDoc here. Sexual abusers have also been outed via YouTube, Facebook, and Twitter. Blogs provide a public square for arguments about rape culture to rage. Twitter users directly critique the media. I’ve heard rumors of a Tumblr hashtag used by survivors to post the names and addresses of their rapists. The FetLife Alleged Abusers Database Engine (recently rolled into the Predator Alert Tools suite as the “Predator Alert Tool for FetLifeâ€) collects anonymous reports of consent violations in the BDSM community and then flags the FetLife profiles of alleged abusers. And I recently helped beta-test a new tool, The Predator Alert Tool for OkCupid, which highlights self-reported sexually violent opinions and behaviors by OkCupid users.
I don’t think any of these tools, or even all of them together, will put the nail in the coffin of rape culture. Like other kinds of abuse, rape culture adapts to new environments quickly. Activists need to stay on our games in order to keep exposing new forms of it as they appear. We need to keep experimenting, trying new things, and being creative with whatever resources we have available. What I find most powerful about these tools is the ways each seems tailored to the specific culture from which it emerged. Predditors addresses rape culture on Reddit by retaliating against its perpetrators using technological savvy, counter-rhetoric about free speech and privacy, and a “troll the trolls†sort of strategy all suited to Reddit’s particular cultural sensibility. FAADE, on the other hand, capitalizes on a mentality strongly espoused by FetLife users that the BDSM community is like a “small town†in which everyone is connected to everyone else by kinship ties. BDSMers often rely on personal references and a player’s public reputation to assess their safety, thus a database allowing FetLife profiles (the site of a player’s public reputation online) to be tagged with negative references from community members has a powerful impact on the sub-cultural consciousness. What would a similar tool look like for Twitter or Facebook?
So again, the question you’re asking is bigger than an email. I’d be interested in having that bigger conversation with you, if you are serious about having it, too.
Thanks again for reaching out.
Cheers,
-maymay
Maymay.net
Cyberbusking.org
I was pleased by the developer’s response:
Thank you so much for all this information.
I often struggle to digest information like this; I’ll be re-reading these articles a few times to try to understand them more fully.
I would like to have the bigger conversation, but […] I need to watch out I don’t bite off more than I can chew. I regard this topic as highly important and a responsibility I now have.
The use of game theory resonates with me, as I’ve used ideas from my basic understanding of game theory as influence in the structure of [my project] (only very crudely). So if I can expand those ideas in a way which protects people, all the better.
Am I right in my understanding that one core idea is that negative information is intentionally hidden in most places, in order to benefit the company? So (and this is a contrived example) where [my project] might track how many messages a person receives as a positive, it should also track, process, and weight the negative events associated; messages which go unrelieved to, messages reported as abusive etc?
Thanks again,
[Anon Developer]
My response tried to elaborate on “negative” signaling:
Of course. That’s fine. Take your time.
It’s good that you consider this a responsibility you have, because you already had this responsibility, even before you were developing [your project]. ;)
You’re almost right about your understanding.
The bigger point being made here is that, from the perspective of users, [your project] is a hostile, not a friendly. You, as the company, are not a passive facilitator of information. You are in a decidedly dominant position over your users, and this means that you have the capacity to be predatory in relation to them, because when it comes to their interactions with or through [your project], you are obscenely more powerful than they are.
So, yes, you should also track, process, and weight negative events. But you should also not presume to necessarily know what events are negative and what events are positive. The minute you think you can determine what negative signaling is for someone else, you become much more likely to fail to empower that other person. It’s not up to you to determine what’s negative or what’s not. You can, of course, do some things to make this more obvious, and the “report abuse†feature is a start. But the problem with “abuse reports†is that those reports are sent to the entity in the [project] ecosystem that already has the most power: [the project/website/company itself]. That’s a recipe for disaster.
One simple way to tweak this system would be to simply display a tally of all the abuse reports a given profile has received next to their profile. Allow people to click-through on that icon to a list of all abuse reports filed against that profile. Don’t hide it. Don’t make excuses for it. Don’t arbitrate it. Don’t moderate it. In a centralized system such as I understand [your project] to be (I signed up for an account today and had a look around), a moderation system is far more likely to end up as a “benevolent†dictatorship rather than an effective means of anti-abuse behavior. You should not appoint yourself as the police.
For more on this point, see my blog post, “Revisiting why ‘no moderation’ is a feature, not a bug, in Predator Alert Tool for Facebook.†An excerpt:
“Moderation†is a governance tool that may make sense in the context of online communities with a relatively homogenous populace, such as multiplayer video games or topically-oriented forums. But moderation is inherently in conflict with the goal of dissolving authority and dispersing power amongst a heterogenous populace already prone to conflict. There is no system of moderation that is not also a system of social control. And in the context of a project explicitly designed to overcome the iniquities introduced to human experience by traditional mechanisms of social control, adding a traditional mechanism of social control is shortsighted at best and active sabotage at worst.
We realize this is difficult to understand at first. After all, there is currently no physical-world social context wherein we are free from the power of authorities we did not choose and also do not agree with. Everyone has a parent, a teacher, or a boss—even the fucking police. As one PAT collaborator wrote:
We’re all so accustomed to having our spaces monitored and moderated and overseen “for our own safety†that sometimes, when we take the well-being of our communities into our own hands, we appear to be doing more harm than good. That’s only because we’re comparing our efforts to the imaginary “safe†world we’ve been told that we live in, not to the dangerous realities that survivors actually face online and off.
Put another way, from the perspective of a vulnerable populace, namely people who are the targets of rape and physical abuse, a system that erodes the power of central authorities (such as website admins, or the cops) is a move towards safety, not away from it.
In other words, the premise of [your project] is to connect people with different characteristics who want to engage positively. This means you have to provide them with the information both to find people they like and to avoid people they don’t like. You can’t do this effectively if you only surface positive signals while hiding negative ones. And to effectively surface negative signals, you have to re-examine your assumptions about what “negative†means because, if you don’t, especially in the context of a diverse user base, you’re going to get it wrong for at least some users. When you get it wrong for them, you create an environment in which it is particularly easy to predate on that specific subsection of your user base.
That’s why most dating sites are a breeding ground for predatory users. Most dating sites are, after all, programmed by men.
Again, feel free to email me whenever you’re ready for another round. This is basically what I do for “a living.” :P I would strongly encourage you to read the posts tagged with “Predator Alert Tool†on the archives of my various blogs, of course.
My hope in sharing this is to encourage other people to think more critically and creatively about what structural changes are necessary to facilitate anti-abuse action. Recent attempts by Twitter and WAM have been decidedly stupid. And I don’t say that lightly. These are some exceptionally talented people in a number of fields ranging from gender advocacy to technology. And yet most acts I see being taken—”moderation superpowers” to use the most recent buzzword—is downright counterproductive. Obviously.
It’s time we stopped believing that authority or authorities in public spheres are a solution. The longer we wait to face the fact that power corrupts, the more abuse we’ll bring down on ourselves, our communities, and our peers. Heed this warning: do not police.