The Porn Diaries #4: 50 Billion Shades of Gray

When I first started working in the anti-porn industry, the idea of pornography was pretty black and white in my mind, even though it was hard to explain. I, like many others, identified with the phrase, I know it when I see it. The trouble is, as I’ve discovered, not everyone sees the same thing. I could see nudity and not think of it as porn, for instance, but there definitely had to be nudity for it to be porn. That last thought has become sort of the bane of my existence since it became my job to find porn for our software.

I came into this position after an extensive background of working Customer Service for a different Anti-Pornography software, and the very small glimpse into the lives of the people utilizing this kind of software suggested that over blocking was better than under blocking. I would’ve said then with complete certainty that the people using that software would rather have the filter be too sensitive than miss any pornography. That is not the case here.

To be fair, I wouldn’t expect anyone to be OK with their cooking shows being blocked, but I was surprised by the number of inquiries surrounding lingerie and swimsuits, which we internally refer to as “gray” images. My previous experiences in the field did not prepare me for people being upset about these images being blocked, aside from those who were trying to shop online. Even as someone would be upset about a bikini being blocked, another person would praise us for the same thing. “I’m so glad you block bikinis,” they’d say, and I’d ponder the differences between personal preferences to myself.

In my head, I was envisioning a picture taken directly from a swimsuit catalog that was inspiring this debate, but as we delved into the process of teaching the software what a gray image was, I realized that it deviated far from our initial “swimsuit and lingerie” definition.  Where there was once one color, there was now a spectrum.

The question of decency no doubt varies between people, which adds to the complexity of the gray classification. Even within a small team of generally like-minded people, terms like “modest” mean shockingly different things. Things that were clean to me were gray to others, and things that were gray to me might be dirty to others. With a very small number of people showing differences in how they categorize things, the prospect of finding a balance to blocking/not blocking gray images in a way that pleased the masses started to seem bleak.

After months of work, we had culled from the endless amounts of images spanning what seemed like an endless spectrum to make the perfect categorization of gray images with the intention that these particular images would be considered clean. The best-laid plans, as they say. It turned out that as we improved our gray image detection, our porn detection functionality decreased, and while we can compromise on what gray is, we cannot risk not blocking pornography.

In the end, our driving purpose is to block pornography. After months of toiling over what gray is, trying to understand the entire spectrum between clean and dirty and the lines that separate clean-gray from gray from dirty-gray, suddenly, it was done. As it is, the software will sometimes block a bikini, and sometimes, it won’t. As long as it’s still blocking pornography, we honestly consider both options a win.