Last night, news broke that YouTube was once again on the hook for allowing, through inadequate monitoring, illicit material on their platform. In the past, they came under fire for pornography disguised as children’s’ content lurking within channels specifically designed for children, but this newest discovery is much worse. This time, they aren’t just exposing kids to pornography, which is bad enough, but they are facilitating and monetizing child sexual exploitation.
As a company that relies on algorithms, we understand the difficulties of navigating through the nuances of machine learning. YouTube, for its part, responded to the earlier scandal by putting in place a strict no child pornography rule and dedicating more resources to better monitoring. Unfortunately, implementing a wholesale ban on key phrases that don’t necessarily relate to child pornography can have disastrous effects as YouTube, and the world, has seen.
So what does this mean? Is YouTube now officially persona non grata? Ultimately, it will depend on how they respond to this crisis. Child pornography is never acceptable, and according to their rules, YouTube agrees. As a platform with a huge user base, they have both the privilege and the challenge of doing better in light of this discovery. To expect that nobody will ever break their rules is naïve; no user-uploaded content site will ever be free of porn. Suggesting that there is nothing to be done after a breach, however, would be irresponsible and unacceptable. How they rise to the occasion to combat this and future occurrences of child pornography will tell how important the matter is to their company, and we for one are watching to see what they do.