The real lesson from Facebook’s most recent, big move.
This past week, the New York attorney general’s office announced a partnership with Facebook to help combat child sex trafficking, which includes technical assistance to help law-enforcement officials find perpetrators and rescue victims. As Fortune reports, the plan involves the use of online data and algorithms to analyze Internet ads, searching for patterns in images, text, phone numbers, and other key info that turns up on sites where minors are trafficked, such as Backpage and Craigslist.
Although the initiative appears limited to New York for now, it could potentially serve as a national model for how major online platforms can proactively work to protect the young people who use their services most. Last year, the International Business Times noted that children between the ages of 13 and 17 represent roughly 10 million Facebook users, despite declining use among that demographic. And according to data released in April from the Pew Research Center, Facebook, Instagram, and Snapchat are far and away the most popular social media apps for teens.
But that dedicated teen following, as many law enforcement agencies are well aware, translates to traffickers using these platforms as recruitment tools. Or worse, they’re leveraged to lure unsuspecting young people into situations where they’re trapped, attacked, and forced into sexual slavery and prostitution, with families and loved ones left in the dark.
It’s a potential fate that’s lingered in my memory for quite sometime—because I was almost a child sex trafficking victim.
In the earlier days of the World Wide Web, chatrooms powered by MSN and Yahoo offered some of the first online homes for young people and adults to connect. I was only 11 years old when my parents purchased our first Internet-ready computer—and, in no time, I created my own online identity, learned the ropes, and quickly made “a/s/l” part of my lexicon.
I lied to my parents about my after school whereabouts, using public transportation to meet a gay guy who was allegedly close to my age.
As a bullied, urban-raised young person ostracized to the point of having few friends, I eagerly anticipated coming home after school to spend hours mindlessly chatting, in a place where everybody knew my Internet name. But I wasn’t ready for the adult interlopers, knowing little about their tactics, and how many of them attempted to sexually bait children online.
By the time I turned 14, my friends in the chatrooms and other responsible community managers had already snuffed out dozens of posers—some of whom even exposed their privates via email, instant messenger chats, and even in webcam-based rooms. I was open to chatting with other queer teens around my age, and naively understood the attention from older men as a compliment, rather than as a potential warning sign.
Although I knew to never tell them where I actually lived, or to give them my phone number, I once did something even worse: I lied to my parents about my after-school whereabouts, using public transportation to meet a gay guy who was allegedly close to my age.
I agreed to a location relatively far from home—and upon encountering him, I noticed a face slightly more mature than the one he shared in photos. But I hopped in his car anyway, and quickly realized he was less interested in getting to know me as a person, and more curious about what I could sexually do for him. With every passing minute, I grew more and more afraid, and knew I had to escape somehow.
Fortunately, we were in an area public enough for me to get out of the car, and wait safely for the next regional commuter train. I called my parents to let them know I was en route, never once giving myself away when asked why I stayed out so late.
I clutched my cell phone the entire ride, and never looked back.
During that near-miss with sexual exploitation, anything could have happened, and there was little being done by Internet companies to protect young people like me. Facebook didn’t exist then, nor did any of the apps that rely on the widespread adoption of smartphones. However, in the weeks that preceded my encounter, MSN announced plans to close their free, unsupervised chat rooms in 28 countries, citing concerns about child safety.
I hopped in his car anyway, and quickly realized he was less interested in getting to know me as a person, and more curious about what I could sexually do for him.
“We have seen cases where under-16s have been approached by people pretending to be same age, but who are grown adults trying to solicit young people for abusive contact,” an MSN spokesperson said at the time, later adding, “We regret responsible users will be affected by this move, but we looked at this very carefully and on balance to safeguard customer interests.”
According to the BBC’s report about the controversial move, roughly 20 percent of children ages 9 to 16 regularly used chat rooms, with half of those children engaging in sex chats. One in 10 of them, per a 2003 Cyberspace Research Center study, met someone from a chat room face to face.
But 12 years ago, social media platforms such as chat rooms weren’t equipped with the technological savvy, hyperconnected networks, and vast pools of big data that now lie in the hands of companies like Facebook, Instagram, and Snapchat. The best a site like MSN could do was increase community moderation or, as they eventually did, shutter unsupervised rooms altogether. But given that there’s little-to-no supervision required to participate on today’s social apps, children are likely more vulnerable than ever to receiving dangerous solicitations from traffickers and pedophiles alike.
Fortunately, where Facebook appears to have started taking proactive measures to beat out traffickers, other companies such as Google are already part of the action. As the Silicon Valley Business Journal highlighted in March 2014, Google partnered with Palantir and Microsoft to implement digital tools similar to the ones being leveraged by Facebook today.
With more effort from the tech giants, countless young people (and even adults) can be shielded or rescued from traffickers—including unknowing and naive teenagers like who I once was.
“Data analysis, image recognition and mapping programs are helping anti-trafficking nonprofits not only locate victims in real time, but predict their victimizers’ next moves,” the report noted. “Palantir’s software, which sifts volumes of unrelated data for meaningful connections, is key to many of those efforts, including speedy response to victims who call hotlines. It instantly pulls information from disparate sources such as license plate numbers, online ads and cellphone records to locate trafficking victims and connect them with help.”
There’s no excuse for relative inaction on child sex trafficking from major tech companies and social media apps. If they’re comfortable profiteering from teens’ heavy, frequent usage of their platforms, those resources must be employed to protect the millions of young people vulnerable to the moved of predators. According to the International Labour Organization, of the estimated 4.5 million people victimized worldwide by sex trafficking, 21 percent of them are children, overwhelmingly female.
Of course, not all of them may have access to smartphone apps, but there’s no question that these mediums have become prime recruiting grounds for abusers. And with more effort from the tech giants now responsible for cultivating online communities, countless young people (and even adults) can be shielded or rescued from traffickers—including unknowing and naive teenagers like who I once was.
Derrick Clifton is the deputy opinion editor for the Daily Dot and a New York-based journalist and speaker, primarily covering issues of identity, culture, and social justice.
Illustration by Max Fleishman