A New Jersey man is suing gay hookup app Grindr after meeting up with a teenage boy through the service and later facing charges of sexual assault. He claims the app has lax age restrictions and poor enforcement, which is how he wound up in a sexual encounter with a 13-year-old boy. It’s not the first time Grindr has been in the hot seat in a situation like this, nor is Grindr the only app facing a similar problem. Now the question becomes: Is it time to start putting age verification restrictions in place for online and mobile-based dating?
The alleged first instance of an underage boy using Grindr for hookups was in 2010, when Bren Tynan of Vancouver engaged in a sexual encounter with a 15-year-old he’d met through the service. Delaware Deputy Attorney General Daniel Simmons (34) used Grindr to meet up with a 16-year-old boy earlier this year and has since been charged with sexual assault. Christopher Lamarche LeBlanc met up with a 14-year-old through Grindr in 2012 and was only discovered after the victim’s father approached the police with concerns about his son’s mobile activity. One underage victim was bullied at school after a Grindr hookup and developed suicidal thoughts, which is what led police to his assailant.
The media reaction to these stories is typically one of shock and disgust—the horror of seeing children exploited through hookup apps, in some cases compounded by a fear of homosexuality (and the ever-present fascination with gay sexuality). There’s also a note of victim-blaming, however, with some responses in the comments of news articles including questions about why underage users are on the service at all and whether it’s fair to hold adults responsible for having sexual encounters with underage people who represented themselves as adults.
Other reactions underscore the need to talk to young adults and children to create a safe environment for them to discuss sexuality—as seen in the comments on this Reddit thread from a horrified gay man who found his cousin on the service. Some might argue that the responsibility could better be laid at the feet of the application itself, as Grindr, Tinder, and similar services have the tools to successfully implement age verification systems that would protect children from exploitation by adults—but this may be the wrong way to go about it.
Grindr claims it is not responsible for information provided by third parties, in accordance with the Communications Decency Act. While it is possible to report underage users on the service, the grievance process can take time, and a profile may remain up for hours or days before the support team can take it down. The terms of service are also very clear about the fact that Grindr is not responsible for information provided by other users, the same cover used by websites across the Internet to shield themselves against complaints from users about objectionable content or misleading material.
Dating apps aren’t the only industry to have faced this problem: So has porn.
In 2013, the industry lost a bitter fight over what are known as “2257 Reporting Requirements,” referencing the section of the U.S. code that requires the adult entertainment industry to maintain detailed age verification records on performers. Litigation on the requirements has gone back and forth, with some courts arguing that they’re reasonable, while others believe they create an undue burden that may push the boundaries of the Constitution. Under the requirements, adult performers must present identification documents which have to be retained by their employers, and when their images are depicted, transmitted, or distributed, information about the location of the records must be provided.
The industry argued that 2257 Reporting Requirements created an undue burden, and some sex workers also pointed out that it posed safety risks for performers. In the event that records were poorly secured or released to unauthorized parties, it would amount to a mass outing of sex workers, complete with personal details.
Designed to address the fact that the industry was profiting from very youthful-looking female performers, the 2257 regulations were intended to protect children, raising the question for some of whether similar requirements could, or should, be implemented for dating applications. If users are exchanging information through an application to meet up for sex, should their ages be verified to weed out underage users in order to protect them from sexual assault?
The gut answer might be “yes,” because socially, many of us share the desire to protect children from obvious sources of harm. However, the problem with children using hookup apps lies not in the fact that their ages aren’t being verified, but in that they’re experiencing larger sexual and personal experiences and emotions that aren’t being addressed. Children living in repressive environments, for example, may turn to the Internet for information and, yes, for sexual partners—because humans don’t have a switch that flips at the age of consent.
When it comes to protecting underage users on hookup apps, the solution lies in providing comprehensive sex education in schools to create a safe, non-judgmental environment for children to bring up sexual concerns and issues. It also lies in parental outreach and education. Stories like these elicit kneejerk reactions from parents, fearing the possibilities of technology and putting their children on a tighter leash in a mistaken attempt to protect them. It’s scary to find a teen on a hookup app, but not because of the app: Because the teen was too afraid to talk openly about sexuality, choosing instead a potentially high-risk behavior.
The tendency to lock down on access to tech is essentially the opposite of a productive response—a savvy parent will instead reach out to children to create a safe home environment and an open atmosphere for discussion in the first place. Would gay teens sneak out of their homes for sex if their parents were supportive of their sexuality? Perhaps, but it wouldn’t happen as often, especially if they were receiving good sexual education at school and their school districts were committed to fighting homophobic bullying and abuse.
Putting the burden on apps might seem like an easy solution, shifting the responsibility to an easy target to hate. However, it doesn’t speak to the right of grown adults to exercise their sexuality as they please, without potentially dangerous restrictions (many adults wouldn’t want dating apps having their home address on file, and who can blame them, with the frequency of privacy breaches). Children are curious, they often lack a safe outlet for their sexuality, and they want to explore the world in a way they think is safe. For some, dating apps look appealing, and that’s not the fault of the app designers, it’s the fault of grownups for not making a safe world for children.
Adult users of dating apps should be capable of telling the difference between a 13-year-old boy and someone who is over the age of consent, and those who are using such services to prey on underage children of any gender should be the real targets of any initiative to crack down on sexual assault through dating and hookup apps. Even with stringent age requirements that would be logistically unenforceable due to the scope of work involved, children would still find a way on to such services, and their predators would be there waiting for them.