Microsoft develops tool to ID pedophiles in online chats

Project Artemis could be a game-changer.

Jan 13, 2020, 2:13 pm

IRL

Mikael Thalen 

Mikael Thalen

Pxfuel (Public Domain)

Technology giant Microsoft has announced a new tool designed to catch pedophiles who target children in online chats.

Code-named Project Artemis, the system, which has long been used by Microsoft to analyze chats held over platforms such as the Xbox, is now being offered to other tech companies.

While details are scarce, the tool works by recognizing specific words and speech patterns. Chats that are flagged as suspicious are handed over to human moderators for review. A decision can then be made as to whether the conversation warrants law enforcement attention.

Courtney Gregoire, Microsoft’s chief digital safety officer, stated in a recent blog post that the tool will be free for companies that offer a chat function in any of their products.

“At Microsoft, we embrace a multi-stakeholder model to combat online child exploitation that includes survivors and their advocates, government, tech companies and civil society working together,” Gregoire writes. “Combating online child exploitation should and must be a universal call to action.”

The tool would not work, however, for messaging programs that utilize end-to-end encryption. And as with any such tool, concerns have been raised over the issue of false positives, which would lead innocent users to have their conversations monitored as well.

Given recent controversies over companies monitoring their customers’ communications in secret, those who employ the software would likely have to update their terms-of-service in order to gain consent from users.

Microsoft admits that Project Artemis “is by no means a panacea,” but argues that its tool represents a significant step forward nonetheless.

“Child sexual exploitation and abuse online and the detection of online child grooming are weighty problems,” the company notes. “But we are not deterred by the complexity and intricacy of such issues.”

READ MORE:

H/T MIT Technology Review

Share this article