One of the central challenges facing Internet companies today is how to identify and remove images of child sexual abuse. Microsoft on Wednesday made it easier for companies to tackle this challenge by turning its PhotoDNA service into an easily accessible cloud platform and encouraging companies to deploy it as soon as possible.
Microsoft said in a press release that more than 70 companies, including leading social networks Facebook and Twitter, already used PhotoDNA, but the initial version, which had to be installed on company servers, “required time, money and technical expertise to get it up and running and keep it up-to-date.”
PhotoDNA employs a common technique called hash-matching to spot child abuse pictures. It compares the “hashes,” or numerical identifiers, of possible illegal images to the hashes of known child sexual-abuse photos. The National Center of Missing and Exploited Children (NCMEC) used a database of known child pornography to build a “hash set” to which new images can be compared.
Detecting child abuse in photos, a subset of the broader challenges of image-recognition, has long bedeviled engineers and victims’-rights groups. A 2014 research paper proposed an algorithm that used facial recognition and skin-tone analysis to detect age and level of clothing. Google continuously scans Gmail accounts for child pornography and reports its findings to the NCMEC.
“Manually searching for a handful of illegal images among the millions uploaded and curated every day is simply an impossible task,” said Flipboard’s Head of Platform Engineering David Creemer. He called PhotoDNA in its new cloud incarnation “an effective service that scales and works great.”
Illustration by Jason Reed