- The man who sold shares of himself on the internet 5 Years Ago
- The rise of the conservative ‘mancast’ in a world of changing masculinity Today 6:00 AM
- Amazon’s ‘Troop Zero’ gives the underdog movie a stylized re-do Today 4:20 AM
- No, the first words of Trump’s tweets don’t match up to lyrics of ‘Break My Stride’ Sunday 10:28 PM
- White woman demanding strangers ‘repent’ for Christ sparks conversation on mental illness and racism Sunday 9:27 PM
- Amtrak employee asked a NAACP lawyer to move from her train seat Sunday 7:54 PM
- Billie Eilish fans riot after being referred to as ‘Avocados’ Sunday 4:37 PM
- Beyhive coming for Sainsbury’s supermarket over Ivy Park shade Sunday 3:17 PM
- Antique store blasted for selling ‘white only’ signs Sunday 1:45 PM
- DaBaby explains altercation with hotel employee after video goes viral Sunday 12:32 PM
- Kanye faces backlash for headlining Christian event with anti-LGBTQ leaders Sunday 10:31 AM
- Why is Yennefer of Vengerberg so different in Netflix’s ‘The Witcher’? Sunday 10:00 AM
- Actress slammed for ‘acid attack-face’ TikTok challenge Sunday 9:46 AM
- ‘Weathering With You’ blends fantasy and realism in a magical love story Saturday 6:18 PM
- Kidnapped teen used Snapchat to get rescued Saturday 4:35 PM
Microsoft’s cloud photo tool could help websites spot child abuse photos
It’s a much-needed tool, but it remains to be seen how much of a difference it will make.
One of the central challenges facing Internet companies today is how to identify and remove images of child sexual abuse. Microsoft on Wednesday made it easier for companies to tackle this challenge by turning its PhotoDNA service into an easily accessible cloud platform and encouraging companies to deploy it as soon as possible.
Microsoft said in a press release that more than 70 companies, including leading social networks Facebook and Twitter, already used PhotoDNA, but the initial version, which had to be installed on company servers, “required time, money and technical expertise to get it up and running and keep it up-to-date.”
PhotoDNA employs a common technique called hash-matching to spot child abuse pictures. It compares the “hashes,” or numerical identifiers, of possible illegal images to the hashes of known child sexual-abuse photos. The National Center of Missing and Exploited Children (NCMEC) used a database of known child pornography to build a “hash set” to which new images can be compared.
Detecting child abuse in photos, a subset of the broader challenges of image-recognition, has long bedeviled engineers and victims’-rights groups. A 2014 research paper proposed an algorithm that used facial recognition and skin-tone analysis to detect age and level of clothing. Google continuously scans Gmail accounts for child pornography and reports its findings to the NCMEC.
“Manually searching for a handful of illegal images among the millions uploaded and curated every day is simply an impossible task,” said Flipboard’s Head of Platform Engineering David Creemer. He called PhotoDNA in its new cloud incarnation “an effective service that scales and works great.”
Illustration by Jason Reed
Eric Geller is a politics reporter who focuses on cybersecurity, surveillance, encryption, and privacy. A former staff writer at the Daily Dot, Geller joined Politico in June 2016, where he's focused on policymaking at the White House, the Justice Department, the State Department, and the Commerce Department.