European law enforcement just rolled out machine learning software that hunts P2P networks for child pornography. The AI automated much of the process—freeing investigators to spend more time investigating suspects. Interpol headquarters tested iCOP (Identifying and Catching Originators in P2P Networks) and found the AI ready for use.
Claudia Peersman, the lead author of a study on the software, explained the Interpol trial results to WIRED. The team set iCOP up to identify material uploaded in the Lyon area. It watched for new content and scanned previously stored data in Interpol databases. After the test run, researchers noted that iCOP made very few mistakes—7.9% for pictures and 4.3% for videos.
Peersman said, “We could, for example, have a look at the IP address of who is uploading these images, and check whether new images are connected to older images.”
She added that such an approach needed many skills from various viewpoints. Moreover, people with these types of varying skills were part of the team behind iCOP. Researchers, image analysts, and linguists from organizations and universities across the world combined their collective skills to save victims.
The system works similarly to Microsoft’s Photo DNA, Peersman explained. Microsoft’s Photo DNA, in short, cuts out the manual comparison of two photographs.
According to Microsoft:
It works by converting images into a grayscale format, creating a grid and assigning a numerical value to each tiny square. Those numerical values represent the hash of a picture or its ‘PhotoDNA signature.’ The program protects user privacy in that it does not look at images or scan photos; it simply matches a numerical hash against a database of known illegal images. The technology allows companies to compare millions of photos against a hash set of child sexual abuse images. NCMEC created the hash set and derived from the ‘worst of the worst’ child pornography images uploaded to the CyberTipline by electronic service providers. Microsoft
The project’s developers explained that iCOP differed significantly from Microsoft’s Photo DNA. To a point, both serve the same purpose: to fight child abuse on P2P networks. “iCOP combines text and image analysis, machine learning and AI to detect new files being put on such networks automatically,” Peersman said. Investigators who saw the pictures and watched the videos received some relief with the release of Microsoft’s program.
Investigators receive automated matches from iCOP, majorly cutting down on trawling through hundreds of images or watching video after video.
Awais Rashid, a professor at Lancaster University who aided in iCOP’s development said:
It [iCOP] significantly reduces the overhead for investigators. Instead of having to trawl through scores of images and videos to identify new child abuse material, researchers are provided with automated matches which are highly accurate. In practice, this means investigators have to look at a small number of images and videos rather than thousands. (Accessed December 4, 2016, via Ubergizmo)
The toolkit is freely available for law enforcement, Peersman said. She explained that while iCOP is a breakthrough for automated crime prevention, development is far from being over. The software cannot search the darknet. And, she said, the darknet was where offenders most shared child pornography. Researchers plan to expand iCOP’s scope beyond the clearnet. A learning machine that automatically scans the darknet and concurrently checks faces against international databases is the future of iCOP, researchers hope.