For the RCMP investigators working at the Canadian Police Centre for Missing and Exploited Children who face the grim task of sifting through seized pornographic material to identify child victims that are in need of immediate rescue, mental health and wellness are serious concerns, says Sergeant Arnold Guerin.
“Often you go home wondering if you put your effort on the right material, focused on the right cases,” he says. “It’s that needle in a haystack, trying to find one image in among 9 to 10 million. Somewhere in that seizure is the child that’s looking at the camera and saying ‘please come and rescue me.’ That’s where we need to act.”
With Internet broadband speeds increasing across the country and storage as cheap as ever, police are seizing ever-larger amounts of child sexual exploitation material all the time. Across both the U.S. and Canada, there were 27,301 cases reported in 2016, almost double the amount reported in 2015. In many cases, the trove of illicit material seized by the RCMP is commonly in the ballpark of 10 million images.
Adding to the problem is a new category of material that adolescents are sending with intent. With young children often receiving their own smartphones or tablets, it’s not unusual for them to send ill-advised pictures to a “boyfriend” or “girlfriend,” only to have the material later leak out. It’s the type of behaviour that today, Safer Internet Day, aims to help prevent through education.
“Kids have much wider access to mobile technology than we did just a generation ago,” Guerin says. “Sometimes that puts them in harm’s way.”
AI could help rescue kids
The images that Guerin is describing, although a disturbing trend and liable to end up in collections of child sexual exploitation material, don’t contain images of victims in immediate need of rescue however. They’re also not in the known materials databases that police agencies from around the world work to share with each other via Interpol’s International Child Sexual Exploitation image database, which means they can’t be filtered out by existing tools that are based on known cases. So the RCMP is turning to a new, artificial intelligence (AI) powered solution that could help them more quickly find child victims in need of help.
Helping them accomplish the task are two researchers from the University of Manitoba, and Kelowna, B.C.-based software firm Two Hat Security Ltd., with government-funded non-profit organization Mitacs arranging the connections.
Mitacs’ mandate is to bridge the gap between academic research and business applications. Projects are selected based on their benefits to both the business looking to innovate, and on the academic merits of the research opportunities. All projects that earn funding are also peer-reviewed, explains Jennifer Tedman Jones, a business development director at Mitacs. The organization has worked on 18,000 projects across the country since 2003.
Two Hat had already developed a relationship working with Mitacs on its Community Sift risk-based content filter, which helps online communities prevent cyber-bullying. When Jones met with Two Hat CEO Chris Priebe at a tech hub event in Okanagan, B.C., he brought forward the new idea for a product that could identify child sexual exploitation content using AI-assisted computer vision.
“It’s one of the parts of my jobs that I really enjoy,” she says.
Training computer vision
The Two Hat project will see a total of $3 million in funding, creating as many as 200 internships over five years. At least 60 of those will be staffed by the University of Manitoba’s Department of Computer Science, headed up by Professor Yang Wang. So far, students Binglin Li and Mehrdad Hosseinzadah have started working on the project, setting up experiments for the RCMP to conduct with their AI algorithm.
“We’re calling it a super convolutional neural network, which is a geeky way to combine a variety of neural networks into one AI system,” says Brad Leitch, head of product development at Two Hat. “No one has really seen it as a business opportunity. Mitacs gives us an opportunity to work with students that are on the leading edge of computer vision and apply them in a meaningful way.”
To train a computer vision system, you’d typically feed it a huge collection of images. First, you’d show it many examples of positive matches to the thing you want it to identify. Then you’d show it examples of negatives, things that are different from what you want to identify. But in the case of dealing with exploitative material of this nature, researchers aren’t allowed to possess it. That’s where working with the RCMP to conduct experiments comes in.
Using an open source deep learning model, the researchers have been able to determine the age of a person in a photo within three years with 84 per cent accuracy. The goal is to be able to identify the demographic of ages 0-13 in images so the RCMP can prioritize pre-pubescent victims.
“It’s definitely with a heavy heart when you talk about these issues,” Leitch says. “I have five kids and the things we’ve been working on has forced to have some hard conversations with them about the world we’re living in.”
Eventually, Two Hat hopes to develop a computer system vision that is so accurate at detecting child sexual exploitation material that social networks like Facebook could use it to prevent the content from ever getting uploaded. Even when adolescents show misjudgment and try to send illicit material of their own via social media tools, intervening with a strong warning about possible consequences could serve to make them rethink the action.
Tools like Microsoft’s PhotoDNA cloud service offer similar services to identify and remove illegal images today, but again, it’s based on material that has already been documented and is known to law enforcement.
“That would be the core opportunity, to help social networks filter out that unwanted content,” he says. “The sooner we get this into law enforcement’s hands, the better off kids will be.”
Help for police across Canada
In Guerin’s unit at the RCMP, he sees high potential for the tool to reduce the time it takes to triage out images that require an investigator’s immediate attention. It could be put to use to sift through databases where reported material is being submitted by the industry, by other police units, and by the public.
“It makes it safer for the child and as well it allows us to accelerate that child’s rescue,” he says.
To help make the experiments run as smoothly as possible, Guerin recruited an officer that had been based in the RCMP’s CIO office and had done previous work involving graphic processor units. It won’t be long before the system is shared across the country for any law enforcement effort facing the same sort of problem.
“We can give them the capability to apply the algorithm to the material and maximize our ability to find and rescue kids and reduce the time it takes to analyze all that material,” he says.
And that might help investigators, when they go home at night, feel like they did everything they could to find and help child victims of a disturbing crime.