BishopAccountability.org
 
 

AI inspired by the film Spotlight could track down child abusers

By Timothy Revell
New Scientist
May 30, 2018

Journalists at The Boston Globe searched for patterns in public records to uncover priests in the Catholic church who had sexually abused children. Now, researchers think artificial intelligence could do the same job faster, more accurately and on a much wider scale.
The Boston Globe investigation, depicted in the film Spotlight, involved looking for clues like priests suddenly going on sick leave or moving around a lot. Joelle Casteix at the Zero Abuse Project, a non-profit that aims to help institutions prevent child abuse, and her team have created an AI that looks for similar patterns in thousands of documents from large organisations.

Casteix unveiled the project at the AI for Good Global Summit in Geneva, Switzerland, last week. “I am a survivor of sexual abuse from a teacher, which was followed by a lot of cover-up,” says Casteix. “This is the first time there is a proactive way to stop the cycle.”

The new initiative, called Project G, can study both digital documents or turn paper scans into machine-readable files for the AI to scour. Depending on the organisation, the documents can include those detailing where different people are based and their roles over time, and news clippings in which they are mentioned.

As the system learns what a normal career trajectory looks like from decades’ worth of documents, outliers begin to stick out. Many people take unusual career paths, but people whose patterns seem to match those of an abuser rise to the top of the suspect list. Casteix and her colleagues would then investigate further.

The tool also gives people an association score, highlighting those that may be involved in any cover-up. “You can pull the predator out, but if you don’t figure out the patterns of the cover-up, the pattern happens all over again,” says Casteix.

The software is still under development, but the team has already included documents on known child abusers and seen the AI assign them the highest ratings. It has also picked up on some people the team believes to be abusers, based on its work outside of developing the AI, but who have yet to be exposed publicly. The team hopes that large organisations will want to use the tool to help root out abuse.
Of course, a tool like this could lead to innocent people being accused of child abuse. To try to avoid this, Casteix says it will only be used to augment the Zero Abuse Project’s existing work. “It is one strategy amongst many. It needs to go along with training, transparency and working with law enforcement,” she says.

How well Project G will work is unclear, says Lorraine Radford at the University of Central Lancashire, UK. “Offenders are very manipulative people. They don’t only manipulate the victim, they also manipulate the people around them,” she says. This means that abuse can continue under the radar for years.

However, the project may be able to spot some types of offender who leave clearer patterns of suspicious behaviour, says Radford. “Any way it can help would be fantastic,” she says.

 

 
 


 
 

 

 

 
 


 
 

 

 

 

 

 

 

 

 

 

 

 

 

 




.

 
 

Any original material on these pages is copyright © BishopAccountability.org 2004. Reproduce freely with attribution.