Sat. Oct 19th, 2024

Two University of Iowa professors are bringing their expertise together to develop technology to identify abuse as it happens. (Photo by James Morgan/Getty Images)

Two University of Iowa professors are working to develop an artificial intelligence camera system that could eventually track and catch child abuse as it happens.

Karim Abdel-Malek, UI professor and interim director of the Iowa Technology Institute, and social work Associate Professor Aislinn Conrad are merging their backgrounds in child welfare and human modeling technology to develop cameras that will clock when an instance of violence has occurred. The cameras would send a recording of the incident to the person who set up the system.

Abdel-Malek said they’d start off targeting the software at abuse of children and elderly people, but as the program grows there will be more opportunities for it to be of aid.

“It’s well known that kids and older adults do not speak for themselves, and so we hope that this tech can really help,” Abdel-Malek said.

GET THE MORNING HEADLINES.

The technology for the camera system started as a program called the “Virtual Soldier,” which Abdel-Malek said he and others developed over 20 years working with the military. The technology could determine someone’s “kinematics,” or the “position, velocity and acceleration of each limb as it moves,” he said.

From there, he wanted to explore whether AI could identify through those kinematics whether someone is being physically abusive. It took two years to get the patent, he said, which was actually a fairly speedy process.

Abdel-Malek reached out to Conrad to bring her in as an expert in child abuse prevention, with both academic and in-the-field experience. Before coming to the UI to teach in the school of social work, Conrad worked as a child welfare investigator and case manager in the foster care system.

As the two only started working together in September, Conrad is currently focused on applying for grants and other funding to help move the project forward. In the future, Abdel-Malek said the team hopes to bring in actors and develop scenarios in order to “teach” the AI system what it should flag as abuse and what it should overlook.

They’re trying to find its limitations, Abdel-Malek said, as well as where it can be expanded and how it could be commercialized in the future. He’s worked on other programs like this that have been brought to market, like software that can detect spills in a grocery store and gun detection.

One real-world application would be for parents, who could hide the camera somewhere in their home to monitor a caretaker with their child. Abdel-Malek described it as “a little Big Brother,” but its intended use is for eliminating abuse.

“When I talk to (parents), it’s very much the sigh of relief of wow, what would that be like to actually leave and know that if something did happen, it would be documented,” Conrad said.

Not only would the parents or caregivers be notified if abuse did occur, Conrad said they could have a way to set up the program to have it notify authorities as well.

In terms of where the system could be utilized, Abdel-Malek said they will start with targeting environments with children and the elderly, like babysitting situations, day cares and nursing homes.

However, Conrad said its applications won’t stop there. Eventually, once the product is out there and proven to work, the system could be pitched to child welfare agencies, other elderly services, schools and other gathering places where violence could occur.

“We do believe this camera will change the way society handles and responds to violence, and it’s really a paradigm shift,” Conrad said. “I mean, it’s one thing to start with children and the elderly, but the applications of this are for any vulnerable person at risk for violence.”

About 1 in 3 people is at risk for violence, Conrad said, especially women, and 1 in 9 children will experience abuse. Despite the many efforts made over the years, Conrad said these numbers haven’t changed much.

She and Abdel-Malek are hoping that with time and effort, this system will help change how people are educated and become aware of abuse, followed by changes in how to mitigate it.

“I think it’ll take a little bit of time, but I wouldn’t be surprised if in a few years, we start seeing a shift,” Conrad said.

YOU MAKE OUR WORK POSSIBLE.

By