In a pair of articles published in the September issue of the Journal of Criminal Justice, professor Rachel Lovell detailed how she used a computer to analyze thousands of incident reports written by Cleveland police officers over two decades in response to sexual assault.
Specifically, Lovell programmed the computer to measure officer bias in each report. The algorithms proved so adept at evaluating the language, they could predict which reports led to prosecutions and which ones died.
The implications, while uncertain, are still significant: If a computer can detect a form of writing that can influence the fate of a rape case, perhaps software engineers can develop a program to help officers write reports the day of a service call.
The results were a shocker to Lovell. She expected the more objective reports — those written with a “just-the-facts” tone — to result in more prosecutions. Instead, she found the opposite: The more subjective reports triggered success.
Looking back, Lovell says the results make sense. The matter-of-fact reports failed to convey the brutality of rape, while the subjective reports were rich with personalized details, elicited from the victims.
Full story: Can A.I. solve rape cases? To find out, a Cleveland professor programmed a computer to analyze thousands of police reports – cleveland.com