AI technology could have identified ‘rogue’ police officers like Wayne Couzens before he killed Sarah Everard if there was the ‘political will’ to use it, says top cyber expert
- Professor Karen Yeung, of the University of Birmingham, spoke to peers today
- Said AI exists now that would can weed out ‘rogue individuals’, including police
- But said there is no ‘political will to apply them to scrutinise … public authority’
Artificial intelligence technology could have been used to flag Wayne Couzens’ violent tendencies before he raped and murdered Sarah Everard but for a lack of ‘political will’ to use it, peers were told today.
Professor Karen Yeung, of the University of Birmingham, said that AI programmes exist now that would can weed out ‘rogue individuals’, including in police forces.
But she told a House of Lords Committee that the focus so far was on creating ‘ prediction tools about poor people and we are leaving whole swathes of society untouched’ when advances could be put to better use.
Couzens, 48, was jailed for the rest of his live at the start of October for the brutal sexually motivated killing of Ms Everard, 33, in March, after he carried out a fake arrest for breach of Covid laws.
But it has since emerged that he already had a track record of sex crimes including flashing and molestation.
Discussing the killing at the Lords’ Justice and Home Affairs Committee today, Prof Yeung said: ‘Why are we not collecting data, which is perfectly possible now, about individual police behaviour?
‘We might have tracked down rogue individuals who were prone to committing violence against women. We have the technology.
‘We just don’t have the political will to apply them to scrutinise the exercise of public authority in more systematic ways than the way in which we are towards poor people.’
Professor Karen Yeung, of the University of Birmingham, said that AI programmes exist now that would can weed out ‘rogue individuals’, including in police forces.
Couzens, 48, was jailed for the rest of his live at the start of October for the brutal sexually motivated killing of Ms Everard, 33, in March, after he carried out a fake arrest for breach of Covid laws.
But it has since emerged that he already had a track record of sex crimes including flashing and molestation.
It came as a high-profile claimed that she was flashed by Couzens in 2008 – but police ‘laughed in her face’ when she reported it.
Magic FM DJ Emma Wilson, known on the radio as Emma B, waived her right to anonymity and said that the former police officer exposed himself to her 13 years ago in Greenwich, south east London.
In an interview with The Telegraph, she said: ‘I really wouldn’t say anything out loud if I wasn’t very sure, because it’s important and it’s serious, but as soon as I saw the pictures of him, I said ‘That’s him’.’
It comes days after it was claimed former Met officer Couzens molested a drag artist while he was in costume at the New Inn pub in Deal, Kent, in 2018, before propositioning him to have sex.
Couzens is known to have committed an indecent exposure, driving round naked from the waist down in his car, when he served with the Met in 2015.
He has also been identified as being responsible for carrying out the same offence at a McDonald’s restaurant days before he targeted Ms Everard while she walked home from her friend’s house in Clapham, south London.
Prof Yeung, who researches AI at Birmingham Law School, told peers to think carefully about who is being targeted by crime-fighting algorithms.
‘We’re not building criminal risk assessment tools to identify insider trading or who’s going to commit the next corporate fraud, because we’re not looking for those kinds of crimes and we do not have high volume data,’ she told them.
‘This is really pernicious. What is going on is that we are looking at high volume data, which is mostly about poor people, and we are turning them into prediction tools about poor people and we are leaving whole swathes of society untouched by these tools. So, this is a serious systemic problem and we need to be asking those questions.’
Prof Yeung made her comments at a session of the Justice and Home Affairs Committee on Tuesday examining the use of new technology in law enforcement, during which she called for greater transparency of how algorithms are designed and used in the criminal justice system.
The committee also heard concerns regarding police use of live facial recognition software, which Silkie Carlo, director of Big Brother Watch, described as ‘disproportionate’.
Ms Carlo said the Metropolitan Police had achieved just 11 true positive matches over ‘four or five years’ of testing on the streets of London, along with ‘an awful lot of false positive matches’, after capturing tens if not hundreds of thousands of people’s faces.
Even some of the positive matches, she added, were of people who were not wanted in connection with any crime but appeared on databases of people with mental health problems or protesters.
She said: ‘Their current rate over the entirety of their employment is 93% false positive matches, so I struggle to see a world in which that could be considered proportionate.’
Prof Yeung added that the police did not know how many false negatives the technology had returned because it had only been used in live tests rather than controlled, scientific conditions.
The Metropolitan Police claim they use facial recognition in a lawful and proportionate way.
Source: Read Full Article