The Info Commissioner’s Workplace (ICO) has criticised the House Workplace for failing to tell it about historic bias within the facial recognition algorithms used inside the Police Nationwide Database.
Final week the Nationwide Bodily Laboratory revealed a report of its impartial testing of facial recognition algorithms commissioned by the House Workplace.
In its report it famous that the police had made efforts to cut back bias by establishing coaching and steering.
The ICO’s deputy commissioner, Emily Keaney, has responded to the report expressing “disappointment” that regardless of the regulatory commonly partaking with the House Workplace and police our bodies, it had not been knowledgeable of this historic difficulty.
“Final week we have been made conscious of historic bias within the algorithm utilized by forces throughout the UK for retrospective facial recognition inside the Police Nationwide Database,” stated Keaney.
“We acknowledge that measures are being taken to deal with this bias. Nonetheless, it’s disappointing that we had not beforehand been informed about this, regardless of common engagement with the House Workplace and police our bodies as a part of our wider work to carry authorities and the general public sector to account on how knowledge is getting used of their companies.”
Keaney stated that “any notion of bias and discrimination can exacerbate distrust” and has requested the House Workplace for “pressing readability on this matter so we will assess the state of affairs and contemplate our subsequent steps”.