Things happen that show we've crossed the line too many times in this trust.
Hilton Simoes Gomez
In Spain’s case, the app is VioGén. Based on answers to 35 questions — with “yes” or “no” alternatives — the app concluded whether the risk of a new domestic assault was high or low. Hamid was sent home and her husband was released the next day, even though the attack involved a piece of wood and was carried out in front of the couple’s four children, aged between 6 and 12.
For Hilton, the case highlights the importance of viewing AI systems as tools, not decision-makers.
Recent Brazilian cases illustrate the negative effects of the uncritical and automatic use of AI. One was the episode with 23-year-old personal trainer João Antonio Trindade Bastos. While watching the final of the 2024 Sergipe football championship, he was handcuffed and led away among thousands of people by police. The evidence? A facial recognition system identified him as a fugitive. João was released only after showing his ID.
The government has suspended the use of the program. Last year, during an unfortunate incident also in Sergipe, Thais Santos was approached by police twice in the same day. Also confused by facial recognition. Retrieve the story here.
There are three cases that show that government officials believe in technology so much that they put people's physical safety, honor, and mental health at risk.
Hilton Simoes Gomez