A tough week for algorithms
Algorithms have been in the press for all the wrong reasons — racial bias, mostly. Instagram’s CEO said they need to change their algorithms to stop mistreatment of black users. After, TikTok explained how their recommender algorithm works.
Researchers may use racist algorithms
Some AI researchers called out a scientific paper introducing a face recognition system that can apparently detect if you’re a criminal. Dubbing it the #TechToPrisonPipeline, the group criticized it for being inherently racist.
Uber and Lyft too
According to a recent study it seems Uber and Lyft offer more expensive fares for trips to non-white and low-income neighbourhoods. The research said this may be down to “social bias”.
It’s not all bad though
So says this piece on how AI can actually help protect disadvantaged people. Algorithm bias is a real thing — but its prominence depends on the people behind the algorithm, on their vigilance and responsibility.
Read here how we avoid bias in our AI advisor, crystal.