Joy Buolamwini got Jeff Bezos to back down.
In June, Amazon announced that it was issuing a moratorium on police use of its controversial facial recognition software, called Rekognition, which it had sold to law enforcement for years in defiance of privacy advocates. The move marked a remarkable retreat for Amazon’s famously stubborn CEO. And he wasn’t alone. IBM pledged that same week to stop developing facial recognition entirely, and Microsoft committed to withholding its system from police until federal regulations were passed.
These decisions occurred amid widespread international protests over systemic racism, sparked by the killing of George Floyd at the hands of Minneapolis police. But the groundwork had been laid four years earlier, when Joy Buolamwini, then a 25-year-old graduate student at MIT’s Media Lab, began looking into the racial, skin type, and gender disparities embedded in commercially available facial recognition technologies. Her research culminated in two groundbreaking, peer-reviewed studies, published in 2018 and 2019, that revealed how systems from Amazon, IBM, Microsoft, and others were unable to classify darker female faces as accurately as those of white men—effectively shattering the myth of machine neutrality.