X
Business

Now Amazon employees rebel: 'End police facial-recognition contracts, ICE support'

Amazonians call on CEO Jeff Bezos to take a moral stand against the sale of its technology to law enforcement.
Written by Liam Tung, Contributing Writer

Video: AWS demonstrates Amazon Rekognition Video in real time.

It seems every tech giant with powerful AI is facing a clash with employees concerned by how government customers are using their technology.

A group of Amazon employees have written a letter to CEO Jeff Bezos asking him to cancel sales of Amazon's Rekognition facial-recognition software to police, and more broadly to take an ethical stand on how its technology is used.

The employees also want Amazon to stop providing AWS cloud services to analytics firm, Palantir, which helps Immigration and Customers Enforcement's (ICE) controversial detention and deportation programs.

In the letter, published by The Hill, the employees object to building technology that government and law enforcement can use to create a surveillance state. They urge Bezos not to repeat IBM's mistake of failing to take responsibility for how Nazi Germany used its systems.

The letter by Amazon employees draws attention to the separation of children from their families under the Trump administration's zero-tolerance border policy.

SEE: Sensor'd enterprise: IoT, ML, and big data (ZDNet special report) | Download the report as a PDF (TechRepublic)

"Along with much of the world we watched in horror recently as US authorities tore children away from their parents," they wrote.

"In the face of this immoral US policy, and the US's increasingly inhumane treatment of refugees and immigrants beyond this specific policy, we are deeply concerned that Amazon is implicated, providing infrastructure and services that enable ICE and DHS.

"We refuse to build the platform that powers ICE, and we refused to contribute to tools that violate human rights," the employees continued.

Shock over the detention and separation of children sparked a similar protest at Microsoft this week, with staff calling on CEO Satya Nadella to end a contract with ICE to provide Azure services, which it previously claimed would help ICE accelerate facial recognition and detection.

It marks another instance of technology employees taking a stand against ethically dubious applications of technology they helped develop. Google employees similarly protested the use of its AI for the Pentagon's drone video-analysis system.

Non-profit the American Civil Liberties Union (ACLU), which last month revealed Amazon was heavily promoting Rekognition to US law enforcement, on Monday delivered a petition signed by 150,000 people against Amazon selling surveillance tech to the government.

The ACLU also delivered a letter from 70 organizations demanding Amazon end its involvement in the surveillance business. Amazon shareholders have also called for the company to end to facial-recognition sales to police.

Previous and related coverage

Microsoft employees protest: 'Stop work with ICE over cruel border policies'

Concern over harmful applications of AI once again sparks an employee protest -- this time at Microsoft.

Google, Microsoft, Amazon: Are they responsible for government use?

Welcome to the world of politicized capitalism doused with hashtag activism.

Microsoft takes a stand against ICE separating parents from children (CNET)

The company had previously said it was "proud" to supply technology to US Immigration and Customs Enforcement -- and sparked anger online.

Google employee protest: Now Google backs off Pentagon drone AI project

Google won't bid to renew its Project Maven contract with the Pentagon after it expires in 2019.

Google's AI pact with Pentagon sparks resignations, highlighting leadership disconnect (TechRepublic)

Nearly a dozen employees have quit to protest the tech giant's work for the Defense Department's 'Project Maven,' where AI is used to analyze drone footage.

Google says it won't build AI for weapons

In a set of principles laid out to guide its development of artificial intelligence, Google also said it won't build AI for surveillance that violates "internationally accepted norms."

Editorial standards