Log in

Protests and a pandemic: Facial recognition gets another close-up



From The Hustle
The encrypted messaging service Signal doesn’t want you to show your face.

Signal has become a favorite among protesters in recent days -- and it just announced a tool that blurs out faces in photos sent over the app.

Some protesters are concerned that police departments may use facial recognition tech to identify them, using images circulating on social media.

Police departments in Seattle, Austin, and Dallas have asked for images of illegal activity that took place during the demonstrations, according to OneZero.

Plenty of other big city departments -- in New York, Chicago, Los Angeles, Miami, and Philadelphia -- keep facial-recognition tech in their arsenals, too.

This is a strange time for facial recognition

Debates over the technology have burned for years. Last week, the ACLU sued Clearview AI -- a startup that has mined 3B+ pictures of faces from across the internet -- accusing Clearview of violating people’s privacy rights.

The pandemic era has changed the backdrop: Companies and governments are floating facial recognition tools as a means to improve contact tracing.

Russia, Poland, and China are already using it as one of several tools to track the spread of COVID-19, and US colleges are preparing to break out facial recognition this fall. A US startup is testing drones in India that recognize faces and enforce social distancing.

The battle is coming to… a head

Since February, California has weighed legislation that would regulate -- and, critics say, expand -- facial recognition tech in the state. On Wednesday, the legislature blocked the bill.

In addition to civil rights concerns, many skeptics point out that facial recognition is not very accurate and exhibits racial bias.

But that hasn’t stopped startups from flooding in: At least 45 companies now boast that they can track faces in real time, according to OneZero.

Businesses, Colleges, Facial Recognition, Pandemic, Police Departments, Protests


No comments on this item Please log in to comment by clicking here