>>6442773
This is the real patriot shit …putting privacy back to the way it was.
“That’s the reason we’re trying to prevent that now,” said Hofer, who has filed suit against the Contra Costa County Sheriff’s Department, the San Jose Police Department and others after he said he was pulled over last year and handcuffed, with guns pointed at him when a license plate reader mistakenly identified the rental car he was driving as stolen. “The genie’s not out of the bottle yet.”
But the ACLU, which also helped draft the ordinances, pointed out that deploying facial recognition would be easy enough.
“The raw materials for face surveillance — data such as mugshots and video feeds from CCTV and body cams — already exist,” said Matt Cagle, technology and civil liberties attorney with the ACLU of Northern California. “With just a few lines of code, existing photo systems can be turned into dangerous dragnet surveillance networks.”
The proposed ordinances come after high-profile examples of the pitfalls of facial recognition, including a report last year that Amazon’s Rekognition software falsely matched the faces of members of Congress with mugshots of people who had been arrested.
The San Francisco Police Department, which said it doesn’t use facial recognition, submitted amendments to the ordinance after talking with other city departments, community groups, neighborhood watch groups and businesses.
“(Our) mission must be judiciously balanced with the need to protect civil rights and civil liberties, including privacy and free expression,” said David Stevenson, spokesman for the San Francisco Police Department. “We welcome safeguards to protect those rights while balancing the needs that protect the residents, visitors and businesses of San Francisco.”
Lee Hepner, legislative aide to Peskin, said the supervisor’s office incorporated some of the SFPD’s requests into the ordinance. If it is approved in committee Monday, the full board will vote May 14.
“Over time, this will build a lot of trust among the community and the police,” he said. “Hopefully in the end it will be a win-win.”
San Francisco Sheriff’s Department spokeswoman Nancy Crowley said her department does not use facial recognition. She added that most of the agency’s work is in non-public spaces, but that if the ordinance is passed “we will comply with the requirements that impact our work.”
The Oakland Police Department did not return a request for comment.
Color of Change, a national nonprofit racial justice advocacy group founded in Oakland, supports both ordinances.
“This is an important moment for San Francisco,” said Brandi Collins-Dexter, senior campaign director for the group. She said the city “is positioned to really protect its constituents” and could influence others around the nation.
In a letter urging supervisors to pass the ordinance, Color of Change expressed concern about “high-tech profiling.” The group cited a 2009 incident in which multiple San Francisco police officers pointed their guns at a black woman who was pulled over based on mistaken information from a license plate reader that the car she was driving was stolen. The woman, Denise Green, a former Muni driver, settled her lawsuit against San Francisco in 2015 for $495,000.
Nowadays, Collins-Dexter said, police have access to “technologies the likes of which we’ve never seen.”
AI experts in April urged Amazon to stop selling facial recognition software to law enforcement until safeguards and laws are put into place. (Its technology is now being tested by police in Oregon.) Amazon shareholders are scheduled to vote later this month on a shareholder resolution urging Amazon to stop selling Rekognition.
The companies that make the technologies have also called for limits and regulations: Microsoft late last year called for regulating artificial intelligence, and Amazon followed suit earlier this year.
In addition, the Partnership on AI — whose members include Facebook, Google, Amazon.com, Apple, Microsoft, IBM and academic researchers — last week said law enforcement should not use artificial intelligence algorithms to make decisions about jailing people.
Meanwhile, some Bay Area law enforcement agencies have been using predictive-policing technology, including Santa Cruz and UC Berkeley. Others, such as Mountain View and Palo Alto’s police departments, tried such technology but decided not to continue its use.