Top Features
The expanded and updated hub—a sort of “Field Guide to Police Surveillance”—has new or updated pages on automated license plate readers, biometric surveillance, body-worn cameras, camera networks, cell-site simulators, drones and robots, face recognition, electronic monitoring, gunshot detection, forensic extraction tools, police access to the Internet of Things, predictive policing, community surveillance apps, real-time location tracking, social media monitoring, and police databases.
If you see a widget, the widget sees you back. Privacy Badger now replaces embedded tweets, video/audio players, and comments sections with "click to activate" placeholders to protect your privacy.
EFF Updates
Since EFF was formed in 1990, we’ve been working hard to protect digital rights for all. And as each year passes, we’ve come to understand the challenges and opportunities a little better, as well as what we’re not willing to accept. Accordingly, here’s what we’d like to see a lot more of, and a lot less of, in 2024.
Keyword warrants that let police indiscriminately sift through search engine databases are unconstitutional dragnets that target free speech, lack particularity and probable cause, and violate the privacy of countless innocent people, the Electronic Frontier Foundation (EFF) and other organizations argued in a brief filed to the Supreme Court of Pennsylvania. Everyone deserves to search online without police looking over their shoulder, yet millions of innocent Americans’ privacy rights are at risk in Commonwealth v. Kurtz—only the second case of its kind to reach a state’s highest court.
Generative AI lets people produce piles upon piles of images and words very quickly, and it would be nice if there were some way to reliably distinguish AI-generated content from human-generated content. One common proposal is that big companies should incorporate watermarks into the outputs of their AIs; unfortunately, watermarking schemes are unlikely to work. So far most have proven easy to remove, and it’s likely that future schemes will have similar problems.
As millions of internet users watch videos online for news and entertainment, it is essential to uphold a federal privacy law that protects against the disclosure of everyone’s viewing history, EFF argued in court last month.
Video footage captured by police drones sent in response to 911 calls cannot be kept entirely secret from the public, a California appellate court has ruled. EFF, along with the First Amendment Coalition and the Reporters Committee for Freedom of the Press, had filed a friend-of-the-court brief arguing that categorically excluding all drone footage from public disclosure could have troubling consequences on the public’s ability to understand and oversee the police drone program.
Here’s an audio version of EFFector. We hope you enjoy it!
Announcements
EFF has been awarded a new $200,000 grant from Craig Newmark Philanthropies to strengthen our cybersecurity work in 2024. We are especially grateful this year, as it marks 30 years of donations from Craig Newmark, who joined as an EFF member just three years after our founding and four years before he launched the popular website craigslist.
Thank you and welcome to Grist Labs—an open-source spreadsheet for collaborating on sensitive data—for their support of EFF as a new Leader Organizational Member!
EFF's series of interviews with free-speech thought leaders continues. Jillian York interviewed Dr. Caroline Are, an Innovation Fellow at Northumbria University’s Centre for Digital Citizens. Her research primarily focuses on the intersection between online abuse and censorship. Her current research project investigates Instagram and TikTok’s approach to malicious flagging against ‘grey area’ content, or content that toes the line of compliance with social media’s community guidelines. They discussed the impact of platform censorship on sex workers and activist communities, the need for systemic change around content moderation, and how there’s hope to be found in the younger generations.
Job Openings
MiniLinks
The increasing reach of cameras and sophistication of algorithms worries EFF’s Jennifer Lynch. “We suddenly seem to have this web of face recognition,” she says. “It’s been building for years, but it now seems to be much easier for the FBI and other police departments to hold onto images for a long time and just run these automated searches whenever they feel like it.”
EFF’s Eva Galperin spoke with KCBS Radio hosts Bret Burkhart and Patti Reising about how the data that modern cars collect and the access they give to their drivers’ lives can be weaponized in abusive relationships.
To EFF’s Corynne McSherry, the panic over A.I. feels like Groundhog Day. “It replicates the anxieties we’ve seen around social media for a long time,” she says. And, so far, it seems like we’re taking the same tack we did around social media regulation. “Someone in Congress hauls a bunch of CEOs to D.C. to testify about how they should be regulated.”
More than a hundred local police departments, sheriff’s offices, and cities have set up an AI-powered camera system, with nearly 200,000 connected cameras belonging to residents and businesses around the country able to provide “direct access” to law enforcement, according to a 404 Media analysis of a set of scraped data. EFF’s Beryl Lipton, who has researched Fusus, said “This dataset is the most detailed record of Fusus-integrated cameras I’ve seen. It makes clear that these systems are using hundreds of thousands of public and private cameras to blanket huge swaths of our cities, particularly those in the Southeast, with the capacity for constant surveillance.”
“With the watermelon (emoji), I think this is actually really the first time where I’ve seen it widely used as a stand-in. And that to me marks a notable uptick in censorship of Palestinian content,” EFF’s Jillian York said about how the symbol is being used to confuse algorithms.
|