Discussion in 'General Discussion' started by Quentinanon, Jan 4, 2012.
I'm planning on cutting my face off. That'll learn 'em.
U act like 12year old
Where did you find this, Slashdot?
1. Avoid enhancers
They amplify key facial features.
2. Partially obscure the nosebridge area
The region where the nose, eyes, and forehead intersect is a key facial feature.
3. Partially obscure the ocular region
The position and darkness of eyes is a key facial feature.
4. Remain inconspicuous
For camouflage to function, it must not be perceived as a mask or disguise.
don't cut your hair, hippies
check out noisebridge lots of tech
/derail for Bill Nye Science guy
Fake noses are the answer.
Electronic Frontier Foundation Sues FBI For Access to Facial-Recognition Records
June 26, 2013
As the FBI is rushing to build a "bigger, faster and better" biometrics database, it's also dragging its feet in releasing information related to the program's impact on the American public. In response, the Electronic Frontier Foundation (EFF) today filed a lawsuit to compel the FBI to produce records to satisfy three outstanding Freedom of Information Act requests that EFF submitted one year ago to shine light on the program and its face-recognition components.
FBI sued over secretive facial recognition program
June 28, 2013
In a July 18, 2012 assessment, the FBI reported that the program was “on scope, on schedule, on cost and 60 percent deployed.” The program is being put together by contractors Lockheed Martin, who are expected to rake in $1 billion from the government by the time the NGI system is finally up and running.
The FBI previously admitted that they found 7,380 records that were "potentially responsive” to one of the EFF’s request, but has yet to deliver actual information pursuant to any of the three FOIA submissions filed, prompting the nonprofit to allege the FBI is “dragging its feet."
"FBI has not explained to the public how NGI or IAFIS's system design would ensure that civil submissions are not 'tainted' by criminal submissions or explained why it is necessary to combine the two types of data," the EFF wrote in the complaint.
FBI Plans to Have 52 Million Photos in its NGI Face Recognition Database by Next Year
By Jennifer Lynch, Electronic Frontier Foundation
New documents released by the FBI show that the Bureau is well on its way toward its goal of a fully operational face recognition database by this summer.
EFF received these records in response to our Freedom of Information Act lawsuit for information on Next Generation Identification (NGI) — the FBI’s massive biometric database that may hold records on as much as one third of the U.S. population. The facial recognition component of this database poses real threats to privacy for all Americans.
Continued at https://www.eff.org/deeplinks/2014/...s-its-ngi-face-recognition-database-next-year
Absolutely frigging ridiculous.
Hey, FBI! How about focusing on unapprehended fugitives!
List of EFF work
Why sunglasses and a hat don't work any more.
CV Dazzle Anon Salon
Ah, but this is pre-crime management. When someone becomes a fugitive, they'll already have their photo and a good idea of where to look.
If only there were some kind of changing mask that people could wear. Herm.
You mean like do their jobs? Are you crazy?
Facial recognition algorithms are of two main types.
One type uses what is called eigenfaces (http://en.wikipedia.org/wiki/Eigenface) which recognize your face by comparing its features to those of the "mean face", a calculated average of the faces stored in an image databank. To fool those, you have to mask any distinctive trait which allows the algorithm to match your face as per its differences with the mean face. The most significant ones will be around eyes, nose and mouth.
Another type uses a combination of gradient filters and scored descriptors. That's the one shown in the video in post #7. To fool that one, you have to "contaminate" your face's gradients, that is, add vertical, horizontal or diagonal lines which will screw up the score and keep your face from even being recognized as a face.
Eigenfaces is the one that is the most used because it is faster and more efficient (which is not negligible, as image manipulations are very demanding vs CPU and RAM).
And if everything fails, there's always this :
How to defeat facial-recognition machines and look like a rock star
You might be invisible to computers, but you're "glaringly obvious to other humans."
By David Kravets
Brooklyn artist Adam Harvey has developed a low-tech solution to protect your privacy — fashioned even before the Snowden revelations — using makeup and hairstyles he says could defeat facial-recognition machines. Privacy enthusiasts must be willing to look like Marilyn Manson or a rocker from Kiss, but this method just might make you safe from the facial-recognition technology that is being embraced by everything from sports stadiums to the FBI.
Harvey, who has also created some counter-surveillance garments, calls the facial-recognition project "CV Dazzle" — developed as a master's project at New York University. He writes:
CV Dazzle is a form of expressive interference that takes the form of makeup and hair styling (or other modifications). The name is derived from CV, a common abbreviation for computer vision, and Dazzle a type of camouflage used during WWI. Dazzle camouflage was originally used to break apart the gestalt image of warships, confusing observers about their directionality, size, and orientation. Likewise, the goal of CV Dazzle is to break apart the gestalt of a face, or object, and make it undetectable to computer vision algorithms, in particular face detection.
His website, which has a Creative Commons license, also comes with two downloadable "test patterns" for stylists.
These 'privacy glasses' make you invisible to facial recognition
There are a couple of things going on here, but essentially AVG is using infrared LEDs to mess with the filter most phones/cameras use when taking pictures. By futzing the light around your eyes and nose, there's enough damage to the image to prevent facial recognition from figuring out who you are. There's a reflective covering, too, that will light-up when a flash is used for similar results.
I think the fashion statement needs work...
I recommend the William S. Burroughs style...
Best way to avoid facial recognition software:
Bumping spam off the front page.
Great thread! Face capatha, sort of.
Has anyone used a burqa type garment for privacy? I mean, non Muslim Anons?
Reposting the same post in more than one thread is considered forum spamming. Your posts have been deleted according to WWP policy. Unbunch your panties and quit forum spamming.
pretty dank, i should make a outfit like dat
Privacy Advocates Walk Out in Protest Over U.S. Facial-Recognition Code of Conduct | The Intercept
Technology industry lobbyists have so thoroughly hijacked the Commerce Department process for developing a voluntary code of conduct for the use of facial recognition technology that nine privacy advocates involved withdrew in protest on Monday.
“At a base minimum, people should be able to walk down a public street without fear that companies they’ve never heard of are tracking their every movement — and identifying them by name – using facial recognition technology,” the privacy advocates wrote in a joint statement. “Unfortunately, we have been unable to obtain agreement even with that basic, specific premise.”
The Commerce Department, through its National Telecommunications and Information Administration, brought together “representatives from technology companies, trade groups, consumer groups, academic institutions and other organizations” early last year “to kick off an effort to craft privacy safeguards for the commercial use of facial recognition technology.”
The goal was “to develop a voluntary, enforceable code of conduct that specifies how the Consumer Privacy Bill of Rights applies to facial recognition technology in the commercial context.”
But after a dozen meetings, the most recent of which was last week, all nine privacy advocates who have participated in the entire process concluded that they were totally outgunned.
“This should be a wake-up call to Americans: Industry lobbyists are choking off Washington’s ability to protect consumer privacy,” Alvaro Bedoya, executive director of the Center on Privacy & Technology at Georgetown Law, said in a statement.
“People simply do not expect companies they’ve never heard of to secretly track them using this powerful technology. Despite all of this, industry associations have pushed for a world where companies can use facial recognition on you whenever they want – no matter what you say. This position is well outside the mainstream.”
In some countries, it is illegal to completely hide your face (some laws state that you should always be recognizable => is that even a word ?).
In France, this was a big thing. Some muslims vere angry when the law was passed (no more burqa, they didn't like it).
How California police are tracking your biometric data in the field | MuckRock
Agencies are using mobile fingerprint scanners, tattoo and facial recognition software.
EFF and MuckRock teamed up in August to reveal how state and local law enforcement agencies are using mobile biometric technology in the field by filing public records requests around the country. With the help of members of the public who nominated jurisdictions for investigation, we have now obtained thousands of pages of documents from more than 30 agencies.
Because of the volume of records we’ve received so far - docs continue to flow in faster than EFF and MuckRock’s teams can read through them - we’re starting with California. Nine of the agencies have responded to our requests with documents, while many more claimed they didn’t have any records.
Of those that did respond, most employed a digital fingerprinting device. Facial recognition has also been widely embraced among agencies in San Diego County, with Santa Clara County law enforcement agencies close behind. In addition, In addition, the Los Angeles Sheriff’s Department’s biometrics system includes tattoo recognition, while the Orange County Sheriff's Department is also investigating iris recognition.
A Million Mask March facial enhancement for the maskless
Glad thread was bumped.
I doing same.
One trying to legally avoid facial recognition can always do that to his face. But that will work for one time only
The first arrest using facial recognition software has been made | Wales Online
"The technology will be tested in a variety of circumstances and location in the months to come, assisting in our assessment of the viability of the project moving forward".
First Arrest Made From Cameras Linked to Facial Recognition Database | The Anti-Media
Facial recognition nabs violent fugitive 25 years after prison escape | Ars Technica
Like it or not, facial-recognition tech has become an everyday part of society.
Here at Ars, we often speak of facial-recognition technology as some Orwellian surveillance method that will one day be deployed by governments or other actors to chronicle our every move — perhaps for nefarious purposes. We reported Wednesday that the Department of Homeland Security is pushing a plan that would require all Americans to submit to a facial-recognition scan when flying out of the country. Whether that's good or bad is open for debate. And add to that, the nation's spy agencies have asked the public to help make biometrics more accurate.
While we're not at an Orwellian point in time yet with biometrics, facial-recognition technology is being used for good, no matter how scary the technology sounds. Consider that Nevada authorities have announced that biometrics was behind the arrest of a violent criminal who escaped from prison 25 years ago. It's another in a string of arrests in which biometrics essentially paved the way for a bad guy's capture.
What led to the recent arrest of 64-year-old career criminal Robert Frederick Nelson of North Las Vegas, who committed a number of felonies after escaping from a Minnesota prison in 1992? He applied for a Nevada ID card, and the Silver State's facial recognition tech doomed him.
"Nelson applied for a renewal of his Nevada identification card on June 5, 2017. Investigators withheld the card after the DMV's facial-recognition system showed the same person had previously held a Nevada driver's license in the name of Craig James Pautler," Nevada DMV officials said.
A background check showed numerous felonies under both names, the authorities said.
Biometrics technology is becoming an everyday facet of society, for both the private and public sectors. Facebook is among the best known private-sector players in the field when it comes to tagging people in photos. In the law-enforcement context, about half of all US adults have their images in a crime-fighting, biometrics database — that's about 117 million adults.
Continued at https://arstechnica.com/tech-policy/2017/07/biometrics-catches-violent-fugitive-25-years-on-the-run/
IBM Used NYPD Surveillance Footage to Develop Technology That Lets Police Search by Skin Color
By George Joseph, and Kenneth Lipp, The Intercept, September 6, 2018
Secret Service Announces Test of Face Recognition System Around White House
By Jay Stanley, Senior Policy Analyst, American Civil Liberties Union, December 4, 2018
In yet another step toward the normalization of facial recognition as a blanket security measure, last week the Department of Homeland Security published details of a U.S. Secret Service plan to test the use of facial recognition in and around the White House.
According to the document, the Secret Service will test whether its system can identify certain volunteer staff members by scanning video feeds from existing cameras “from two separate locations on the White House Complex, and will include images of individuals passing by on public streets and parks adjacent to the White House Complex.” The ultimate goal seems to be to give the Secret Service the ability to track “subjects of interest” in public spaces.
Physical protection of the president and the White House is not only a legitimate goal but a vital one for protecting the stability of our republic. And while this pilot program seems to be a relatively narrowly defined test that does not in itself pose a significant threat to privacy, it crosses an important line by opening the door to the mass, suspicionless scrutiny of Americans on public sidewalks. That makes it worth pausing to ask how the agency’s use of face recognition is likely to expand — and the constitutional concerns that it raises.
First, it represents yet another example of DHS’s determination to deploy face recognition, (despite the fact that Congress has never authorized its use on the public within the United States). Like the technology’s incipient deployment by U.S. Customs and Border Protection at airport gates and its planned rollout by the Transportation Security Administration in airports more broadly, its use by the Secret Service would be a milestone.
Face recognition is one of the most dangerous biometrics from a privacy standpoint because it can so easily be expanded and abused — including by being deployed on a mass scale without people’s knowledge or permission. Unfortunately, there are good reasons to think that could happen. The Secret Service envisions using the technology to provide early warning about “subjects of interest” who are approaching the White House “prior to direct engagement with law enforcement.”
We don’t exactly know how the Secret Service determines if someone is a “subject of interest.” The agency says they could be flagged through a variety of means, including “social media posts made in public forums” as well as suspicious activity reports and media reporting. Unfortunately, our government agencies have a long history of labeling people threats based on their race, religion, or political beliefs. Just last year, for example, a leaked document revealed that the FBI had prepared an intelligence assessment wrongly profiling Black activists as threats based on their race and beliefs, labeling them “Black Identity Extremists.”
The Secret Service’s use of face recognition is of special concern when it comes to protesters. The Trump administration is already attempting to limit protests near the White House, and the Secret Service has a problematic history in its handling of protests — including mistreating protesters because of their political opinions. The addition of face recognition to the mix does not bode well in light of this history.
Then there’s the question of where this leads. Exactly how wide a radius does the Secret Service want to monitor? Is there any reason to think it wouldn’t want to follow its “subjects of interest” 24/7 and nationwide if technology makes that easy enough? Let’s also keep in mind that the agency’s mission includes protecting not only the White House but also presidents and vice presidents when they travel; presidents’ and vice presidents’ immediate families; former presidents, their spouses, and their minor children; major presidential and vice presidential candidates and their spouses; and foreign heads of state. The agency’s authority also includes investigation of certain financial crimes. If it begins using face recognition as a principal tool, that’s not going to be an issue just for people in downtown Washington, D.C.
Nor is the relative narrowness of the Secret Service’s mission necessarily going to limit the expansion of this technology. The record of military intelligence agencies charged with protecting the security of military bases on U.S. soil provides a good example of this. Those agencies have used their narrow mission as a rationale to engage in very broad surveillance — for example collecting data on millions of domestic airline passengers; creating a database logging “raw, non-validated” reports of “anomalous activities” anywhere within the United States; monitoring peaceful political protests by pacifist Quakers far from any military base; and even deploying undercover agents to infiltrate such groups.
How far-ranging does the Secret Service believe its monitoring efforts need to be to fulfill its mission? Whatever the answer is today, there is good reason to be concerned about what that answer might be in the future — especially if unregulated face recognition and other technologies make it cheap and easy to extend the tentacles of its surveillance outwards.
The deployment announced in this document is just a test, and for now, the agency promises not to retain images except matches with its volunteer employees. But thousands of people going about their business in the busy urban area around the White House are still having their faces scanned, some of whom will likely be falsely matched to target subjects. (The agency none-too-helpfully notes that “individuals who do not wish to be captured by … cameras involved in this pilot may choose to avoid the area.”) And there’s no promise that privacy protections will survive an expansion and normalization of public face surveillance.
The program is another blinking red light for policymakers in the face of powerful surveillance technologies that will present enormous temptations for abuse and overuse. Congress should demand answers about this new program and the government’s other uses of face recognition. And it should intercede to stop the use of this technology unless it can be deployed without compromising fundamental liberties.
Central Londoners to be subjected to facial recognition test this week | Ars Technica
Met Police: No worries — if you decline to be scanned, it won't be suspicious at all!
A Major Police Body Cam Company Just Banned Facial Recognition | The New York Times
Axon, the company that supplies 47 out of the 69 largest police agencies in the United States with body cameras and software, announced Thursday that it will ban the use of facial recognition systems on its devices.
“Face recognition technology is not currently reliable enough to ethically justify its use,” the company’s independent ethics board concluded.
Even as facial recognition systems are rolled out by privacy companies — from airlines to smartphone makers — institutions nationwide are balking at government’s use of algorithmically-powered surveillance tools.
In May, San Francisco’s Board of Supervisors voted to ban use of facial recognition technology by the city’s police and other agencies. Other cities, including Berkeley and Oakland, Calif., and Somerville, Mass. are also mulling or close on bans. Earlier this month, California lawmakers announced they’re considering a statewide ban on facial recognition in police body cams.
In a 28-page report, Axon’s ethics board, which was handpicked by members of the Policing Project at New York University School of Law, argued that the technology “does not perform as well on people of color compared to whites, on women compared to men, or young people compared to older people.”
The report also cautioned that facial recognition is especially prone to inaccuracy when used with police body cameras, which frequently operate in low-light conditions and produce shaky footage.
“The tech is just not accurate enough,” Barry Friedman, founding director of N.Y.U.’s Policing Project and a member of the ethics board, told me. “Until that’s fixed we don’t need to say another word. And that could be years.”
Axon’s move is a rare departure from the “move fast and break things” style of innovation traditionally associated with new technologies. And it may very well indicate that, when it comes to facial tracking and privacy, policing may be where we draw the line.
Continued at https://www.nytimes.com/2019/06/27/opinion/police-cam-facial-recognition.html
Choose a color via Color picker or click the predefined style names!