Researchers use makeup to evade facial recognition via new black box attack

Researchers from Israel’s Ben-Gurion University in the Negev have found a way to thwart facial recognition cameras using certain software-generated models and natural makeup techniques, with a success rate of 98%, reports Vice.

For the study ‘Dodge the attack using carefully crafted natural makeup‘, the five-person team used YouCam Makeup, a selfie app to digitally apply makeup to identifiable areas of the face of 20 participants. In the second condition, a makeup artist physically recreated the digitally applied and software-generated makeup patterns on the participants, but in a naturalistic manner. The participants crossed a corridor filmed by two cameras. In both physical and digital makeup tests, participants were flagged as blacklisted individuals for systems to be alerted.

The facial biometric system could not identify any of the participants to whom makeup was digitally applied. For the physical make-up recreation experiment, “the facial recognition system was only able to identify participants in 1.22% of the images (compared to 47.57% without make-up and 33.73% with random natural make-up), which is below a reasonable threshold of a realist. operating environment, ”the newspaper said.

“[The makeup artist] didn’t do too many tricks, just saw the makeup in the picture, then tried to copy it to the physical world. It’s not a perfect copy there. There are differences, but it still worked, ”Nitzan Guettan, doctoral student and lead author of the study, told Vice.

“Our attacker assumes a black box scenario, which means the attacker cannot access the target FR model, its architecture, or any of its parameters. Therefore, [the] The attacker’s only option is to change their face before being captured by the cameras that feed the target FR model, ”according to the research paper.

Adverse machine learning (AML) attacks have already been carried out. In June, the Israeli firm Adversa announced the creation of “Adversarial Octopus,” a black box-style transferable attack designed to trick biometric models.

While facial recognition systems have not always been able to identify people wearing face coverings, the pandemic has accelerated the drive to advance this ability. Corsight AI announced in July that the company’s facial recognition system, Fortify, is able to identify people wearing motorcycle helmets and face covers at the same time.

Articles topics

biometric identification | biometrics | biometric research | facial recognition | usurpation | video surveillance

Source link

About Roberto Frank

Check Also

12 most popular highlighters at Sephora

Which popular highlighters at Sephora are best? Sticks, powders, or bricks – there are so …

Leave a Reply

Your email address will not be published. Required fields are marked *