News Release

WVU cybersecurity researcher targeting security vulnerabilities, racial bias with NSF CAREER Award support

Grant and Award Announcement

West Virginia University

CameraShot

image: 

Nima Karimian tests biometrics security measures with WVU graduate student Banafsheh Adami. Sensors measure Adami’s heartbeat and blood flow to authenticate her identity and defeat spoofing attempts, while a video camera facilitates facial recognition. Karimian said he hopes the research may enable extraction of remote physiological biometrics based on facial recordings.

view more 

Credit: WVU Photo/Maddy Watson

West Virginia University research is addressing critical security flaws in biometric systems like the fingerprint or face recognition systems that many use to unlock their mobile devices.

Nima Karimian, assistant professor in the Lane Department of Computer Science and Electrical Engineering at the WVU Benjamin M. Statler College of Engineering and Mineral Resources, said biometric systems are prey to significant security vulnerabilities, especially attacks targeting hardware like phones or laptops.

With $632,000 in National Science Foundation CAREER Award funding, Karimian will be the first to investigate certain common hardware-based attacks on biometric systems. He will also tackle the problem of bias, in which biometric systems misclassify people who belong to certain demographics, such as when facial recognition software fails to recognize users of color.

“This pushes the boundaries of biometric engineering, bridging the knowledge gap between the seemingly isolated research fields of hardware security and biometric security,” Karimian said.

“We aim to develop the first hardware-based biometric template protection system that not only will deal with known forms of attack but unfamiliar new threats, too.”

The field of biometric recognition has exploded over the past two decades and now encompasses systems for reading users’ retinas and palm prints — even the veins in their fingers. Accuracy and convenience have driven widespread adoption, but serious security and privacy risks persist.

Karimian’s research will contribute to making biometric systems secure enough to be implemented by data-sensitive sectors like law enforcement, border control, health care and financial services.

“Right now, there are worries about user privacy,” he said. “Biometric systems are susceptible to leaking auxiliary information such as the user’s gender, age or ethnicity. There are concerns about bias, and how to train biometric systems to treat all users fairly. And there are security weaknesses — how to protect biometric systems from multiple attacks we’re currently ill-prepared to meet on the hardware side.”

Four specific attacks present the most risk, he said. In a spoofing attack, a bad actor tricks a biometric system with a fake sample, like using a copy or image of someone’s fingerprint, retina or face to gain access to their device. In a template attack, the system’s copy of the authentic user sample, the original fingerprint or retinal scan, is stolen or replaced with a different sample. Side-channel attacks steal information from incidental device signals like sounds, power consumption or electromagnetic emissions. And fault-injection attacks trip up devices’ security protocols with physical stimuli like electromagnetic pulses or voltage surges.

Karimian said he believes his study is the first practical investigation of fault-injection and side-channel attacks on biometric systems.

“First, we look for potential risks that haven’t been thoroughly studied before,” he said. “Then we develop countermeasures and defenses. This project is about enabling the design of cost-effective, secure biometric authentication systems while preserving privacy and exploring anti-spoofing techniques that are discrimination aware.”

One form of biometrics discrimination is the tendency of facial recognition algorithms to have higher error rates when identifying individuals from certain ethnic backgrounds, particularly those with darker skin tones. That presents a hurdle when it comes to defending against spoofing attacks.

“To date, there is nonexistent research specifically addressing bias and fairness in anti-spoofing biometrics,” according to Karimian. “Part of the challenge is the fact that the current state-of-the-art technique for training face recognition software in a way that mitigates bias involves using synthetic data, not images of real people’s faces. But synthetic data generation won’t work if we’re trying to mitigate bias in anti-spoofing systems, because the whole point of anti-spoofing systems is to distinguish between fake samples and genuine data.

“We’re looking at how biometric anti-spoofing recognition systems perform across different demographic groups, whether the system exhibits bias misclassification by age, gender, race, ethnicity and country of origin — and if it does, we’re asking why that’s happening.”

High school students will join Karimian in the research through his Youth Cybersecurity Research program.

“During the summers of 2019 through 2022, six students from three local high schools joined my lab and it was a very positive experience,” he said. “Now we’ll have both volunteer and stipend summer research positions available to bright, motivated high school scholars interested in conducting cybersecurity research. Each will be assigned a WVU faculty mentor and paired with a graduate student for an eight-week research project.”

Undergraduates will be involved as well.

“We’ll be modifying some existing WVU classes to engage with this research,” Karimian said. “I’ll also be developing related new courses: one on biometric and AI security, and another on the ethics of biometrics and machine intelligence that will bring together students from across departments to explore the impact of bias and fairness on AI decision-making.”


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.