BeyondTrust expert on why biometric data poses unique security risk
Morey Haber, CTO at BeyondTrust, explores the potential security risks associated with biometric data

BeyondTrust expert on why biometric data poses unique security risk

Morey Haber, CTO at BeyondTrust, explores the potential security risks associated with biometric data and provides some basic recommendations that consumers should consider before handing over biometric data to organisations.

We live in sensitive times. One ‘sensitive’, under-discussed topic that we need to directly confront and have an open conversation about is around the sensitivity of data. Yes, that’s right, what do people today consider ‘sensitive’ data?

The definition of Personally Identifiable Information (PII) often includes your name, email addresses, usernames, passwords, birthdate, address, social security number, credit card information, medical history, etc.

I would stipulate that most people can agree that these are all sensitive data sets.

But there is an entire classification of sensitive data in the world that we do not discuss and is going to be a problem in the very near future. The sensitive data we are failing to adequately address is the linkage of our physical, carbon-based human bodies to all the biometric data being stored by IoT devices and services in the cloud. If you think this sounds far-fetched, ask yourself if you or any of your loved ones participated in an ancestry DNA kit or received a new notebook, mobile device, or smartwatch that stores health or login data via fingerprints or facial recognition PII. I am willing to bet that either you, or someone close to you, has.

Compromised biometric data poses unique risks

To understand the sensitivity of biometric data and why it should be a part of your conversations, consider the potential risk. You are a person. Typically, you have one single identity. One could argue that, even if you are a spy or have a criminal alias, you still only have one identity since, regardless of your aliases or the names you impersonate, you only have one set of biometric data. You cannot change your fingerprints, voice, face, eyes, EKG or even veins in your arm.

When information technology uses biometric data for either authorisation or authentication (and yes, they are different), it needs to compare the results with a stored profile of your biometric data. The storage is electronic.

While extraordinary safeguards can be placed on the storage and encryption of biometric data, at some point it needs to be reassembled (at least in parts) to compare to assessed input. If the storage is flawed by design, has vulnerabilities, or the host system is misconfigured, we have a potential exposure of the most sensitive biometric data.

However, the biggest problem with biometric data is not the storage or authentication technology used, rather it is the static nature of biometric data itself. If a password is compromised, you can change it, putting a stop to password re-use attacks that rely on the compromised password.

However, if biometric data is compromised, you cannot change it. Your eyes, face or fingerprints are permanently linked to your identity (excluding bio-hacking which is a topic for another day). Any future hacks that solely rely on compromised biometric data can be an easy target for threat actors.

Biometrics alone should never be used to authenticate or authorise action or commit a transaction. Biometrics should be paired with a password or, better yet, a two-factor or multi-factor authentication solution for a higher degree of confidence.

Assessing how your biometric data is being used and accessed

Some vendors emphasise security for biometric data (Apple Secure Enclave), while others treat biometric data with little safe regard. If you think my latter claim is questionable, consider VTech’s My Friend Cayla doll and the ramification for sales, collection of voice fingerprints and the mischievous potential for a threat actor against you or your children.

The storage of biometric data is quickly increasing, but the implications are just beginning to be understood and well-grasped. We need to begin discussing what we will allow to be stored about our identity and what is just too risky. And, most importantly, by whom.

Just consider all the new technology that may now possess your biometric data:

  • Personal assistants: Devices from Amazon, Google and Apple all process voice recognition commands and can be programmed to understand individual voices. Your unique vocal patterns are stored and processed in the cloud. While threat vectors for human voice patterns are still very theoretical, be mindful that this data is being stored.
  • DNA kits: If you purchased or used one of these, your DNA is now on file. And, if you give permission, your data can be used by law enforcement to help solve outstanding criminal cases. Your most private and sensitive data, your DNA, is now in the hands of a third party. You should be aware of everything they can do with it and what the ramifications are if those services are ever breached.
  • Mobile devices and IoT: Cellular phones, tablets and even door cameras capture some form of biometric data and store it on the device or in the cloud – even if it is not used for authentication or authorisation. The risk here is obvious. Some door cameras, based on location, capture photos or video based on movement and may capture your picture just by your walking or driving past it. Your likeness, unknown to you, is now potentially on another end user’s device, or in the cloud. And your mobile phone or tablet now has fingerprints and facial metrics stored within it too. There are plenty of tools and documents on how to bypass these security models if you have the device in hand. You cannot trust these security models based on biometrics alone and AI may actually make the matter worse by performing the PII linkage for a threat actors.

Opening up a dialogue about biometric data

Now is the time to begin sensitive discussions on biometric data. When you purchase a device, use a new technology, or consider how you are interacting with a new service, ask yourself and potentially the vendor (especially if the technology is used for work), the following:

  • How are you storing biometric data?
  • Where is it being stored? (especially what countries, since this may have other legal and compliance ramifications)
  • How is it secured? Who has access?
  • Is my biometric data being purged over time?
  • Do you sell my biometric data?
  • Does law enforcement have access to my biometric data or logs? Even with a warrant?

Biometric data is perhaps the most sensitive information you possess. It is a part of your identity and can never be changed. It is a worthy conversation we need to have in this sensitive world. It affects everyone, does not discriminate and as new technology emerges, stands to cause potential trouble for everyone unless we understand how our likeness is being captured, stored, processed, and ultimately utilised.

 

Click below to share this article

Browse our latest issue

Intelligent CIO Africa

View Magazine Archive