Artificial intelligence can create false fingerprints that function as a “master” or a master key for phones that use biometric sensors. According to security researchers who developed the technique, the attack can be launched against individual users with a certain percentage of performance.
In recent years, researchers have been able to prove and demonstrate that a large number of biometric security formats can be overwhelmed through biometric ID systems (print sensors, eye shields or even veins on the hands). In most cases, biometric identification requires the creation of photocopies of fake diagrams or facial scans that match the existing individual. However, researchers from the University of New York and Michigan announced details of how it is possible to train a machine learning algorithm that generates fake fingerprints, which serve as the basis for a large number of true prints stored in databases.
Known as DeepMasterPrints, these artificially generated prints are similar to the main keys in buildings. Although the mentioned researchers were not the first to consider creating the main fingerprint, they are the first to create a functional version using the machine learning algorithm. These prints are specifically designed to target types of fingerprint sensors that can be found on phones.
Researchers point out that at the highest level of security, master prints are “not overly good” in sensing the sensors, when they managed to fool the sensor less than 1.2 percent of the way.
The research does not announce the end of the fingerprint identification system, but researchers suggest that designers of such biometric protection will need to review solutions and determine compromises between the benefits offered by fingerprints and overall security in the future.