From providing enterprise level security to businesses across the globe to making smartphones & laptops safer against unauthorized access, fingerprint technology has permeated every possible security system in use today. Being the oldest biometric recognition system in play, it has been well-integrated in enterprise level security applications across industries. The widespread acceptance & easiness to operate coupled with cost-effectiveness provide it an appreciable clout over the other under-developed biometric technologies.
Fingerprint technology offers extremely high accuracy but this accuracy is contingent upon many factors. The probability of false positives (illegitimate access due to wrong matching) is nearly zero and with the help of live scanning it could be completely eliminated. However, there is a good chance of false negatives (inability to recognize a legitimate user) if proper scanning doesn’t take place. Some of the simple rules that could facilitate proper fingerprint extraction by the sensor were discussed in the earlier posts.
The friction ridges present on the fingers form distinctive patterns that are unique to a person and minutiae points which are formed at ridge ending and ridge bifurcation are extracted from these patterns to be used later for verification or identification process. The first major factor that affects the accuracy of a fingerprint scanner is the human error. When a sensor reads a fingerprint, it creates multiple digital templates with the help of minutiae points which are then stored in the database for future matching. If consequent fingerprints captured are not adequate to create templates for database matching, then false negatives occur. Skin conditions such as wet, dry, greasy, finger injury, etc. are common impediments to optimal digital capture. Enrolling more than one finger generally solves most of these problems.
Device capabilities also play a pivotal role in the recognition process. Since the minutiae points are quite subtle and skin conditions could hamper correct template formation, it is vital that the device (sensor & associated hardware) is capable of capturing the best possible images even when the external conditions are far from being ideal. DPI (dots per inch) which essentially refers to the amount of information stored within an inch of space in the digital image is a primary deciding factor for image quality obtained. According to most image quality & matching standards such as ISO 19794-2/4, ANSI 378, & NFIQ, the optimal value is in the range of 500 DPI. Our fingerprint biometric recognition systems that we distribute worldwide comply to such stringent quality standards.
The matching software or algorithm is an another important factor when it comes to the accuracy of fingerprint recognition systems. Fingerprint enhancement algorithms are widely used by the sensors so that the extraction of the minutiae points is as distinctive & clear as possible. Depending on this extraction, matching algorithms then initiate the identification or verification process.