From fingerprint scanning to facial recognition, Biometrics are a useful and precise tool to identify users, provide secure access, and protect devices and data. Biometric Authentication is becoming more prevalent in modern technology today, but when did it go from a future fantasy to a daily routine?
Biometrics Portrayed in Film
Many of our first encounters with advanced biometric technology came from sci-fi movies set in a futuristic fantasy world. Star Trek (1966) showcased methods such as retina scans to access data, as well as voice recognition to talk to computers.
2001: A Space Odyssey (1968) utilized voice and face recognition with the AI HAL, Blade Runner (1982) portrayed retina scanning, and even Wall-E (2008) showcased technology that scanned heaps of trash for plant life. While these movies were being written, these technologies established a futuristic setting beyond our current reality.
But with the rapid advancements of modern technology, we have already implemented many of these practices into our lives.
As cited by Avatier, the most common type of biometric technology portrayed in movies are face and voice recognition, as well as retina scanning. While eye scanning hasn’t become a common practice (yet), facial recognition has almost become industry standard for companies like Apple, which have implemented Face ID technology into mobile devices, granting users access to their phones within seconds.
Voice recognition has found its way into our phones and TVs too, some cable companies have TV remotes fitted with voice control, allowing viewers to simply say what show or movie they wish to watch. But how did it become so common?
The Beginnings of Basic Biometric Authentication:
Law Enforcement was one of the first industries to adopt biometric authentication. Some of the earliest recorded use cases were to identify criminals in the 1800s, by comparing body measurements. This method is still used today but in more precise and accurate practice.
Ink fingerprint authentication soon followed for signature and identification use, which required a person to make an imprint of their fingers with ink. The Henry Classification System, developed in the late 19th century, was one of the first recorded methods of fingerprint identifying and was the basis of fingerprint identifying methods until the 1990s.
Then into the 20th century, biometric identification grew immensely. Facial recognition was first widely used in the 1960s where it started as a way for early computers to identify a human face. This had to be done manually, by applying facial pinpoints on images.
In 1958, photos first appeared on California Drivers licenses as a requirement, and soon photo identification within the Department of Motor Vehicles (DMV) became standard practice when issuing licenses in the United States. The DMV was also one of the first markets to offer facial recognition technology and contributed to making it a standard practice of personal identification.
Today these authentication practices are no longer exclusive to government agencies or big corporations. Fingerprint scanning reached the consumer market in 2013 with the iPhone 5C, allowing users to access their phone with their fingerprint instead of a passcode. What started as a brand new feature is now standard practice and every model thereafter has featured a fingerprint scanner.
Facial scanning, dubbed Face ID by Apple, was developed for the iPhone X, and has also become a mainstay feature on every iPhone since. This new technology has become compatible with certain applications as well, giving users access to their apps quicker than ever. Banking and shopping apps that have sensitive data such as bank account information and credit card numbers now have facial recognition in place of a password, which creates a more secure alternative.
Since passwords can be vulnerable to hackers, using biometric authentication can improve data protection, be more secure, and create an easier login experience. Even beyond our phones, computers, and internal use company devices are implementing this type of technology with many benefits.
For instance, biometric authentication can be used for point of sale operations, like when a customer can download an application to their device with the tap of a thumbprint. To learn more, check out our article How Biometric Authentication Enables A Secure And Efficient Workforce to read about the benefits of biometric login in retail spaces.
However, it also raises concern for many with the protection of personal data and privacy. Especially when biometric technology is used among younger audiences with social media apps such as Instagram and Snapchat, who offer hundreds of facial filters with the click of a button. It’s important for any company who utilizes these features to ensure the safety of their consumers’ and/or employees’ information.
Biometric Authentication has come a long way in a relatively short period of time, and will no doubt become more common and advanced for years to come.
Technology we once only dreamed of quickly became a reality, and there is no sight of slowing down. With the benefits of secure devices, this type of authentication is definitely one to consider when setting up your phone, or even with implementing rugged devices for company use.
To learn more about securing your company’s devices, visit us at https://bluefletch.com/enterprise-mobile-security/