By Michael Thieme
The use of biometric technology is directly associated with privacy concerns, such that it is impossible to discuss biometrics without addressing the negative perceptions which surround its usage. As with the perceptions surrounding most new technologies, many concerns are well-grounded, some are based on fundamental misconceptions of the technology’s operation, and others are unrelated to the technology. All privacy-related concerns must be addressed fully in any situation where biometrics might be deployed.
Privacy-related concerns expressed regarding biometric technology can be divided into Personal Privacy and Informational Privacy.
Personal privacy. There is a segment of the population for whom the use of biometric technology is inherently offensive, distasteful, invasive, or embarrassing. This may be attributable to a variety of cultural, religious, or personal beliefs.
The percentage of the population for whom the use of biometrics is inherently problematic is unknown; further, the percentage of people whose personal disregard for biometrics is so strong as to increase the likelihood of non-compliance with biometric systems is unknown. In either case, fears and concerns relating to privacy of the person are difficult to address through legislation, system design requirements, and can only be partially addressed by public awareness campaigns. The presence of such concerns, though held by a small percentage of users, is an inevitable component of any potential biometric deployment.
Informational privacy. Of more immediate significance to many users is the issue of informational privacy. Fears and concerns classified under informational privacy are not expressions of inherent discomfort with biometrics, but are centered on the potentially ominous impact of the collection, use, retention, and disclosure of biometric data.
The use of biometric technology is directly associated with privacy concerns, such that it is impossible to discuss biometrics without addressing the negative perceptions which surround its usage. As with the perceptions surrounding most new technologies, many concerns are well-grounded, some are based on fundamental misconceptions of the technology’s operation, and others are unrelated to the technology. All privacy-related concerns must be addressed fully in any situation where biometrics might be deployed.
Privacy-related concerns expressed regarding biometric technology can be divided into Personal Privacy and Informational Privacy.
Personal privacy. There is a segment of the population for whom the use of biometric technology is inherently offensive, distasteful, invasive, or embarrassing. This may be attributable to a variety of cultural, religious, or personal beliefs.
The percentage of the population for whom the use of biometrics is inherently problematic is unknown; further, the percentage of people whose personal disregard for biometrics is so strong as to increase the likelihood of non-compliance with biometric systems is unknown. In either case, fears and concerns relating to privacy of the person are difficult to address through legislation, system design requirements, and can only be partially addressed by public awareness campaigns. The presence of such concerns, though held by a small percentage of users, is an inevitable component of any potential biometric deployment.
Informational privacy. Of more immediate significance to many users is the issue of informational privacy. Fears and concerns classified under informational privacy are not expressions of inherent discomfort with biometrics, but are centered on the potentially ominous impact of the collection, use, retention, and disclosure of biometric data.
- Unauthorized collection. Although only certain technologies are even theoretically capable of collecting biometric information without the subject’s knowledge, the increased deployment of certain types of biometric technologies does bring with it the concept of biometric information being gathered, and biometric identification functions being performed, without consent. This would facilitate, if not be an instantiation of, unauthorized use of biometric technology.
- Unnecessary collection. Biometric technology, in its various iterations, is normally deployed as a means of addressing a specific identity verification problem. Primary examples include controlling physical access to specific locations, controlling logical access to specific data, or ensuring that an individual does not enroll multiple times in a single-identity system. A potential fear, if and when biometric technologies become pervasive, is that they will be deployed in situations where there is little to no benefit to strong user authentication or identification. Unnecessary collection would also facilitate unauthorized use of biometric technology.
- Unauthorized use. Unauthorized uses of the biometric technology are seen to represent the greatest risk biometrics pose to privacy. It is not the intended uses of biometrics that are seen as problematic, but the ways in which it might be used for purposes than originally intended. “Unauthorized use” concerns use can be classified as forensic usage and usage as unique identifier.
- Forensic usage. Given the use of fingerprints as the primary means of forensic identification, it is natural that the requirement to provide one’s fingerprints to receive public benefits should be looked at with hesitation. The fear is that information provided for public or private sector usage will facilitate police searches, both automated and through use of latent images. By virtue of this, every database with a biometric could be used as a database of criminal records, representing a significant increase in the potentially intrusive investigative powers of the state.
- Usage as unique identifier. The use of biometrics to monitor, link and track a person’s daily activities is another commonly held fear. Being that biometric technologies are based on physiological or behavioral characteristics, and that some of these characteristics (such as fingerprints) are unique, the fear is that biometric technology can thereby serve as a unique identifier. The fear is that biometric information in "identifiable form", that is, as "raw image" biometrics, will be used to link information.
Unique identifiers are a danger in a world where databases are underlying building blocks of almost every modern system, service, and transaction, because such identifiers can link disparate databases and information. Hence the opposition to the broad use of citizen ID numbers - such a unique number would facilitate searches in any database in which it resided.
When considering the various environments where one might provide biometric information in the public or private sector - banking, medical, public service, retail, and employment – the prospect of information linkage and collection is extremely problematic. - Unauthorized disclosure. Unauthorized disclosure, in addition to being an obvious facilitator of unauthorized usage, undermines an individual’s control over his or her own information. Fears of the loss of control over one’s personal information are at the heart of privacy concerns. As a necessary condition of biometrics being considered for inclusion in any project , unauthorized disclosure must be prevented through the development of privacy-sympathetic system design and procedural protections.
- Function creep. The fears categorized as informational privacy represent various types of function creep, or the expansion of a program, system or technology into areas for which it was not originally intended. The reality of the U.S. Social Security Number being used for a broad variety of applications illustrates the danger of function creep, as information-gathering services are able to use this unique identifier to locate and link information across databases.