Great discussion, guys! I have a comment to throw in here that has
always concerned me about biometrics. Its not so much the biometric
data itself, but how it is used, or more likely, misused. Comments,
clarifications, or corrections are welcome.
Take an analytical step back and look at the biometric data. The
measurement that it takes is going to be transformed into a "signature"
of the scan, fingerprint, voice data. This signature/transform must
remove (most?) variations among different measurements over time and
various measuring devices. The data used (compared) will be relatively
static. We've learned from passwords that "static" can be bad.
Biometric data has an extremely low degree of secrecy. I can get your
fingerprint from your coffee mug, a retinal scan from your eye doctor, a
face print from seeing you in the streets, etc. The signature/transform
algorithm is assumed to be known (autocorrelation function for voice,
etc.). Therefore, I can easily generate the biometric data necessary to
assume your identity. "Stealing" the data can be done much easier and
secretly than an attack on the body. I, for one, would barely notice a
missing coffee mug compared to a missing digit. Assume the data is
The high degree of user authenticity afforded by biometrics comes from
the ability of _only_ the valid user to present the biometric data to
the "system". A warm, pulsing thumb set upon a measuring device is a
good indicator of who you are. Now the problem is comparing that data
to a (remote?) database of data without allowing data to be inserted
between the measuring device and the compare operation. You must
completely authenticate the dialogue between the measuring device and
the compare stage and only allow transactions with trusted measuring
For example: The "Mission Impossible" scenario where the fingerprint
measuring devices appear to be in the wall, with "secured" (behind the
wall) wiring into the authentication system. This would be a nice
closed system. Only those measuring devices that are securely hardwired
into the system are allow to authenticate.
On the other hand (pun intended): Your fingerprint device is
connected via a serial port to your PC. An attacker could easily unplug
the fingerprint device and plug in the coffee mug to give the same
response (the stolen biometric data) unless the measuring device itself
was authenticated. This is the type of biometric authentication I've
seen demo-ed so far.
What I'd like to see is a "tamper-proof" token (a la SecurID) that
measures the biometric, takes a PIN, and an internal seed to generate
authentication data and/or unlock a stored private key. The biometric
data would be utilized to its best potential without a significant
threat of data insertion. All 3 authentication factors in one
credit-card sized token! Well, someday.
The perverbial Guido and Mac the Knife are still a problem. How about a
duress finger? :-)