X
Tech

Facial recognition creeps up on a JetBlue passenger and she hates it

Facial recognition systems don't want you to stop and think about them. This is what happens when someone does.
Written by Chris Matyszczyk, Contributing Writer
blueface.jpg

Surprise!

Screenshot by ZDNet

"There's nothing to worry about."

Every time I hear those words, I start worrying. 

This may be because I've heard them uttered a little too often by tech CEOs who are subsequently shown to enjoy all the honesty of a congressperson's PR representative.

Also: Amazon offers up regulatory guidelines for facial recognition

I therefore well up with sympathy toward writer MacKenzie Fegan, who endured a troubling encounter last week with JetBlue's facial recognition technology, first introduced last year.

Naturally, she took to Twitter to register her troubles.

She began: "I just boarded an international @JetBlue flight. Instead of scanning my boarding pass or handing over my passport, I looked into a camera before being allowed down the jet bridge. Did facial recognition replace boarding passes, unbeknownst to me? Did I consent to this?"

A funny thing, consent. Sometimes, you have no idea you've already given it. Sometimes, you haven't given it at all.

JetBlue, as all good airlines do, was ready to offer Twitterized sympathy: "You're able to opt out of this procedure, MacKenzie. Sorry if this made you feel uncomfortable."

But once you start thinking about these things, your thoughts become darker. Fegan wanted to know how JetBlue knew what she looked like.

JetBlue explained: "The information is provided by the United States Department of Homeland Security from existing holdings."

Fegan wondered by what right a private company suddenly had her bioemtric data.

JetBlue insisted it doesn't have access to the data. It's "securely transmitted to the Customs and Border Protection database."

Ah, our old friend securely. The only thing you can be sure about that concept is that it isn't very secure.

Fegan wanted to know how this could have possibly happened so quickly. Could it be that in just a few seconds her biometric data was whipped "securely" around government departments so that she would be allowed on the plane?

JetBlue referred her to an article on the subject, which was a touch on the happy-PR side. Fegan was moved, but not positively, by the phrase "there is no pre-registration required."

Others weighed in to explain that her passport information was already in the database. The only difference here, said the wise, is that the machines look at your face and not your passport picture.

But wait. Fegan's reaction is what happens when real people start thinking about the technology that's being foisted upon them.

Tech companies are quite brilliant at sliding their latest brainwaves into the public sphere before those who implement them -- and those who are forced to be the objects of them -- consider the consequences.

Also: Microsoft: Here's why we need AI facial-recognition laws

The technology appears to solve one problem -- speeding up the security and boarding processes -- and in it goes.

JetBlue isn't the only airline that's already using facial recognition. Delta enjoys it at Atlanta airport. The airline claims it will save you nine whole minutes.

Yet if you extrapolate the use of facial recognition to an everyday occurrence -- which Apple already does with the occasionally erratic Face ID -- a few horrors tend to unfold. 

It isn't just that the technology can be inaccurate, especially when it comes to certain skin colors.

Imagine that, soon, everywhere you go you'll be immediately recognizable.

It's not just that cameras will know you're there. It's that they'll know precisely who you are. And, given the rivers of data that flow uncontrolled around the web, a store, a restaurant, or an airport will, within seconds, know if you're rich, poor, divorced, angst-ridden, and/or a fan of Ska music.

Who knows what other information about you might be instantly accessible? Worse, who knows how the people using the technology will categorize you?

A clothing store might decide you're the perfect customer for a sudden special offer. An airline might decide you can only be seated near "your own kind."

Also: Commuters to pay for train fare with their face in China

And as for governments, well, the more unscrupulous might conclude you're undesirable, even for a two-week sojourn on one of its beaches. Because you have a couple of outstanding debts, perhaps.

We rarely learn. New technology is quickly introduced and only years later do we stop and see the consequences it's wrought and how those consequences are affecting us.

And then we start to worry.

Photos: From facial recognition to connected toys, a trip inside the invisible big data revolution

Editorial standards