The 2014 White House Report on Big Data and Privacy includes a fictional vignette where a woman's life is made safer (and lives of those around her) because she's continuously monitored by external and internal surveillance products.
It's almost like reading Google's five-to-ten year product projection showcase.
The privacy report made strong points about privacy risks, yet its weakened approaches to privacy protection and stance on allowing uncontrolled data collection to continue left some wondering what interests will be served by the report's recommendations.
Google in particular has been singled out.
That's partly because PCAST includes Google's Eric Schmidt, but also because the report stresses that White House policy should leave data collection alone, stating that controlling collection isn't "scalable" and that enforcing collection control would be "economically damaging." And Google is conspicuously at the top of the collection business.
Another reason is because people like to blame Google for a lot regarding our seriously broken personal privacy these days. But it doesn't help that Google has more senior employees involved in the report than any other company.
In this light, it's interesting to examine section 2.4 Tradeoffs among privacy, security, and convenience (pp 17-18).
Noting that it's important to consider that "notions of privacy change generationally" the report advises the White House that kids in the near future would be okay with things that might freak out anyone upset by "1984."
The report states, "Raised in a world with digital assistants who know everything about them, and (one may hope) with wise policies in force to govern use of the data, future generations may see little threat in scenarios that individuals today would find threatening, if not Orwellian."
PCAST conjures up an entire world in which surveillance products are a system of guardians, to protect us not only from threats of criminal intent or benign missteps such as missing one's plane -- but part of a futuristic system that protects us from the threat of deviation.
"PCAST’s final scenario, perhaps at the outer limit of its ability to prognosticate, is constructed to illustrate this point."
The 2014 White House Privacy Report section reads:
Taylor Rodriguez prepares for a short business trip. She packed a bag the night before and put it outside the front door of her home for pickup. No worries that it will be stolen: The camera on the streetlight was watching it; and, in any case, almost every item in it has a tiny RFID tag. Any would‐be thief would be tracked and arrested within minutes. Nor is there any need to give explicit instructions to the delivery company, because the cloud knows Taylor’s itinerary and plans; the bag is picked up overnight and will be in Taylor’s destination hotel room by the time of her arrival.
Taylor finishes breakfast and steps out the front door. Knowing the schedule, the cloud has provided a self‐ driving car, waiting at the curb. At the airport, Taylor walks directly to the gate – no need to go through any security. Nor are there any formalities at the gate: A twenty‐minute “open door” interval is provided for passengers to stroll onto the plane and take their seats (which each sees individually highlighted in his or her wearable optical device).
There are no boarding passes and no organized lines. Why bother, when Taylor’s identity (as for everyone else who enters the airport) has been tracked and is known absolutely? When her known information emanations (phone, RFID tags in clothes, facial recognition, gait, emotional state) are known to the cloud, vetted, and essentially unforgeable?
When, in the unlikely event that Taylor has become deranged and dangerous, many detectable signs would already have been tracked, detected, and acted on?
Indeed, everything that Taylor carries has been screened far more effectively than any rushed airport search today. Friendly cameras in every LED lighting fixture in Taylor’s house have watched her dress and pack, as they do every day. Normally these data would be used only by Taylor’s personal digital assistants, perhaps to offer reminders or fashion advice. As a condition of using the airport transit system, however, Taylor has authorized the use of the data for ensuring airport security and public safety.
Taylor’s world seems creepy to us. Taylor has accepted a different balance among the public goods of convenience, privacy, and security than would most people today. Taylor acts in the unconscious belief (whether justified or not, depending on the nature and effectiveness of policies in force) that the cloud and its robotic servants are trustworthy in matters of personal privacy.
In such a world, major improvements in the convenience and security of everyday life become possible.
The section ends there.
Next is "Collection, Analytics, and Supporting Infrastructure." The rest of the White House Privacy Report lacks fanciful short fiction pieces about productive citizens humans living in harmony with surveillance products connected to big data tracking and analytics.
The authors might say it's "Orwellian," but Mr. Orwell's near-satire of Stalinism was obvious.
If it were a traditional product launch with so-called "first hint" and "conditioning" methodology, this kind of pitch would be way more subtle than Orwell. Even so, it's not hard to think of or imagine any number of Google projects, patents or acquisitions while reading the privacy report's little work of futurism.