When I was at university reading for a degree in Medical Anthropology, one particular assignment is prominent in my memory. A group of us were asked by our seminar leader to conduct a self-study and bring together members of our family -- on both sides -- as far back as possible to compile a comprehensive record of family illnesses, allergies, diseases and anything else relevant to our medical history.
Faced with the prospect of our lecturers shortly becoming more knowledgeable about our medical histories than we likely were at that moment in time, almost everyone shrunk in horror.
In mere seconds, this reluctance turned to anger; mutinous looks, crossed arms, and the refusal to speak by every student eventually forced the session to end early.
We all did the assignment, as we had to for the sake of our degree aspirations. However, we were not happy about it -- no matter how enlightening such a study was for our own future health prospects and risks.
Why? Due to the deeply personal information we were being asked to reveal to people we did not know beyond the classroom.
This scenario came to mind when I read that an EU lobby group dubbed the European Translational Information and Knowledge Management Services (eTRIKS), a collaboration between think tanks, research centers, universities, and pharmaceutical giants such as Pfizer and GlaxoSmithKline, which wants to 'teach' and 'encourage' UK patients of the National Health Service (NHS) to willingly share their medical data.
According to The Register, eTRIKS released the results of a survey conducted by the group which revealed that many NHS patients don't trust the service with their personal information.
In the survey, 16 percent out of 2,000 participants said they did not trust the NHS with their personal data at all, 20 percent were not sure -- and only 43 percent of respondents were willing for their medical data to be shared "in the pursuit of research."
In addition, 21 percent said they believed their medical data had been shared without their consent and 38 percent were unsure whether their data had been shared with third parties without their knowledge.
These statistics pose a problem for today's medical scientists who want access to information for research purposes. The emergence of supercomputers such as IBM's Watson and projects worldwide which utilize large medical data sets to extend our knowledge about everything from genome sequencing to geographical health patterns and medical trends in cities has highlighted a thirst for such data.
ETRIKS spokesman Paul Houston said the group wanted to "create a new culture of openness in research" which made the "sharing of data much easier" in order to pursue medical advances.
However, to do so, Houston said, "we also need a new culture of greater willingness from research participants and the general public."
These projects are immensely valuable, but there is a roadblock: medical data is immensely personal and private. This information is on a different level from our daily thoughts, pictures of our lunch, and our search and shopping habits which are routinely posted online and slurped up by various companies.
Tracking is becoming part-and-parcel of our lives in the West, whether through surveillance cameras on the streets or cookies used to push "suitable" adverts in front of your nose while you're surfing.
So why would we willingly give more of our data away -- especially when the UK government seems to always be grasping for more power to learn about us and watch our activities?
When you're being asked to share deeply personal data, trust comes into play -- and vendors must prove they can keep this information safe, secure, and must ensure it is being used for the right reasons.
In today's world of data breaches by-the-second, and little punishment for perpetrators or companies which held security in so little regard that they were almost begging to be breached due to outdated patches or unsecured servers, there are few reasons for the average member of the public to put their trust in, and to willingly share data with agencies.
It is worth noting, of course, that security and data protection performed well by companies and organizations do not hit the news. You do something well, no one necessarily knows about it -- but the moment a security barrier is broken, the consequences hit entities like a ton of bricks.
If you're talking medical information, such as the data breach which exposed the information of HIV patients, it could spell disaster not just for agencies involved, but the average person on the street.
If you've already been hit by identity theft or have been affected by a data breach, little is going to induce you to part with more sensitive information about yourself, whether or not that information is anonymized or for the common good of the health of humanity -- especially if there is a risk of an agency revealing personal, deeply sensitive parts of you to the public.
Many of us consider privacy a right, and we choose -- to a point -- what information to share, whether in person or online. But when you're not sure how that data will be used, whether it is being collected just for the sake of it or to flog to third parties for profit -- and whether or not it can even be kept safe -- there is little inducement to sign up and say "okay."
Perhaps if the UK government was not so surveillance-happy and encryption-damning, the EU data sharing economy would have a chance. In the meantime, increased transparency and showing consumers just how data will be used and kept secure could perhaps "encourage" the kind of sharing the EU lobby group wants -- but I wouldn't hold my breath.
When it comes down to the wire, the UK government -- and being tarnished in the same way, any organization linked therein -- has shown little regard for the privacy rights of individuals (Snooper's Charter, anyone?), and you cannot have it both ways.
If you want to foster a willingness to share, you must also prove that you are worthy of trust and are respectful of being granted access to deeply personal information.