World Health Organization CIO on healthcare data, privacy, trust, and ethics (CxOTalk interview)

The CIO of WHO talks about the risks and opportunities of big data when it comes to health care. Large technology companies are important partners in building trust and advancing efforts around digital health. But determining how to partner is not always simple or easy.
Written by Michael Krigsman, Contributor

I recently spoke with the chief information officer of the World Health Organization, Bernardo Mariano, about digital transformation and healthcare on episode 364 of the CxOTalk series of discussions with the world's top innovators.

The subject of data ownership and related ethical considerations was one of the most interesting aspects of our conversation.

Must read:

We all know that technology companies -- Facebook, Google, Amazon, and most other online companies -- gather, aggregate, share, and monetize their users' personal data. The scale of data aggregation, together with the impact of that scale on peoples' lives, raise questions in areas such as data privacy, ownership of data, and legal protections for consumers.

The World Health Organization's agenda includes taking a leadership position on issues such as these in relation to health data. Given WHO's prominence, and their ability to convene public discussions, as Bernardo told me, we must pay attention to their perspectives.

You can watch our entire, in-depth conversation in the video above and read the complete transcript. Edited comments from WHO's CIO on the topic of data are below.

How can we navigate ethical considerations around data and data ownership?

The World Health Organization is key to ensuring that we bring about international regulation on health data. The European Union has the GDPR that protects privacy. Each country has its own national privacy laws to protect their citizens' data.

More countries are bringing laws that forbid the cloud providers, for instance, to take data to the cloud. There are a number of examples of countries where health or national data, by regulation, should not leave national borders.

As these different interests and perspectives come into play, and with the power of machine learning and artificial intelligence to lean on big data to address some of the critical diagnoses or treatment of diseases, we want to strike that balance to ensure that privacy, ethical consideration is addressed to allow data sharing or sharing of data for global good to deliver those positive health outcomes, to deliver these gains, the acceleration to address diseases that otherwise we are challenged to address, because of remote areas or to take primary healthcare to the next level because of digitalization.

Health data regulation is key, and this is enshrined in our global strategy that is going to approval by our member states in May. We want to ensure that, as we get approval from our 194 member states on this global digital health strategy, one of the deliverables of that strategy is international health data regulation to exactly address the issues we just talked about.

What is the role of public trust in advancing digital health technologies?

Without [trust], we will not unleash the potential of digital health. Some gains can be made, but the full potential will not be unleashed. Trust comes by [avoiding] incidents such as data leakages or others that we see happening.

A people-centric approach, involving people at the earliest stage and every stage, is creating the capacity for dynamic consent. For instance, I think the European Union is leading on that, meaning that today I want to share my data for research and tomorrow, I don't want to share my data. I should have the capacity for dynamic consent.

Today, I give consent almost for life, in some respects. We need to move from blanket consent to dynamic consent. That requires technology and processes to be aligned.

Such a dynamic concept will bring about trust because if I can trust that my data is safe and I have the power to give consent and not, to revoke it, I think that's where the ecosystem needs to move towards.

Dynamic consent is important, and the platform providers, the technology, and the processes around it have to be enforced to ensure that happens. That's the only way to start building trust in the system. Trust is key.

Will large tech companies share their data?

How can we ensure that the AI that used data from one country and is sold in a different country, how can we ensure that that correlation or the local context is taken into consideration in that process? Data is an equal partner. Trust is important. Without data, research will not deliver the results that we need.

We are exploring a partnership with some of the tech giants for them to relinquish data for research; depersonalize data to make sure that we can achieve some of the gains that depersonalized data can provide for research – some of the insights.

Some companies are holding their data too close to their chest because they want to monetize it. But some of the tech giants realize that free data can address some of the key global challenges but also create new business avenues.

Think about geo-positioning data. Satellites were launched, and we have GPS geo-positioning all over the world. It creates new insights. It creates new business.

If depersonalized data for research are not made available to researchers, it might hinder those companies from finding new business models. From our discussion with some of the companies in Silicon Valley—I was there a month ago with a number of my colleagues—they realize that, yes, we have to partner to ensure that data for research is freely available.

CXOTalk offers in-depth conversations and learning with the world's top leaders in business, technology, government, and education.

Editorial standards