Q&A: Maria Bezaitis, Intel engineer, on our need for strangeness

Does technology connect us -- or barricade us from people and ideas we don't know? And what's being done about it?

As a principal engineer at Intel, Maria Bezaitis spends her days considering how technology and data enable our connections to each other -- and separate us from one another. From social networks that link us to people we already know to websites that introduce strangers, Bezaitis studies a variety of platforms while considering new business models and the future of safe data sharing.

I spoke with Bezaitis about our human need for "strangeness," what makes Airbnb tick and why Facebook and Google keep her up at night. Below are excerpts from our interview.

You have a humanities Ph.D., but your title at Intel is principal engineer. Describe your career trajectory.

It was a circuitous route. I finished a Ph.D. in French literature and cultural studies at Duke University in 1994. I knew I didn't want to stay in academics. I ended up in Chicago as the startup culture was booming. I became managing partner at a firm developing qualitative research and design methodologies for the consumer products space. I spent five years there helping develop that company. We were acquired by a technology consultancy, which brought me to the East Coast. From there, I went on to Intel.

I came to Intel to run a group called People and Practices Research, which was a small R&D lab focused on understanding the social world and key social practices in an attempt to get Intel to think more strategically about the social world. That group became part of a much larger lab, called Interaction and Experience Research. It's around a 100-person investment on the R&D side. In that organization, I'm a principal engineer working in the Insights division. I lead a capability that we call Flux. It's about looking at changes in the social world and how to look at those changes to understand what new business and technology opportunities might emerge.

We often think of technology in terms of how it connects us to others. But in your TED Talk you argue that technology actually barricades us from people and ideas that we don't already know. How so?

The social network world has cultivated this notion of connections and has tried to center that on people we know. There are all sorts of good reasons for that. The thinking around that goes back to the 1950s and '60s and into the '70s. There was a bunch of important work published on networks and how our connections are, to a large degree, about nearness. The social networks today have remained centered on this idea of people who are near or people who were near that you want to make near today. They've cultivated the notion that that's where our comfort zone is.

For me, what's so exciting about the world of data and our personal data circulating in the world is that at some level we have a new and important means to access things we don't already know. The hard challenge for algorithms and new technologies is to start to think about what it means to build connections -- and ultimately relations -- between things that aren't necessarily connected in some obvious way.

Why should we care about what you call 'strangeness,' which is access to people and ideas that are foreign to us?

What makes the digital world -- and our lives -- so exciting is that every day is another possibility for learning something we don't already know. How do we start to counter the natural tendency of algorithmic development to optimize for accuracy and efficiencies? It's that trajectory that, at some levels, is counter to the fundamental human need to be surprised and discover. The challenge is to do that in ways that are meaningful to people and not abrupt. These methods should be recognizable and accessible.

What are you working on at Intel around these ideas?

I'm focused on applying these concepts to security and the massive investments we have in the security space. Security is a fascinating space that has rested a lot of its energy on encryption as its core contribution. I'm working with our security researchers to get them to think more flexibly about how they categorize people and how they think about what data needs to do. How do you enable data to circulate safely? Security researchers tend to have rigid notions about how to categorize our relations. It's safe to share certain kinds of data with people we know and it's not safe to share certain kinds of data with people we don't know. We need to start undoing those binaries if we're going to develop useful security technologies in a future where so much of our digital experience is about data being shared.

At some level, this isn't just about enabling me to connect with people I don't know. It's also about how I rejuvenate and maintain my known relations. Imagine a future of lots of data being shared and circulated. You're building security technologies that allow you to shape intimate relations as much as they're shaping relations with people who have no contact whatsoever.

Can you give examples of this work?

Something we can talk about is trusted environments for sharing data. How do we conceive of trusted, secure environments where the data that belongs to multiple people can be shared without compromising the identity of those persons? This is an environment where the data can be shared and analyzed, where correlations can be made and where people maintain some control over what aspects of their personal identity are revealed and when and to whom and under what conditions. You can think of that trusted environment as a kind of stranger, a stranger that's brokering relationships between people who might not know each other and enabling people to discover value in their data that they might not otherwise have been able to do.

The other area of research I'm doing, the qualitative research, is focused on Airbnb hosts. I'm interested in the segment of folks who are living in the homes where they're making a room available and also sharing common spaces, like bathrooms, kitchens and living rooms. I finished one round of work in Portland and I'm expanding it to Chicago and Europe. It'll end up being a global study. The point is to understand what kind of value is being created for both the host and the guest. Why are they doing this? One of the straightforward conclusions early on is that it's not just about money. There's a whole level of social value being created for hosts and guests that makes it a meaningful experience. Hosts and guests end up socializing. Occasionally, gifts are exchanged. Very occasionally, they stay in touch after a visit. There's a relating thing happening in a short-term setting that is new and interesting. It tells us about the ways in which algorithms might need to perform in the future. What does it mean for someone you don't know to get close to you? How does that change the existing intimate relations in the home?

How can these ideas enable new businesses and business models?

That's already happening. Part of what led to my interest in this work was watching how digital platforms have started to enable interactions among people who don't know each other. Beyond my friend list on Facebook, there are various digital platforms that are allowing people who don't know each other to connect. The world has turned this into the sharing economy and collaborative consumption. At Intel, we're calling this the data society. We're interested in the kinds of exchanges that are possible between people who don't know each other, whether it's people sharing their cars or people sharing their driveways.

One recent example I looked into is DogVacay. It's a platform that enables you to, instead of bringing your dog to a kennel or leaving him with a relative, take your dog to stay in the home of a dog lover. That's another great example of people opting for settings that are not known to them to enable some pretty intimate interactions to happen. As it turns out, maybe it's not blood ties and kennels that provide people with the best places to leave their pets. Maybe the fact that I don't know you matters less than the fact that you also love dogs. It's fascinating that those kinds of shifts are possible now.

How do you expect data to continue to shape our relations with others? Where do you see this trend going?

If things go well for us -- meaning individuals who are producers and consumers and sharers of data -- our world will involve much more control over our personal data. There will be capabilities that allow our data to circulate in the world and discover connections and insights and value that we wouldn't otherwise have. We wouldn't be reliant solely on the big data players or on retailers to provide us with value. Our data would be enabled to undergo that kind of discovery process more independently for us. We would both create and capture value on our own terms, for our own purposes.

As you continue to work on this, what worries you? What keeps you up at night?

I'm worried we won't move fast enough to allow people to maintain real agency in this world of data sharing. It's easy to imagine a world where all the power shifts to Google and Facebook. It's less easy to imagine a world where people have real control somewhat independently of those players. Today, that's lacking entirely. We have a window of time in which to work to make that possible. We produce it, so we should be able to benefit from it.

Photo: Maria Bezaitis

This post was originally published on Smartplanet.com

Editorial standards