1

BONUS EPISODE: Ethics of Privacy Online with Andy Cullison

Eleanor Price: I’m Eleanor Price, and this is Ethics on My Mind, a space for us to focus on timely ethics issues. It’s a special bonus show from the Examining Ethics team, brought to you by the Janet Prindle Institute for Ethics at DePauw University.

[music]

Resident ethics expert Andy Cullison is here to talk about the ethics of privacy, sparked by the recent Cambridge Analytica scandal. The scandal is already beginning to change the way we look at our profiles on the internet, and Andy suggests that we might actually have a duty to protect our own privacy online.

Eleanor: All right, Andy. So what's up? What's on your mind?

Andy Cullison: So I've been thinking a lot about privacy. Rights to privacy, protection of privacy. I've been thinking that a lot of our conversations about privacy talk about all the goods for the person, like what's good for you about having privacy, but people seem to think that the ethics of privacy stops there. That there are things that are good for you, and that if you want to just give up your privacy, it's perfectly fine 'cause you're just giving up benefits that you know you're giving up, but you're well within your rights to do it.

And I'm actually starting to think that there are reasons why you might be in the wrong for doing that. I'm starting to think that privacy's not like, this good that you can just choose to give up. I think that people might have an obligation to protect their privacy, even if they think giving it up won't harm them at all.

So Cambridge Analytica is this consulting company that will work with political candidates to help them do their social media marketing -- advertising on Facebook and things like that. And what they did was managed to harvest data from people on Facebook, hundreds of thousands of Facebook users, through a little kinda ... It was kinda like a quiz app. To do whatever it was the app had you do, you had to give them access to pretty much all of your data, all of your likes and interests, but also access to your friends, even if your friends didn't know that they were gonna have their information harvested by this company. They harvested the data, and then they used that data to come up with sort of voter profiles to figure out who they should target with what kinds of political ads. And I think it probably shaped the election.

Eleanor:Why is the right to privacy, or just privacy in general, so important to people?

Andy: So I think it's important to start with what are the personal benefits to having privacy. What's good for you about having privacy? One is just safety, generally. Right? The world's a dangerous place, people might not like certain things about you, and so being able to choose to protect information about you can keep you safe.

Another one has to do with autonomy. We're social creatures, and so we make decisions based on how others are perceived. And, to some extent, we can't help that we are that way. And sometimes when things are known about you, you feel less free, or less comfortable to be out in the world doing things. Right? And so, sometimes having information about you not known can be liberating. It makes it easier for you to go about in the world and pursue your projects and interests.

And then a third benefit that's often talked about with privacy has to do with what you might call intimacy. Now, I don't just mean romantic intimacy. I mean the kind of intimacy that you might have even between just close best friends. I have a view that secrets, personal secrets, are kinda like social currency. Have you ever had a friend who's come up to you and said, "I'm gonna tell you something about myself, and you're only one of three people in the world who knows this about me." Have you had someone do something like that?

Eleanor: Yeah, definitely.

Andy: And how'd that make you feel?

Eleanor: I felt kind of special, but also very...like I had to protect this information.It was important that I didn't betray this person.

Andy: Yep. You feel special, and it also incurs this kind of obligation. But an obligation that you like to have, right?I think secrets are useful in kind of signaling to people that you are a close, dear, and trusted friend. And so, there's a way to reinforce intimate bonds between friends if you have secrets to share. So if you have secrets that you're embarrassed by, I think that's a good thing, and it's a way to open up to people, and it's a way to show someone that they are a dear and trusted friend.

So there's three different reasons why someone might have, personally, for having their privacy be protected.

Eleanor: But do we have a moral obligation to keep any of that information secret for our own sakes? Or for other people's sakes?

Andy: Yeah. So until about three weeks ago, I was in the camp of, "Hey, if you wanna give up your privacy, you are well within your rights to give up your privacy." 'Cause those three benefits - safety, autonomy, and intimacy - if you don't want 'em, and you'd rather have some cool social network where you're giving all this up, then more power to you. Give up your privacy if you want.The Cambridge Analytica thing got me thinking, well, wait a minute. Anybody who chose to give up their privacy contributed to this behemoth that could suddenly be used to manipulate the democratic process in a certain way. And I thought, well, now it looks like giving up your privacy isn't a harmless, purely personal decision. So then I started thinking, are there other reasons why people might be obliged to protect their privacy?

A lot of people who give up their privacy to Facebook will say things like, "My life's an open book. What do I care if people know all these things about me?" And that might be true for you, but there are a lot of other people for whom their life is not an open book. Right? Imagine you are a gay person in a region that is not particularly friendly to the LGBTQ community. Or imagine you have just left an incredibly physically abusive relationship. People have good, legitimate reasons to have their privacy protected.Now, if people are flippant about their privacy when they don't have needs to protect their privacy, they fail to create market pressure to have companies protect privacy. If a vast majority of people aren't actively protecting their privacy, and actively voicing concern when there are privacy issues with companies, if you're not doing that, then you make it difficult for vulnerable people to have goods and services and products that will protect their privacy.

So an analogy I like to use is sort of related to the sort of herd immunity argument when it comes to vaccinations. Young, healthy people won't get vaccinated because they say, "I don't care. I can get the flu. It's not gonna hurt me." But if enough people behave that way, then you create a lethal environment for vulnerable people, and I think there's something similar going on with privacy. Right? So if enough people who have nothing to hide are flippant about their privacy, they make it harder for other folks to protect their privacy who really need it.

Another reason that you might think we have an obligation to protect our privacy is that giving up your privacy contributes to a big data behemoth that could cause harm to people. And it could cause harm, in part, because of the information you've given. There's this fascinating story about Target, how they inadvertently tipped off a young daughter's dad that she was pregnant, based on her purchasing behavior. So here's what happened. This smart marketing VP figured out that with big data, based on purchasing habits, he could predict her due date.And the way he did this is women tend to buy different kinds of products at different stages of the pregnancy. All that kind of stuff gives them, especially spread out across millions of women, you get an incredibly accurate idea of how pregnant someone is, and when their due date is. And so what they started doing is they started creating custom flyers that they would mail out to women based on where they thought they were in their pregnancy. So there's a way in which Target, inadvertently, basically gave up some very personal information. Now, you might be a well-off, 30-something pregnant woman with no qualms about Target knowing your spending habits, but you contributed to that big data behemoth that indirectly divulged some very personal information about a young, vulnerable woman.

Another reason to think we have an obligation to protect our privacy has to do with ways in which you might actually inadvertently reveal something that directly causes harm to someone.So here's an example. Imagine a young woman just gets out of an abusive relationship, so she just completely unplugs from social media. So she stops posting things on Facebook, she stops posting things on Twitter, she's very careful about not posting her location. Imagine her ex-partner knows who her friends are, and so just starts monitoring her friends' social media feeds who don't really care all that much about their privacy. And imagine one of them posts something like, "Hey, ladies night! We're all hanging out at the pub having a drink. Come join us!"So there's a way in which by just flippantly giving out information about yourself, you could cause harm to someone who's actively trying to protect their privacy.

So there are lots of examples where, by being open about what you're up to, you could reveal information about other people that could be harmful to them.

Eleanor: Andy makes the case that your responsibilities to protect your privacy online might go beyond yourself -- we have to think about others’ needs too. However, there are objections to this reasoning.

Eleanor: But aggregate data can also be used for good reasons, like health research, or sociological research. So one, how do we reconcile those things? And two, are we obligated to protect certain pieces of data about ourselves, but not others?

Andy: That's a really good question, and a good candidate objection.There are a lot of things that are like this. So take medicine and drugs. Medicine and drugs can be used in very bad ways, very harmful ways, but they also can be used in very good ways, and in many cases can be lifesaving.I think you'd say something similar about privacy. The main thing I want to say is, a decision to give up your privacy is not merely a personal decision. Your decision to give up your privacy has potential to cause real harm. And so, we don't have an unrestricted right to give up our privacy. That's the main claim. And I'm willing to say, yes, sometimes it's okay to give it up, and sometimes you shouldn't. And the real hard work is to sort of figure out under what circumstances is it okay to give up your privacy, and under what circumstances is it not.

Eleanor: We’ve all got some serious thinking to do about privacy on the internet. When it comes to that “big data behemoth,” it’s hard to know what to do. Andy gave me some practical tips.

Andy: So one is we can just start being more intentional about what products we use. Educating ourselves about what they do, or do not do, to protect our privacy. Also, being more intentional about what we share. And then, more toward just what can you do to lock things down. A lot of these companies give you ways to turn off certain features. Google, for example, is very good about this, showing you how to protect your privacy. You can download what Google knows about you, and learn how to turn that stuff off so that Google stops learning that stuff about you. You can do the same thing for Facebook. Also, whenever you install third-party apps on your mobile phone, or on Facebook, there's a long list of things that they say they are going to access, and I would caution you against accepting a long list of those things. That's when you really start giving up some of your privacy, and probably even some of the privacy of your friends.

[music]

Eleanor: Checking your google and facebook settings might not feel like enough, but it’s a step in the right direction. And according to Andy, it’s the best way to help everyone have a bit more privacy.

If you’ve got questions or comments about today’s show, shoot us an email at . And if you’re an educator interested in using Examining Ethics in your classroom, we’re developing classroom toolkits and we’d love your input. Email us at .

Remember to subscribe to get new episodes of the show wherever you get your podcasts.

But regardless of where you subscribe, please be sure to rate us on Apple podcasts--it helps us get new listeners, and it’s still the best way to get our show out there.

For updates about the podcast, interesting links and more follow us on Twitter: @examiningethics. We’re also on Instagram: @examiningethicspodcast and Facebook.

Credits: Examining Ethics is hosted by the Janet Prindle Institute for Ethics at DePauw University. Christiane Wisehart and Eleanor Price produced the show. Our logo was created by Evie Brosius. Our music is by Blue Dot Sessions and can be found online at freemusicarchive.ORG. Examining Ethics is made possible by the generous support of DePauw Alumni, friends of the Prindle Institute, and you the listeners. Thank you for your support.

The views expressed here are the opinions of the individual speakers alone. They do not represent the position of DePauw University or the Prindle Institute for Ethics.