Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Laws to Norms - How Privacy can Influence Design

UXAustralia
August 30, 2019

Laws to Norms - How Privacy can Influence Design

UXAustralia

August 30, 2019
Tweet

More Decks by UXAustralia

Other Decks in Design

Transcript

  1. UX Australia 2019 -30th August, Breakout session (AUUXAU3008E) TIM KARIOTIS:

    I am hoping you can all hear me. I am from Tim and I'm from the university of Melbourne, and I also worked as a consultant around public health and privacy, and I manage a not-for-profit. I wear a few hats. I want to start today by acknowledging the traditional owners of the land upon which I work and live on, the Wurundjeri people of the Kulin Nation, and Eora Nation land on which we are today. I want to start by acknowledging some people who have inspired what I want to talk about today. These are people I would recommend you read because they are at the forefront of how we conceptualise privacy, and what I will introduce you today is a new way of thinking about privacy. The title of the talk, I gave it a title because of my bosses, but I also have another name for this talk. The actual title is 'Sex, Drugs and Privacy'. And you will start to realise why very soon. But on a completely unrelated topic, I am going to start with a pop quiz. And I need you to participate. I will assume that you are participating. Either way, you lose. Who is interested in working out how they can design better for privacy? Raise your hands. Who wishes that the banana flavour was the flavour of bananas? Who here has had consensual sex with another adult before? Cool. You put your hand down really quickly though. Who here has ever taken illicit substances or illegal drugs before? Cool, cool. Who here values their own privacy? Two more big ones. Who hear values other people's privacy? And who here was expecting to be asked about sex and drugs at this conference? Anyone? One person, OK. You are in the right place. So, some of you, I think, are liars. Because some of the answers to those questions, you engaged in, but then you said you valued other people's privacy. So, by answering the question about whether or not you have had consensual sex before, you just outed all the virgins in the room. Anyone who just put their hand up. I get to define the variables. By saying that you had taken illicit drugs, you just outed yourself as taking illicit drugs, but you also outed people who haven't. So, you have just outed all those people as well. What this tells us, and this is a big thing in my talk, is that privacy isn't just important for individuals in our digital age, but it's important as groups. And what you say and do doesn't just impact on you and your privacy, it also impacts on the privacy of those around you.
  2. UX Australia 2019 -30th August, Breakout session (AUUXAU3008E) Page 2

    of 9 Throughout my talk, I come from a health background and I will use lots of examples from My Health Record. For those of you don't know, that's a national electronic health record that contains a summary of people’s health information that general practitioners upload to benefit other practitioners. I will use examples from that but try to make this as general as possible. Why do we care about privacy? We care about it in this space mostly because of this term – privacy by design. Privacy by design is this big monolithic concept at the moment which is being taken up more and more within the General Data Protection Regulation in Australia, and general privacy regulation, and we have to react not just to privacy invasions, but we have two prepare for them and designed privacy into our technology. When we say privacy by design, how it is practically being used is privacy in engineering. It's how it is being action in GDPR. And what I want to get us to think about is what it means to privacy by design, and what we think of a design. What I would argue, and hopefully I can convince you of this, is that privacy by design for us is not about answering privacy, it's about questioning it. It's about being critical. One of the things I will talk about today has been reflected in the talks of many topics throughout the last day. The power we have is to question. And other people might do the doing. Why does society care about privacy now? We care about privacy now because we live in an information age where information is the commodities that we need to get through the world. In my research I look at health services, and healthcare services, and most of those services you have to consent to give information. If people don't consent to give information, they still get the service, that they might not get the support that they need. So, information is an essential commodity to interact in the world. It provides benefits, but also provide some creepiness sometimes. And that is where the current tension comes from. Historically though, privacy was all about having protection or space away from government – that the state couldn't intervene in your space. I will talk about this in a second. What I want you to think about though is that that is where we came from privacy, if privacy was protection from the state, what does it mean now when lots of thought processes about privacy are more privacy from private entities? So, keeping the thing about the state, privacy started about a concept about individual rights – that we had a right to define ourselves, a right to be left alone. That is where it started. And lots of this was about control – that we should have control over what information about us was out there and you had access to it, and that we should be able to control in the private sphere, what people knew about us. And this dichotomy of private and public. That in private spaces, we had control, but in the public space, we didn't.
  3. UX Australia 2019 -30th August, Breakout session (AUUXAU3008E) Page 3

    of 9 And what I would say is that this is really hard to understand, in today's age when we spend a lot more time on the public. The public and the private are collapsing with the internet and technology – you are at home, on social media, are you in public or private? And we have much more ways to collect and use the information in the public sphere. 50 or 60 years ago, public information was hard to use. Your address was in a phonebook somewhere, it didn't mean a lot, but now it can be aggregated and linked to other data, there was much more implications to that. This individual conception of privacy is very much the focus of the GDPR, which is very progressive but still sits in this individual conceptualisation of privacy. And the GDPR as seven principles. You can read them if you want. I think the one that fits with what we're thinking about and doing at the moment is principal number one. You have probably experienced it in this scenario – do you accept cookies? Do you want to change your privacy settings? I have just been travelling across Europe, so you have a nice slide there from the Louvre. The issue of the individual conception of privacy is that it relies on this idea that we have choice. I would argue that we don't have a lot of choice. We also have a lot of information. I have never read a terms and conditions before, and I am a privacy researcher, but I haven't got a lot of time. What this hinges on, though, is the concept of power. And the individual conceptualisation of privacy doesn't realise this. And I think the person who really solidifies this for me is a researcher and author called Priscilla Regan, and what she says is that the idea that we as individuals have enough power to stand up against great organisations and states is laughable. If you think about situations, and I think the US provides many great examples, where the state or organisation has said, "You have some individual privacy but there is some other social group we have to balance that with." That might be fighting terrorism or public health, and your individual privacy doesn't stack up next to that social need. So, what Priscilla Regan said is that if we want privacy to work for us, we have to consider the social value of privacy. And another way to think about this is group privacy, where most of the technologies we are concerned about today don't care about us as as individuals, they think about us as groups. And that's important to consider. In lots of our design and discussions, privacy has to be a social good if it is going to have power. One of the practical ways of using privacy which I am passionate about, and I want to use as a framework to move forward, is a contextual privacy, which is about control, and what is appropriate to a specific context. One thing that frustrates me in healthcare is that we have these specific rules for healthcare, but
  4. UX Australia 2019 -30th August, Breakout session (AUUXAU3008E) Page 4

    of 9 healthcare can be many things. I sound a bit biased because I spent the last couple of weeks in California with all of the privacy contextual integrity researchers. I have really drank the Kool- Aid. Just remember there are negatives which I will draw out. But I bought in. Contextual integrity says that privacy is appropriate in specific contexts. And when new technologies come in that breach that appropriate flow, that is what creeps us out. Lots of things that make you feel creepy about technology is not because of some legal conceptualisation, it's just because of what they expected. And what is appropriate is based off context-specific norms. In this room, there is probably not a norm to talk about sex and drugs. So when I brought that up you probably wondered, you probably were not expecting it. But in saying that, we are a pretty rowdy bunch, maybe you did expect it. And we can conceptualise norms based on this theory by five parameters or dimensions. By the actors, the subject of the data, the sender and receiver, the information type and by the transmission principle. The transition principle is the restraints on information flows, which might be consent, reciprocation, or by law. We need to identify all of these parameters. For those of us, if your information, depending on who your information is shared to, it hinges on whether it is expected. For example, cameras out on the street, why do they creep people out? Isn't it just the same as any other person see you in the street? What is different? There are a few things where you don't know who is at the other end of the camera. The receiver of the data has changed, and the transmission, if you take a photo of me on the street, and someone takes a photo of my watch, I can see you. I could take a photo of you. But CCTV, you can't. Contextual integrity is a descriptive way of understanding why technology challenges our expectations and norms. We can go further and say, isn't all technology going to disturb our norms, how can we evaluate it and see if it is still justified? We need to consider a few different things. As I say these, I want you to think about how you might define these if I was giving you the task to define a technology, and think about the idea of power. What it says is that you need to evaluate whether the technology has an impact on key interests, and whose interest? In healthcare, does it have an impact on the patient's interest? Does it uphold, value, certain political or moral interest? My Health Record says that it does, it says that it will give patients more control over their data. Is that a good thing? Is that what we are aiming for in the context of health care? Then, does it uphold the values? Healthcare, if its goal is to make people healthier, does My Health Record make people healthier? I have argued in my research that it does breach privacy. In normal health situations, you go to the doctor if you are sick, and they say that they will send you to another doctor, and they will
  5. UX Australia 2019 -30th August, Breakout session (AUUXAU3008E) Page 5

    of 9 send them some information, and then you go to that other doctor. You know who is receiving the information, it is curated for them. My Health Record though, your information is dumped into the cloud, which someone in the Senate enquiry called a glorified dropbox, you don't know who will receive it and what will be done with it. It seems to breach our norms, but does it make people healthier? Does it give patients more control? That is how we would use contextual integrity, we would ask if it breaches the current norms, and if it is justified? Does it have a moral or political argument? I'm in healthcare, but some of you are in many other context. Another great example is education, technology, is it bridging norms? I want to test a theory, looking at artists, designers... The first question, is the power of context, this is something that I am very passionate about, context is very hard to define. My background is community background, what is a community? What is context? This is really important. What you do financial context is where you are going to have information on who you are working with. What you define as your context defines who you are going to talk to. Healthcare, if you define it as hospital and primary health care, you might only talk to doctors, and patients in a GPs office. But for some people, healthcare can be a support worker. Suddenly, you are missing out on the experience that people have. One question that I am always asking is, what are the different types of contexts? Context that symbolise, for example a Red Cross that shows that something is medical. Context is experience, how we express our context, when we go out into the world in healthcare, what does that mean to you? Context explained, and I want you to take note of this, most of the times in privacy, we frame a context through the technology. So, if you have given your information to my health context, the context is sometimes hospitals and doctors, because that is what My Health Record puts forward. When you are designing technology, you have an idea of context, of what you are designing for, it might not actually be that, but you come to an exact location of what healthcare, what education is. In your thinking that when you are designing. People make new connections with technology within contexts. Context is all about connections and disconnections. Depending on how you define context, it will depend on connections between different entities and disconnections. Coming back to privacy, it depends on what we expect in a certain context. Most fascinating for me, is the idea of context collusion, context collapse. This is one of the things where we really do see the creepiness, social collusion around privacy, for example, when your mum adds you on Facebook, all the pictures that you posted on a great night out, you then have to curate your feed, put on some privacy control. The context of your friends and family, has been collated,
  6. UX Australia 2019 -30th August, Breakout session (AUUXAU3008E) Page 6

    of 9 your information norms, for example with your mum, I have a great relationship with my mum, they collide. I think this is very important for us as designers, and I want to show you why. I think of design as having a value set, even if we are not always aware of it. We want to make technology better, fix it. How do we do that? We do that by making sure that we share the information as quickly as possible with the patient, that's for my health record. When you set a value, I want to make people healthier, you can do that with many things, then you can work out the technical specifications that you want. My research, and this is from one of my thesis papers, you have designed in norms, how you think information should flow, that is what you have done. You've done that by talking to lots of different people, different stakeholders. That has come about from a context of design, you think that you would talk to the people, and make something. This will probably be different from the actual context. We can't always capture all the data, people's experiences. There will be some misaligners. This is context collusion, the context that you defined, will collide with the actual experience, and they will create new norms from your technology. The big question that I had in my research, who really gets to define context in norms? Are we defining the context, healthcare, is that a doctor that is defining the norms? How do we get to this point where we have norms being defined? Contextual integrity says out in the world, we have sent norms, and we don't talk about sex, drugs, at conferences, before 5 PM. People are creating new norms in technology, to fit their values. Who defines it? There is a great example from the UK, part of the design, patients weren't asked for consent before opening the record. Why did doctors get to define this? What drives this, is power, but also capital. When you think about capital in an economic context, Pierre Bourdieu, talks about differenttypes times of capital. In healthcare, it might be more about health, it might be having a lab coat, there might be many things. With cultural capital, it is embedded in the norms. If you have this cultural capital, you understand the rules, you understand how healthcare works. You become a patient in relation to a doctor. You don't understand the game. You don't know what is happening. You don't get that opportunity to define those norms, the context, and that is an issue. It is the data subject, the unit, that is probably going to the most harmed. So, what is the solution to all this? We've gone on a bit of a journey. We are talking about norms... People have expectations, if they get creeped out by technology, they might not turn up to health care. What is the solution?
  7. UX Australia 2019 -30th August, Breakout session (AUUXAU3008E) Page 7

    of 9 I think the solution, is participation. Not just any participation, and I am going to talk about what I mean. How great is this movie by the way? I just want to talk about Disney's new stream service, but we can talk about that at lunchtime. Participation is a transformative context, a way of life, a way of being in the world. It is not just when we bring people and talk to them. What I think is more challenging, and I walk in the participatory design space, norms are institutionalised, you don't know that something is a norm into you break it, you don't know someone has power over you into you do something wrong. You don't know that you have capital.. You are just kind of living your life. We actually need to help people, to work with people, is a better way of putting it, to think about these broader questions of power, control. What I am thinking in my design work, in my participatory design, is how we start to question power structures? How do we define the context and the norms, before things go to crap, before we introduce the technology we have designed around? The GDPR is great, but then people get creeped out. They have talked to the doctors and the patients and I think part of it is questioning, well, you can't have every single norm, you can't have every single information norm in a technology, so you have to ask how you got there, who controls that? And I think the big thing for me is disruption. And cultural capital is a way to disrupt which is practical. My Health Record is a good example. It asks patients to input their own information. No doctor will ever see it. What if we allow doctors to see it? There is a legal argument about if that makes them liable. But what if doctors could see information? Would it then change the cultural capital? Would then legitimise patient information? And had patients then start to define the information laws about how they want their data use? Because their data, their knowledge of health, is just as equal as the doctor's knowledge of health. Could the technology up-skill users to have more cultural capital? Could the My Health Record take the data the doctors put in and translated two more simple information or more laypersons information so that patients can understand it and participate in conversations around the information and how it’s used? Could that be away patients and doctors can work together to create these norms, rather than letting those in power with the most cultural capital to find the norms himself? For me, this is something that is challenging. Because I am not a patient or a doctor. And I am talking to patients and doctors but it's hard to think about if people would want to rise up and take control of their data. And one of the questions I am asking you and myself to think about is, what role do we as
  8. UX Australia 2019 -30th August, Breakout session (AUUXAU3008E) Page 8

    of 9 designers play to facilitate in some of this push? We started with individual conceptualisations of privacy, that you have an individual right to control your data. And hopefully, I have convinced you that we need to move beyond that. It's important, and I think we have to consider the law because it's important. But if we really want people to not get creeped out by the things we design, we can't just take the GDPR and apply to our design and then give it to the engineers and say, "Fix it. Make this align with the GDPR." We have to think about what people expect, what are the norms? And what I am leaning towards in my work is this idea, and this first bit here said it well – do really this up to the state, or some independent governance board? For me, I think it's about governance. We have to think about ways to work with our users to give them power, redistribute the power, to set up some of these norms. And we are using power in the appropriation of technology. We are appropriating it, so how do those norms emerge from appropriation when we are not there to support the redistribution of cultural capital? And I think the idea of knowledge governance – what rules and roles are we setting up so that people can govern their knowledge, establish norms around their technology, that actually meet their expectations and values and can negotiate different values? And I guess I want to finish on this idea of community knowledge governance. How can we as designers design things, because they are things, that actually create a space for our users to start build mechanisms for community knowledge governance? And for me, My Health Record is about having patients having data they can contribute to. And I think that only one step. I think there is more. I am not an engineer-user, I am a sociologist and fit in nowhere. But is talking to our tech friends and asking what we can do with the technology to get more control. And I guess what I want to leave you with is that even if you leave your thinking it's too hard – norms. People will adapt. I think you want to think of privacy as something you contentedly interact with. You can group hold of it. That is all you can take away from yourself, a question you can ask in every single focus group or interview, you can ask people, what do you expect with your data? If I introduce this, how do you expect it to be used? That is a simple thing you can start to do to help understand if the norms of your product and service will creep people out. Super simple. And then, if you really want to take up a challenge, you can start to be critical about who gets to decide the context. How can I spread cultural capital so that other people can have control over those norms? Please, come and chat to me afterwards. I get up here and I walk around a lot but I am actually very introverted. So come up and chat to me or tweet me.
  9. UX Australia 2019 -30th August, Breakout session (AUUXAU3008E) Page 9

    of 9 And one more thing – it is Wear it Purple Day, and I want to acknowledge all of our LGBTQIA+ friends in the room. And thanks everyone for coming to my talk. (Applause)