Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Design in the Age of Synthetic Realities

UXAustralia
August 29, 2019

Design in the Age of Synthetic Realities

UXAustralia

August 29, 2019
Tweet

More Decks by UXAustralia

Other Decks in Design

Transcript

  1. UX Australia 2019 (AUUXAU2908A) Main Room, Day 1 – 29th

    August, 2019 ANDY POLAINE: Hello. I bought a fancy shirt and it is so cold in here that get sloppy that jumper instead. I'm going to be talking about design in the age of synthetic realities. I co-wrote the Fjord trend. Every year we do the trends at Fjord, and this is one that I specifically wrote cold Synthetic Realities. I knew there was something going on here and I couldn't put my finger on it. It turns out to be one of the hardest ones to give this sort of talk about because there is so much stuff. Sometimes I show it to people and they are like, "Yeah, so what." And some people are more in the middle. The main thing about this is that when you see it all together, you have this moment of, "Oh my God, things are going to change quite extraordinarily." You probably have seen this, this was done Jordan Peele for Buzzfeed. His using a facial recognition technology. I won't go too heavily into the technology, we'll just have a look. SPEAKER: See, I would never say these things, but someone else would, someone like Jordan Peele. This is a dangerous time. ANDY POLAINE: Is a longer bit where he calls Donald Trump lots of names. His whole point was that it is really dangerous, we can make any public figure, our enemies can make anyone say anything they like. You may have seen, and most people will of heard of Deep Fakes. Using generative adversarial network scans. They work kind of like user testing, they ask whether this looks like a face, and if not they keep going. The two neural networks fight against each other and eventually start to generate faces. One of the things that you can do with that is you can swap styles. What he did was he did that with people's faces. This is a very recent one, there are hundreds of them out there. Nicholas Cage is the poster child of Deep Fakes because of the film he was in called Face:Off. This is Bill Hader who does a lot of impressions. His face morphed into Tom Cruise's face and then back again, but only at the moments where he is doing Tom Cruise or talking about Tom Cruise. Most notable, watch his teeth and his eyebrows. That is have a quick look. SPEAKER: And then Tom Cruise walks in, and he is super stoked to be there. He's just immediately excited when he walks into a room. ANDY POLAINE: Did you see that? The to give it away, and it is kind of spooky. That gave rise to a lot of rather alarmist headlines. There is a terrifying trend on the internet that can be used to ruin your
  2. UX Australia 2019 (AUUXAU2908A) Main Room, Day 1 – 29th

    August, 2019 Page 2 of 8 reputation and nobody knows about it. There is a lot of this sort of stuff around, this is the end of news, this is the end of truth, nobody will know what is real any more. It didn't quite sit with me as I wanted to, and I kept thinking there is something not quite right about this stop it doesn't seem as though Deep Fakes are the problem. There's a very famous quote by Joseph McCartney, "As soon as it works, nobody calls it a I anymore." We were talking about predictive text, nobody calls that Ai. The same as Google maps, nobody considers that Ai. It's the same as Netflix recommending a movie, nobody thinks about that as Ai. I had this moment when I was having this conversation with my Google Home. It gave me a list of the top three restaurants that had found, and I said tell me more about that last one, and it said I can't find more about that last one on your Spotify playlist. It was like the microphone had just popped in from the top of the screen and I realised that it was all made up. This isn't a conversation I am actually having with the sentient being, so my suspension of disbelief was broken. I had this moment I checked myself and realised up until that moment I was having a conversation with my Google Home. I wasn't having that moment like where your grandparents first experience Skype. I think the whole scare about Deep Fakes has it the wrong way around, and I think the more interesting way of looking at this is not how shocking it is that people aren't going to know what is true or not, the shocking thing is that nobody cares, and how quickly we don't care about it once it works. A good way to understand why fake news isn't fake news is to look at (unknown term). I would argue that the death of truth is centuries if not decades old. One of the first famous uses of digital technology is from National Geographic. These pyramids at Giza here, there was this lovely photograph that the photographer took. When the editors got it they decided it didn't quite match the cover. They use expensive technology to shift the left-hand remit across a bit. You can see the notch where it lines up. At the time, the picture editor said, "It's just like as if the photographer had taken a step immediately to the right and taken the photograph, just an alternative perspective." In a weird kind of echo of Kellyanne Conway's 'alternative facts'. Now, there was an uproar from the readership because National Geographic additions itself as a scientific journal, and people felt it should not be acting that way. Not long later and if you look at the copyrights in the about box here, they go all the way back to 1985. Not long later, photo shop one came out, they started working on around 1986. I use this version of photo shop, and it was astounding, it was amazing to be able to clone stuff, select and copy and paste it. Talking about the dates of this just to show how quickly that went from the expensive custom system to this, to a verb. We talk about, "Can't you just Photopshop it?" So much so that we all have it on our phones now, and if you have Snapchat, giving yourself a fancy selfie, we all downloaded FaceApp and used it but we don't think about that, we don't think about the technology going on behind it. But there is a longer history to this - Hitler, Stalin,
  3. UX Australia 2019 (AUUXAU2908A) Main Room, Day 1 – 29th

    August, 2019 Page 3 of 8 all the great dictators have had censors who have removed unwanted people from their pictures. The guy who was standing next to Stalin there was the water Commissioner, Nikolai Yezhov, he fell out of favour, he was arrested and shot, and just to add insult to injury, Stalin removed him from his Instagram feed. What you find is that it's not the particular use of it, is the storytelling. You can miss label something and that does just as good a job. What I'm about to show is a Nancy Pelosi video where she is just slow down a bit to seem drunk. Rudy Giuliani, Donald Trump's best mate, tweeted it saying, "What's wrong with Nancy Pelosi?" Australians will remember the children overboard affair, which is cropped photos to make the case for John Howard being strong on immigration. The stories you attach to stuff makes a big difference. Putting the death of truth and the fake news aside and thinking about what does this mean for the creative industry. What I wanted to do was take a bit of a journey through where stuff is, and then I'm going to end up with what this means for design and we will come onto some ethical stuff as well. Now we are at the stage where we have pretty high fidelity imagery. The dog, the burger is pretty interesting. It looked pretty stodgy previously, but now it is much better. None of that exists, it is pretty good because it illustrates how they can completely create images, dog ball is great. We're going to come back to these weird serendipitous things. I really like dog ball. These are all generated. I talked about Photoshop. In 2014 you get these really grainy, 1960s kind of mugshot of criminals, and they have a very relatable looking human being who never existed. These are what they look like when you see lots of them. It still kind of boggles my mind, none of these people never existed. I feel a little bit bad because none of them will ever exist again. Then you see how fluid race and ethnicity is, because they are generating all of these. They are just adjusting things like hair colour, skin tone, glasses, things like that. Then you can see how attractive things that this would be to businesses, brands, and marketing people. One of the things you can start to do is finally solved that diversity and inclusion problem, because you can generate content for your websites, content and apps because you can make it match people and make it about you. I'll come back to that point a little bit later on. Just one more thing, have a look at that, none of those people ever existed, it is strange and a bit freaky. Of course, brands have been playing with this. Dove brought out a campaign in the UK called 'Real Mums' #realmums, and they did advertising through AI and this was the average that came out. And this was the image of a perfect mum. And a friend of mine's wife said, "I think that is an advertising guy's dream of what a perfect mum looks like, it is not reality."
  4. UX Australia 2019 (AUUXAU2908A) Main Room, Day 1 – 29th

    August, 2019 Page 4 of 8 And once you have your still faces, what you then need to do is animate them. You have your still and it turns it into a really cheesy Photoshop girl, and uses the edge to generate the other output. So every face or speaking face when it says output is completely generated. And again you can change attributes like face shape, skin colour and things like that. And then it captures, and it looks at source footage, captures the bodies and body segments, understands what it needs to do in 3-D, does the shading and synthesises the new one. And everywhere it says 'output' it is generated by the Gann. The one that Jordan Peele uses is facial reenactment. This is Theresa May, who was a fantastic Prime Minister, just before Brexit, just before we had our current fantastic Prime Minister, as a British guy, I have to apologise. And what you are seeing is the source footage. And they are making her pout, but in the middle one they have locked off her pose and edited her features. And she is just standing there blinking at the camera. And that is how I imagine her doing the Brexit negotiations. (Laughter) She blinked last. And all of that requires quite a lot of training data, and even just one shot can be turned into an animated face. Here is the Mona Lisa, as living porch it. When they have done it with people like Einstein as well. It is quite freaky. That is using imagery to generate other imagery. The other thing is to use text, and dispense with having to do that, you still need to have a lot of training data, but what you do is put in 'this bird is red, it was white, and has a very short beak'. And it has learned quite a lot about birds, and generate a good bird. It knows that birds sit on branches. And it can generate a bit of background but it doesn't know a lot about that but it can do a bit about birds. So wehave some of those features and what we need to do is get some digital artists. One of the very first ones of these, and talking about Jurassic Park. A lot of people in the audience were too young to watch Jurassic Park but I hope some of you have seen this, but there is a scene where Lexie was crawling through an air vent, and she falls down and hangs by her arm. The stunt woman was crawling through and she looks up, and instead of reshooting the scene, the team at Industrial Light and Magic mapped the actresses face onto the stunt woman. And the top one here is thousands of dollars of visual effects to map Carrie Fisher's face, in case something happens. And she is there, and the bottom one was done in one day with a desktop computer. And they use the old image of Carrie Fisher and Star Wars, and she looked more like the old Princess Leia, but it is pretty remarkable.
  5. UX Australia 2019 (AUUXAU2908A) Main Room, Day 1 – 29th

    August, 2019 Page 5 of 8 So once you have old faces and bodies, you can get rid of models altogether. There is a start- up in Japan, and what they do is generate models in different poses. You can see interpolating between all the clothes are generated, the poses are generated, and they switch between gender and race all the time. This is what I was talking about before. If you are in the business of being a model, all working in catalogue work, especially in more anonymous catalogue work for fashion, that is all going to go away. With all of these synthetic things, you look at them, and things like my Google Home, you get the willing suspension of disbelief. Willing suspension of disbelief is when you are watching a movie at the theatre, or reading a book, know it is not real but you suspend your disbelief to immerse yourself. One of the most famous examples is Little Michaela. Who follows Little Michaela in here? Little Michaela is a virtual Instagram influencer. You liars. Here she is, she is absolutely styled, she is a little bit sexy, and she talks about her life worries and woes. It is all about the storytelling. She says, "Let's go to the beach. Beach, that is all." She is so social media. She has 36,500 likes on that. The comments on that, half of them feel like they are playing along, knowing that she is not real but commenting anyway. And the other half may not get it. And I think that this is only possible in the age of Instagram in social media. You don't have to actually do anything but you have to able to write good stories and people will play along. She has a music career. She is a TV presenter. She was in a controversial Calvin Klein campaign where she kissed another model even though she identifies as a straight bot. (Laughter) Now if you think that is weak, this is Hatsuni Miko, she is a vocalised synthesiser, using samples not of a real thing, but you put in lyrics and she sings it. She has over 100,000 songs recorded. Here she is in concert. (audio plays) (Sings in Japanese) ANDY POLAINE: If anyone saw Tupac at Coachella, they did the same thing. If anyone wonders if it is fake or real, she is just an artist in concert. There was an installation at the Dali Museum in Florida, and his body is an actor but his face is a generated. Here is a bit where he takes a selfie, high up in the installation, and he breaks the
  6. UX Australia 2019 (AUUXAU2908A) Main Room, Day 1 – 29th

    August, 2019 Page 6 of 8 fourth wall. He then shows you a picture of you in front of him with the selfie. He blurs the line between what is real and what is AI. (unknown term), I don't think I said that correctly. This guy doesn't exist. He is head and shoulders behind a desk, and he is talking about how he is now coming out from behind the desk and has some more gestures and body language that he can use. And of course the nice thing about a synthetic news actor is he can say whatever he wants. If anyone has read Ian McEwan's book, 'Machines like Me' who buys one of the first android on the market, he said, "There is nothing so amazing that we can't get used to it." Super computers in our pockets, that we all have, only 10 years old. It is so remarkable that we all got used to them. I think it will happen so quickly. The whole thing of the death of reality is long gone. My wife the other day was crying about a death of characters in a 3-D animated movie. You can animate animals, capture and animate poses a movement. Modify characters and expression of anyone you like. Synthesise places and landscapes, clone voices which I haven't talked about today much, generate writing. If you go to openai.org, you can write a couple of sentences of an article and it would generate the rest of the article for you. It is a bit hit and miss but when you train it it can work well. You can generate a video. You can link them altogether. Pretty much there you have a whole suite of design tools that you would ever need, that can be artificially generated. This is UI-zard, or wizard. You drop it in and it generates all the code and the assets you can link them together so that you can say, "This button goes to this thing." They are showing you the code here, and in a minute they will move the window around. That drudge work of going from your wireframe sketches to this stuff to go away. One of the things you can do with that his you can start testing a Gann. Instead of just doing AB testing of just a few things, essentially you can let the AI generate and test a lot of different versions of it. It is modifying itself to the users, all you know is from the data what is being successful or not. This is Adobe's content aware tool. The way this works is that instead of painstakingly doing the cloning on this, you just select the cars, in this case, that you don't want, and the AI generates imagery to fill it in. This image is from a thing called Gann breeder where you breed two images together. There are lot of cats in it because of the internet. Eventually you end up with these abstract or surreal illustrations. One of the things about AI in the creative industry is that AI is going to replace everyone's jobs. Everyone thinks that people in the creative industry will not be touched. Actually, humans get stuck in ruts all the time, we use all the same fonts. What AI can do is
  7. UX Australia 2019 (AUUXAU2908A) Main Room, Day 1 – 29th

    August, 2019 Page 7 of 8 create this things that we have never thought of. It will start happening is that you will start working with AI and start nudging it in different erections, that's why our call the duration. It will start to learn from you because it is training. Has anyone seen Runway ML here? That's quite interesting. All of the stuff I've been talking about so far is in the realm of… There is quite a lot of obscure papers and things you can find on Github. You have to download and install all these Python libraries. Rest of us who went to design school, because we're not very good at maths. There is an equivalent here which is harking back to the days of Web 2.0, the early 2000, we had to be a web person to publish things online. The rise of things like Blogger and WordPress changed all that. What happened is just like Photoshop before it, you suddenly broaden the range of people using this stuff. Runway just came out of private beta, so you can go and download it. It shows you all of those libraries, and you can download it from the cloud to run it mostly on your machine. What you do is you get a face generator like the one I showed you. This is one of my colleagues, Dennis, and these are time maps of his messing around with a thing called 'GauGAN', we give it simple shapes and it generates close to photorealistic stuff. I don't know how long that sped up video really was, probably a couple of minutes. For those of you who know Figma, the collaborative design tool. That gives rise to the question of who owns this. I've been building up this relationship with my AI tools, when I leave that company, do I own it, that they own it? That's one of the questions we really have to result. I have to start again take over someone else's Ai who was really hostile to my taste. Another thing we will see is synthetic twins. There was a vlogger China, and she was using this avatar to look like the young woman on the right, and during one of her live streams, and she has hundreds of thousands of viewers, she had a tech glitch, and it revealed her to be the woman on the left. For those of you who have kids, this is what your teenagers too. They have the Instagram account that you can see, which is their Finsta, then they have the real one which their friends can see. This is totally Black Mirror, it takes all of your online life, pushes it into machine learning that generates this version of you, then you can have a scan of your face, so your loved ones can have a little chat bottom of you after you die. If you've gone to all the effort to do that, you are not going to wait until after you die to use it. I can have my work avatar for me at work, and I can have my social media one and so on. There are digital versions of that where you send your digital one out to get mess ups of the algorithm, and this is a thing called URMe. For facial recognition is, you can just give it a face, which is what it wants, not your one. What about this, I just broken up with someone, and I know I can do one of these in Photoshop.
  8. UX Australia 2019 (AUUXAU2908A) Main Room, Day 1 – 29th

    August, 2019 Page 8 of 8 This took me 30 seconds, and I just masked this out and the content aware field just filled it. What about when I change my relationship status on Facebook, and there is this little pop-up that asked if I want to erase my ex from all of the photos on my timeline. Is it OK as a designer to be when I am generating imagery for the website building, to say, "I think I'll have a bit more Asian or a bit more African." Do I, as a middle-aged white guy, know what that means? Is that OK, or is it the same as me just looking through a stock photo catalogue and choosing people randomly there. I don't really have an answer, but I think it's an important question to be asking. Apples Face Time attention correction feature, when your face timing with someone and your eye line looking at the screen, it's not looking at the camera. What it does is it uses AR to bend your eyeballs up a bit and make it look like you are looking at someone in the eye. Nobody would even likely notice without prior knowledge of the feature. If I'm in a culture where staring someone in the eyes all the time is impolite or rude, or what if I'm in an interview and I turn on the spot removal feature. I'm a bit hung over, so just take the bloodshot look out of my eyes. This is called 'emotional intervention', where the guy on the left is having his face look more smiley. If you are someone like me who has a grumpy resting place, and I can make myself look more happy. That is great unless I am having an online therapy session and my therapist is looking for signs of depression. If I think that I know in this Skype interview and I am hiring for that they are looking for diversity, you start to get over into that discomfort zone. What starts off as a well-intentioned feature starts to get a bit creepy. When you record a video of the Skype with someone and it says it is being recorded, maybe we need to have a warning to say that emotional filters are active. Looping back to the days of Photoshop. The photo editor at National Geographic said we have to rely on the ethics of people. As Chris was talking about, as we start teaching machines to be our assistance, just like good children, they pick up on our ethics and repeated. It remains to be seen whether this is still true. Thanks very much. (Applause)