Upgrade to Pro — share decks privately, control downloads, hide ads and more …

What Philosophy and Neuroscience can teach us about UX design

What Philosophy and Neuroscience can teach us about UX design

Today, I want to talk about three things. The first is the philosophical discipline of phenomenology. The second is what that has to do with how we use tools, and hence, with how we should design tools. The third is that neuroscientist say that the philosophers already got it right like a hundred years ago.

But first, let me take you to France. Paris, the city of love. In 1942. The city is under Nazi occupation. It’s night, long past the curfew, and outside SS patrols march down dark alleys with cobblestone streets. Inside, a man who would later become THE intellectual celebrity of post-war France together with his lifelong partner - think the Brangelina of the 50s - sat in front of a typewriter to work on the manuscript for his philosophical opus magnum “L’Être et le néant” - “Being and Nothingness”. Jean-Paul Sartre was a funny man. And not just because he had a disarmingly charming humour, but because he was actually funny looking. Most would actually describe him as plain ugly. He was short, ears sticking out, one eye pointing this way, one eye that way, he wore think glasses and was notoriously unkempt.

Some people already knew him as a playwright - he brought us such memorable quotes like “Hell is the other people”. Allegedly he wrote most of his plays to give the many young actresses he dated something to do - apparently his wit and intellect were rather seductive. But that night, he was tired. As he was typing on the manuscript, he tried to focus on the ideas, the concepts, the world the words he was typing were creating. But then his eyes became sore and weary, and the letters started to blur. He noticed how his attention shifted from the concepts to the letters that carry them. And then soon to his tired fingers that created those letters. Finally he couldn’t think about anything but his sore eyes and had trouble keeping them open.

Most ordinary people would just have decided to call it a night. But Sartre was anything but ordinary. He was a philosopher. Particularly, he was a phenomenologist. Phenomenology is the philosophical study of experiences, or more correctly, the study of structures of experiences. In a way, it’s a scientific discipline, a way of describing the world and theorising about it. But as opposed to physics or psychology, phenomenology is a first-person approach: you're analysing, deconstructing and describing your own experiences and consciousness. Phenomenologists don't study things for what they are, but for how they appear to us.

Sartre, being a phenomenologist, was quick to notice a certain pattern in his experiences. He noticed how his attention shifts from one thing to another - the ideas, the words, his fingers, his eyes. What's interesting here is that his attention shifts from the subject of perception to the medium of perception. First he perceives the ideas through the words, then the typed words themselves become centre of his attention. Finally, his own eyes become the subject of his experience, rather than the medium, the method of experiencing.

His book ”Being and Nothingness" is of course a play on the title of Martin Heidegger's seminal work "Being and Time" - "Sein und Zeit" in German, which Sartre read in a German prison camp when he was a prisoner of war in 1941. Being and Time is kind of like the Hadoken punch in philosophical street fighting, blasting away anything in its way. You know, it tries to answer questions like “Why is there something instead of nothing?”.

The thing about this book is that it was written with a brutal accuracy that is only possible in German where you can make up words like "Ersatzzeitgeist" and "Auskunftspflicht" - which loosely be translated to "the obligation to inform authorities about any changes in permanent residential address or marital status in a timely fashion" - yes, that's a single word in German, and it has seven consecutive consonants. Even the Czech are jealous of that one. These linguistic “features” make Being and Time almost undreadable in any other language than German.

Anyway, Heidegger introduces a distinction that incredibly important in UX design, although we don’t call it like that: presence-at-hand and readyness-at-hand. Or, in German: vorhandensein and zuhandensein. Guh, what's that? In his famous example, he considers a hammer. Alright, so most philosophers probably never used a hammer in their whole lives, which makes this example somewhat amusing.

There are two ways to understand a hammer. When it's lying there on the table, we can look at it, analyse it, describe it based on its constituents - wooden handle, heavy cast iron head - maybe even infer its function and use from the way it's shaped. The hammer is present-at-hand.

Magic happens when we pick up the hammer:

In order to I pick up a hammer, I need to twist my hand and adjust my fingers so they’ll be able to grasp the hammer. But as soon as the hammer is in my hand, when I use the hammer to drive a nail into the wall, I do not think about which angle to hold my hand, to manipulate the hammer - I think about how to hold the hammer to manipulate the nail. The hammer becomes almost invisible to me, it transitions from being an object in the world to a way of interacting with the world through it. Same thing when I use a pen to write or a computer mouse to point to things. I don’t think about how to move my hand on the trackpad, I think about how to move the cursor on the screen. The hammer becomes part of my body in that way; I interact with the world through the hammer now. Now, to understand the hammer means knowing how to hammer. That's what Heidegger cals ready-at-hand. Only when the hammer is ready-at-hand it becomes a means of action, rather than a subject of it, only then can we achieve some fluidity of using it. When the hammer breaks, it immediately loses its readyness-at-hand and becomes merely present-at-hand, our attention will immediately shift back from what we're trying to hammer to the hammer itself.

So we've seen two examples here of how our experience of a tool can shift from the tool itself to what we're perceiving (In Sartre’s case) or (in Heidegger’s example) manipulating through the tool and back.

I could easily talk about phenomenology as a tool in UX design all day, but the point I want to make today is a different one.

What I'm interested in is this transition. Moments ago the hammer was a distinct thing, an ontologically discrete entity lying there in the outside world. And as soon as I pick it up, it becomes part of me. I use it as naturally as my own hands. The boundary between "me" and the "world" shifts - now the hammer is part of me, and ceases to be part of the "outside" world.

You’re all familiar with the concept of affordances, right? Like the flat surface of a chair affords sitting on it and so on. Here’s what James Gibson had to say about hammers: “When in use, a tool is a sort of extension of the hand, almost an attachment to it or part of the user’s own body, and thus no longer a part of the environment of the user. But when not in use the tool is simply a detached object of the environment, graspable and portable, to be sure, but nevertheless external to the observer. This capacity to attach something to the body suggests that the boundary between the animal and the environment is not fixed at the surface of the skin but can shift. More generally it suggests that the absolute duality of ‘objective’ and ‘subjective’ is false. When we consider the affordances of things, we escape this philosophical dichotomy.” That’s a thesis that was actually crucial to Gibson’s work when he came up with it in 1977 but has unfortunately got lost a bit when affordances entered mainstream UX lingo.

This fact that tools can be so readily assimilated into our own body image has caused philosopher Andy Clark to call us "natural-born cyborgs." But he goes even further than this.

A little thought experiment: in 1993, IBM released the Intel Pentium processor. Top of the line home computers now had about 60 MHz of CPU power, 8 Megabytes for RAM, and a 500 Megabyte hard drive. And I remember this very well, because this is the year that the game Rebel Assault came out, and it was one of the first games to only be distributed on CD-Rom. No Floppy discs. I got a CD drive for Christmas in 1993.

Fast forward 20 years. This is what computer games look like now. If you had a team of the most talented and experienced programmers in the world, do you think you could get it to run on a 1993 PC? No. Of course not. Just not possible. No amount of clever engineering and advances in algorithms could make a game like Far Cry actually playable on a 1993 PC. Our hardware has improved by a factor of 1000 since then - THOUSAND!

Well, I've got bad news for you. Your hardware - your neuroanatomy - hasn't changed much in the last 100000 years. We’re ancient models! But yet we run software like differential calculus, post-modern art, flying fighter jets, contemporary ballet and arguing about unified string theory. Stuff that nobody was able to do 150 years ago! How is that possible? Because we're cyborgs. We use technology to enhance ourselves. And not just physical tools: We use the world itself as external memory - notepads and books to store and transfer knowledge and ideas, or, when I go shopping for groceries with a friend and we share a cart and I put my stuff on one side of the cart and his on the other side instead of both of us remembering who bought what: that’s external memory. That’s the world storing information for us. You could say we build smartness into tools to overcome our own shortcomings: hammers have a heavy head to increase the momentum that we can't achieve with just swinging our hand alone.

That’s what makes us natural-born cyborgs: not just because we’re _good_ at using tools and memory aides, but because we can’t NOT use them. Using tools as naturally as our own body - or you might as well say, using our own body as naturally as we use tools, is what defines our species. And on some phenomenological level, using a tool is indistinguishable from using our own hands. On this level, using an “external” mental process is just like using an internal one. You won’t even notice the difference.

So what I want you to remember so far is that

a) when we inspect our own experiences of tool use, we can see that tools become part of our body, and
b) philosophers say that this is the most natural thing ever

Alright my scientifically and practically minded peers, you may now have two questions. The first question is: how do we know it’s true and not just something some crackhead philosophers made up in their cozy leather armchairs on a particularly absinth-fueled night? The second question is: how does that help me as a UX professional?

Neuroscience to the rescue. I want to show and explain some studies done by Atsushi Iriki in Tokyo over the course of the past 10 years or so. What he did was, he took Japanese macaque monkeys and trained them to use a simple tool like a rake to fetch food from the other side of a table that would otherwise be out of reach. Now, it’s interesting to note that, unlike apes like chimps, macaque monkeys show very very little tool use in the wild, however they can learn how to use tools pretty quickly in a lab. So here’s the setup: the monkey is sitting at a table on a chair, and the researchers place some food like a grape on the table for the monkey to grab. First the food is placed close enough for the monkey to pick it up with their hands. But then it’s placed so far away that the monkey first has to use the rake that’s lying on the table to pull it closer. So far, so good.

What’s being studied is what’s happening in the monkey brain. For that, they stuck a microscopic array of electrodes into the monkey brain into an area called the intraparietal cortex. That’s an area in the primate brain where sensory and visual information gets integrated. That means signals coming all the way from the retina through the visual cortex get meshed up with signals coming from the hands through the somatosensory cortex here. You can think of it as two pipelines with different kinds of information joining each other there. Oh, and in case you’re wondering: there are actually no pain receptors in the brain, so the monkeys don’t actually feel the electrode. We know this for sure because such electrodes are used in standard procedures in humans as well, for example for the treatment of epilepsy.

So, monkey sitting at the table, fetching food, while we’re looking under the hood in the intraparietal cortex. Now what we actually measure with such electrodes is when single nerve cells called neurons fire, that means, when they pass on a signal to the next nerve cell. In a very simplified model you can consider each nerve cell to be a mini-mini computer that calculates only one function - it gets some inputs from different other mini-mini-computers, wrangles it a bit, and if the result seems significant enough to the cell it tells the other nerve cells. Our cells here get their input indirectly from both the visual and the touch systems.

Now here’s the fun thing: the particular intraparietal nerve cells they were recording in this study are receptive to anything that happens close to the monkey’s hand. So if you flash a light next to the monkey’s hand these cells will fire, and we can see that on our electrode. In fact, these cells will fire if you flash a light anywhere where the monkey could reach it with his hand! That’s called the nerve cells receptive field: the area in the real world that this cell responds to.

Now for the exciting part: if we move the food out of the monkey’s reach and give it a rake, while the monkey is using the rake, these neurons receptive fields grow to encompass everything the monkey can reach with the rake. So now, the nerve cells will fire whenever anything happens within the range of the tool, as if it was an extension of the hand. But only when the monkey is actually using the tool. When he’s just holding the rake without moving it, the nerve cells only respond to what the hand can reach. But as soon as it starts using the rake - zap! To the brain, or at least to these neurons, there’s no difference between the hand and the tool. It becomes completely integrated into our body schema.

And that that’s very neat! So neuroscience has philosophy’s back on this one. But it gets even better than this. In a follow-up experiment, we’ve got the same set-up, same nerve cells being recorded. But this time the monkey can’t actually see its hand or the rake. All it can see is a video screen showing a top down view of the table. And guess what? The same thing happens. If we flash a light on the screen where the monkey can reach it with its “virtual” hand, we get a response from the nerve cells, but not if we flash it outside the monkey’s reach. When give the monkey a rake, this area grows to what the monkey can reach with its virtual rake. Even if we filter out everything but the tip of the rake from the image, the effect remains. There are still neurons accepting the virtual image of tip of the rake as part of the monkeys body schema. Even if we replace the tip of the rake on the screen by nothing a white circle, same thing. Congratulations, the monkey has just learned how to use the mouse pointer. And we’ve just learned that the mouse pointer can, in fact, be part of your body.

And that’s a little bit crazy. Even if you buy the first part about hammers and pens, here we’re talking digital. It’s not just the physical mouse in my hand that becomes part of your body when you’re using it. It’s the purely virtual mouse pointer on the screen that does. This 16 pixel tall moving blob of light is as much part of me as my own hand.

And this is where I want to bring all of that neuroscience and philosophy back to UX design. It all comes down to a very simple, yet incredibly profound realisation: the user is not just what’s inside the skin. The cursor that allows the user to click on your pretty little buttons is not part of the interface. It’s part of the user. It’s part of how the user experiences herself. And this realisation won’t turn UX design upside down, but it can enable you to think about UX design from a new angle. Because, remember the golden rule of interface design?

The golden rule of interface design is: don’t chop off your user’s hands! We now know that the boundary between the user and the world, especially the boundary between the user and the interface, is not a given. It’s fluid. But, it’s also easy to understand that if that boundary gets violated, it breaks the flow. So, for example: can we please, as a design community, get rid of the fucking beach ball? Seriously. If the system has to think and can’t accept new input, then show that as part of the system. Grey out the parts you can’t interact with right now, show spinners on buttons, or even show a full-screen overlay. Just don’t magically change the user’s cursor into a beach ball or an hourglass or what not. That’s almost like mutilating the user’s index finger, you know? If the system breaks, show a broken system. My cursor belongs to me!

Think about what is user and what is not. And respect that boundary in your design. Right, so you know that a cursor is part of the user, not part of the interface. But what if the experience you’re designing doesn’t involve a mouse but, say, Nintendo’s nunchucks? Or what if you’re creating wearables? Will they be part of the user? Is my wrist watch part of me, or an extension, or something different entirely?

Here’s a good heuristic: if it moves together, it belongs together. The reason the hammer or pen or rake or mouse cursor are so readily assimilated into our body schema is that every muscle I contract immediately and predictably creates a movement in the tool. There’s a statistical regularity between how I move and how the tool moves. Same muscle movement always results in same tool movement. That’s why a tool becomes indistinguishable from my own body. Same muscle movement results in same hand movement. Same muscle movement results in same tool movement. The interaction is highly regular. This statistical regularity binds the tool to my body.

So, wrist watches: part of me? I’d say no. Why? Because we can’t actually do anything with them. We can’t interact with the world through them. It’s a wearable, but not a tool. And that’s the same for most wearable tech we currently have. I can’t actually do very much with it. Specifically, I can’t do anything that’s regular and continuous with it, which is why they’re not tools and will always be foreign objects to us.

But earlier I briefly mentioned “tools of perception” - like, uh, you’re eyes. Or glasses. Or, say, white canes used by visually impaired people. Hearing aides. All of them I’d argue are tools that become part of our body, and you know why? Because the perception is continuous and regular. Same movement results in same change in perception. The tools become integrated because they allow us to perceive something continuously and regularly. Wearables? Apple watch? Not so. Boring. They don’t actually turn us into good cyborgs.

Do you know what would? An Apple Watch that instead of buzzing and ringing to get our attention would slowly get warmer when it’s about time to leave for the next meeting. And hot when you’re running late. A fitness tracker that gets heavier or tighter when you eat more calories than you burn. A bracelet that gently and almost subconsciously buzzes into the direction of your kids when they’re out playing on a big and crowded playground. Continuous and regular. That’s the kind of tools humans don’t need a manual for. That’s the kind of tools that become us.

So, let’s use phenomenology to better understand experiences. When we design with the user in mind, let’s take a second to think about where the user actually ends. And remember that we’re cyborgs. The best tools are not tools, they’re upgrades. So, let’s upgrade us cyborgs. Thank you.

Manuel Ebert

May 15, 2015
Tweet

More Decks by Manuel Ebert

Other Decks in Design

Transcript

  1. NEUROSCIENCE
    PHILOSOPHY
    UX
    can teach us about
    What
    and
    @maebert
    #UXPABOS15
    MANUEL EBERT

    View Slide

  2. 1942
    Paris

    View Slide

  3. SARTRE
    Jean-Paul

    View Slide

  4. L'Être et le Néant
    ANY study of human reality
    must begin with the cogito.

    View Slide

  5. HeidegGER
    Martin
    DEAL WITH IT

    View Slide

  6. Martin Heidegger:
    PRESENT-AT-HAND
    vs.
    READY-AT-HAND

    View Slide

  7. THING
    The
    HAMMER
    just a in
    OUTSIDE WORLD
    the
    Present-at-hand:
    is

    View Slide

  8. HAMMER
    To
    UNDERSTAND
    the means
    HOW To
    knowing
    HAMMER
    Ready-at-hand:

    View Slide

  9. EXTENSION
    A
    TOOL
    an
    becomes
    BODY
    of my
    Ready-at-hand:

    View Slide

  10. AFFORDANCES
    Let’s have a seat
    & talk about

    View Slide

  11. Star Wars: Rebel Assault (1993)

    View Slide

  12. Far Cry 4 (2003)

    View Slide

  13. CYBORGS
    We’re all
    NATURAL BORN

    View Slide

  14. View Slide

  15. View Slide

  16. View Slide

  17. View Slide

  18. View Slide

  19. USER
    The
    IN FRONT
    is
    NOT
    just what’s
    of
    SCREEN
    the
    Auntie Miranda

    View Slide

  20. This has to die.

    View Slide

  21. my
    CURSoR
    BelongsTo
    ME
    #

    View Slide

  22. What
    MOVES
    BELONGS
    together
    together

    View Slide

  23. TOOLS
    of
    PERCEPTION

    View Slide

  24. Phenomenology rocks.
    1
    Lets do more of that.
    What’s the User?
    2
    Good question. Ask it more often.
    We’re cyborgs.
    3
    Let’s make better upgrades.

    View Slide

  25. THANK you.
    @maebert
    #UXPABOS15

    View Slide

  26. References
    Sartre, J. P: Being And Nothingness (1943)
    Heidegger, M: Being And TIME (1927)
    Dreyfus, h: Being-in-the-world (1991)
    Gibson, J: The Theory of Affordances (1977
    Iriki, A. & al: Tools For the Body(-Schema) (2004)
    Mentioned in the discussion:
    Asimov, I: The Secret sense (1941)
    [ notice a pattern there?]

    View Slide