Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Ruling the World: When Life Gets Gamed

Ruling the World: When Life Gets Gamed

Before AI ethics and human-centred AI, before Frank Pasquale's "The Black Box Society", before Brett Fischmann and Evan Selinger's "Re-Engineering Humanity", before Jerry Z. Muller's "The Tyranny of Metrics", before Kars Alfrink's "Contestable AI", before Dan Davies' "The Unaccountability Machine" --- we (= the tight filter bubble of second gen Internet intelligentsia) were musing gently about what it means when more and more of everyday life gets offloaded into and organised by software.

Oh those innocent days.

Part of that was me joining the stage with Tom Armitage (Hello Lamppost!) and Kars Alfrink to talk about games. I used Zuckmayer's play The Captain of Köpenick about an ex-con hacking a Prussian bureaucratic infinite regress to discuss how offloading social regulation into automated decision-making will require such manual overrides and invite and necessitate both bad and desirable forms of "gaming the system".

In hindsight, I think many of these ideas have been fleshed out and evidenced way better since, including in the concept of contestability (https://contestable.ai/). But I still think there is some unexplored purchase in the idea that automated rule systems = game-like = invite gaming, and my throw-away taxonomy of types of gaming the system. Enjoy.

Talk given at Lift'12 in Geneva, February 23, 2012.

Sebastian Deterding

September 23, 2024
Tweet

More Decks by Sebastian Deterding

Other Decks in Design

Transcript

  1. Let me start with a story – in fact, two

    stories. In 1906, Friedrich Wilhelm Voigt, a con man, was released from prison.
  2. Reformed, he actually wanted to become a good citizen. But

    he quickly ran into a problem: To get an apartment, he needed to document that he had a job. To get a job, he needed a work permit. But to get a work permit, he needed to document he had an apartment. And the Prussian bureaucrats wouldn‘t make an exception for him. They stuck to the rules – a bit like a computer, really. So Voigt was caught in a loop.
  3. So on October 16, 1906, Voigt puts on a Captain‘s

    uniform, grabs a group of soldiers from the street, marches over to the townhall of Köpenick, and occupies it ...
  4. … and in the course, has his work permit signed

    and stamped. This stunt immortalized Voigt in German folklore as the »Captain of Köpenick«.
  5. For the first time, I tried out one of these

    new gimmicks – a mobile ticket. All went well, until I switched my phone back on in Schiphol ...
  6. … and found that the QR code did not load

    – it was stored online. And because of the roaming charges, I wouldn‘t dare switch mit WIFI on.
  7. So I walked over to these ticket machines to print

    a replacement ticket. But I got none. The ticket machine informed me that the ticket under my number was already drawn. I was stuck in a loop: The system did not foresee that someone might draw a mobile ticket, but then need a paper replacement as well.
  8. http://www.flickr.com/photos/erussell1984/2443450232 Fortunately, I could walk over to these people, who

    printed out another ticket for me so I could board in time. But on the plane, I started to wonder: What if they had not behaved like they did, but more like a Prussian bureaucrat? Like a computer? What if they had been replaced by a computer, like so many other service people on the airport? And it dawned on me that this question extended way beyond the airport. Increasingly, we live in a world ruled by computers.
  9. You experience it every time you buy something online and

    get recommendations what to by (actually, every time you use any web site).
  10. Every time you drive on a highway and come by

    these automated traffic control systems that measure traffic and change speed limits accordingly.
  11. I am of course not the first person to observe

    this. Matt Webb of BERG calls this „The Robot Readable World“.
  12. HOW ALGORITHMS SHAPE OUR WORLD In his Lift talk last

    year, Kevin Slavin tracked „how algorithms shape our world“.
  13. HOW ALGORITHMS SHAPE OUR WORLD The architects Kitchin and Dodge

    call this new world „code/space“. And „The new aesthetic“ that James Bridle traces tomorrow is basically the aesthetic expression of this code/space we live in today.
  14. Jane McGonigal »What if we decided to use everything we

    know about game design to fix what‘s wrong with reality?« reality is broken (2011: 7) This idea that we can put a »game layer« – goals, rules, feedback systems – over reality to »fix it«: to make it more fun, enjoyable, engaging.
  15. Life Or life itself. If you think about it a

    bit, gamification is the logical next step of the code/space: It takes this world of ubiquitious sensors and algorithms we already live in to actively steer and change people‘s behaviour.
  16. Now I don‘t know about you, but to me, this

    sounds like one big 1950s Scifi »What if?« novel turned into a real-life experiment.
  17. What if ... we let computers run our rule systems

    and put humans inside? What if … we let computers run our rule systems, and then put humans inside? That is the question I‘d like to answer today, or better: report some preliminary findings.
  18. The first thing we find are exceptions. If you look

    at the Captain of Köpenick, or my mobile ticket: Both were exceptions; they were not foreseen in the rule system.
  19. Exceptions are the rule And if you ever wrote programs

    yourself, you know that exceptions are not exceptions: They are the rule.
  20. The are the rule because the map is never the

    territory, and complexity can never be reduced: We can never foresee every edge case, and the more complex we make a model to include edge cases, the more interactions and complexities within our model we create, so that the model itself starts to produce bugs, errors, exceptions.
  21. This is something ecologists discovered when they tried to build

    ever-more complex models of ecosystems: At a certain point, making the model more complex and realistic decreased the power and quality of predictions it generated.
  22. So what we always needed and always will need is

    a manual override: A human stepping in, making sense of the situation, and handling the exception. Which is what I did when I walked from the ticket machine to the service people.
  23. Ever-more removed But that‘s the thing: When we shift these

    systems into computers, the manual override becomes more and more removed from us. You already experience that every day when you interact with companies and end up in said phone trees. (Which is why there is a service like „Get Human“ to make the manual override accessible again.)
  24. http://www.flickr.com/photos/target_man_2000/5544736415/ Ever-more bla -boxed And increasingly, even manual override is

    inaccessible: I couldn‘t check or fix what business rule kept the ticket machine from giving me a replacement ticket. And even if I were a programmer and had source code access: The more complex and older these systems become, the harder they become to fix or override.
  25. http://www.flickr.com/photos/target_man_2000/5544736415/ Take Cobol: Cobol was the main mainframe language back

    in the days. According to one estimate, 90% of all global financial transactions are still processed in Cobol. But all the programmers that ran these systems are retiring, and too few young people are learning Cobol. So increasingly, our financial transactions are operated by computer programs we cannot fix or override because no-one understands them anymore, and they are too »mission-critical« to stop, throw away and just start anew.
  26. 3of letter & spirit (No rule is ever explicit) Not

    only will rule systems always have exceptions: Rules are also never explicit. Rules always have a meaning, an intention. And for everyday life to work, we follow that intention – the spirit of the rule, not the letter.
  27. In fact, this is essential for rule systems to work

    in real life. Take a phenomenon like »work to rule«: People strike by sticking to the letter of their work regulations – like Austrian postal workers who one weighed every single piece of mail to ensure that proper postage was affixed, bringing the whole system to a screeching halt.
  28. But if you put a rule system into a program,

    the program will follow it to the letter – it cannot bend or overstep it toward »the spirit«. Take foursquare, for example: The system only knows the hard rule of not checking in more often than so-and-so-many times per hour.
  29. http://www.flickr.com/photos/37996583811@N01/5020671427 So what you get are these fine people at

    the Playful conference in London 2010, holding a public voting of London foursquare players to determine which kind of foursquare checkins are in the spirit of the game: Checking in at home? Using auto-checkin? Checking in at buses?
  30. 4intentions matter Or: Computer‘s can‘t give credit And not only

    do rules have intentions. To us as humans, it makes a huge difference whether something is done by a person with intention or not.
  31. http://www.flickr.com/photos/beigeinside/50122570/ In a recent self-experiment for the magazine Popular Science,

    the journalist Matthew Shear tried to »gamify« all parts of his existence for a week, including »becoming a better fiancé«, where he would gets points for washing dishes or taking the dog out. On the evening of day five, when he and his girlfriend went to bed, he said:
  32. »You look especially lovely tonight.« http://www.flickr.com/photos/beigeinside/50122570/ »Now I feel like

    you’re just doing it for the points.« We care whether people do something to follow a rule, or because they get an incentive for it, or because they genuinely mean it (like apologizing, or paying a compliment).
  33. Computers, however, can‘t do things and mean them, and this

    does make a difference to us. This was recently demonstrated in a nice scientific study with school kids using Scratch. If you don‘t know it, Scratch is a gorgeous software that allows kids to program video games with a very visual code editor, thus learning the principles of programming in the course.
  34. A core part of Scratch is the online community that

    enables people to remix and improve the games of other designers.
  35. To support that, there‘s an automated feature that shows if

    someone copied another person‘s project.
  36. In addition, users established the practice of thanking the original

    creator in the project notes. And in interviews, it came out that this personal, intentional note was much more important and engaging than the automated one. Indeed, many users felt it was even undue plagiarism if you didn‘t explicitly state in the notes which project you copied – even if the automatic attribution did it.
  37. 5campbell‘s law How Rules Beget Gamers So much for what

    happens when we let computers run our rule system. Now what happens when we put humans into these systems? The short answer: They become gamers. They game the system.
  38. Donald T. Campbell »The more a quantitative social indicator is

    used for social decision-making, the more subject it will be to corruption pressures and the more apt it will be to distort and corrupt the social processes it is intended to monitor.« assessing the impact of planned social change (1976) This is not a new observation. Already in the 1970s, the sociologist Donald T. Campbell stated his famous laws. What he was describing were things like schools evaluated by how students performed on certain tests, where school directors would fudge the numbers: They asked low-performing students to drop out of school, or reclassified them as »disabled«, because then they wouldn‘t be counted in.
  39. system intention But the observation is a general one: All

    social systems serve a purpose, an intention.
  40. system intention formal rules, quantified goals, something at stake And

    whenever you turn such a system into something game-like – with formal rules, quantified goals, and something at stake ...
  41. system intention formal rules, quantified goals, something at stake …

    weird things start to happen with the relation of system and intention.
  42. system intention » e Mun kin« Type #1 The first

    thing that happens is something we often observe in regular games: For some people, the system becomes its own end. People pursue the stated goal of the game and become blind to everything outside that. Among gamers, we even have a word for such people. We call them »Munchkins«.
  43. the rule of irrelevance Now to a certain extent, this

    focus is desired: We want people to want to win the game – otherwise it‘s no fun to play. Likewise, we want people to focus on the game itself. This is what sociologist Erving Goffman called »the rule of irrelevance«.
  44. Take strategy war games. Some of them, like Warhammer, are

    played on lush miniature landscapes with beautifully hand-painted figures costing hundreds of Euro. But to a certain extent, while you‘re playing, that price and that beauty are irrelevant.
  45. http://boardgamegeek.com/image/1209336/advanced-squad-leader?size=original For the purposes of the game, those figures and

    landscapes might as well be represented with some cardboard counters on a simple map. The only thing that counts are the game-internal values of the units – how much damage do they do? How far away does one unit stand from another, and how does that affect my probability of scoring a hit?
  46. http://www.rasmusen.org/x/images/pd.jpg So in a certain sense, when you put humans

    into a game, they can become »rational actors« – strategic decision-makers myopically focused on maximising their outcomes, the kind of strange creature that otherwise only lives in the Prisoner Dilemmas of mathematic game theory and economics. The become like computers, really.
  47. But in real games, every gamer knows there‘s a limit:

    If you go too far, you become a Munchkin. To quote from Wikipedia, »a munchkin seeks within the context of the game to amass the greatest power, score the most 'kills', and grab the most loot, no matter how deleterious their actions are to the other players' fun". In other words, the Munchkin forgets that the purpose of playing a game is to have fun together. He forgets that he is not only a a rational actor, but also a social actor enmeshed in messy world where the beauty of the pieces and their worth and his friends and fair play and fun – where everything counts.
  48. And Munchkindom is pervasive. BMW recently tested a location-based game

    prototype to motivate fuel-efficient driving. The game challenged you to beat the amount of fuel used by other drivers for the route you entered into the navigation system. The prototype worked well – on average, test drivers used 0,4l/100km less fuel. In fact, the game was so motivating ...
  49. So you also played EcoChallengeTM? … that in order to

    safe fuel, the test drivers engaged in not-so-safe driving practices, like dashing over a reddish light because stopping and restarting would use more fuel. In the US, »hypermiling« is the newly-minted word for this new emergent consumer behaviour. Again generalising, once you add incentives or goals to anything, it can motivate all kinds of unintented behaviours. (Source)
  50. After the recent financial crisis, many critics have traced its

    origins back to Munchkindom: The market had become self-referential. In his recent book „Fixing the Game“, Robert Martin observed that tying incentives to stakeholder value has turned CEOs into Munchkins focused solely on stock market price, destroying companies in the course, as they ignored that the stock market is a means to the end of funding sustainable growth of the company.
  51. <Insert Dilbert cartoon here> Similarly, the management consultant observed James

    Rieley observed that in every large organisation, people start to focus on the internal game of meeting their KPIs and targets and lose sight of whether these are helpful for the thriving of the organisation itself. In a word, they become office politics Munchkins. And I am sure you can think of many examples yourself.
  52. »negative externalities« Economists have their own word for this: negative

    externalities. Bad things happening as a consequence of an economic exchange that don‘t effect the exchange because they are external: They are not counted in. Again, we can generalise this: Create a rule system and targets, and everything not »counted in« tends to become an unaccounted negative externality.
  53. In a certain sense, Brenda Brathwaite‘s board game Train is

    a reflection on how we as humans are prone to become Munchkins. On the surface, Train is a transportation game with the goal to move as many people as quickly as possible from start to finish. So you have to move fast and stack people efficiently. But when the first player‘s train reaches the destination, he has to draw a „Terminus“ card, which reveals his destination. And on those cards, the player reads words like Auschwitz. Or Bergen-Belsen.
  54. »Just following orders« He discovers that he has become an

    Adolf Eichmann, »just following orders«. That he never questioned the goal he was given, or the intention of the system he was operating in.
  55. system intention » e Exploiter« personal gain Type #2 The

    second type of gamer is someone who knows the intention of the system full well, but doesn‘t care. Instead, he maliciously uses the rules for his own purposes.
  56. Take this story from Australian Economist Joshua Grant who tried

    to raise his daughter with economic laws. She should be potty- trained, so good economist that he was, he introduced an incentive – Skittles – that she would get every time she went to the potty. So what would our smart gaming daughter do?
  57. She somehow managed to discipline herself so that she would

    go to the potty every twenty minutes – and eat herself sick with Skittles. And it gets even better.
  58. When her little brother should be potty-trained, her father wanted

    to make it a social thing – so she would earn Skittles every time he went to the potty. And what did the clever lady do? She added water to the equation – that is, to her little brother. Lots and lots of water. (Source)
  59. Again, this behaviour is pervasive everywhere you have a rule

    system and something at stake. Think of filibustering in the US Senate, where Republican senators in 2010 stopped a law to disclose sponsors of political ads by using their right to speak as long as they wish, and their majority to stop the Democrats from voting a »cloture« to end it, until the Democrats gave in and abandoned the law.
  60. Think of online media: Buying facebook or twitter followers, black-hat

    SEO, ... or this Kindle stand that got a whopping 310 five-star reviews out of a total 335 on Amazon. Some people got curious and found that the company selling them packed a little note to the first stands it sent out. The note asked people to write an Amazon review, if they liked the product. So far, so good. But there was also this little sentence:
  61. »In return for writing the review, we will refund your

    order, so you will have received the product for free.«
  62. » e Player« Type #3 The third kind of gaming

    the system happens when people are more interested in exploring the possibilities the rule system holds than producing a pragmatic effect. In a sense, they are the benign counterpart to the Munchkin – ignoring the original intention of the system, but not out of forgetfulness, but out of curiosity. system intention
  63. Or tracking the most deleted, rather than the most listened,

    tunes. In short, exploring what effects and experiences are possible within a given system.
  64. system effect » e Ha er« intention Type #4 The

    fourth kind of gaming the system happens when people find the system itself to be broken. When the system serves a certain end that is not what the system originally was intended for – people will hack it.
  65. Health care is a good example: It is heavily ruled

    and regulated to reduce costs. But for doctors, the point of health care is not costs, but healing patients. So when the system gets in the way of their patients, they game it: If a health insurance doesn‘t pay preventive screening in an MRI, say, they diagnose a patient as »having a brain tumor« instead of »screening for possible tumor«, to make sure people get the treatment that is best for them.
  66. And when our captain here did not get his work

    permit – well, you know the story.
  67. 6Plus ça change ... Whose rules? What game? I would

    like to end with a simple question: Who builds these rule systems? Whose intentions do they support? What kind of „fixing reality“ do they propose? The answer takes us back into the 1970s.
  68. Technologies of power Back then, the philosopher Michel Foucault coined

    a useful term: Technologies of power. What he meant were all the rules, procedures, machines, discourses that a society uses to control its individuals – to rule the world. And if we look at today‘s »code/spaces« and gamified applications, I'd argue they fit that bill.
  69. Stay in the game. Move on. They are designed by

    companies and governments to make you fit into the rules they devised: Fitter, happier, more productive – for their purposes.
  70. … are technologies of the self You see, technologies of

    power can also be used as technologies of the self. Technologies with which we are ruled, but also technologies we can use to rule ourselves, reflect on ourselves, transform ourselves – and in the course, lift ourselves out of the rules of society.
  71. Michel Foucault »What I mean ... are those intentional and

    voluntary actions by which men not only set themselves rules of conduct, but also seek to transform themselves, ... and to make their life into an oeuvre«. the use of pleasure (1985)
  72. And if that sounds a bit abstract, here‘s an example.

    In 1971, Luke Rhineheart wrote this thinly veiled autobiographical novel about a psychoanalyst named Luke Rhineheart who is utterly bored with his life – stuck in a rut. So one day, he sets himself one rule: Every decision he will make will be made by the throw of a die. He will write out six options and then let the die decide. As you would expect from a pulpy 1970s »cult classic«, the ensuing events are full of gratuitous sex (especially sex), violence, drugs, madness, and other social deviance. But I think the main point stands valid: We can use self-chosen rules to liberate, to grow, to empower yourself.
  73. And if you prefer more recent examples, there‘s Fred Stutzman‘s

    »Freedom«, which allows you to rule yourself out of internet connectivity.
  74. »How do you use technology to generate more of those

    serendipitous encounters?« Even foursquare, in its original intention, was all about this: To use data collected about you and your friends to push you out of your rut into exploring your city, as co-founder Dennis Crowley explains.
  75. »I realized that I‘m surrounded by opportunities in life that

    I‘m not aware of.« And apparently, even game designer Will Wright is after this in his most recent venture »HiveMind«: Using recommendation engines to push us out of the trodden paths – and paradoxically into yet another comfort zone.
  76. But I think that Crowley and Wright miss a central

    insight of »The Dice Man«: Self-transformation is not about fancy technology. The »Dice Man« used the oldest and simplest game technology available: A die; dice go back before recorded history. In the end, what makes a rule system a technology of the self – or a technology of power – is how we, as human beings, relate to it. Whether we actively decide to make use of them.
  77. Whether we make sense of the rules in the situation

    at hand, and handle the exceptions.
  78. … and when we find them faulty, do not become

    myopic munchkins or self-serving exploiters, but hackers who fix what‘s really broken.
  79. If you liked this, you will enjoy ... don‘t play

    games with me! Promises and Pitfalls of Gameful Design