$30 off During Our Annual Pro Sale. View Details »

Ethics, Data Science, and Public Service Media

Ethics, Data Science, and Public Service Media

In this talk, I will look at the contributions public service media organizations can play in the emerging understanding of the responsible and ethical practice of data science. We will look at some specific project examples: what works and where we can improve.

Among them are automatic decision-making processes, as they need to come to Public Service Medias (PSM) because they represent some competitive advantage and competitive potential. But PSM are about making sure that people have a shared understanding of the world around them. How can you balance these two different expectations?

By the end of the talk, the audience should have examples and principals they can apply in their own data science practice.

As presented at EGG london 2019

Ben Fields

July 02, 2019
Tweet

More Decks by Ben Fields

Other Decks in Technology

Transcript

  1. Ethics, Data Science, and
    Public Service Media
    Ben Fields
    @alsothings
    ben.fi[email protected]

    View Slide

  2. Ethics, Data Science, and Public Service Media | EGG London | 2 July 2019
    The Case for Public Service Recommender Algorithms
    Ben Fields, Rhianne Jones, Tim Cowlishaw
    BBC
    London
    [email protected]
    ABSTRACT
    In this position paper we lay out the role for public service or-
    ganisations within the fairness, accountability, and transparency
    discourse. We explore the idea of public service algorithms and what
    role they might play, especially with recommender systems. We
    then describe a research agenda for public service recommendation
    systems.
    1 INTRODUCTION
    In traditional commercial applications of recommender systems,
    the goal is a straightforward extension of an organisation’s overall
    commercial aims. This leads to a focus on designing and optimising
    recommender systems that above all else improve overall revenue
    (via increased purchasing) or in the case of a subscription service,
    increased engagement (via increased consumption of items, e.g.
    listens for songs, views for short video). However, for a class of
    organisations that answer to the public rather than shareholders,
    a dierent drive exists: public service. Whilst there is no single
    denition of what constitutes public service motivations, there are
    several ways in which the notion of public service enshrines the
    principles of Fairness, Accountability and Transparency (FAT) and
    presents an opportunity for novel ways to design recommender
    algorithms, challenging the orthodoxy of commercial applications
    of this technology.
    2 CONTEXT
    Using data to deliver personalised services to audiences has be-
    come a key strategic priority for many Public Service Broadcasters
    across Europe [7]. However the increasing use of algorithmic rec-
    ommendations and personalisation in Public Service Media (PSM),
    specically in journalism, has surfaced concerns about the poten-
    tial risk these models pose for PSM values like universality and
    diversity, through the potential to undermine shared and collective
    media experiences, reinforce audiences’ preexisting preferences,
    and the cumulative risk to PSM of becoming more like a goldsh
    bowl, rather than a window to the world [1, 17, 19, 23, 25]. However,
    counter to this is the view that recommender systems could be im-
    portant in promoting diversity of supply and stimulating exposure
    diversity [8, 9]. The European Broadcast Union (EBU) describes
    due oversight and scrutiny [26] to ensure they do not undermine
    editorial independence, impartiality [7] and their trusted reputa-
    tion. They must deliver recommendations that responsibly balance
    personalisation with the public interest.
    3 PSM VALUES AS A FRAMEWORK FOR
    RECOMMENDER SYSTEMS
    The notion that PSM values oer distinct frameworks for recom-
    mender systems is underpinning new EBU initiatives to develop
    distinctly PSM approaches to recommendations1. As a public ser-
    vice broadcaster, the BBC’s aims and operating principles are en-
    shrined in our public purposes2 which commit us to impartiality,
    distinctiveness, and diversity in our output. Issues of Fairness, Ac-
    countability and Transparency (FAT) thus inform approaches to
    recommendation and personalisation as these values are baked into
    its very reasons for existing as an organisation - in a way that is not
    necessarily true of commercial organisations. John Reith’s famous
    imperative of the BBC to "inform, educate and entertain" lies at the
    heart of the BBC mission. Whilst this has evolved over the years,
    the BBC’s unique duty and role in society remains central.
    In the domain of recommender systems the Reithian view of
    PSM commits to providing content which fulls the public’s need
    for diverse and balanced information, entertainment, and education
    in a manner which is unexpected or surprising – best expressed
    by Reith’s assertion that "the best way to give the public what it
    wants is to reject the express policy of giving the public what it
    wants"3. Notions of public service inevitably vary across dierent
    geo-political and cultural contexts [8] and a one size ts all model
    is likely to be unsatisfactory but it is clear that the PSM remit
    has implications for how we design and evaluate recommenders
    to ensure principles such as exposure diversity and surprise are
    maintained.
    4 PUBLIC SERVICE ALGORITHMIC DESIGN:
    WHAT YOU OPTIMISE FOR MATTERS
    Why is this signicant for recommender systems specically? The
    metrics we choose to optimise for are critical. Many commercial
    providers optimise for engagement and audience gures, for ex-
    ample collaborative ltering (CF) algorithms are often evaluated
    in terms of how accurately they predict user ratings. If PSM or-
    Human-centric evaluation of similarity spaces of news
    articles
    Clara Higuera Caba˜
    nes Michel Schammel Shirley Ka Kei Yu Ben Fields
    [first name].[last name]@bbc.co.uk
    The British Broadcasting Corporation
    New Broadcasting House, Portland Place
    London, W1A 1AA
    United Kingdom
    Abstract
    In this paper we present a practical approach
    to evaluate similarity spaces of news articles,
    guided by human perception. This is moti-
    vated by applications that are expected by
    modern news audiences, most notably recom-
    mender systems. Our approach is laid out
    and contextualised with a brief background
    in human similarity measurement and percep-
    tion. This is complimented with a discussion
    of computational methods for measuring sim-
    ilarity between news articles. We then go
    through a prototypical use of the evaluation
    in a practical setting before we point to fu-
    ture work enabled by this framework.
    1 Introduction and Motivation
    In a modern news organisation, there are a number of
    functions that depend on computational understand-
    ing of produced media. For text-based news articles
    this typically takes the form of lower dimensionality
    content-similarity. But how do we know that these
    similarities are reliable? On what basis can we take
    these computational similarity spaces to be a proxy
    for human judgement? In this paper we address this
    • Analogously, what are e cient and e↵ective
    means of computing similarity between news ar-
    ticles
    • By what means can we use the human cognition of
    article similarity to select parameters or otherwise
    tune a computed similarity space
    A typical application that benefits from this sort of
    human calibrated similarity space for news articles is
    an article recommender system. While a classic col-
    laborative filtering approach has been tried within the
    news domain [LDP10], typical user behaviour makes
    this approach di cult in practice. In particular, the
    lifespan of individual articles tends to be short and the
    item preferences of users is light.
    This leads to a situation where in practice a col-
    laborative filtering approach is hampered by the cold-
    start problem, where lack of preference data negatively
    impacts the predictive power of the system. To get
    around this issue, a variety of more domain-specific ap-
    proaches have been tried [GDF13, TASJ14, KKGV18].
    However, these all demand significant levels of analyt-
    ical e↵ort or otherwise present challenges when scaling
    to a large global news organisation. A simple way to
    get around these constraints while still meeting the
    functional requirements1 of a recommender system is
    to generate a similarity space across recently published
    https://piret.gitlab.io/fatrec2018/program/fatrec2018-fields.pdf https://research.signal-ai.com/newsir19/programme/index.html

    View Slide

  3. recommender algorithms
    Ethics, Data Science, and Public Service Media | EGG London | 2 July 2019

    View Slide

  4. Public service recommender algorithms
    Ethics, Data Science, and Public Service Media | EGG London | 2 July 2019

    View Slide

  5. Motivations and Aims
    Data to deliver personalised services
    has become a key strategic priority
    for Public Service Media
    organisations across Europe
    Ethics, Data Science, and Public Service Media | EGG London | 2 July 2019

    View Slide

  6. potential to undermine shared and
    collective media experiences
    Ethics, Data Science, and Public Service Media | EGG London | 2 July 2019
    Motivations and Aims
    https://www.flickr.com/photos/woolamaloo_gazette/47571470732/

    View Slide

  7. reinforce audiences’
    preexisting preferences
    Ethics, Data Science, and Public Service Media | EGG London | 2 July 2019
    Motivations and Aims
    https://www.flickr.com/photos/garryknight/4659576761

    View Slide

  8. Motivations and Aims
    Ethics, Data Science, and Public Service Media | EGG London | 2 July 2019
    (Public Service) Media
    becoming more like a
    goldfish bowl, rather than
    a window to the world
    https://www.flickr.com/photos/60852569@N00/3321751008/

    View Slide

  9. Motivations and Aims
    For traditional
    commercial applications
    the goal is a
    straightforward extension
    of an organisation’s
    overall commercial aims
    https://www.flickr.com/photos/24354425@N03/16593266327
    Ethics, Data Science, and Public Service Media | EGG London | 2 July 2019

    View Slide

  10. “inform,
    educate,
    and
    entertain”
    Ethics, Data Science, and Public Service Media | EGG London | 2 July 2019

    View Slide

  11. Thus far guidance on how public service (media) should
    approach personalisation has been vague and not
    sufficient to drive implementation
    Ethics, Data Science, and Public Service Media | EGG London | 2 July 2019

    View Slide

  12. How can public service values lead to
    novel ways to design recommender
    algorithms?
    Ethics, Data Science, and Public Service Media | EGG London | 2 July 2019

    View Slide

  13. How do we operationalise public service
    media (PSM) values as tangible concepts in
    specific contexts?
    Ethics, Data Science, and Public Service Media | EGG London | 2 July 2019

    View Slide

  14. operationalise PSM
    Caveat: Notions of public service inevitably vary across
    different geo-political and cultural contexts (Helberger
    2015)
    Ethics, Data Science, and Public Service Media | EGG London | 2 July 2019

    View Slide

  15. operationalise PSM
    Key challenges: operationalising concepts like diversity,
    surprise, shared experience, etc.
    Ethics, Data Science, and Public Service Media | EGG London | 2 July 2019

    View Slide

  16. operationalise PSM
    “any initiative to promote diversity
    exposure will first have to deal with
    ‘the question of what exposure
    diversity actually is’ as well as how
    to measure it” (Helberger et al. 2018)
    Ethics, Data Science, and Public Service Media | EGG London | 2 July 2019

    View Slide

  17. How much accuracy loss is acceptable in
    pursuit of new metrics, e.g. diversity?
    Ethics, Data Science, and Public Service Media | EGG London | 2 July 2019

    View Slide

  18. At what (accuracy) cost?
    Tradeoffs must be made explicitly to minimise
    unexpected and undesirable outcomes
    Ethics, Data Science, and Public Service Media | EGG London | 2 July 2019

    View Slide

  19. At what (accuracy) cost?
    Can this question be answered
    through other research practises?
    (e.g. audience research, UX
    methodologies)?
    Ethics, Data Science, and Public Service Media | EGG London | 2 July 2019
    https://www.flickr.com/photos/usc_annenberg_innovation_lab/8675548605

    View Slide

  20. How should transparency work - when and to
    whom is it useful, e.g. regulators?
    Ethics, Data Science, and Public Service Media | EGG London | 2 July 2019

    View Slide

  21. Transparency for whom?
    Principles of transparency key to the mission of PSM:
    They provide the mechanism by which PSM are
    regulated and held accountable
    The Case for Public Service Recommender Algorithms | EBU AI Workshop | 9 November 2018

    View Slide

  22. Transparency for whom?
    • Transparency for stakeholders/audiences or regulators?
    • When we say transparency, when do we mean disclosure?
    • In a user-facing system this is an issue of enabling
    consent to be meaningful
    • However in a stakeholder arrangement this is about
    disclosure
    The Case for Public Service Recommender Algorithms | EBU AI Workshop | 9 November 2018

    View Slide

  23. How do we design for interpretability and
    explainability to enable appropriate oversight
    of how recommenders are making decisions
    and ensure due accountability?
    Ethics, Data Science, and Public Service Media | EGG London | 2 July 2019

    View Slide

  24. Design for accountability?
    • Accountability vital to PSM/BBC
    • Transparency/full disclosure vs meaningful
    explanations for a user
    • Does the desire for transparency push algorithmic
    design in particular directions?
    Ethics, Data Science, and Public Service Media | EGG London | 2 July 2019

    View Slide

  25. Design for accountability?
    • How can complex algorithmic systems designed to be
    intelligible/interpretable
    • What types of explanations can a system generate/what’s
    possible?
    • What explanations will be sufficient for oversight?
    • How will explanation needs vary e.g regulators/
    stakeholders/editorial/public?
    Ethics, Data Science, and Public Service Media | EGG London | 2 July 2019

    View Slide

  26. What type/level of explanation will be most
    useful?
    Ethics, Data Science, and Public Service Media | EGG London | 2 July 2019

    View Slide

  27. Will explanations produced for editorial need to
    vary from the
    type of explanations PSM may provide to
    audiences?
    Ethics, Data Science, and Public Service Media | EGG London | 2 July 2019

    View Slide

  28. Explanation, how much? For whom?
    • Are we explaining algorithmic decision making to
    users or to experts (principally: editorial teams)?
    • What is the distance between these two groups?
    Ethics, Data Science, and Public Service Media | EGG London | 2 July 2019

    View Slide

  29. In Practice
    Ethics, Data Science, and Public Service Media | EGG London | 2 July 2019

    View Slide

  30. better align recommender systems in
    public service contexts with their
    underlying value frameworks
    Ethics, Data Science, and Public Service Media | EGG London | 2 July 2019

    View Slide

  31. Ethics, Data Science, and Public Service Media | EGG London | 2 July 2019

    View Slide

  32. Human-centric testing
    Ethics, Data Science, and Public Service Media | EGG London | 2 July 2019

    View Slide

  33. Human-centric testing
    Ethics, Data Science, and Public Service Media | EGG London | 2 July 2019

    View Slide

  34. Human-centric testing
    Ethics, Data Science, and Public Service Media | EGG London | 2 July 2019

    View Slide

  35. Thanks!
    Let’s have some questions!
    We’re hiring: http://bit.ly/bbcDSjobs
    Ben Fields
    @alsothings
    ben.fi[email protected]
    http://bit.ly/psmegg19
    Question by fahmionline from the Noun Project
    https://thenounproject.com/fahmionline/collection/ask/
    Ethics, Data Science, and Public Service Media | EGG London | 2 July 2019

    View Slide