Upgrade to Pro — share decks privately, control downloads, hide ads and more …

Shannon Vallor - Who is responsible for respons...

Shannon Vallor - Who is responsible for responsible AI?

Turing Fest

July 05, 2023
Tweet

More Decks by Turing Fest

Other Decks in Technology

Transcript

  1. Who is Responsible for Responsible AI? THE ECOLOGIES OF A

    RESPONSIBLE AI ECOSYSTEM 29.07.23 Prof. Shannon Vallor The University of Edinburgh Turing Fest 2023
  2. o Trustworthy o Fair o Just o Accountable o Transparent

    o Safe o Beneficial RESPONSIBLE AI INNOVATION IS:
  3. AI represents a further consolidation of social power in the

    hands of tech companies, developers and users
  4. Innovation without responsibility: o endangers the ‘social license to operate’

    o inhibits adoption; skews it to reckless actors o breeds a vicious cycle of social harms o incentivizes a short-term ‘race to the bottom’ o impedes public support for future innovation
  5. The cost to society of Model 1 for social media

    was high. We’ve got lasting damage: to democratic norms, to trust in institutions, to social and civic cohesion, to political rationality
  6. AI isn’t the new oil. It’s the new steel. Photos

    from top left by Dakota Roos, Pierre Pavlovic, and Ricardo Gomez on Unsplash
  7. 2. AI systems emerge from a complex, shifting global web

    of distributed yet interdependent actors Photo by Ivan Bandura on Unsplash
  8. The Material Ecology of AI • Data suppliers • Energy

    suppliers • Human labour suppliers (from mining to Mturk) • Material resource suppliers (water, conflict minerals for GPUs) See: Kate Crawford and Vladan Joler’s ‘Anatomy of an AI System’ (2018)
  9. The Tech Ecology of AI • Data scientists • AI

    researchers • Model developers • ML Engineers • Tech leads/PMs • Model and product testers • API and UI developers • ML Fairness/Trust and Safety/Responsible AI teams • Professional tech societies and standards orgs • AI research publication venues
  10. The Corporate Ecology of AI • Large platform companies and

    cloud providers • Investors • AI startups and open source orgs • AI research orgs • Logistics and supply chain companies • Third-party apps and services • Consulting firms • AI auditing firms • Corporate boards and lobbyists • Business users
  11. The Public Ecology of AI • Individual consumers/end users •

    Impacted communities and non-users • Public institutions (universities, NHS, courts, media) • Civil society, policy and advocacy orgs • Academic societies • Public research funding agencies • Local, regional and national governments • National policymakers and legislators • Regulators • Intergovernmental bodies (UN, WEF, OECD)
  12. 1. A thriving ecosystem is balanced by the health of

    its component ecologies – none can be sacrificed for another
  13. 3. Healthy ecosystems are regulated by constant adjustment to a

    dynamic environment; brittle ecosystems collapse
  14. 5. Responsibility for the health of ecosystems rests with agents

    with the power to damage them and the knowledge that they can do otherwise
  15. 1. manage the health of all the AI ecosystem’s ecologies

    2. build symbiotic relationships within and across ecologies 3. be guided by coordinated, agile and responsive regulation 4. create mechanisms of resistance and resilience to shocks 5. distribute duties of care to powerful actors across ecologies Responsible AI Policy and Practice Must:
  16. 1. We need better maps of the AI ecosystem’s regional

    and global ecologies, key interdependencies and dynamics
  17. 2. We need to boost the ecological knowledge of powerful

    actors of vulnerabilities in the AI ecosystem
  18. 3. We need to realign the interests and incentives of

    powerful actors across the AI ecosystem with its health
  19. 4. We need to bridge artificial divides between sectors and

    disciplines in the AI ecosystem that block ecological understanding, communication and coordination