Hosted by Ben Byford, The Machine Ethics Podcast brings together interviews with academics, authors, business leaders, designers and engineers on the subject of autonomous algorithms, artificial intelligence, machine learning, and technology’s impact on society.
This episode I’m talking with Derek Leben about his new book Ethics for Robots: How to Design a Moral Algorithm. We also dive into a general framework for machine ethics, contractarianism, Rawls’ original position thought experiment (which is one of my favourite ethical thought experiments), maximin function approach to machine ethics, and whether robots should respect the consent of a person in life threatening circumstances…
Listen to the episode here:
Derek Leben is Associate Professor of Philosophy at the University of Pittsburgh, Johnstown. He works at the intersection of ethics, cognitive science, and emerging technologies. In his book, Ethics for Robots, Leben argues for the use of a particular moral framework for designing autonomous systems based on the Contractarianism of John Rawls. He also demonstrates how this framework can be productively applied to autonomous vehicles, medical technologies, and weapons systems.
This podcast was created, and is run by, Ben Byford and collaborators. Over the last few years the podcast has grown into a place of discussion and dissemination of important ideas, not only in AI but in tech ethics generally.
The goal is to promote debate concerning technology and society, and to foster the production of technology (and in particular: decision making algorithms) that promote human ideals.
Ben Byford is a AI ethics consultant, code, design and data science teacher, freelance games designer with over 10 years of design and coding experience building websites, apps, and games. In 2015 he began talking on AI ethics and started the Machine Ethics podcast. Since then, Ben has talked with academics, developers, doctors, novelists and designers about AI, automation and society.
Join in the conversation with us by getting in touch via email here or following us on Twitter and Instagram.