Hosted by Ben Byford, The Machine Ethics Podcast brings together interviews with academics, authors, business leaders, designers and engineers on the subject of autonomous algorithms, artificial intelligence, machine learning, and technology’s impact on society.
This month we’re chatting with Giulia Trojano about AI as an economic narrative, companion chatbots, deskilling of digital literacy, chatbot parental controls, differences between social AI and general AI services, increasing surveillance in the guise of safety, advertising creeping into GenAI services, ReplikaAI, lack of research in emotional AI, techno-determinism, and more…
Listen to the episode here:
Listen on YouTube here:
Giulia is a competition lawyer focusing on abuse of dominance actions against Big Tech companies as well as environmental claims. She recently completed her masters in AI Ethics & Society at Cambridge and writes for several journals and academic publications on the interplay between technology, politics, society, and contemporary art. She regularly gives talks on AI ethics, law and regulation and in 2025 was recognised in the “100 Brilliant Women in AI Ethics” list.
This podcast was created and is run by Ben Byford and collaborators. The podcast, and other content was first created to extend Ben’s growing interest in both the AI domain and in the associated ethics. Over the last few years the podcast has grown into a place of discussion and dissemination of important ideas, not only in AI but in tech ethics generally. As the interviews unfold on they often veer into current affairs, the future of work, environmental issues, and more. Though the core is still AI and AI Ethics, we release content that is broader and therefore hopefully more useful to the general public and practitioners.
The hope for the podcast is for it to promote debate concerning technology and society, and to foster the production of technology (and in particular, decision making algorithms) that promote human ideals.
Join in the conversation by getting in touch via email here or following us on Twitter and Instagram.