Your Undivided Attention
In our podcast, Your Undivided Attention, co-hosts Tristan Harris, Aza Raskin and Daniel Barcay explore the unprecedented power of emerging technologies: how they fit into our lives, and how they fit into a humane future.
Join us every other Thursday as we confront challenges and explore solutions with a wide range of thought leaders and change-makers — like Audrey Tang on digital democracy, neurotechnology with Nita Farahany, getting beyond dystopia with Yuval Noah Harari, and Esther Perel on Artificial Intimacy: the other AI.
Your Undivided Attention is produced by Executive Editor Sasha Fegan and Senior Producer Julia Scott. Our Researcher/Producer is Joshua Lash. We are a top tech podcast worldwide with more than 20 million downloads and a member of the TED Audio Collective.
Episodes
The Tech-God Complex: Why We Need to be Skeptics
Silicon Valley's interest in AI is driven by more than just profit and innovation. There’s an unmistakable mystical quality to it as well. In this episode, Daniel and Aza sit down with humanist chaplain Greg Epstein to explore the fascinating parallels between technology and religion. From AI being treated as a godlike force to tech leaders' promises of digital salvation, religious thinking is shaping the future of technology and humanity. Epstein breaks down why he believes technology has become our era's most influential religion and what we can learn from these parallels to better understand where we're heading.Your Undivided Attention is produced by the Center for Humane Technology. Follow us on X.If you like the show and want to support CHT's mission, please consider donating to the organization this giving season: https://www.humanetech.com/donate. Any amount helps support our goal to bring about a more humane future.RECOMMENDED MEDIA “Tech Agnostic” by Greg EpsteinFurther reading on Avi Schiffmann’s “Friend” AI necklace Further reading on Blake Lemoine and Lamda Blake LeMoine’s conversation with Greg at MIT Further reading on the Sewell Setzer case Further reading on Terminal of Truths Further reading on Ray Kurzweil’s attempt to create a digital recreation of his dad with AI The Drama of the Gifted Child by Alice MillerRECOMMENDED YUA EPISODES ’A Turning Point in History’: Yuval Noah Harari on AI’s Cultural Takeover How to Think About AI Consciousness with Anil Seth Can Myth Teach Us Anything About the Race to Build Artificial General Intelligence? With Josh Schrei How To Free Our Minds with Cult Deprogramming Expert Dr. Steven Hassan
21/11/24•46m 32s
What Can We Do About Abusive Chatbots? With Meetali Jain and Camille Carlton
CW: This episode features discussion of suicide and sexual abuse. In the last episode, we had the journalist Laurie Segall on to talk about the tragic story of Sewell Setzer, a 14 year old boy who took his own life after months of abuse and manipulation by an AI companion from the company Character.ai. The question now is: what's next?Megan has filed a major new lawsuit against Character.ai in Florida, which could force the company–and potentially the entire AI industry–to change its harmful business practices. So today on the show, we have Meetali Jain, director of the Tech Justice Law Project and one of the lead lawyers in Megan's case against Character.ai. Meetali breaks down the details of the case, the complex legal questions under consideration, and how this could be the first step toward systemic change. Also joining is Camille Carlton, CHT’s Policy Director.RECOMMENDED MEDIAFurther reading on Sewell’s storyLaurie Segall’s interview with Megan GarciaThe full complaint filed by Megan against Character.AIFurther reading on suicide bots Further reading on Noam Shazier and Daniel De Frietas’ relationship with Google The CHT Framework for Incentivizing Responsible Artificial Intelligence Development and UseOrganizations mentioned: The Tech Justice Law ProjectThe Social Media Victims Law CenterMothers Against Media AddictionParents SOSParents TogetherCommon Sense MediaRECOMMENDED YUA EPISODESWhen the "Person" Abusing Your Child is a Chatbot: The Tragic Story of Sewell SetzerJonathan Haidt On How to Solve the Teen Mental Health CrisisAI Is Moving Fast. We Need Laws that Will Too.Corrections: Meetali referred to certain chatbot apps as banning users under 18, however the settings for the major app stores ban users that are under 17, not under 18.Meetali referred to Section 230 as providing “full scope immunity” to internet companies, however Congress has passed subsequent laws that have made carve outs for that immunity for criminal acts such as sex trafficking and intellectual property theft.Your Undivided Attention is produced by the Center for Humane Technology. Follow us on X.
07/11/24•48m 44s
When the "Person" Abusing Your Child is a Chatbot: The Tragic Story of Sewell Setzer
Content Warning: This episode contains references to suicide, self-harm, and sexual abuse.Megan Garcia lost her son Sewell to suicide after he was abused and manipulated by AI chatbots for months. Now, she’s suing the company that made those chatbots. On today’s episode of Your Undivided Attention, Aza sits down with journalist Laurie Segall, who's been following this case for months. Plus, Laurie’s full interview with Megan on her new show, Dear Tomorrow.Aza and Laurie discuss the profound implications of Sewell’s story on the rollout of AI. Social media began the race to the bottom of the brain stem and left our society addicted, distracted, and polarized. Generative AI is set to supercharge that race, taking advantage of the human need for intimacy and connection amidst a widespread loneliness epidemic. Unless we set down guardrails on this technology now, Sewell’s story may be a tragic sign of things to come, but it also presents an opportunity to prevent further harms moving forward.If you or someone you know is struggling with mental health, you can reach out to the 988 Suicide and Crisis Lifeline by calling or texting 988; this connects you to trained crisis counselors 24/7 who can provide support and referrals to further assistance.Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_RECOMMENDED MEDIAThe first episode of Dear Tomorrow, from Mostly Human MediaThe CHT Framework for Incentivizing Responsible AI Development Further reading on Sewell’s caseCharacter.ai’s “About Us” page Further reading on the addictive properties of AIRECOMMENDED YUA EPISODESAI Is Moving Fast. We Need Laws that Will Too.This Moment in AI: How We Got Here and Where We’re GoingJonathan Haidt On How to Solve the Teen Mental Health CrisisThe AI Dilemma
24/10/24•49m 10s
Is It AI? One Tool to Tell What’s Real with Truemedia.org CEO Oren Etzioni
Social media disinformation did enormous damage to our shared idea of reality. Now, the rise of generative AI has unleashed a flood of high-quality synthetic media into the digital ecosystem. As a result, it's more difficult than ever to tell what’s real and what’s not, a problem with profound implications for the health of our society and democracy. So how do we fix this critical issue?As it turns out, there’s a whole ecosystem of folks to answer that question. One is computer scientist Oren Etzioni, the CEO of TrueMedia.org, a free, non-partisan, non-profit tool that is able to detect AI generated content with a high degree of accuracy. Oren joins the show this week to talk about the problem of deepfakes and disinformation and what he sees as the best solutions.Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_ RECOMMENDED MEDIATrueMedia.orgFurther reading on the deepfaked image of an explosion near the PentagonFurther reading on the deepfaked robocall pretending to be President Biden Further reading on the election deepfake in Slovakia Further reading on the President Obama lip-syncing deepfake from 2017 One of several deepfake quizzes from the New York Times, test yourself! The Partnership on AI C2PAWitness.org Truepic RECOMMENDED YUA EPISODES‘We Have to Get It Right’: Gary Marcus On Untamed AITaylor Swift is Not Alone: The Deepfake Nightmare Sweeping the InternetSynthetic Humanity: AI & What’s At Stake CLARIFICATION: Oren said that the largest social media platforms “don’t see a responsibility to let the public know this was manipulated by AI.” Meta has made a public commitment to flagging AI-generated or -manipulated content. Whereas other platforms like TikTok and Snapchat rely on users to flag.
10/10/24•25m 36s
'A Turning Point in History': Yuval Noah Harari on AI’s Cultural Takeover
Historian Yuval Noah Harari says that we are at a critical turning point. One in which AI’s ability to generate cultural artifacts threatens humanity’s role as the shapers of history. History will still go on, but will it be the story of people or, as he calls them, ‘alien AI agents’?In this conversation with Aza Raskin, Harari discusses the historical struggles that emerge from new technology, humanity’s AI mistakes so far, and the immediate steps lawmakers can take right now to steer us towards a non-dystopian future.This episode was recorded live at the Commonwealth Club World Affairs of California.Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_RECOMMENDED MEDIANEXUS: A Brief History of Information Networks from the Stone Age to AI by Yuval Noah Harari You Can Have the Blue Pill or the Red Pill, and We’re Out of Blue Pills: a New York Times op-ed from 2023, written by Yuval, Aza, and Tristan The 2023 open letter calling for a pause in AI development of at least 6 months, signed by Yuval and Aza Further reading on the Stanford Marshmallow Experiment Further reading on AlphaGo’s “move 37” Further Reading on Social.AIRECOMMENDED YUA EPISODESThis Moment in AI: How We Got Here and Where We’re GoingThe Tech We Need for 21st Century Democracy with Divya SiddarthSynthetic Humanity: AI & What’s At StakeThe AI DilemmaTwo Million Years in Two Hours: A Conversation with Yuval Noah Harari
07/10/24•1h 30m
‘We Have to Get It Right’: Gary Marcus On Untamed AI
It’s a confusing moment in AI. Depending on who you ask, we’re either on the fast track to AI that’s smarter than most humans, or the technology is about to hit a wall. Gary Marcus is in the latter camp. He’s a cognitive psychologist and computer scientist who built his own successful AI start-up. But he’s also been called AI’s loudest critic.On Your Undivided Attention this week, Gary sits down with CHT Executive Director Daniel Barcay to defend his skepticism of generative AI and to discuss what we need to do as a society to get the rollout of this technology right… which is the focus of his new book, Taming Silicon Valley: How We Can Ensure That AI Works for Us.The bottom line: No matter how quickly AI progresses, Gary argues that our society is woefully unprepared for the risks that will come from the AI we already have.Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_ RECOMMENDED MEDIALink to Gary’s book: Taming Silicon Valley: How We Can Ensure That AI Works for UsFurther reading on the deepfake of the CEO of India's National Stock ExchangeFurther reading on the deepfake of of an explosion near the Pentagon.The study Gary cited on AI and false memories.Footage from Gary and Sam Altman’s Senate testimony. RECOMMENDED YUA EPISODESFormer OpenAI Engineer William Saunders on Silence, Safety, and the Right to WarnTaylor Swift is Not Alone: The Deepfake Nightmare Sweeping the InternetNo One is Immune to AI Harms with Dr. Joy Buolamwini Correction: Gary mistakenly listed the reliability of GPS systems as 98%. The federal government’s standard for GPS reliability is 95%.
26/09/24•41m 43s
AI Is Moving Fast. We Need Laws that Will Too.
AI is moving fast. And as companies race to rollout newer, more capable models–with little regard for safety–the downstream risks of those models become harder and harder to counter. On this week’s episode of Your Undivided Attention, CHT’s policy director Casey Mock comes on the show to discuss a new legal framework to incentivize better AI, one that holds AI companies liable for the harms of their products. Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_RECOMMENDED MEDIAThe CHT Framework for Incentivizing Responsible AI DevelopmentFurther Reading on Air Canada’s Chatbot Fiasco Further Reading on the Elon Musk Deep Fake Scams The Full Text of SB1047, California’s AI Regulation Bill Further reading on SB1047 RECOMMENDED YUA EPISODESFormer OpenAI Engineer William Saunders on Silence, Safety, and the Right to WarnCan We Govern AI? with Marietje SchaakeA First Step Toward AI Regulation with Tom WheelerCorrection: Casey incorrectly stated the year that the US banned child labor as 1937. It was banned in 1938.
13/09/24•39m 9s
Esther Perel on Artificial Intimacy (rerun)
[This episode originally aired on August 17, 2023] For all the talk about AI, we rarely hear about how it will change our relationships. As we swipe to find love and consult chatbot therapists, acclaimed psychotherapist and relationship expert Esther Perel warns that there’s another harmful “AI” on the rise — Artificial Intimacy — and how it is depriving us of real connection. Tristan and Esther discuss how depending on algorithms can fuel alienation, and then imagine how we might design technology to strengthen our social bonds.RECOMMENDED MEDIA Mating in Captivity by Esther PerelEsther's debut work on the intricacies behind modern relationships, and the dichotomy of domesticity and sexual desireThe State of Affairs by Esther PerelEsther takes a look at modern relationships through the lens of infidelityWhere Should We Begin? with Esther PerelListen in as real couples in search of help bare the raw and profound details of their storiesHow’s Work? with Esther PerelEsther’s podcast that focuses on the hard conversations we're afraid to have at work Lars and the Real Girl (2007)A young man strikes up an unconventional relationship with a doll he finds on the internetHer (2013)In a near future, a lonely writer develops an unlikely relationship with an operating system designed to meet his every needRECOMMENDED YUA EPISODESBig Food, Big Tech and Big AI with Michael MossThe AI DilemmaThe Three Rules of Humane TechDigital Democracy is Within Reach with Audrey Tang CORRECTION: Esther refers to the 2007 film Lars and the Real Doll. The title of the film is Lars and the Real Girl. Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
06/09/24•44m 52s
Tech's Big Money Campaign is Getting Pushback with Margaret O'Mara and Brody Mullins
Today, the tech industry is the second-biggest lobbying power in Washington, DC, but that wasn’t true as recently as ten years ago. How did we get to this moment? And where could we be going next? On this episode of Your Undivided Attention, Tristan and Daniel sit down with historian Margaret O’Mara and journalist Brody Mullins to discuss how Silicon Valley has changed the nature of American lobbying. Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_RECOMMENDED MEDIAThe Wolves of K Street: The Secret History of How Big Money Took Over Big Government - Brody’s book on the history of lobbying.The Code: Silicon Valley and the Remaking of America - Margaret’s book on the historical relationship between Silicon Valley and Capitol HillMore information on the Google antitrust rulingMore Information on KOSPAMore information on the SOPA/PIPA internet blackoutDetailed breakdown of Internet lobbying from Open Secrets RECOMMENDED YUA EPISODESU.S. Senators Grilled Social Media CEOs. Will Anything Change?Can We Govern AI? with Marietje SchaakeThe Race to Cooperation with David Sloan Wilson CORRECTION: Brody Mullins refers to AT&T as having a “hundred million dollar” lobbying budget in 2006 and 2007. While we couldn’t verify the size of their budget for lobbying, their actual lobbying spend was much less than this: $27.4m in 2006 and $16.5m in 2007, according to OpenSecrets. The views expressed by guests appearing on Center for Humane Technology’s podcast, Your Undivided Attention, are their own, and do not necessarily reflect the views of CHT. CHT does not support or oppose any candidate or party for election to public office
26/08/24•43m 59s
This Moment in AI: How We Got Here and Where We’re Going
It’s been a year and half since Tristan and Aza laid out their vision and concerns for the future of artificial intelligence in The AI Dilemma. In this Spotlight episode, the guys discuss what’s happened since then–as funding, research, and public interest in AI has exploded–and where we could be headed next. Plus, some major updates on social media reform, including the passage of the Kids Online Safety and Privacy Act in the Senate. Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_ RECOMMENDED MEDIAThe AI Dilemma: Tristan and Aza’s talk on the catastrophic risks posed by AI.Info Sheet on KOSPA: More information on KOSPA from FairPlay.Situational Awareness by Leopold Aschenbrenner: A widely cited blog from a former OpenAI employee, predicting the rapid arrival of AGI.AI for Good: More information on the AI for Good summit that was held earlier this year in Geneva. Using AlphaFold in the Fight Against Plastic Pollution: More information on Google’s use of AlphaFold to create an enzyme to break down plastics. Swiss Call For Trust and Transparency in AI: More information on the initiatives mentioned by Katharina Frey. RECOMMENDED YUA EPISODESWar is a Laboratory for AI with Paul ScharreJonathan Haidt On How to Solve the Teen Mental Health CrisisCan We Govern AI? with Marietje Schaake The Three Rules of Humane TechThe AI Dilemma Clarification: Swiss diplomat Nina Frey’s full name is Katharina Frey. The views expressed by guests appearing on Center for Humane Technology’s podcast, Your Undivided Attention, are their own, and do not necessarily reflect the views of CHT. CHT does not support or oppose any candidate or party for election to public office
12/08/24•36m 55s
Decoding Our DNA: How AI Supercharges Medical Breakthroughs and Biological Threats with Kevin Esvelt
AI has been a powerful accelerant for biological research, rapidly opening up new frontiers in medicine and public health. But that progress can also make it easier for bad actors to manufacture new biological threats. In this episode, Tristan and Daniel sit down with biologist Kevin Esvelt to discuss why AI has been such a boon for biologists and how we can safeguard society against the threats that AIxBio poses.Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_RECOMMENDED MEDIASculpting Evolution: Information on Esvelt’s lab at MIT.SecureDNA: Esvelt’s free platform to provide safeguards for DNA synthesis.The Framework for Nucleic Acid Synthesis Screening: The Biden admin’s suggested guidelines for DNA synthesis regulation.Senate Hearing on Regulating AI Technology: C-SPAN footage of Dario Amodei’s testimony to Congress.The AlphaFold Protein Structure DatabaseRECOMMENDED YUA EPISODESU.S. Senators Grilled Social Media CEOs. Will Anything Change?Big Food, Big Tech and Big AI with Michael MossThe AI DilemmaClarification: President Biden’s executive order only applies to labs that receive funding from the federal government, not state governments.
18/07/24•32m 47s
How to Think About AI Consciousness With Anil Seth
Will AI ever start to think by itself? If it did, how would we know, and what would it mean?In this episode, Dr. Anil Seth and Aza discuss the science, ethics, and incentives of artificial consciousness. Seth is Professor of Cognitive and Computational Neuroscience at the University of Sussex and the author of Being You: A New Science of Consciousness.Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_RECOMMENDED MEDIAFrankenstein by Mary ShelleyA free, plain text version of the Shelley’s classic of gothic literature.OpenAI’s GPT4o DemoA video from OpenAI demonstrating GPT4o’s remarkable ability to mimic human sentience.You Can Have the Blue Pill or the Red Pill, and We’re Out of Blue PillsThe NYT op-ed from last year by Tristan, Aza, and Yuval Noah Harari outlining the AI dilemma. What It’s Like to Be a BatThomas Nagel’s essay on the nature of consciousness.Are You Living in a Computer Simulation?Philosopher Nick Bostrom’s essay on the simulation hypothesis.Anthropic’s Golden Gate ClaudeA blog post about Anthropic’s recent discovery of millions of distinct concepts within their LLM, a major development in the field of AI interpretability.RECOMMENDED YUA EPISODESEsther Perel on Artificial IntimacyTalking With Animals... Using AISynthetic Humanity: AI & What’s At Stake
04/07/24•47m 58s
Why Are Migrants Becoming AI Test Subjects? With Petra Molnar
Climate change, political instability, hunger. These are just some of the forces behind an unprecedented refugee crisis that’s expected to include over a billion people by 2050. In response to this growing crisis, wealthy governments like the US and the EU are employing novel AI and surveillance technologies to slow the influx of migrants at their borders. But will this rollout stop at the border?In this episode, Tristan and Aza sit down with Petra Molnar to discuss how borders have become a proving ground for the sharpest edges of technology, and especially AI. Petra is an immigration lawyer and co-creator of the Migration and Technology Monitor. Her new book is “The Walls Have Eyes: Surviving Migration in the Age of Artificial Intelligence.”Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_RECOMMENDED MEDIAThe Walls Have Eyes: Surviving Migration in the Age of Artificial IntelligencePetra’s newly published book on the rollout of high risk tech at the border.Bots at the GateA report co-authored by Petra about Canada’s use of AI technology in their immigration process.Technological Testing GroundsA report authored by Petra about the use of experimental technology in EU border enforcement.Startup Pitched Tasing Migrants from Drones, Video RevealsAn article from The Intercept, containing the demo for Brinc’s taser drone pilot program.The UNHCRInformation about the global refugee crisis from the UN.RECOMMENDED YUA EPISODESWar is a Laboratory for AI with Paul ScharreNo One is Immune to AI Harms with Dr. Joy BuolamwiniCan We Govern AI? With Marietje SchaakeCLARIFICATION:The iBorderCtrl project referenced in this episode was a pilot project that was discontinued in 2019
20/06/24•46m 19s
Former OpenAI Engineer William Saunders on Silence, Safety, and the Right to Warn
This week, a group of current and former employees from OpenAI and Google DeepMind penned an open letter accusing the industry’s leading companies of prioritizing profits over safety. This comes after a spate of high profile departures from OpenAI, including co-founder Ilya Sutskever and senior researcher Jan Leike, as well as reports that OpenAI has gone to great lengths to silence would-be whistleblowers. The writers of the open letter argue that researchers have a “right to warn” the public about AI risks and laid out a series of principles that would protect that right. In this episode, we sit down with one of those writers: William Saunders, who left his job as a research engineer at OpenAI in February. William is now breaking the silence on what he saw at OpenAI that compelled him to leave the company and to put his name to this letter. RECOMMENDED MEDIA The Right to Warn Open LetterMy Perspective On "A Right to Warn about Advanced Artificial Intelligence": A follow-up from William about the letterLeaked OpenAI documents reveal aggressive tactics toward former employees: An investigation by Vox into OpenAI’s policy of non-disparagement.RECOMMENDED YUA EPISODESA First Step Toward AI Regulation with Tom WheelerSpotlight on AI: What Would It Take For This to Go Well?Big Food, Big Tech and Big AI with Michael MossCan We Govern AI? With Marietje SchaakeYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
07/06/24•37m 47s
War is a Laboratory for AI with Paul Scharre
Right now, militaries around the globe are investing heavily in the use of AI weapons and drones. From Ukraine to Gaza, weapons systems with increasing levels of autonomy are being used to kill people and destroy infrastructure and the development of fully autonomous weapons shows little signs of slowing down. What does this mean for the future of warfare? What safeguards can we put up around these systems? And is this runaway trend toward autonomous warfare inevitable or will nations come together and choose a different path? In this episode, Tristan and Daniel sit down with Paul Scharre to try to answer some of these questions. Paul is a former Army Ranger, the author of two books on autonomous weapons and he helped the Department of Defense write a lot of its policy on the use of AI in weaponry. RECOMMENDED MEDIAFour Battlegrounds: Power in the Age of Artificial Intelligence: Paul’s book on the future of AI in war, which came out in 2023.Army of None: Autonomous Weapons and the Future of War: Paul’s 2018 book documenting and predicting the rise of autonomous and semi-autonomous weapons as part of modern warfare.The Perilous Coming Age of AI Warfare: How to Limit the Threat of Autonomous Warfare: Paul’s article in Foreign Affairs based on his recent trip to the battlefield in Ukraine.The night the world almost almost ended: A BBC documentary about Stanislav Petrov’s decision not to start nuclear war.AlphaDogfight Trials Final Event: The full simulated dogfight between an AI and human pilot. The AI pilot swept, 5-0.‘Lavender’: The AI machine directing Israel’s bombing spree in Gaza: An investigation into the use of AI targeting systems by the IDF.RECOMMENDED YUA EPISODESThe AI ‘Race’: China vs. the US with Jeffrey Ding and Karen HaoCan We Govern AI? with Marietje SchaakeBig Food, Big Tech and Big AI with Michael MossThe Invisible Cyber-War with Nicole PerlrothYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
23/05/24•59m 16s
AI and Jobs: How to Make AI Work With Us, Not Against Us With Daron Acemoglu
Tech companies say that AI will lead to massive economic productivity gains. But as we know from the first digital revolution, that’s not what happened. Can we do better this time around?RECOMMENDED MEDIAPower and Progress by Daron Acemoglu and Simon Johnson Professor Acemoglu co-authored a bold reinterpretation of economics and history that will fundamentally change how you see the worldCan we Have Pro-Worker AI? Professor Acemoglu co-authored this paper about redirecting AI development onto the human-complementary pathRethinking Capitalism: In Conversation with Daron Acemoglu The Wheeler Institute for Business and Development hosted Professor Acemoglu to examine how technology affects the distribution and growth of resources while being shaped by economic and social incentivesRECOMMENDED YUA EPISODESThe Three Rules of Humane TechThe Tech We Need for 21st Century DemocracyCan We Govern AI?An Alternative to Silicon Valley UnicornsYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
09/05/24•46m 48s
Jonathan Haidt On How to Solve the Teen Mental Health Crisis
Suicides. Self harm. Depression and anxiety. The toll of a social media-addicted, phone-based childhood has never been more stark. It can be easy for teens, parents and schools to feel like they’re trapped by it all. But in this conversation with Tristan Harris, author and social psychologist Jonathan Haidt makes the case that the conditions that led to today’s teenage mental health crisis can be turned around – with specific, achievable actions we all can take starting today.This episode was recorded live at the San Francisco Commonwealth Club. Correction: Tristan mentions that 40 Attorneys General have filed a lawsuit against Meta for allegedly fostering addiction among children and teens through their products. However, the actual number is 42 Attorneys General who are taking legal action against Meta.Clarification: Jonathan refers to the Wait Until 8th pledge. By signing the pledge, a parent promises not to give their child a smartphone until at least the end of 8th grade. The pledge becomes active once at least ten other families from their child’s grade pledge the same.
11/04/24•1h 5m
Chips Are the Future of AI. They’re Also Incredibly Vulnerable. With Chris Miller
Beneath the race to train and release more powerful AI models lies another race: a race by companies and nation-states to secure the hardware to make sure they win AI supremacy. Correction: The latest available Nvidia chip is the Hopper H100 GPU, which has 80 billion transistors. Since the first commercially available chip had four transistors, the Hopper actually has 20 billion times that number. Nvidia recently announced the Blackwell, which boasts 208 billion transistors - but it won’t ship until later this year.RECOMMENDED MEDIA Chip War: The Fight For the World’s Most Critical Technology by Chris MillerTo make sense of the current state of politics, economics, and technology, we must first understand the vital role played by chipsGordon Moore Biography & FactsGordon Moore, the Intel co-founder behind Moore's Law, passed away in March of 2023AI’s most popular chipmaker Nvidia is trying to use AI to design chips fasterNvidia's GPUs are in high demand - and the company is using AI to accelerate chip productionRECOMMENDED YUA EPISODESFuture-proofing Democracy In the Age of AI with Audrey TangHow Will AI Affect the 2024 Elections? with Renee DiResta and Carl MillerThe AI ‘Race’: China vs. the US with Jeffrey Ding and Karen HaoProtecting Our Freedom of Thought with Nita FarahanyYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
29/03/24•45m 16s
Future-proofing Democracy In the Age of AI with Audrey Tang
What does a functioning democracy look like in the age of artificial intelligence? Could AI even be used to help a democracy flourish? Just in time for election season, Taiwan’s Minister of Digital Affairs Audrey Tang returns to the podcast to discuss healthy information ecosystems, resilience to cyberattacks, how to “prebunk” deepfakes, and more. RECOMMENDED MEDIA Testing Theories of American Politics: Elites, Interest Groups, and Average Citizens by Martin Gilens and Benjamin I. PageThis academic paper addresses tough questions for Americans: Who governs? Who really rules? Recursive PublicRecursive Public is an experiment in identifying areas of consensus and disagreement among the international AI community, policymakers, and the general public on key questions of governanceA Strong Democracy is a Digital DemocracyAudrey Tang’s 2019 op-ed for The New York TimesThe Frontiers of Digital DemocracyNathan Gardels interviews Audrey Tang in NoemaRECOMMENDED YUA EPISODES Digital Democracy is Within Reach with Audrey TangThe Tech We Need for 21st Century Democracy with Divya SiddarthHow Will AI Affect the 2024 Elections? with Renee DiResta and Carl MillerThe AI DilemmaYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
29/02/24•34m 38s
U.S. Senators Grilled Social Media CEOs. Will Anything Change?
Was it political progress, or just political theater? The recent Senate hearing with social media CEOs led to astonishing moments — including Mark Zuckerberg’s public apology to families who lost children following social media abuse. Our panel of experts, including Facebook whistleblower Frances Haugen, untangles the explosive hearing, and offers a look ahead, as well. How will this hearing impact protocol within these social media companies? How will it impact legislation? In short: will anything change?Clarification: Julie says that shortly after the hearing, Meta’s stock price had the biggest increase of any company in the stock market’s history. It was the biggest one-day gain by any company in Wall Street history.Correction: Frances says it takes Snap three or four minutes to take down exploitative content. In Snap's most recent transparency report, they list six minutes as the median turnaround time to remove exploitative content.RECOMMENDED MEDIA Get Media SavvyFounded by Julie Scelfo, Get Media Savvy is a non-profit initiative working to establish a healthy media environment for kids and familiesThe Power of One by Frances HaugenThe inside story of France’s quest to bring transparency and accountability to Big TechRECOMMENDED YUA EPISODESReal Social Media Solutions, Now with Frances HaugenA Conversation with Facebook Whistleblower Frances HaugenAre the Kids Alright?Social Media Victims Lawyer Up with Laura Marquez-GarrettYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
13/02/24•25m 6s
Taylor Swift is Not Alone: The Deepfake Nightmare Sweeping the Internet
Over the past year, a tsunami of apps that digitally strip the clothes off real people has hit the market. Now anyone can create fake non-consensual sexual images in just a few clicks. With cases proliferating in high schools, guest presenter Laurie Segall talks to legal scholar Mary Anne Franks about the AI-enabled rise in deep fake porn and what we can do about it. Correction: Laurie refers to the app 'Clothes Off.' It’s actually named Clothoff. There are many clothes remover apps in this category.RECOMMENDED MEDIA Revenge Porn: The Cyberwar Against WomenIn a five-part digital series, Laurie Segall uncovers a disturbing internet trend: the rise of revenge pornThe Cult of the ConstitutionIn this provocative book, Mary Anne Franks examines the thin line between constitutional fidelity and constitutional fundamentalismFake Explicit Taylor Swift Images Swamp Social MediaCalls to protect women and crack down on the platforms and technology that spread such images have been reignitedRECOMMENDED YUA EPISODES No One is Immune to AI HarmsEsther Perel on Artificial IntimacySocial Media Victims Lawyer UpThe AI DilemmaYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
01/02/24•42m 59s
Can Myth Teach Us Anything About the Race to Build Artificial General Intelligence? With Josh Schrei
We usually talk about tech in terms of economics or policy, but the casual language tech leaders often use to describe AI — summoning an inanimate force with the powers of code — sounds more... magical. So, what can myth and magic teach us about the AI race? Josh Schrei, mythologist and host of The Emerald podcast, says that foundational cultural tales like "The Sorcerer's Apprentice" or Prometheus teach us the importance of initiation, responsibility, human knowledge, and care. He argues these stories and myths can guide ethical tech development by reminding us what it is to be human. Correction: Josh says the first telling of "The Sorcerer’s Apprentice" myth dates back to ancient Egypt, but it actually dates back to ancient Greece.RECOMMENDED MEDIA The Emerald podcastThe Emerald explores the human experience through a vibrant lens of myth, story, and imaginationEmbodied Ethics in The Age of AIA five-part course with The Emerald podcast’s Josh Schrei and School of Wise Innovation’s Andrew DunnNature Nurture: Children Can Become Stewards of Our Delicate PlanetA U.S. Department of the Interior study found that the average American kid can identify hundreds of corporate logos but not plants and animalsThe New FireAI is revolutionizing the world - here's how democracies can come out on top. This upcoming book was authored by an architect of President Biden's AI executive orderRECOMMENDED YUA EPISODES How Will AI Affect the 2024 Elections?The AI DilemmaThe Three Rules of Humane TechAI Myths and Misconceptions Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
18/01/24•35m 50s
How Will AI Affect the 2024 Elections? with Renee DiResta and Carl Miller
2024 will be the biggest election year in world history. Forty countries will hold national elections, with over two billion voters heading to the polls. In this episode of Your Undivided Attention, two experts give us a situation report on how AI will increase the risks to our elections and our democracies. Correction: Tristan says two billion people from 70 countries will be undergoing democratic elections in 2024. The number expands to 70 when non-national elections are factored in.RECOMMENDED MEDIA White House AI Executive Order Takes On Complexity of Content Integrity IssuesRenee DiResta’s piece in Tech Policy Press about content integrity within President Biden’s AI executive orderThe Stanford Internet ObservatoryA cross-disciplinary program of research, teaching and policy engagement for the study of abuse in current information technologies, with a focus on social mediaDemosBritain’s leading cross-party think tankInvisible Rulers: The People Who Turn Lies into Reality by Renee DiRestaPre-order Renee’s upcoming book that’s landing on shelves June 11, 2024RECOMMENDED YUA EPISODESThe Spin Doctors Are In with Renee DiRestaFrom Russia with Likes Part 1 with Renee DiRestaFrom Russia with Likes Part 2 with Renee DiRestaEsther Perel on Artificial IntimacyThe AI DilemmaA Conversation with Facebook Whistleblower Frances HaugenYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
21/12/23•47m 15s
2023 Ask Us Anything
You asked, we answered. This has been a big year in the world of tech, with the rapid proliferation of artificial intelligence, acceleration of neurotechnology, and continued ethical missteps of social media. Looking back on 2023, there are still so many questions on our minds, and we know you have a lot of questions too. So we created this episode to respond to listener questions and to reflect on what lies ahead.Correction: Tristan mentions that 41 Attorneys General have filed a lawsuit against Meta for allegedly fostering addiction among children and teens through their products. However, the actual number is 42 Attorneys General who are taking legal action against Meta.Correction: Tristan refers to Casey Mock as the Center for Humane Technology’s Chief Policy and Public Affairs Manager. His title is Chief Policy and Public Affairs Officer.RECOMMENDED MEDIA Tech Policy WatchMarietje Schaake curates this briefing on artificial intelligence and technology policy from around the worldThe AI Executive OrderPresident Biden’s executive order on the safe, secure, and trustworthy development and use of AIMeta sued by 42 AGs for addictive features targeting kidsA bipartisan group of 42 attorneys general is suing Meta, alleging features on Facebook and Instagram are addictive and are aimed at kids and teensRECOMMENDED YUA EPISODES The Three Rules of Humane TechTwo Million Years in Two Hours: A Conversation with Yuval Noah HarariInside the First AI Insight Forum in WashingtonDigital Democracy is Within Reach with Audrey TangThe Tech We Need for 21st Century Democracy with Divya SiddarthMind the (Perception) Gap with Dan ValloneThe AI DilemmaCan We Govern AI? with Marietje SchaakeAsk Us Anything: You Asked, We AnsweredYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
30/11/23•35m 7s
The Promise and Peril of Open Source AI with Elizabeth Seger and Jeffrey Ladish
As AI development races forward, a fierce debate has emerged over open source AI models. So what does it mean to open-source AI? Are we opening Pandora’s box of catastrophic risks? Or is open-sourcing AI the only way we can democratize its benefits and dilute the power of big tech? Correction: When discussing the large language model Bloom, Elizabeth said it functions in 26 different languages. Bloom is actually able to generate text in 46 natural languages and 13 programming languages - and more are in the works. RECOMMENDED MEDIA Open-Sourcing Highly Capable Foundation ModelsThis report, co-authored by Elizabeth Seger, attempts to clarify open-source terminology and to offer a thorough analysis of risks and benefits from open-sourcing AIBadLlama: cheaply removing safety fine-tuning from Llama 2-Chat 13BThis paper, co-authored by Jeffrey Ladish, demonstrates that it’s possible to effectively undo the safety fine-tuning from Llama 2-Chat 13B with less than $200 while retaining its general capabilitiesCentre for the Governance of AISupports governments, technology companies, and other key institutions by producing relevant research and guidance around how to respond to the challenges posed by AIAI: Futures and Responsibility (AI:FAR)Aims to shape the long-term impacts of AI in ways that are safe and beneficial for humanityPalisade ResearchStudies the offensive capabilities of AI systems today to better understand the risk of losing control to AI systems forever RECOMMENDED YUA EPISODESA First Step Toward AI Regulation with Tom WheelerNo One is Immune to AI Harms with Dr. Joy BuolamwiniMustafa Suleyman Says We Need to Contain AI. How Do We Do It?The AI DilemmaYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
21/11/23•38m 44s
A First Step Toward AI Regulation with Tom Wheeler
On Monday, Oct. 30, President Biden released a sweeping executive order that addresses many risks of artificial intelligence. Tom Wheeler, former chairman of the Federal Communications Commission, shares his insights on the order with Tristan and Aza and discusses what’s next in the push toward AI regulation. Clarification: When quoting Thomas Jefferson, Aza incorrectly says “regime” instead of “regimen.” The correct quote is: “I am not an advocate for frequent changes in laws and constitutions, but laws and institutions must go hand in hand with the progress of the human mind. And as that becomes more developed, more enlightened, as new discoveries are made, new truths discovered, and manners and opinions change, with the change of circumstances, institutions must advance also to keep pace with the times. We might as well require a man to wear still the coat which fitted him when a boy as civilized society to remain ever under the regime of their barbarous ancestors.” RECOMMENDED MEDIA The AI Executive OrderPresident Biden’s Executive Order on the safe, secure, and trustworthy development and use of AIUK AI Safety SummitThe summit brings together international governments, leading AI companies, civil society groups, and experts in research to consider the risks of AI and discuss how they can be mitigated through internationally coordinated actionaitreaty.orgAn open letter calling for an international AI treatyTechlash: Who Makes the Rules in the Digital Gilded Age?Praised by Kirkus Reviews as “a rock-solid plan for controlling the tech giants,” readers will be energized by Tom Wheeler’s vision of digital governance RECOMMENDED YUA EPISODESInside the First AI Insight Forum in WashingtonDigital Democracy is Within Reach with Audrey TangThe AI DilemmaYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
02/11/23•35m 24s
No One is Immune to AI Harms with Dr. Joy Buolamwini
In this interview, Dr. Joy Buolamwini argues that algorithmic bias in AI systems poses risks to marginalized people. She challenges the assumptions of tech leaders who advocate for AI “alignment” and explains why some tech companies are hypocritical when it comes to addressing bias. Dr. Joy Buolamwini is the founder of the Algorithmic Justice League and the author of “Unmasking AI: My Mission to Protect What Is Human in a World of Machines.”Correction: Aza says that Sam Altman, the CEO of OpenAI, predicts superintelligence in four years. Altman predicts superintelligence in ten years. RECOMMENDED MEDIAUnmasking AI by Joy Buolamwini“The conscience of the AI revolution” explains how we’ve arrived at an era of AI harms and oppression, and what we can do to avoid its pitfallsCoded BiasShalini Kantayya’s film explores the fallout of Dr. Joy’s discovery that facial recognition does not see dark-skinned faces accurately, and her journey to push for the first-ever legislation in the U.S. to govern against bias in the algorithms that impact us allHow I’m fighting bias in algorithmsDr. Joy’s 2016 TED Talk about her mission to fight bias in machine learning, a phenomenon she calls the "coded gaze." RECOMMENDED YUA EPISODESMustafa Suleyman Says We Need to Contain AI. How Do We Do It?Protecting Our Freedom of Thought with Nita FarahanyThe AI Dilemma Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
26/10/23•47m 46s
Mustafa Suleyman Says We Need to Contain AI. How Do We Do It?
This is going to be the most productive decade in the history of our species, says Mustafa Suleyman, author of “The Coming Wave,” CEO of Inflection AI, and founder of Google’s DeepMind. But in order to truly reap the benefits of AI, we need to learn how to contain it. Paradoxically, part of that will mean collectively saying no to certain forms of progress. As an industry leader reckoning with a future that’s about to be ‘turbocharged’ Mustafa says we can all play a role in shaping the technology in hands-on ways and by advocating for appropriate governance.RECOMMENDED MEDIA The Coming Wave: Technology, Power, and the 21st Century’s Greatest DilemmaThis new book from Mustafa Suleyman is a must-read guide to the technological revolution just starting, and the transformed world it will createPartnership on AIPartnership on AI is bringing together diverse voices from across the AI community to create resources for advancing positive outcomes for people and societyPolicy Reforms Toolkit from the Center for Humane TechnologyDigital lawlessness has been normalized in the name of innovation. It’s possible to craft policy that protects the conditions we need to thriveRECOMMENDED YUA EPISODES AI Myths and MisconceptionsCan We Govern AI? with Marietje SchaakeThe AI DilemmaYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
28/09/23•32m 4s
Inside the First AI Insight Forum in Washington
Last week, Senator Chuck Schumer brought together Congress and many of the biggest names in AI for the first closed-door AI Insight Forum in Washington, D.C. Tristan and Aza were invited speakers at the event, along with Elon Musk, Satya Nadella, Sam Altman, and other leaders. In this update on Your Undivided Attention, Tristan and Aza recount how they felt the meeting went, what they communicated in their statements, and what it felt like to critique Meta’s LLM in front of Mark Zuckerberg.Correction: In this episode, Tristan says GPT-3 couldn’t find vulnerabilities in code. GPT-3 could find security vulnerabilities, but GPT-4 is exponentially better at it.RECOMMENDED MEDIA In Show of Force, Silicon Valley Titans Pledge ‘Getting This Right’ With A.I.Elon Musk, Sam Altman, Mark Zuckerberg, Sundar Pichai and others discussed artificial intelligence with lawmakers, as tech companies strive to influence potential regulationsMajority Leader Schumer Opening Remarks For The Senate’s Inaugural AI Insight ForumSenate Majority Leader Chuck Schumer (D-NY) opened the Senate’s inaugural AI Insight ForumThe Wisdom GapAs seen in Tristan’s talk on this subject in 2022, the scope and speed of our world’s issues are accelerating and growing more complex. And yet, our ability to comprehend those challenges and respond accordingly is not matching paceRECOMMENDED YUA EPISODESSpotlight On AI: What Would It Take For This to Go Well?The AI ‘Race’: China vs. the US with Jeffrey Ding and Karen HaoSpotlight: Elon, Twitter and the Gladiator Arena Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
19/09/23•26m 48s
Spotlight on AI: What Would It Take For This to Go Well?
Where do the top Silicon Valley AI researchers really think AI is headed? Do they have a plan if things go wrong? In this episode, Tristan Harris and Aza Raskin reflect on the last several months of highlighting AI risk, and share their insider takes on a high-level workshop run by CHT in Silicon Valley. NOTE: Tristan refers to journalist Maria Ressa and mentions that she received 80 hate messages per hour at one point. She actually received more than 90 messages an hour.RECOMMENDED MEDIA Musk, Zuckerberg, Gates: The titans of tech will talk AI at private Capitol summitThis week will feature a series of public hearings on artificial intelligence. But all eyes will be on the closed-door gathering convened by Senate Majority Leader Chuck SchumerTakeaways from the roundtable with President Biden on artificial intelligenceTristan Harris talks about his recent meeting with President Biden to discuss regulating artificial intelligenceBiden, Harris meet with CEOs about AI risksVice President Kamala Harris met with the heads of Google, Microsoft, Anthropic, and OpenAI as the Biden administration rolled out initiatives meant to ensure that AI improves lives without putting people’s rights and safety at riskRECOMMENDED YUA EPISODES The AI DilemmaThe AI ‘Race’: China vs the US with Jeffrey Ding and Karen HaoThe Dictator’s Playbook with Maria RessaYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
12/09/23•43m 46s
The AI ‘Race’: China vs. the US with Jeffrey Ding and Karen Hao
In the debate over slowing down AI, we often hear the same argument against regulation. “What about China? We can’t let China get ahead.” To dig into the nuances of this argument, Tristan and Aza speak with academic researcher Jeffrey Ding and journalist Karen Hao, who take us through what’s really happening in Chinese AI development. They address China’s advantages and limitations, what risks are overblown, and what, in this multi-national competition, is at stake as we imagine the best possible future for everyone.CORRECTION: Jeffrey Ding says the export controls on advanced chips that were established in October 2022 only apply to military end-users. The controls also impose a license requirement on the export of those advanced chips to any China-based end-user.RECOMMENDED MEDIA Recent Trends in China’s Large Language Model Landscape by Jeffrey Ding and Jenny W. XiaoThis study covers a sample of 26 large-scale pre-trained AI models developed in ChinaThe diffusion deficit in scientific and technological power: re-assessing China’s rise by Jeffrey DingThis paper argues for placing a greater weight on a state’s capacity to diffuse, or widely adopt, innovationsThe U.S. Is Turning Away From Its Biggest Scientific Partner at a Precarious Time by Karen Hao and Sha HuaU.S. moves to cut research ties with China over security concerns threaten American progress in critical areasWhy China Has Not Caught Up Yet: Military-Technological Superiority and the Limits of Imitation, Reverse Engineering, and Cyber Espionage by Andrea Gilli and Mauro GilliMilitary technology has grown so complex that it’s hard to imitateRECOMMENDED YUA EPISODES The Three Rules of Humane TechA Fresh Take on Tech in China with Rui Ma and Duncan ClarkDigital Democracy is Within Reach with Audrey TangYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
31/08/23•45m 45s
Esther Perel on Artificial Intimacy
For all the talk about AI, we rarely hear about how it will change our relationships. As we swipe to find love and consult chatbot therapists, acclaimed psychotherapist and relationship expert Esther Perel warns that there’s another harmful “AI” on the rise — Artificial Intimacy — and how it is depriving us of real connection. Tristan and Esther discuss how depending on algorithms can fuel alienation, and then imagine how we might design technology to strengthen our social bonds.RECOMMENDED MEDIA Mating in Captivity by Esther PerelEsther's debut work on the intricacies behind modern relationships, and the dichotomy of domesticity and sexual desireThe State of Affairs by Esther PerelEsther takes a look at modern relationships through the lens of infidelityWhere Should We Begin? with Esther PerelListen in as real couples in search of help bare the raw and profound details of their storiesHow’s Work? with Esther PerelEsther’s podcast that focuses on the hard conversations we're afraid to have at work Lars and the Real Girl (2007)A young man strikes up an unconventional relationship with a doll he finds on the internetHer (2013)In a near future, a lonely writer develops an unlikely relationship with an operating system designed to meet his every needRECOMMENDED YUA EPISODES Big Food, Big Tech and Big AI with Michael MossThe AI DilemmaThe Three Rules of Humane TechDigital Democracy is Within Reach with Audrey Tang Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
17/08/23•44m 7s
Protecting Our Freedom of Thought with Nita Farahany
We are on the cusp of an explosion of cheap, consumer-ready neurotechnology - from earbuds that gather our behavioral data, to sensors that can read our dreams. And it’s all going to be supercharged by AI. This technology is moving from niche to mainstream - and it has the same potential to become exponential. Legal scholar Nita Farahany talks us through the current state of neurotechnology and its deep links to AI. She says that we urgently need to protect the last frontier of privacy: our internal thoughts. And she argues that without a new legal framework around “cognitive liberty,” we won’t be able to insulate our brains from corporate and government intrusion.RECOMMENDED MEDIA The Battle for Your Brain: Defending the Right to Think Freely in the Age of Neurotechnology by Nita FarahanyThe Battle for Your Brain offers a path forward to navigate the complex dilemmas that will fundamentally impact our freedom to understand, shape, and define ourselvesComputer Program Reveals What Neurons in the Visual Cortex Prefer to Look AtA study of macaque monkeys at Harvard generated valuable clues based on an artificial intelligence system that can reliably determine what neurons in the brain’s visual cortex prefer to seeUnderstanding Media: The Extensions of Man by Marshall McLuhanAn influential work by a fixture in media discourseRECOMMENDED YUA EPISODES The Three Rules of Humane TechTalking With Animals… Using AIHow to Free Our Minds with Cult Deprogramming Expert Dr. Steven HassanYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
03/08/23•44m 7s
Social Media Victims Lawyer Up with Laura Marquez-Garrett
Social media was humanity’s ‘first contact’ moment with AI. If we’re going to create laws that are strong enough to prevent AI from destroying our societies, we could benefit from taking a look at the major lawsuits against social media platforms that are playing out in our courts right now.In our last episode, we took a close look at Big Food and its dangerous “race to the bottom” that parallels AI. We continue that theme this week with an episode about litigating social media and the consequences of the race to engagement in order to inform how we can approach AI harms. Our guest, attorney Laura Marquez-Garrett, left her predominantly defense-oriented practice to join the Social Media Victims Law Center in February 2022. Laura is literally on the front lines of the battle to hold social media firms accountable for the harms they have created in young people’s lives for the past decade. Listener warning: there are distressing and potentially triggering details within the episode.Correction: Tristan refers to the Social Media Victims Law Center as a nonprofit legal center. They are a for-profit law firm.RECOMMENDED MEDIA 1) If you're a parent whose child has been impacted by social media, Attorneys General in Colorado, New Hampshire, and Tennessee are asking to hear your story. Your testimonies can help ensure that social media platforms are designed safely for kids. For more information, please visit the respective state links.ColoradoNew HampshireTennessee2) Social Media Victims Law CenterA non-profit legal center that was founded in 2021 in response to the testimony of Facebook whistleblower Frances Haugen3) Resources for Parents & EducatorsOverwhelmed by our broken social media environment and wondering where to start? Check out our Youth Toolkit plus three actions you can take today4) The Social DilemmaLearn how the system works. Watch and share The Social Dilemma with people you care aboutRECOMMENDED YUA EPISODES Transcending the Internet Hate Game with Dylan MarronA Conversation with Facebook Whistleblower Frances HaugenBehind the Curtain on The Social Dilemma with Jeff Orlowski-Yang and Larissa RhodesYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
21/07/23•34m 54s
Big Food, Big Tech and Big AI with Michael Moss
In the next two episodes of Your Undivided Attention, we take a close look at two respective industries: big food and social media, which represent dangerous “races to the bottom” and have big parallels with AI. And we are asking: what can our past mistakes and missed opportunities teach us about how we should approach AI harms? In this first episode, Tristan talks to Pulitzer Prize-winning journalist and author Michael Moss. His book Salt, Sugar, Fat: How the Food Giants Hooked Us rocked the fast food industry when it came out in 2014. Tristan and Michael discuss how we can leverage the lessons learned from Big Food’s coordination failures, and whether it’s the responsibility of the consumer, the government, or the companies to regulate. RECOMMENDED MEDIA Salt Sugar Fat: How the Food Giants Hooked UsMichael’s New York Times bestseller. You’ll never look at a nutrition label the same way againHooked: Food, Free Will, and How the Food Giants Exploit Our AddictionsMichael’s Pulitzer Prize-winning exposé of how the processed food industry exploits our evolutionary instincts, the emotions we associate with food, and legal loopholes in their pursuit of profit over public healthControl Your Tech UseCenter for Humane Technology’s recently updated Take Control ToolkitRECOMMENDED YUA EPISODESAI Myths and MisconceptionsThe AI DilemmaHow Might a long-term stock market transform tech? (ZigZag episode) Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
06/07/23•34m 43s
What Can Technologists Learn from Sesame Street? With Dr. Rosemarie Truglio
What happens when creators consider what lifelong human development looks like in terms of the tools we make? And what philosophies from Sesame Street can inform how to steward the power of AI and social media to influence minds in thoughtful, humane directions?When the first episode of Sesame Street aired on PBS in 1969, it was unlike anything that had been on television before - a collaboration between educators, child psychologists, comedy writers and puppeteers - all working together to do something that had never been done before: create educational content for children on television. Fast-forward to the present: could we switch gears to reprogram today’s digital tools to humanely educate the next generation? That’s the question Tristan Harris and Aza Raskin explore with Dr. Rosemarie Truglio, the Senior Vice President of Curriculum and Content for the Sesame Workshop, the non-profit behind Sesame Street. RECOMMENDED MEDIA Street Gang: How We Got to Sesame StreetThis documentary offers a rare window into the early days of Sesame Street, revealing the creators, artists, writers and educators who together established one of the most influential and enduring children’s programs in television historySesame Street: Ready for School!: A Parent's Guide to Playful Learning for Children Ages 2 to 5 by Dr. Rosemarie TruglioRosemarie shares all the research-based, curriculum-directed school readiness skills that have made Sesame Street the preeminent children's TV programG Is for Growing: Thirty Years of Research on Children and Sesame Street co-edited by Shalom Fisch and Rosemarie TruglioThis volume serves as a marker of the significant role that Sesame Street plays in the education and socialization of young childrenThe Democratic Surround by Fred TurnerIn this prequel to his celebrated book From Counterculture to Cyberculture, Turner rewrites the history of postwar America, showing how in the 1940s and 1950s American liberalism offered a far more radical social vision than we now rememberAmusing Ourselves to Death by Neil PostmanNeil Postman’s groundbreaking book about the damaging effects of television on our politics and public discourse has been hailed as a twenty-first-century book published in the twentieth centurySesame Workshop Identity Matters StudyExplore parents’ and educators’ perceptions of children’s social identity developmentEffects of Sesame Street: A meta-analysis of children's learning in 15 countriesCommissioned by Sesame Workshop, the study was led by University of Wisconsin researchers Marie-Louise Mares and Zhongdang PanU.S. Parents & Teachers See an Unkind World for Their Children, New Sesame Survey ShowsAccording to the survey titled, “K is for Kind: A National Survey On Kindness and Kids,” parents and teachers in the United States worry that their children are living in an unkind worldRECOMMENDED YUA EPISODESAre the Kids Alright? With Jonathan HaidtThe Three Rules of Humane TechWhen Media Was for You and Me with Fred Turner Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
22/06/23•29m 36s
Spotlight: How Zombie Values Infect Society
You’re likely familiar with the modern zombie trope: a zombie bites someone you care about and they’re transformed into a creature who wants your brain. Zombies are the perfect metaphor to explain something Tristan and Aza have been thinking about lately that they call zombie values.In this Spotlight episode of Your Undivided Attention, we talk through some examples of how zombie values limit our thinking around tech harms. Our hope is that by the end of this episode, you'll be able to recognize the zombie values that walk amongst us, and think through how to upgrade these values to meet the realities of our modern world. RECOMMENDED MEDIA Is the First Amendment Obsolete?This essay explores free expression challengesThe Wisdom GapThis blog post from the Center for Humane Technology describes the gap between the rising interconnected complexity of our problems and our ability to make sense of themRECOMMENDED YUA EPISODES A Problem Well-Stated is Half Solved with Daniel SchmachtenbergerHow To Free Our Minds with Cult Deprogramming Expert Steve HassanYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
08/06/23•22m 56s
Feed Drop: AI Doomsday with Kara Swisher
There’s really no one better than veteran tech journalist Kara Swisher at challenging people to articulate their thinking. Tristan Harrris recently sat down with her for a wide ranging interview on AI risk. She even pressed Tristan on whether he is a doomsday prepper. It was so great, we wanted to share it with you here. The interview was originally on Kara’s podcast ON with Kara Swisher. If you like it and want to hear more of Kara’s interviews with folks like Sam Altman, Reid Hoffman and others, you can find more episodes of ON with Kara Swisher here: https://link.chtbl.com/_XTWwg3kRECOMMENDED YUA EPISODES AI Myths and MisconceptionsThe AI DilemmaThe Three Rules of Humane TechYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
02/06/23•55m 34s
The Tech We Need for 21st Century Democracy with Divya Siddarth
Democracy in action has looked the same for generations. Constituents might go to a library or school every one or two years and cast their vote for people who don't actually represent everything that they care about. Our technology is rapidly increasing in sophistication, yet our forms of democracy have largely remained unchanged. What would an upgrade look like - not just for democracy, but for all the different places that democratic decision-making happens?On this episode of Your Undivided Attention, we’re joined by political economist and social technologist Divya Siddarth, one of the world's leading experts in collective intelligence. Together we explore how new kinds of governance can be supported through better technology, and how collective decision-making is key to unlocking everything from more effective elections to better ways of responding to global problems like climate change.Correction:Tristan mentions Elon Musk’s attempt to manufacture ventilators early on in the COVID-19 pandemic. Musk ended up buying over 1,200 ventilators that were delivered to California.RECOMMENDED MEDIAAgainst Democracy by Jason BrennanA provocative challenge to one of our most cherished institutionsLedger of HarmsTechnology platforms have created a race for human attention that’s unleashed invisible harms to society. Here are some of the costs that aren't showing up on their balance sheetsThe Wisdom GapThis blog post from the Center for Humane Technology describes the gap between the rising interconnected complexity of our problems and our ability to make sense of themDemocracyNextDemocracyNext is working to design and establish new institutions for government and transform the governance of organizations that influence public lifeCIP.orgAn incubator for new governance models for transformative technologyEtheloTransform community engagement through consensusKazm’s Living Room ConversationsLiving Room Conversations works to heal society by connecting people across divides through guided conversations proven to build understanding and transform communitiesThe Citizens DialogueA model for citizen participation in Ostbelgien, which was brought to life by the parliament of the German-speaking communityAsamblea Ciudadana Para El ClimaSpain’s national citizens’ assembly on climate changeClimate Assembly UKThe UK’s national citizens’ assembly on climate changeCitizens’ Convention for the ClimateFrance’s national citizens’ assembly on climate changePolisPolis is a real-time system for gathering, analyzing and understanding what large groups of people think in their own words, enabled by advanced statistics and machine learningRECOMMENDED YUA EPISODESDigital Democracy is Within Reach with Audrey Tang They Don’t Represent Us with Larry LessigA Renegade Solution to Extractive Economics with Kate RaworthYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
25/05/23•38m 39s
Spotlight: AI Myths and Misconceptions
A few episodes back, we presented Tristan Harris and Aza Raskin’s talk The AI Dilemma. People inside the companies that are building generative artificial intelligence came to us with their concerns about the rapid pace of deployment and the problems that are emerging as a result. We felt called to lay out the catastrophic risks that AI poses to society and sound the alarm on the need to upgrade our institutions for a post-AI world.The talk resonated - over 1.6 million people have viewed it on YouTube as of this episode’s release date. The positive reception gives us hope that leaders will be willing to come to the table for a difficult but necessary conversation about AI.However, now that so many people have watched or listened to the talk, we’ve found that there are some AI myths getting in the way of making progress. On this episode of Your Undivided Attention, we debunk five of those misconceptions. RECOMMENDED MEDIA Opinion | Yuval Harari, Tristan Harris, and Aza Raskin on Threats to Humanity Posed by AI - The New York TimesIn this New York Times piece, Yuval Harari, Tristan Harris, and Aza Raskin call upon world leaders to respond to this moment at the level of challenge it presents.Misalignment, AI & MolochA deep dive into the game theory and exponential growth underlying our modern economic system, and how recent advancements in AI are poised to turn up the pressure on that system, and its wider environment, in ways we have never seen beforeRECOMMENDED YUA EPISODESThe AI DilemmaThe Three Rules of Humane TechCan We Govern AI? with Marietje SchaakeYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
11/05/23•26m 48s
Talking With Animals… Using AI
Despite our serious concerns about the pace of deployment of generative artificial intelligence, we are not anti-AI. There are uses that can help us better understand ourselves and the world around us. Your Undivided Attention co-host Aza Raskin is also co-founder of Earth Species Project, a nonprofit dedicated to using AI to decode non-human communication. ESP is developing this technology both to shift the way that we relate to the rest of nature, and to accelerate conservation research.Significant recent breakthroughs in machine learning have opened ways to encode both human languages and map out patterns of animal communication. The research, while slow and incredibly complex, is very exciting. Picture being able to tell a whale to dive to avoid ship strikes, or to forge cooperation in conservation areas. These advances come with their own complex ethical issues. But understanding non-human languages could transform our relationship with the rest of nature and promote a duty of care for the natural world.In a time of such deep division, it’s comforting to know that hidden underlying languages may potentially unite us. When we study the patterns of the universe, we’ll see that humanity isn’t at the center of it. Corrections:Aza refers to the founding of Earth Species Project (ESP) in 2017. The organization was established in 2018.When offering examples of self-awareness in animals, Aza mentions lemurs that get high on centipedes. They actually get high on millipedes. RECOMMENDED MEDIA Using AI to Listen to All of Earth’s SpeciesAn interactive panel discussion hosted at the World Economic Forum in San Francisco on October 25, 2022. Featuring ESP President and Cofounder Aza Raskin; Dr. Karen Bakker, Professor at UBC and Harvard Radcliffe Institute Fellow; and Dr. Ari Friedlaender, Professor at UC Santa CruzWhat A Chatty Monkey May Tell Us About Learning to TalkThe gelada monkey makes a gurgling sound that scientists say is close to human speechLemurs May Be Making Medicine Out of MillipedesRed-fronted lemurs appear to use plants and other animals to treat their afflictionsFathom on AppleTV+Two biologists set out on an undertaking as colossal as their subjects – deciphering the complex communication of whales Earth Species Project is Hiring a Director of ResearchESP is looking for a thought leader in artificial intelligence with a track record of managing a team of researchers RECOMMENDED YUA EPISODES The Three Rules of Humane TechThe AI DilemmaSynthetic Humanity: AI & What’s At Stake Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
04/05/23•24m 1s
Can We Govern AI?
When it comes to AI, what kind of regulations might we need to address this rapidly developing new class of technologies? What makes regulating AI and runaway tech in general different from regulating airplanes, pharmaceuticals, or food? And how can we ensure that issues like national security don't become a justification for sacrificing civil rights?Answers to these questions are playing out in real time. If we wait for more AI harms to emerge before proper regulations are put in place, it may be too late. Our guest Marietje Schaake was at the forefront of crafting tech regulations for the EU. In spite of AI’s complexity, she argues there is a path forward for the U.S. and other governing bodies to rein in companies that continue to release these products into the world without oversight. Correction: Marietje said antitrust laws in the US were a century ahead of those in the EU. Competition law in the EU was enacted as part of the Treaty of Rome in 1957, almost 70 years after the US. RECOMMENDED MEDIA The AI Dilemma Tristan Harris and Aza Raskin’s presentation on existing AI capabilities and the catastrophic risks they pose to a functional society. Also available in the podcast format (linked below)The Wisdom GapThis blog post from the Center for Humane Technology describes the gap between the rising interconnected complexity of our problems and our ability to make sense of themThe EU’s Digital Services Act (DSA) & Digital Markets Act (DMA)The two pieces of legislation aim to create safer and more open digital spaces for individuals and businesses alike RECOMMENDED YUA EPISODESDigital Democracy is Within Reach with Audrey TangThe AI DilemmaThe Three Rules of Humane TechYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
21/04/23•39m 47s
Spotlight: The Three Rules of Humane Tech
In our previous episode, we shared a presentation Tristan and Aza recently delivered to a group of influential technologists about the race happening in AI. In that talk, they introduced the Three Rules of Humane Technology. In this Spotlight episode, we’re taking a moment to explore these three rules more deeply in order to clarify what it means to be a responsible technologist in the age of AI.Correction: Aza mentions infinite scroll being in the pockets of 5 billion people, implying that there are 5 billion smartphone users worldwide. The number of smartphone users worldwide is actually 6.8 billion now. RECOMMENDED MEDIA We Think in 3D. Social Media Should, TooTristan Harris writes about a simple visual experiment that demonstrates the power of one’s point of viewLet’s Think About Slowing Down AIKatja Grace’s piece about how to avert doom by not building the doom machineIf We Don’t Master AI, It Will Master UsYuval Harari, Tristan Harris and Aza Raskin call upon world leaders to respond to this moment at the level of challenge it presents in this New York Times opinion piece RECOMMENDED YUA EPISODES The AI DilemmaSynthetic humanity: AI & What’s At Stake Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
06/04/23•22m 17s
The AI Dilemma
You may have heard about the arrival of GPT-4, OpenAI’s latest large language model (LLM) release. GPT-4 surpasses its predecessor in terms of reliability, creativity, and ability to process intricate instructions. It can handle more nuanced prompts compared to previous releases, and is multimodal, meaning it was trained on both images and text. We don’t yet understand its capabilities - yet it has already been deployed to the public.At Center for Humane Technology, we want to close the gap between what the world hears publicly about AI from splashy CEO presentations and what the people who are closest to the risks and harms inside AI labs are telling us. We translated their concerns into a cohesive story and presented the resulting slides to heads of institutions and major media organizations in New York, Washington DC, and San Francisco. The talk you're about to hear is the culmination of that work, which is ongoing.AI may help us achieve major advances like curing cancer or addressing climate change. But the point we're making is: if our dystopia is bad enough, it won't matter how good the utopia we want to create. We only get one shot, and we need to move at the speed of getting it right.RECOMMENDED MEDIAAI ‘race to recklessness’ could have dire consequences, tech experts warn in new interviewTristan Harris and Aza Raskin sit down with Lester Holt to discuss the dangers of developing AI without regulationThe Day After (1983)This made-for-television movie explored the effects of a devastating nuclear holocaust on small-town residents of KansasThe Day After discussion panelModerated by journalist Ted Koppel, a panel of present and former US officials, scientists and writers discussed nuclear weapons policies live on television after the film airedZia Cora - Submarines “Submarines” is a collaboration between musician Zia Cora (Alice Liu) and Aza Raskin. The music video was created by Aza in less than 48 hours using AI technology and published in early 2022RECOMMENDED YUA EPISODES Synthetic humanity: AI & What’s At StakeA Conversation with Facebook Whistleblower Frances HaugenTwo Million Years in Two Hours: A Conversation with Yuval Noah HarariYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
24/03/23•42m 25s
TikTok’s Transparency Problem
A few months ago on Your Undivided Attention, we released a Spotlight episode on TikTok's national security risks. Since then, we've learned more about the dangers of the China-owned company: We've seen evidence of TikTok spying on US journalists, and proof of hidden state media accounts to influence the US elections. We’ve seen Congress ban TikTok on most government issued devices, and more than half of US states have done the same, along with dozens of US universities who are banning TikTok access from university wifi networks. More people in Western governments and media are saying that they used to believe that TikTok was an overblown threat. As we've seen more evidence of national security risks play out, there’s even talk of banning TikTok itself in certain countries. But is that the best solution? If we opt for a ban, how do we, as open societies, fight accusations of authoritarianism? On this episode of Your Undivided Attention, we're going to do a deep dive into these questions with Marc Faddoul. He's the co-director of Tracking Exposed, a nonprofit investigating the influence of social media algorithms in our lives. His work has shown how TikTok tweaks its algorithm to maximize partisan engagement in specific national elections, and how it bans international news in countries like Russia that are fighting propaganda battles inside their own borders. In other words, we don't all get the same TikTok because there are different geopolitical interests that might guide which TikTok you see. That is a kind of soft power that TikTok operates on a global scale, and it doesn’t get talked about often enough.We hope this episode leaves you with a lot to think about in terms of what the risks of TikTok are, how it's operating geopolitically, and what we can do about it.RECOMMENDED MEDIATracking Exposed Special Report: TikTok Content Restriction in RussiaHow has the Russian invasion of Ukraine affected the content that TikTok users see in Russia? [Part 1 of Tracking Exposed series]Tracking Exposed Special Report: Content Restrictions on TikTok in Russia Following the Ukrainian WarHow are TikTok’s policy decisions affecting pro-war and anti-war content in Russia? [Part 2 of Tracking Exposed series]Tracking Exposed Special Report: French Elections 2022The visibility of French candidates on TikTok and YouTube search enginesThe Democratic Surround by Fred TurnerA dazzling cultural history that demonstrates how American intellectuals, artists, and designers from the 1930s-1960s imagined new kinds of collective events that were intended to promote a powerful experience of American democracy in actionRECLOMMENDED YUA EPISODESWhen Media Was for You and Me with Fred TurnerAddressing the TikTok ThreatA Fresh Take on Tech in China with Rui Ma and Duncan ClarkYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
02/03/23•37m 13s
Synthetic Humanity: AI & What’s At Stake
It may seem like the rise of artificial intelligence, and increasingly powerful large language models you may have heard of, is moving really fast… and it IS. But what’s coming next is when we enter synthetic relationships with AI that could come to feel just as real and important as our human relationships... And perhaps even more so. In this episode of Your Undivided Attention, Tristan and Aza reach beyond the moment to talk about this powerful new AI, and the new paradigm of humanity and computation we’re about to enter. This is a structural revolution that affects way more than text, art, or even Google search. There are huge benefits to humanity, and we’ll discuss some of those. But we also see that as companies race to develop the best synthetic relationships, we are setting ourselves up for a new generation of harms made exponentially worse by AI’s power to predict, mimic and persuade.It’s obvious we need ways to steward these tools ethically. So Tristan and Aza also share their ideas for creating a framework for AIs that will help humans become MORE humane, not less.RECOMMENDED MEDIA Cybernetics: or, Control and Communication in the Animal and the Machine by Norbert WienerA classic and influential work that laid the theoretical foundations for information theoryNew Chatbots Could Change the World. Can You Trust Them?The New York Times addresses misinformation and how Siri, Google Search, online marketing and your child’s homework will never be the sameOut of One, Many: Using Language Models to Simulate Human Samples by Lisa P. Argyle, Ethan C. Busby, Nancy Fulda, Joshua Gubler, Christopher Rytting, David WingateThis paper proposes and explores the possibility that language models can be studied as effective proxies for specific human sub-populations in social science researchEarth Species ProjectEarth Species Project, co-founded by Aza Raskin, is a non-profit dedicated to using artificial intelligence to decode non-human communicationHer (2013)A science-fiction romantic drama film written, directed, and co-produced by Spike JonzeWhat A Chatty Monkey May Tell Us About Learning To TalkNPR explores the fascinating world of gelada monkeys and the way they communicateRECOMMENDED YUA EPISODESHow Political Language is Engineered with Drew Westen & Frank LuntzWhat is Humane Technology?Down the Rabbit Hole by Design with Guillaume Chaslot
16/02/23•46m 25s
The Race to Cooperation
It’s easy to tell ourselves we’re living in the world we want – one where Darwinian evolution drives competing technology platforms and capitalism pushes nations to maximize GDP regardless of externalities like carbon emissions. It can feel like evolution and competition are all there is.If that’s a complete description of what’s driving the world and our collective destiny, that can feel pretty hopeless. But what if that’s not the whole story of evolution? This is where evolutionary theorist, author, and professor David Sloan Wilson comes in. He has documented where an enlightened game, one of cooperation, rather than competition, is possible. His work shows that humans can and have chosen values like cooperation, altruism and group success – versus individual competition and selfishness – at key moments in our evolution, proving that evolution isn’t just genetic. It’s cultural, and it’s a choice. In a world where our trajectory isn’t tracking in the direction we want, it's time to slow down and ask: is a different kind of conscious evolution possible? On Your Undivided Attention, we’re going to update the Darwinian principles of evolution using a critical scientific lens that can help upgrade our ability to cooperate – ranging from the small community-level, all the way to entire technology companies that can cooperate in ways that allow everyone to succeed. RECOMMENDED MEDIAThis View of Life: Completing the Darwinian Revolution by David Sloan WilsonProsocial: Using Evolutionary Science to Build Productive, Equitable, and Collaborative Groups by David Sloan WilsonAtlas Hugged: The Autobiography of John Galt III by David Sloan WilsonGoverning the Commons: The Evolution of Institutions for Collective Action by Elinor OstromHit Refresh by Satya NadellaWTF? What’s the Future and Why It’s Up to Us by Tim O’ReillyHard Drive: Bill Gates and the Making of the Microsoft Empire by James Wallace & Jim Erickson RECOMMENDED YUA EPISODES An Alternative to Silicon Valley Unicorns with Mara Zepeda & Kate “Sassy” SassoonA Problem Well-Stated is Half-Solved with Daniel Schmachtenberger Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
02/02/23•34m 57s
Ask Us Anything: You Asked, We Answered
Welcome to our first-ever Ask Us Anything episode. Recently we put out a call for questions… and, wow, did you come through! We got more than 100 responses from listeners to this podcast from all over the world. It was really fun going through them all, and really difficult to choose which ones to answer here. But we heard you, and we’ll carry your amazing suggestions and ideas forward with us in 2023.When we created Your Undivided Attention, the goal was to explore the incredible power technology has over our lives, and how we can use it to catalyze a humane future. Three years and a global pandemic later, we’re more committed than ever to helping meet the moment with crucial conversations about humane technology - even as the tech landscape constantly evolves and world events bring more urgency to the need for technology that unites us, invests in democratic values, and enhances our well-being.We’ve learned from our guests alongside all of you. Sixty-one episodes later, the podcast has over 16 million unique downloads! That’s a lot of people who care about the promise of humane technology and are working to construct a more humane version of technology in their lives, their family’s lives, and within their communities and society at large. We’re a movement! Thank you to everyone who submitted questions and comments for us. We loved doing this, and we’re looking forward to doing it again!Correction:When discussing DeepMind’s recent paper, Aza said the premise was four people entering their views and opinions, with AI finding the commonality between all of those viewpoints. It was actually three people entering their views and opinions.RECOMMENDED MEDIA CHT’s Recommended Reading List:Foundations of Humane TechnologyOur free, self-paced online course for professionals shaping tomorrow’s technologyThe Age of Surveillance Capitalism by Shoshana Zuboff Foundational reading on the attention economyAlgorithms of Oppression by Safiya Umoja Noble Seminal work on how algorithms in search engines replicate and reinforce bias online and offlineAmusing Ourselves to Death by Neil Postman Written in 1985, Postman’s work shockingly predicts our current media environment and its effectsAttention Merchants by Tim WuA history of how advertisers capture our attentionDoughnut Economics by Kate Raworth A compass for how to upgrade our economic models to be more regenerative and distributiveThinking in Systems by Donella MeadowsThis excellent primer shows us how to develop systems thinking skillsWhat Money Can’t Buy: The Moral Limits of Markets by Michael SandelSandel explores how we can prevent market values from reaching into spheres of life where they don’t belongEssay: Disbelieving Atrocities by Arthur KoestlerOriginally published January 9, 1944 in The New York TimesHumane Technology reading listComprehensive for those who want to geek outORGANIZATIONS TO EXPLORE Integrity InstituteIntegrity Institute advances the theory and practice of protecting the social internet, powered by their community of integrity professionalsAll Tech Is Human job boardAll Tech Is Human curates roles focused on reducing the harms of technology, diversifying the tech pipeline, and ensuring that technology is aligned with the public interestDenizenDenizen brings together leaders across disciplines to accelerate systemic changeNew_PublicNew_Public is place for thinkers, builders, designers and technologists to meet and share inspirationPsychology of Technology InstitutePTI is non-profit network of behavioral scientists, technology designers, and decision-makers that protects and improves psychological health for society by advancing our understanding and effective use of transformative technologiesRadicalxChangeRxC is a social movement for next-generation political economiesThe School for Social DesignThe School for Social Design offers three courses on articulating what’s meaningful for different people and how to design for it at smaller and larger scalesTechCongressTechCongress is a technology policy fellowship on Capitol HillRECOMMENDED YUA EPISODES An Alternative to Silicon Valley Unicornshttps://www.humanetech.com/podcast/54-an-alternative-to-silicon-valley-unicornsA Problem Well-Stated is Half-Solvedhttps://www.humanetech.com/podcast/a-problem-well-stated-is-half-solvedDigital Democracy is Within Reachhttps://www.humanetech.com/podcast/23-digital-democracy-is-within-reachYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
29/12/22•42m 51s
Can Psychedelic Therapy Reset Our Social Media Brains?
When you look at the world, it can feel like we're in a precarious moment. If you’ve listened to past episodes, you know we call this the meta-crisis — an era of overlapping and interconnected crises like climate change, polarization, and the rise of decentralized technologies like synthetic biology. It can feel like we’re on a path to destroy ourselves.That's why we’re talking to Rick Doblin, the founder and executive director of the Multidisciplinary Association for Psychedelic Studies, or MAPS. They’re a nonprofit focused on educating and researching the benefits of using psychedelic therapy to address PTSD and promote humane ways of relating worldwide.Doblin’s vision is for nothing less than a transformation of society through psychedelic-assisted therapy – not for the drugs themselves, but for their ability to help us react to one another with compassion, appreciate differences, and accept criticism.Given the perma-crisis we face, it’s provocative to think about a tool that, when prescribed and used safely, could help us overcome rivalrous dynamics out in the world and on social media. If we rescue our hijacked brains, we can heal from the constant trauma inflation we get online, and shrink the perception gap that splits us into tribes.Both MAPS and Center for Humane Technology want to understand what helps minds heal and be free. We invite you to keep an open mind about a different kind of humane technology as you listen to this episode. Correction: Doblin attributes a quote to Stan Grof about psychedelics helping your ego be “transparent to the transcendent.” In his book Pathways to Bliss, Joseph Campbell wrote, "When a deity serves as a model for you, your life becomes transparent to the transcendent as long as you realize the inspiring power of that deity. This means living not in the name of worldly success and achievement, but rather in the name of the transcendent, letting the energy manifest through you.” Grof was likely paraphrasing Campbell’s work and applying it to psychedelics. Additional credits:The episode contains an original musical composition by Jeff Sudakin. Used with permission. RECOMMENDED MEDIA Multidisciplinary Association for Psychedelic Studies (MAPS)The non-profit founded by Rick Doblin in 1986 focused on developing medical, legal, and cultural contexts for people to benefit from the careful uses of psychedelics and marijuana. MAPS has some open clinical trials; see details on their website. Rick Doblin’s TED talkIn this fascinating dive into the science of psychedelics, Doblin explains how drugs like LSD, psilocybin and MDMA affect your brain - and shows how, when paired with psychotherapy, they could change the way we treat PTSD, depression, substance abuse and more.How to Change Your Mind by Michael PollanPollan writes of his own consciousness-expanding experiments with psychedelic drugs, and makes the case for why shaking up the brain's old habits could be therapeutic for people facing addiction, depression, or death.How to Change Your Mind on NetflixThe docuseries version of Pollan’s bookBreath by James NestorThis popular science book provides a historical, scientific and personal account of breathing, with special focus on the differences between mouth breathing and nasal breathing.Insight timerA free app for sleep, anxiety, and stress RECOMMENDED YUA EPISODES You Will Never Breathe the Same Again with James Nestorhttps://www.humanetech.com/podcast/38-you-will-never-breathe-the-same-againTwo Million Years in Two Hours: A Conversation with Yuval Noah Harari https://www.humanetech.com/podcast/28-two-million-years-in-two-hours-a-conversation-with-yuval-noah-harariYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
15/12/22•42m 51s
Real Social Media Solutions, Now — with Frances Haugen
When it comes to social media risk, there is reason to hope for consensus. Center for Humane Technology co-founder Tristan Harris recently helped launch a new initiative called the Council for Responsible Social Media (CRSM) in Washington, D.C. It’s a coalition between religious leaders, public health experts, national security leaders, and former political representatives from both sides - people who just care about making our democracy work.During this event, Tristan sat down with Facebook whistleblower Frances Haugen, a friend of Center for Humane Technology, to discuss the harm caused to our mental health and global democracy when platforms lack accountability and transparency. The CRSM is bipartisan, and its kickoff serves to boost the solutions Frances and Tristan identify going into 2023.RECOMMENDED MEDIA Council for Responsible Social Media (CRSM)A project of Issue One, CRSM is a cross-partisan group of leaders addressing the negative mental, civic, and public health impacts of social media in America.Twitter Whistleblower Testifies on Security IssuesPeiter “Mudge” Zatko, a former Twitter security executive, testified on privacy and security issues relating to the social media company before the Senate Judiciary Committee.Beyond the ScreenBeyond the Screen is a coalition of technologists, designers, and thinkers fighting against online harms, led by the Facebook whistle-blower Frances Haugen.#OneClickSafer CampaignOur campaign to pressure Facebook to make one immediate change — join us!RECOMMENDED YUA EPISODES A Conversation with Facebook Whistleblower Frances Haugenhttps://www.humanetech.com/podcast/42-a-conversation-with-facebook-whistleblower-frances-haugenA Facebook Whistleblower: Sophie Zhanghttps://www.humanetech.com/podcast/episode-37-a-facebook-whistleblowerMr. Harris Zooms to Washington https://www.humanetech.com/podcast/episode-35-mr-harris-zooms-to-washingtonWith Great Power Comes… No Responsibility? https://www.humanetech.com/podcast/3-with-great-power-comes-no-responsibilityYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
23/11/22•26m 54s
Spotlight — Humane Technology on '60 Minutes'
The weekly American news show 60 Minutes invited Center for Humane Technology co-founder Tristan Harris back recently to discuss political polarization and the anger and incivility that gets elevated on social media as a matter of corporate profit. We're releasing a special episode of Your Undivided Attention this week to dig further into some of the important nuances of the complexity of this problem.CHT’s work was actually introduced to the world by Anderson Cooper on 60 Minutes back in 2017, and we’re honored to have been invited back. In this new interview, we cover the business model of competing for engagement at all costs - the real root of the problem that we’re thrilled to be able to discuss on a far-reaching platform.We also busted the myth that if you’re not on social media, you don’t need to be concerned. Even if you're not on social media, you likely live in a country that will vote based on other people’s collective choices and behaviors. We know that the media we engage with shapes the people who consume it. CORRECTION: Tristan notes that Facebook's Head of Global Policy, Monika Bickert, says in the interview that social media can't be the root of America's anger because it's people over the age of 60 who are most polarized. She actually said that people over the age of 65 are most polarized.RECOMMENDED MEDIA60 Minutes: “Social Media and Political Polarization in America”https://humanetech.com/60minutesAmusing Ourselves to Death by Neil Postmanhttps://www.penguinrandomhouse.com/books/297276/amusing-ourselves-to-death-by-neil-postman/Neil Postman’s groundbreaking book about the damaging effects of television on our politics and public discourse has been hailed as a twenty-first-century book published in the twentieth century.60 Minutes: “Brain Hacking”https://www.youtube.com/watch?v=awAMTQZmvPERECOMMENDED YUA EPISODES Elon, Twitter, and the Gladiator Arenahttps://www.humanetech.com/podcast/elon-twitter-and-the-gladiator-arenaAddressing the TikTok Threathttps://www.humanetech.com/podcast/bonus-addressing-the-tiktok-threatWhat is Civil War In The Digital Age? With Barbara F Walterhttps://www.humanetech.com/podcast/50-what-is-civil-war-in-the-digital-age
10/11/22•12m 5s
Spotlight — Elon, Twitter and the Gladiator Arena
Since it’s looking more and more like Elon Musk, CEO of Tesla and SpaceX, will probably soon have ownership of Twitter, we wanted to do a special episode about what this could mean for Twitter users and our global digital democracy as a whole.Twitter is a very complicated place. It is routinely blocked by governments who fear its power to organize citizen protests around the world. It’s also where outrage, fear and violence get amplified by design, warping users’ views of each other and our common, connected humanity.We’re at a fork in the road, and we know enough about humane design principles to do this better. So we thought we would do a little thought experiment: What if we applied everything we know about humane technology to Twitter, starting tomorrow? What would happen?This is the second part in a two-part conversation about Twitter that we’ve had on Your Undivided Attention about Elon Musk’s bid for Twitter and what it could mean in the context of the need to go in a more humane direction.RECOMMENDED MEDIA On Liberty by John Stuart MillPublished in 1859, this philosophical essay applies Mill's ethical system of utilitarianism to society and stateElon Musk Only Has “Yes” Men by Jonathan L. FischerReporting from Slate on the subject Foundations of Humane TechnologyThe Center for Humane Technology's free online course for professionals shaping tomorrow's technologyRECOMMENDED YUA EPISODES A Bigger Picture on Elon and Twitterhttps://www.humanetech.com/podcast/bigger-picture-elon-twitterTranscending the Internet Hate Game with Dylan Marronhttps://www.humanetech.com/podcast/52-transcending-the-internet-hate-gameFighting With Mirages of Each Other with Adam Mastroiannihttps://www.humanetech.com/podcast/56-fighting-with-mirages-of-each-otherYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
27/10/22•17m 36s
They Don’t Represent Us — with Larry Lessig
We often talk about the need to protect American democracy. But perhaps those of us in the United States don't currently live in a democracy.As research shows, there's pretty much no correlation between the percentage of the population that supports a policy and its likelihood of being enacted. The strongest determinant of whether a policy gets enacted is how much money is behind it.So, how might we not just protect, but better yet revive our democracy? How might we revive the relationship between the will of the people and the actions of our government?This week on Your Undivided Attention, we're doing something special. As we near the election, and representation is on our minds, we're airing a talk by Harvard Law professor and Creative Commons co-founder Larry Lessig. It's a 2019 talk he gave at the Politics and Prose bookstore in Washington, DC about his book, They Don't Represent Us.The book title has two meanings: first, they — as in our elected representatives — don't represent us. And second, we — as in the people — don't represent ourselves. And this is where social media comes in: we don't represent ourselves because the more we use social media, the more we see extreme versions of the other side, and the more extreme, outraged, and polarized we ourselves become.Last note: Lessig's talk is highly visual. We edited it lightly for clarity, and jump in periodically to narrate things you can’t see. But if you prefer to watch his talk, you can find the link below in Recommended Media. RECOMMENDED MEDIA Video: They Don't Represent UsThe 2019 talk Larry Lessig gave at Politics and Prose in Washington, DC about his book of the same nameBook: They Don't Represent UsLarry Lessig’s 2019 book that elaborates the ways in which democratic representation is in peril, and proposes a number of solutions to revive our democracy -- from ranked-choice voting to non-partisan open primariesTesting Theories of American Politics: Elites, Interest Groups, and Average Citizens Princeton's Martin Gilens and Benjamin I. Page study measuring the correlation between the preferences of different groups and the decisions of our government RECOMMENDED YUA EPISODESDigital Democracy is Within Reach with Audrey Tanghttps://www.humanetech.com/podcast/23-digital-democracy-is-within-reachHow Political Language Is Engineered with Drew Westen and Frank Luntzhttps://www.humanetech.com/podcast/53-how-political-language-is-engineeredYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
20/10/22•39m 37s
Stepping Into the Metaverse — with Dr. Courtney Cogburn and Prof. Jeremy Bailenson
The next frontier of the internet is the metaverse. That's why Mark Zuckerberg changed the name of his company from Facebook to Meta, and just sold $10 billion in corporate bonds to raise money for metaverse-related projects.How might we learn from our experience with social media, and anticipate the harms of the metaverse before they arise? What would it look like to design a humane metaverse — that respects our attention, improves our well-being, and strengthens our democracy?This week on Your Undivided Attention, we talk with two pioneers who are thinking critically about the development of the metaverse. Professor Jeremy Bailenson is the Founding director of Stanford’s Virtual Human Interaction Lab, where he studies how virtual experiences lead to changes in perceptions of self and others. Dr. Courtney Cogburn is an Associate Professor at Columbia's School of Social Work, where she examines associations between racism and stress-related disease. Jeremy and Courtney collaborated on 1000 Cut Journey, a virtual reality experience about systemic racism.CORRECTIONS: In the episode, Courtney says that the average US adult consumes 9 hours of media per day, but the actual number in 2022 is closer to 13 hours.Finally, Aza mentions the "pockets of 4.6 billion people" — implying that there are 4.6 billion smartphone users. The global number of social media users is 4.7 billion, and the number of smartphone users is actually 6.6 billion.RECOMMENDED MEDIA: Experience on Demand: What Virtual Reality Is, How It Works, and What It Can Dohttps://www.amazon.com/Experience-Demand-Virtual-Reality-Works/dp/0393253694Jeremy Bailenson's 2018 book exploring how virtual reality can be harnessed to improve our everyday livesExperiencing Racism in VRhttps://www.ted.com/talks/courtney_cogburn_experiencing_racism_in_vr_courtney_d_cogburn_phd_tedxrvaCourtney Cogburn's 2017 TEDx talk about how using virtual reality to help people experience the complexities of racismDo Artifacts Have Politics?https://faculty.cc.gatech.edu/~beki/cs4001/Winner.pdf Technology philosopher Langdon Winner’s seminal 1980 article, in which he writes, "by far the greatest latitude of choice exists the very first time a particular instrument, system, or technique is introduced."RECOMMENDED YUA EPISODES: Do You Want To Become A Vampire? with LA Paulhttps://www.humanetech.com/podcast/39-do-you-want-to-become-a-vampirePardon the Interruptions with Gloria Markhttps://www.humanetech.com/podcast/7-pardon-the-interruptionsBonus - What Is Humane Technology?https://www.humanetech.com/podcast/bonus-what-is-humane-technologyYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
06/10/22•59m 35s
Fighting With Mirages of Each Other — with Adam Mastroianni
Have you ever lost a friend to misperception? Have you lost a friend or a family member to the idea that your views got so different, that it was time to end the relationship — perhaps by unfriending each other on Facebook?As it turns out, we often think our ideological differences are far greater than they actually are. Which means: we’re losing relationships and getting mired in polarization based on warped visions of each other. This week on Your Undivided Attention, we're talking with Adam Mastroianni, a postdoctoral research scholar at Columbia Business School who studies how we perceive and misperceive our social worlds. Together with Adam, we're going to explore how accurate — and inaccurate — our views of each other are. As you listen to our conversation, keep in mind that relationship you might have lost to misperception, and that you might be able to revive as a result of what you hear.CORRECTIONS: In the episode, Adam says in 1978, 85% of people said they'd vote for a Black president, but the actual percentage is 80.4%. Tristan says that Republicans estimate that more than a third of Democrats are LGBTQ, but the actual percentage is 32%. Finally, Tristan refers to Anil Seth's notion of cognitive impenetrability, but that term was actually coined by the Canadian cognitive scientist and philosopher Zenon W. Pylyshyn.RECOMMENDED MEDIA Widespread Misperceptions of Long-term Attitude Changehttps://www.pnas.org/doi/abs/10.1073/pnas.2107260119 Adam Mastroianni's research paper showing how stereotypes of the past lead people to misperceive attitude change, and how these misperceptions can lend legitimacy to policies that people may not actually preferExperimental Historyhttps://experimentalhistory.substack.com/ Adam's blog, where he shares original data and thinks through ideasAmericans experience a false social reality by underestimating popular climate policy support by nearly halfhttps://www.nature.com/articles/s41467-022-32412-yAcademic study showing that Americans are living in what researchers called a “false social reality” with respect to misperceptions about climate viewsRECOMMENDED YUA EPISODES Mind the (Perception) Gap with Dan Vallonehttps://www.humanetech.com/podcast/33-mind-the-perception-gapThe Courage to Connect. Guests: Ciaran O’Connor and John Wood, Jr.https://www.humanetech.com/podcast/30-the-courage-to-connectTranscending the Internet Hate Game with Dylan Marronhttps://www.humanetech.com/podcast/52-transcending-the-internet-hate-game
22/09/22•39m 43s
Spotlight — Addressing the TikTok Threat
Imagine it's the Cold War. Imagine that the Soviet Union puts itself in a position to influence the television programming of the entire Western world — more than a billion viewers. While this might sound like science fiction, it’s representative of the world we're living in, with TikTok being influenced by the Chinese Communist Party.TikTok, the flagship app of the Chinese company Bytedance, recently surpassed Google and Facebook as the most popular site on the internet in 2021, and is expected to reach more than 1.8 billion users by the end of 2022. The Chinese government doesn't control TikTok, but has influence over it. What are the implications of this influence, given that China is the main geopolitical rival of the United States?This week on Your Undivided Attention, we bring you a bonus episode about TikTok. Co-hosts Tristan Harris and Aza Raskin explore the nature of the TikTok threat, and how we might address it.RECOMMENDED MEDIA Pew Research Center's "Teens, Social Media and Technology 2022"https://www.pewresearch.org/internet/2022/08/10/teens-social-media-and-technology-2022/Pew's recent study on how TikTok has established itself as one of the top online platforms for U.S. teensAxios' "Washington turns up the heat on TikTok"https://www.axios.com/2022/07/07/congress-tiktok-china-privacy-data?utm_source=substack&utm_medium=emailArticle on recent Congressional responses to the threat of TikTokFelix Krause on TikTok's keystroke trackinghttps://twitter.com/KrauseFx/status/1560372509639311366A revelation that TikTok has code to observe keypad input and all tapsRECOMMENDED YUA EPISODESA Fresh Take on Tech in China with Rui Ma and Duncan Clarkhttps://www.humanetech.com/podcast/44-a-fresh-take-on-tech-in-chinaA Conversation with Facebook Whistleblower Frances Haugenhttps://www.humanetech.com/podcast/42-a-conversation-with-facebook-whistleblower-frances-haugenFrom Russia with Likes (Part 1). Guest: Renée DiRestahttps://www.humanetech.com/podcast/5-from-russia-with-likes-part-1From Russia with Likes (Part 2). Guest: Renée DiRestahttps://www.humanetech.com/podcast/6-from-russia-with-likes-part-2 Your Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
08/09/22•23m 57s
Spotlight — How might a long-term stock market transform tech?
At Center for Humane Technology, we often talk about multipolar traps — which arise when individuals have an incentive to act in ways that are beneficial to them in the short term, but detrimental to the group in the long term. Think of social media companies that compete for our attention, so that when TikTok introduces an even-more addictive feature, Facebook and Twitter have to mimic it in order to keep up, sending us all on a race to the bottom of our brainstems.Intervening at the level of multipolar traps has extraordinary leverage. One such intervention is the Long Term Stock Exchange — a U.S. national securities exchange serving companies and investors who share a long-term vision. Instead of asking public companies to pollute less or be less addictive while holding them accountable to short-term shareholder value, the Long-Term Stock Exchange creates a new playing field, which incentivizes the creation of long-term stakeholder value.This week on Your Undivided Attention, we’re airing an episode of a podcast called ZigZag — a fellow member of the TED Audio Collective. In an exploration of how technology companies might transcend multipolar traps, we're sharing with you ZigZag’s conversation with Long Term Stock Exchange founder Eric Ries.CORRECTION: In the episode, we say that TikTok has outcompeted Facebook, Instagram, and YouTube. In fact, TikTok has outcompeted Facebook, but not yet YouTube or Instagram — TikTok has 1 billion monthly users, while YouTube has 2.6 billion and Instagram has 2 billion. However, we can say that TikTok is on a path toward outcompeting YouTube and Instagram.RECOMMENDED YUA EPISODESAn Alternative to Silicon Valley Unicorns with Mara Zepeda & Kate “Sassy” Sassoon: https://www.humanetech.com/podcast/54-an-alternative-to-silicon-valley-unicornsA Problem Well-Stated Is Half-Solved with Daniel Schmachtenberger: https://www.humanetech.com/podcast/a-problem-well-stated-is-half-solvedHere’s Our Plan And We Don’t Know with Tristan Harris, Aza Raskin, and Stephanie Lepp: https://www.humanetech.com/podcast/46-heres-our-plan-and-we-dont-know
25/08/22•38m 37s
The Invisible Cyber-War
When you hear the word cyber-attack, what comes to mind? Someone hacking into your email, or stealing your Facebook password?As it turns out, our most critical infrastructure can be hacked. Our banks, water treatment facilities, and nuclear power plants can be deactivated and even controlled simply by finding bugs in the software used to operate them. Suddenly, cyber-attack takes on a different meaning.This week on Your Undivided Attention, we're talking with cyber-security expert Nicole Perlroth. Nicole spent a decade as the lead cyber-security reporter at The New York Times, and is now a member of the Department of Homeland Security’s Cybersecurity Advisory Committee. She recently published “This Is How They Tell Me The World Ends” — an in-depth exploration of the global cyber arms race.CORRECTIONS: In the episode, Nicole says that "the United States could have only afforded 2 to 3 more days of Colonial Pipeline being down before it ground the country — our economy — to a halt." The correct number is actually 3 to 5 days. She also refers to a 2015 study researching why some countries have significantly fewer successful cyber-attacks relative to cyber-attack attempts. That study was actually published in 2016.RECOMMENDED MEDIA This Is How They Tell Me The World EndsNicole Perlroth’s 2021 book investigating the global cyber-weapons arms raceReporter Page at the New York TimesNicole’s articles while the lead cyber-security reporter at the New York TimesThe Global Cyber-Vulnerability Report (in brief)Brief of a 2015 study by the Center for Digital International Government, Virginia Tech, and the University of Maryland that researched why some countries have significantly fewer successful cyber-attacks relative to cyber-attack attemptsRECOMMENDED YUA EPISODES The Dark Side Of Decentralization with Audrey Kurth Cronin: https://www.humanetech.com/podcast/49-the-dark-side-of-decentralizationIs World War III Already Here? Guest: Lieutenant General H.R. McMaster: https://www.humanetech.com/podcast/45-is-world-war-iii-already-hereA Problem Well-Stated Is Half-Solved with Daniel Schmachtenberger: https://www.humanetech.com/podcast/a-problem-well-stated-is-half-solvedYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
04/08/22•58m 21s
An Alternative to Silicon Valley Unicorns
Why isn't Twitter doing more to get bots off their platform? Why isn’t Uber taking better care of its drivers? What if...they can't?Venture-capital backed companies like Twitter and Uber are held accountable to maximizing returns to investors. If and when they become public companies, they become accountable to maximizing returns to shareholders. They’ve promised Wall Street outsized returns — which means Twitter can't lose bots if it would significantly lower their user count and in turn lower advertising revenue, and Uber can’t treat their drivers like employees if it competes with profits.But what's the alternative? What might it look like to design an ownership and governance model that incentivizes a technology company to serve all of its stakeholders over the long term – and primarily, the stakeholders who create value?This week on Your Undivided Attention, we're talking with two experts on creating the conditions for humane business, and in turn, for humane technology: Mara Zepeda and Kate “Sassy” Sassoon of Zebras Unite Co-Op. Zebras Unite is a member-owned co-operative that’s creating the capital, culture, and community to power a more just and inclusive economy. The Zebras Unite Coop serves a community of over 6,000 members, in about 30 chapters, over 6 continents. Mara is their Managing Director, and Kate is their Director of Cooperative Membership.Two corrections:The episode says that the failure rate of startups is 99%. The actual rate is closer to 90%.The episode says that in 2017, Twitter reported 350 million users on its platform. The actual number reported was 319 million users.RECOMMENDED MEDIA Zebras Fix What Unicorns BreakA seminal 2017 article by Zebras Unite co-founders, which kicked off the movement and distinguished between zebras and unicorns — per the table below.Meetup to the People Zebras Unite’s 2019 thought experiment of exiting Meetup to communityZebras Unite Crowdcast ChannelWhere you can find upcoming online events, as well as recordings of previous events.RECOMMENDED YUA EPISODES A Renegade Solution to Extractive Economics with Kate Raworth: https://www.humanetech.com/podcast/29-a-renegade-solution-to-extractive-economicsBonus — A Bigger Picture on Elon & Twitter: https://www.humanetech.com/podcast/bigger-picture-elon-twitter Here’s Our Plan And We Don’t Know with Tristan Harris, Aza Raskin, and Stephanie Lepp: https://www.humanetech.com/podcast/46-heres-our-plan-and-we-dont-knowYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
30/06/22•51m 26s
Spotlight — Conversations With People Who Hate Me with Dylan Marron
This week on Your Undivided Attention, we’re doing something different: we’re airing an episode of another podcast that’s also part of the TED Audio Collective.Backing up for a moment: we recently aired an episode with Dylan Marron — creator and host of the podcast, Conversations With People Who Hate Me. On his show, Dylan calls up the people behind negative comments on the internet, and asks them: why did you write that?In our conversation with Dylan, we played a clip from episode 2 of Conversations With People Who Hate Me. In that episode, Dylan talks with a high school student named Josh, who’d sent him homophobic messages online. This week, we're airing that full episode — the full conversation between Dylan Marron and Josh.If you didn’t hear our episode with Dylan, do give it a listen. Then, enjoy this second episode of Conversations With People Who Hate Me.RECOMMENDED YUA EPISODES Transcending the Internet Hate Game with Dylan Marron: https://www.humanetech.com/podcast/52-transcending-the-internet-hate-gameA Conversation with Facebook Whistleblower Frances Haugen: https://www.humanetech.com/podcast/42-a-conversation-with-facebook-whistleblower-frances-haugenThe Cure for Hate. Guest: Tony McAleer: https://www.humanetech.com/podcast/11-the-cure-for-hateYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
16/06/22•31m 25s
How Political Language Is Engineered — with Drew Westen and Frank Luntz
Democracy depends on our ability to choose our political views. But the language we use to talk about political issues is deliberately designed to be divisive, and can produce up to a 15-point difference in what we think about those issues. As a result, are we choosing our views, or is our language choosing them for us?This week,Your Undivided Attention welcomes two Jedi Masters of political communication. Drew Westen is a political psychologist and messaging consultant based at Emory university, who has advised the Democratic Party. Frank Luntz is a political and communications consultant, pollster, and pundit, who has advised the Republican Party. In the past, our guests have used their messaging expertise in ways that increased partisanship. For example, Luntz advocated for the use of the term “death tax” instead of “estate tax,” and “climate change” instead of “global warming.” Still, Luntz and Westen are uniquely positioned to help us decode the divisive power of language — and explore how we might design language that unifies.CORRECTIONS: in the episode, Tristan refers to a panel Drew Westen and Frank Luntz were on at the New York Public Library. He says the panel was “about 10 years ago,” but it was actually 15 years ago in 2007. Also, Westen refers to a news anchor who moderated a debate between George H. W. Bush and Michael Dukakis in 1988. Drew mistakenly names the anchor as Bernard Kalb, when it was actually Bernard Shaw.RECOMMENDED MEDIAThe Political Brain: The Role of Emotion in Deciding the Fate of the NationDrew Westen's 2008 book about role of emotion in determining the political life of the nation, which influenced campaigns and elections around the worldWords That Work: It's Not What You Say, It's What People HearFrank Luntz's 2008 book, which offers a behind-the-scenes look at how the tactical use of words and phrases affects what we buy, who we vote for, and even what we believe inNew York Public Library's Panel on Political Language A 2007 panel between multiple 'Jedi Masters' of political communication along the political spectrum, including Frank Luntz, Drew Westen, and George Lakoff RECOMMENDED YUA EPISODESThe Invisible Influence of Language with Lera Boroditsky: https://www.humanetech.com/podcast/48-the-invisible-influence-of-languageHow To Free Our Minds with Cult Deprogramming Expert Dr. Steven Hassan: https://www.humanetech.com/podcast/51-how-to-free-our-mindsMind the (Perception) Gap with Dan Vallone: https://www.humanetech.com/podcast/33-mind-the-perception-gapYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
02/06/22•36m 41s
Transcending the Internet Hate Game — with Dylan Marron
The game that social media sets us up to play is a game that rewards outrage. It's a game that we win by being better than other players at dunking on each other, straw-manning each other, and assuming the worst in each other. The game itself must be transformed.And, we can also decide to step out of the game, and do something different. On this week’s episode of Your Undivided Attention, we welcome Dylan Marron — who has been called by Jason Sudeikis "a modern Mr. Rogers for the digital age." Dylan is the creator and host of the podcast Conversations With People Who Hate Me. On the show, he calls up the people behind negative comments on the internet, and asks them a simple question: why did you write that? He just published a book by the same name, where he elaborates 12 lessons learned from talking with internet strangers. Together with Dylan, we explore how transforming the game and transforming ourselves can go hand-in-hand.RECOMMENDED MEDIA Conversations With People Who Hate Me (podcast)Dylan Marron’s podcast where he calls up the people behind negative comments on the internet, and talks to them. In this episode, we heard a clip of Episode 2: Hurt People Hurt People.Conversations With People Who Hate Me (book)Dylan’s book where he elaborates 12 lessons learned from talking with internet strangers.Won’t You Be My NeighborFeature documentary chronicling the work and legacy of Fred Rogers.RECOMMENDED YUA EPISODES A Conversation with Facebook Whistleblower Frances Haugen: https://www.humanetech.com/podcast/42-a-conversation-with-facebook-whistleblower-frances-haugenThe Cure for Hate. Guest: Tony McAleer: https://www.humanetech.com/podcast/11-the-cure-for-hateThe Fake News of Your Own Mind with Jack Kornfield and Trudy Goodman: https://www.humanetech.com/podcast/19-the-fake-news-of-your-own-mindYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
19/05/22•45m 54s
How To Free Our Minds — with Cult Deprogramming Expert Dr. Steven Hassan
How would you know if you were in a cult? If not a cult, then at least under undue influence?The truth is: we're all under some form of undue influence. The question is: to what degree and to what extent we’re aware of this influence — which is exacerbated by social media. In an era of likes, followers, and echo chambers, how can we become aware of undue influence and gain sovereignty over our minds?Our guest this week is Dr. Steven Hassan, an expert on undue influence, brainwashing, and unethical hypnosis. He’s the founder of the Freedom of Mind Resource Center — a coaching, consulting, and training organization dedicated to helping people freely consider how they want to live their lives. Dr. Hassan was himself a member of a cult: the Unification Church (also known as the Moonies), which was developed in Korea in the 1950's. Since leaving the Moonies, Dr. Hassan has helped thousands of individuals and families recover from undue influence.RECOMMENDED MEDIA Freedom of Mind website: The website for Dr. Hassan’s Freedom of Mind Resource Center, which includes resources such as his Influence Continuum, BITE model of authoritarian control, and Strategic Interactive Approach for alleviating people of undue influenceThe Influence Continuum with Dr. Steven Hassan: Dr. Hassan’s podcast exploring how mind-control works, and how to protect yourself from its grips Reckonings: A podcast that told the stories of people who’ve transcended extremism, expanded their worldviews, and made other kinds of transformative change. Start with episode 17 featuring a former paid climate skeptic, or episode 18 featuring the former protégé of Fox News chairman Roger AilesRECOMMENDED YUA EPISODES Can Your Reality Turn on a Word? Guest: Anthony Jacquin: https://www.humanetech.com/podcast/34-can-your-reality-turn-on-a-wordThe World According to Q. Guest: Travis View: https://www.humanetech.com/podcast/21-the-world-according-to-qThe Cure for Hate. Guest: Tony McAleer: https://www.humanetech.com/podcast/11-the-cure-for-hateYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
05/05/22•51m 28s
Spotlight — A Bigger Picture on Elon & Twitter
If Elon Musk owns Twitter, what are the risks and what are the opportunities? In order for Twitter to support democracy — and Musk’s goal of becoming a multi-planetary civilization — we need a radical redesign that goes beyond free speech. Note: this conversation was recorded on April 21, 2022. That was 3 days prior to the official purchase announcement, which revealed that Elon Musk will buy Twitter for $44 billion. Clarification: In the episode, we talk about the creation of The Daily Show, featuring Jon Stewart. To be clear, The Daily Show was created by writer and producer Madeleine Smithberg and comedian and media personality Lizz Winstead — for comedian and host Craig Kilborn. Jon Stewart took over in 1999, which is when he had the conversation with executives that we reference in the episode, where he didn't want to see the viewership numbers.RECOMMENDED MEDIA Examining algorithmic amplification of political content on TwitterPolarization of Twitter (Knight Foundation)Pew Research on the political extremes drowning out centrist voices on TwitterChronological feed vs algorithm (Computational Journalism Lab)RECOMMENDED YUA EPISODESA Conversation with Facebook Whistleblower Frances Haugen: https://www.humanetech.com/podcast/42-a-conversation-with-facebook-whistleblower-frances-haugenHere’s Our Plan And We Don’t Know: https://www.humanetech.com/podcast/46-heres-our-plan-and-we-dont-knowA Problem Well-Stated Is Half-Solved: https://www.humanetech.com/podcast/a-problem-well-stated-is-half-solvedYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
26/04/22•13m 42s
What Is Civil War in the Digital Age? — with Barbara F. Walter
Civil war might be the most likely escalation pathway towards disaster for our country. On the flip side, learning how to avoid civil conflict — and more ambitiously, repair our civic fabric — might have the greatest leverage for addressing the challenges we face.Our guest Barbara F. Walter is one of the world's leading experts on civil wars, political violence, and terrorism. She’s the author of How Civil Wars Start: And How To Stop Them, which provides insight into the drivers of civil war, how social media fuels conflict, and how we might repair our broken democracies. Together, we explore what makes for a healthy liberal democracy, why democracies worldwide are in decline, and the role of resentment and hope. Join us in an exploration of the generator functions for civil war in the digital age, and how we might prevent them.RECOMMENDED MEDIAHow Civil Wars Start: And How To Stop ThemBarbara F. Walter’s latest book and the subject of our conversation, identifying the conditions that give rise to modern civil war in order to address themPolitical Violence At A GlanceAn award-winning online magazine about the causes and consequences of violence and protest, co-authored by Barbara and other expertsThe Center for Systemic PeacePublications, analysis, and other resources from the organizations that measures for democracies and anocracies on a 21-point scale RECOMMENDED YUA EPISODESA Conversation with Facebook Whistleblower Frances Haugen: https://www.humanetech.com/podcast/42-a-conversation-with-facebook-whistleblower-frances-haugenThe Courage to Connect. Guests: Ciaran O’Connor and John Wood, Jr.: https://www.humanetech.com/podcast/30-the-courage-to-connectMind the (Perception) Gap with Dan Vallone: https://www.humanetech.com/podcast/33-mind-the-perception-gapYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
21/04/22•49m 31s
Spotlight — What Is Humane Technology?
“The fundamental problem of humanity is that we have paleolithic emotions, medieval institutions, and God-like technology.” — E. O. Wilson.More than ever, we need the wisdom to match the power of our God-like technology. Yet, technology is both eroding our ability to make sense of the world, and increasing the complexity of the issues we face. The gap between our sense-making ability and issue complexity is what we call the “wisdom gap." How do we develop the wisdom we need to responsibly steward our God-like technology?This week on Your Undivided Attention, we're introducing one way Center for Humane Technology is attempting to close the wisdom gap —through our new online course, Foundations of Humane Technology. In this bonus episode, Tristan Harris describes the wisdom gap we're attempting to close, and our Co-Founder and Executive Director Randima Fernando talks about the course itself.Sign up for the free course: https://www.humanetech.com/courseRECOMMENDED YUA EPISODESA Problem Well-Stated Is Half-Solved with Daniel Schmachtenberger: https://www.humanetech.com/podcast/a-problem-well-stated-is-half-solvedA Conversation with Facebook Whistleblower Frances Haugen: https://www.humanetech.com/podcast/42-a-conversation-with-facebook-whistleblower-frances-haugenHere’s Our Plan And We Don’t Know with Tristan Harris, Aza Raskin, and Stephanie Lepp: https://www.humanetech.com/podcast/46-heres-our-plan-and-we-dont-knowYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
07/04/22•12m 50s
Digital Democracy is Within Reach with Audrey Tang (Rerun)
[This episode originally aired on July 23rd, 2020.] Imagine a world where every country has a digital minister and technologically-enabled legislative bodies. Votes are completely transparent and audio and video of all conversations between lawmakers and lobbyists are available to the public immediately. Conspiracy theories are acted upon within two hours and replaced by humorous videos that clarify the truth. Imagine that expressing outrage about your local political environment turned into a participatory process where you were invited to solve that problem and even entered into a face to face group workshop. Does that sound impossible? It’s ambitious and optimistic, but that's everything that our guest this episode, Audrey Tang, digital minister of Taiwan, has been working on in her own country for many years. Audrey’s path into public service began in 2014 with her participation in the Sunflower Movement, a student-led protest in Taiwan’s parliamentary building, and she’s been building on that experience ever since, leading her country into a future of truly participatory digital democracy.
24/03/22•47m 33s
The Dark Side Of Decentralization — with Audrey Kurth Cronin
Is decentralization inherently a good thing? These days, there's a lot of talk about decentralization. Decentralized social media platforms can allow us to own our own data. Decentralized cryptocurrencies can enable bank-free financial transactions. Decentralized 3D printing can allow us to fabricate anything we want.But if the world lives on Bitcoin, we may not be able to sanction nation states like Russia when they invade sovereign nations. If 3D printing is decentralized, anyone can print their own weapons at home. Decentralization takes on new meaning when we're talking about decentralizing the capacity for catastrophic destruction. This week on Your Undivided Attention, we explore the history of decentralized weaponry, how social media is effectively a new decentralized weapon, and how to wisely navigate these threats. Guiding us through this exploration is Audrey Kurth Cronin — one of the world’s leading experts in security and terrorism. Audrey is a distinguished Professor of International Security at American University, and the author of several books — most recently: Power to the People: How Open Technological Innovation is Arming Tomorrow’s Terrorists.Clarification: in the episode, Tristan refers to a video of Daniel Schmachtenberger's as "The Psychological Pitfalls of Working on Existential Risk." The correct name of the video is "Psychological Pitfalls of Engaging With X-Risks & Civilization Redesign."RECOMMENDED MEDIA Power to the People: How Open Technological Innovation is Arming Tomorrow's TerroristsAudrey Kurth Cronin's latest book, which analyzes emerging technologies and devises a new framework for analyzing 21st century military innovationPsychological Pitfalls of Engaging With X-Risks & Civilization RedesignDaniel Schmachtenberger's talk discussing the psychological pitfalls of working on existential risks and civilization redesignPolicy Reforms ToolkitThe Center for Humane Technology's toolkit for developing policies to protect the conditions that democracy needs to thrive: a comprehensively educated public, a citizenry that can check the power of market forces and bind predatory behaviorRECOMMENDED YUA EPISODES22 – Digital Democracy is Within Reach with Audrey Tang: https://www.humanetech.com/podcast/23-digital-democracy-is-within-reach 28 – Two Million Years in Two Hours: A Conversation with Yuval Noah Harari: https://www.humanetech.com/podcast/28-two-million-years-in-two-hours-a-conversation-with-yuval-noah-harari45 – Is World War III Already Here? Guest: Lieutenant General H.R. McMaster: https://www.humanetech.com/podcast/45-is-world-war-iii-already-here
10/03/22•48m 19s
The Invisible Influence of Language — with Lera Boroditsky
One of the oldest technologies we have is language. How do the words we use influence the way we think?The media can talk about immigrants scurrying across the border, versus immigrants crossing the border. Or we might hear about technology platforms censoring us, versus moderating content. If those word choices shift public opinion on immigration or technology by 25%, or even 2%, then we’ve been influenced in ways we can't even see. Which means that becoming aware of how words shape the way we think can help inoculate us from their undue influence. And further, consciously choosing or even designing the words we use can help us think in more complex ways – and address our most complex challenges.This week on Your Undivided Attention, we're grateful to have Lera Boroditsky, a cognitive scientist who studies how language shapes thought. Lera is an Associate Professor of Cognitive Science at UC San Diego, and the editor-in-chief of Frontiers in Cultural Psychology.Clarification: in the episode, Aza refers to Elizabeth Loftus' research on eyewitness testimony. He describes an experiment in which a car hit a stop sign, but the experiment actually used an example of two cars hitting each other.RECOMMENDED MEDIA How language shapes the way we thinkLera Boroditsky's 2018 TED talk about how the 7,000 languages spoken around the world shape the way we thinkMeasuring Effects of Metaphor in a Dynamic Opinion LandscapeBoroditsky and Paul H. Thibodeau's 2015 study about how the metaphors we use to talk about crime influence our opinions on how to address crime Subtle linguistic cues influence perceived blame and financial liabilityBoroditsky and Caitlin M. Fausey's 2010 study about how the language used to describe the 2004 Super Bowl "wardrobe malfunction" influence our views on culpabilityWhy are politicians getting 'schooled' and 'destroyed'?BBC article featuring the research of former Your Undivided Attention guest Guillaume Chaslot, which shows the verbs YouTube is most likely to include in titles of recommended videos — such as "obliterates" and "destroys"RECOMMENDED YUA EPISODES Mind the (Perception) Gap: https://www.humanetech.com/podcast/33-mind-the-perception-gapCan Your Reality Turn on a Word?: https://www.humanetech.com/podcast/34-can-your-reality-turn-on-a-wordDown the Rabbit Hole by Design: https://www.humanetech.com/podcast/4-down-the-rabbit-hole-by-design
24/02/22•40m 19s
How Science Fiction Can Shape Our Reality — with Kim Stanley Robinson
The meta-crisis is so vast: climate change, exponential technology, addiction, polarization, and more. How do we grasp it, let alone take steps to address it? One of the thinking tools we have at our disposal is science fiction. To the extent that we co-evolve with our stories, science fiction can prepare us for the impending future — and empower us to shape it.This week on Your Undivided Attention, we're thrilled to have one of the greatest living science-fiction writers — Kim Stanley Robinson. His most recent novel is The Ministry for the Future, a sweeping epic that reaches into the very near future, and imagines what it would take to unite humanity and avoid a mass extinction. Whether or not you've read the book, this episode has insights for you. And if this episode makes you want to read the book, our conversation won't spoil it for you.Clarification: in the episode, Robinson refers to philosopher Antonio Gramsci's "pessimism of the intellect, optimism of the will." This phrase was originally said by novelist and playwright Romain Rolland. Gramsci made the phrase the motto of his newspaper, because he appreciated its integration of radical intellectualism with revolutionary activism.RECOMMENDED MEDIA The Ministry For The FutureRobinson's latest novel and the subject of our conversation — which reaches into the near future, and imagines what it would take to unite humanity and avoid a mass extinctionA Deeper Dive Into the Meta CrisisCHT's blog post about the meta-crisis, which includes the fall of sense-making and the rise of decentralized technology-enabled power Half Earth ProjectThe project based on E. O. Wilson's proposal to conserve half the land and sea — in order to safeguard the bulk of biodiversity, including ourselvesClimateAction.techGlobal tech worker community mobilizing the technology industry to face the climate crisisRECOMMENDED YUA EPISODES18 – The Stubborn Optimist’s Guide to Saving the Planet: https://www.humanetech.com/podcast/18-the-stubborn-optimists-guide-to-saving-the-planetBonus – The Stubborn Optimist’s Guide Revisited: https://www.humanetech.com/podcast/bonus-the-stubborn-optimists-guide-revisited29 – A Renegade Solution to Extractive Economics: https://www.humanetech.com/podcast/29-a-renegade-solution-to-extractive-economicsYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
10/02/22•40m 38s
Here’s Our Plan And We Don’t Know — with Tristan Harris, Aza Raskin and Stephanie Lepp
Renowned quantum physicist Richard Feynman once wrote, "It is our capacity to doubt that will determine the future of civilization." In that spirit, this episode is a little different – because we're talking openly about our doubts, with you, our listeners. It's also different because it’s hosted by our Executive Producer Stephanie Lepp, with Tristan Harris and Aza Raskin in the hot seats.How have we evolved our understanding of our social media predicament? How has that evolution inspired us to question the work we do at Center for Humane Technology? Join us as we say those three magic words — I don't know — and yet pursue our mission to the best of our ability.RECOMMENDED MEDIALeverage Points: Places to Intervene in a SystemSystems theorist Donella Meadows' seminal article, articulating a framework for thinking about how to change complex systems. Winning Humanity’s Existential GameThe Future Thinkers podcast with Daniel Schmactenberger, where he explores how to mitigate natural and human-caused existential risks and design post-capitalist systemsLedger of Harms of Social MediaThe Center for Humane Technology's research on elaborating the many externalities of our technology platforms' race for human attention Foundations of Humane Technology CourseCHT's forthcoming course on how to build technology that protects our well-being, minimizes unforeseen consequences, and builds our collective capacity to address humanity's urgent challengesRECOMMENDED YUA EPISODES 36 - A Problem Well-Stated Is Half-Solved: https://www.humanetech.com/podcast/a-problem-well-stated-is-half-solved42 - A Conversation with Facebook Whistleblower Frances Haugen: https://www.humanetech.com/podcast/42-a-conversation-with-facebook-whistleblower-frances-haugen43 - Behind the Curtain on The Social Dilemma: https://www.humanetech.com/podcast/43-behind-the-curtain-on-the-social-dilemmaYour Undivided Attention is produced by the Center for Humane Technology. Follow us on Twitter: @HumaneTech_
03/02/22•35m 47s
Is World War III Already Here? — with Lieutenant General H. R. McMaster
Would you say that the US is in war-time or peace-time? How do you know? The truth is, the nature of warfare has changed so fundamentally, that we're currently in a war we don't even recognize. It's the war that Russia, China, and other hostile foreign actors are fighting against us — weaponizing social media to undermine our faith in each other, our government, and democracy itself. World War III is here, it's in cyberspace, and the US is unprepared — and largely unaware. This week on Your Undivided Attention, we're fortunate to be speaking with Lieutenant General H. R. McMaster. General McMaster was the United States National Security Advisor from 2017 to 2018. He has examined the most critical foreign policy and national security challenges that face the United States, and is devoted to preserving America's standing and security.
13/01/22•35m 22s
A Fresh Take on Tech in China — with Rui Ma and Duncan Clark
Who do you think the Chinese government considers its biggest rival? The United States, right? Actually, the Chinese government considers its biggest rival to be its own technology companies. It's China's tech companies who threaten its capacity to build a competitive China. That's why the Chinese government is cracking down on social media — for example, by limiting the number of hours youth can play video games, and banning cell phone use in schools. China's restrictions on social media use may be autocratic, but may also protect users more than what we see coming from the US government.It’s a complicated picture.This week on Your Undivided Attention, we're having a surprising conversation about technology in China. Here to give us a fresh take are two guests: investor, analyst, and co-host of the Tech Buzz China podcast Rui Ma, and China internet expert and author of Alibaba: The House That Jack Ma Built, Duncan Clark.
10/12/21•48m 37s
Behind the Curtain on The Social Dilemma — with Jeff Orlowski-Yang and Larissa Rhodes
How do you make a film that impacts more than 100 million people in 190 countries in 30 languages?This week on Your Undivided Attention, we're going behind the curtain on The Social Dilemma — the Netflix documentary about the dark consequences of the social media business model, which featured the Center for Humane Technology. On the heels of the film's 1-year anniversary and winning of 2 Emmy Awards, we're talking with Exposure Labs' Director Jeff Orlowski-Yang and Producer Larissa Rhodes. What moved Jeff and Larissa to shift their focus from climate change to social media? How did the film transform countless lives, including ours and possibly yours? What might we do differently if we were producing the film today? Join us as we explore the reverberations of The Social Dilemma — which we're still feeling the effects of over one year later.
11/11/21•43m 40s
A Conversation with Facebook Whistleblower Frances Haugen
We are now in social media's Big Tobacco moment. And that’s largely thanks to the courage of one woman: Frances Haugen.Frances is a specialist in algorithmic product management. She worked at Google, Pinterest, and Yelp before joining Facebook — first as a Product Manager on Civic Misinformation, and then on the Counter-Espionage team. But what she saw at Facebook was that the company consistently and knowingly prioritized profits over public safety. So Frances made the courageous decision to blow the whistle — which resulted in the biggest disclosure in the history of Facebook, and in the history of social media.In this special interview, co-hosts Tristan and Aza go behind the headlines with Frances herself. We go deeper into the problems she exposed, discuss potential solutions, and explore her motivations — along with why she fundamentally believes change is possible. We also announce an exciting campaign being launched by the Center for Humane Technology — to use this window of opportunity to make Facebook safer.
18/10/21•55m 24s
Spotlight — A Whirlwind Week of Whistleblowing
In seven years of working on the problems of runaway technology, we’ve never experienced a week like this! In this bonus episode of Your Undivided Attention, we recap this whirlwind of a week — from Facebook whistleblower France Haugen going public on 60 Minutes on Sunday, to the massive outage of Facebook, Instagram, and WhatsApp on Monday, to Haugen’s riveting Congressional testimony on Tuesday. We also make some exciting announcements — including our planned episode with Haugen up next, the Yale social media reform panel we’re participating in on Thursday, and a campaign we’re launching to pressure Facebook to make one immediate change. This week it truly feels like we’re making history — and you’re a part of it.
06/10/21•4m 56s
Making Meaning in Challenging Times — with Jamie Wheal
What helps you make meaning in challenging times? As you confront COVID, the climate crisis, and all of the challenges we discuss on this show, what helps you avoid nihilism or fundamentalism, and instead access healing, inspiration, and connection? Today on Your Undivided Attention, we're joined by anthropologist and writer Jamie Wheal. Wheal is the author of Recapture the Rapture: Rethinking God, Sex and Death In a World That's Lost Its Mind. In the book, he makes the case that in order to address the meta-crisis — the interconnected challenges we face, which we talked about in Episode 36 with Daniel Schmachtenberger, we must address the meaning crisis — the need to stay inspired, mended, and bonded in challenging times. Jamie argues that it doesn't matter whether we're staying inspired, mended, and bonded through institutionalized religion or other means as long as meaning-making is inclusively available to everyone.What we hope you'll walk away with is a humane way to think about how to address the challenges we face, from COVID to climate — by enabling us to make meaning in challenging times.
30/09/21•43m 3s
Spotlight — The Facebook Files with Tristan Harris, Frank Luntz, and Daniel Schmachtenberger
On September 13th, the Wall Street Journal released The Facebook Files, an ongoing investigation of the extent to which Facebook's problems are meticulously known inside the company — all the way up to Mark Zuckerberg. Pollster Frank Luntz invited Tristan Harris along with friend and mentor Daniel Schmachtenberger to discuss the implications in a live webinar. In this bonus episode of Your Undivided Attention, Tristan and Daniel amplify the scope of the public conversation about The Facebook Files beyond the platform, and into its business model, our regulatory structure, and human nature itself.
21/09/21•1h 5m
The Power of Solutions Journalism — with Tina Rosenberg and Hélène Biandudi Hofer
What is the goal of our digital information environment? Is it simply to inform us, or also to empower us to act? The Solutions Journalism Network (SJN) understands that simply reporting on social problems rarely leads to change. What they’ve discovered is that rigorously reporting on responses to social problems is more likely to give activists and concerned citizens the hope and information they need to take effective action. For this reason, SJN trains journalists to report on “solutions angles.” More broadly, the organization seeks to rebalance the news, so that people are exposed to stories that help them understand the challenges we face as well as potential ways to respond. In this episode, Tina Rosenberg, co-founder of SJN, and Hélène Biandudi Hofer, former manager of SJN’s Complicating the Narratives initiative, walk us through the origin of solutions journalism, how to practice it, and what impact it has had. Tristan Harris and Aza Raskin reflect on how humane technology, much like solutions journalism, should also be designed to create an empowering relationship with reality — enabling us to shift from learned helplessness to what we might call learned hopefulness.
03/09/21•40m 23s
Do You Want to Become a Vampire? — with L.A. Paul
How do we decide whether to undergo a transformative experience when we don’t know how that experience will change us? This is the central question explored by Yale philosopher and cognitive scientist L.A. Paul. Paul uses the prospect of becoming a vampire to illustrate the conundrum: let's say Dracula offers you the chance to become a vampire. You might be confident you'll love it, but you also know you'll become a different person with different preferences. Whose preferences do you prioritize: yours now, or yours after becoming a vampire? Similarly, whose preferences do we prioritize when deciding how to engage with technology and social media: ours now, or ours after becoming users — to the point of potentially becoming attention-seeking vampires? In this episode with L.A. Paul, we're raising the stakes of the social media conversation — from technology that steers our time and attention, to technology that fundamentally transforms who we are and what we want. Tune in as Paul, Tristan Harris, and Aza Raskin explore the complexity of transformative experiences, and how to approach their ethical design.
12/08/21•36m 35s
You Will Never Breathe the Same Again — with James Nestor
When author and journalist James Nestor began researching a piece on free diving, he was stunned. He found that free divers could hold their breath for up to 8 minutes at a time, and dive to depths of 350 feet on a single breath. As he dug into the history of breath, he discovered that our industrialized lives have led to improper and mindless breathing, with cascading consequences from sleep apnea to reduced mobility. He also discovered an entire world of extraordinary feats achieved through proper and mindful breathing — including healing scoliosis, rejuvenating organs, halting snoring, and even enabling greater sovereignty in our use of technology. What is the transformative potential of breath? And what is the relationship between proper breathing and humane technology?
23/07/21•37m 46s
A Facebook Whistleblower — with Sophie Zhang
In September of 2020, on her last day at Facebook, data scientist Sophie Zhang posted a 7,900-word memo to the company's internal site. In it, she described the anguish and guilt she had experienced over the last two and a half years. She'd spent much of that time almost single-handedly trying to rein in fake activity on the platform by nefarious world leaders in small countries. Sometimes she received help and attention from higher-ups; sometimes she got silence and inaction. “I joined Facebook from the start intending to change it from the inside,” she said, but “I was still very naive at the time.” We don’t have a lot of information about how things operate inside the major tech platforms, and most former employees aren’t free to speak about their experience. It’s easy to fill that void with inferences about what might be motivating a company — greed, apathy, disorganization or ignorance, for example — but the truth is usually far messier and more nuanced. Sophie turned down a $64,000 severance package to avoid signing a non-disparagement agreement. In this episode of Your Undivided Attention, she explains to Tristan Harris and Aza Raskin how she ended up here, and offers ideas about what could be done at these companies to prevent similar kinds of harm in the future.
09/07/21•28m 8s
[Unedited] A Problem Well-Stated is Half-Solved — with Daniel Schmachtenberger
We’ve explored many different problems on Your Undivided Attention — addiction, disinformation, polarization, climate change, and more. But what if many of these problems are actually symptoms of the same meta-problem, or meta-crisis? And what if a key leverage point for intervening in this meta-crisis is improving our collective capacity to problem-solve?Our guest Daniel Schmachtenberger guides us through his vision for a new form of global coordination to help us address our global existential challenges. Daniel is a founding member of the Consilience Project, aimed at facilitating new forms of collective intelligence and governance to strengthen open societies. He's also a friend and mentor of Tristan Harris. This insight-packed episode introduces key frames we look forward to using in future episodes. For this reason, we highly encourage you to listen to this unedited version along with the edited version. We also invite you to join Daniel and Tristan at our Podcast Club! It will be on Friday, July 9th from 2-3:30pm PDT / 5-6:30pm EDT. Check here for details.
25/06/21•2h 2m
A Problem Well-Stated is Half-Solved — with Daniel Schmachtenberger
We’ve explored many different problems on Your Undivided Attention — addiction, disinformation, polarization, climate change, and more. But what if many of these problems are actually symptoms of the same meta-problem, or meta-crisis? And what if a key leverage point for intervening in this meta-crisis is improving our collective capacity to problem-solve?Our guest Daniel Schmachtenberger guides us through his vision for a new form of global coordination to help us address our global existential challenges. Daniel is a founding member of the Consilience Project, aimed at facilitating new forms of collective intelligence and governance to strengthen open societies. He's also a friend and mentor of Tristan Harris. This insight-packed episode introduces key frames we look forward to using in future episodes. For this reason, we highly encourage you to listen to this edited version along with the unedited version.We also invite you to join Daniel and Tristan at our Podcast Club! It will be on Friday, July 9th from 2-3:30pm PDT / 5-6:30pm EDT. Check here for details.
25/06/21•37m 6s
Mr. Harris Zooms to Washington
Back in January 2020, Tristan Harris went to Washington, D.C. to testify before the U.S. Congress on the harms of social media. A few weeks ago, he returned — virtually — for another hearing, Algorithms and Amplification: How Social Media Platforms’ Design Choices Shape Our Discourse and Our Minds. He testified alongside Dr. Joan Donovan, Research Director at the Harvard Kennedy School’s Shorenstein Center on Media Politics and Public Policy and the heads of policy from Facebook, YouTube and Twitter. The senators’ animated questioning demonstrated a deeper understanding of how these companies’ fundamental business models and design properties fuel hate and misinformation, and many of the lawmakers expressed a desire and willingness to take regulatory action. But, there’s still room for a more focused conversation. “It’s not about whether they filter out bad content,” says Tristan, “but really whether the entire business model of capturing human performance is a good way to organize society.” In this episode, a follow-up to last year’s “Mr. Harris Goes to Washington,” Tristan and Aza Raskin debrief about what was different this time, and what work lies ahead to pave the way for effective policy.
10/05/21•32m 36s
Can Your Reality Turn on a Word? — with Anthony Jacquin
Can hypnosis be a tool to help us see how our minds are being shaped and manipulated more than we realize? Guest Anthony Jacquin is a hypnotist and hypnotherapist of over 20 years, author of Reality is Plastic, and he co-runs the Jacquin Hypnosis Academy. He uses his practice to help his clients change their behavior and improve their lives. In this episode, he breaks down the misconceptions of hypnosis and reveals that despite the influence of hypnotizing forces like social media, we all still have the ability to get in touch with our subconscious selves. “What can I say with certainty is true about me — what is good, true and real about me?” Anthony asks. “Much of what we’ve invested in is actually transient. It will change. What is unchanging?” Anthony draws connections between hypnosis and technology and the impacts of both on our subconscious minds but identifies a key difference — technology is exploiting us. But maybe a little more insight into one more dimension of how our minds work underneath the hood can help us build better, more humane and conscious technology.
29/04/21•47m 27s
The Stubborn Optimist's Guide Revisited — with Christiana Figueres (Rerun)
[This episode originally aired May 21, 2020] Internationally-recognized global leader on climate change Christiana Figueres argues that the battle against global threats like climate change begins in our own heads. She became the United Nations’ top climate official, after she had watched the 2009 Copenhagen climate summit collapse “in blood, in screams, in tears.” In the wake of that debacle, Christiana began performing an act of emotional Aikido on herself, her team, and eventually delegates from 196 nations. She called it “stubborn optimism.” It requires a clear and alluring vision of a future that can supplant the dystopian and discouraging vision of what will happen if the world fails to act. It was stubborn optimism, she says, that convinced those nations to sign the first global climate framework, the Paris Agreement. In this episode, we explore how a similar shift in Silicon Valley’s vision could lead 3 billion people to take action for the planet.
22/04/21•59m 56s
Mind the (Perception) Gap — with Dan Vallone
What do you think the other side thinks? Guest Dan Vallone is the Director of More in Common U.S.A., an organization that’s been asking Democrats and Republicans that critical question. Their work has uncovered countless “perception gaps” in our understanding of each other. For example, Democrats think that about 30 percent of Republicans support "reasonable gun control," but in reality, it’s about 70 percent. Both Republicans and Democrats think that about 50 percent of the other side would feel that physical violence is justified in some situations, but the actual number for each is only about five percent. “Both sides are convinced that the majority of their political opponents are extremists,” says Dan. “And yet, that's just not true.” Social media encourages the most extreme views to speak the loudest and rise to the top—and it’s hard to start a conversation and work together when we’re all arguing with mirages. But Dan’s insights and the work of More in Common provide a hopeful guide to unraveling the distortions we’ve come to accept and correcting our foggy vision.
15/04/21•1h 2m
Spotlight — Coded Bias
The film Coded Bias follows MIT Media Lab researcher Joy Buolamwini through her investigation of algorithmic discrimination, after she accidentally discovers that facial recognition technologies do not detect darker-skinned faces. Joy is joined on screen by experts in the field, researchers, activists, and involuntary victims of algorithmic injustice. Coded Bias was released on Netflix April 5, 2021, premiered at the Sundance Film Festival last year, and has been called “‘An Inconvenient Truth’ for Big Tech algorithms” by Fast Company magazine. We talk to director Shalini Kantayya about the impetus for the film and how to tackle the threats these challenges pose to civil rights while working towards more humane technology for all.
08/04/21•23m 56s
Come Together Right Now — with Shamil Idriss
How many technologists have traveled to Niger, or the Balkans, or Rwanda, to learn the lessons of peacebuilding? Technology and social media are creating patterns and pathways of conflict that few people anticipated or even imagined just a decade ago. And we need to act quickly to contain the effects, but we don't have to reinvent the wheel. There are people, such as this episode’s guest, Shamil Idriss, CEO of the organization Search for Common Ground, who have been training for years to understand human beings and learn how to help them connect and begin healing processes. These experts can share their insights and help us figure out how to apply them to our new digital habitats. “Peace moves at the speed of trust, and trust can’t be fast-tracked,” says Shamil. Real change is possible, but as he explains, it takes patience, care, and creativity to get there.
01/04/21•1h 16m
Disinformation Then and Now — with Camille François
Disinformation researchers have been fighting two battles over the last decade: one to combat and contain harmful information, and one to convince the world that these manipulations have an offline impact that requires complex, nuanced solutions. Camille François, Chief Information Officer at the cybersecurity company Graphika and an affiliate of the Harvard Berkman Klein Center for Internet & Society, believes that our common understanding of the problem has recently reached a new level. In this interview, she catalogues the key changes she observed between studying Russian interference in the 2016 U.S. election and helping convene and operate the Election Integrity Partnership watchdog group before, during and after the 2020 election. “I'm optimistic, because I think that things that have taken quite a long time to land are finally landing, and because I think that we do have a diverse set of expertise at the table,” she says. Camille and Tristan Harris dissect the challenges and talk about the path forward to a healthy information ecosystem.
18/03/21•55m 45s
The Courage to Connect — with Ciaran O’Connor and John Wood, Jr.
It’s no revelation that Americans aren’t getting along. But it’s easier to diagnose the problem than come up with solutions. The organization Braver Angels runs workshops that convince Republicans and Democrats to meet, but not necessarily in the middle. “Conflict can actually be a pathway to intimacy and connection rather than division, if you have the right structure for bringing people together,” says Ciaran O’Connor, the organization’s Chief Marketing Officer. We’re delighted to have Ciaran and the Braver Angels National Ambassador John Wood, Jr. on the show to describe their methods, largely based on marriage counseling techniques, and talk about where to go next. “How do you scale that up and apply that to the digital space, given that that is the key battlefield?” asks John. Technology companies play a role here, and the wisdom of the people doing the work on the ground is a valuable guide.
04/03/21•1h
A Renegade Solution to Extractive Economics — with Kate Raworth
When Kate Raworth began studying economics, she was disappointed that the mainstream version of the discipline didn’t fully address many of the world issues that she wanted to tackle, such as human rights and environmental destruction. She left the field, but was inspired to jump back in after the financial crisis of 2008, when she saw an opportunity to introduce fresh perspectives. She sat down and drew a chart in the shape of a doughnut, which provided a way to think about our economic system while accounting for the impact to the world around us, as well as for humans’ baseline needs. Kate’s framing can teach us a lot about how to transform the economic model of the technology industry, helping us move from a system that values addicted, narcissistic, polarized humans to one that values healthy, loving and collaborative relationships. Her book, “Doughnut Economics: Seven Ways to Think Like a 21st Century Economist,” gives us a guide for transitioning from a 20th-century paradigm to an evolved 21st-century one that will address our existential-scale problems.
11/02/21•1h 26m
Two Million Years in Two Hours: A Conversation with Yuval Noah Harari
Yuval Noah Harari is one of the rare historians who can give us a two-million-year perspective on today’s headlines. In this wide-ranging conversation, Yuval explains how technology and democracy have evolved together over the course of human history, from paleolithic tribes to city states to kingdoms to nation states. So where do we go from here? “In almost all the conversations I have,” Yuval says, “we get stuck in dystopia and we never explore the no less problematic questions of what happens when we avoid dystopia.” We push beyond dystopia and consider the nearly unimaginable alternatives in this special episode of Your Undivided Attention.
15/01/21•1h 59m
Won't You Be My Neighbor? A Civic Vision for the Internet — with Eli Pariser
You’ve heard us talk before on this podcast about the pitfalls of trying to moderate a “global public square.” Our guest today, Eli Pariser, co-director of Civic Signals, co-founder of Avaaz, and author of "The Filter Bubble," has been thinking for years about how to create more functional online spaces and is bringing people together to solve that problem. He believes the answer lies in creating spaces and groups intentionally, with the same kinds of skilled support and infrastructure that we would enlist in the physical world. It’s not enough to expect the big revenue-oriented tech companies to transform their tools into something less harmful; Eli is encouraging us to proactively gather in our own spaces, optimized for togetherness and cooperation.
23/12/20•48m 25s
Are the Kids Alright? — with Jonathan Haidt
We are in the midst of a teen mental health crisis. Since 2011, the rate of U.S. hospitalizations for preteen girls who have self-harmed is up 189 percent, and with older teen girls, it’s up 62 percent. Tragically, the numbers on suicides are similar — 151 percent higher for preteen girls, and 70 percent higher for older teen girls. NYU social psychologist Jonathan Haidt has spent the last few years trying to figure out why, working with fellow psychologist Jean Twenge, and he believes social media is to blame. Jonathan and Jean found that the mental health data show a stark contrast between Generation Z and Millennials, unlike any demographic divide researchers have seen since World War II, and the division tracks with a sharp rise in social media use. As Jonathan explains in this interview, disentangling correlation and causation is a persistent research challenge, and the debate on this topic is still in full swing. But as TikTok, Instagram, Snapchat and the next big thing fine-tune the manipulative and addictive features that pull teens in, we cannot afford to ignore this problem while we sit back and wait for conclusive results. When it comes to children, our standards need to be higher, and our burden of proof lower.
27/10/20•40m 35s
Your Nation's Attention for the Price of a Used Car — with Zahed Amanullah
Today’s extremists don’t need highly produced videos like ISIS. They don’t need deep pockets like Russia. With the right message, a fringe organization can reach the majority of a nation’s Facebook users for the price of a used car. Our guest, Zahed Amanullah, knows this firsthand. He’s a counter-terrorism expert at the Institute for Strategic Dialogue, and when his organization received $10,000 in ad credits from Facebook for an anti-extremism campaign, they were able to reach about two-thirds of Kenya’s Facebook users. It was a surprising win for Zahed, but it means nefarious groups all over the African continent have exactly the same broadcasting power. Last year, Facebook took down 66 accounts, 83 pages, 11 groups and 12 Instagram accounts related to Russian campaigns in African countries, and Russian networks spent more than $77,000 on Facebook ads in Africa. Today on the show, Zahed will explain how the very tools that extremists use to broadcast messages of hate can also be used to stop them in their tracks, and he’ll tell us what tech and government must do to systematically counter the problem. “If we don’t get in front of this,” he says, “this phenomenon is going to amplify beyond our reach.“
06/10/20•43m 17s
Spotlight: The Social Dilemma
A new documentary called The Social Dilemma comes out on Netflix today, September 9, 2020. We hope that this film, full of interviews with tech insiders, will be a catalyst and tool for exposing how technology has been distorting our perception of the world, and will help us reach the shared ground we need to solve big problems together.
09/09/20•4m 26s
Facebook Goes '2Africa' — with Julie Owono
This summer, Facebook unveiled “2Africa,” a subsea cable project that will encircle nearly the entire continent of Africa — much to the surprise of Julie Owono. As Executive Director of Internet Without Borders, she’s seen how quickly projects like this can become enmeshed in local politics, as private companies dig through territorial waters, negotiate with local officials and gradually assume responsibility over vital pieces of national infrastructure. “It’s critical, now, that communities have a seat at the table,” Julie says. We ask her about the risks of tech companies leading us into an age of “digital colonialism,” and what she hopes to achieve as a newly appointed member of Facebook’s Oversight Board.
02/09/20•35m 43s
When Media Was for You and Me — with Fred Turner
In 1940, a group of 60 American intellectuals formed the Committee for National Morale. “They’ve largely been forgotten,” says Fred Turner, a professor of communications at Stanford University, but their work had a profound impact on public opinion. They produced groundbreaking films and art exhibitions. They urged viewers to stop, reflect and think for themselves, and in so doing, they developed a set of design principles that reimagined how media could make us feel more calm, reflective, empathetic; in short, more democratic.
06/08/20•37m 7s
Digital Democracy Is Within Reach — with Audrey Tang
Imagine a world where every country has a digital minister and technologically-enabled legislative bodies. Votes are completely transparent and audio and video of all conversations between lawmakers and lobbyists are available to the public immediately. Conspiracy theories are acted upon within two hours and replaced by humorous videos that clarify the truth. Imagine that expressing outrage about your local political environment turned into a participatory process where you were invited to solve that problem and even entered into a face to face group workshop. Does that sound impossible? It’s ambitious and optimistic, but that's everything that our guest this episode, Audrey Tang, digital minister of Taiwan, has been working on in her own country for many years. Audrey’s path into public service began in 2014 with her participation in the Sunflower Movement, a student-led protest in Taiwan’s parliamentary building, and she’s been building on that experience ever since, leading her country into a future of truly participatory digital democracy.
23/07/20•46m 33s
Spotlight — Beyond the Boycott
#StopHateforProfit is an important first step, but we need to go much further.
10/07/20•9m 20s
The World According to Q — with Travis View
What would inspire someone to singlehandedly initiate an armed standoff on the Hoover Dam, or lead the police on a 100-mile-an-hour car chase while calling for help from an anonymous internet source, or travel hundreds of miles alone to shoot up a pizza parlor? The people who did these things were all connected to the decentralized cult-like internet conspiracy theory group called QAnon. Our guest this episode, Travis View, is a researcher, writer and podcast host who has spent the last few years trying to understand the people who’ve become wrapped up in QAnon and the concerning consequences as Q followers increasingly leave their screens and take extreme actions in the real world. As many as six candidates who support QAnon are running for Congress and will be on the ballot for the 2020 elections, threatening to upend long-held Republican establishment seats. This just happened to a five-term Republican congressman in Colorado. Travis warns that QAnon is an extremism problem, not a disinformation or political problem, and dismissing QAnon as a fringe threat underestimates how quickly their views can leapfrog into mainstream debates on the left and the right.
08/07/20•59m 13s
The Bully’s Pulpit — with Fadi Quran
The sound of bullies on social media can be deafening, but what about their victims? “They're just sitting there being pummeled and pummeled and pummeled,” says Fadi Quran. As the campaign director of Avaaz, a platform for 62 million activists worldwide, Fadi and his team go to great lengths to figure out exactly how social media is being weaponized against vulnerable communities, including those who have no voice online at all. “They can't report it. They’re not online.” Fadi says. “They can't even have a conversation about it.” But by bringing these voices of survivors to Silicon Valley, Fadi says, tech companies can not just hear the lethal consequences of algorithmic abuse, they can start hacking away at a system that Fadi argues was “designed for bullies.”
22/06/20•55m 53s
The Dictator's Playbook Revisited — with Maria Ressa (Rerun)
[This episode originally aired on November 5, 2019] Maria Ressa is arguably one of the bravest journalists working in the Philippines today. As co-founder and CEO of the media site Rappler, she has withstood death threats, multiple arrests and a rising tide of populist fury that she first saw on Facebook, in the form of a strange and jarring personal attack. Through her story, she reveals, play by play, how an aspiring strongman can use social media to spread falsehoods, sow confusion, intimidate critics and subvert democratic institutions. Nonetheless, she argues Silicon Valley can reverse these trends, and fast. First, tech companies must "wake up," she says, to the threats they've unleashed throughout the Global South. Second, they must recognize that social media is intrinsically designed to favor the strongman over the lone dissident and the propagandist over the truth-teller, which is why it has become the central tool in every aspiring dictator's playbook.
17/06/20•52m 11s
The Fake News of Your Own Mind — with Jack Kornfield and Trudy Goodman
When you’re gripped by anxiety, fear, grief or dread, how do you escape? It can happen in the span of a few breaths, according to meditation experts Jack Kornfield and Trudy Goodman. They have helped thousands of people find their way out of a mental loop, by moving deeper into it. It's a journey inward that reveals an important lesson for the architects of the attention economy: you cannot begin to build humane technology for billions of users, until you pay careful attention to the course of your own wayward thoughts.
02/06/20•49m 22s
The Stubborn Optimist’s Guide to Saving the Planet — with Christiana Figueres
How can we feel empowered to take on global threats? The battle begins in our heads, argues Christiana Figueres. She became the United Nation’s top climate official, after she had watched the 2009 Copenhagen climate summit collapse “in blood, in screams, in tears.” In the wake of that debacle, she began performing an act of emotional Aikido on herself, her team and eventually delegates from 196 nations. She called it “stubborn optimism." It requires a clear and alluring vision of a future that can supplant the dystopian and discouraging vision of what will happen if the world fails to act. It was stubborn optimism, she says, that convinced those nations to sign the first global climate framework, the Paris Agreement. We explore how a similar shift in Silicon Valley's vision could lead 3 billion people to take action.
21/05/20•52m 54s
The Spin Doctors Are In — with Renée DiResta
How does disinformation spread in the age of COVID-19? It takes an expert like Renée DiResta to trace conspiracy theories back to their source. She’s already exposed how Russian state actors manipulated the 2016 election, but that was just a prelude to what she’s seeing online today: a convergence of state actors and lone individuals, anti-vaxxers and NRA supporters, scam artists and preachers and the occasional fan of cuddly pandas. What ties all of these disparate actors together is an information ecosystem that’s breaking down before our eyes. We explore what’s going wrong and what we must do to fix it in this interview with Renée DiResta, Research Manager at the Stanford Internet Observatory.
07/05/20•52m 57s
When Attention Went on Sale — with Tim Wu
An information system that relies on advertising was not born with the Internet. But social media platforms have taken it to an entirely new level, becoming a major force in how we make sense of ourselves and the world around us. Columbia law professor Tim Wu, author of The Attention Merchants and The Curse of Bigness, takes us through the birth of the eyeball-centric news model and ensuing boom of yellow journalism, to the backlash that rallied journalists and citizens around creating industry ethics and standards. Throughout the 20th century, radio, television, and even posters elicited excitement, hope, fear, skepticism and greed, and people worked together to create a patchwork of regulation and behavior that attempted to point those tools in the direction of good. The Internet has brought us to just such a crossroads again, but this time with global consequences that are truly life-and-death.
28/04/20•45m 22s
Changing Our Climate of Denial — with Anthony Leiserowitz
We agree more than we think we do, but tech platforms distort our perceptions by amplifying the loudest, angriest and most dismissive voices online. In reality, they’re just a noisy faction. This Earth Day we ask Anthony Leiserowitz, Director of the Yale Program on Climate Change Communication, how he shifts public opinion on climate change. We’ll see how tech platforms could amplify voices of solidarity within our own communities. More importantly, we’ll see how they could empower 2 billion people to act in the face of global threats.
22/04/20•1h 6m
Stranger than Fiction — with Claire Wardle
How can tech companies help flatten the curve? First and foremost, they must address the lethal misinformation and disinformation circulating on their platforms. The problem goes much deeper than fake news, according to Claire Wardle, co-founder and executive director of First Draft. She studies the gray zones of information warfare, where bad actors mix facts with falsehoods, news with gossip, and sincerity with satire. “Most of this stuff isn't fake and most of this stuff isn't news,” Claire argues. If these subtler forms of misinformation go unaddressed, tech companies may not only fail to flatten the curve — they could raise it higher.
31/03/20•1h 2m
Mr. Harris Goes to Washington
What difference does a few hours of Congressional testimony make? Tristan takes us behind the scenes of his January 8th testimony to the Energy and Commerce Committee on disinformation in the digital age. With just minutes to answer each lawmaker’s questions, he speaks with Committee members about how the urgency and complexity of humane technology issues is an immense challenge. Tristan returned hopeful, and though it sometimes feels like Groundhog Day, each trip to DC reveals evolving conversations, advancing legislation, deeper understanding and stronger coalitions.
30/01/20•42m 10s
Trust Falls — with Rachel Botsman
We are in the middle of a global trust crisis. Neighbors are strangers and local news sources are becoming scarcer; institutions that used to symbolize prestige, honor and a sense of societal security are ridiculed for being antiquated and out of touch. To replace the void, we turn to sharing economy companies and social media, which come up short, or worse. Our guest on this episode, academic and business advisor Rachel Botsman, guides us through how we got here, and how to recover. Botsman is the Trust Fellow at Oxford University, and the author of two books, including “Who Can You Trust?” The intangibility of trust makes it difficult to pin down, she explains, and she speaks directly to technology leaders about fostering communities and creating products the public is willing to put faith in. “The efficiency of technology is the enemy of trust,” she says.
14/01/20•51m 22s
The Cure for Hate — with Tony McAleer
“You can binge watch an ideology in a weekend,” says Tony McAleer. He should know. A former white supremacist, McAleer was introduced to neo-Nazi ideology through the U.K. punk scene in the 1980s. But after his daughter was born, he embarked on a decades-long journey from hate to compassion. Today’s technology, he says, make violent ideologies infinitely more accessible and appealing to those who long for acceptance. Social media isolates us and can incubate hate in a highly diffuse structure, making it nearly impossible to stop race-based violence without fanning the flames or driving it further underground. McAleer discusses solutions to this dilemma and the positive actions we can take together.
19/12/19•41m 17s
Rock the Voter — with Brittany Kaiser
Brittany Kaiser, a former Cambridge Analytica insider, witnessed a two day presentation at the company that shocked her and her co-workers. It laid out a new method of campaigning, in which candidates greet voters with a thousand faces and speak in a thousand tongues, automatically generating messages that are increasingly aiming toward an audience of one. She explains how these methods of persuasion have shaped elections worldwide, enabling candidates to sway voters in strange and startling ways.
05/12/19•52m 20s
The Dictator's Playbook — with Maria Ressa
Maria Ressa is arguably one of the bravest journalists working in the Philippines today. As co-founder and CEO of the media site Rappler, she has withstood death threats, multiple arrests and a rising tide of populist fury that she first saw on Facebook, in the form of a strange and jarring personal attack. Through her story, she reveals, play by play, how an aspiring strongman can use social media to spread falsehoods, sow confusion, intimidate critics and subvert democratic institutions. Nonetheless, she argues Silicon Valley can reverse these trends, and fast. First, tech companies must "wake up," she says, to the threats they've unleashed throughout the Global South. Second, they must recognize that social media is intrinsically designed to favor the strongman over the lone dissident and the propagandist over the truth-teller, which is why it has become the central tool in every aspiring dictator's playbook.
05/11/19•50m 44s
The Opposite of Addiction — with Johann Hari
What causes addiction? Johann Hari, author of Chasing the Scream, travelled some 30,000 miles in search of an answer. He met with researchers and lawmakers, drug dealers and drug makers, those who were struggling with substance abuse and those who had recovered from it, and he came to the conclusion that our whole narrative about addiction is broken. "The opposite of addiction is not sobriety," he argues. "The opposite of addiction is connection." But first, we have to figure out what it really means to connect.
22/10/19•48m 58s
Pardon the Interruptions — with Gloria Mark
Every 40 seconds, our attention breaks. It takes an act of extreme self-awareness to even notice. That’s why Gloria Mark, a professor in the Department of Informatics at University of California, Irvine, started measuring the attention spans of office workers with scientific precision. What she has discovered is not simply an explosion of disruptive communications, but a pandemic of stress that has followed workers from their offices to their homes. She shares the latest findings from the “science of interruptions,” and how we can stop forfeiting our attention to the next notification, and the next one, ad nauseam.
14/08/19•43m 54s
From Russia with Likes (Part 2) — with Renée DiResta
In the second part of our interview with Renée DiResta, disinformation expert, Mozilla fellow, and co-author of the Senate Intelligence Committee’s Russia investigation, she explains how social media platforms use your sense of identity and personal relationships to keep you glued to their sites longer, and how those design choices have political consequences. The online tools and tactics of foreign agents can be very precise and deliberate, but they don’t have to be -- Renée has seen how deception and uncertainty are powerful agents of distrust and easy to create. Do we really need the ease of global amplification of information-sharing that social media enables, anyway? We don’t want spam in our email inbox so why do we tolerate it in our social media feed? What would happen if we had to copy and paste and click twice, or three times? Tristan and Aza also brainstorm ways to prevent and control disinformation in the lead-up to elections, and particularly the 2020 U.S. elections.
01/08/19•28m 53s
From Russia with Likes (Part 1) — with Renée DiResta
Today’s online propaganda has evolved in unforeseeable and seemingly absurd ways; by laughing at or spreading a Kermit the Frog meme, you may be unwittingly advancing the Russian agenda. These campaigns affect our elections integrity, public health, and relationships. In this episode, the first of two parts, disinformation expert Renee DiResta talks with Tristan and Aza about how these tactics work, how social media platforms’ algorithms and business models allow foreign agents to game the system, and what these messages reveal to us about ourselves. Renee gained unique insight into this issue when in 2017 Congress asked her to lead a team of investigators analyzing a data set of texts, images and videos from Facebook, Twitter and Google thought to have been created by Russia’s Internet Research Agency. She shares what she learned, and in part two of their conversation, Renee, Tristan and Aza will discuss what steps can be taken to prevent this kind of manipulation in the future.
24/07/19•45m 47s
Down the Rabbit Hole by Design — with Guillaume Chaslot
When we press play on a YouTube video, we set in motion an algorithm that taps all available data to find the next video that keeps us glued to the screen. Because of its advertising-based business model, YouTube’s top priority is not to help us learn to play the accordion, tie a bow tie, heal an injury, or see a new city — it’s to keep us staring at the screen for as long as possible, regardless of the content. This episode’s guest, AI expert Guillaume Chaslot, helped write YouTube’s recommendation engine and explains how those priorities spin up outrage, conspiracy theories and extremism. After leaving YouTube, Guillaume’s mission became shedding light on those hidden patterns on his website, AlgoTransparency.org, which tracks and publicizes YouTube recommendations for controversial content channels. Through his work, he encourages YouTube to take responsibility for the videos it promotes and aims to give viewers more control.
10/07/19•54m 29s
With Great Power Comes... No Responsibility? — with Yaёl Eisenstat
Aza sits down with Yael Eisenstat, a former CIA officer and a former advisor at the White House. When Yael noticed that Americans were having a harder and harder time finding common ground, she shifted her work from counter-extremism abroad to advising technology companies in the U.S. She believed as danger at home increased, her public sector experience could help fill a gap in Silicon Valley’s talent pool and chip away at the ways tech was contributing to polarization and election hacking. But when she joined Facebook in June 2018, things didn’t go as planned. Yael shares the lessons she learned and her perspective on government’s role in regulating tech, and Aza and Tristan raise questions about our relationships with these companies and the balance of power.
25/06/19•55m 41s
Should've Stayed in Vegas — with Natasha Dow Schüll
In part two of our interview with cultural anthropologist Natasha Dow Schüll, author of Addiction by Design, we learn what gamblers are really after a lot of the time — it’s not money. And it’s the same thing we’re looking for when we mindlessly open up Facebook or Twitter. How can we design products so that we’re not taking advantage of these universal urges and vulnerabilities but using them to help us? Tristan, Aza and Natasha explore ways we could shift our thinking about making and using technology.
19/06/19•39m 11s
What Happened in Vegas — with Natasha Dow Schüll
Natasha Dow Schüll, author of Addiction by Design, has spent years studying how slot machines hold gamblers spellbound, in an endless loop of play. She never imagined the addictive designs which she had first witnessed in Las Vegas would go bounding into Silicon Valley and reappear on virtually every smartphone screen worldwide. In the first segment of this two-part interview, Natasha Dow Schüll offers a prescient warning to users and designers alike: How far can the attention economy go toward stealing another moment of your time? Farther than you might imagine.
10/06/19•40m 51s
Launching June 10: Your Undivided Attention
Technology has shredded our attention. We can do better.
16/04/19•3m 16s