The TED AI Show: Can AI read your mind? The battle for your brain w/ Nita Farahany
TED TechSeptember 10, 202435:2264.72 MB

The TED AI Show: Can AI read your mind? The battle for your brain w/ Nita Farahany

Imagine a world where your thoughts are no longer private – where employers, friends, and even companies can see, hack, or exploit your thinking. According to ethicist Nita Farahany, that reality is closer than you think. Nita and Bilawal discuss the rapidly advancing field of neurotechnology and its potential to completely transform our everyday lives, from tools that could help you deeply understand your health to tech that could manipulate your dreams. Nita also shares why we need to protect our "cognitive liberty" and how to exercise our rights to think freely in an age of mind-reading technology.

For transcripts for The TED AI Show, visit go.ted.com/TTAIS-transcripts

Learn more about our flagship conference happening this April at attend.ted.com/podcast


Hosted on Acast. See acast.com/privacy for more information.

Imagine a world where your thoughts are no longer private – where employers, friends, and even companies can see, hack, or exploit your thinking. According to ethicist Nita Farahany, that reality is closer than you think. Nita and Bilawal discuss the rapidly advancing field of neurotechnology and its potential to completely transform our everyday lives, from tools that could help you deeply understand your health to tech that could manipulate your dreams. Nita also shares why we need to protect our "cognitive liberty" and how to exercise our rights to think freely in an age of mind-reading technology.

For transcripts for The TED AI Show, visit go.ted.com/TTAIS-transcripts

Learn more about our flagship conference happening this April at attend.ted.com/podcast


Hosted on Acast. See acast.com/privacy for more information.

[00:00:00] [SPEAKER_01]: TED Audio Collective

[00:00:07] [SPEAKER_03]: Scene 1, interior.

[00:00:09] [SPEAKER_03]: An office space.

[00:00:11] [SPEAKER_03]: Fluorescent.

[00:00:12] [SPEAKER_03]: There.

[00:00:13] [SPEAKER_03]: The year is 2035, and neurotechnology is the new normal.

[00:00:19] [SPEAKER_03]: An employee sits at their desk.

[00:00:21] [SPEAKER_03]: They're wearing earbuds that their workplace has issued them.

[00:00:24] [SPEAKER_03]: But these earbuds are doing much more than playing music, because they have brain sensors

[00:00:29] [SPEAKER_03]: embedded inside them.

[00:00:31] [SPEAKER_01]: It's also tracking their stress levels while they're interacting with their screen and

[00:00:35] [SPEAKER_01]: typing a memo.

[00:00:36] [SPEAKER_01]: And then they're sitting back relaxing and they're reviewing their brain data over the

[00:00:41] [SPEAKER_01]: past couple of months, and they notice that there's something unusual going on when

[00:00:45] [SPEAKER_01]: they're sleeping.

[00:00:46] [SPEAKER_01]: And so they pull up a new email, type it through their brain sensing earbuds,

[00:00:52] [SPEAKER_01]: and they send off a quick message to their doctor and say like, hey, notice that there's

[00:00:56] [SPEAKER_01]: something unusual going on here.

[00:00:58] [SPEAKER_01]: Could you take a look and let me know what you think?

[00:01:00] [SPEAKER_03]: There are no physical screens or keyboards here.

[00:01:03] [SPEAKER_03]: Only desks where the workers sit and stare straight ahead, clicking around office software

[00:01:08] [SPEAKER_03]: with their minds.

[00:01:10] [SPEAKER_03]: Our worker is sitting with her arms folded at a desk.

[00:01:13] [SPEAKER_03]: As she waits for her doctor to respond, her thoughts wander.

[00:01:16] [SPEAKER_01]: She starts to fantasize about one of her colleagues, and then suddenly starts to

[00:01:22] [SPEAKER_01]: worry realizing that her employer has access to all of her brain data.

[00:01:26] [SPEAKER_01]: And she notices that a little message pops up on her screen warning about interoffice

[00:01:31] [SPEAKER_01]: romances and she doesn't want to get in trouble in that context.

[00:01:34] [SPEAKER_01]: She's relieved when later in the day she gets an email from her boss that tells her

[00:01:38] [SPEAKER_01]: that she's getting a performance bonus, and the reason is her brain metrics show

[00:01:44] [SPEAKER_01]: that she's really just been on it.

[00:01:47] [SPEAKER_01]: She leaves work, she's still jamming to the music with her brain sensing earbuds in.

[00:01:52] [SPEAKER_01]: Playlists will become increasingly responsive to brain activity.

[00:01:55] [SPEAKER_01]: That's kind of a given.

[00:01:57] [SPEAKER_01]: She gets home, has her brain sensing earbuds in while she sleeps at night.

[00:02:01] [SPEAKER_01]: That allows her to track brain activity during the night.

[00:02:04] [SPEAKER_01]: It also potentially could allow people to do things like market to her while

[00:02:08] [SPEAKER_01]: she's sleeping.

[00:02:10] [SPEAKER_01]: And then she comes into work the next day.

[00:02:12] [SPEAKER_01]: There's a somber cloud that's fallen over the office and the reason is that

[00:02:16] [SPEAKER_01]: one of her colleagues has been arrested under suspicion of engaging in some kind

[00:02:23] [SPEAKER_01]: And she's really worried because she's been secretly working with that person on

[00:02:29] [SPEAKER_01]: a startup.

[00:02:30] [SPEAKER_01]: But it turns out that as you work more closely with someone, you can start to

[00:02:33] [SPEAKER_01]: see synchronization of brain data between two people.

[00:02:37] [SPEAKER_01]: And she's worried that that synchronization data is going to be used by

[00:02:41] [SPEAKER_01]: the authorities to implicate her in his wrongdoings.

[00:02:46] [SPEAKER_03]: A world overrun by neurotechnology.

[00:02:48] [SPEAKER_03]: It all sounds very sci-fi.

[00:02:50] [SPEAKER_03]: But our guest today, Nita Farahani, says that many of our devices already have

[00:02:55] [SPEAKER_03]: these sensing capabilities.

[00:02:57] [SPEAKER_03]: And this kind of workplace experience is just within reach.

[00:03:00] [SPEAKER_01]: There's already workplaces that are issuing these earbuds that have brain

[00:03:04] [SPEAKER_01]: sensing devices in them could really both benefit and be used by an

[00:03:09] [SPEAKER_01]: individual and be used against them.

[00:03:14] [SPEAKER_03]: I'm Bilal Vosadou and this is the TED AI Show, where we figure out how to

[00:03:18] [SPEAKER_03]: live and thrive in a world where AI is changing everything.

[00:03:26] [SPEAKER_02]: Imagine this.

[00:03:28] [SPEAKER_02]: In 2030, the CFO of a Fortune 100 company is a bot.

[00:03:33] [SPEAKER_02]: I'm Paul Michaelman and on Imagine This, we'll be exploring possible futures and

[00:03:37] [SPEAKER_02]: the implications they hold for organizations.

[00:03:40] [SPEAKER_02]: Joining me will be BCG's top experts as well as my co-host Jean, BCG's

[00:03:44] [SPEAKER_02]: conversational gen AI agent.

[00:03:46] [SPEAKER_02]: Blending human creativity with AI innovation, this podcast promises an

[00:03:51] [SPEAKER_02]: unmatched listening journey.

[00:03:53] [SPEAKER_02]: Join us on Imagine This from BCG.

[00:03:57] [SPEAKER_00]: Hey, TED Tech listeners.

[00:03:58] [SPEAKER_00]: If you enjoy this show, we think you'll enjoy a new show about how tech is

[00:04:02] [SPEAKER_00]: changing our politics.

[00:04:03] [SPEAKER_00]: It's called Wired Politics Lab, hosted by Wired senior politics editor Leah

[00:04:08] [SPEAKER_00]: Feiger and a rotating cast of Wired's political reporters.

[00:04:12] [SPEAKER_00]: Each week, you'll be guided through the exciting, challenging and sometimes

[00:04:16] [SPEAKER_00]: entertaining vortex of Internet extremism, conspiracies and

[00:04:19] [SPEAKER_00]: disinformation.

[00:04:21] [SPEAKER_00]: From AI chat bots running for political office to deep fakes

[00:04:24] [SPEAKER_00]: influencing voters, you'll hear in-depth analysis and conversations based on facts

[00:04:28] [SPEAKER_00]: and research.

[00:04:29] [SPEAKER_00]: Find and follow Wired Politics Lab wherever you listen.

[00:04:37] [SPEAKER_03]: Today, we're going to dive into AI enhanced neurotechnology with ethicist

[00:04:41] [SPEAKER_03]: and legal scholar, Nita Farahani.

[00:04:44] [SPEAKER_03]: Nita is the author of The Battle for Your Brain, Defending the Right to

[00:04:48] [SPEAKER_03]: Think Freely in the Age of Neurotechnology, a book I found both

[00:04:52] [SPEAKER_03]: fascinating and terrifying.

[00:04:54] [SPEAKER_03]: It's about how neurotechnology without regulation has the power to infringe upon

[00:04:59] [SPEAKER_03]: our last bastion of privacy, our privacy of thought.

[00:05:03] [SPEAKER_03]: But is this kind of technology inevitable?

[00:05:06] [SPEAKER_03]: And if so, how do we preserve our cognitive liberty, a term that Nita

[00:05:11] [SPEAKER_03]: has used that I find myself thinking about a lot lately?

[00:05:14] [SPEAKER_03]: I followed up with Nita because I had so many questions.

[00:05:20] [SPEAKER_03]: Nita, welcome to the TED AI Show.

[00:05:22] [SPEAKER_01]: Thanks for having me.

[00:05:23] [SPEAKER_03]: First, I have to ask, can you start by giving us a brief overview of

[00:05:27] [SPEAKER_03]: neurotechnology?

[00:05:28] [SPEAKER_01]: Sure.

[00:05:29] [SPEAKER_03]: What it is, how does it work at a high level and what are some of the

[00:05:33] [SPEAKER_03]: dominant use cases that companies are pursuing?

[00:05:36] [SPEAKER_01]: Up until now, if somebody hears neurotechnology, they associate it with

[00:05:41] [SPEAKER_01]: maybe something like what Elon Musk is doing, which is with his

[00:05:44] [SPEAKER_01]: Neuralink company, drilling a small hole into the skull,

[00:05:49] [SPEAKER_01]: implanting electrodes deep into the brain and then enabling somebody who

[00:05:54] [SPEAKER_01]: maybe has lost the ability to communicate or has lost the ability to move to

[00:05:59] [SPEAKER_01]: regain some of that functionality.

[00:06:01] [SPEAKER_01]: That is also neurotechnology, but it is not the neurotechnology that's

[00:06:04] [SPEAKER_01]: going to impact most of us.

[00:06:06] [SPEAKER_01]: What I'm focused on is the Internet of Things and wearable technology.

[00:06:10] [SPEAKER_01]: So the fact that people are increasingly wearing devices like a

[00:06:15] [SPEAKER_01]: smartwatch or earbuds or XR devices like a virtual reality and augmented

[00:06:20] [SPEAKER_01]: reality glasses that are packed with sensors and those sensors are picking

[00:06:24] [SPEAKER_01]: up different aspects of our bodily function.

[00:06:28] [SPEAKER_01]: Anytime you think, anytime you do anything, neurons are firing in your

[00:06:31] [SPEAKER_01]: brain.

[00:06:32] [SPEAKER_01]: They give off tiny electrical discharge.

[00:06:34] [SPEAKER_01]: Hundreds of thousands of neurons may be firing at any given moment

[00:06:37] [SPEAKER_01]: that reflect different mental states.

[00:06:40] [SPEAKER_01]: So when you're happy, when you're sad, when your mind is

[00:06:43] [SPEAKER_01]: wandering, or when you think up, down, left, right and you want to navigate

[00:06:46] [SPEAKER_01]: around a screen, those are electrical signals that can be picked up by these

[00:06:51] [SPEAKER_01]: sensors and then AI enables the decoding of that into commands.

[00:06:56] [SPEAKER_01]: Most people are used to, for example, heart rate or the number of

[00:07:00] [SPEAKER_01]: steps that they're taking or even temperature being something that's

[00:07:03] [SPEAKER_01]: tracked.

[00:07:03] [SPEAKER_01]: And what's already on the market but starting to come in a much more

[00:07:07] [SPEAKER_01]: widespread fashion is embedding brain sensors into those devices.

[00:07:12] [SPEAKER_01]: Mostly electroencephalography sensors, EEG sensors, and those sensors can be

[00:07:18] [SPEAKER_01]: put into earbuds or headphones or even small wearable tattoos behind the ear

[00:07:23] [SPEAKER_01]: that pick up electrical activity in the brain.

[00:07:27] [SPEAKER_01]: And so really the way to think about neurotechnology is not some separate

[00:07:32] [SPEAKER_01]: device although there are many of those that already exist on the

[00:07:35] [SPEAKER_01]: marketplace like a forehead band or sensors that can be put into a hard

[00:07:39] [SPEAKER_01]: hat.

[00:07:40] [SPEAKER_01]: But instead to think about our wearable devices we're already wearing with new

[00:07:45] [SPEAKER_01]: sensors that have the capability of picking up brain activity.

[00:07:50] [SPEAKER_03]: Even recently with the release of the Apple Vision Pro or the MetaQuest 3,

[00:07:54] [SPEAKER_03]: the way I'm thinking about it is on one hand it's a VR headset,

[00:07:57] [SPEAKER_03]: on the other hand it's a biometric recorder on your face.

[00:08:00] [SPEAKER_03]: That's right.

[00:08:01] [SPEAKER_03]: What is currently possible with this technology as far as mind reading

[00:08:05] [SPEAKER_03]: goes and what is not quite possible yet but is on the near horizon?

[00:08:09] [SPEAKER_01]: Yeah, it's a good question.

[00:08:10] [SPEAKER_01]: XR is in many ways from the ground up new technology.

[00:08:15] [SPEAKER_01]: It's computing on your face but it doesn't have to be constrained by the same old

[00:08:19] [SPEAKER_01]: rules and the same old rules being something like a keyboard and a mouse or a joystick

[00:08:24] [SPEAKER_01]: that doesn't have to be how we navigate through that.

[00:08:27] [SPEAKER_01]: And so as you build a new class of technology it's possible to reimagine what

[00:08:33] [SPEAKER_01]: interacting with that technology looks like.

[00:08:35] [SPEAKER_01]: So that's what these companies have done is they've packed them full of biometric sensors

[00:08:40] [SPEAKER_01]: that whether those are cameras or facial recognition or eye gaze or brain sensors,

[00:08:45] [SPEAKER_01]: all of them are being trained on the ability to be able to make

[00:08:49] [SPEAKER_01]: inferences about brain and mental state.

[00:08:51] [SPEAKER_01]: So are you happy?

[00:08:52] [SPEAKER_01]: Are you sad?

[00:08:53] [SPEAKER_01]: Are you tired?

[00:08:55] [SPEAKER_01]: Are you awake?

[00:08:56] [SPEAKER_01]: Is your mind wandering or are you paying attention?

[00:08:59] [SPEAKER_01]: Some of these kind of basic emotional states.

[00:09:02] [SPEAKER_01]: Also most of them are pretty good at being trained on brain activity to figure out

[00:09:07] [SPEAKER_01]: cursor activity.

[00:09:08] [SPEAKER_01]: So up, down, left, right.

[00:09:10] [SPEAKER_01]: So a lot of the things that you could do with your mouse

[00:09:12] [SPEAKER_01]: you could do with a brain computer interface device.

[00:09:16] [SPEAKER_01]: And then they're getting better at being able to decode an intention to type.

[00:09:20] [SPEAKER_01]: What's not there yet from a true mind reading capacity for wearable devices

[00:09:26] [SPEAKER_01]: is literally what you're thinking or continuous language in your brain.

[00:09:30] [SPEAKER_01]: You can't quite at this point think, oh, I need to send a quick text message

[00:09:34] [SPEAKER_01]: to my husband and then have that somehow decoded from your brain sent to your

[00:09:39] [SPEAKER_01]: mobile device and then sending off a message to him by thinking about doing so.

[00:09:45] [SPEAKER_01]: But that's coming.

[00:09:47] [SPEAKER_01]: So that's the kind of thing where the intention to communicate is something

[00:09:51] [SPEAKER_01]: that can increasingly be decoded by these devices.

[00:09:54] [SPEAKER_01]: And that's where generative AI really can be a game changer because generative AI

[00:09:59] [SPEAKER_01]: trained on natural language and conversations becomes much more powerful

[00:10:04] [SPEAKER_01]: at being able to predict the next word that it is that you want to type.

[00:10:08] [SPEAKER_01]: And so that kind of auto complete feature of generative AI paired together with brain

[00:10:13] [SPEAKER_01]: activity and the increasing capability of being able to put large language models

[00:10:19] [SPEAKER_01]: on a device mean that the devices and the brain sensors can co-evolve with the

[00:10:25] [SPEAKER_01]: individual so that it becomes increasingly more precise at more and more mind reading.

[00:10:29] [SPEAKER_03]: It's really powerful.

[00:10:30] [SPEAKER_03]: And so it sounds like this technology can kind of coarsely infer your brain states,

[00:10:36] [SPEAKER_03]: not exactly read your internal monologue, but your point as you introduce new modalities,

[00:10:40] [SPEAKER_03]: you're looking at what is the camera feed see?

[00:10:42] [SPEAKER_03]: What is like, what is the user's eye gaze focused on and then throw in generative

[00:10:47] [SPEAKER_03]: AI to throw a layer of prediction on top of that.

[00:10:50] [SPEAKER_03]: Even with these very core sensing capabilities,

[00:10:52] [SPEAKER_03]: you can actually start making very accurate predictions.

[00:10:55] [SPEAKER_01]: That's right.

[00:10:55] [SPEAKER_03]: So I want to understand why is this something so many companies are racing towards and

[00:11:00] [SPEAKER_03]: investing in heavily? What are some of the most positive use cases that you're excited about?

[00:11:05] [SPEAKER_03]: How is this actually going to help users?

[00:11:07] [SPEAKER_01]: Yeah, I think there's a lot of ways it can help users.

[00:11:11] [SPEAKER_01]: And I don't think about neurotechnology as just those brain sensors that are put into

[00:11:16] [SPEAKER_01]: an earbud. I think about this as an entire class of cognitive biometrics,

[00:11:21] [SPEAKER_01]: which allow predictions about what a person is thinking and feeling.

[00:11:24] [SPEAKER_01]: And that can have a lot of really positive use cases, the more we can actually gain

[00:11:28] [SPEAKER_01]: really accurate insights about what's happening inside of our brains.

[00:11:33] [SPEAKER_01]: So the companies who are racing to this space are all of the major tech companies, right?

[00:11:38] [SPEAKER_01]: Apple has all kinds of patents and investments in this space.

[00:11:40] [SPEAKER_01]: And you see the same thing at Meta and Google and others, because if XR technology

[00:11:45] [SPEAKER_01]: really takes off, which I think eventually it will, it doesn't make sense to have it powered

[00:11:51] [SPEAKER_01]: or have us tethered to a keyboard and a mouse or a joystick. There has to be a more natural

[00:11:56] [SPEAKER_01]: and seamless way of interacting with these devices. The approach has not been like,

[00:12:02] [SPEAKER_01]: let's figure out how to commodify all the brain data, but it's like, how do we build

[00:12:05] [SPEAKER_01]: a new class of technology from the ground up and have a new way of thinking about

[00:12:10] [SPEAKER_01]: interacting with that technology? But then there's also a huge amount of investment

[00:12:17] [SPEAKER_01]: that's been happening to the side, which is on mental health and recognizing that the brain

[00:12:22] [SPEAKER_01]: is an untapped potential of areas and products that could be targeted at mental health.

[00:12:29] [SPEAKER_01]: And so this is things like seeing a huge number of apps that are focused on

[00:12:33] [SPEAKER_01]: whether it's AI-based mental health or if it's journaling or meditation apps.

[00:12:40] [SPEAKER_01]: There you have billions to trillions of dollar industry potentially around brain

[00:12:45] [SPEAKER_01]: health and wellness. And then they start to converge because suddenly if you have

[00:12:49] [SPEAKER_01]: this new class of devices that have been focusing on neural interface and you have

[00:12:53] [SPEAKER_01]: this untapped potential of brain health and wellness, suddenly you have access to much

[00:12:58] [SPEAKER_01]: better insights and much better ways of being able to actually interact with the brain and

[00:13:02] [SPEAKER_01]: to gather data that could be used for much more targeted products. So I think it's like

[00:13:08] [SPEAKER_01]: the biggest untapped market if you think of it that way.

[00:13:11] [SPEAKER_03]: That's well said. I believe Zuckerberg called neural interfaces the holy grail of VR, right?

[00:13:17] [SPEAKER_01]: That's right. Yeah. The Cordy Heber doesn't make any sense when you're thinking about

[00:13:21] [SPEAKER_01]: XR, right? It just doesn't make sense. Nor does the mouse, right?

[00:13:27] [SPEAKER_01]: Totally.

[00:13:28] [SPEAKER_01]: It's become second nature to us, but the idea that I have to work on a mouse to

[00:13:32] [SPEAKER_01]: navigate around my screen, touch screen was a good innovation that way, but that's still

[00:13:37] [SPEAKER_01]: awkward. And we just were inefficient in our interactions with technology.

[00:13:47] [SPEAKER_03]: This world you painted sounds very efficient, if that's the right word for it, but there

[00:13:53] [SPEAKER_03]: is something uncanny about it, right? And I want to get into what makes it uncanny.

[00:13:58] [SPEAKER_03]: In evaluating some of the risks of this technology, you brought a lot of awareness

[00:14:02] [SPEAKER_03]: to this idea of cognitive liberty. So how do you define cognitive liberty and why do you

[00:14:08] [SPEAKER_03]: define it to be a useful term when you're looking at this new wave of neurotechnology?

[00:14:12] [SPEAKER_01]: I came on the term cognitive liberty around 2010 or something, and it resonates with me well,

[00:14:17] [SPEAKER_01]: because what it reflects from my perspective is this right to self-determination over

[00:14:23] [SPEAKER_01]: our brain and mental experiences as a fundamental matter. And what does that even

[00:14:27] [SPEAKER_01]: mean to have self-determination? There's this huge amount of literature that has developed

[00:14:32] [SPEAKER_01]: the past couple of decades around what self-determination means and why self-determination

[00:14:37] [SPEAKER_01]: is fundamental to human self-actualization. You need the basic autonomy, the competence,

[00:14:44] [SPEAKER_01]: and the capacity for relatedness to other people. And so for me, it's about that.

[00:14:48] [SPEAKER_01]: It's about those pillars of self-determination, the ways in which technology can both enable it,

[00:14:54] [SPEAKER_01]: but also have increasingly come to interfere with the capacity for self-determination

[00:14:59] [SPEAKER_01]: and how cognitive liberty would give us the goalpost to say, like, what is it that we're

[00:15:04] [SPEAKER_01]: trying to achieve in technological design or in technological regulation? We're trying to

[00:15:09] [SPEAKER_01]: preserve this space of self-determination for individuals to enable them to form their identity,

[00:15:14] [SPEAKER_01]: to have a space of mental privacy, to have the capacity of freedom of thought,

[00:15:18] [SPEAKER_01]: the preconditions to be able to become a fully self-actualized human. So it resonates

[00:15:24] [SPEAKER_01]: really well for me as a liberty interest, as a kind of fundamental right that individuals have

[00:15:30] [SPEAKER_03]: as a precondition to being able to flourish. It's like our minds are this sanctum sanctorum,

[00:15:37] [SPEAKER_03]: right? We feel like we have complete dominion over it, though already technology is influencing

[00:15:44] [SPEAKER_03]: us at a very deep visceral level without us even knowing. And now we're creating these

[00:15:48] [SPEAKER_03]: higher bandwidth forms of sensing. And I'm kind of curious, is there an interesting analogy here

[00:15:55] [SPEAKER_03]: where many folks got their DNA sequenced over the past decade or two without really thinking

[00:16:01] [SPEAKER_03]: about a scenario where maybe a data breach would occur? And it made me wonder from a medical

[00:16:06] [SPEAKER_03]: data perspective, what are going to be the effects of increasing brain and neurological

[00:16:11] [SPEAKER_03]: health transparency without adequate privacy regulations? And obviously America is very

[00:16:16] [SPEAKER_03]: unique in that there is no federal law on internet privacy. Could this medical data

[00:16:22] [SPEAKER_01]: kind of be weaponized against users? Yeah, I mean, I think very much so. So

[00:16:28] [SPEAKER_01]: part of it is people went into direct to consumer genetic testing and could never have

[00:16:34] [SPEAKER_01]: imagined it would eventually be used to solve cold cases and for law enforcement agencies to

[00:16:41] [SPEAKER_01]: collect all of that data and put us all into warrantless searches for here on out, right?

[00:16:48] [SPEAKER_01]: And then some people are like, well, that's okay. I don't mind that. I didn't commit a crime.

[00:16:52] [SPEAKER_01]: And so if it helps find that person, that's great. But as you start to imagine every

[00:16:59] [SPEAKER_01]: possible use case, there are so many use cases that people just don't even contemplate

[00:17:03] [SPEAKER_01]: about the ways in which data can be used or misused against them. I think with brain data,

[00:17:08] [SPEAKER_01]: even more fundamental than that. I don't think it's just let's point to how law enforcement might

[00:17:13] [SPEAKER_01]: use it one day or how others might use it one day. I think it's about the importance of having

[00:17:19] [SPEAKER_01]: a space, like the inner sanctum, the mental privacy, the space we need to even just be us.

[00:17:28] [SPEAKER_01]: And that's almost impossible for us to even grapple with or imagine a world in which we

[00:17:33] [SPEAKER_01]: don't have that. But that's where I spend a lot of my mental energy is imagining that

[00:17:37] [SPEAKER_01]: world where we don't have that. And think about like as a kid, all of the thoughts where you're

[00:17:43] [SPEAKER_01]: like, maybe I'm weird. I have different gender identity or preferences and sexual orientation,

[00:17:51] [SPEAKER_01]: or maybe I don't want to be a doctor and my parents really want me to be a doctor.

[00:17:57] [SPEAKER_01]: And all of these little thoughts that you have every day. If you don't have a space

[00:18:01] [SPEAKER_01]: you can do that, where you don't, where you feel safe to just be that person that figures

[00:18:09] [SPEAKER_01]: out who you are. What does that world look like for humans to become? Right? This like

[00:18:14] [SPEAKER_01]: act of becoming that we're constantly engaged in. That's the world that I think we're

[00:18:19] [SPEAKER_01]: entering into without realizing it. It's a world where what we've taken for granted is

[00:18:25] [SPEAKER_01]: most fundamental aspect of being human that suddenly may no longer exist. And we're not

[00:18:32] [SPEAKER_01]: putting into place the right protections to ensure this linchpin of humanity is still safe.

[00:18:40] [SPEAKER_03]: This is what it has come to. We're talking about

[00:18:44] [SPEAKER_03]: neurosurveillance. Let's go even further down the rabbit hole and like imagine this future

[00:18:49] [SPEAKER_03]: a bit more. Do you think we'll get to a place where our brain states, our inner thoughts

[00:18:53] [SPEAKER_03]: are fully transparent to one another? How would this impact our personal relationships?

[00:18:59] [SPEAKER_03]: How's this going to impact society? It kind of feels like Twitter on steroids,

[00:19:03] [SPEAKER_03]: like without the crude thumb typing to express your thoughts. Just

[00:19:07] [SPEAKER_03]: knowing what everyone is thinking at all times just feels wild.

[00:19:11] [SPEAKER_01]: So I taught a class at Duke called Let's Talk About Digital You. And it was a class

[00:19:16] [SPEAKER_01]: for undergraduates where the goal was to have them think critically about digital

[00:19:20] [SPEAKER_01]: technologies and how they interact with them. Then also think about what does it mean for them

[00:19:25] [SPEAKER_01]: and who they are as a person. And one of the things I learned in that class when we were

[00:19:29] [SPEAKER_01]: talking about privacy was how almost every kid in the class shared their location data on their

[00:19:35] [SPEAKER_01]: phone with every one of their friends, like hundreds of friends who were tracking them at

[00:19:40] [SPEAKER_01]: all times. I was shocked by this. And then I think about how in the writing of the book,

[00:19:45] [SPEAKER_01]: I was interviewing this person who leads a meditation class that uses neural devices

[00:19:51] [SPEAKER_01]: and how they've created Facebook groups where they're sharing readouts of their

[00:19:58] [SPEAKER_01]: meditation sessions. They're like, oh, look at my gamma activity there. Look at what's happening

[00:20:01] [SPEAKER_01]: with my alpha activity here. And I think is this going to be the latest status update where

[00:20:06] [SPEAKER_01]: you're like, Nita's in a bad mood. Don't talk to her right now. Or like, Nita is

[00:20:12] [SPEAKER_01]: thinking about food. And my friends reach out and say, like, hey, I'm thinking about food too.

[00:20:15] [SPEAKER_01]: Like, let's go get a bite to eat. Like it just becomes something that we decide like we're

[00:20:21] [SPEAKER_01]: going to share all this data with. And it's not hard to imagine that we get to that place

[00:20:27] [SPEAKER_01]: where it becomes something that is much more transparent. Is that all bad? I don't know.

[00:20:31] [SPEAKER_01]: Right. There are some people who believe total transparency is better. I can buy into

[00:20:37] [SPEAKER_01]: argument up until you get to mental privacy. And then I really think this act of becoming

[00:20:43] [SPEAKER_01]: requires that we have much greater control over what, if anything, is shared from our brain data.

[00:20:49] [SPEAKER_03]: You're bringing up a point about this technology that is very much a consent issue.

[00:20:53] [SPEAKER_03]: And sometimes I feel like utility with technology almost ends up being a Trojan horse.

[00:20:58] [SPEAKER_03]: Like we do get a lot of benefit from this stuff. But by consenting to these things,

[00:21:02] [SPEAKER_03]: all these other things happen that we did not expect and we may not even be aware of.

[00:21:08] [SPEAKER_01]: Right. And I think that's what I'm trying to really highlight for people. There's been a

[00:21:14] [SPEAKER_01]: lot that's happened over the past year. And part of it has been me worrying about

[00:21:21] [SPEAKER_01]: normalizing neural surveillance, like the risks become invisible to us. And we accept this

[00:21:27] [SPEAKER_01]: new technology without even stopping to recognize all of the implications of what it is that we're

[00:21:32] [SPEAKER_01]: adopting. And part of it is interesting. It's how the technology is introduced to us. A lot of

[00:21:37] [SPEAKER_01]: times it's with, as you put it, utility. Right? There's some utility that we buy into

[00:21:43] [SPEAKER_01]: or there's some experience where it's like the only way you get this fun experience is by

[00:21:49] [SPEAKER_01]: opting into this new technology without then sensitizing us to all of the risks of doing so

[00:21:56] [SPEAKER_01]: and really contemplating like we're crossing a new barrier here. We're crossing some

[00:22:01] [SPEAKER_01]: threshold that we've never passed before and it has profound implications. It is this space of

[00:22:07] [SPEAKER_01]: what it means to be human. Part of how we define vulnerability and intimacy with each other

[00:22:13] [SPEAKER_01]: is what we choose to share and what we choose to hold back. And I don't think that's just a

[00:22:18] [SPEAKER_01]: consent issue, right? Because like at some point there's enough peer pressure and social pressure

[00:22:23] [SPEAKER_01]: that like if you're not sharing your location data, like you're weird, you know, why aren't

[00:22:26] [SPEAKER_01]: you sharing it with everybody else? So there's this coercive pressure toward the norm where we

[00:22:32] [SPEAKER_01]: have this collective action problem. If each of us individually decides like, oh, I'm okay

[00:22:37] [SPEAKER_01]: with sharing my neural data and I don't care if I don't have mental privacy, I don't have

[00:22:40] [SPEAKER_01]: anything to hide. Suddenly we have the collective having not recognized that they've given up

[00:22:47] [SPEAKER_01]: this space of what it means to be human without us ever having really talked about it,

[00:22:53] [SPEAKER_01]: recognized how profound it is of a thing to actually see to other people or to companies.

[00:22:59] [SPEAKER_03]: Yeah, it's like we like targeted ads. Oh, that's fine. I like using Instagram.

[00:23:04] [SPEAKER_03]: These algorithms have the coarsest view into our hopes, wishes and desires

[00:23:10] [SPEAKER_03]: and are still doing such a great job. It feels like we're going into this future where

[00:23:14] [SPEAKER_03]: people are like, oh yeah, I'll get that cheap Alexa and then suddenly they're having dreams

[00:23:19] [SPEAKER_03]: about Bud Light or Coars Light or whatever, but they're like, oh, that's cool with me.

[00:23:23] [SPEAKER_01]: And like I'm asleep. I don't really care. Yeah. I mean, that example is a real one,

[00:23:27] [SPEAKER_01]: right? It's like I write about this in my book where, you know, Coars was really

[00:23:32] [SPEAKER_01]: frustrated. They've been locked out of the halftime show for years for the Super Bowl.

[00:23:37] [SPEAKER_01]: So they got together with dream researchers, figured out that it's possible to incubate in

[00:23:43] [SPEAKER_01]: people's minds when they're in their most suggestible state because their conscious

[00:23:47] [SPEAKER_01]: brain is basically checked out as they're falling asleep. Associations between Coars and

[00:23:53] [SPEAKER_01]: mountains and lakes and streams and like, so that you have this idea that like Coars

[00:23:58] [SPEAKER_01]: is crisp and refreshing. But then I imagine this dystopian future of you're wearing your

[00:24:03] [SPEAKER_01]: sleep earbuds to track your sleep activity. The entire economic system of these companies

[00:24:10] [SPEAKER_01]: is based on targeted advertisement and real time bidding of, you know, here's new real

[00:24:16] [SPEAKER_01]: time state to be able to access Nita, whether she's on her search engine or the app store.

[00:24:21] [SPEAKER_01]: And it's like, Hey, we've never targeted when Nita is asleep, but here's her most

[00:24:25] [SPEAKER_01]: suggestible sleep state. She's got a Amazon Echo in her bedroom, like prime time to

[00:24:32] [SPEAKER_01]: advertise and to play a little jingle is right now. And so like what stops that?

[00:24:36] [SPEAKER_01]: What prevents targeted advertisements to us while we are sleeping? Nothing, right?

[00:24:42] [SPEAKER_01]: I mean, other than like the company shouldn't do that. Okay. What's going to stop them from

[00:24:47] [SPEAKER_01]: doing that if that becomes the most effective way to advertise to us and becomes the most

[00:24:51] [SPEAKER_01]: expensive ad real estate that they could sell to ad brokers? Oh, good Lord. Oh, good Lord.

[00:24:58] [SPEAKER_03]: I mean, this studies is basically this targeted dream incubation demonstrates what companies

[00:25:05] [SPEAKER_03]: and social media algorithms can do and also what neurotechnology is capable of already.

[00:25:10] [SPEAKER_03]: And so it's wild because social algorithms not only extract or infer our way of looking at the

[00:25:17] [SPEAKER_03]: world, but they are actively shaping how we perceive the world. That's right. They predict

[00:25:22] [SPEAKER_03]: what we want, but as these platforms become such an intricate part of our lives, they can almost

[00:25:27] [SPEAKER_03]: define what we want. So when it comes to this threat of neuro surveillance, how do you think

[00:25:34] [SPEAKER_03]: the rise of this tech could influence what we feel safe to think on a conscious level

[00:25:39] [SPEAKER_03]: or even more subliminally, like what we're capable of thinking? That's the question I struggle with

[00:25:44] [SPEAKER_01]: the most these days is, you know, as I try to unpack, like what does self-determination mean

[00:25:51] [SPEAKER_01]: anymore in a world where we're being steered constantly. And, you know, the easiest way

[00:25:57] [SPEAKER_01]: people can really understand that is if you sit down to watch a single episode of a show

[00:26:03] [SPEAKER_01]: you end up watching four, it's by design, you are less likely to get up and leave if it's just

[00:26:11] [SPEAKER_01]: automatic. Like if you don't have to make a self-control choice and we can see these

[00:26:15] [SPEAKER_01]: differences. Like if you look at the studies, there's a difference between when you are making

[00:26:20] [SPEAKER_01]: a choice about what video to watch next versus an algorithm is deciding for you what video

[00:26:26] [SPEAKER_01]: to watch next in that the parts of your brain that are responsible for self-control

[00:26:30] [SPEAKER_01]: are basically just turned off. They're silent when you're just being fed content over and

[00:26:34] [SPEAKER_01]: over again. And when you see that, right, self-control is turned off, you are being fed

[00:26:41] [SPEAKER_01]: information and that's changing how you feel. That's changing what you believe.

[00:26:46] [SPEAKER_01]: We are being steered. And is there such a thing as self-determination in a world that's

[00:26:52] [SPEAKER_01]: increasingly connected to technology and a world where that steering will become much, much more

[00:26:57] [SPEAKER_01]: precise? If it's not just the crude interpretation of how many seconds or milliseconds you spent on

[00:27:04] [SPEAKER_01]: a video to try to make an interpretation of you, but literally a direct measurement of your

[00:27:09] [SPEAKER_01]: reaction to information and then you're in an immersive environment. You're not even in the

[00:27:13] [SPEAKER_01]: real world anymore where at least there are some things that are static and that immersive

[00:27:16] [SPEAKER_01]: environment can continuously change in response to your brain activity. It feels like we're

[00:27:21] [SPEAKER_01]: approaching the matrix pretty quickly. And so then I struggle with like, okay, well, like

[00:27:27] [SPEAKER_01]: is like, is there such a thing as self-determination in that world of self-actualization in the world

[00:27:32] [SPEAKER_01]: where there's true autonomy, even if it's relational autonomy, where there's true

[00:27:37] [SPEAKER_01]: competence, where we're exercising self-control and critical thinking skills, where we're

[00:27:41] [SPEAKER_01]: fostering relationships with each other and relatedness with ourselves and with other

[00:27:46] [SPEAKER_01]: people. I still believe there is. I think that we have cognitive liberty as a guiding principle

[00:27:50] [SPEAKER_01]: that points us to different ways of designing technology that align with self-determination

[00:27:57] [SPEAKER_03]: rather than align with human diminishment. I love the phrasing that you're using of

[00:28:02] [SPEAKER_03]: technologies steering us and really the question is who is at the steering wheel,

[00:28:07] [SPEAKER_03]: right? Like who controls the objective function. And so this future starts sounding rather

[00:28:13] [SPEAKER_03]: scary. So I want to zoom back out just a bit here and ask you, you know, you've been a

[00:28:20] [SPEAKER_03]: leader in the world of self-determination, right? And so you've been working to

[00:28:27] [SPEAKER_03]: encode cognitive liberty as a legal right. I'm curious, what would that look like in

[00:28:32] [SPEAKER_01]: regulation and how would it be enforced? First, I'll just emphasize, I think cognitive

[00:28:37] [SPEAKER_01]: liberty, to your point, it's systemic change, right? It's in part about encoding it into

[00:28:42] [SPEAKER_01]: law, but it's also about changing our incentives, changing, you know, economic

[00:28:46] [SPEAKER_01]: alignment with cognitive liberty, commercial design and redesign. And so I think that's

[00:28:50] [SPEAKER_01]: from a legal perspective, I think it starts with a human rights approach. And this is so that

[00:28:55] [SPEAKER_01]: there's a universal global norm around a right to cognitive liberty and recognizing that as an

[00:29:01] [SPEAKER_01]: organizing principle for how we interpret existing human rights law. It's recognizing

[00:29:05] [SPEAKER_01]: privacy includes a right explicitly to mental privacy, which also safeguards from

[00:29:10] [SPEAKER_01]: interference and interception with the automatic processes in our brain. And so much

[00:29:15] [SPEAKER_01]: of this technology is really about that. It's not about robust thought. It's about these

[00:29:19] [SPEAKER_01]: automatic ways that our brain reacts and interfering with them, hijacking them,

[00:29:23] [SPEAKER_01]: manipulating them. And then there's freedom of thought. As long as it's been a right,

[00:29:27] [SPEAKER_01]: it has been recognized as an absolute human right, but it's pretty narrowly constructed

[00:29:31] [SPEAKER_01]: because it's an absolute right. And so that's really around protecting the interception,

[00:29:37] [SPEAKER_01]: manipulation and punishment of thought. And here we're really in the mind reading realm,

[00:29:41] [SPEAKER_01]: right? Which is protecting this space of, you know, what we would really commonly in common

[00:29:47] [SPEAKER_01]: language think of as thought and images in our mind. That's the human rights perspective.

[00:29:53] [SPEAKER_01]: Any good critic of human rights law will say, well, that's only as good as like,

[00:29:57] [SPEAKER_01]: you know, whether people adhere to it and how it's implemented at a national level.

[00:30:02] [SPEAKER_01]: So I also think it's important to recognize what that looks like, you know, at nation state

[00:30:06] [SPEAKER_01]: levels. And part of that is doing things as part of privacy laws to, you know, create

[00:30:12] [SPEAKER_01]: robust rights around cognitive biometric data. Employees have a right to mental privacy,

[00:30:19] [SPEAKER_01]: giving children and students in the educational system a right to not be surveilled for their

[00:30:25] [SPEAKER_01]: mental activity and giving them that space, right? So it's taking these high principles

[00:30:30] [SPEAKER_01]: and then context by context creating robust laws that actually implement those high concepts

[00:30:37] [SPEAKER_01]: into law at different, you know, nation and nation state levels.

[00:30:42] [SPEAKER_03]: So where are we currently at there? I'm curious if there's been any notable progress on this subject

[00:30:47] [SPEAKER_01]: since your book came out. A lot. Yeah. So in the US,

[00:30:51] [SPEAKER_01]: there's been finally some momentum. I'd say I'm not that excited about what's happened here

[00:30:56] [SPEAKER_01]: yet, but I am appreciative of the fact that there have been conversations. So Colorado

[00:31:00] [SPEAKER_01]: passed a new law that provides some protections around neural data when it's used for

[00:31:06] [SPEAKER_01]: identification purposes. That's a very narrow subset, but it's something. California has a

[00:31:11] [SPEAKER_01]: broader law that's currently pending that would make it a sensitive category of data.

[00:31:16] [SPEAKER_01]: The Uniform Law Commission here in the United States, which has appointed commissioners from

[00:31:20] [SPEAKER_01]: every state, has just agreed to create a study committee that might lead to a drafting

[00:31:26] [SPEAKER_01]: committee to have model legislation for all of the states about how to protect this category

[00:31:31] [SPEAKER_01]: data around cognitive biometrics, which is exciting. That launches the summer. UNESCO has

[00:31:37] [SPEAKER_01]: had a major process underway. 194 member countries voted to move toward adopting a global standard

[00:31:43] [SPEAKER_01]: around the ethics of neurotechnology. The first draft of that was published in April.

[00:31:49] [SPEAKER_01]: I'm part of that process. So I was appointed by the US to that process and the co-chair

[00:31:54] [SPEAKER_01]: of the expert committee. And the second draft will come out at the end of August.

[00:31:58] [SPEAKER_01]: And then internationally, there's been a lot of conversations happening around this. The concept

[00:32:04] [SPEAKER_01]: of cognitive liberty, the concept of whether or not there need to be special rights,

[00:32:09] [SPEAKER_01]: broadening the conversation beyond neurotechnology to understand this is a

[00:32:12] [SPEAKER_01]: new class is moving forward. So it's encouraging, but what you find is trying to claw back

[00:32:17] [SPEAKER_01]: rights is much harder than from the get-go having a set of rights in place. So I think we

[00:32:27] [SPEAKER_01]: need to real change before these products are wide scale across society.

[00:32:33] [SPEAKER_03]: This technology is coming fast and furious. And as you've alluded to, and actually as you've

[00:32:37] [SPEAKER_03]: outlined, it's being infused into tech that we use every single day. People are just going to

[00:32:41] [SPEAKER_03]: go buy the next-gen AirPods. One question I have is for the listeners that are perhaps

[00:32:46] [SPEAKER_03]: being exposed to this idea of neurosurveillance and neurotechnology, what advice would you have

[00:32:52] [SPEAKER_03]: as they go about navigating the world and become exposed to these technologies,

[00:32:59] [SPEAKER_03]: both in their personal life but also in the workplace?

[00:33:01] [SPEAKER_01]: We still have time to make choices individually and collectively. We should demand that if

[00:33:08] [SPEAKER_01]: Apple launches EEG sensors, that they be very clear about what their data privacy policies are

[00:33:13] [SPEAKER_01]: with respect to that data. And they've done that with Apple Pro Vision, right? They've

[00:33:16] [SPEAKER_01]: said like, they've been incredibly specific to say, here's what's happening with the eye

[00:33:20] [SPEAKER_01]: tracking data. Here's what lives on device. Here are the inferences that leave the device.

[00:33:25] [SPEAKER_01]: That kind of transparency is great, and I applaud them for doing that. We should only

[00:33:30] [SPEAKER_01]: buy the products that are both transparent with respect to how they're thinking about each

[00:33:34] [SPEAKER_01]: stream of sensor data and not buy products from companies that do not offer those same

[00:33:39] [SPEAKER_01]: assurances. That's a space we have a right to protect and that we should demand as a set

[00:33:46] [SPEAKER_01]: of protections. And so advocate for the rights that are being called for, be part of the process

[00:33:52] [SPEAKER_01]: of advocating for UNESCO to move forward with this process for states, for countries to move

[00:33:56] [SPEAKER_01]: forward with a robust set of rights around cognitive liberty and be part of the change

[00:34:01] [SPEAKER_03]: that makes that happen. That's beautifully put. Thank you so much for joining us.

[00:34:06] [SPEAKER_03]: Thanks for having me. This tech is amazing. And it's not hyperbole to say that we're

[00:34:17] [SPEAKER_03]: mind reading here. One of the central benefits of this technology is gaining visibility into our

[00:34:23] [SPEAKER_03]: own minds, quantifying that endless stream of activity that really influences our day-to-day

[00:34:29] [SPEAKER_03]: perception. And through this new tech, what seems to be as opaque as emotional experiences

[00:34:35] [SPEAKER_03]: can be translated into measurable neurobiological patterns. This means that

[00:34:41] [SPEAKER_03]: we can better understand what's happening in our heads. And it also means that we can more

[00:34:46] [SPEAKER_03]: change what's happening in our heads. If you understand what's going on in a person's mind,

[00:34:51] [SPEAKER_03]: you can create a stimulus to influencing whether that's Orwellian workplace monitoring or the

[00:34:57] [SPEAKER_03]: most persuasive advertisement you've ever seen. It's giving away read, write access to the

[00:35:03] [SPEAKER_03]: core of who we are. So what happens to a society where there are no secrets?

[00:35:09] [SPEAKER_03]: We're all open books. Maybe this is an inevitable transition. We already share so

[00:35:14] [SPEAKER_03]: much of ourselves via social media and do so proactively. That said, we need to place a lot

[00:35:21] [SPEAKER_03]: more value in our personal data, data that we currently view to be innocuous and often sign

[00:35:27] [SPEAKER_03]: away without even thinking about it. But when it comes to tech that has both read and write

[00:35:32] [SPEAKER_03]: access into our minds, we have to be more proactive than we had been about social media.

[00:35:38] [SPEAKER_03]: And we have to value the data before it becomes table stakes to function in society.

[00:35:44] [SPEAKER_03]: After all, the stakes here are high. We're talking about our hearts,

[00:35:48] [SPEAKER_03]: minds and how we view the world. The TED AI show is a part of the TED Audio Collective

[00:35:57] [SPEAKER_03]: and is produced by TED with Cosmic Standard. Our producers are Ella Federer and Sarah McCray.

[00:36:04] [SPEAKER_03]: Our editors are Ben Ben Chang and Alejandra Salazar. Our showrunner is Ivana Tucker and

[00:36:09] [SPEAKER_03]: our associate producer is Ben Montoya. Our engineer is Asia Pilar Simpson. Our technical

[00:36:15] [SPEAKER_03]: director is Jacob Winnink and our executive producer is Eliza Smith.

[00:36:20] [SPEAKER_03]: Our fact checker is Christiana Parta and I'm your host, Bilal Vosadou. See y'all in the next one.