5 ethical principles for digitizing humanitarian aid | Aarathi Krishnan
TED TechDecember 06, 202417:5316.39 MB

5 ethical principles for digitizing humanitarian aid | Aarathi Krishnan

Over the last decade, humanitarian organizations have digitized many of their systems, from registering refugees with biometric IDs to transporting cargo via drones. This has helped deliver aid around the world, but it's also brought new risks to the people it's meant to protect. This week we're revisiting a talk by tech and human rights ethicist Aarathi Krishnan who points to the dangers of digitization — like sensitive data getting into the hands of the wrong people — and lays out five ethical principles to help inform humanitarian tech innovation. After the talk, our host Sherrell shares a practical way to assess the costs and benefits of digitizing aid using Krishnan's principles.

Learn more about our flagship conference happening this April at attend.ted.com/podcast


Hosted on Acast. See acast.com/privacy for more information.

Over the last decade, humanitarian organizations have digitized many of their systems, from registering refugees with biometric IDs to transporting cargo via drones. This has helped deliver aid around the world, but it's also brought new risks to the people it's meant to protect. This week we're revisiting a talk by tech and human rights ethicist Aarathi Krishnan who points to the dangers of digitization — like sensitive data getting into the hands of the wrong people — and lays out five ethical principles to help inform humanitarian tech innovation. After the talk, our host Sherrell shares a practical way to assess the costs and benefits of digitizing aid using Krishnan's principles.

Learn more about our flagship conference happening this April at attend.ted.com/podcast


Hosted on Acast. See acast.com/privacy for more information.

[00:00:00] TED Audio Collective

[00:00:02] It is expensive to be poor. For example, if your low income and an unexpected setback strikes, like a natural disaster or a medical crisis, writing it out might be more than you can afford.

[00:00:24] And to receive institutional support, you might be required to pay with your privacy.

[00:00:30] Take, for instance, a growing new practice of surveillance and biometric technologies being placed in subsidized housing without prior notification or resident consent.

[00:00:44] Think cameras that use facial recognition technology for entry into units.

[00:00:50] Residents of a rent-stabilized apartment in Brooklyn's Atlantic Plaza Towers pushed back against their landlord for installing these technologies, citing issues like the camera not being able to recognize women and people of color correctly.

[00:01:05] And also, invasion of their personal privacy.

[00:01:10] This is an issue a lot of people are talking about.

[00:01:14] Last year, a bill was reintroduced to Congress that seeks to limit invasive surveillance technology practices in 1.2 million homes subsidized by the federal government.

[00:01:27] It's called the No Biometric Barriers to Housing Act of 2021.

[00:01:33] Long story short, the bill pushes for an end to surveillance within public housing to protect the privacy of residents and would make it illegal for landlords to use facial recognition devices on their properties.

[00:01:49] The bill would also require that the Department of Housing and Urban Development study the impact of tech on vulnerable communities in public housing, including tenant privacy, civil rights, and fair housing.

[00:02:05] And ask for HUD to report to Congress on why the technology is being used and the demographic data of tenants who it is being used on.

[00:02:16] So, what happens when tech goes wrong?

[00:02:20] How does it impact people differently based on their socioeconomic status?

[00:02:25] Or their gender?

[00:02:27] Or their race?

[00:02:28] And who is responsible for making sure issues like the ones that arose in Brooklyn's Tower Plaza never happen in the first place?

[00:02:38] I'm Sherelle Dorsey, and this is TED Tech.

[00:02:41] Today on the show, we'll be hearing from Arthi Krishnan about how to weigh the promise of technology against the potential harms.

[00:02:50] Krishnan is a tech and human rights ethicist who has been a thought leader in the field of international humanitarian aid for over a decade.

[00:02:59] In her talk, she'll touch on the difference between tech and what she calls dumb tech.

[00:03:06] And she'll offer some guidelines for how to implement the former while avoiding the latter.

[00:03:11] Be sure to stick around afterwards so we can take a deep dive into Krishnan's recommendations for how to evaluate the costs and benefits of digitizing aid.

[00:03:22] How will humans and machines work together in the future?

[00:03:34] We spend so much time discussing how the world's changing.

[00:03:38] It would be absolutely absurd to believe the role of the CEO is not.

[00:03:42] This is Imagine This, a podcast from BCG that helps CEOs consider possible futures for our world and their businesses.

[00:03:51] Listen wherever you get your podcasts.

[00:03:53] Add a little curiosity into your routine with TED Talks Daily, the podcast that brings you a new TED Talk every weekday.

[00:04:04] In less than 15 minutes a day, you'll go beyond the headlines and learn about the big ideas shaping your future.

[00:04:11] Coming up, how AI will change the way we communicate, how to be a better leader, and more.

[00:04:16] Listen to TED Talks Daily wherever you get your podcasts.

[00:04:21] Sociologist Zainab Tufukci once said that history is full of massive examples of harm caused by people with great power who felt that just because they felt themselves to have good intentions, that they could not cause harm.

[00:04:41] In 2017, Rohingya refugees started to flee Myanmar into Bangladesh due to a crackdown by the Myanmar military, an act that the UN subsequently called of genocidal intent.

[00:04:56] As they started to arrive into camps, they had to register for a range of services.

[00:05:02] One of this was to register for a government-backed digital biometric identification card.

[00:05:08] They weren't actually given the option to opt out.

[00:05:13] In 2021, Human Rights Watch accused international humanitarian agencies of sharing improperly collected information about Rohingya refugees with the Myanmar government without appropriate consent.

[00:05:28] The information shared didn't just contain biometrics.

[00:05:32] It contained information about family makeup, relatives overseas, where they were originally from, sparking fears of retaliation by the Myanmar government.

[00:05:44] Some went into hiding.

[00:05:46] Targeted identification of persecuted peoples have long been a tactic of genocidal regimes.

[00:05:53] But now, that data is digitized, meaning it is faster to access, quicker to scale, and more readily accessible.

[00:06:01] This was a failure on a multitude of fronts.

[00:06:04] Institutional, governance, moral.

[00:06:09] I have spent 15 years of my career working in humanitarian aid from Rwanda to Afghanistan.

[00:06:15] What is humanitarian aid, you might ask?

[00:06:18] In its simplest terms, it's the provision of emergency care to those that need it the most at desperate times post a disaster during a crisis.

[00:06:28] Food, water, shelter.

[00:06:30] I have worked within very large humanitarian organizations, whether that's leading multi-country global programs,

[00:06:38] to designing drone innovations for disaster management across small island states.

[00:06:45] I have sat with communities in the most fragile of contexts,

[00:06:52] where conversations about the future are the first ones they've ever had.

[00:06:56] And I have designed global strategies to prepare humanitarian organizations for these same futures.

[00:07:03] And the one thing I can say is that humanitarians,

[00:07:07] we have embraced digitalization at an incredible speed over the last decade,

[00:07:12] moving from tents and water cans, which we still use, by the way,

[00:07:17] to AI, big data, drones, biometrics.

[00:07:23] These might seem relevant, logical, needed, even sexy to technology enthusiasts.

[00:07:30] But what it actually is, is the deployment of untested technologies on vulnerable populations without appropriate consent.

[00:07:39] And this gives me pause.

[00:07:42] I pause because the agonies we are facing today as a global humanity didn't just happen overnight.

[00:07:50] They happened as a result of our shared history of colonialism.

[00:07:56] And humanitarian technology innovations are inherently colonial,

[00:08:00] often designed for and in the good of groups of people seen as outside of technology themselves.

[00:08:09] And often not legitimately recognized as being able to provide for their own solutions.

[00:08:15] And so as a humanitarian myself, I ask this question.

[00:08:19] In our quest to do good in the world,

[00:08:22] how can we ensure that we do not lock people into future harm,

[00:08:28] future indebtedness, and future inequity as a result of these actions?

[00:08:33] It is why I now study the ethics of humanitarian tech innovation.

[00:08:38] And this isn't just an intellectually curious pursuit.

[00:08:43] It's a deeply personal one, driven by the belief that it is often people that look like me,

[00:08:50] that come from the communities I come from, historically excluded and marginalized,

[00:08:55] that are often spoken on behalf of and denied voice in terms of the choices available to us for our future.

[00:09:04] As I stand here on the shoulders of all those that have come before me,

[00:09:09] an in obligation for all those that will come after me to say to you

[00:09:14] that good intentions alone do not prevent harm,

[00:09:18] and good intentions alone can cause harm.

[00:09:23] I'm often asked, what do I see ahead of us in this next 21st century?

[00:09:28] And if I had to sum it up, of deep uncertainty, a dying planet, distrust, pain.

[00:09:37] And in times of great volatility, we as human beings, we yearn for a bomb.

[00:09:43] And digital futures are exactly that, a bomb.

[00:09:47] We look at it in awe of its possibility, as if it could soothe all that ails us,

[00:09:52] like a logical inevitability.

[00:09:56] In recent years, reports have started to flag the new types of risks that are emerging

[00:10:02] about technology innovations.

[00:10:05] One of this is around how data collected on vulnerable individuals

[00:10:10] can actually be used against them as retaliation, posing greater risk,

[00:10:16] not just against them, but against their families, against their community.

[00:10:22] We saw these risks become a truth with the Rohingya.

[00:10:26] And very, very recently, in August 2021, as Afghanistan fell to the Taliban,

[00:10:32] it also came to light that biometric data collected on Afghans

[00:10:37] by the U.S. military and the Afghan government,

[00:10:40] and used by a variety of actors,

[00:10:42] were now in the hands of the Taliban.

[00:10:45] journalists' house were searched.

[00:10:49] Afghans desperately raced against time to erase their digital history online.

[00:10:55] Technologies of empowerment then become technologies of disempowerment.

[00:11:01] It is because these technologies are designed on a certain set of societal assumptions,

[00:11:08] embedded in market and then filtered through capitalist considerations.

[00:11:12] But technologies created in one context and then parachuted into another will always fail

[00:11:20] because it is based on assumptions of how people lead their lives.

[00:11:25] And whilst here, you and I may be relatively comfortable providing a fingertip scan

[00:11:31] to perhaps go to the movies,

[00:11:33] We cannot extrapolate that out to the level of safety one would feel

[00:11:39] while standing in line,

[00:11:41] having to give up that little bit of data about themselves

[00:11:44] in order to access food rations.

[00:11:48] Humanitarians assume that technology will liberate humanity,

[00:11:53] but without any due consideration of issues of power,

[00:12:00] exploitation and harm that can occur for this to happen.

[00:12:03] Instead, we rush to solutionizing,

[00:12:07] a form of magical thinking that assumes that just by deploying shiny solutions,

[00:12:12] we can solve the problem in front of us

[00:12:15] without any real analysis of underlying realities.

[00:12:20] These are tools at the end of the day,

[00:12:22] and tools like a chef's knife.

[00:12:25] In the hands of some,

[00:12:27] the creator of a beautiful meal,

[00:12:29] and in the hands of others,

[00:12:32] devastation.

[00:12:33] So how do we ensure that we do not design the inequities of our past

[00:12:40] into our digital futures?

[00:12:42] And I want to be clear about one thing.

[00:12:44] I'm not anti-tech.

[00:12:46] I am anti-dumb tech.

[00:12:54] The limited imaginings of the few should not colonize

[00:12:58] the radical reimaginings of the many.

[00:13:02] So how then do we ensure that we design an ethical baseline

[00:13:06] so that the liberation that this promises

[00:13:09] is not just for a privileged few,

[00:13:13] but for all of us?

[00:13:15] There are a few examples that can point to a way forward.

[00:13:19] I love the work of indigenous AI,

[00:13:23] that instead of drawing from Western values and philosophies,

[00:13:27] it draws from indigenous protocols and values

[00:13:29] to embed into AI code.

[00:13:31] I also really love the work of Nia Tero,

[00:13:34] an indigenous co-led organization

[00:13:37] that works with indigenous communities

[00:13:38] to map their own well-being and territories,

[00:13:42] as opposed to other people coming in

[00:13:43] to do it on their behalf.

[00:13:46] I've learned a lot from the Satellite Sentinel project

[00:13:49] back in 2010,

[00:13:50] which is a slightly different example.

[00:13:52] The project started essentially to map atrocities

[00:13:58] through remote sensing technologies,

[00:14:00] satellites,

[00:14:01] in order to be able to predict

[00:14:03] and potentially prevent them.

[00:14:05] Now, the project wound down

[00:14:07] after a few years for a variety of reasons,

[00:14:11] one of which being that it couldn't actually generate action.

[00:14:14] But the second and probably the most important

[00:14:17] was that the team realized

[00:14:21] they were operating without an ethical net.

[00:14:24] And without ethical guidelines in place,

[00:14:27] it was a very wide open line of questioning

[00:14:30] about whether what they were doing

[00:14:32] was helpful or harmful.

[00:14:35] And so they decided to wind down

[00:14:38] before creating harm.

[00:14:40] In the absence of legally binding ethical frameworks

[00:14:46] to guide our work,

[00:14:48] I've been working on a range of ethical principles

[00:14:51] to help inform humanitarian tech innovation.

[00:14:54] And I'd like to put forward a few of this here for you today.

[00:14:58] One, ask.

[00:15:00] Which groups of humans will be harmed by this and when?

[00:15:05] Assess.

[00:15:06] Who does this solution actually benefit?

[00:15:11] Interrogate.

[00:15:12] Was appropriate consent obtained from the end users?

[00:15:17] Consider what must we gracefully exit out of

[00:15:22] to be fit for these futures?

[00:15:24] And imagine what future good might we foreclose

[00:15:29] if we implemented this action today?

[00:15:33] We are accountable for the futures that we create.

[00:15:37] We cannot absolve ourselves of the responsibilities

[00:15:40] and accountabilities of our actions.

[00:15:43] If our actions actually cause harm

[00:15:46] to those that we purport to protect and serve,

[00:15:49] another world is absolutely radically possible.

[00:15:54] Thank you.

[00:15:59] Krishnan's talk begs the question,

[00:16:01] is aid always going to require

[00:16:04] that we give up something precious to us,

[00:16:06] like privacy?

[00:16:07] And if so,

[00:16:09] what does that say about our society?

[00:16:13] Something I'd guess the residents

[00:16:14] of Brooklyn's Tower Plaza

[00:16:16] and other affordable housing units grapple with

[00:16:19] is deciding whether speaking up is worth the cost.

[00:16:23] Pushing back could mean real consequences,

[00:16:26] like losing access to aid

[00:16:27] you and your family desperately need,

[00:16:30] or even being evicted.

[00:16:31] I love how Krishnan refocused our attention

[00:16:34] on not shaming the progress of technology,

[00:16:37] but questioning its merit when she said,

[00:16:40] I am not anti-tech.

[00:16:42] I am anti-dumb tech.

[00:16:44] The limited imaginings of the few

[00:16:47] should not colonize

[00:16:48] the radical reimaginings of the many.

[00:16:51] So let's reimagine together for a second.

[00:16:54] What would it look like to apply Krishnan's rubric

[00:16:58] to challenge the power dynamics that exist today?

[00:17:02] Krishnan gives us an excellent rubric

[00:17:04] in which to assess the relationship

[00:17:06] between aid and privacy.

[00:17:09] And we don't have to be humanitarians

[00:17:11] by profession to follow this advice.

[00:17:15] First, who are the people

[00:17:19] this technology will affect the most?

[00:17:21] Who exactly are they

[00:17:23] and what do they really need?

[00:17:25] Second, assess.

[00:17:27] How do I meet those needs

[00:17:30] without infringing upon their privacy?

[00:17:32] Third, interrogate.

[00:17:35] Have I communicated clearly the intent

[00:17:38] and how the technology will be deployed?

[00:17:41] Have I provided enough opportunity for feedback?

[00:17:44] Fourth, consider.

[00:17:46] How has the communication and relationship

[00:17:50] with existing clients or aid recipients

[00:17:52] been in the past?

[00:17:54] What can be improved based on the feedback?

[00:17:58] And finally, imagine.

[00:18:01] Is this the best technology to deploy

[00:18:03] or are there more considered

[00:18:05] and better routes

[00:18:06] to accomplishing the same goal?

[00:18:09] Technology should not be foisted

[00:18:11] on people or groups

[00:18:13] without ensuring, first,

[00:18:15] the safety, security, and privacy

[00:18:18] of those who it will most affect.

[00:18:25] TED Tech is part of the TED Audio Collective

[00:18:27] and is produced by TED

[00:18:29] in partnership with Transmitter Media.

[00:18:31] Our editor is Sammy Case

[00:18:33] and the show is Fact Checked

[00:18:35] by Christiane Aparthe.

[00:18:37] I'm Sherelle Dorsey.

[00:18:39] Let's keep digging into the future.

[00:18:41] Join me next week for more.