In this podcast episode, Petra Molnar, a lawyer and anthropologist specializing in border technologies, emphasized the critical point that technology is far from neutral. She highlighted how technology mirrors and perpetuates power dynamics in society, underscoring the significant impacts it can have on individuals if not carefully considered during the innovation process. This concept is essential to grasp as it illuminates the broader implications of technological advancements on society.
Replication of Power
Power Imbalance: Technology often mirrors and reinforces existing power imbalances within society. Those with more resources and influence can shape the development and deployment of technology, potentially leading to biases and discriminatory outcomes.
Impact on Marginalized Groups: The replication of power dynamics in technology can disproportionately affect marginalized groups, such as refugees, migrants, and individuals facing legal proceedings. These vulnerable populations may bear the brunt of technological solutions that prioritize control and surveillance over human rights and dignity.
Considerations for Innovation
Ethical Frameworks: Molnar suggested that considering ethical frameworks is a starting point for technologists. However, she cautioned that ethics alone may not be sufficient to address the complex societal implications of technology.
Regulation and Governance: While ethics provide a moral compass, regulatory frameworks are necessary to enforce accountability and ensure that technology serves the common good. Molnar highlighted the importance of robust regulations tailored to specific contexts to address the real-world impacts of technology.
Collaboration and Multidisciplinary Approach
Inclusive Innovation: Molnar stressed the need for collaboration between different disciplines and lived experiences in the innovation process. By engaging with diverse perspectives, including those directly impacted by technology, technologists can gain a deeper understanding of the potential consequences of their innovations.
Human-Centered Design: Taking a human-centered approach to technology development involves prioritizing the well-being and rights of individuals over technological advancement. By centering the needs and experiences of users, technologists can create solutions that empower and benefit society as a whole.
Supported by:
https://getinsync.ca/mspradio/
All our Sponsors: https://businessof.tech/sponsors/
💼 All Our Sponsors
Support the vendors who support the show:
👉 https://businessof.tech/sponsors/
🚀 Join Business of Tech Plus
Get exclusive access to investigative reports, vendor analysis, leadership briefings, and more.
👉 https://businessof.tech/plus
🎧 Subscribe to the Business of Tech
Want the show on your favorite podcast app or prefer the written versions of each story?
📲 https://www.businessof.tech/subscribe
📰 Story Links & Sources
Looking for the links from today’s stories?
Every episode script — with full source links — is posted at:
🎙 Want to Be a Guest?
Pitch your story or appear on Business of Tech: Daily 10-Minute IT Services Insights:
💬 https://www.podmatch.com/hostdetailpreview/businessoftech
🔗 Follow Business of Tech
LinkedIn: https://www.linkedin.com/company/28908079
YouTube: https://youtube.com/mspradio
Bluesky: https://bsky.app/profile/businessof.tech
Instagram: https://www.instagram.com/mspradio
TikTok: https://www.tiktok.com/@businessoftech
Facebook: https://www.facebook.com/mspradionews
Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.
[00:00:02] This was rare to me because the story got me. In 2022, the U.S. Department of Homeland Security announced it was training robot dogs to help secure the U.S.-Mexico border against migrants. Those machines were equipped with cameras and sensors that would join a network of drones
[00:00:18] and automated surveillance towers. It was nicknamed the Smart Wall. As part of a larger trend, more people are displaced by war, economic instability, and a warming planet, countries are turning to AI-driven technology to manage that influx.
[00:00:32] So based on years of researching borderlands across the world, lawyer and anthropologist Petra Moeller's book, The Walls Have Eyes, is a global story. It's a dystopian vision turned reality where your body is your passport, matters of life and death are determined by algorithms.
[00:00:50] And she joins me to discuss that perspective and we get some lessons on what IT service providers can take away on this bonus episode of the Business of Tech. This episode sponsored by Skykick, new sponsor for MSP Radio.
[00:01:06] Skykick has been helping over 30,000 MSPs for the past 10 years be more successful in the cloud, migrating, protecting, securing and managing their Microsoft 365 customers. A highlight in their offerings is their Microsoft 365 data protection solution, Cloud Backup. They've recently enhanced it with a new feature called Smart Insights.
[00:01:26] This feature delivers visual insights, empowering partners to engage more efficiently with customers on Microsoft 365 data protection. And MSP Radio listeners get a special offer. Get a free 2M365 email migration for a customer when you bundle it with backup. Visit skykick.com slash MSP Radio to learn details.
[00:01:48] Well, Petra, thanks for joining me today. Thank you so much for having me. So I'm really interested to have this conversation with you because one of the things I've been doing a lot on my show is talking about interesting AI use cases and the way technology is applied
[00:02:06] to make things better. And you've spent a lot of time looking at technology and how it's being used at the border. Tell me a little bit about how big the business of technology is in border control. Sure, I think that's a great place to start, Dave.
[00:02:21] And thanks for having me. I'm really looking forward to the conversation. You're right. Yeah, I've spent about six years now trying to understand how new technologies are playing out at the border and precisely how big of a business this really is, because we're talking
[00:02:34] about a billion dollar global industrial complex that is implicated in how technologies are developed and deployed. And what I find helpful sometimes is looking at it kind of from a temporal perspective, trying to understand what happens when a person is crossing borders, because at this point,
[00:02:53] your entire journey is impacted by tech. Stuff happens before you even move, you know, things like social media scraping, other types of data analytics. There's all sorts of things that are happening right at the border when it comes to surveillance,
[00:03:06] really experimental AI projects like robodogs or lie detectors and things like this. And then there's a class of technologies that also impact you once you're already in a country, things like voice reporting, different types of carceral technologies and immigration detention,
[00:03:20] and even some of the less sexy stuff like algorithmic decision making that happens under the hood that a lot of governments are experimenting with. So it's a really broad class of technologies. Yeah, and I would think that most of this starts from the perspective that most technology
[00:03:37] does, right? Like it probably starts from the perspective of, well, we've got a bunch of humans doing things. I bet if we had automation, it would help us. Like give me a little bit of a sense of the genesis of like how it went from human to accelerated
[00:03:52] technology for processing all this. Yeah, yeah. I think that's a great place to look into it because also for me personally, I used to be a practicing refugee lawyer and someone who was in court and representing people and
[00:04:04] really seeing firsthand how inefficient and problematic the system really is, right? We're talking about immigration systems that have huge backlogs that take forever. There's also real issues around how decisions get made and how discriminatory a lot of the processes are.
[00:04:21] So the impetus behind introducing technology into this space comes from there. A lot of governments say, well, we need more efficiency. We need more technology to help us with our processes.
[00:04:33] But the thing is, the system in and of itself has problems and you don't fix a broken system by putting a band-aid solution on it that actually exacerbates some of these issues. But that's really where it comes from, this kind of need to control or manage migration
[00:04:46] that a lot of states feel. And then you have this kind of ready-made solution sometimes presented to you by the private sector saying, hey, you know what you need? Instead of maybe training for officers or more resources for lawyers, we're going to
[00:04:58] give you a robodog or an AI lie detector. That's kind of the impetus behind it. Now, tell me a little bit about how this happens because there's a natural business inclination of, hey, we have solutions to solve problems, right?
[00:05:11] That literally is the space that we're dealing with right now is we are all technologists and we have solutions to problems. I think there's a certain bit of like, well, how are we missing the mark when we're trying
[00:05:22] to work with customers here and saying, okay, we're offering solutions. How are we missing the mark then to end up in a situation where we may be over engineering this or over automating this or losing the core piece when ultimately both sides are trying to solve the problem?
[00:05:40] Yeah, you know, I think, and not to get too philosophical, but if you permit me just for one second, I think it's about who gets to define what the problem even is. And right now people on the move, refugees, people who are stateless, they're seen as
[00:05:54] the problem to be managed, right? In quotes. And when that's the logic that animates what we innovate on and why, what we're really talking about is a massive power differential between the people who are on the move and
[00:06:06] also, you know, as an aside, exercising their internationally protected right to asylum. That's a right that we all have, right? We forget that sometimes. But again, it's a group that's oftentimes tested upon and seen as this kind of laboratory
[00:06:20] for experimentation when it comes to tech, because it can be right. There's fewer rights that are available to you at the border. Oftentimes you might not even have a lawyer present and it creates this kind of environment
[00:06:33] where different types of actors who are more powerful states, private sector are able to say, yes, you know what we need? We need this technical solution to a really complex problem instead of actually asking,
[00:06:43] well, what really is the problem and who gets to define it and how can we work together to ensure that the technology that's available is actually empowering people or making court processes more fair or just assisting with psychosocial support and a host of issues
[00:06:57] that exist when it comes to migration. So how is it that there are sort of little or no regulations? Because particularly, again, I spent a lot of time on this. We're distinctly behind, you know, particularly when we look at the US like compared against
[00:07:11] EU in terms of regulation and such. But that's not to say that we don't have a series of compliance laws that we have to follow. We've got a whole space of compliance on.
[00:07:18] How is it that this sector has so little and so little regulations in place to govern this kind of tech? Yeah, I think this is again bringing it back to the context of migration and borders is
[00:07:30] really important because, you know, forever those spaces have been this kind of lawless frontier kind of attitude, right, that's there and it allows for the experimentation around different policies and laws to happen.
[00:07:43] And when it comes to tech, right, there is not a lot of incentive to regulate a lot of this high risk technology because it is this kind of lucrative laboratory. You test stuff out at the border and then you can say, well, OK, you know, it works
[00:07:56] there. Let's implement it somewhere else. Now, I mentioned those robodogs. The Department of Homeland Security a few years ago proudly announced that they want to roll them out at the US-Mexico border. Well, and then just last year, the New York City Police Department said, you know what's
[00:08:09] a good idea? We're going to bring back these robodogs to the streets of New York to keep New York safe. I mean, it's interesting, right, because this stuff doesn't just stay at the border. It actually bleeds over into other spaces of public life.
[00:08:21] But we are seeing this massive gap in terms of governance and regulation. And if I put my lawyer hat on, it's a big problem, right? Because at the end of the day, we don't have a lot to pin responsibility to for when things go wrong.
[00:08:35] Yes, there's, of course, you know, privacy legislation and things like the General Data Protection Regulation or the European Union's new AI Act. Which is coming into force shortly. That's a start for sure, but a lot of the law has to be much more context specific, kind
[00:08:51] of understanding what's actually happening on the ground and how it's hurting real people. And there really, I think at the end of the day, also needs to be a bit of a societal conversation about some no-go zones. Like are we really OK with robodogs in public spaces?
[00:09:05] Are we OK with facial recognition at the airport? If not, why not? I feel like we skipped a few steps in the kind of conversation that we need to have. So I think I'm totally on board with that.
[00:09:17] Now, one of the things I wanted to get a sense of is I want to get your take on something because one of the things I've been advocating for IT companies and solution providers to do to leverage work like what NIST is doing around ethical frameworks and the
[00:09:32] application of technologies to help keep them out of trouble. And I believe that there's actually work that can be done from IT companies, MSPs, IT solution providers to help advise companies of, hey, there are risk assessments and guidance that companies can implement.
[00:09:49] And by doing so, you'll actually end up in a better place with your implementations. What am I missing from that? Because obviously I think this is probably a more complicated bit. What am I missing from that basic premise?
[00:10:00] Because it feels like a really valuable service that we can offer to customers. Absolutely. I mean, I think that's the start. Thinking through the ethical and the moral framework that a company is built on and what it wants to innovate on and why.
[00:10:13] But I mean, again, I think as a lawyer, to me, that doesn't go far enough. We do need some hard lines on how this is manifesting in real time and in real life, because ethics are all well and good, but they're not enforceable.
[00:10:26] Right. And we see this kind of what some colleagues have called ethics washing, where companies have all sorts of ethical statements. There's, you know, consortia that say, oh, AI for good. And then we wring our hands and say, oh, OK, well, we really need to think about what
[00:10:38] this is doing. But at the end of the day, that is not enforceable. So I think we do need some stronger regulation. And also then maybe trying to get away from that kind of dichotomy of saying, well, regulation stifles innovation. I think that's way too simple.
[00:10:55] In fact, when you have regulation that allows you to actually think about the human impact of what you're doing and then work collectively also with those who are affected and impacted by the tech, you actually end up with a much healthier ecosystem
[00:11:07] of innovation. And, you know, you can come up with projects and technologies that are actually beneficial to people. Yeah, I would agree that I think saying regulation stifles innovation is oversimplifying because we have too many cases actually where regulation has driven innovation because
[00:11:25] either every time we've looked at antitrust historically, we've actually seen more innovation come from that all the way back to the railroads, the oil companies and Microsoft in the 90s, a whole bunch of Internet companies. So I think it's oversimplification.
[00:11:41] I think there is a space to be talked about if we want to make sure that we have space for that innovation. And that's it. But that's a different statement. Now, the thing that I wanted to sort of get a sense is, is that like how is these
[00:11:57] sectors that are so unregulated kind of driving the rest of the conversation? Give me a little sense into what your research has shown in this space. Yeah, I mean, that's one of the more kind of disturbing parts of my work, seeing just
[00:12:12] how kind of broad the overreach of the private sector and the really powerful actors really is. You know, I had a chance to go to a variety of different conferences, for example, that are, you know, for the purposes of connecting the private sector with
[00:12:25] government, governments and kind of presenting the tools and the solutions that they're innovating as, again, like the viable tools to help you with migration. Things like the World Border Security Congress and others.
[00:12:36] And it is fascinating, you know, because you go into these halls and all of a sudden you see all these companies selling their wares. And there's this kind of conflation between military tech, things like tanks and machine guns with border tech, facial recognition, robodogs and things like that.
[00:12:52] And it is really interesting and troubling that that conflation is just being made without critically analyzing why. And one of the reasons why is because, again, it's big business and you don't see human rights lawyers there. You don't see impacted communities.
[00:13:07] Even media has been having a hard time getting into these spaces because, again, they're seen as this kind of bilateral kind of phase where a lot of deals get made, a lot of money changing hands. And again, that kind of sets up what we innovate on and why.
[00:13:22] And I think what's disturbing, too, is, again, who gets to kind of set the norms of what we innovate on? Why are we developing AI lie detectors to test out on refugees when we could be using AI to root out racism at the border?
[00:13:36] Right. Like that's a choice. That's a normative choice that some set of powerful actors is making. And we need to pay attention to that and how power operates in this space. Give me a little bit more insight into that, because you said something really interesting
[00:13:48] there, because AI and racism is an issue that comes up and you actually very literally just said we could use that technology to do it. Give me your take on that balance between, you know, is AI racist or can it help solve
[00:14:03] it? Give me a little bit of your insights. I think it's both right. I mean, it's a tool that's developed by a society in a world that is racist and discriminatory against people on the move and people who are crossing borders.
[00:14:14] And so, you know, I think on face value, makes sense that it would be replicating or maybe even creating biases that are already present in our world as it exists. But that doesn't mean that we can't kind of push for different types of models and
[00:14:28] different types of thinking when it comes to this kind of technology. But again, I think it ultimately comes down to who makes the decision of what the priority. And if it's a powerful actor that's saying we need to control borders, we
[00:14:40] need to put more surveillance, more AI at the border for the purposes of enforcement, that's very different than saying, OK, we know that there are complex problems in terms of how decisions get made. Also very discretionary in the immigration space.
[00:14:54] Right. You can have two officers look at the exact same set of evidence and render two completely different yet equally legally valid decisions. That's just how the system is built. So if that's the kind of premise that we're starting on, you know, are there ways that
[00:15:08] we can actually slow it down and say, well, these are the real problems and maybe this is a way that technology or new ways of thinking can help us. But it can't happen again without the involvement of those who are impacted and
[00:15:20] affected and also those who have training in human rights law and ethics and sociology and anthropology. I mean, again, we talk in silos in these spaces so much. That's why I think these conversations are really important, because we need to find
[00:15:32] a way to come to the table and then like really look at what's happening and maybe also build new tables altogether, right? Different ways of thinking. So I want to ask you something. Something. So often I'm a technologist, right?
[00:15:45] I spend a lot of time thinking about the way to apply technology to solve customer problems. You're someone who's actually looked at this system from the other side. What is the one thing about this system that you would want technologists to know
[00:15:58] and consider as they're thinking about solving customer problems? Like, what do we need to know about this kind of system? Yeah, I think, you know, for me, it ultimately goes back to the kind of foundational idea that technology is not neutral, right? It replicates power in society.
[00:16:13] And it's unfortunately given the vast differential in power, it hurts people if you don't think about how you innovate and why you innovate and what you're actually doing. That's why building these bridges between different disciplines, different lived experience is super important because I get it.
[00:16:28] I mean, if you're in a space of innovation and you know, you have a lot of money to do this and you maybe have never met a refugee or someone who is facing a court proceeding,
[00:16:36] maybe as a result of predictive policing or is a victim of a discriminatory welfare algorithm, there's a huge divide right between lived experience. So I think that's part of it. We really need to think about what is this doing in real time and why.
[00:16:50] Well, for those of us in delivering services, understanding customer pain point and getting in the system remains the key execution piece. Petra Molnar is a lawyer and anthropologist specializing in border technologies, and her book is called The Walls Have Eyes, Surviving Migration in the Age of Artificial
[00:17:08] Intelligence. She runs the Refugee Law Lab at New York University as a faculty associate at Harvard's Berkman Klein Center for Internet and Society and joins me today from New York. Petra, thanks for joining me today. Thank you so much, Dave.
[00:18:18] To learn more. The Business of Tech is written and produced by me, Dave Sobel, under ethics guidelines posted at businessof.tech. If you like the content, please make sure to hit that like button and follow or
[00:18:33] subscribe. It's free and easy and the best way to support the show and help us grow. You can also check out our Patreon where you can join the business of tech community at patreon.com slash MSP radio or buy our Why Do We Care merch at businessof.tech.
[00:18:52] Finally, if you're interested in advertising on the show, visit MSP radio dot com slash engage. Once again, thanks for listening to me. I will talk to you again on our next episode of the Business of Tech. Part of the MSP radio network.

