In this episode of The Business of Tech, host Dave Sobel engages in a thought-provoking discussion with Aaron Painter, CEO of NameTag, about the evolving landscape of identity verification in the face of emerging threats like deep fakes. The conversation begins with an exploration of traditional identity verification methods, particularly multi-factor authentication (MFA), and their limitations. Painter emphasizes that while MFA is a critical first step in securing accounts, it is not sufficient on its own, especially when recovery processes can be exploited by malicious actors impersonating legitimate users.
As the dialogue progresses, Painter introduces the concept of behavioral biometrics and AI-powered verification as potential advancements in security. He explains how these technologies can monitor user behavior to determine if the person accessing an account is indeed the same individual who logged in. However, he also highlights the challenges associated with these methods, particularly in scenarios where identity verification is crucial, such as during account recovery or hiring processes. The discussion underscores the need for more robust identity verification practices that go beyond traditional methods.
The episode also delves into the growing adoption of passkeys as a solution for identity verification. Painter notes that while passkeys are gaining traction, particularly among larger tech companies, their implementation in small to mid-sized businesses remains limited. He points out that passkeys are not a complete replacement for passwords and still rely on traditional recovery methods, which can create vulnerabilities. This leads to a broader conversation about the importance of understanding the limitations of current technologies and the need for a comprehensive approach to identity security.
Finally, the conversation shifts to the implications of deep fakes on identity verification. Painter warns that the technology enabling deep fakes has become increasingly accessible, posing significant risks for organizations relying on visual verification methods. He advocates for a more advanced approach to identity verification that combines AI, cryptography, and biometrics to create a stronger defense against impersonation. The episode concludes with a call to action for organizations to rethink their identity verification processes and adapt to the evolving threat landscape, emphasizing the importance of knowing who users are in a digital world.
💼 All Our Sponsors
Support the vendors who support the show:
👉 https://businessof.tech/sponsors/
🚀 Join Business of Tech Plus
Get exclusive access to investigative reports, vendor analysis, leadership briefings, and more.
👉 https://businessof.tech/plus
🎧 Subscribe to the Business of Tech
Want the show on your favorite podcast app or prefer the written versions of each story?
📲 https://www.businessof.tech/subscribe
📰 Story Links & Sources
Looking for the links from today’s stories?
Every episode script — with full source links — is posted at:
🎙 Want to Be a Guest?
Pitch your story or appear on Business of Tech: Daily 10-Minute IT Services Insights:
💬 https://www.podmatch.com/hostdetailpreview/businessoftech
🔗 Follow Business of Tech
LinkedIn: https://www.linkedin.com/company/28908079
YouTube: https://youtube.com/mspradio
Bluesky: https://bsky.app/profile/businessof.tech
Instagram: https://www.instagram.com/mspradio
TikTok: https://www.tiktok.com/@businessoftech
Facebook: https://www.facebook.com/mspradionews
Hosted by Simplecast, an AdsWizz company. See pcm.adswizz.com for information about our collection and use of personal data for advertising.
[00:00:02] Well, we know multi-factor authentication is required, but what's beyond that? Are Passkeys the answer? I thought so. Maybe I'm totally wrong. I talked to Aaron Painter, who's CEO of NameTag, and we talk about identity and what it means in the future, particularly as we think about Deepfakes on this bonus episode of The Business of Tech.
[00:00:24] Well, Aaron, thanks for joining me today.
[00:00:27] Oh, it's a privilege to be here, Dave. I'm a big fan of the show.
[00:00:29] Well, thank you. I'm excited to have you because in particular, as I was briefing for this, you've got a particularly strong viewpoint that traditional methods of identity verification are going to be obsolete due to the rise of Deepfakes.
[00:00:43] So I really wanted to learn from you, like, what's the state of identity verification right now as it relates to like IT security? And why is all the traditional stuff now becoming obsolete?
[00:00:54] Yeah, I'm a huge fan of identity security. And today, the standard that most people think about when you say, how do I protect access to an account is traditionally multi-factor authentication or MFA.
[00:01:07] An MFA is one of those things that the same word can mean many different things, and there are different types and flavors of MFA.
[00:01:14] You know, sending a text message to someone's phone is considered a lighter weight form of MFA.
[00:01:19] A more sophisticated way today is using something that's hardware backed, like a YubiKey or maybe like an authenticator app that you might have downloaded in your phone with sort of revolving pin codes.
[00:01:29] The challenge that exists is that as necessary as MFA is, it is not sufficient to protect accounts because MFA is really only as good as the reset or the recovery process.
[00:01:43] Which means if it's easy for someone to call and say that they are you and that they are locked out of their account, then all the efforts that you put to make it a technologically secure method suddenly rely on someone at a help desk traditionally asking new security questions or possibly a few other methods to try and determine if you were the rightful account owner and if they should reset access, maybe when you're locked out or when you've got a new phone.
[00:02:07] But more importantly, if someone is pretending to be you, they can claim that they are locked out or they got a new phone, and therefore they need their access reset.
[00:02:14] And that has become the leading way in which people take over an account and then deposit ransomware and cause harm.
[00:02:22] Okay, so for listeners, I'm going to put aside the statement of because I know Aaron and I will both agree on this that look, you've got to do the basics right.
[00:02:30] If you're not doing something at this level, then this conversation we're having is irrelevant.
[00:02:34] So you've got to do the basics.
[00:02:36] Now, let's assume we're doing some of the basics and that you are good about multi-factor authentication.
[00:02:40] And now you're thinking of, okay, I need to go beyond that.
[00:02:44] Tell me a little bit about what behavioral biometrics and AI-powered verification mean to the evolution of that higher level of security.
[00:02:55] How is that different from what we're doing now with good, and I'm putting good in quotes, multi-factor authentication?
[00:03:04] So MFA is, again, we were aligned, a critical first step.
[00:03:08] The concept of behavioral biometrics often comes into play when you think, how can I use different signals to monitor?
[00:03:13] Is this the same person, let's say, who logged in or authenticated?
[00:03:17] So imagine you're working in a call center or you're in a hospital scenario where maybe a nurse is taking notes.
[00:03:23] So let's say, well, a medical interaction is going on.
[00:03:25] Maybe they've logged into their account.
[00:03:27] Hopefully, they've used some kind of credentials.
[00:03:29] Hopefully, they've used some kind of MFA.
[00:03:31] And then what if they were to walk away from the keyboard?
[00:03:33] The idea of behavioral biometrics is to say, hey, is the typing pattern, for example, one form of behavioral biometrics, the same?
[00:03:40] Is someone typing the same way as they were when they first logged in?
[00:03:43] You know, is the speed of typing change?
[00:03:45] Is the way they're using the keyboard changed?
[00:03:47] You know, in a visual sense, sometimes we look at gait detection, the way someone walks, for example.
[00:03:53] Gosh, is the way they walk different than another person?
[00:03:55] Those are signs of using behavior to say, is this the same person as the one who logged in?
[00:04:02] The challenge with those often is that you don't know which human that is behind the screen.
[00:04:08] So you might know it's the same person at the keyboard as the one who logged in.
[00:04:11] We're not actually necessarily sure, is this the same person that logged in as I expect them to be?
[00:04:17] And that's what we're seeing happen, particularly at this moment of recovery.
[00:04:20] But actually, really right now, we're seeing it wildly take advantage at the moment of hiring.
[00:04:25] Where you've got foreign actors, bad actors, adversaries, you know, at the national stage level, applying for jobs in companies of all sizes.
[00:04:34] And they might not be who they claim to be.
[00:04:37] They're often either impersonating someone that they want to be, right?
[00:04:41] Or they are inviting others to impersonate them, sort of outsource their job.
[00:04:45] And so those become scenarios where behavioral biometrics or MFA aren't sufficient because you haven't set up the account for the right person, where you're not recovering the account for the right person when someone claims to be locked out.
[00:04:57] So then all your efforts at kind of focusing on that login or that authentication stage aren't sufficient.
[00:05:04] Okay.
[00:05:04] So we've talked a lot now so far about stuff that isn't sufficient.
[00:05:07] What do we need to be moving towards doing?
[00:05:11] The key thing for all of us is to understand this and to understand how this happens so that you can apply your creativity for the businesses of all different sizes maybe that you're working with.
[00:05:21] And there may be different levels of sophistication, different levels of an approach.
[00:05:24] Let's say you're a largely in-person organization or you're supporting a client who's largely in-person and on-site.
[00:05:30] You know what?
[00:05:31] Requiring people, let's say, who are new hires or onboarding or who are otherwise locked out of their account to come in person to a security or to an IT desk is actually still a good method.
[00:05:42] Now, that's not practical for everyone, particularly in our remote workforce world.
[00:05:46] But that concept is something that works.
[00:05:49] So let's work backwards from there.
[00:05:50] That's effective.
[00:05:51] Sending a text message to someone's phone or using the default tools that might exist in, you know, Entra or Duo or Okta are not sufficient because they rely on things like SMS verification.
[00:06:03] Another method that's become increasingly popular, and it became popular a little bit about nine months, almost a year ago now from, you know, it was last August.
[00:06:11] So after the MGM breach, and poor MGM, they've sort of become the poster child children on this type of breach.
[00:06:18] You know, a bad actor called the employee help desk at MGM.
[00:06:22] You know, eight minutes later, the help desk rep did their best effort, but reset access for that bad actor.
[00:06:28] And the account had privileged access credentials.
[00:06:31] You know, ransomware was deposited, and MGM was offline for two weeks.
[00:06:36] Okta, who was protecting MGM at that point or helping support their login efforts, their CISO, David Bradbury, came out and advised companies to think about things called visual verification.
[00:06:47] And visual verification is a lot like what you would do in person if someone came to that help desk.
[00:06:52] But it's launching a video call, Teams or Zoom, whatever it might be, and asking that person to show up on video, maybe hold up a form of government issued ID, maybe ask them questions.
[00:07:02] And it was a way to make it sort of more real than things like sending SMS.
[00:07:06] So that became sort of the default guidance, call it last summer.
[00:07:10] But in recent months, that's been called into question.
[00:07:14] Well, right, because I mean, I want to layer in there deep fakes.
[00:07:17] But before I get to deep fakes, I want to ask one more question.
[00:07:19] It feels like, from my perspective on the consumer side, we're heavily pushing past keys now as a solution.
[00:07:27] I'm seeing significantly less SMB mid-market adoption of that.
[00:07:32] I think we're starting to see it in enterprise, but I'm not convinced that that's happening a lot.
[00:07:38] What's your take on how past keys fit into this whole question?
[00:07:42] I think you're exactly right on both fronts.
[00:07:44] Past keys are growing and you're seeing them more and you're not seeing as much in SMB or mid-market.
[00:07:49] And part of the reason why is that the initiative behind the technology decision for past keys was driven by some of the larger tech companies.
[00:07:56] And so those tech companies have been some of the early adopters of the standard that they helped create and write.
[00:08:02] And so, you know, it's easier for Microsoft and Google to push that out because they've been writing the standard.
[00:08:07] Then when you start thinking about implementing that as a smaller size organization, you need other tools or you need kind of prepackaged solutions and products to help do that.
[00:08:16] The other complexity with past keys is that it is a great idea in theory, but really it's not a replacement for the password.
[00:08:25] It is not true passwordless.
[00:08:27] It allows you to log in faster, but you often still need a password.
[00:08:33] When you create a new account, it says create a password and then would you like to enroll in past keys?
[00:08:38] And so you haven't actually gotten rid of the concept of needing to store or manage or secure the actual password.
[00:08:45] So it unfortunately creates that same vulnerability because someone can, you know, that password can be compromised and they can log in without the past keys infrastructure and still take over an account.
[00:08:55] And the other big issue that arises often with past keys are things like switching devices.
[00:09:00] You know, if you operate in a Mac ecosystem and then, you know, maybe you have an iPhone and you have a Windows PC, past keys don't work well in those scenarios.
[00:09:08] And they don't secure onboarding and they don't secure recovery.
[00:09:12] So you don't necessarily know who's enrolling in that past key.
[00:09:16] Let's say if they're a new user and you don't necessarily know who the person is, if they are for whatever reason are locked out of their past key.
[00:09:23] So it's a useful tool to make logins a bit more express, particularly on the consumer side where we see it more often.
[00:09:28] But it has a lot of limitations still.
[00:09:31] And so for that reason, it's something I would think very carefully about if I were deciding to implement that for a small or medium sized business.
[00:09:38] Okay, interesting.
[00:09:39] Because I would feel like it's a first step and I don't want to spend too much time here.
[00:09:43] But I would feel like this is a step because the obvious next step is when you put a past key infrastructure in place to then turn off passwords, right?
[00:09:51] And make it a requirement versus a transitional technology.
[00:09:54] Am I misreading this?
[00:09:56] Yeah, the past key that architecture today doesn't really, it's not for getting rid of passwords.
[00:10:01] And so, and because it still breaks down, unfortunately, you need a recovery method, which is, again, if I exist in an iCloud infrastructure and I log into my PC, I might not have a past key set up yet.
[00:10:11] So I actually need to log in with that password before I can even configure the past key.
[00:10:15] So the way that past keys are designed still very much rely on the password existing to kind of cover some of those fringe scenarios.
[00:10:22] Okay, gotcha.
[00:10:23] Important to know.
[00:10:24] So then now let's transition a little bit into what we're looking at with the layered complexity of deep fakes.
[00:10:30] Like what are we adding now to the problem with essentially the ability to create deep fakes nearly instantaneously with the technologies that are available at a consumer level?
[00:10:40] Like what more do we have to protect against now?
[00:10:43] You're spot on.
[00:10:44] So deep fakes, so everyone's, you know, to kind of educate and just baseline, deep fakes are a term that started roughly in 2017, 2018.
[00:10:50] It was the username of a user on Reddit who was talking about creating essentially adult content, imitating someone else, changing the faces and bodies of, you know, known celebrities.
[00:11:01] And the term sort of caught on.
[00:11:02] And obviously the technology around generative AI has really accelerated since then.
[00:11:06] And so deep fakes often refer to impersonating the likeness of someone else.
[00:11:12] And that could be a voice impersonation, which, you know, is five years ago was read a 20-minute script of set words and a machine could train on your voice and maybe replicate it.
[00:11:24] And Microsoft researchers have now proven that can be done in three seconds with someone's audio, voice clip, an Instagram post, you name it, let alone a podcast.
[00:11:32] And the other form comes from sort of deeper video representation.
[00:11:36] And that video can be pre-made and it can be a real-time sort of emulator where you've trained a system and then can conduct, you know, a real-time interview, a real-time video call pretending to be someone that you're not.
[00:11:49] And so the risk is that these tools have become so prevalent that bad actors are using them, particularly when we think of the term as social engineering.
[00:11:58] So now let's imagine, let's go back to an early scenario where you're locked out of your account or someone wants to pretend that they are you locked out of an account.
[00:12:06] And suddenly they can call a help desk.
[00:12:08] Maybe you're a smaller organization and you say, hey, I know all my employees.
[00:12:11] I know all my users.
[00:12:12] Or, you know, I support a few customers in my MSP and I know who normally calls me.
[00:12:17] But suddenly someone's replicating their voice and they are purposely trying to deceive you.
[00:12:22] That's a very real threat today.
[00:12:23] Or let's say you follow that advice maybe from Okta last summer and you have a visual verification.
[00:12:28] You set up a Teams call, a Zoom call, and you're going to, you know, interview that person, so to speak, before you reset their access.
[00:12:35] Well, now you don't really know if the person on that video call is the real human or if they're running a deep fake emulator because the platforms, Teams and Zoom, for example, weren't meant to prevent against this.
[00:12:47] As easy as it is to select a different camera or microphone in a Zoom call, it is as easy to select a piece of software that is running a live real-time emulator of a deep fake software.
[00:13:01] And so you can now get on that call and really impersonate someone in a way that was never intended and certainly not in a way that we can really trust to do these kind of visual verifications in a high assurance way.
[00:13:12] Okay.
[00:13:13] Okay.
[00:13:14] So it appropriately freaked me out.
[00:13:15] And it's basically because for our mind listeners, as a podcaster YouTuber, there is lots of video and audio of someone like me available.
[00:13:24] So I would be exactly in the candidate of kind of person who might get emulated.
[00:13:29] Knowing there is no magic wand and knowing there is no one single thing that you can tell people to do.
[00:13:35] Give some directional guidance on the kinds of things that providers should be thinking about addressing to get this solved directionally.
[00:13:45] Like what do they need to start working on to address this problem?
[00:13:49] Yeah.
[00:13:50] One of the really important things is I would argue to think about the concept of identity verification.
[00:13:55] Okay.
[00:13:55] I think across the internet, we have a greater responsibility to know who a user is.
[00:14:00] It doesn't mean they can't operate with an alias or a pseudonym or a username, but the platform or in a lot of our cases, the employer should know who that person really is.
[00:14:11] And so the idea of doing identity verification in person or in a remote way that's highly secure is critical when you're trying to protect accounts.
[00:14:21] And so that means, let's say someone's calling your MSP and wants to make a change or wants to onboard.
[00:14:25] You have a new hire into your infrastructure that you want to be a client or one of your own employees that you want to support.
[00:14:32] Really thinking about who are they?
[00:14:34] Do I have a way to confirm their identity from one or multiple channels before, let's say, I onboard them or I bring them onto my infrastructure or before I make the change they might request?
[00:14:44] That concept of verifying identity needs to go to a more advanced level and can no longer be these nonsensical security questions.
[00:14:52] Right.
[00:14:52] The awareness on this will allow all of us to sort of be more creative, I believe, if we're aware of the problem.
[00:14:59] So essentially, we really want to start with a good process implementation.
[00:15:02] It can be as simple as I will have a secret word with my spouse that if I call her, she should know that I'm actually – if it's something like,
[00:15:13] maybe I need you to move $10,000 right now, she should ask, hey, what's the code word?
[00:15:20] It's a process problem much more than a technology one.
[00:15:24] Am I reading this right?
[00:15:25] Yeah.
[00:15:25] Often, you know, sometimes it is tooling as well.
[00:15:28] But, you know, especially when you bring it to your personal life, you know, I often think a lot about the person, the channel, the context.
[00:15:34] You know, hey, do I know this person?
[00:15:36] For example, are they going to message me randomly from a large WhatsApp group that I'm in?
[00:15:41] Channel, in that case, I might not trust as much.
[00:15:43] Is it a one-on-one conversation I've been having with someone I know well for a long time?
[00:15:47] I might know the person.
[00:15:48] I might know the channel.
[00:15:49] But then is the context right?
[00:15:51] You know, are they suddenly asking me for something that seems out of ordinary or uncharacteristic?
[00:15:56] Things like that might cause natural flags or skepticisms to say, you know, that perhaps this person isn't who they're purporting to be.
[00:16:04] Gotcha.
[00:16:04] Well, the last sort of question I want to get some insight into is, so how do the regulatory frameworks need to change here?
[00:16:10] Because it feels like this is an area where government is part of the problem and part of the solution, too, because they do some level of identity issuing.
[00:16:20] They have a component in this.
[00:16:22] How does the regulatory framework need to change?
[00:16:25] I believe firmly that we need to start to think about the reset and the onboarding process critically as surrounding MFA.
[00:16:33] And, you know, there's the White House directives in implementing MFA.
[00:16:36] There have been a variety of other standards that suggest or require it now.
[00:16:39] You know, cyber insurance policies and premiums often require that.
[00:16:43] But the breaches that we've seen overwhelmingly, particularly in the last 12 months, you know, over 90 percent of cyber attacks are identity related.
[00:16:51] And the easiest, the lowest hanging fruit right now is to impersonate someone, compromise their credentials and take over their account.
[00:16:58] And so this this the frameworks across all boards, regulatory and commercial need to evolve to take that into account.
[00:17:05] And frankly, nowhere is that even more relevant today than in health care.
[00:17:09] You know, I think we've seen over the last nine months in particular, almost every major or regulatory body in the U.S., from health and human services to Homeland Security to the FBI, come out and say, hey, in the U.S.
[00:17:20] in particular, health care is under attack.
[00:17:22] The health care help desk is literally under attack because on the dark web, the value of protected health information and personal information exceeds that of your social security number and your credit card these days.
[00:17:34] People want access to that data because it doesn't change.
[00:17:37] Right. It can't be changed.
[00:17:39] And if you have that level of insight on a person, you can use Gen.
[00:17:43] AI to target in a much more precise way, targeted phishing mails, targeted conversations where, you know, something deeply personal and you can use it to extract other information and elevate the threat matrix.
[00:17:55] OK, so you brought it up.
[00:17:56] But before I wrap that is biometrics the right way to go because it is unchangeable.
[00:18:02] You know, are we are we going down the right direction for this or should we be thinking about other ways of implementing identity verification?
[00:18:10] You know, I believe in sort of the basis of where I spend my time, my team and the products that we create are biometrics is an important component.
[00:18:16] But alone, it's not sufficient.
[00:18:18] Just like AI is.
[00:18:19] You know, I don't believe in deep fake detection.
[00:18:22] I think that's an AI arms race.
[00:18:24] You know, does the bad person's AI model better than the good person's mind?
[00:18:28] Somebody will always be slightly ahead, likely the bad actor.
[00:18:31] But I think when you are when you combine AI and you combine cryptography and biometrics, cryptography is something that's been powering a lot of our financial transactions.
[00:18:40] The Internet now for many years, decades with the power of cryptography, for example, on mobile devices and the idea of verifying someone's identity using a mobile device.
[00:18:50] You get to use the three dimensional depth map camera that powers face ID.
[00:18:54] Now, face ID knows that you are the face that enrolled.
[00:18:57] They don't know whose face that is, but we can use that camera.
[00:19:00] We can use the cryptography element of hardware, knowing that someone can't inject, you know, a deep fake emulator or fake piece of software.
[00:19:07] And we can use AI.
[00:19:09] My gosh, you have a much stronger arsenal to go compete against deep fakes and AI's and AI deep fakes.
[00:19:16] So that that that's the stuff that gets me really excited is how we can take different elements of technology and weave them together for a stronger arsenal that that really is competitive.
[00:19:26] Aaron Painter is the CEO of NameTag, a leading company in identity verification technology, where he focuses on safeguarding digital identities through AI powered and biometric solutions.
[00:19:36] With over two decades of experience in global technology leadership, he's a recognized voice in the future of cybersecurity and identity management.
[00:19:43] Aaron, this has been fascinating.
[00:19:44] Really appreciate you joining me today.
[00:19:47] Thanks for having me, Dave.
[00:19:49] Are you ready to get your brand in front of the tech leaders shaping the future of managed services?
[00:19:55] Here at the Business of Tech, we offer flexible sponsorship opportunities to meet your needs, whether it's live show sponsorship, podcast advertising, event promotion or custom webinars.
[00:20:07] From affordable exposure options to exclusive sponsorships, our offerings are designed to fit businesses and vendors of all sizes looking to make an impact.
[00:20:16] Prices start at just $500 per month, making our packages a fraction of typical event sponsorship costs.
[00:20:25] Be a part of the conversation that matters to IT service providers worldwide.
[00:20:31] Join us at MSP Radio and amplify your message where it counts.
[00:20:36] Visit MSP Radio dot com slash engage today to explore all the ways we can help you grow.
[00:20:45] The Business of Tech is written and produced by me, Dave Sobel, under ethics guidelines posted at businessof.tech.
[00:20:52] If you like the content, please make sure to hit that like button and follow or subscribe.
[00:20:57] It's free and easy and the best way to support the show and help us grow.
[00:21:02] You can also check out our Patreon where you can join the Business of Tech community at patreon.com slash MSP Radio or buy our Why Do We Care merch at businessof.tech.
[00:21:15] Finally, if you're interested in advertising on this show, visit MSP Radio dot com slash engage.
[00:21:22] Once again, thanks for listening to me and I will talk to you again on our next episode of the Business of Tech.
[00:21:31] Part of the MSP Radio Network.

