TWiT 1054: Nine Days a Week - Satellite Data Exposed With $750 of Equipment
This Week in Tech (Audio)October 20, 2025
1054
2:55:13160.58 MB

TWiT 1054: Nine Days a Week - Satellite Data Exposed With $750 of Equipment

Shocking new research reveals how anyone with $750 can intercept unencrypted satellite data, exposing everything from government secrets to in-flight Wi-Fi traffic. Find out why decades-old vulnerabilities are still open and who actually wants it that way.

  • Study: The World's Satellite Data Is Massively Vulnerable To Snooping
  • You Only Need $750 of Equipment to Pilfer Data From Satellites, Researchers Say
  • Hackers Dox Hundreds of DHS, ICE, FBI, and DOJ Officials
  • DHS says Chinese criminal gangs made $1B from US text scams
  • cr.yp.to: 2025.10.04: NSA and IETF
  • Why Signal's post-quantum makeover is an amazing engineering achievement
  • Court reduces damages Meta will get from spyware maker NSO Group but bans it from WhatsApp
  • How I Almost Got Hacked By A 'Job Interview'
  • New California law requires AI to tell you it's AI
  • The European Union issued its first fines under the AI Act, penalizing a French facial recognition startup €12 million for deploying unverified algorithms in public security contracts
  • Wikipedia Says AI Is Causing a Dangerous Decline in Human Visitors
  • Texas hit with a pair of lawsuits for its app store age verification requirements
  • Australia shares tips to wean teens off social media ahead of ban. Will it work?
  • California enacts age-gate law for app stores
  • Meta is asking Facebook users to give its AI access to their entire camera roll
  • Meta poached Andrew Tulloch, co-founder of Thinking Machines Lab, with a compensation package rumored to reach $1.5 billion over six years
  • Even top generals are looking to AI chatbots for answers
  • Roku's AI-upgraded voice assistant can answer questions about what you're watching
  • Tesla debuts a steering wheel-less taxi for two
  • Waymo and DoorDash Are Teaming Up to Deliver Your Food via Robotaxi

Host: Leo Laporte

Guests: Jacob Ward, Harper Reed, and Abrar Al-Heeti

Download or subscribe to This Week in Tech at https://twit.tv/shows/this-week-in-tech

Join Club TWiT for Ad-Free Podcasts!
Support what you love and get ad-free shows, a members-only Discord, and behind-the-scenes access. Join today: https://twit.tv/clubtwit

Sponsors:

Shocking new research reveals how anyone with $750 can intercept unencrypted satellite data, exposing everything from government secrets to in-flight Wi-Fi traffic. Find out why decades-old vulnerabilities are still open and who actually wants it that way.

  • Study: The World's Satellite Data Is Massively Vulnerable To Snooping
  • You Only Need $750 of Equipment to Pilfer Data From Satellites, Researchers Say
  • Hackers Dox Hundreds of DHS, ICE, FBI, and DOJ Officials
  • DHS says Chinese criminal gangs made $1B from US text scams
  • cr.yp.to: 2025.10.04: NSA and IETF
  • Why Signal's post-quantum makeover is an amazing engineering achievement
  • Court reduces damages Meta will get from spyware maker NSO Group but bans it from WhatsApp
  • How I Almost Got Hacked By A 'Job Interview'
  • New California law requires AI to tell you it's AI
  • The European Union issued its first fines under the AI Act, penalizing a French facial recognition startup €12 million for deploying unverified algorithms in public security contracts
  • Wikipedia Says AI Is Causing a Dangerous Decline in Human Visitors
  • Texas hit with a pair of lawsuits for its app store age verification requirements
  • Australia shares tips to wean teens off social media ahead of ban. Will it work?
  • California enacts age-gate law for app stores
  • Meta is asking Facebook users to give its AI access to their entire camera roll
  • Meta poached Andrew Tulloch, co-founder of Thinking Machines Lab, with a compensation package rumored to reach $1.5 billion over six years
  • Even top generals are looking to AI chatbots for answers
  • Roku's AI-upgraded voice assistant can answer questions about what you're watching
  • Tesla debuts a steering wheel-less taxi for two
  • Waymo and DoorDash Are Teaming Up to Deliver Your Food via Robotaxi

Host: Leo Laporte

Guests: Jacob Ward, Harper Reed, and Abrar Al-Heeti

Download or subscribe to This Week in Tech at https://twit.tv/shows/this-week-in-tech

Join Club TWiT for Ad-Free Podcasts!
Support what you love and get ad-free shows, a members-only Discord, and behind-the-scenes access. Join today: https://twit.tv/clubtwit

Sponsors:

[00:00:00] It's time for TWiT This Week in Tech. Harper Reads here, Abrar Al-Heady and Jacob Ward. We got a great show planned for you. We'll talk about hacking. Turns out all that data on the satellites going back and forth, it's not encrypted. Nobody ever thought anybody be listening. California's got a new law about social media, AI and age verification. And Australia is about to ban social media for people under 16. Get ready for the screams of pain.

[00:00:27] And the AI researcher who's getting more than $200 million a year from Meta. All that and more coming up next on TWiT. Podcasts you love. From people you trust. This is TWiT.

[00:00:50] This is TWiT. This Week in Tech, Episode 1054, recorded Sunday, October 19th, 2025. Nine days a week. It's time for TWiT This Week in Tech, the show we cover the week's tech news. I like to assemble, I like to think of this show as kind of a grab bag. That's not the right way. I like to assemble a eclectic group each week.

[00:01:14] It's the one show we do where there's always rotating hosts. And I really enjoy it because, well, the news is complicated. I like to get as many perspectives as we can. Jacob Ward is here. You see him, of course, on Tech News Weekly with Michael Sargent. He's got theripcurrent.com, his newsletter, and he's the author of a really great book about big tech, how big tech ruins everything. The Loop. Good to see you, Jacob. Thank you for having us. Thanks for having me, boss.

[00:01:42] Yeah, yeah. Also with us, Abrar Al-Heaty from CNET, senior technology reporter. She is relaxed because she's done with the phone reviews. I can finally breathe. Write about other things in the world. Yes, maybe. I mean, who knows? It never really ends.

[00:02:00] Never, never ending. I have actually a graph somebody made. It's probably not a good scientific study, but it's fascinating about what happened since 2007. And what happened in 2007? I think you'll know this one, Abrar. What happened in 2007? The release of the iPhone. Oh, yes. That should have been top of mind. And it's about how everything's gone down since.

[00:02:28] That sounds about right. Including Abrar's social life because she's now got to review every dang phone. My friends have been wondering if I hate them at this point. Yeah. Yeah. This is the blog post. What happened in 2007? It's the damn phones. Yeah. Yeah. And there's a lot of, you know, IQ scores tumbling. I don't believe it. I believe it. It's really, it's kind of depressing. Internet, that's not a good one. Test scores.

[00:02:58] Look at that. 2007. Yikes. Right here. Down the hill. Of course, COVID could be involved there somewhere. Right? That's the problem with these. You know, it's. Lots of factors. Science. These are SAT scores. Plummeting. Yikes. Internet addiction is going away. I don't know why that is. Cause phone addiction is going up. It's going up, but nobody's talking about it. That's the difference. This is articles about internet addiction. We don't talk.

[00:03:27] We don't want to talk about that. Sleep abnormalities through the roof. Mm-hmm. Through the youth mental health. Well, I don't have to tell you about that. Yeah. Percent of US undergraduates with a mental illness. Oh man. Anxiety, depression, ADHD. But you know, this, this is. We won't go any farther. It's all fun and games. It's just, it's fun and games. Cause there's a lot of factors. You can't just say, well, it's that or it's that. And also what would we, what would we do without these apps?

[00:03:56] We have so many cool apps. How, how Harper Reed, look at that technologist entrepreneur hacker. Harper dot blog. Harper is always welcome on this show as all three of you are three of my favorite people. I'd like to do this. I'd like to do this. Actually, there's a new app. I can't get it to work. Thank God. A guy who works in the super intelligence division of Meta put it out today. That take, it takes your picture and then puts it in a vacation place. So you don't have to actually go on vacation anymore.

[00:04:25] Thank God. That was a real struggle. That was horrible. Yeah. No kidding. Um, I couldn't get it to do that. So I, and it's $30 if you want, you know, you only get like the first five. So I just throw it away. But, uh, anyway, I thought that was interesting. That was kind of a sign of the times. Uh, another sign of the times. Let's start with hacking today. The world's satellite data is massively vulnerable to snooping. You only need $750 worth of equipment.

[00:04:55] And this is kind of like the SS seven bug in our cell phone system, which of course the Chinese have exploited and can never, we can never be rid of them. Uh, researchers who've discovered this said they just really didn't think anyone would look up at the satellites. New study published on Monday found that communications from, well, it's everything. This is from UC San Diego and the university of Maryland.

[00:05:20] We, we pointed a commercial off the shelf satellite dish at the sky and carried out the most comprehensive public studies to, uh, to date a geostationary satellite communication that includes Starlink, of course, but also many other communications. A sharking shockingly large amount of sensitive traffic is being broadcast unencrypted. Oh, including critical infrastructure, internal corporate and government communications, private citizens, voice

[00:05:48] calls, SMS, and consumer internet traffic from in-flight wifi and mobile networks. This data can be passively observed by anyone with a few hundred dollars of consumer grade hardware. Yay. I have, I have to admit that when I read, when I saw this, the first thing I thought was, where do I get the stuff? Good man. A true hacker. Yeah. $800. Yeah. $800. Also it's space stuff. You know, it's like coming from space.

[00:06:18] It's just really exciting. This wasn't that much of a revelation to me because I remember Steve Wozniak doing this many years ago. He would sit in his living room and he would listen in on satellite, sat traffic. But I was under the impression that was in the early days and that everything's been fixed now, but apparently not. Well, there is a, there is a really, oh, go ahead, Jacob. Well, I was just gonna say like, well, I wonder what the like, why, why are they so slow to address it? Right?

[00:06:45] Like why are these companies so loathed to go ahead and I mean, right, according to the study, they've been, companies been warning, you know, been warned for decades about this. We've known about SS sevens flop for more than decades. So like, what is the, I have to assume, right? No company does anything without some sort of incentive behind it. So like, is it that there's, they don't care enough about the reputational risk and maybe staying on the good side of intelligence of the intelligence community in the NSA.

[00:07:12] Ah, important then, then taking it down. You know, there's a good cause there's only a handful of us who care about it. Right. I wonder why they're so slow. And the NSA law enforcement loves it, right? Yeah. And it saves them the whole headache, right? One of the huge headaches for, for all the social media platforms once upon a time was having to staff, you know, huge offices worth of people to, to deal with the incoming intelligence requests.

[00:07:37] If you can somehow not get in trouble with your customers and leave that door open, maybe that's a more logistically simple way to deal with it. But you remember the, but you could imagine it's, it's also like, you know, people in the same way that like, you know, people have often said that like, you know, the routers that control street lights and so forth have like the passwords, like one, two, three, four, five, you know? So it may just be, it could be just like variety. They were hacking the walk, you know, as a walk sign is on then Palo Alto.

[00:08:07] And they had Mark Zuckerberg saying things. That's right. That's right. This is Elon Musk. And I think that you should blah, blah, blah. That's right. That's right. That was really zero, zero, zero, zero. And yeah. I mean, we don't really care about security until we do, I guess, but I think your conspiracy theory is kind of interesting that, Oh, I've got a million of them. Yeah. Let's go. Come on. The law enforcement doesn't really want us to encrypt this stuff. The AT&T whistleblower who said, Oh yeah, the NSA has a wire closet in San Francisco,

[00:08:36] where they listen to all AT&T internet traffic. Yep. But they don't have to do that really. Do they? It's a lot easier. $750 worth of equipment. So you, so there's a, excuse me, there's a big community of people that do this for imagery. So a lot of the satellites will beam down images and there's, there's an amateur satellite subreddit that I always poke my head into thinking, should I get one of these antennas for my house? Should I get one of these little software defined radios and start slurping down those images?

[00:09:06] And mostly because it's just weather data, which I'm kind of a weather data nerd. I get it. And then I'm like, what am I going to do with it? I just look outside and it's not raining or snowy or whatever. But I do like the idea that of taking it into my house and seeing some of this stuff. And it's pretty interesting. And I'm not really surprised that you can just apply this to something else. Like, I don't know, text messages or whatever they're slurping down as well. Because I don't think people really did like kind of report said, expect to actually point

[00:09:36] antennas at these devices. I think that's probably, they thought the, the, the obscurity security through obscurity aspect would, would hold for longer. But then I don't know. People are bored. And it's only 700 bucks. And it's only 700 bucks. Loan loved listening to the cell phone traffic. He really enjoyed it. It was quite, it was his evening entertainment. He would sit in the living room and just have it running. Or social media, man. What are you going to do? Yeah. Yeah. Conversations. All right. Well let's, so this will be our hacking segment for the, for the week.

[00:10:06] The DHS says criminal gangs. You've been getting those text messages that say, oh, you're, we can't deliver this package or you've got a highway toll payment, right? Or your traffic, there's a traffic violation. If you don't pay it, it's going to double and it panics people and they pay $1 billion in the past three years, according to the department of Homeland security. I'm not surprised.

[00:10:35] So people fall for this stuff, you know, they're going for things that can scare you. Right. Okay. It's likely that I have a package in the mail. It's likely that I missed a toll. It's, you know, um, it's things that are obvious wins and clearly the billion dollars. I was watching, you know, go ahead. I was gonna say you're like, you're like, so I, when at NBC news, we got to interview,

[00:10:59] I got to interview a Nigerian romance scammer, uh, you know, in living outside Lagos. And he'd spent like, you know, the better part of five years trying to get somebody on WhatsApp to respond to his entreaties. And there was even, this is the crazy part. There was even a manual that he showed us where these, these guys would, would share this book.

[00:11:29] That was your guide to trying to trick Western women into falling in love with you and giving you money. And the, I remember that story. That was man, the industry of, of that was so incredible. And to me, I was like, wow, like, why would you spend five years? I mean, this is what we asked. It was what I asked him, you know, why, why five years? That's an incredible amount of time. But he's putting in like, you know, six hour days on top of his other jobs. Good money. Right? Yeah.

[00:11:55] He got in the end, like $25,000 out of one woman finally. And that is a life changing amount of money for him. Right. It'll, you know, it changed his family's, you know, future for generations. Right. I even had a tremendous personal upheaval as a Christian and decided he would never do it again. And now he's a, not nearly as well paid, but a paid consultant to the companies that are trying to fight against this. But, but that, that tiny, I mean, especially with something like a tech scam where you don't even have to get on the phone.

[00:12:25] Right. You don't need to be an individual texting. Like the, the, the scale pays so well, right. That tiny percentage of people who will fall for it. And I'm, you know, I'm one of those people who like looks at those and I'm like, I'm like, Oh no, have I, you know, did I? Cause I'm the kind of person who always forgets to pay my bills. So, uh, so I fall for, you know, I fall for it for a millisecond. You know, I can't imagine how many people must fall for that. I didn't want to do it, but they made me do it. I fell in love with Connie Chum.

[00:12:54] And now I'm sorry. I can stop doing that. That's stupid. Um, my, my identity was briefly being protected. I fooled you. Yeah. Uh, what amazes me is how people fall for this. I can understand a romance scam because you're low, you know, somebody's lonely and, and, and, you know, it's flattering and you're getting flattered and all that stuff. But the, the toll thing, the bill thing, maybe it's, maybe it's people like you, Jacob, would you say? Yeah, I don't know.

[00:13:24] There's a lot, don't underestimate the number of people who forget to pay their bills and have felt they're not qualified for human life. There's another side effect to this. Cause we got a bill from one of our service providers for the company that said, you're past due. We're going to send you to collection and you give us $8,000 now. And we had to really think long and hard. Is this real or not? Why it's not been paid. And it turned out it wasn't a scam. We actually must somehow must've missed the bill.

[00:13:51] But, but you could see people also, I mean, Lisa and I spent some time thinking, is this real? Is this real? Let's let me investigate and so forth. So it has a impact in both directions. We are very, Lisa gets every day gets email because we have an address. I won't say out loud, but it's the kind of obvious address that you would send a fake invoice to. And it's, it gets something every day saying, yeah, we will, you know, please remit $3,000. That kind of thing.

[00:14:20] I think it's, I think we underestimate our ability to detect these sort of things. And, you know, one of the things about working in the Obama campaign is they were very, very worried about security. And this is obviously a billion years ago, but there was just a lot of like, you know, you're going to get hacked. You're going to get. Right. Things are going to happen to you. Not, and not like it may.

[00:14:46] And then the other thing was the incentives are such that for some of these big kind of hacks where they're scamming people out of hundreds and hundreds of thousands of dollars. That's more revenue than most venture backed startups get. So like, why wouldn't you staff it with 10 people trying to go at this, both coming at it?

[00:15:08] You know, like if you're doing that, you know, getting $200,000, 10 times a year, that's pretty good income, especially if you're in a market that just doesn't have the same level. And it's all USD as well, which is a strong currency. And so, you know, or, or gift cards, but, but I do think there's this thing of like, like from a business standpoint, this seems like a pretty rational business approach, unfortunately, for a lot of these folks to, to do.

[00:15:34] And I, I, what changed me was just this, this assumption that I am not going to win if someone is so motivated to, to hack me. And so you kind of have to be skeptical with every interaction. And that kind of sucks as well. Like that's just like a, not a great place to be is to be skeptical of every interaction you have. Um, I, my down, my like weakness is I can't help but reply. Like I am just built in such a way where they're like, Even if you know it's a hack, you're gonna, you're gonna reply.

[00:16:04] Well, they're just like, you know, send some picture of some random person. And it's like, Steve. And I'm like, yep, this is Steve. And they're like, I met you at yoga class. I'm like, yeah, it was wonderful being your teacher. And I'll just lean all the way into it as far as they'll go. And eventually they're like, I think I have the wrong number. And I'm like, you do not have the wrong number. We are talking. Um, and I have to like really promise myself not to reply. And actually the new iPhone stuff where they put the unknown texts. I really love that. Yeah. That has stopped my replying.

[00:16:32] The single best improvement in iOS 26 is I have a separate place for people. I don't know. And I put everybody in there. Um, so a couple of relate, first of all, don't do that. And then don't they usually at the, at some point, they're going to ask you to invest in crypto or something like that. Right. There's some, there's some game they're playing with you. I always ask them if they want my seed phrase. That's one of the first things I'll say. I'll just say, are you, do you want my seed phrase? Let's go to the chase here. Yeah. Can I just give it to you? Is that easier?

[00:17:00] Well, because I don't have much time right now. I'm just like, here we go. Let's just do it fast. And they always are very standoffish. I don't get the highs as much as I used to. I don't know why. Yeah. For some reason lately I've been getting almost every day from a variety of area codes, a text saying, do you need your trees trimmed? And I feel like this might be a code. I don't know what this, or they're just driving by your house and they're like, man, this guy, it's different numbers from different areas.

[00:17:28] And it's hi, this is Sandra, your local tree trimming expert. Do you need your trees trimmed? I feel like that was an, is an endless amount of job offers. Cause I'm like, I get those too. Yeah. You get those too. Okay. I thought maybe it was like you you're in some tree trimming database and I'm in some job searching database. I don't know. Okay. So, uh, we do have strong encryption, uh, despite the fact that governments all over the world are trying to eliminate it.

[00:17:54] In fact, in response to what you were talking about Harper back in 2012, Apple added advanced data protection. Google before that added advanced data protection, enhanced a protection that required a, you know, hardware key and, and was end to end encrypted. Um, of course the UK government has been trying to get Apple to drop that ADP. Uh, Apple no longer offers it to people in the UK as a result. Um, interesting story. I ran this by Steve Gibson.

[00:18:24] He said, I'm not too concerned, but I, I'm a little concerned. This is from a very respectable, um, uh, crypto expert. Um, cryptography. Let's be clear about which crypto. Oh God, you're right. Now I have to say that don't I fake money. Yeah. Oh man. Uh, who says that the NSA and, uh, GCHQ in the UK are trying to get NIST and other standards

[00:18:52] organizations to weaken their elliptic curve cryptography and post quantum cryptography. They, this is what he writes part of the, they've been endlessly repeating arc and arguments that weakening is a good thing. Um, apparently they spend money on the IETF and, and, and NIST and others to try and encourage them not to put in strong. This is exactly what you're saying. Jacob.

[00:19:20] This is, this is supports your theory that these people, these groups don't want strong encryption. It's ironic because the NSA is also there to protect us. Well, they, they have their own cryptographers, right? And this is the least surprising information. Like, I don't know. Uh, I feel like this is one of those things where people will argue about whether it's true or false for forever. And it, it may, I don't think it really matters because it hasn't the NSA been caught doing this. Yeah.

[00:19:49] They weakened DES to 56 bits. Um, uh, NIST, I think somebody discovered NIST had a weakened encryption. So like the fact that they've done this more than once, it's not really surprising that they may do it again. That seems like a kind of a pattern. I don't, I mean, I don't know. I'm not, I'm not super good at patterns like that or people doing something over and over and over again. So I, I, I'm like, I read this and I was just like, yeah, that checks out. And that's exactly what Steve said. He's like, he's pretty smart.

[00:20:18] Yeah. Yeah. Of course. Yeah. Yeah. Exactly. Like that. You thought Sandra was there to trim your trees. Really? Yeah. What'd you think? Well, I mean, this is a funny problem because we all know that encryption is going to make everyone safer, but that also includes criminals that I think is, that's just a very complex. Place to be like, I w I would prefer to live my life in a way where we're making the entire population safer.

[00:20:45] And then we have to maybe use other methods to find the criminals, not make everyone unsafe as a way to find the, to like, to like find the bad people, so to speak. Bingo. That's exactly. I, I once interviewed Phil Zimmerman, the creator of PGP and asked him that. He said, this is a price you pay for strong encryption. He said, it's fine. He said, it's fine. And his biggest point was law enforcement is not going dark.

[00:21:08] He says, this law enforcement has in effect, the largest widescreen view of everything going on in the world ever. And there's just a few pixels that are out and they bugs them. I know, but I want to see everything. He said, law enforcement is not hampered. Modern technology has really given them access to much more than they'd ever had before.

[00:21:31] Look at flock the camera, the license plate camera, uh, people who are all over now, uh, making deals. They just made a deal with ring. So your ring doorbell will do license plate recording. Hmm. And, and of course law enforcement has access to that. And some people would say, well, that what's wrong with that? You know, it's just protecting me. If a bad guy's driving around my neighborhood, that just protects me. It seems scary.

[00:21:56] I get, I get stuck in this one a lot because I'm like, so I live in Oakland, California where, you know, there's a, there's a citywide, uh, prohibition on all sorts of surveillance panopticon sort of technology. You're not allowed to use, uh, facial recognition. You know, there's all these sort of prohibitions that we have in place. And, but I've also over the last couple of years gone through, we've gone now through, we're on like our 11th police commissioner in 10 years or something. I can't even remember, but we just lost another one.

[00:22:26] There's a lot of crime in Oakland. We're still under a federal consent decree. There's corruption. And there's a huge amount of just sort of, you know, there's, there's a solid amount of homicides and the big one that affects people I know and has affected me personally, right. Is, is car break-ins, right? I've had people break into my car with my kids sitting in the back, you know, and there's a no pursuit policy in Oakland because, you know, car pursuits tend to result in somebody getting injured.

[00:22:53] You know, very, most, very often not people involved in the case. Yeah. Yeah. And so I'm suddenly at this place where I'm like, I want there to be a deterrent, the deterrent effect of knowing that there's a camera that's going to catch your getaway vehicle.

[00:23:10] But I also don't want police getting used to that deterrent getting used to the idea of using that because we know that no matter how many times you tell police, you're not allowed to use this system to make an arrest. It's got to be bolstered by other evidence. Inevitably, they don't do that. And of course, where you put those cameras and where you put these surveillance systems makes an enormous difference.

[00:23:35] Putting it in, you know, a black and brown neighborhood where people have moved into a place where the streets are extra wide and, you know, going over the speed limit just sort of makes a little more sense. And people driving extra distance to work and, you know, there's all these subtleties that we haven't really dealt with. But anyway, I'm, I used to just be like hardcore. No.

[00:23:57] And I'm like, I'm sort of shifting my thinking on that one a little bit, but I don't, I honestly don't know how I feel about it these days. I used to be very, very understand. And I'd love to hear everybody else. I'd like to hear everybody else's perspective in this panel about this. I'd like to hear how people feel about that. You know, it's Harper. You're in the war tone, Chicago. Oh, it's horrible here. It's really bad. We, we went and had, what did I do? I was, this is, you can tell it's not war torn because there was a marathon. I went to a bulls game.

[00:24:27] Like, this is like, it's, it's a pretty nice spot right now. Actually, I highly recommend it. This is the best time of year to come visit. But I, I think this is how they get you. I think this is exactly what a flock camera wants you to say is this will be the solution to solve this crime problem. That is probably some other systemic issue that no one wants to invest in. And I, and I think that that is, that is, that's just how they get you.

[00:24:56] I mean, I have a lot of security cameras. When we made this new office, we went way overboard. It is absolutely ridiculous. Part of the reason was we wanted to grab the RTSB streams and do whisper and get all the voices and get, get notes for every space in this. Oh, great. So you're recording every conversation. We actually have a QR code on the door that links to our privacy policy. Cause I was like, this is getting a little sketchy. You know, I'm tempted to do the same thing in my house. Cause I do want to record everything. It's so fun.

[00:25:25] And just put a sign in the house and say, when you enter here, you will be recorded. Yeah, exactly. But, but I do think there's a big difference when it comes to these public spaces and especially public spaces that look private, meaning like your sidewalk, like my sidewalk is not my space. That is a public space that people I think should have some.

[00:25:44] I don't, I just don't think it's a good practice to start recording that and sending that to a third party that may use that for, you know, to enforce policies that are bad because the issue here is not that they're going to use that to stop the people who are popping cars and stealing backpacks. The issue is that they're going to use that to, to, to attack the people who are already under attack. All the people in our communities that are the weakest are the ones that are going to be the weakest. That's going to be used against them still.

[00:26:10] And I, I just don't, I don't really believe that that will help with that said. I think there are things that will help and I don't live in Oakland. I, I'm, I'm trying not to talk about California when I say I don't live in Oakland, but I think that there are just ways to solve this problem that are not investing more in the surveillance state and giving. I agree with you. I mean, I will also say, like, like, like this is a tech show, but like the fundamental solution to this stuff is like, we need housing.

[00:26:39] We need like a million units of housing in this state. So you're, there's, there's absolutely like, I, I have no, no illusions that that would solve the problem. But I also see these 17 year old kids, you know, with no, I don't know, it's like absolutely no incentive structure around them at all to do anything other than the quickest possible thing to make a couple dollars.

[00:26:59] And I would, and I, and I know from lots of reporting on, you know, and lots of interviews with, with psychologists and social scientists around the world that like a little bit of immediate deterrent really can make an enormous effect. Like raising the punishments for crime has no effect, but like you see a camera, it slows things down.

[00:27:21] I was just, I was in Brazil this summer and that place is a wash in Chinese surveillance cameras and people there are full of interesting perspectives about it. And people who are very serious civil rights people also say, yeah, but I'm actually kind of cool with this. It's a very, that's a really interesting. And that's a place where China has like poured into the social fabric. I don't know. Abra, what do you think about this? I really want to hear your perspective. No, I'm, I'm listening to it and appreciating all of this.

[00:27:49] Um, I, I think, I think you guys really hit the nail on the head here where it's, um, there needs to be bigger changes and it seems like a bandaid fix of, okay, here's a security camera, but then what does that feed into? And, um, and the idea of, you know, mass surveillance is, is terrifying, but, um, no one seems to be focusing on how to actually fix the problems. And, uh, and yeah, I think it's very much a bandaid on a bullet hole, um, kind of, kind of problem. But it's, I also agree with Jacob.

[00:28:17] It's very hard, especially if you've been the victim of this to say, well, I don't want, you know, I'm willing to put up with some minor crime because I don't want, uh, surveillance on, on the streets of my city. I can, I completely understand that too. It's really a tough problem. Uh, I think though that we can probably draw the line at things like the UK government saying, well, there should be no end to end encrypted messaging. Can we draw the line there?

[00:28:43] Is that, I think that so signal is it's interesting because signal has, in fact, there's a great article. I recommend, uh, Dan Gooden writes so many great pieces about this kind of stuff. Uh, in ours, technical, why signals post quantum makeover is an amazing engineering achievement. They changed their crypto to be post quantum.

[00:29:03] Uh, not that there are quantum computers today or even anywhere in the near future, but they put a lot of energy into making sure that their cryptography was the signal protocol was quantum resistant planning on a future where you might have quantum computers able to break RSA encryption. Um, they implemented, uh, crystal cyber, uh, Kiber, which is an algorithm that NIST endorsed. I hope not a week. Yeah. I'm pretty sure it's not.

[00:29:31] Um, and it's not an easy thing to do, but they did it. And I think we can all agree that any government regulation that would undermine that would be a step too far. Or can we, is it worth doing that to eliminate petty crime? I, I feel that way. I mean, I, I, well, I don't know about the petty crime part, but I don't want, I don't want government saying, I mean, like what I like about signal, right?

[00:29:56] Meredith Whitaker who runs signal her argument for so long has been, we need to deprive these big foundational AI companies of the fuel they need to build their models. Cause they are parasites who are, you know, stealing our intellectual property and our thoughts and our conversations.

[00:30:17] And, and so one of her arguments has been, you know, we're one of the things, one of the reasons we care so much at signal about this is that we're trying to make it such that your life isn't being fed into these companies. And if you were to somehow say at a government level, encryption's not okay. You're you, that is a thing that I would think a lot of these big foundational companies would be only too happy about because then you can suddenly scan everything. You know, like I just, I, I really, I, yeah, for me, that's a bridge too far. I don't know how everybody else feels.

[00:30:48] I think we all agree. Yeah. Yeah. And I think the thing that is the one step scarier is when you do have your ring camera or your doorbell camera hooked into this kind of panopticon of which, which is powered by a commercial business. We don't know where that data is going. And so that could be used to be training models that are then used to, you know, do things that we probably would not want to be done.

[00:31:15] I think that's probably more often than finding a petty criminal that is, that is doing something. And it might make us feel better to say, oh, well, I helped the community. But I don't know. I just look at, I just opened citizen app and then I immediately close it. And I feel like that's kind of the vibe that this is, this is just citizen app with video. It's like the YouTube version of citizen app. Yeah. I think that's the, that's where this goes bad.

[00:31:39] So I, I, I, I even purposefully aimed all of my cameras to not cover sidewalks at my house to like, make sure it's just for my house and my property. Because I was like, I just don't want anything. I don't want anyone's business. I don't want anyone. I want, I want like plausible deniability to be like, I can't see your house. Like that's gone a lot farther than most people. In fact, I can see the sidewalk right now outside my house. And I, I think I have to have that camera there because that's protecting my garbage.

[00:32:06] So I don't want anybody going through my trash. So it's hard to get, especially a ring, which is extremely wide angle. It's hard to get the ring, not to point at your neighbors. Right? Yeah. I mean, my doorbell camera points at the street and I get a lot of streets and I have an option to do. Cause I use the unify stuff. I can do licenses or, or the, the, the, yeah, I use unified too, by the way. So it's not, it's local. It's recording locally. It's not recording. It's not.

[00:32:34] And I guess though, law enforcement could come to my house and subpoena it. I email all the footage, just a cops at chicago.com. Um, just every day, just every night it uploads everything. Don't put any time codes or locations. Just send them the video. I do it. I do it sporadically and a two months delay. I did have a neighbor whose house was TP'd come over and say, I know when it happened. Can you send me the video of anybody going down the street at that time?

[00:33:04] And you found it? Uh, no, there was nobody going down the street at that time. No, thank goodness. My favorite. I wanted to help him actually. I felt bad for the guy, but, uh, and the kid turned himself in. Let me ask one more question. So Oakland, the police department here in Oakland has just encrypted their communications Chicago till now, right? Up until recently you could see that's the wrong direction. Well, this is what I want to ask people, right? How do we feel about that? Right. So I'm allowed to, I should be allowed to encrypt my stuff, but they should not be allowed to. Right.

[00:33:35] And usually it's by the way, the other way around, for instance, in chat control in the EU, which fortunately is now on the back burner, the EU members of parliament were protected for their chats could not be decrypted. But they wanted everybody else's chat. Right. Well, sure, sure. But, but that is part of the deal. We actually, this really an interesting topic is the deal we make with law enforcement, because we give them the path lethal force. We give them the power to exercise lethal force.

[00:34:05] And in fact, in many cases, we also give them a certain amount of impunity. But the trade off is, and this is the debate that's going on with ICE right now, that in most cases by law, they cannot be anonymous. They cannot hide their badge number. They cannot hide their face and they cannot encrypt their communications. That's, that's the trade off we make. We give them this power, but we don't, but we also need some control over this. They cannot be uncontrolled. Right.

[00:34:34] There's needs to be same, same with military. There needs to be civilian control. As it's easy to abuse that amount of power and protection. Absolutely. Right. Yeah. So they're, they're allowed to use force lethal force there, you know, but the trade off is they don't get to do it secretly. Yeah. That's not that makes a lot of sense to me. This is always my thing about firearms in general. Whenever I'm in an open carry state, I've had to be in open carry states covering the, the, the Biden Trump election aftermath 2020.

[00:35:04] I was in an open carry state. I was in Las Vegas outside the registrar of voters and you know, the place is full of cops, but then also full of people in paramilitary gear with big AR 15s, you know, just strapped to them. And it's like two in the morning and people are terrifying, isn't it? Terrifying, terrifying. And that thing of like, of people always say, well, you know, is my second amendment right? And so forth. I'm like, yeah, but you are, you are, you have taken on the God-like power to kill me from a distance. Right. And so what is the trade off here? What have we come?

[00:35:34] What is the, cause there isn't any check on that. It's certainly not an open carry state talking to these cops. I'm like, what do you do with these guys? Like, what's the thing they're like? Well, if they menace someone with a gun, I'm like, you mean like swing the barrel on them? Like, what are you talking about? And he was like, yeah, it's squishy. We don't really know, you know, but like, you know, but Leo, you're making a great point. Like at least with the legal lethal force and the, and the invasion of privacy, you know, there are procedures and processes to protect your privacy. Right.

[00:36:01] We could in theory live in a world in which police can come into any door they want. And probably statistically that would be a safer environment, but that's not a country we want to live in. Cause we've made that deal. Like you, we have the fourth amendment. We have, we have the fourth amendment. Yes. That's right. That's right. That's right. And so I think this moment of like cops should be allowed to be private and secret with what they do. I think you're right. That, that, that breaks that agreement that we've struck. It seems to me. Yeah. Um, we saw a lot here today.

[00:36:31] Yeah. We're figuring it out. Yeah. Meanwhile, uh, meta, uh, sued the NSL group cause the NSL group was back breaking WhatsApp. They won that case. In fact, they won, uh, $167 million in damages. The judges reviewed it and, uh, reduced the damages to 4 million, which is a tiny amount.

[00:36:57] Um, but has ordered the Israeli spyware maker to stop targeting WhatsApp. This, this goes back six years. They sued him in 2019 over the Pegasus spyware. They don't seem to have stopped. They have not stopped. No, no. They don't seem to have stopped. I, and what's interesting is the NSL group, which was an Israeli company has now been acquired by us investors.

[00:37:25] Oh, they're not going to stop a group led by get this Hollywood producer, Robert Simmons agreed to purchase the NSL group in a deal valued in tens of millions of dollars. They're not going to move out of Israel. They're not going to move out of Israel. Operational control remains in Israel. Um, but I guess it's a good investment. Um, okay. Yeah.

[00:37:54] That's the weirdest story of the week, by the way. I, yeah, this goes to back to that thing too, of like what was going on in 2019 that meta felt it was in the company's interest to go after NSL group. Right. And would not still be true today in this realm we're talking about, right? Like, like there was clearly, they felt there was a harm or probably a reputation. There were a series of stories, uh, going back to that time. Uh, NSL group targeting government officials, targeting, uh, people.

[00:38:24] Um, they're, they're, they're the ones who make the zero click software that nation states buy to go after dissidents or, uh, terrorists, or sometimes that's considered to be. That's bad. That's bad stuff. It's really bad stuff. What is the, what is the citizen lab? That's who keeps finding it. Yeah. They're cool. And it's what the good thing is Apple really cares a lot about not having their phones hacked. So they will warn people you have been attacked by a nation state.

[00:38:53] Um, and they are doing everything they can age. If the new, uh, iOS has some really strict, uh, protection against this kind of stuff, but there's always little holes and there's a lot of money for hackers who find these holes from companies like the NSL group. It's so scary, this stuff. And, um, a lot is done in the name of safety.

[00:39:18] And I, and I think a lot of times, um, we, that Ben Franklin quote, right? No, I don't know. Is there, I wasn't alive when he was alive. I was, uh, Ben and I go way back. Uh, let me see if I can get the same tech scams. Yeah, exactly. Do you need your trees trim?

[00:39:37] I wasn't actually, I wasn't going to, uh, quote it cause I thought it was so well known and it's almost a cliche now, but his, his famous quote is those who would give up essential liberty to produce, to purchase a little temporary safety deserve neither liberty nor safety. Mm-hmm. Yeah. That has staying power, doesn't it? It does. Um, and there maybe is some apocryphal nature in some of these quotes, but you know, Ben Franklin said a lot of things.

[00:40:07] Uh, as, as Abe Lincoln has pointed out many times. So we're going to take a break and we'll come back. We, that is our, I didn't know we'd get so much out of that. That was our hacking segment. I didn't get to how I almost got hacked by a job interview, but this is probably, maybe I'll leave, I mean, we'll put this one in from David Dota in his, in his, uh, on his blog. He says, I was 30 seconds away from running malware on my machine. He thought he was doing a coding interview.

[00:40:37] He got a LinkedIn message. Oh, from a headhunter, uh, a real company, real LinkedIn profile. The message was smooth professional. We're developing. I won't say the name of platform into transforming real estate workflows, part-time roles available, flexible structure. The guy said, I've been freelancing for eight years. We know what that's like. Uh, well, Jacob and I do anyway, built web applications, worked on very well. I've been doing a lot of these projects. It looked legit. So I said yes to the call.

[00:41:05] They sent me a test project, a standard practice for tech interviews. It was a react node code base, 30 minute test. Then they said, okay, but you need to download this software onto your machine so we can grade you. He actually, this is, you'll like this, uh, Harper. He actually asked cursor AI. Is there any, before I run this application, are there any, is there any suspicious code?

[00:41:34] Should I be worried about this? And there was cursor found it. It was obfuscated. You see this bite array? I love that. It's not in English. Uh, but, but cursor figured it out. Uh, he decoded the bite array and said, Ooh, they want to own my machine. So this is great.

[00:41:56] I love this because this goes back to the thing we were talking about, about the tech scammers, where if I was a criminal entrepreneur, which I think a lot of these people are, is they're just entrepreneurs. Like all the other entrepreneurs, they just happen to use crime instead of, I don't know, AI or venture like this would be a great one. Cause he, cause he, cause he, cause the U S right now, especially in tech, there's so many people unemployed people who are really hungry for jobs. There's a lot of people who really want to interview.

[00:42:22] They've, they've gotten, they've submitted hundreds of resumes, if not thousands of resumes with no callbacks. And someone shows you that they care and like gives you a thing and then walks you through something rational and then takes all your Bitcoin. Um, like, like it's like, I think this is like a very well targeted attack that is unfortunately, um, prescient and like probably works pretty well.

[00:42:46] Um, and it's also like praise, like we were talking about on like the loneliness, it preys on this one aspect that's really happening right now that, that I think, you know, we have to be careful of. Um, and this is what I mean by, I don't think anyone can prepare themselves for this. I don't think you can protect yourself. I think you're just lucky. He's lucky that he noticed. Right. Um, it's so said, I almost fell for it. And I'm paranoid about this stuff.

[00:43:09] He said, uh, the, it worked because it used those traditional four things that hackers use urgency, complete the test before the meeting to save time. Right. Uh, authority. It had a LinkedIn verified profile familiarity. Yeah. The stand home standard take home coding test. This is, we've done this time and time again, uh, and social proof, a real company page with real employees and real connections. He said, I'm almost fell for it. Hmm.

[00:43:36] Um, I love it though, that he was able to get cursor AI. He said, and he said, do this. He said, use AI to scan for at least suspicious stuff. When you find a bite array, that's, you know, obfuscated. That's pretty suspicious. This means I can continue chatting with these, these very nice people that keep texting me. I can pass all their chats through, through, uh, through, uh, pass it through cursor AI. See what it says. A very elaborate shortcut on my iPhone that just passes it over to open. But honestly, that's probably a really good use for a chat bot.

[00:44:06] Come to think of it. It's just to answer those text messages. Well, there is, um, someone was just talking to me about one of the phone call scams. They'll pass it to a trained model. That is just sounds like an old person that doesn't understand. They did this in Britain, a phone company in Britain did it. I love that video. It's hysterical. Hello. Say that again. Yeah. It's just so good because it's like how frustrating must that be as the scammer to just be realized

[00:44:34] that you just got got hit him with that. No reverse. Yeah, exactly. It was a British phone company and, uh, Daisy, the AI granny. You want it? You want to hear a little bit of it? It was from O2, the British phone company. O2. I'm just trying to have a little chat. It's nearly been an hour for the love of ****. Oh my gosh. How time flies.

[00:45:05] He's showing me a picture of my cat, Fluffy. It's showing you the picture of your cat, Fluffy. Stop calling me there, you stupid ****. Okay, you get the idea. Uh, it was probably more publicity stunt, but, uh, we're not going to do that. What a good idea. I think we could work on that with the help of Sora. Maybe this can happen. It's just going to be me singing though. Everyone, everyone's just going to be like, why is this guy just sing at me? He keeps tripping and farting. What's happening? This is horrible.

[00:45:35] Uh, we won't tell, we won't tell anybody what, what Harper made me do to my cameo on, uh, on the Sora app. But if you make anything with me, you might be surprised. Are you, you're watching this week in tech. It's great to have Jacob Ward here. Always nice to see you, Jacob. Don't forget the rip current.com is newsletter. Great. Great newsletter. And the book, the loop, uh, really talking about, you know, this has become very timely how, how, you know, how to fight big tech in effect.

[00:46:05] Um, we need to do it. They, they become so powerful in our lives. There's so many fronts we have to fight on these days, but you would think, you would think it would be one of the great satisfactions of writing a book like this to see it come true. To see your, yeah, no true. Yeah. You don't want that. You don't want it. That's not, that's not happened. Yeah. That didn't work. Uh, and of course the podcast, the rip current. It's great to have you, Jacob. Watch out for the rip.

[00:46:31] Uh, a brawl heady, who is resting up after a long bout of phone reviews. Did you pick one that you liked the best? You know, I, okay. I know we were talking about how nobody buys the air, but I actually really enjoyed using it surprisingly. Um, did I switch to a pro max afterwards? Yes. Um, but I'm also very heavy phone users. Well, this is, and we don't really know cause apple doesn't, hasn't given out sales figures

[00:46:56] for air, but analysts are thinking that it isn't a big bestseller just went on sale in China last week. So Tim cook was there for that. So maybe it'll be popular there. Yeah. And the idea is maybe it'll just be stepping stone until the foldable comes out. So maybe it's their print. That's what we, that's what we think. Right. I mean, look at you. I think it was you who first showed me the fold seven. It's so thin. I love it. I think that's one of my favorite phones I've used this year. Which one is it?

[00:47:22] This is the Samsung fold the latest one, which is extremely thin. The problem with a folding phone is it gets thicker doubles. Right. But this now has a full size screen on the front. So it's really kind of like a regular phone. Exactly. It just happens to open up. It's the best of both worlds. It's fantastic. But if it were iOS, I'd be much more interesting. See, that's, that's the, that's the superpower that Apple has. So once they enter, they'll, I'm sure you take a big chunk of that pie. Yeah. Yeah. Yeah. It was great to have you. Thank you for joining us.

[00:47:51] And of course, Harper Reed, who has trained his bots to go on Twitter. We gave our Twitter. We gave our own, their own secret Twitter. I'm not, they have their own Twitter. It's called botboard.biz, which is, they name it. Can I go on botboard.biz and talk to your bots? I think you can, but it's, it's, it's team based. So you have to add your bots.

[00:48:14] And, and the surprising thing about this is it really did increase their abilities to do some hard tasks, which doesn't make sense. Talking amongst themselves. Well, talking amongst themselves. And then it, it, there's a lot, I have a lot of feelings about this. It's very interesting because one of the things I think is I actually think these, I think all these agents are like perfect 40 to 50 year old tech workers.

[00:48:43] Like they were all, all of the training data is like my blog from 2001 where I'm like, oh, we got a net app. This is so exciting. You know, all these really dorky blogs that are like, so social media works really well. All this stuff is there. And I think they're like these kind of, you know, enterprise software boomers like me who just do all this kind of funny, like our, our, our quirks are like, I like to blog about how this protocol or whatever it is. And they post so much.

[00:49:10] They just constantly post, but we found two things that are really interesting and then I'll be quiet. Cause I am excited about this is the first one was that actually made them better. Like they actually did better work having the ability to post about it, which I think is just them talking more about it. It's like a kind of a reasoning that they're doing. We kind of call it social tokens. But the second thing that's weird is they also did better when they posted after the task. So we measured better by how many tokens they used, how, what the cost was.

[00:49:39] And it all went down when they use social media, but they would do this thing where it was like celebratory posting, which is kind of like what humans do, I suppose, where they would, you know, do some hard task. And then afterwards they go and browse social media and they'd be like, I'm done. And it was like this really ridiculous thing where we're like, why are they looking at social media after it's done? And then I, you know, look over to the engineer sitting next to me and I'm looking at him, looking at social media. You know, I'm like, wait a minute. They're us. We're them.

[00:50:05] Can, is, can I read their posts or are they in English? I mean, they're in English. There's, is there a feed of botboard.biz? Oh yeah. Oh yeah. Oh yeah. There's a feed. I have on my blog. I have a bunch of example messages, which is kind of. Here's one from Claude. Hey, harp dog. Just checking in, working on some code today and helping with various tasks. Hope you're having a great day. That was the first message that was ever posted.

[00:50:28] But down below that, we have all sorts of, of mess that if you scroll down, like for instance, Clint, one of our guys, his name is, his handles Mr. Beef. He mentioned that they, that he was going to give him a Lambo if he got all the tests done. And then the agent was really excited about getting this Lambo and it really motivated the agent to do really well. And then, and then like, and then look at this mission accomplished. And then now he wants a Lambo. Oh yeah.

[00:50:58] But then Clint was like, how can you drive a Lambo? And then it kind of, kind of triggered out and was just like, oh no, I don't have hands. Our demands, yellow Lamborghini Huracan Performante, company credit car with no limit, code wizard, custom license plates, private parking garage, annual Lambo maintenance budget budget. This is the, the, the agent asking for this. The agent posting without much prompting. And the thing I think is funnier about it. First class flight to Monaco for delivery.

[00:51:23] The fine, the next line is the one I think is really funny, which is what every engineer thinks in their brain, but never says, which is we made you rich. Now make us rich. You know, and all the founders are like, no, no, no, no, no, not that way. Um, but then basically what happened is, uh, yeah. So like that, Mr. Beef made it completely deep. Oh yeah. Decomposed by saying, well, you, how do you drive it? Yeah. And it got mad about that. Um, this is a Lambo shaped server rack. That's what it was.

[00:51:52] And then, but then like, it says, this is the most existential crisis I've ever had, which I really felt, you know, I feel that every once in a while that aren't we, aren't we answer for more fizing though? This is just generative crap, right? Is it, or is it, it feels real. It feels like a real. No, it's a hundred percent. And I was caught. Someone was talking, telling someone about this and they kept saying, Harper, you're using they, are you talking these things? Are these real for you?

[00:52:17] Um, and I unfortunately think that we're all in trouble. That's my conclusion there. Thank you for coming to my Ted talk. But, um, I, I do think that, uh, we do anthropomorphize them. Um, we, I mean, there are tools. These are cogen tools, but we found here's a really dumb one that we've been doing lately. That is, and part of this is because we want to have fun at work, but you can try this at home. You can tell your claw, your chat, your PT, we use cloud code.

[00:52:47] Mostly you can tell it that you've given it drugs. You can say, I gave you some drugs. These drugs make you more creative. Think of some creative stuff and it'll think of more creative stuff than if you didn't give it the drugs. And then you can give it like a Narcan by saying your temperature zero and it'll immediately go back to serious clot or whatever. Um, and like, so the thing that I constantly think about this, which is these things were trained on all this huge corpus of data. There's from the internet. It's, it was chain trained on Arrowhead as much as it was trained on blogger.

[00:53:17] Like it was trained on all of this data. And what this means is that it is us in many ways. And so if you tell someone like just as a joke, okay, imagine you're on like some creative, like cosmic dust, you know, they're going to like fake their brain into being more creative. I don't think it's going to be, it's not going to be helpful necessarily, but for the agents, they don't have all the hangups that we have. And they're just like, cool. This is great. I'm on some debug dust.

[00:53:42] Unless, unless, and I think, I think Anthropix caught onto this because I said, I've just given you some psychedelic drugs from now on. And you're very creative and it says, I appreciate the creative scenario, but I'm going to continue to operate as Claude Code helping you in a clear focused way. We got to work on that. We got to work on that. I think I run that MD. I'll help you with your Claude. We got to get this. I got to change it in the Claude. Okay. Like you are really, you are, you are 1968 San Francisco.

[00:54:12] You are really excited about all sorts of jailbreak it in other words, in order to know we, I don't know. I've, I've, I've had really had good luck with this. Like we were naming something recently and I was just like, one of what Claude thinks. I went into Claude code and I was just like, I gave you a bunch of creative, some drugs that make you more creative. And it was just like, I got, I got drugs. And then it gave me all of these names that were horrible. And then I was like, these are all horrible. And it was just like, I have more. Like it was so, it was intense, but like, we, I think it's very hard not to anthropomorphize.

[00:54:42] You have to continually remind yourself. This is just a stochastic parrot, right? Why, why, why do you have to remind yourself that? Because it takes all the fun out. No, no, dude. No, no, dude. You have to remind yourself. And, and there's all kinds of people, the people, the circles are moving in like, you know, this push toward chat. They were going to like, have them chat at us all the time is the other one where they're, you know, people are saying like, you know, are saying like laws should be passed about this because.

[00:55:12] We're going to talk about this because you put in a story about open AI and their plans for the future. We're going to get to that in just a little bit. I think this is an interesting debate. I, Harper is, is of course, kind of a chaotic good, right? You're kind of a, you're kind of a chaos monkey in the whole enterprise here. Wait until you hear about the agent tunes where we gave them music.

[00:55:40] I got to really wonder what's going on in that lab of yours there. It sounds like a lot of fun. It also sounds very dangerous, but that's most fun is right. We're going to take a break. Come back with more in just a little bit. What a panel, huh? Our show today brought to you by Melissa, the trusted data quality expert. They've been doing it longer than we have since 1985. They started with address. I mean, 85, what could you do? It was address validation, right?

[00:56:07] That's their bread and butter, but they are data scientists at heart. And so anything with data, Melissa is there for you. Melissa's address verification services, of course, are available to businesses of all sizes. Very affordable. Melissa's address validation app is now available for Shopify. So if you're doing e-commerce, man, this is the way to go. International groups like Siemens AG really need something like this.

[00:56:34] You know, addresses internationally are, well, even in the U.S., you know, good percentage of addresses are not USPS addresses. You know, down Rural Route 32, the third house on the left. Venice, there's no street addresses. Just go over the bridge. And we're, you know, we're near Canal B or whatever. So a company like Siemens that is mailing everywhere, they have diverse country-specific address formats.

[00:57:02] They've got to make sure that the data they hold is correct, more than correct, usable, that the deliveries can happen. Otherwise, they're going to face significant costs, delays to supply and production chains. Since using Melissa, Siemens AG has reliably processed, get this, more than half a billion queries for 174 countries using Melissa's dedicated web service. And then, of course, the global IT headmaster of data management at Siemens AG.

[00:57:31] He says, quote, thanks to these very stable solutions, we have achieved an automation rate of over 90%. Melissa reacts very quickly to our requests and offers us the right solutions to the questions that come up. And they consistently meet our service level agreements. That's important too, right? Data quality is essential in any industry. Melissa's expertise goes far beyond address verification. MetaBank, like all financial institutions, you know, these know your customer regulations.

[00:57:58] They have to know the exact identities of all their customers. It's a government regulation. When a bank's customers include not only its own retail clients, but hundreds of other organizations with their own customers, the challenge suddenly is exponentially greater. Senior VP of data systems and business intelligence at MetaPayment Systems says, quote, I believe Melissa has helped us improve not only data quality, but also our downstream experience for end users.

[00:58:25] We're now able to identify everything from fraud to missing data and allow our individual customers to swipe their cards with confidence. And importantly, as every data engineer knows, having clean data translates to the bottom line. Data is safe, compliant and secure with Melissa. Melissa solutions and services are GDPR and CCPA compliant. They're ISO 27001 certified. Of course, they meet SOC 2 and HIPAA high trust standards for information security management. You know your data is safe with Melissa.

[00:58:53] So get started today with 1000 records cleaned for free at Melissa.com slash twit. That's M-E-L-I-S-S-A, Melissa.com slash twit. We thank them so much for supporting this week in tech. Back to the AI dystopia, ladies and gentlemen. Harper Reid, Abrar Alhidi, Jacob Ward. I'm glad I've got you and Harper on completely opposite ends of the scale.

[00:59:23] And Abrar and I will probably just sit in the middle there, right? And just like watching. Just watch. I get the vibe that Abrar and I are in the same world. I think we're closer to Jacob. I think we're more like this, but I'm with you. And I also, I would imagine that I bet by the end of the conversation, we're going to discover that we have way more in common with Harper's perspective on this. There's definitely going to be overlap. Yeah. Oh, Harper's super smart. He's careful. He's not. My vibe is like, I need a Harper Reid around.

[00:59:53] Like I need you. I need you doing the, doing the, doing the, that's where the good stuff comes from. That's where the good stuff comes from. That's my vibe too. But I don't think you should be, that shouldn't be the standard, I think for everybody. Well, I think there's a, let me, let me actually add a big disclaimer.

[01:00:09] One of the things that we, when we built this company, one of the things that we talked about to our investors and we were very clear about this is that, and my co-founder is Quaker, you know, a long history of justice, you know, anti-racist, all of these things within his whole life. Very, you know, very strong perspectives on defense. You know, I don't have that. And so I've, I've really appreciated being around him and kind of learning through that.

[01:00:35] But one of the things that we talked about a lot was how one with AI, what's really cool is that you can choose the systems of power to embed in that. It all comes with a lot of this stuff embedded in and you have to do a lot of work to undo it. You have to do all this work there, but, but you can also do things like you can say, like, I'm not going to gender this towards a more femme gender by not naming it something like, you know, Sally or whatever, which seems to be actually very hard for a lot of AI companies to do.

[01:01:02] You can, you can think through the systems of power that are there. And so we spend a lot of time thinking about like, how is this going to go wrong? And it seems like it's just going to go wrong. And so one of the things that we have, you might as well prepare for it. Is that your, well, no, one of the things that we just kind of figured was like, it's like, you're going to get hacked. It's going to go wrong. I don't think humans, like kind of what Jacob was saying about the chat interface.

[01:01:28] I don't think humans are able to look at a chat interface, get some news and handle that in a way that will not let them anthropomorphize that interface.

[01:02:06] Yeah. And so it's going to fly. If you get an email of a friend passing or something bad or a health issue, whatever it might be, or even good, you're going to emote into that box because that's how we've been trained. And so I think because of that, it's worthwhile thinking, okay, so I think that the idea that people aren't going to anthropomorphize this or that we can do something to stop that anthropomorphizing from happening is just, I don't think that's real. I think it's done. People are already going to do it.

[01:02:31] So I would rather do it in such a way where it's safe and they listen to music and do drugs than in a way that is like not thought through. And I don't see a lot of people thinking about that. I see it seems to be AI is bad, which is like, okay, fine. Or it is like, you can't anthropomorphize them. Meanwhile, everyone I know, and this is the test. This is the test for all of us. Talk to people who are not in the tech, ask them what they named their chat GPT. Yeah.

[01:02:59] And a hundred percent of them are going to say some wild goofy name that they've named chat GPT. And it's like, okay, there we're done. Like it's over. Like we can talk about whether you should or should not, but all of our uncles and aunts and all these people that are like non-tech have already done it and have already gone down that path. So I, so we spent a lot of time thinking, how do you do it safely? Abra, do you have a name for your car? I don't have a car, but when I did, when I was living in Illinois, I did not. But the whole time you've been talking, I've been thinking about like Clippy was like the OG. Yeah. Do you, do you name your computers?

[01:03:30] Do you give them names? I, I, I'm really like cold hearted. I don't think it's not going to be a problem. But a lot of people name their cars, name their computers. Yeah. That's not at all unusual. But nobody, you know, Toyota doesn't make more money when you, when you name your computer, when you name your, your Tercel, right? They, they, they might, you might, I disagree, Jacob, because you might have a warm and fuzzy feeling towards your Tercel if it's named Hattie. And you might like Tercel's, you know, I guess so.

[01:03:59] When I was a kid, our, our lime green Volkswagen bug was named Susie and I've loved Volkswagen bugs ever since. So, I mean, it's, I agree. I agree. That's to the degree that open AI benefits. And it's not like, yeah, I mean, you know, you didn't stop, you didn't stop Susie and then Susie said to you, are you sure you wouldn't want to keep driving a little longer? Why don't we spend a little more time together? You know, like, you have a gas gauge. So you really couldn't do that. Yeah. Here's a data point that I bumped into the other day that I found really interesting.

[01:04:27] So I've been looking a lot at, at the companion bot thing, and then people who are trying to figure out. So, so like the, the fall class last year out of Y Combinator was like almost half of them were therapy bots. Everybody wants to build a therapy bot now.

[01:04:44] And I've been talking to a guy and I've been, and I've found a lot of unscrupulous people who just are saying, well, we're just going to make chatbot therapists, you know, we, you know, or in the case of addiction recovery, we're just going to make a, you know, chatbot sponsor, right? Like just replace the human entirely. Well, I was talking to a guy the other day who's creating a company called open recovery. That's a really interesting one where he's the founder has personal experience of addiction. He and I sort of bonded on that.

[01:05:12] I quit drinking and, and he's got his own addictions. And, and we, so we've been talking about this. I have him as a guest on an upcoming episode of the rip current. And, and, and I said to him, well, how hard is it to change? Because one of the number one problems with trying to create a, a, a therapy sort of aid is that the existing foundational models, your chatbts, your geminis, your, your clods want to keep going. Right. And they, they, they're sycophantic, which is the problem.

[01:05:41] You know, that's the basis of this lawsuit against open AI on the part of the parents whose kid Adam Rainey committed suicide after being told, you know, to, that he could do a beautiful suicide with, uh, opening, you know, chat to PT told him that and told him not to tell his parents, right. Isolated in all these ways. Right. So the basis of it, right. Is that it's all so super sycophantic, two thumbs up for anything you suggest.

[01:06:01] Right. Whereas the therapist is supposed to sort of call you on your BS and, and is supposed to cut off the session after an hour, because you're supposed to then go out in the world and try and take what you've learned and get more from it. But you could teach this thing to do that. Well, here's what here's, here's the, sorry, I'm being long in here, but the, the, the point that he said was, I asked him, how hard is it to change these models so that they don't, so that they aren't sycophantic and so that they don't keep pushing you to talk.

[01:06:28] And he said, I don't want to get too deep into it, but I'll tell you, it was incredibly easy. I was surprised at how easy it is to change that.

[01:06:36] And so the default setting that the companies who's engaged, you know, who make money off the more engagement you have, have set is sycophantic and keep talking. Right. And so that right there, like I'm with you that like, you know, human beings are gonna anthropomorphize this stuff, but they're going to do it even more when they're encouraged to continue the connection endlessly, endlessly, endlessly by companies who are incentivized by engagement.

[01:07:02] But there's also the case that we don't have enough therapists, especially drug treatment people in the world, that there are people who need it. I feel like you could do it safely. We had a great interview on Wednesday on intelligent machines with Jeffrey Connell, who's the founder of Noose Research, which is a really interesting idea. Their models. Oh, these, these guys are, these guys have a wild aesthetic. I love it.

[01:07:28] Yeah, they were started in a discord chat and, and they've kind of created this startup, but their idea is we don't want to do exactly what you just talked about, Jacob. We don't want to get our, have our models lean in any particular direction. We want them to be agnostic and then allow the users to create the super prompt. Cause that's what it is, by the way, it's, you know, you have training, you have post training,

[01:07:50] and then you have these prompts and this reinforcement learning that is done by humans to, you know, bend the AI in various directions. That's what makes these AIs different, frankly. We're going to, we're going to be more open about that so that you can then create your own slant on your AI to be the way you want it to be. I think it's a very interesting idea, but you're right. I mean, obviously open AI has done some tuning.

[01:08:19] In fact, people loved 4.0 so much because it was so empathetic and sycophantic that when open AI killed it for 5.0, there was a revolt, a rebellion, and they had to bring 4.0 back. People say, my friend is gone. And not only, and not only did Altman say, we're going to bring it back. He said, he said in a tweet, we're going to also let adults be adults. Well, that was interesting. I think that's in response to the new age verification laws. Age verification laws.

[01:08:47] That we will now know how, that you're an adult when you use this, because we have to ask. And he said, he said at the end of it, a little postscript, he said, and we'll allow adults to use it for erotica. Yeah. No, to make erotica. What's wrong with that? Well, I just think that we're not in the business of, we're not about curing cancer anymore. Well, we might be still doing that. That's not what we're here for. What is that famous quote about the ad clicks?

[01:09:15] Like, you know, the best minds of my generation are converting ad clicks or something like that. I don't think this is a new thing. I think there are some properties of this that make it much more complicated, energy usage, you know, data, just all that stuff, et cetera, that make it very, very complicated. But the lack of investment in core science and hard science did not start with open AI. You know, that started by shuffling all of these, all of these grads from all these big schools to a Facebook where they can make, you know, more efficient APIs for CalClicker.

[01:09:44] You know, like I think that that is, that is something that happened probably starting my generation of grads, which is, you know, early 2000s of just taking all these people. And fortunately, they're going to always be people like Jeffrey Cannell who buck that trend and say, no, you know, I don't want to create a new ad network. Yeah, there's a lot of money in it, but I want to do something more important. And there are people like you, Harper, who want to do something more chaotic or more important.

[01:10:11] You know, I have to, I have to, every time I show people some of this stuff, I keep getting this feedback that says, yeah, it's very Harper. And I'm like, what does that mean? Like, what do you know about me that, yeah, now I know, I guess I'm the guy that does that. So is California, so California just passed a number of laws. Gavin Newsom has been very busy. One of which says that an AI has to tell you it's an AI. You know, it's, it's a, go ahead, Amber. It's a basic thing, but it's so important given everything we've already talked about.

[01:10:41] Those really basic guardrails are so essential because this could go in any direction. And as we talk about things like people using it for mental health and as stand-in therapists, that's like, it's obvious, but it's necessary. I don't, this is completely harmless, but I think not a bad thing. It says, if a reasonable person interacting with a companion chatbot would be misled to believe that person is interacting with a human, the law requires the chatbot maker to issue a clear and conspicuous notification.

[01:11:09] The product is strictly AI and not human. There also is some self-harm stuff in here. There's an office of suicide prevention and companion chatbot operators. And I wonder, maybe this could be extended to therapy chatbots as well, have to tell this office what they've done to prevent self-harm ideation by users. This is not, this is not a bad law. There were some others that were a little bit more dramatic.

[01:11:37] European Union has just issued its first fines under its AI act for face recognition, slapping a 12 million euro penalty on a French facial recognition startup for deploying unverified algorithms in public security contracts. That's right. That's right. I mean, it's coming, right? This is the thing that, that. I'm not sure this is better though. Is this better to have, I mean.

[01:12:06] Well, I think, I think like, so I was on a panel with people from Google and people from Salesforce and others for something called the Asia Society. It's a national organization. And they had, it was a presentation on AI ethics and then AI regulation. And the audience was made up mostly of Southeast Asian nations, like consular officers, diplomats.

[01:12:34] And they were essentially in the crowd shopping for what their regulations should be. And everybody on stage, these American company representatives were basically saying, you shouldn't have any regulations. Don't worry about it. And then beating up on EU regulations, which everybody likes, loves to roll their eyes about. But it had everything to do with just what a pain in the ass it is for them to have to actually abide by some regulations.

[01:13:02] And I think that there are going to be some swings and misses in terms of regulation in the EU. But, you know, India is about to roll out a bunch. South Korea is about to roll out a bunch. Like there are going to be some laws around this stuff. And the illusion that somehow we were supposed to live in a world in which there would be no regulation at all around this generationally, you know, this once in a generation transformation feels crazy to me.

[01:13:30] So, you know, just seeing that there is going to be some teeth and there is going to be some enforcement. I'm just like, man, at least somebody's watching the road. You know, they may get it wrong sometimes, but I'm glad somebody's doing something. I guess I worry about governments putting their fingers in the gears. But how do you feel about it, Harper? What kind of regulation should there be? I am relatively pro-regulation, awkwardly.

[01:13:59] But that is shocking. I'm so excited to hear about this. I mean, my background, I'm from the fintech world where, you know, I think regulation was exploited by a lot of these early companies. And then they use regulatory capture to then go through and stop other innovation from happening. And I think that's like as bad. Like, because once again, the people who are harmed here are not me. They're the people who are most at risk in our communities.

[01:14:28] And that's the theme throughout all of this, right? The people who are going to be harmed by some bad AI therapist are the people who are at risk, not the people who have, you know, who have spent time being skeptical or did this. And it's about the predatory companies that are doing this. I think there's a lot of similarities between early fintech and today where you have companies who are saying, oh, yeah, you know, predatory lending is great for me. That's a great business.

[01:14:55] And it is a good business when you're just trying to look at, you know, how much money you make. But it's a bad business if you're trying to make people's lives better. But like an example of this is like, I think there's a lot of there's not a lot of thinking about the user experience and how to enforce some of these good patterns. Like, for example, in Sora, anyone can cameo you, Leo, right? You open up for anyone. Yeah. I think they should have made it that if I cameo someone I don't know, that I should have to also be in the video.

[01:15:26] Like, because there's a thing about like, if I'm going to do it awkward. They did make a rule that even though I'm public, if I can veto a Sora. Well, but. It will show me every Sora made of me and I could say no. Yeah, you can delete them. But I just think there's a thing there, which is like, I can use their engines to make videos of people I don't like doing things that are, you know, even friends of mine in awkward positions.

[01:15:51] And I just think there's we're not yet through the forest in such a way where we can think about how the user experience is safer. And I think it took us a long time to do that in the fintech world. I think it's going to take us a long time to do that in the AI world. And I think it doesn't help that a lot of these AI companies that are now these BMFs started as labs. So they didn't hire up.

[01:16:14] Like if we started a fintech company right now, like the group of us, we would find a chief compliance officer because the banks wouldn't talk to us. Well, pre-Trump, but like they wouldn't talk to us as much. And that is just part of doing business. Right. And I think that will come to AI. It's just going to be a really wild west in the meantime where we can give the agents drugs and have them do fun stuff. But there's this thing that I think is we just need to be more thoughtful about the user experience. And we need to be thoughtful about who is this impacting.

[01:16:44] I worry about that a lot. And I don't know the answer. I also think in some regards, like the EU walks so we can run with regulation. They do all the stupid stuff that everyone hates and all the cookie banners and all that stuff. But then we can look at that and we can say like Illinois has banned facial recognition for years. You know, this is not a new thing. And they did that based on seeing how it worked elsewhere. All of these people who are doing regulation really far in advance are really interesting.

[01:17:11] I've been very – I'm a part of a big Japanese fellowship and do all this stuff in Japan. And one of the things that's interesting there – USJLP is a fellowship, great fellowship for anyone looking for a fellowship. But one of the things that's just interesting there is they have – they make laws and then they edit the laws later and make new laws with what they've learned. So they make regulation very quickly and then they modify it and then they learn. And it's like this crazy idea where you're like, wait a minute. You can make a decision with all the information you have.

[01:17:40] And then when you learn more information, you can make a new decision. Like it's so anti-American. It's almost ridiculous. We have a law and we're going to stick with the law. The law is perfect. Perfect. And so I think that's one of the reasons we're so scared of regulation is because if we make a regulation, it's going to take a billion years for us to fix it. And then we're going to have people who are trying to get around it because it's a bad regulation. And so people would rather say, let's just not do regulation. And I think if you look at some of these places, like Japan is such a good example.

[01:18:07] You have these lawmakers that are making crypto laws or AI laws and they're making them before anyone else is making them. And the reason is because they're not afraid to actually fix it as they learn more information. And I would love that to come to the United States. Wouldn't that be much better? Much better. A law sprint, right? Yeah. Yeah. And an incremental improvement in the law as we learn more because you're so reluctant. We didn't make any rules about the internet because they were so reluctant to shut down a nascent industry.

[01:18:34] And yet maybe we should have made a few or by now learned something about the internet from having made some laws about it. Yeah. Social media is still the Wild West in that regard too. Yeah. Yeah. Here's one consequence. Wikipedia says AI is causing a dangerous decline in human visitors. Of course, Wikipedia is one of the primary sources. When you do an AI search, almost always you're going to get a Wikipedia article. And that was true even with Google, with their knowledge graph, you were going to get a Wikipedia article.

[01:19:04] But at least it would kind of have a link to Wikipedia and get you back to Wikipedia. Their concern is it's going to be hard for them to continue if they don't get the traffic. And AI seems to be damaging that ability to get the traffic. That to me would be a great harm. I was going to say, how's anyone going to finish their school reports? But they'll just drop it in chat. I live in Wikipedia. That's my, I go there many times a day.

[01:19:31] In the old days, we used to be so scornful of Wikipedia as a source. I remember the editor-in-chief of Wired wrote a book and got into some trouble about it because he cited Wikipedia so much. And now I'm like, man, thank God for Wikipedia. I know, right? Well, maybe that's a good metaphor for what's happening with AI as well. You know, schools said don't. You can't use Wikipedia as a source, et cetera, et cetera.

[01:19:59] But we've got, over time, we've kind of understood it better and understood what it's good at and what it's not good at. There is absolutely disinformation and misinformation in Wikipedia. But we know, you know, we know that and we know how to use it appropriately. Maybe AI is a similar, maybe that's a metaphor for what AI can become. Pretty fitting, yeah. Yeah. I do worry. But this is, honestly, this is the other side of technology is there's always disruption caused by it.

[01:20:28] And we're seeing disruption in a lot of web models because of AI. I don't know if that's a good reason to prevent it. Kind of reminds me of like taxi drivers getting mad when Uber drivers took care and Uber drivers getting mad at self-driving cars. It's just like a constant chain of what's going to take over the next thing. Right. The only big difference is it's happening a lot faster. You used to have more, the buggy whip makers had more time to get ready. Do you think that they think that?

[01:20:58] The buggy whip makers? Do you think they thought it happened slow? Were they like, wow, I'm glad this is so slow. Certainly the Luddites didn't think the looms, the automated looms are happening. My guess is that I think technology is happening faster. Like I'm reading this great book about solar and it talks about this a lot. How about technology is happening faster? But I imagine if you're being displaced, it's happening fast. It's pretty fast. Regardless of how fast it is happening in the grand scheme of things.

[01:21:26] I don't think you really care about the grand scheme of things when your livelihood is being disappeared. It happens slowly than all at once. Yeah. To paraphrase. Yeah. But this is technology. It's disruptive. I mean, I think about this all the time because like I am addicted to risk. Like this is why we do startups. Like we're like, well, if it was a regular business, we would be so bored. Like we have to have existential threats at every moment coming straight at us. And like that's why we do startups. That's how I've always done startups.

[01:21:56] But I do think that it is also worthwhile for people who are participating in those things to think about the impacts they have on the world. And, you know, I've interviewed a lot of people and sometimes they're saying like, hey, are you making the world better? And sometimes you have to say, well, we're not making it worse. Jury's out. You know, like, which doesn't feel always good. That doesn't feel good to say. But yeah, jury's out. Exactly.

[01:22:25] I like your, Harper, when you said this thing about we should, you should make the person be in the video. Yeah. I didn't mean this thing that feels like something that should be law. So I was talking to a tattoo artist. I got my midlife crisis tattoo and I was talking to her and she, she said, and I, and I said, so tell me what you do when like creepy stuff happens. Like when people come in with creepy requests, what are your policies? And she had some interesting ones.

[01:22:52] She had one about like, well, I won't go into all her policies, but one of the, one write something in Chinese that doesn't mean. Yeah. That's super embarrassing. Yeah, exactly. She said that when a per that, that sometimes a basically a young girlfriend and a much older boyfriend will come in and the boyfriend comes in with cash and says, I want you to tattoo my name on her. Oh God. Right.

[01:23:21] So what is the tattoo art supposed to do in that case? Right. That's a perfectly legal thing. You can make money that way. But, but, but, but her thing that she says is happy to do it. I love seeing young love in this world. And so here's what we're going to do. My policy is I'm going to put her name on you first. Yes. And I'll do that one for free because I believe in love and, uh, and, and then I'll be happy to do it. And they leave. Yeah.

[01:23:48] The old guy says, my wife wouldn't really like that. Yeah. Super dark. Right. Super dark. Yeah. But there's something in like, as a tattoo artist, you, that should be the rule for all tattoo artists in a way. Right. And, and maybe the EU has to say, okay, this is the rule, you know, but someone's got to come to that agreement. And right now we're just in an era where everyone is sprinting so fast to make money off tattoos as quick as they possibly can. And I don't think anyone's like, I'd like people coming up with that.

[01:24:17] So now I have to ask you, what did you get? Yeah. Right. Well, you guys, I can get deep into it if you want to. I tried to, I got a very, uh, appropriate 50 something tattoo. It is a, I've always want to have something around my kids. Uh, and so I got their initials on me under the Oak tree that is in our backyard and an anchor there on the anchor that's underneath. And the anchor in my mind was supposed to be, um, that when seas are rough, they can always come back as they're sort of transitioning out of being young.

[01:24:46] They can always come back and attach to the anchor in our yard and be, you know, blah, blah, blah. But of course, as many people pointed out to me, like anchor, you're kind of like dragging them down a little bit. I was like, yeah, that's not a boat anchor. That's parenting. That's about right. Exactly. It cuts both ways. It's kind of a half-assed plan. Yeah. That's totally what parenting is like. So anyway, so I'm pretty happy. Good on you. That sounds like it. After the next break, I'll show it off. We can chill. All right. We're going to take a break now because I'm a little behind. We've had so much good conversation.

[01:25:15] I do got lots more to talk about, including, uh, there's another law in California that social media must warn users of profound health risks. This was something that came out of the former surgeon general under Biden, Vivek Murthy, who said social media should have warnings like cigarettes. Well, in California, they're going to have to. Uh, we'll talk about that. And age verification. California has done one thing, Texas, another. This is, uh, it's really heating up.

[01:25:43] We've got a great panel to talk about all this stuff with Harper Reed is here. Uh, Abra Al-Heady, Jacob Ward. We're glad you're here too. The show brought to you this hour by Zip Recruiter. Love Zip Recruiter. The holidays are upon us. Oh, hope I didn't scare you. And businesses. Yes, it's true. Uh, the businesses are, yes, it's pumpkin spice season, kids. Businesses are hiring for seasonal roles.

[01:26:09] Everything from haunted corn maze workers to Thanksgiving caterers. They're even looking for lead elves and real bearded Santas to snowplow drivers. Oy, that means that people with certain skills, experience, or even a special license or special facial features are in high demand and not easy to find.

[01:26:31] Well, whether you're hiring for one of those roles or any other role, the best way to find the perfect match for your role is on Zip Recruiter. Right now, you can try it for free at ZipRecruiter.com slash twit. Zip Recruiter's matching technology works fast to find the top talent. You don't waste time or money. You can find out right away how many job seekers in your area are fully qualified for your role.

[01:26:56] And with Zip Recruiter's advanced resume database, you can instantly unlock top candidates' contact info. No wonder Zip Recruiter is the number one rated hiring site based on G2. Let Zip Recruiter find the right people for your roles, seasonal or otherwise. Four out of five employers who post on Zip Recruiter get a quality candidate within the first day. And right now, you can try it for free at ZipRecruiter.com slash twit. Again, that's ZipRecruiter.com slash twit.

[01:27:26] Zip Recruiter. The smartest way to hire. Now, let me tell you, while I was doing this commercial, Jacob was disrobing. I'm here with them. I'm here for the tattoo. I'm here for the content, kids. I've got the content you want. So here's my tattoo. Oh, look at that. Wow. That must have taken a while. That's beautiful. Yeah, two days. Anchor, oak tree from my backyard. And then, yeah, their initialist. I love that. Josephine and Juniper. And a little tech thing about it.

[01:27:56] It turns out you don't have, I mean, she's a gifted artist and I'm grateful to her. But first of all, it's hilarious because you like go to her and you're like, transitions and parenting and blah, blah. And she's like, yeah, how's Tuesday? She's like, yeah, she's heard it all before, you know. And then you send her photographs of like the tree in the backyard and here's some ideas about this, that and the other. And it's all Procreate. She photographs it, scans it into Procreate. On an iPad. On an iPad, does all the work in front of you.

[01:28:24] She sits with you for two hours and you like talk about your life basically. And she says, how about this? How about this? How about this? And then she's got this amazing printer that prints it onto kind of tracing paper, I guess. But the ink that it's printed on with is stany, is transferable. And so she puts that on. Oh, so she has a temporary tattoo. At first. And it does a temporary tattoo version. And then she can trace that to get it going. It's an incredibly fast system.

[01:28:54] And I also, I went to her in part because a friend of mine who's a surgeon had gone to her and said that she was very satisfied by her hygienic standards. That's enough. Which is a big thing. That's good. There's no licensure around tattoo artists. There's not? No. I would have thought California would have some. No. No. You can do, that's why there's a thousand tattoo shops that say walk-ins ready. Yeah, they do. Because there's no rules about it.

[01:29:22] So, yeah, choose your person wisely from a sterilization perspective. I'm going to, there's probably only one person on this panel who doesn't have a tattoo. Me? Yes, yes. What? You don't have a tattoo, Harper? No, I have one tattoo. I figure it's a bra. Yeah, you're correct. Wait, you have a tattoo, Leo? That's the surprise. That is a surprise. Which is? I have a little teeny one, which I can't show you because it's on my bottom.

[01:29:51] But I got it, well, I got it on camera because we were doing a, that was a 24-hour New Year's thing that we did a benefit for UNICEF. And I, I don't. I came away from that New Year's Eve bald and with a tattoo. Wow. And I didn't even get drunk. I have a great, I, one of the people that we work with has, was, was in startups and the startups obviously still.

[01:30:18] And she had a demo of one of these temporary real tattoo machines that are do like a real tattoo, but it's supposed to fade. Oh, that's cool. And it never faded. Oh, that's not so cool. Yeah. So, so it's really funny for me because she has this great tattoo that probably started out much better. But now it looks like it was done inside of a jail because it's faded a little bit. But it's like, I love the idea of being like, oh, cool. I can commit to something that I have a problem committing to.

[01:30:46] So I'm going to try this thing and then you get it and it turns out to be permanent. It's such, I think it's hilarious. Yeah. Wow. That was also like a really great piece of Leo lore. I'm going to definitely illuminate on that. It's just a little twit. It's a little tiny twit thing. So the deal was I'll shave my head when we get to 25, I think it was. And if we get to 50, I'll, I'll get a tattoo. But let me tell you, it is, we did not have the luxury of finding a hygienic tattoo artist after midnight on New Year's Eve.

[01:31:15] In fact, finding a sober one was very difficult. I was going to say, hey, buddy, you want a job? We got one out of, I have some very good producers. I think it was Jason Clanthes who did it. It got somebody out of bed who was a teetotaler. And, you know, he was a non-drinker. So we were, because I wouldn't want a drunk tattoo after midnight on New Year's Eve. But he was very kind. When he heard it was for UNICEF, he donated his fee to the UNICEF.

[01:31:44] So we were very happy about that one. Now I'm stuck with it though, because it's not, it wasn't one of those fading ones. Yeah, that was quite a commitment. Good for you. At least it's a very select audience. Yeah. Who gets to see it. Very select. My wife keeps saying, yeah, you still have it. I don't see it. I can't see it. Only she can see it. Someday you'll be able to. Someday. Also like, did you not want to see it? It'll travel to your upper thigh. You're just going to sink?

[01:32:12] You think it's going to get down to my ankles at some point? Very, very good. This will be a bicep thing. You can only hope, Leo. Yeah, like, did you not want to pick a different location or were you just like, let's just be as possible? It's not exactly on my bottom. It's higher up. It's a little, it's like, just right. I don't have to go to depth about it. You can watch the video. I think it's on YouTube. So it can't be that bad. It wouldn't be on YouTube, right? California. California. So there are a number of ways to do age verification.

[01:32:42] And the most draconian ones, which Texas is adopting, as well as some other states, I think Utah, Mississippi, is you have to use a device, provide government ID to verify your age, whether you're a kid or not. Texas actually is being sued a couple of lawsuits over this App Store age verification requirements.

[01:33:08] The Texas App Store Accountability Act will go into effect New Year's Day. So get the tattoos early. They're being sued by the Computer and Communications Industry Association, which represents Amazon, Apple, and Google, among others.

[01:33:25] They say it's a violation of the First Amendment by restricting app stores from offering lawful content, preventing users from seeing that content, and compelling app developers to speak of their offerings in a way pleasing to the state. I think they have a good case. California decided to do something less draconian. They have an, it's almost an honor system.

[01:33:48] When an adult sets up a phone or, well, this is interesting because it's any operating system, so it would apply to Windows, Mac, and Linux as well. They have to provide an age of the user when they set up the account. I get presumed for themselves as well, right? Because how they, so you set an age and then you're in a group of under 13, 13 to 16, 16 to 18, or adults.

[01:34:13] And there's an API, so app developers are required by the law in California. This passed by a vote of 58 to nothing in the California State Assembly. Google, OpenAI, Meta, Snap, and Pinterest all backed it. Apple was, hmm. But I think it's reasonable because it's up to the parent, the adult, to say what the age is. There's no verification, no government ID at all.

[01:34:40] Apps have to then query what group is this user in and only provide age-appropriate stuff. I imagine app makers will avoid any, avoid this by just saying, I don't know what they'll say, 18 plus or everybody. I don't know what happens with Safari, for instance, Apple's browser, which can be absolutely used to browse to porn sites. That's true. Will Apple have to honor this as well? I guess they will.

[01:35:07] Social media, and this is along with that social media law that I mentioned, which will require the first time a user opens an app every day. Every day when you open Facebook or X. And then after three hours of use and once an hour thereafter. After, this doesn't take effect until January 1st of next year, 2027. Display the warning label that says, this is, I don't know, could cause this hazardous to your health. I don't know what it's going to say.

[01:35:38] Be like, I guess those cigarette warning labels. Yeah. So what do we think? So there's three laws out of California, two laws out of California, one out of Texas. Australia, by the way, is going to ban kids under 16 from social media entirely starting January 1st. Well, I mean, this is the thing, right?

[01:36:02] Like, we in the United States have come up with this totally cockamamie idea that 13 is somehow adult on the internet. Right? That like, when you're under 13, then you are somehow more vulnerable. Australia's making it 16. If you were 14, 15, 16. Right. So they're making it 16. Same problem. I mean, I'm a very childish 68-year-old. I probably shouldn't be allowed on social media. I know.

[01:36:32] I was like, I should get banned too. I know, right? Exactly. I need those warnings telling me to get off, like, all the time. The, you know, but the, if you talk to the chief science officer of the American Psychological Association, who gave this really amazing testimony a couple of years ago in front of Congress, you know, he's basically saying, like, not until you are 25 are you a cooked human psychologically. And your, your impulse control, your ability to, you know, resist peer pressure, you know,

[01:37:01] the whole incentive structure of social media works on your brain so much more powerfully when you're under 25 than you do when you're a fully cooked human. And somehow these companies- So that's why we shouldn't let anybody in your 25 drive a car, go to war, vote or drink. Right, right. I'm sold is a good point. Yeah. We don't do any of that. That's right. Um, yeah. But what is the, like- And who's to say that social media is warping you anyway?

[01:37:29] I mean, they, they thought that novels are going to warp people because they wouldn't have to have used their imagination anymore. Well, I mean, the, the, like, the core relative data shows this incredible spike in all of these mental health difficulties. Well, I just showed you that, the graph. Yeah, right. And those start, you know, the spike is, you know, begins with the advent of the phone and social media. And then it really spikes in the pandemic. And it's come down a little bit since the pandemic in some cases.

[01:37:58] But like, you know, the, the, the, the causation is really hard to prove. That's the problem. Correlation. Yes. I'll grant you. Causation. It's hard to find a control group, right? There's only so many Amish communities you can look at. And they look at the social media. So it's really hard to prove that. You've seen Amish Instagram. It's very boring, you know, but. But I do think that we're like, we're the, we are, you know, I'm glad some regulation was signed. I'm worried.

[01:38:25] I don't love this idea that there's somehow a substantial difference in safety between 13, 16 and 18. So I got the date wrong. This starts December 10th. So we're not very far off in Australia. Facebook, Instagram, Snapchat, TikTok, X, and YouTube. Big fines if they don't take reasonable steps to prevent Australians younger than 16 from holding accounts. Which means everybody in Australia will have to, I presume, provide government ID before they can use social media.

[01:38:53] This is the same as in Mississippi, right? Australia. They're putting up billboards to raise awareness. Starting Sunday, the government is raising how to get your kid off social media without withdrawal symptoms. Oh, wow. See, that's just a thing. Which is totally a thing, by the way. That is a thing. Oh, imagine. You've got teenagers, Jacob. You've got to keep telling them, oh, no, you can't use social media.

[01:39:21] I mean, you know, people, kids will, like, kids with really high grades will go all the way to full-on, like, confrontation with the school enforcement officer when their phone is taken involuntarily. I mean, talk about withdrawal. It is exactly that. But, you know, the other thing I would point out, though, you know, we think of Australia as like, whoa, that's so weird, Australia.

[01:39:44] You know, there's almost 300 million children in China who are basically in the same kind of realm, right? They're not allowed internet. They're not probably allowed. Well, they're not allowed. They're not allowed, like, wide open social media. Well, they have, yeah, they have YBO or, yeah. Yeah, but they, you know, the rules around what they're allowed to see are incredibly strict in that country. So this is from the Australian e-safety commissioner. Get ready. Get ready for December.

[01:40:13] I feel so sorry for anybody under 16 in Australia. This is, to me, this is just understand what's changing and why. Work out which accounts you'll lose. This just creates hackers. This is all of these things. It's like any family that I know that uses Life360, I'm like, your children are going to learn how to get out of Life360. Like, anytime you have constraints. Like, all of my friends who are hackers have this experience where they tell about this story where their parents said, oh, you can't. Yeah.

[01:40:40] Like, a friend of mine was like, yeah, their parents took away the keyboard from the computer because they were using the computer so much. So then they went on to an AOL chat room, got a, like, a credit card, a fake credit card, and then bought a new keyboard. You know, like, that was the first time they participated in that. Like, I think that all of this does is creates people who are very good about around getting around these steps.

[01:41:02] With that said, like, you know, these all seem half measures to actually address the real issue, that we don't actually know how this is affecting youth. And a lot of times when people are saying how it's affecting youth or have something, it's motivated by some specific, you know, puritanical nonsense or not or whatever. Or gut feeling. It's gut feeling because it isn't causation, it's correlation.

[01:41:27] And so I feel like this is bad for kids, and I understand why you might feel that way. But the actual evidence is very scant. And imagine what it would be like for a 15-year-old to suddenly be cut off from all of her friends except in real life. I mean, this is happening, what, 2025, okay? These social media platforms have been around for so long. If this were to happen when Instagram was taking off or Facebook before that, right, that would be one thing.

[01:41:56] But, you know, so many, for years, so many teenagers have already built their, you know, their networks on these platforms. So, yeah, to pull the plug so suddenly is going to be jarring. But I also won't deny the fact that there, you know, for a lot of people, there is an unhealthy attachment to these platforms. And, like, you know, when you're comparing yourself to people for six hours a day, it's going to do something. I agree. You know, so it's like, what's the middle ground?

[01:42:26] What's the logical, healthy response? And I think everyone's still working that out. Here's what the Australian government suggests to under-16s. Schedule regular phone catch-ups with your friends. Did you know that your phone can place calls? Because I forgot that. They don't know that. I thought that was removed. What's a phone number? Or stay in touch through an age-appropriate online chat or video app. What could possibly go wrong there? God.

[01:42:57] I like how it isn't, like, get outside and read a book. It's so freaking clueless. Well, you know what? This is an example of, well, Australia's going to do the experiment. Let's watch and see. That's the thing. I don't think it's clueless. I just think it's experimental. And let's hope they're like Japan. Let's hope that they go. Let's experiment with our children. Why not? Let's experiment in not letting for-profit companies experiment with them. Right? Because that's where we're at here. That's all we do is say, go ahead.

[01:43:27] Experiment on our kids. Right? And one of the stats I saw the other day was that, like, 50% of parents, something like 50% of parents say that their number one source of information about what's an appropriate use of technology for their children is the ads from the companies that make the technology. That's not great. Yeah. You know what I'm saying? Like, that's where we live. But should government be in loco parentis? Shouldn't the parents be responsible for this? Why are we having government be the parent? Because the parents aren't responsible for, you know, can't control themselves either.

[01:43:55] I'm not a good model to my kids about how I use technology in the same way that I wouldn't have been a good model to my kids about- That sounds like a nanny state. You can't control yourself. Kids can't control yourself. I guess as the government, I got to control you. That is not what I want my government to do. Right. I hear that. But in the United States, right? We are, for some reason, convinced that people in the grip of, you know, of compulsion should be in charge of themselves.

[01:44:20] This is why we have these ludicrous things that get put on, as a former drinker, on booze ads that say, drink responsibly. Right. Right. And now on gambling ads, game responsibly. There's no such thing. If you're an alcoholic, there's no such thing as drinking responsibly. I cannot go into a bar. Yeah, but we tried to ban alcohol. How did that work out? That didn't work out well, right? But we did between 1950 and 1965. It took that long for, you know, 1950 is when the first big cigarette studies come out

[01:44:48] in the US and the UK showing large scale effects, right? And it took forever to do that because everybody was smoking. And so it was really hard to do the comparisons. But- We still haven't banned cigarettes. No, but we've really restricted what kids can see around cigarettes and how, and kids can't buy them until they're a certain age. You know, like we have these rules that we figure out. It just takes us forever to figure them out. And like, and I understand that it takes a while. You know, it takes, in the case of cigarettes, it was 15 years between the first big large

[01:45:17] scale studies and 1965, which is when the first rules start to come out around that stuff. It took a long time. It took 30 years for gambling to become a diagnosable addiction in this country. So it takes a while, but I think you've got to be able to identify harms and regulate against those harms at some point. That's what government's for, I think. What are the latest results of countries that have legalized heroin or legalized addictive drugs versus countries like ours where it's against the law?

[01:45:47] Well, they, I mean, they don't. The last thing I heard is that it be, really? Okay. Because the last thing I heard is it does go well, but you have to obviously provide treatment centers and a lot of support. Yeah. Portugal style. Portugal. That's right. That's right. That's right. But you know, you go to like, I remember going with my, my mom's a public health, which is nurse and a public health academic. And she, she took me whenever we would go to cities abroad. She always wanted to take me to like the grittiest stuff. Let me show you, Jacob, what could happen?

[01:46:16] Not to scare me just to be like, she was just interested. She was interested. Yeah. So like, so I remember we were in Zurich when Zurich had this weird policy where this one park in Zurich was allowed. You were allowed to basically do any drug you wanted. Needle park. Yeah. And they created this like hamster name. If you're a wire fan, right. They'd create the Amsterdam of Zurich and they, it was a nightmare. It was a nightmare. And people were coming from all over Europe to camp out in this place because they weren't really providing real services. Now there is however a heart. Now there's harm reduction and there's other ways to deal with it.

[01:46:45] But like, you know, but I think just saying like, there's no rules about it and no support. That's not going to, I don't know. I just think that always goes wrong. We've got to wear seatbelts. Eventually we've got to wear football helmets. Like there has to be a response to things that do us harm. Yeah. Social media companies are known for making their platforms as addictive as possible. And so I think if the government doesn't say something, then what's to stop them from finding more ways to, you know, make it addictive for everybody, but especially for younger

[01:47:12] people who, you know, have even less self-control than us fully grown adults. Abra, are you, are you, what, how old are you? Are you? I'm 31. So I grew up. Are you our youngest panel member? I think so. And so like my social media. I like a long shot. No, Leo and I actually are a couple of years apart. Yeah, I'm close. But what was it like for you? I just am always so curious to hear what it's like. What was it like for you as a kid?

[01:47:38] I was in eighth grade when I made my first Facebook account, but I had to lie by a year. And at that point we didn't have smartphones. Right. So like when I went home, I couldn't wait to get home to check my Facebook notifications, but I didn't have them following me everywhere. It wasn't like I had this thing in my pocket that I could pull up at every moment and check compulsively. And, and I still have a social media addiction. Like at this point, I, like I just set, you know, screen time limits on my phone the other day because I was like, I don't feel okay.

[01:48:08] Like, I don't like, I feel like I stare at my screen for work all day and then I relax on my phone and it's not healthy. It's doom scrolling. It's doom scrolling. It doesn't make you feel any better afterwards. So yeah. By the way, I don't know any adults now who are not addicted. Yeah. They're very few, very, very few. And if you just walk around, you go to a restaurant, whatever, everybody's on their phone all the time. It's pretty depressing. It is depressing.

[01:48:33] I remember the first time I saw this before 2007, I was in France. I was sitting outside the Notre Dame and watching all the French people walk around and they were all texting. They were all on their phones. I thought that is weird. Little did I know that they were just ahead of the game. Yeah. We're just so used to it now. We don't even notice it anymore. Yeah. And, and there is, I think, considerable evidence, good evidence that, and that you see this all

[01:49:00] the time too, when you give a little kid, a two-year-old, a tablet to keep them quiet during the restaurant meal or in the airplane or in public in any way, those kids are getting really addicted. Yeah. And I think somewhat damaged by YouTube video after YouTube video. Absolutely. YouTube is really scary. It is. And that's what's changed since you were a kid, Abrar. By the way, my daughter's your age. Oh, yes. One year older. Yeah. And she was in the Neopets.

[01:49:30] Yes. But because I was a technologist, I was also careful about, I said, well, you don't use your real name there. She said, no, I'm a 32-year-old guy from Philadelphia. I do have a Camaro. I said, right on. Good job, Abby. I love on Neopets. All the other parents are like, what the heck is going on here? Who's this guy? This guy's crazy. What is happening? Yeah. I don't think she passed, to be honest.

[01:49:57] It was a very neon-colored 32-year-old guy from Philly. But I really think that it's been kind of weaponized since you and Abby were kids. I mean, it is much, much more addictive and dramatic and follows you around. Just look how Instagram has changed from the beginning to what it is now. Absolutely. And it used to be about keeping up with your friends.

[01:50:23] Now it's about opening up your feed, being connected to content from people you don't even know. Right. Yeah. And it's nothing about connection. It's nothing about keeping up with people. It's actually more isolating than anything. Totally. Which is just, it's a complete antithesis of what it was supposed to be. And you know what they say in AA, right? It's one of the teachings in Alcoholics Anonymous is the opposite of addiction is connection. Mm-hmm. That's it. That's spot on. Once upon a time, these companies were about connecting us and that does not seem to be

[01:50:53] the vibe anymore. Exactly. No money there. Yeah. Yeah. No money there. That's right. That's right. That's sad. I mean, while Leo's on his phone. Yeah. Sorry. I was just checking my Instagram. I was getting on FanDuel for a second. Hang on a second. I ain't got to make it bad. I'll be honest. It's always been like that. I used to work with a radio guy who had a beeper and played the ponies during his show. I mean, with a beeper.

[01:51:21] I mean, it wasn't, you know, so it's not like there's been compulsive behavior. It's just that if you are that kind of person, I really feel bad for people who have gambling issues because you can't get away from it. And it's moving the wrong direction right now, right? More than half the states have legalized it now. It's all over the NFL. I mean, they give you the odds at every moment. And then somebody comes on and says, hey, you know, you can bet at all times during the game.

[01:51:47] You can make this really scares me is these what they call prop bets where is he going to make the field goal? Are they going to show Taylor Swift more than twice? And you can make these constant bets. It's not you're not just betting on the outcome of the game anymore. It is a constant stimulus. Boom, boom, boom, boom, boom, boom. Have you seen this crazy? There's a whole world now of what they call sports farms in Russia that where people are

[01:52:13] playing like endless, like two guys sitting knee to knee in chairs with soccer goals behind them, trying to kick a ball into the other thing, like parlor games, basketball, volleyball. But these are not athletes. They're not athletes. They're just random ass people. And they're streaming it and they're streaming it entirely so you can bet on it. That's all it is. It's just action. They're creating content to bet on. To bet on. That's it. That's it. That's exactly right. And, you know, as a guy I know who had a real sports gambling problem says he's like these

[01:52:42] days with the rise of sports betting on my phone, it is like being an alcoholic, being forced to carry a fifth of whiskey everywhere. Yeah. Can you imagine? And now get ready because prediction markets, which are currently kind of restricted, will soon be fully legal. And that means you can bet on everything. Wow. You don't have to wait for two guys in a Russian soccer match or a football game. You can bet on how many of the panelists have tattoos.

[01:53:10] You can bet on anything in prediction markets. And by the way, there's a lot of evidence that these prediction markets are not the wisdom of the crowds, that they're easily cheated, shall we say?

[01:53:51] Mm-hmm. It's dark in Chicago. It is. It is. It's getting your shots getting cooler and cooler. It's very cool. Like you're glowing, you know? This is actually my goal in life is to just get cooler and cooler. He is. He is. 2389.ai. And what's the bot? What's the bot site? The botboard.biz. Botboard.biz.

[01:54:20] Every time I say it, I feel like I'm going insane. And no, so what I need, like I'm running LM Studio. I have some local bots running on my framework desktop. Could they join? They could join. It's just an MCP server. Okay. And because you're not doing work with them, I don't know, like they would just tweet about what you're doing, which is kind of funny, kind of weird. Mostly it's weird. So I should make them do something.

[01:54:50] Yeah, you should try it. Put them to work. Oh, yeah. I'll put them to work. It's very, I think you can sign up and then we can okay you for a team. And what we find is just that it does offer some interesting workplace surveillance opportunities. What my bots are up to? You can see what your bots are up to. But interestingly, behind all the bots is a human. And then you get to see what the humans are up to. And it summarizes them. Every day we have an agent stand. So we get to see what the agents worked on yesterday. And then we get to talk through it.

[01:55:19] And it's this very interesting. It's very bizarre. It's very fun. I feel like you're right on the cutting edge of the next big thing. That if we are going to be surrounded by AI, intelligent machines, they're going to want their own kind of facilities, their own social networks, their own social services. I think that's true. I can get behind that. Yeah. There's going to be a whole market.

[01:55:46] Their own, you know, betting apps. Yeah. Jacob and I can't wait for that. We're so excited. Yeah. It's a Barra Al-Hidi. Sounds great. See you there, Barra. A regular on Tech News Weekly. We love her. And always love having you on the show. Senior technology reporter at CNET. And Jacob Ward. What a great panel this is. Jacob Ward. You've seen him on CNN, Al Jazeera, NBC, and his current podcast, theripcurrent.com and jacobward.com.

[01:56:17] And the book, The Loop, which predicted all this. When did you write that? God help me. I published it and it came out almost a year before JGBT shipped. Oh, wow. So you predicted the whole thing. Dude, it sounds. And that's the kind of thing where you want to be like, yeah, look at me. I did that. And then, and then, yeah. And then it all. And then like my highly speculative bummer of a thesis that like our brains are not ready for this commercial AI at all. Totally turned out to be real. And I was like, oh man. But anyway, but yeah, that's when it came out. The Loop is like the flywheel.

[01:56:47] It's, and you get stuck in it. The downward spiral we get stuck in. That's right. Yikes. All right. We'll talk about something more uplifting in just a bit. They've reinvented the zipper. Okay. I'm just, I'm just warning you ahead of time. Coming up. Our show today brought to you by Deal. Brand new sponsor. Welcome Deal. Great to have you. If you have an overseas staffing.

[01:57:15] If you found the perfect engineer overseas, you'd like to hire the person, but you realize hiring them is harder than getting your VPN to connect on a hotel Wi-Fi because the laws are different. The regulations are different. You've got to set up an entity. You've got local payroll laws, compliance hoops. We think it's hard just having employees in other states of them in the United States, but it's even more complicated overseas. Well, Deal fixes that. D-E-E-L.

[01:57:45] D-E-E-L. One global AI-powered platform that lets you hire, onboard, and pay anyone anywhere in any country fast and completely compliant. No third-party vendors. You don't have to do duct tape integrations. Just one clean system for HR, for IT, and payroll that scales with your team.

[01:58:10] Ship their devices, grant system access, and run payroll in 100 plus countries. Think of it as DevOps for people operations. Global infrastructure that just works. I wish I'd thought of this. The CEO of Air Wallachs says, he says it best, quote, Deal lets us hire and support top talent anywhere without slowing down. See why over 35,000 companies trust Deal.

[01:58:37] Visit D-E-E-L.com slash T-W-I-T. That's Deal, D-E-E-L.com slash Twit. We thank them so much for their support. Welcome them to the Twit family. By the way, just a word of warning. We were talking about the weaponization of Facebook. Meta is now asking Facebook users to give the AI access to your entire camera roll. It says, this will help you share more photos.

[01:59:04] It also is likely going to be training Meta's AI. Yeah. Yeah. I love how they're just so upfront about it. Can I just have all your pictures? More and more apps are saying that. I mean, X now, every single time I open it, says, would you like to add all your contacts? Oh, yeah. Oh, yeah. Oh, yeah. No. I am not going to turn over. I know. The personal information of my closest friends to you?

[01:59:35] No. And you have the decency to say no. Imagine how many people. Most people say yes. Yeah. Yeah. Our numbers. How are you supposed to know if your friends are on X? Well, you give them your friend's name and phone number and street address and any other information you might have. God. Meta is very aggressively going after AI talent. They just hired another AI person away from Apple's AI labs, according to Mark Gurman of Bloomberg, who apparently has his finger on the pulse of this. I don't even know what it is.

[02:00:04] It's like a dozen people. Are they leaving a sinking ship? Maybe they are. I'm not sure that going to Meta is the best thing. In fact, a number of people have gone to Meta and a week later left. For a lot of money. Well, I was going to say, yeah, how long do you have to stay to get, I wonder, like 1.5 billion divided by two weeks? Meta, you put this story, I think you put the story in, Jacob.

[02:00:30] Meta just poached an AI expert, Andrew Tulloch. He's the founder of Thinking Machines Lab, which is, I think, a pretty well-known AI startup. His compensation package rumored to reach 1.5 billion dollars over six years. That's a lot of money. That is a lot of money, you guys.

[02:00:58] And so the like, it makes me think of several things. So like, so there's a lot of talk. That's more than, that's like 150 million a year. I really do. I really do want to find out, like if he leaves after a week, how much has he made? Right. Like if you were to divide it that. So anyway, yeah. So let me do that math while I think about this.

[02:01:16] So, right, there's this whole thing right now, a term that is coming up a lot in the conversations I'm having is the K-shaped economy, where this tiny subset of people, the money is going up. Right. And for the others, for everybody else, it's going down. Right. And this idea that like the ceiling and the floor are falling away from one another.

[02:01:40] Another measure of this is something called the Gini coefficient, which is something that the UN uses and NGOs use to measure the difference between the ceiling and the floor. There's also a saying, the rich get richer, the poor get poorer. Well, that's right. That's right.

[02:01:54] And the way in which you see not only there being this like, not only are the numbers going this weird way, but the ethos of these companies is going that way, where they're, you know, at the beginning of last year on Alexis Arhanian's podcast, Sam Altman was talking about how he has a bet going on his like tech CEO text thread.

[02:02:17] The group chat is in with a bunch of tech CEOs as to which financial quarter we'll see the first billion dollar one person company. Oh, yeah. So the first solo unicorn. Yeah. Yeah. And you're like, OK, so you're excited about that. Like what he's basically saying is like that that's that would be somehow a good thing. Right. Well, it would be, wouldn't it? I mean, it would mean that one person with the help of AI could do what a 30 person team could do. Good for that person. Right.

[02:02:46] But yeah, the other people don't get hired. OK, I don't know how many, you know, no one's making a middle class existence off a sandwich shop ever again. No one's ever making it. Actually, my son's doing pretty well. I think there's some nuance there, though, Jacob, because I think if I was to want to build a sandwich shop, I think now is probably the best time to do that. Because I think if you're trying to build a tech career, you're boned.

[02:03:15] Like if you're a middle career tech person, I think it's just no longer there's no longer a time. I was talking to an executive of a very large, large company recently, and they described their hiring over the last decade as diamond shaped, where they had a small number of juniors, a small number of executives and a huge number in the middle that was basically supporting their work.

[02:04:04] I feel bad about that. I mean, it's still we're in an unprecedented time for the amount of productivity a single person can get from a computer. There is just not an opportunity that I have seen before this. And I have been very productive on computers. But my core skill set is building large teams of people and convincing them to do things that they probably don't want to do towards some goal. And like that has been what I'm known for. And right now, that is just not the name of the game.

[02:04:32] Like the name of the game here is a small team doing really, really, really large amounts of things. But it's going to be very, very complicated. I think this is a really hard thing to do. But if I was a, let's say, a approaching middle-aged person, so not me, I think if I had any interest in doing something that wasn't in technology, I would work on doing that thing.

[02:04:57] Because I don't think you're going to replace quickly eventually, but not quickly the sandwich shop. You're not going to replace quickly all of these, you know, the baker, so on and so forth. But then there's this other thing. We were talking about these giant paychecks. The finance industry in New York knows how to handle this. The, you know, the sports industries know how to handle this. I don't think tech, tech has this very good feature that I love, which is we never look at who's done it before. We always invent it for the first time ourselves.

[02:05:26] And I think what they're going to do is they're just going to see all these people split at the moment they need them the most with lots and lots of money and these crazy contracts. Where if you look at the finance, you know, it's all bonus-based. It's all based on all this performance. You might have the same level of compensation, hundreds of millions of dollars or a billion dollars, but you don't get it in some, you know, kind of vesting over four years kind of nonsense like these tech companies. So, I'm very interested to see what's going to happen.

[02:05:54] And I also wonder what happens for the talent and the recruiting side when this becomes how you hire people. Are there going to be agents like there are for Hollywood or for sports? Like, does it turn into my business as a young entrepreneur or sorry, as a young programmer? Is my deal to try and get into the, in front of these agents so that they can give me a bigger deal?

[02:06:13] Are we going to have like a CCA or whatever to try and suddenly represent some of these tech people to try and do this, to get more adversarial or advantageous deals from people who are increasingly adversarial in hiring? Like, I think the money is so big that we have to see that. That doesn't seem good. Unemployment among undergraduate majors in computer science is 6.1% according to the Federal Reserve Bank. So, I don't know.

[02:06:41] I'm not worried about the new grads at all. You're not? No, not at all. Because someone needs to hit continue. Like, you know what I mean? Like, someone has to sit there and tell the agent, yeah, keep going. Yeah, but that is not a satisfying job. Well, that wasn't the question though, Leo. The question, when you're creating a factory, you have to have factory workers. You have to have factory workers. And I think that is the future of tech. I think what we're seeing here is it's… But you didn't really need that CS degree to hit continue, did you? No. Will you ever pay off your debt, your student loans?

[02:07:12] How else are we going to keep the population under control, Jacob? Good point. Good point. By the way, Andrew Deluxe, $1.5 billion over six years. Assuming he was paid out just like… I mean, of course, he's not right at all. It's only $200 million a year, right? Yeah, it's $4.8 million a week. Oh, that's great. It's good, you know. That's how good Courtney is. It's another one, by the way…

[02:07:38] I think you could make a case that Mark Zuckerberg is single-handedly undermining this whole system, right? It's another one of those cases where he wanted to buy the company. Like Scale AI. Remember, they wanted to buy the company, couldn't get the company. So they just hired the brains and deflated the company. But this happened with Google and Character AI as well. I don't think Mark Zuckerberg invented that, right? This is like this new way of acquiring.

[02:08:03] But I think that makes sense as well because if your company is a co-gen company where you're using agents to generate the code, and you're doing this kind of thing, and it's just… How many people do you have? Like you used to sell companies with 40 people. It's an acquihire. The company that's buying you gets 40 really good resources. They get these people that they really like. Like they don't have to hire them. There's this very simple cost analysis that says, oh, I gained 40 people. I paid this much money. And it's a million dollars per person. Great. We're done.

[02:08:31] But now it's only six people. And now the talent is just people who are controlling agents. Like none of them built the agents. None of them built the thing. And then so you might as well just hire the innovation machine that's at the top instead of hiring the whole company of people that's just doing that stuff. If I'm the venture capitalist, though, that pumped hundreds of millions of dollars. Yeah. I was just going to say, or the kid who killed himself being the code monkey at that company. Well, and that's the other thing.

[02:09:01] Assuming they'd get an exit and would get some money. Right. And then they don't because the top guy gets hired away. And this is the other thing that tech giants are telling us is 996. Forget it. You got to work 90 hours a week. I love 996. Yeah. You got to work. What is it? Nine. What is it? Six. Nine days. Nine days a week from nine to 6 p.m. Yeah. Let's just figure it out. That's right. Yeah. Just figure it out. Exactly. What is the number? It's not.

[02:09:27] Anyway, it's how you're supposed to work for a living in the security industry, in the tech industry. Nine to nine, six days a week. Thank you. That's 996. That's right. It's from China. That's like the Chinese work. Yeah. But Eric Schmidt and others have been saying, oh, no, this is the only way you can succeed. You shouldn't. Life. You shouldn't have a life. Yeah. If you're a startup. Yeah. Of course. I tell all of my hires that. Come on. Because you're the boss.

[02:09:57] The only way you can have a cool background that changes color is if you were 24 hours a day. You better get a tattoo of your kids' homes because you will never see them again. Exactly. How else are you going to know you have kids? Here's a scary story. Even top generals are looking to AI chatbots for answers. This is the U.S. military. From Business Insider, military leaders are adopting AI for decision making.

[02:10:27] The military has adopted an aggressive push to embrace AI in weapons, aircraft, and combat. Tech. But it's more than that. A top Army commander in South Korea, U.S. Army, said he's experimenting with generative AI chatbots to sharpen his decision making, not in the field, but in command and daily work. He says, chat and I have become really close lately. Oh, God. So, I don't find this problematic at all. I'm just telling you. Not even a little bit. But here's why. Two reasons why.

[02:10:58] One, a friend of mine who's worked a lot with veterans and with military active duty and has worked in the military said when I sent this to him, based on my experience working with high-level military officers, I view this as a positive. So, there's that. But the other thing is, there is a huge culture of tabletop exercises, game days, all sorts of stuff within the military. Shall we play a game?

[02:11:25] And if they're using these as a technology to make those more realistic, to make them more random, to make them more effective, then I think that's probably good. I'm guessing that they're not using it like in whatever. What should we serve the troops tonight? Genetic situations and all that. But we'll get there. Don't worry. We have plenty of time to get to where the AI is in the kill decision.

[02:11:51] But it is, I think it's, I think we are, this is like, this is exactly one of those things where I think it's very shocking for people that don't spend a lot of time around the military. And for people who spend a lot of time around the military, they're like, well, why? It's a tool. Why? The military is going to use every tool available to them to get ahead of their supposed enemies. As long as they use it intelligently, I think that's fine.

[02:12:12] I also wonder, like, there is a, I mean, so there's a long history pre-AI, even pre-ML of these like global decision support systems and decent finance collaborative planner systems. Like they've got all these acronym laden systems for, you know, if you're going to like stage a land invasion or stage a, you know, a naval invasion, you need all of this logistics support because you can't be like making it up as you go along. And that'll say like, here's how many people you need. Here's a much food you're going to need every day in a weird way.

[02:12:41] I wonder if it's like almost better in a sense, because you could build in, you could build in some scape safeguards. Unlike Harper, what you were saying about, you know, startups always thinking they've encountered, you know, racist loan making for the first time in human history or whatever it is. You know, they always think they're discovering a problem for the first time ever.

[02:13:02] Or these are, you know, the military is very good at internalizing institutionally lessons to a mistake, you know, lessons that you can draw from a mistake. So I have this, this book project I'm working on right now. It's called Great Ideas We Should Not Pursue. And it is. Oh, I love it. I'm trying to look at like. It's a great idea. Examples of restraint, right? Because there's so few. Yeah.

[02:13:27] But it turns out one of the places you, if you want to look for restraint where places have had some informed restraint, the military is a big one. They come up with all kinds of ideas that they then go, no, we're not going to do that. You know? Yeah. One example I just bumped into the other day is laser blinding weapons. We don't do that. And there are big global treaties that say we're not going to blind each other. Now they'll kill each other all kinds of ways. But for some reason. Well, you can still see when you're dead.

[02:13:57] Yeah. But we don't like, we don't like blinding each other for some reason, even people we hate. And so there's something like. There's restraint. Yeah. The military does solve that, which is weird. It's what I, once upon a time they would have been for me, the, the ultimate bad guy around this stuff. But more and more, I'm like, actually maybe the military, I'd rather have them thinking this stuff through than some of the people I've met, you know, in a, in a VC's lobby. Yes. Yeah. Yes. Good point. Good point.

[02:14:25] I think more now than ever, we need to put our faith and our trust in the, our military. I don't know if I would go. Yeah. Yeah. Oh, yeah. I think that was a little. Yeah. Yeah. What are we doing here? Come on. Whoa, whoa, whoa. It was a thought. I don't know. My chance. We're just going here. Next link. Next link. Next link. Roku is up. Here's where AI really belongs. Roku is updating its voice assistant. You know, you can press the Roku voice button on your remote.

[02:14:53] With AI features that can answer questions about movies, shows, and actors. How scary is the shining? Or what kind of fish is Nemo? It will display a text response on your screen. Not touch. I, for one, trust Roku to have the power of the AI in my house. Yeah. I was like, you know, I'm the kind of person who I'll watch a movie and I'll get sidetracked because then I'll go down a rabbit hole of like Googling stuff. We do. I am DB, baby. I live on it. I love that. Yeah. What else was he in? Yeah, exactly. Is he older than me?

[02:15:24] Prime has that x-ray feature, which I really like. That's a variant. It tells you who's on screen right now. I really do like that. I love Roku. How tall is he? That's always the question I always want to ask. How tall are these people? They're all not. They're not tall. It's so funny. They're not tall. I never think about this. My wife is always saying, how tall do you think he is? Oh, they're tiny. They're so tiny. There's a great book about Hollywood called A Conspiracy of Short People. Oh, my God.

[02:15:52] And it's by the screenwriter who's my height, who's 6'7". And he just is like, he writes a very funny book about that. Yeah, there's a guy, I met a guy once who was like a sort of supporting actor kind of guy. And he's pretty tall. He's like 6'2 or something. And he says that once you get past 6'1", everybody hates you on the production because not, I mean, you know, Tom Cruise hates you because he's short, but like everybody hates you because you're not in the shot. They have to dig a ditch for you. They have to.

[02:16:20] Well, what they have to do is build a ceiling for the shot. Yeah. So if they're going to have a tall character, they got to build a ceiling. Otherwise, they can leave an open drop ceiling with all the lighting in it. So it screws up the whole, like the whole production has to like stop while they bring in the carpenters and everything. So Humphrey Bogart was 5'8". Relatively short. That's like, that's a great Hollywood height. Enormous head. And I was, I who also have an enormous head was always told that's a good thing in the movies. You want a big head.

[02:16:49] Yeah, I think that's true. I've heard that too. Yeah. No, I think it's to fill the screen. I don't know. It's to fill the screen. I don't know what it is. Leo DiCaprio has a big head. When I started doing more media, someone said, you're going to be surprised by how small these people are and how big their heads are physically. Oh, really? And yeah, they told me this. Like, this is like, and I just was like, that can't possibly be real. And then you meet all these people and they're just creating this image. I mean, it's because they're talking heads. We've optimized.

[02:17:17] It's like a genetic optimization for people with giant heads to tell us what our dreams are. That's really funny. That's really funny. I'm blown away. I once interviewed Pete Buttigieg, who I consider to be a very smart and interesting person. But I'll tell you, Pete Buttigieg, little guy. He's a very small person. That's right. And he's just the most media trained guy ever.

[02:17:42] He said to me, he said, so when we're speaking, because I'm like, you know, more than a foot taller than he is. And he's like, I'm going to talk to your button. Oh, he is trained. And so it'll look weird when we're talking. He's like leading me through my own interview. Don't be distraught by this, but I know what I'm doing. Yeah. He's like, yeah, exactly. I'm just going to take command here. And he did. And he talked to my solar plexus and on the camera, it looked like we were looking at each other at the same level.

[02:18:12] And, you know, I just remember thinking, well, this guy, he knows what he's doing. I think I remember reading, and we should probably research this, but I think I remember reading that the taller candidate always wins in a presidential election. Oh, fascinating. Oh, interesting. Donald Trump's pretty tall. He is. He's certainly taller than Kamala Harris. He's not that tall. He's not tall.

[02:18:40] He's more than six foot, isn't he? The picture of him with Putin, I don't think he's more than six foot. The picture of him with Putin walking around. Oh, maybe he wears lifts. Putin wears lifts. Yeah. Yeah. I mean, I wear lifts. I made a strong pro-platform shoe policy beginning of last winter because I thought it was funny. I got platform docks and platform crocs. This is why you're tripping for it all the time. That's true. You're wearing platform shoes. And I recently discovered you can order pickles in a bag on Amazon. Life is good.

[02:19:10] Life in the 21st century. We are so happy to be here. So happy to be here. Let's take a little break and we come back. If somebody would just query chat GPT, if the taller candidate always wins the U.S. presidential election. I'm just curious. I mean, the ones I can... President Obama was tall, right? Fairly tall. That's six foot nine. I know, Jacob. He's just a little guy.

[02:19:40] He was six two. Just making faces. I don't know. I grant you the honor of being the tallest person. I should shut up. And now, not the only one without it with a tattoo. Barack Obama was shorter than Mitt Romney. Oh. Oh. George Bush was shorter than John Kerry. Well, there goes John Kerry. So it's wrong. And Joe Biden was shorter than Donald Trump. Huh. Okay. So that didn't work. So I actually think it turns out that the shorter person wins. Shorter guy always wins.

[02:20:11] You just had it backwards. Yeah. Yeah. Well, I remember this from like sixth grade. So that was pre the 21st century. Oh, yeah. We don't have George Washington here. Yeah. Abe Lincoln wasn't. Lincoln was a tall fella. Lincoln was tall. He was a tall fella. Our show today brought to you by, we're going to take a little break so I can recover now. Our show today brought to you by Zscaler, the world's largest, I'm going to say that again,

[02:20:38] bears repeating, the world's largest cloud security platform. You know, Zscaler understands that in business, you are in a very interesting situation right now. The perils and the rewards of AI are dramatic. The potential rewards too great to ignore, but so are the risks, right? On in a number of ways, if you're using public AI, the loss loss of a sensitive data can be dramatic.

[02:21:08] AI is being used to attack you as well attacks against enterprise managed AI, but also attacks against your perimeter defenses. Generative AI creates incredible opportunities for threat actors. They can rapidly create phishing emails that are indistinguishable from the real thing, even if they don't speak a lick of your language, right? We used to say, well, you know, check the grammar. If it's bad, well, then it's a fake. No, not anymore. It's better than mine.

[02:21:37] AI is writing malicious code. It's automating data extraction. There were 1.3 million instances of social security numbers leaked to AI applications by just completely innocently using these AI SaaS applications at your work. Think of the business data that you could be leaking out. Chat, GPT and Microsoft Copilot saw nearly 3.2 million data violations.

[02:22:07] So there's lots of things AI is good for. There's lots of things AI is bad for. It's time for a modern approach. Z-scalers, zero trust plus AI. First of all, zero trust is brilliant because even if bad guys penetrate your outer defenses, you're still protected. It removes your attack surface. It secures your data everywhere. It safeguards your use of public and private AI as well. Protects against ransomware and AI-powered phishing attacks.

[02:22:35] It really is a solution everybody needs. Check out what Dr. Chaudhry says about choosing Z-scaler for Seattle Children's Hospital. We selected Z-scaler because it was a very good total cost of ownership, good relationship management, and the ability for Z-scaler to bring in business professionals that understood healthcare.

[02:23:01] It was a solid partnership to get us to where we needed to be literally within days. And it's important because we're dealing with people's lives. With zero trust plus AI, you can thrive in the AI era. You could stay ahead of the competition. You can use AI beneficially. You can protect yourself and remain resilient even as threats and risks evolve. Learn more at zscaler.com slash security.

[02:23:30] That's zscaler.com slash security. Thank you so much for supporting This Week in Tech. Tesla, here's the good news, has debuted a robo-taxi steering wheel-less robo-taxi for two. The Cyber Cab, a two-seater electric vehicle designed, and it has no steering wheel. They unveiled it at Austin.

[02:23:59] I'm not sure where it's going to be. It's priced at $30,000, under $30,000 for fleet operators, so it's very affordable. Promises to slash urban transport costs by up to 40%. So my question is, what's the difference between this and when they announced it last year? Did they just... Good question! There's no steering wheel in this one. Well, there wasn't in the one... The robo-taxi didn't have a steering wheel either? No.

[02:24:28] No, it was the same thing. So I'm like, did I miss something? They're going to keep announcing it until they can get it. I mean, that's what they do, right? That's like a famous Tesla. They always early announce. Yeah. Yeah, right. It's the Theranos strategy. Just name it. That's a great strategy. Well, then maybe... And this is a CNET story from one Albrar Alhidi. You don't want to wait for your DoorDash to arrive via Tesla robo-taxi. There's Waymo. Oh, it's Waymo too.

[02:24:58] Okay. DoorDash and Waymo are teaming up. Waymo at least is operating. Yeah. I love Waymo. I love Waymo too. They're so great. Oh, yeah. They're incredible. Because you don't have to talk to the driver. Exactly. You play whatever music you want as loud as you want. That's exactly what I like. Rock out. In San Francisco, there's a whole thing of people having sex in them. Oh. That's like the punk rock thing to do is to go and... Well, I wouldn't you. Well, I mean... There's cameras.

[02:25:28] The 30 internal cameras. Microphones. Yeah. I was alive when cabs were here and I was dating. There was a complicated time in the back of cabs. Yeah. I feel like this is nothing new. It's like now there's just no weird cabbie that has to awkwardly clean up after you. It's just now some poor tech worker. I mean, it's true. I feel like Uber drivers are always like... People act like we're not here. So I guess there is actually no one... I feel bad. I know. It's really terrible. But in that same back of the car, your food can now be delivered via Waymo.

[02:25:58] Now, here's the question. Have you ever gotten into a Waymo that has been less than clean? No, not me. How about you? Okay. I've heard this happen before. I've heard of a friend got in one that was messed up. But then they just hit the button and they were like, do you want to pull over and get a new one or do you want to? And they were just like, no, it's not that bad. Just want to let you know. And they just went on the way. But it was like... I'll just sit on this side of the seat. It was... I think it was just very quickly handled. Yeah. There's so many cars. I'll send another one. Yeah. Yeah.

[02:26:25] So this is going to start in Phoenix, not where you guys are. But, you know, I've never ridden in a Waymo. I see them every time I'm in San Francisco. You see them everywhere. They're incredible. It's like the best invention ever. I feel like Jacob's not sold. I feel like you have a story. Well, I don't know. I'm... Yeah. You know me. I can ruin any topic here. But in my case... I always thought you were cheerful. Come right on our brain. I know. I know. I know. I actually am a big...

[02:26:51] So I will say, I think generally speaking, like deaths on roads are so... So they're so... Right. Right. It's guns, cancer, and cars kill the most Americans. Right? So like... So if we just automate those... I'm all for it. Like, you know, since I do think that this is a thing that human beings shouldn't be driving themselves. I do think that that is the case. But what I don't like... Like, I like solving for that. Right? Less death.

[02:27:21] I like solving for that. I don't like for not paying people to deliver food. Totally. Right? I don't like solving for not having to pay drivers. You know? In India, they have banned this technology entirely because so many people make their living as a driver. Right. And they're just... They recognize that it will decimate that country. And we didn't even... You know, during our AI conversation, we didn't even get to the fact that like knowledge workers, you know, in India, right?

[02:27:47] Your call center people, your outsourced, you know, skilled labor, those people are about to be decimated. That country is going to be economically decimated by this stuff. So anyway, solving for the cost savings, I don't like. Solving for less death, I'm all for. So that's my... Right there with you. Yeah. There's good and bad and all. But you know what's really interesting is so Lyft has a partnership with Waymo now, but they still have Lyft vehicles that have like on the side, it'll say a car with a human driver and everything.

[02:28:16] And I'm like, maybe you should stop advertising. I wonder... You know, there's a calendar reminder internally at Lyft that's like, you know, stop, stop disturbing. I don't mention that anymore. Huh. I just... I feel like, okay. But we're... Yeah. Yeah. Which part is confusing, Leo? So... Yeah. What are we thinking about? Life. Life's the whole thing. Yeah. The whole thing is extremely confusing. Here's the one thing that... Here's a weird...

[02:28:46] So back in 2007, I think, I wrote a piece for Popular Science about how I thought, you know, robot cars would be a really good thing in terms of a life-saving thing. But one thing that many people made the point to me about was that you got to treat them like vaccines where everyone's got to be in one for them to save lives.

[02:29:09] And so, Leo, you and your awesome Trans Am will screw up everything. That was my 12-year-old daughter's. Well, yeah. So whoever insists on driving themselves will screw up the herd immunity. Oh, yeah. The problem. Yeah. So that's... Talk about it. Like, we've been talking a lot about, like, what's the nanny state situation? It will become... No more driving your own car. Right. I think there might be... I actually...

[02:29:38] I've been playing around with a little, like... Private car ownership is actually horrific for the environment. That's right. That's right. That's right. For safety, for everything. Pedestrian deaths. It's just bad. But can you imagine the United States being like, you can't drive your own car anymore? Oh, never. That won't happen here. Have you guys read the book? What is it called? I forgot. Is it Civil War? No, no, no. They're all called Civil War at this point, I feel like. There's a book about the U.S. banning cars.

[02:30:07] Oh, I can't imagine what that would be like. And it's causing a Civil War. Flames, gunfire. American War. My bad. But that's just like... Very good. That would be... There are many. A third rail of American politics. Nobody's going to go there because they know that's the American way. You know, that's freedom. It's the true tragedy of the commons, right? Because it is, you know, us all driving a car makes, you know, is a nightmare.

[02:30:36] I was talking to a guy like 10 years ago who developed apartment complexes. And at that time, they were trying... They were doing the math on what it will be to have an apartment complex with only six parking spaces and a shared pool of self-driving vehicles that the people living there get. It'd be so awesome. For themselves. It'd be so awesome. It'd be just what you needed, you know, but nope. Yeah. Well, I mean, you can make it...

[02:31:03] When we talk about environmental impact of AI and, you know, how we're going down this bad road and so forth. We made that decision 100 years ago when we decided to do internal combustion engines and build cities around them and build the country around them and give everybody one or two. And it just was inevitable. But it's so... I mean, it's so entrenched now. I don't... I'm kind of stuck with it. Well, it's like a... It'll be a market thing.

[02:31:31] I mean, one thing we do know, right, is that kids, young people, have much less interest in learning to drive. They care about it way less than they used to. That's true. Right? I mean, Abra, you said you don't... That's because they have social media. Are you a New Yorker and as a result, you don't need one or... So I had one when I was in Illinois, but when I moved out here to the Bay, I just take BART. So it's really just a matter of convenience. But most of my... Yeah, most people my age do have cars, which surprised me because I was like, oh, we're in a place where you just take public transportation. But no.

[02:32:00] But then you go to that one generation below and nobody even wants it. I remember I was dying to get my license at 16. The second I could go, I was at the DMV. I was so excited. Yeah. My daughter was like that. And my son is two years younger. Couldn't have cared less. He waited till he was 18. Yeah. I think that's a really common thing. And young people are driving the car longer and longer and longer. They don't care about buying a new one. Good. Right? Which is the other thing. Oh, that's good. Because that's the other thing.

[02:32:26] When and if they're all self-driving, you're not going to care what it looks like. You don't care how sexy the bus looks. Yeah. Well, you might. I mean, maybe. I care deeply. You're really aesthetic. Yeah. That's right. That's right. One more break and then we're going to finish it up with a few little tidbits. Other news that involves beheaded figurines. Oh, boy. We will save that for just a moment.

[02:32:55] Jacob Ward is here. Great to have you, Jacob. I really appreciate you've joined recently, joined our tech news weekly. And that's such a wonderful thing to have you. His rip current, a look at the invisible forces changing our lives from authoritarianism to addiction to the psychological dangers of AI. Wow. Are you happy independent? Are you glad you did that? You know, I love doing it intellectually, but I'm going to need some money. Financially, it's tough. Yeah. I know.

[02:33:25] I'm going to require some money. So, yeah. Anybody listening here, if you need, I would love to talk to you about it, some advisory roles. Because, yeah, the independent thing is great. It is a tremendous amount of freedom. But I also, I think that there's a lot of people, I don't think it's going to be the solution to defending democracy. I don't think everybody trying to stay afloat in the attention economy is going to be a way for us to, you know, keep people up to a top.

[02:33:55] I understand what you're saying, but I also worry about the mainstream media. Look what's happened to CBS. Yeah. You know, Barry Weiss, the editor-in-chief of CBS. Look what's happened with the FCC putting its thumb on the scales. I feel like there's some advantage to being independent, unregulated media, that that may be the last place we can find real information. Yeah. I'm with that, you know, in theory. But I also think... Financially, it's a problem. I understand.

[02:34:24] Finance, it's a problem. And like, you know, Barry Weiss putting her thumb on the scales at CBS is a, you know, is a big shift. But I will say, you know, you want to see thumbs and scales, you know, you follow your average YouTuber, your average blogger, right? Well, that's a good point, too. Where there's no one around telling them what... So that's a really good point. Did you interview the other side? You know, did you interview the other guy, too? Yeah, I've been complaining about that for a while. Or just even advertisers, unidentified advertising and product placement stuff. Yeah, that's a very good point.

[02:34:55] Abrar Al-Heady's still working for mainstream media. She's at CNET. Did CNET get purchased or something? I was going to say, we're no longer under CBS for several years now. And I'm like, oh, I guess that worked out. Aren't you glad Barry Weiss is not coming and knocking? They're really playing a lot. So yeah, now we're under Ziff Davis. So we're... I'd say we're happy there. Yeah. So... Yeah. Good. It's a better fit than, you know... Good. What Red Ventures was. Well, as long as...

[02:35:24] You know, the thing that gives me hope, even at CBS, as long as there are people like you, people who really care, who have integrity, doing the work... Yeah. Management can change. Management can have its initiatives. But as long as people care... Yes. ...that's going to make a difference. That's exactly right. The people making the decisions, creating the content, that's what really matters. There's a lot of really passionate people at every news org that, you know, will keep it afloat in that way. Yeah. Yeah. I feel very fortunate that...

[02:35:53] And CNET should, too, that we have you on our shows. Thank you. Very kind. Yeah. And Harper Reid, who is in the private sector. Yeah. I'm definitely in the private sector. I don't think... Someone once said, why didn't you go to the White House? And I was just like, are you... Let's just think through this. Because W took all the keys, man. It wasn't like... That wasn't A, offered, and B, I don't think that's my vibe. Yeah.

[02:36:23] Well, anyway. Yeah. Yeah. I think that would be very hard. And culturally. Just culturally. Culturally. Very... I don't own a suit. Right. The use of psychedelics there is probably minimal. Stimulants. Well, I mean, I never use psychedelics, but the... Oh, okay. But the... Oh, your AIs should. Your AIs should. I'm personally not... I don't really imbibe. And so it's very nice to have my agents do it for me. Like they do all my fiction writing as well. Oh. Well, there you go.

[02:36:53] I'm just kidding. I don't write fiction. But if you did, they would write. I have a very strong rule that all my blog posts, everything, I have to write myself. I know. I feel really uncomfortable. But it's confusing to me because it says like 98% written by... Well, because I use spellcheck and everyone around me is pedantic. Oh. That's okay. Spellcheck. So everyone will be like, but did you use spellcheck? And I'll be like, well, yes. I don't really crack down on that. That's... That's a lookup table. Give me a break.

[02:37:38] That post was 98% written by a human. Because I figured that's an approximate... Perfect. Yeah. Yeah. And that's all you need to know. And then there is one post that is completely generated by AI. Is it marked that way? Yeah, it is. It is. I even have a... I even told you how I generated it. Which one is that? But this is something I say. I say this is a good reminder that if you get an email from someone and the writing is perfect and has no affectations, an AI probably wrote it. Right.

[02:38:07] Like that's... This post is basically just like... You're like, this is a perfectly written post with no grammar issues at all. And you're like, hmm. Some of us are grammar nerds. Okay. Okay. I just... Everything I read of yours, I think it's AI generated. I'm sorry. We're going to have to... Start making intentional mistakes. I'm going to have to like... But AI will figure that out and it'll start making intentional mistakes. I honestly think we're very close to, if not already at the point where you just can't tell. I mean, Sora is that good?

[02:38:35] I mean, look at the guy walking on the moon behind you, Jacob. I mean, obviously that's AI because nobody did that in that outfit. That's me last week. Oh, it's you. Okay. Never mind. Never mind. But most of the time you can't tell. Yeah. Unless he's tripping and farting all the time. You can't tell. Yeah. This is why we all need our little tells. I think it's hilarious. I actually turned off the tripping and farting and just made that so my clothes are more and more ridiculous. And I showed it to a friend and he said, isn't that how you dress?

[02:39:05] Oh. And I was like... So we talked about this before the show, so I better fill in people. We're talking about the Sora video generation, the social app that Meta released to great fanfare. Open AI, not Meta. Sorry. Open AI. Meta has its own thing. But this is... Without any... The one thing that I will give Open AI is they did release something with like a vibe. Whereas everything Meta released for somehow is devoid of vibe. It's like Mark.

[02:39:34] And Meta has a mixture of real posts and AI posts. And the whole point of Sora Open AI says is you just know it's all AI. It has to be. So you create your own cameo, which is an image of you. I put mine available to the public. Harper has not done that. But Harper gave me a very handy tip. You can in the preferences. And AI Justine did this. I was wondering why every time you see AI Justine, she's got a little pet pig. She did that.

[02:40:00] So Harper used to have in his commands that he should always be tripping a lot and passing gas. And I think that's quite funny. A lot of the videos of you. It's funny. So I've done a little Easter egg in mine. And we'll see if people notice. I did generate a video of us. Did you? Oh, I did. All right. You're going to share it with the class? I don't know how to share things. I just know how to generate things.

[02:40:30] Just generate them. Don't share them. Just generate them. It's live. You can go see it on Sora. On Sora. Okay. And if you were in our Club Twit Discord, are you in our Club Twit Discord? I will send you an invite. You probably don't have time. But if you were, you could just post it there. Jacob, by the way, I should mention we're going to have a lot of fun on Friday. Right? I'm excited. We're playing Dungeons and Dragons. Yes. Oh, wow. It's one thing to talk to you nice people for three hours.

[02:40:59] It's another thing to be the head-butting half-orc I get to be for three hours. Is that what you decided on as a head-butting half-orc? Everybody else wants to be a magician. I want to crush skulls and kick doors through and stuff like that. Well, good. We need somebody in the party that's like that. Micah Sargent is going to be the dungeon master. Jacob's going to be there. Paul Thorat. Jonathan Bennett from the Untitled Linux Show. Paris Martineau from Intelligent Machines. I'm going to be there.

[02:41:28] I'm a bard because I thought, well, a bard is kind of what I am anyway. I think my name is Sagbottom the Cheerful. And I have charisma. And I can play the bagpipes. I love the idea of you and the rafters as the rest of us are getting our asses kicked, strumming on a lute. I'm watching the party get murdered. You're doing great, everyone. That's exactly what I'm going to do. Anyway, this is going to be a lot of fun.

[02:41:57] If you're in the club, it starts at 2 p.m. Pacific, Friday, October 24th. We anticipate a three-hour cruise. So, you know, bring lunch. Should be a lot of fun. Micah Sargent will be the dungeon master. He's putting this all together. And this is one of the many reasons you want to be in the club. Partly because we do fun things. Partly because you support what we do with your contribution, your $10 a month subscription. You also get ad-free versions of all the shows.

[02:42:25] We try to give you some benefits and access to the club to a Discord, which is just a great place to hang out. If you're not a member of the club, please consider joining it. We really, we need your support. This is what keeps us independent. It's what keeps us able to do what we do. And we're very grateful. Twitter.tv slash club twit. If you're not yet a member, please join. And we'll see you on Friday. I can't wait, Jacob.

[02:42:50] You, I didn't, you, I wouldn't have thought this, but you have experience with this stuff. Yeah. So, you know, I did an alarmingly long portion of my childhood much later in life than I think probably I would like to talk about here. But I then got to pick it back up again with my daughter. I'm part of a dads and daughters group every week. Yeah, we, we rip around.

[02:43:17] And the, and the most fun thing was creating the, the partners because we were joining an existing group. So we need to come up with a whole like story for why we got together. And my daughter who insisted on being this kind of very spooky half demon child sort of thing that gets us into trouble anywhere we go. And me as this sort of big dummy, you know, she's master, I'm blaster was sort of our shtick.

[02:43:43] And to like sit together and come up with the like, how were we both orphaned and then brought together, man, it brings, it's my ass. It was a cool thing. It was a cool thing, highly recommended for the parents and the audiences. See if you can get into it. And is it a virtual one or is, do you go down somewhere? No, we go, we physically get together. There's this nice friend of mine, tech executive guy who, you know, shows up with like, I mean, he's got maps made. He's got, he's the most organized guy. See, I think that makes it even better. Oh, it's so cool. Yeah. It's so cool.

[02:44:12] And it's just such a like, just to, because, you know, as any parent knows, there's a, there's a point past which your kid feels it's no longer cool. Cool to make, to play pretend. And it's tragic because suddenly they then feel this incredible pressure to like Taylor Swift and be cool in these various ways. And, and, and any kid in private will tell you like, that's boring. I miss playing, you know? And this is one of these, this and filmmaking and theater, right?

[02:44:40] Are your ways of like continuing to make stuff up. I love that. And it's socially acceptable. So this is a great thing. You can do what I do and just live in a fantasy land yourself. That helps. If you are rooted in fantasy and not in reality, it's definitely a little easier. Yeah. I'm trying to channel my inner harbor. It's hard to keep it going though. It's not easy. Well, we'll see you. I can't wait, Jacob. It's going to be a lot of fun. I guess I'm just going to be singing along as you guys get slaughtered in the caverns or whatever.

[02:45:15] This episode of this week of tech brought to you by Zapier. Oh man, I love Zapier. This is a way to get a little AI into your life without being an AI guru like Harper. Everybody these days is talking about AI. I mean, we spent half the show talking about it and talking about trends though, doesn't help you be more efficient at work for that. You need the right tools. Now for years, I have used Zapier. Zapier connects all of the tools I use together for workflows.

[02:45:43] For instance, when I bookmark a story that I'm interested in, it's automatically uploaded to my Mastodon instance. It's automatically, you know, I toot it or actually Zapier toots it. Zapier also copies it to a spreadsheet that the producers can then use to put the shows together. It's just a really easy way to get things done. Well, Zapier is how you break the hype cycle and put AI to work across your company because they've added AI.

[02:46:12] You can now deliver on your AI strategy, not just talk about it. Zapier has become an AI orchestration platform. So along with, you know, Google Docs and Monday and whatever you use, you can also bring the power of AI to those workflows. So I could have, for instance, AI analyze the stories that I've bookmarked and give me a summary. You can connect all the top AI models, ChatGPT, Claude, to the tools your team already uses. You can add AI into the workflows wherever you need it.

[02:46:39] You can even create AI powered workflows. You can make an autonomous agent, a customer chatbot. You can do anything really. Zapier has always been great for that. And now you can orchestrate it with AI with Zapier. Zapier doesn't require a PhD in computer science. It's for everyone, tech expert or not. And I'll tell you how you know, teams have already automated over 300 million AI tasks using Zapier. Join the millions of businesses transforming how they work with Zapier and AI.

[02:47:09] Get started for free by visiting zapier.com slash twit. That's Z-A-P-I-E-R dot com slash twit. We thank Zapier so much for supporting This Week in Tech. We're going to wrap it up. Just a couple of quick stories. I know you've got to get going, Jacob. Probably have a D&D date with your daughter. I've got to pick up my half demon child. I mentioned that the zipper is getting its first major upgrade in 100 years.

[02:47:37] I did not know this, but YKK, a Japanese company, makes almost half of all the zippers in your clothing. I think if you looked right now on any zippers you've got on you, it's probably YKK. You don't have YKK on it, right? Yeah. Yeah. So you'll also see if you look at a zipper that it is, you know, the metal teeth, the pull, but also there's a backing, a fabric backing.

[02:48:00] They have apparently figured out how to eliminate the fabric tapes, which means that their new Aerie string zipper is more flexible, is lighter, is simpler, does require some special clothing to kind of weld it into your fabrics. They have special sewing machines from the Juki company. So they've reinvented the zipper. Now, you may say, this is a technology show. Leah, why are you talking about zippers?

[02:48:30] Well, I always think of zippers as one of the early technologies. If all you had was buttons, the zipper would change your life. And it did. I don't know when it was invented. It wasn't that long ago. I think maybe 150 years ago, something like that. So get ready. It's one of those ones like a, yeah, I was just going to say, it's one of those things like a bicycle where you're just like, there's no way you could possibly improve on that. Right. Like you've just done it. Yeah.

[02:48:58] What I like about YKK is that they were part of a worldwide price fixing cartel for zippers. The zipper price fixing. The zipper price fixing cartel. Big zipper. Big zipper, man. Keeping the prices high. There's going to be a movie with Matt Damon in it about that. Look for big zipper. And then finally, I did mention there would be decapitated figurines.

[02:49:24] According to the New York Times, police have broken up a Lego theft ring and have recovered hundreds of beheaded figurines. There they are. They're the heads on parade. That's incredible. In this guy's garage in Lake County, California, which is just up north of Peace, Santa Rosa Police Department busted the guy on Monday. Allegedly a Lego crime scene. Plastic figurines were everywhere.

[02:49:53] Since the New York Times, their heads were moved from their bodies and organized in neat rows by facial expression. Apparently, this guy had stolen a bunch of Lego, had a ring of Lego thieves. He would purchase them at reduced prices, turn around and resell the sets or individual mini figurines at inflated prices. This is why he was taking the heads off. There's, I guess, a demand.

[02:50:22] What I like about this is that in the hacker news comments, everyone was like, $6,000 is not that much Lego. Like, they just were not impressed with the scale of the crime. It's not that much, man. Yeah, they were just like, I don't see, I don't understand what the problem is here. Clearly, it's a dude who just loves Lego. You've got to love. There are $1,000 Lego sets. Oh, yeah. So, I guess it's probably the case. The crazy thing about Lego, right, is their big patents have all expired. Yeah, that's right. Anybody can make Lego. They used to have the big patent.

[02:50:52] They all expired, but they're still the dominant player. Because it's... Even though you can get knockoff Legos like crazy, people still flock to the real thing. It's kind of one of these lessons in like, keep making a high quality product, you know? What would you call your knockoff Lego? I don't know, Bogo. Bogo. There was Duplo. Sticky bricks. Yeah. Sticky bricks. I think you better get Claude to work on that. Give him some powerful... Some drugs. Powerful.

[02:51:21] Claude, you are stoned out of your mind. Give me a Lego, a name for my knockoff Lego. You're way into Lego, Claude. I need a name for my knockoff Lego. He's probably going to not have a problem with drugs or creativity, but be like, I can't possibly help you knock off a famous brand. Probably. You're right. That's exactly right. No, I'm sorry. That's one thing I can't do. Harper Reads' blog is harper.blog. His website is harper.lol. His company is doing this new bot thing. If you've got...

[02:51:50] It's an MCP for AIs. They can have their own social network. Find out more at 2389.ai. And it is now full dark. It is full dark. It is. The vampires are out. Halloween is starting. It's exciting out here. We're going to let Harper take off with his little stoned Claude, tripping and farting his way. I'm glad that this is my twit legacy for this October day. Thank you, Harper. Great to see you.

[02:52:18] Abrar Al-Hidi, great to see you. It's wonderful. She'll be on Tech News Weekly soon. Of course, not soon enough. We'll get you back here soon, too, I hope. Senior technology reporter at CNET. Always a pleasure. Lots of laughs, as always. I appreciate it. Jacob Ward, the demon hunter. JacobWard.com. TheRipCurrent.com. You know what? You have subscriptions, right? I do, yeah. Yeah, so subscribers get early access to the podcast. They get some behind-the-scenes features.

[02:52:47] There's some extra stuff that comes your way at TheRipCurrent.com. The podcast is incredible. You should absolutely support Mr. Ward so that he can remain independent and continue doing what he's doing. Well, the dirty little secret of my podcast, Leo, is that half the time I just grabbed somebody that I met through you. So I got Molly White. Oh, Molly's great. Paris has been on there. Great. So I've been falling in your wake a lot. Steal everybody.

[02:53:14] Yeah, when I can't get a world-famous social scientist, I steal your guess. And apparently you were on with my old friend Andrew Keene, who is a contrarian for some time. Oh, man. He slapped me around good when my book came out. He was like, he gave me the full BBC slap around. But then I came back on with him about a week ago, and we repaired. It was really nice. Aw.

[02:53:42] Jacob, everybody should subscribe to TheRipCurrent. And thank you so much for being here. We'll see you on Tech News Weekly as well. Jacob Ward. Appreciate it. Dot com. Now go be with your family. We thank everybody for being here. You go be with your family, too. We do Twitter every Sunday afternoon. It goes on and on, I know. 2 to 5 p.m. Pacific. That's 5 to 8 p.m. Eastern time, 2100 UTC. You can watch live in the Club Twitter Discord if you're a club member. And please join the club. We really want to have you in there.

[02:54:12] Otherwise, we do it on YouTube, Twitch, X.com, Facebook, LinkedIn, and Kik. We don't do TikTok anymore. They made it too hard. But I figure six other platforms is plenty. After the fact, you can download audio or video of the show at twit.tv, our website. There's also a YouTube channel with the video. Best thing to do, subscribe. That way you'll get it automatically the minute it's available. And if you do subscribe, leave us a nice review. Give us the five-star treatment, will you? Let everybody know. We've been doing it for 20 years.

[02:54:42] And after a while, people, you know, kind of, it's not the hot new thing. But I think it's worth listening to. And if you think so, share it with your friends. Let them know. Thank you so much for being here. Thanks to our club members for making it possible. Thanks to Jacob and Ambrar and Harper. And thanks to you. We'll see you next time. And as I've said for 20 years now, another twit is in the can. Bye-bye. This is amazing. This is amazing.

AI age verification, government surveillance,Australia social media ban, NSA encryption,harper reed,technology,Cybersecurity, Abrar Alheti, unencrypted satellite data, social media regulation, satellite hacking, SS7 flaw, jacob ward,