Is social media addictive by design or just irresistible entertainment? The panel tackles the lawsuit that's dragging tech giants onto the witness stand and how surveillance tech is quietly expanding while lawmakers and users scramble to catch up.
- Jury told that Meta, Google 'engineered addiction' at landmark US trial
- Instagram Chief Says Social Media Is Not 'Clinically Addictive' in Landmark Trial
- Section 230 turns 30 as it faces its biggest tests yet
- Meta apparently thinks we're too distracted to care about facial recognition and Ray-Bans
- Amazon Ring's Super Bowl ad sparks backlash amid fears of mass surveillance
- Ring cancels its partnership with Flock Safety after surveillance backlash
- TikTok is tracking you, even if you don't use the app.
- Discord backtracks on controversial age verification rollout...kind of
- Discord/Twitch/Snapchat age verification bypass
- The DJI Romo robovac had security so poor that this man remotely accessed thousands of them
- HP's laptop subscriptions are a great deal — for HP
- FTC Ratchets Up Microsoft Probe, Queries Rivals on Cloud, AI
- T-Mobile announces its network is now full of AI by rolling out real-time translation
- Apple's latest attempt to launch the new Siri runs into snags
- SpaceX Prioritizes Lunar 'Self-Growing City' Over Mars Project, Musk Says
- Elon Musk declares victory with Medicaid data release
- Waymo Is Getting DoorDashers to Close Doors on Self Driving Cars
- Backblaze Drive Stats for 2025
- $1.8 million MST3K Kickstarter brings in (almost) everyone from the old show
- OpenAI Is Nuking Its 4o Model. China's ChatGPT Fans Aren't OK
- Hideki Sato, designer of all Sega's consoles, has died
- Byte magazine artist Robert Tinney, who illustrated the birth of PCs, dies at 78
- Launching The Rural Guaranteed Minimum Income Initiative
Host: Leo Laporte
Guests: Wesley Faulkner, Stacey Higginbotham, and Thomas Germain
Download or subscribe to This Week in Tech at https://twit.tv/shows/this-week-in-tech
Join Club TWiT for Ad-Free Podcasts!
Support what you love and get ad-free audio and video feeds, a members-only Discord, and exclusive content. Join today: https://twit.tv/clubtwit
Sponsors:
[00:00:00] It's time for TWIT This Week in Tech. Stacey Higginbotham is back. Wesley Faulkner is also here and brand new. We welcome from the BBC, Thomas Germain. We'll talk about social media on trial. The 30th anniversary of the 26 words that changed the internet. Meta is announcing face recognition at a moment when they think no one's paying attention. And what about the discord age verification? What's going to happen there? That and a whole lot more coming up next on TWIT.
[00:00:32] Podcasts you love. From people you trust. This is TWIT. This is TWIT, This Week in Tech. Episode 1071. Recorded Sunday, February 15th, 2026. Image Pickles. It's time for TWIT, This Week in Tech. The show where we cover the week's tech news.
[00:01:02] Stacey Higginbotham is here from Consumer Reports, where she's a policy fellow. And more importantly, the host of Stacey's Book Club. Woo-hoo! Hi, Stacey. Hey, everyone. Good to see you. I get a little extra Stacey this month, which is nice. And we will talk about the new book we have picked for the book club in a couple of months. Excellent. Give me a break before you do it so I can get a copy. You have a copy? A physical copy. Okay. Nice.
[00:01:32] You're going to read it on dead trees, huh? Always. I will probably listen to it on Audible. Actually, not Audible. On Libro.fm if it's available. Also here, Wesley Faulkner. Our good friend Wesley, founder of something new that is not yet going, but you can go and take a look at the prototype at works-not-working.com. Hi, Wes. Hey, thanks for having me. How are you? Good to see you. I am doing great. Nice.
[00:02:00] And we want to welcome somebody brand new to our microphones. I'm saying that because that's what they say in the BBC. Welcoming to our microphones from the BBC, Thomas Germain, tech correspondent at The Beeb. He's the host of a brand new show launched last week called The Interface. Hi, Thomas. Happy to be here. Great to have you. Do you have a tree growing in your house? Yeah, you could call it a tree. I think it's taller than I am now, but he kind of fell over a little bit.
[00:02:27] If you're not looking really closely, it looks like the green curtain is the trunk of that tree. That would be cooler, yeah. Yeah. Thomas, was it Gizmodo? Was it Consumer Reports from any moon? And has been at The Beeb for a while, BBC for a while now. What is The Interface about? Well, I think it's something that probably would appeal to twit listeners. You know, the idea of the show is that everything is a technology story now. Yeah, that's true. Exactly.
[00:02:56] Yeah, we're trying to be a show for everyone. So we've got, you know, three people with really deep expertise. There's me, there's Karen Howe, who just wrote a book called AI. Oh, I know Karen. Yeah. And then Nikki Wolf, who's done a lot of investigative, you know, miniseries, podcasts about weird internet culture mysteries and stuff. So you've got AI, you've got culture and you've got consumer electronics. Exactly, right. Perfect. So something for everybody and, you know, the ideas will be a show even if you think you
[00:03:25] don't care about technology, we're trying to appeal to everyone because it's every part of your life. Well, I see that your most recent episode was about, well, it might have been about the ring, was about doorbells tracking. We're going to get, we have a big privacy section today. Beautiful. But I thought we should probably start with that trial that's going on in Los Angeles right now. The debate, it's a, it's a, it's an omnibus trial where they brought together, I think,
[00:03:50] several hundred plaintiffs suing social media, Meta, YouTube, Snapchat and TikTok were also in the suit. They, they kind of made a deal before, right before the trial began. The thesis is that Meta and Google are addictive and that they are addicting children. And that they should be liable for addicting children.
[00:04:18] And it has brought some of the biggest names in social media to the stand. Adam Masseri was on stand from Instagram this week. Mark Zuckerberg will probably be testifying next week. Plaintiff's attorney said this case is about two of the richest corporations in history. It would have been four, but we made a deal. Who have engineered addiction in children's brains. Mosseri said it's addictive.
[00:04:46] Like a great show on Netflix is addictive. Is social, is social addictive. Can social be addictive like heroin and cigarettes? I mean, is dopamine and seeking dopamine addictive? Yes. Can we actually test that in a way that's provable? Probably not because we, we don't really have that yet from a scientific and brain chemistry perspective.
[00:05:11] But this, I'm so frustrated by this because really what should be on trial is, we call them dark patterns, deceptive patterns. And that sort of thing is being deployed. It has been deployed forever. And it's just kind of like everything on the internet, it's deployed at scale. And in such a, I guess, emotionally close way for people, you're snuggled up in bed looking
[00:05:38] at this thing, that it, it is harmful. And I would argue that it's not just children. I mean, I, I look at my father and his addiction to YouTube and what's happened as a result of that. And I'm like, holy moly. So anyway, that's just Stacy's two cents. Legally, I think it'll be hard to prove. Do I think these companies have a lot to answer to? Absolutely. Well, I think the really interesting thing is like why we're talking about whether it's addictive
[00:06:05] in the first place, which is, I think because there are basically no laws regulating these companies, at least not in particular, right? Like they can't go after a social media platform because of the content, because section 230 protects them. So we've been seeing all like this raft of lawsuits, not just against social media. There was similar stuff about dating app companies a couple of years ago. And the idea is if we can prove that the design of the platform is the thing that's hurting
[00:06:32] people as opposed to the content itself, then maybe we can hold the companies liable and the courts will be able to do something. And it's like this one little narrow legal question that, you know, in a lot of ways, the whole future of the internet hinges on. It seems a perilous contention though. Should, uh, look, I I'm watching succession on HBO for the fourth time because I get a big dopamine hit from it.
[00:07:01] And I love that dopamine is HBO liable for making a show. That's very, very good. I wonder if there are parallels. So remember back in the eighties, I do not because I was a child, but I do because I was a full grown man by then. Go ahead. I'm just like, I don't want to make it sound like, Oh, I was there thinking about this, but I did take a, I took a college class on this.
[00:07:26] Um, in the eighties, they, uh, remove the regulations around advertising to children on television. Right. And that launched some really, truly terrible TV shows that I loved wholeheartedly. Um, and a lot of sugary cereals, but I think about, are there parallels to that kind of deregulatory error? We haven't actually had any regulations here, but should we go back and look at what we did there?
[00:07:55] And cause there was the concern that children's little brains were not able to parse ads and that they shouldn't see a lot of advertising because they were so. Yeah, that's true. I don't know if these are good parallels, right? There's a difference between the on demand all the time nature of these apps that are different than even succession.
[00:08:16] For instance, you have to basically there there's, if you think of it as an opt in opt out model, you have to opt in to turn on HBO, then to, uh, choose. Don't you have to install Instagram and then scroll through it. And I mean, don't you have to do that too, but you don't get to. You don't get to choose what you watch. You don't get to choose when it stops. Ah, they have these controls that are for parents that you have to opt into.
[00:08:42] So it's not, it's not choosing all of having all these restrictions by default. And then you have to opt in to release and turn those off. And you say, I want to, I want to do this. I want to get more shows. I want to get this type of content. I want to be able to have endless scroll. I want to be not bound by time that there's that's not by default. That's has something you have to opt into to restrict add those restrictions. And I want to correct myself.
[00:09:11] The plaintiff in this is a single person that I was confusing with another trial, which does have a number of plaintiffs, a 20 year old California woman known only as KGM. They're trying to protect her anonymity. Um, I just, I, uh, yeah, I mean, I, I don't have any problem with restricting sugary cereal ads on children's television or, uh, or keeping adult television to after 10 PM or whatever. I think regulations like that are okay.
[00:09:41] I just worry about the chilling of the free speech, chilling effect that telling a company, well, you can't make a product that's too good because then you'll be liable for addiction. I don't think the issue is necessarily that it's too good. I think that it is hard. Like, I mean, we've made heroin illegal and it is too good, but it is also physically
[00:10:06] addictive as our cigarettes. Well, okay. So we have shown that these sorts of infinite scrolls and, uh, tick tocks and all of the social, any social media, uh, does create a dopamine hit that people continuously go back for. Is it that same level? That's hard to, I mean, is it like heroin? Well, Mosseri said, if you don't have withdrawal, then it's not addictive.
[00:10:35] So I will tell you, okay, this is, this is Stacy's story of my tick tock addiction, which you, okay. We can all laugh because obviously I'm fine. Um, I actually did realize that I had a problem when I got an RSI industry injury from holding up by heavy phones. Right. Right. Um, okay. So I had to actually, I, I had uninstalled the app and that was like, Oh, but then I'd reinstall it. Cause my kid would send me something and I felt like I needed to be there for them.
[00:11:05] And then I would, I set time limits for myself, but I did, whenever I picked up my phone, I automatically went to it. So it's not an addiction. Like I'm not going to go sell my house or rob somebody to get access, like steal a phone so I could check tick tock on, but it was something that I had bitch. It was very easy to form a habit, much like smoking cigarettes. Yeah. It's habituating. I won't, I won't deny that. And also the social nature, sorry.
[00:11:33] Um, the social nature of it is also the thing that is tapping into our psyche, but it's valuable. I mean, that the argument is there are kids who are isolated, whose parents don't agree with their sexual choices or whatever, who find solace and a community online in these social networks. Right. But that's, that's, we're going to talk about like, um, two 30 later and talking about, there's a balance of pros and cons that we need to keep examining.
[00:12:02] And I think one of the social nature of it is not just everyone finding community, but everyone doing it because everyone's doing it. It's as a, I don't know if you've seen those experiments where everyone is like taking a test in a room and they pump in smoke and they, the one person who is not a paid actor, um, is wondering why everyone is staying in their seats and not moving is because the social like, like bond is so that if no one feels panicked and no one's leaving, then they feel compelled to stay.
[00:12:31] And I think it's the same. This is the opposite of screaming fire in a crowded theater. Yes. Of not screaming fire in a crowded theater. Yes. If no one does anything, like, even if you think about, um, like the Will Smith slap example, how no one was like tackling. Yes. Because of the social pressure from everyone around them doing it. And I think that is also something that needs to be thought about that. It's not just that it's available. Is that, that everyone else is using it in this way. Well, I think it figured out where I'm going. Go ahead. I'm sorry. Oh, no.
[00:13:01] You know, also though, I think it's important to go back to what Stacy was saying about dark patterns, right? Like it, you can't really compare it to HBO making game of Thrones so good that I can't watch it because we know that, uh, meta for example, is well aware that the design of its products causes harm to its users. Right. I don't think HBO is, is well aware. Their shows are engineered, but you can always. Yeah.
[00:13:26] So I, I have, it is very easy to see like when you're watching the pit or any Netflix show that is designed to eventually be binge watched that they end on a cliffhanger. Yeah. Right. That's called a cliffhanger. Yeah. Okay. But that should be illegal. Making me watch the next episode. We actually stop all of our TV shows about 10 minutes before the end because we want to, especially if we need to go to bed, that's what we do. To avoid the cliffhanger. Yeah.
[00:13:55] That's, that's how we watch shows in our house. Cause we're very. I love it. Uh, but there's a very big difference than ending on a cliffhanger for like something that has a finite lifespan, even if it is, you know, 15 hours of binge watch and having an infinite scroll of like, just. I guess. I feel like, I mean, every novelist, if they could, would write a novel that compelled
[00:14:24] you to read the next installment. Do you over these guys? Is the, is this, is the crime that Instagram and Facebook and YouTube committed that they're too good at it? I think it's that they know that they're hurting people. And like, we've got all these internal documents from the company talking about like, oh, like we noticed that when we turned this feature on, it caused this shame spiral. And all these like teenage girls started watching 10 hours of eating disorder content.
[00:14:52] And they also know that there are solutions that they can implement to address this problem, but they choose not to. We have all these documents where the company, the companies are discussing internally. This is a big problem and we're causing it and we could do something about it and choosing not to because they want to protect their bottom line. And that's where it's a little different. If HBO knew that it was doing that and they had a solution and they could still make a great show that wouldn't cause you to have an eating disorder.
[00:15:18] I think we would all be pretty upset that they were proceeding in that way. And I think that's the way that we should be looking. Yeah. The incentive structures are totally different. Yeah, you're right. The one once you ever watch more ads, so you, they get more money the longer you stay on the platform. The other one is like, we just want you to stay around for a month. You can watch as little as you want or as much as you want. And we won't try to, we just want you to keep your subscription, which is a totally different motivation. YouTube actually argued, we're not social media.
[00:15:47] We're just an entertainment platform like HBO. Their lawyer said it's not trying to get in your brain and rewiring it like those other guys in the stand. It's just asking you what you like to watch. Is YouTube social media? I mean, is anything social media anymore? Like what's the difference between? No, you're right. Because Instagram is not my friends. No. Yeah.
[00:16:12] I think that's a strange argument to make that like, that we're like, oh, well, this is different because of the definition of what a social media platform is. Like my space was social media, Instagram in the early days with social media, but I go on any of these apps. The last thing that I'm looking at is what my friends are posting and they're posting less and less. And that's by design. All of the companies are doing this. It's TikTok. It's a stream of entertaining content designed to keep you scrolling. Instagram is the same way. Yeah. Instagram is the same.
[00:16:40] I suppose you could argue Facebook's the same. Certainly YouTube is that way. So you kind of hit on the thing that I do care about on this. And the reason I'm arguing against any restraints is because of Section 230. I don't want to open the door to undermining Section 230. 30 years old this week.
[00:17:03] And, you know, it's been called the 26 words that created the Internet. It was part of the Communications Decency Act. And it said that no provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provided. In other words, Twitter wasn't responsible for your tweets. You were. And Twitter couldn't be held liable.
[00:17:33] And by the way, a lot of people who are trying to change Section 230, including a number of well-known Hollywood actors, which puzzles me, is that it's big tech trying to manipulate you. But Section 230 also protects me. It protects my chat room. It protects my discourse forums. It protects my Mastodon instance.
[00:17:57] If I were liable for the things people posted on those, I would just shut them down because I couldn't defend myself. I'm not big enough to defend myself. In fact, big tech can. So do you think this trial is in any way a threat in the long run to 230 or that 230 protects these companies? I think they're trying to sidestep that issue altogether because they're not talking about like Instagram hosts dangerous content. Like it's not the content.
[00:18:26] It's what Instagram is doing with their algorithms. Right. A whole separate can of worms and much, I think, more complicated issues. This is what people like Joseph Gordon-Levitt say. Well, when you do an algorithm, you're a publisher. Right. Well, which I think is a point that's really worth focusing on. Right. You in your discord, like maybe you have some rules set up where if someone says something that's like, you know, really deeply offensive or like breaks a law, you're going to ban them.
[00:18:53] That's a lot different than a social media platform where they've got this algorithm. And we know that they're making like very specific, intentional decisions about what kinds of content that they want to promote. We also know like TikTok, for example, there was a story a couple of years ago about how they have a heating button where if they have a particular user that they want to promote, they go in and they turn this on and it makes the content go viral.
[00:19:19] So I think there are all kinds of cases where the tech platforms are acting like publishers, like YouTube decides what kinds of stuff goes on the homepage. These algorithms are making the decisions, but I think it's not really accurate to describe these platforms as just like neutral the way that that Reddit is and anyone can just go on there and post.
[00:19:44] There's something else that's going on here that I think maybe needs to be addressed in a different way than like, you know, your, your discord community because it operates in a completely different way. What's the remedy? So here's, here's what I've been thinking about. And I hate thinking about 230 because it's like the third rail of internet. No, we have to defend it. We have to defend it. Hold on, hold on. So I'm thinking back to, you know, this has influenced my net neutrality years, right?
[00:20:12] Thinking about, we had the common carriers for ISPs, right? And then we came up and we were like, you know what? These email providers and internet folks, they're a little different. They're, they're broadband providers. So they're not, they don't, they don't have to follow these rules. And I know we spent a long time debating over how that would happen, but I wonder if, because there, and we would have to figure out what is a publisher on the internet versus what is a platform that is pushing something harmful.
[00:20:42] And you could define it by the platform. You could set regulations for proving harm. I don't love that idea, but like, there are lots of policy ways we could tackle this. But what happens is we're like, we're going after 230 and everyone rightly freaks out. So it's time for nuance. And we're just really terrible. This time is terrible for nuance. And I'd love to read like some really good thinking on that. Please send me stuff. I think you're right.
[00:21:10] I think it's true that when a company creates an algorithm that's really designed to as best as possible, hook people into doom scrolling, that they have now become a publisher. Or look at AI, right? Like you go on Google and do a Google search right now. And Google is speaking to you. Google is generating the information itself and giving it to you. They're publishing. They are publishing. They are making the information up themselves and delivering it to you. That's a completely different thing.
[00:21:39] I think we have to protect Section 230. It's true. The whole internet relies on it. We cannot get rid of it. The solution that I always point to with this kind of stuff is transparency. What if there was a law that said you had to make it so that outside auditors, and we could decide who they are, can go in and look at exactly what the algorithm is doing and demonstrate, you know, this piece of content is being promoted in this particular way. And this is how people are reacting to it.
[00:22:09] This is what the algorithm is doing. Just by putting it out in public so everyone can decide, like, how do we feel about all this, I think would create, you know, like Wesley was saying, a social pressure on these companies to behave a little better and be better stewards of what's happening on their platforms. Now, that wouldn't solve it. But where we are right now is like nothing at all. That's like these companies can never be held accountable under any circumstances because of these rules that really are protecting and upholding the internet.
[00:22:37] I think we've just gotten to a place where the internet is so different that we have to come up with a new way to look at it. But it gets, like you're saying, gets thorny really fast. It could even be opt-in for some places to have, like, almost a score, like a health score for a restaurant. You can see if it's an A, a B, a C, a D, an F on certain categories where they can choose to be part of it or choose not. Some of this stuff, especially with kids. I worry that Josh Hawley is going to be doing the grading.
[00:23:07] Well, it has to be useful for people to trust it, right? And so if you really want to tackle this, it needs to be extremely objective. But if lawmakers make this law, who is going to be the person doing the grading? Lawmakers. It could be a consortium of different people. No, no. Lawmakers hate actually doing that. Yeah, exactly. They're going to actually give it up to God help us. If we're lucky, it'll be a government agency that is going to fairly do it in public. This is what Facebook pretended to do, right? With their, whatever they called that group.
[00:23:37] It's still around. Or just let people make their own benchmarks. Like, think of a geekbench. I agree. Social norms are the only thing that can protect us. Not lawful laws. No, I think actual rules need to protect us. And I think everyone really, we love transparency. I mean, God help me. No, you're right. I'm looking at so many labels right now. No, you're right. But the problem is, and I will go back to kids, because they don't have the ability to look at this and understand the label the way.
[00:24:05] Like, I may be able to look and say, oh, that's, you know, pro-anacontist. Were the motion picture labels, MPAA laws, rules, were those effective? R, G, PG? When I showed my kid when they were young, things that I watched that were like PG or PG-13, I was like, oh my God, this is the most sexist, horrible thing. No wonder I, as a 90s child, like grew up not having, like with some of the ideas I had. Yeah.
[00:24:35] I mean, the MPAA allowed murder on camera, but not a naked breast. But now they don't. Now they don't? Like these will change, well, the way they were enforced has changed over time. I just don't, I don't think ratings are the way. I think we really need to think about the harms we're trying to prevent and then set up rules to address those harms. And probably some sort of body that arbitrates that because it's going to change over time.
[00:25:05] I don't know. This is why I want people to send me good ideas. You can see how hard this is, to be honest. We have a system set up for copyright, right? Like if I notify, well, it's complicated, but if I notify Twitter or TikTok that like someone is like hosting, you know, Paramount's content, they have 24 hours to take down before they violate the DMCA, right?
[00:25:30] You could create a system where, okay, we decide that there are certain kinds of content that we think are so noxious that they're illegal and maybe we need to expand it a little bit. But even just the rules that we have, right? We know that companies aren't doing a good job of taking down dangerous, like completely illegal stuff. We could create more robust protections where they have to have a system in place where they act on stuff in a certain amount of time if they're notified about a particular problem.
[00:25:58] And then, you know, there's some accountability system where they have to tell everyone about how they're doing it the same way that they have transparency reports about how many times they respond to, you know, warrants and police requests. I think there are a lot of small interventions that we could make that everyone except the tech companies would kind of agree on. It's when you look at the big, you're trying to like write one law about the internet or do we want the senators, you know, deciding what kind of speech is and isn't okay? That's when it gets complicated.
[00:26:28] There are a lot of common sense interventions. It does seem like there has been a growing consensus that this stuff is bad for us. It's bad for our kids, that these companies don't really care. It's hard to think of what a remedy would be, but I think we all we've all kind of come around to this idea that this is not the utopia that we thought the internet would end up being. I think competition is the way to fix it, though. What's going on with the Fediverse?
[00:26:54] Hopefully, you'll mature enough where people can feel like they can opt into the experience that they feel is good for them. But we lack the vocabulary to even describe what we want, I think, is part of it is that because we don't have this competition, you can't say this is more X, this is more Y, this is whatever, because we can't even have that conversation.
[00:27:16] We just say safe, harmful, or addictive, but we don't have the nuance to be able to say, let's make a platform that's different than all the others where people know if they want a certain experience where to go. You know what the problem is, Wesley? These algorithms are so good that they win. You can't have competition because people want Instagram. They want these algorithms. You can't have competition also because there's no business model for this right now.
[00:27:45] Well, the business, but Instagram and Facebook and YouTube are making billions and billions and billions of dollars. Their money is based on engaging us at all costs, even at harm. We recognize that. Right, but if you're not to harm people, if you're to create a place where engagement doesn't matter, then you're looking at something that people pay for. Fewer people will pay for that part of the reason. I mean, and it does get down to this idea.
[00:28:09] I wonder if some of this is down to the fact that we think we want the internet to be a place where we socialize and can connect and have these engagements. And we do in the sense, like, if you live in a small town, it gets you all of these people and engagement you otherwise would never get because those people aren't there. But we don't want to spend money on it. And honestly, I don't know if you should have. I mean, basically, we think the internet's doing one thing, but it's really doing another.
[00:28:39] And most people are not aware of that. And that's hard. And I love the Fediverse. I hope it helps, but I'm not as optimistic. It's not very competitive, is it? You know, when they poll people about which TV channels they watch, they always say PBS in a far greater number than people actually watch PBS. Right? People say they want to eat vegetables. Right. They like McDonald's. Yeah. I think it's a problem.
[00:29:07] The biggest problem here is these platforms are hugely successful. But you could make it more painful and more expensive for them. So, like, some of these little alterations could make it – because I think we're like, let's stop them from being evil. But what if we just said, hey, let's make it really expensive. We did that with cigarettes. Pain in their butt. We did that with cigarettes. We raised the cost of cigarettes. In Canada, they have – I think it's probably true in the UK, too.
[00:29:35] They have labels on cigarettes that are graphic. Yeah. They show lungs. And it's like, this is going to happen to you if you – and it does not seem to deter people much. What are you talking about? Nobody smokes anymore. But you know why? Because we make sure that they have to do it outside when it's cold. Right. I feel like there's been a bunch of interventions to make it much more painful. You're right. There are interventions that work. But we did a whole bunch of different ones. Yeah, we tried them all. Yeah. Yeah, that's true.
[00:30:04] Do fewer people smoke? Oh, yes. In the U.S., for sure. In the U.S. Yeah. But in Italy, it's – I'm afraid of it now. You could smoke – can you smoke inside places in Italy? No. When we were in Rome last year, Lisa said, I never want to come back here because everybody's outside smoking. Yes. Anytime I'm in Japan or parts of Europe, I'm just like, woof, you guys. When I was in Istanbul, they were smoking in the hotel.
[00:30:34] And people know – I mean, there's nobody who doesn't know that smoking is bad for them. That's going to kill them. Right. But think about this culturally. You see it less than movies and TV shows. And I think – That's true. That goes back to my previous example. It's not cool anymore. Yeah. They are worried about it coming back. We've moved away from that. Yeah, it's not cool anymore. All right. We're going to take a break and talk about privacy next. Great to have Thomas Germain, first timer from the BBC's The Interface, which you've got a great team. I can't wait to listen to it. That sounds great.
[00:31:04] Just launched last week. Just last week. Yeah. But you already have muffs. We've been doing this 20 years and we do not have microphone muffs with our name on it. You've got to get on my level. I don't even have a mic flag for crying out loud. We're coming for you. Yeah. I'm in trouble. Also, Stacey Higginbotham, who thank goodness does not have a Consumer Reports mic flag. But she does have a little, that little wooden CR behind her, which Paris now has. And Nicolas De Leon now has. Oh, good. Yeah. Taken over everywhere.
[00:31:33] They were from the pandemic, apparently. Oh, because everybody had to do Zoom calls. Yeah. Isn't that interesting? That might be the number one thing the pandemic has changed. We've all become, even on major news networks, we've all become used to people sitting in their bedroom. I'm in my office. Okay. Or their office or whatever. It's really changed. It's really interesting. Wesley Faulkner is also here. Great to have you, Wesley. We'll talk about your new company, Works Not Working. Thanks. In just a little bit.
[00:32:02] Our show today brought to you by Zscaler, the world's largest cloud security platform. And the time is right. We talk a lot about AI on all of our shows. The potential rewards in business are obviously too great to ignore. But there are risks. There are a lot of risks. The loss of sensitive data. Attacks against enterprise managed AI. Generative AI increases opportunities for threat actors, too.
[00:32:30] Just as it's helping your business, it's helping them. To rapidly create phishing lures that are indistinguishable from the real thing. Do you notice? I don't know if you've noticed. We get more and more of those every single day. I got one the other day from a well-known company saying, your bill is overdue. Just click this link to pay it. Also, bad guys are using it to write malicious code. We're seeing malware crafted by AI. They're automating data extraction.
[00:32:57] One of the big problems that maybe you don't even see in your business is even the use of well-known SaaS AI apps can leak information. There are 1.3 million instances of social security numbers leaked to AI applications last year. ChatGPT and Microsoft Copilot, between the two of them, saw nearly 3.2 million data violations. So, this is my pitch.
[00:33:22] It's time for a modern approach with Zscaler Zero Trust Plus AI. It does a bunch of things that are very important. For one thing, it removes your attack surface. It secures your data everywhere on-prem and in the cloud. It safeguards your use of public and private AI. And it protects you against ransomware and AI-powered phishing attacks. Wow. Just check out what the Director of Security and Infrastructure at Zwara says about using Zscaler.
[00:33:51] AI provides tremendous opportunities, but it also brings tremendous security concerns when it comes to data privacy and data security. The benefit of Zscaler with ZIA rolled out for us right now is giving us the insights of how our employees are using various Gen AI tools. So, ability to monitor the activity, make sure that what we consider confidential and sensitive information, according to companies' data classification, does not get fed into the public LLM models, etc.
[00:34:20] With zero trust plus AI, you can thrive in the AI era, stay ahead of the competition, and remain resilient even as threats and risks evolve. You should go right now. Check it out. Learn more at zscaler.com slash security. Zscaler.com slash security. We thank them so much for their support of this week in tech.
[00:34:43] We have, you know, it's funny, when I was putting this show together, there weren't a lot of stories about AI, but there were a ton of stories about privacy. I'll start with one that kind of tells you something. Meta, according to Business Insider, apparently thinks we're too distracted to care about face recognition and Ray-Bans.
[00:35:02] They say they're going to add face recognition, that thing that Google refused to do way back when with Google Glass, to the Ray-Ban smart glasses, and they're going to do it soon. Figuring, yeah, there's too many other things people are worried about. They're not going to notice. We noticed, Mr. Zuckerberg. We noticed. Should we worry about face recognition in these glasses? It seems to me this is the one thing I've been waiting for.
[00:35:31] To be clear, they said that it wasn't you guys that they're worried about. They were saying that the consumer groups that fight against this or the rights groups that fight against it. They're busy fighting literally everything else because there's so much happening. So it's not just, it's not consumers that they're saying. The New York Times viewed a document that said, quote, Meta's internal memo said the political tumult in the United States was good timing for the features release.
[00:36:00] Yeah, I mean, it's one of the most cynical things I think I've ever seen from the tech industry. And that is really saying a lot. Yeah, no kidding. Yeah. If you read, I mean, Careless Whisper, that was literally, I mean, like, it did not surprise me at all. I was like, well, of course, do we not know that this is how they operate? What's Careless Whisper? That was the book by, was it Francis? The woman who used to work there and like. Careless People? Careless People, sorry. Careless Whisper was by Wham. That's a different song. Yeah.
[00:36:30] Okay. I was curious because I thought that was maybe a code name. Oh, yeah. Careless People blew the lid off of this. Wow. Well, you would think, but like. No, nobody cared. People kind of just shrugged. Yeah, so I will say. Careless Whisper. Facial. I'll sing it later.
[00:36:51] I am terrified of this, both as a woman, as a reporter, as a person who has been stopped. I just like everything about this is awful. But it's also something that I think Ome and I had conversations about this when I was back at KigaOme. I mean, like in 2012. Well, it's been on the radar. Google. Yeah. Remember, Google said we're not going to do that because of that very reason.
[00:37:21] We could. Although it is built into the nests. I think it's built into the nest cameras. So maybe they decided to change their mind. You can turn on facial recognition in your ring camera. Sorry. You can opt out of. Ring has familiar faces, so you can see it on your ring camera. You can see. But you understand that in order to do that, it's sending your face back to the home office. Yes.
[00:37:50] Comparing it against the database and then sending that information. But it's not done on device. Right. I have a lot of like one of the biggest things that I'm angry about and argue with people about all the time is end-to-end encryption on any sort of AI powered service anywhere. Like the Kohler toilet when it launched and was like, well, AI analyze your poop. I recognize that. But oh, you're OK. And they were like, we have end-to-end encryption. Absolutely. But end-to-end encryption.
[00:38:19] It takes a whole new meaning. Toilet cam. I'm too serious for this. Yes. I'm sorry. I'm sorry. I'm getting silly now. I apologize. Actually, this came up.
[00:38:31] Ring, remember two weeks ago and last week, wasn't even two weeks ago, and during the Super Bowl had what many thought was kind of a dystopian ad showing how you could find lost doggies using Ring's new surveillance feature, which many people almost immediately pointed out. But search party isn't for dogs. It's for, oh, I don't know, anything.
[00:38:59] And part of the problem was that Ring had a deal with Flock. And Flock has a deal with ICE. So it all kind of goes around and around. Do you want accurate information on this or? Yeah. Flock is the surveillance company that does license plate, ASLR, license plate recognition, automated license plate readers, and video surveillance. And anyway, Ring, which the good here's the good news. After the backlash.
[00:39:29] Ring said, oh, never mind. We're not going to do that deal with Flock. OK, but search party still. Those are two separate things. Search party is still there. Yeah. Ring is very accurate. Oh, go on. Oh, sorry. I said they're also doing the deal with Taser, too. Still. That's still going. Yeah, they still have. Taser. They're going to build a Taser into my doorbell? No, it's OK.
[00:39:53] Ring does a lot of third party partnerships with law enforcement because they need chain of custody for some of the videos. And they need. So that's that's part of there's some nefarious fun things happening there. So the deal with Flock and the deal with Axion, Taser, Bodycam Maker. Those are Ring did cancel their deal with Flock. They still have the other deal. Were they going to do license plate recognition?
[00:40:22] No, Flock was going to be able to access video feeds on like from Ring. And I don't I think Ring would give them the video and then Flock would run their own algorithm on it. So I think this is one of those cases where you do want to. There's a famous quote, like ski down the slippery slope here. The CEO of Ring says that his company's goal is to zero out crime, which like, first of all, kind of ridiculous.
[00:40:50] I don't know what Ring cameras are going to do about white collar crime, but like crime and what are we doing the crimes talking about here? And unlike some of the stuff we were talking about in the first half of the show. Right. This is something that almost everyone I know who doesn't work at a gigantic tech company agrees on. Every single person I go up to, you know, at a party, I tell them about, you know, an investigation I've done about some privacy issue. They go. They always like the response is always like, well, what's going to happen now?
[00:41:17] And the answer is nothing because there's no rules about this stuff. Right. People want laws on this issue so badly that they assume that they already exist because it's like so clearly a thing that everyone agrees on. I mean, facial recognition, like I'm not the first person to say this, but that is the end of any form of privacy in our society.
[00:41:41] It's over. It's done. As soon as that is just like a regular thing that, you know, a guy walking down the street can like hit a button on his glasses and identify who you are. It's over. And we're already there with all these cameras that are operating in publicly. If you see a security camera, you should assume that there's facial recognition software running on it. And, you know, like maybe you want to get rid of all crime like Ring does.
[00:42:06] Maybe you're in favor of that, but I think it's really important that we talk about what exactly it is that we're trading to get there, especially when we look at what Meta is doing, adding facial recognition to the glasses, trying to do it in secret. So I think we can assume the goal is like you get there and it's like, well, it's already here. Everyone's already using. We can't roll it back now. Like it's just become a regular part of everyday life. It isn't too late if people aren't happy with this stuff to do something about it.
[00:42:33] And what we saw with the reaction of the Super Bowl ad is perfect evidence. There is a constituency that wants this, though. There are suburban people who want to, you know, look, if somebody's stealing boxes off my porch, I want to help law enforcement catch them. Or if somebody across the street does a home invade, there's a home invasion across the street. I want law enforcement to catch them. If my camera can help, I want it to help.
[00:43:01] So they think they want it. Oh, we'll let Wesley go because he's going to make the... I'm sorry. It's just that... Yeah. What is being created right now is a marketplace for locating people. And once there's a marketplace, they will try to get as much data and sell it to as many people as they can.
[00:43:26] With the meta glasses, you said someone will push a button and see where a person is. That's not... Or who that person is. Yeah, I want that because when I run into you in about five years and I can't remember who you are... But that's meta letting you know who that person is. But meta will know all the people in that whole shot and they just will keep that data for themselves. Or sell it to Flock. And especially sell it.
[00:43:52] Because now they say, hey, I know that you're in a Barnes & Nobles and I know you just ran into your friend Tom. But now I know that Gina, Mike, and blah, blah, blah is in that Barnes & Noble. Hey, Barnes & Nobles, would you like to know who's going to your store and when? And that's the problem with the ring, isn't it? It catches everything that's going on. Regardless of your little puppy. They have it from your cell phone data. Like they know where I'm going. So the visual isn't necessary there.
[00:44:22] But now meta will have it. And meta will sell it. And now they would try to get more of it and they try to sell more of it. It's just like making another marketplace where your privacy is also sold and then people can aggregate it. It's also a surveillance net, right? It becomes at some point there is universal information about who's there, when, doing what. And I will say, so way back, Ring launched Neighbors.
[00:44:51] I was podcasting from my old house. So I had Jamie Siminoff on my IoT show way back in the day talking about this. And I was like, hey, man, what about racism? What about petty crime, people getting caught up in petty crime? And neighbors really did become a racist. It was. It was incredibly racist. Yeah. There's a black man walking down my street. But he was like, we're going to have social shame for that. We're going to, those people aren't.
[00:45:22] Exactly. So I want to point, and I've even brought the. I installed Ring cameras for my in-laws years ago because they really wanted to catch porch pirates. That was a big, they lived in like a million dollar neighborhood in Texas in a suburb. They were fine, but they were deeply concerned. And I brought up these concerns and they were like, Stacey, that's not real. And I don't know if those people have changed their minds.
[00:45:48] I don't think that the idea of zero crime, I mean, it's been sold to us for decades as the reason for needing surveillance. And it has perpetually led us to criminalizing, usually being poor, being black or brown. It isn't really about zero crime. It's about control and particular control of certain people. And I do wonder if our current political moment means that people are going to push back much harder. I think so.
[00:46:18] I think we used to think it was, I used to think, oh, privacy. So what if a, you know, a marketer gets my information? No big deal. Now you start to talk about, well, what if a federal, out of control, federal law enforcement agency gets that information? How, how do you feel about that now? Yeah, I don't know anyone. Oh, go ahead. Sorry. No, please.
[00:46:38] I don't, I don't know anyone who is comfortable and confident in the future of, you know, the governments of the leading world powers. Right. We've seen, you know, no matter what side of the aisle you happen to be on. And I think that's what, yeah, a lot of people are okay with this. A lot of people, you know, you tell them like, well, they're handing the data to the cops and they go, great. I love cops.
[00:47:01] But I think you, you have to use your imagination a little bit here about the amount of power that we are handing. Not like, sure, the government can get it, but also these gigantic corporations that operate and have the power of nation states. Right. You know, here's the evidence people are becoming aware of this. When the Nancy Guthrie story surfaced, one of the things they said is, well, we don't have, you know, she had a Nest camera doorbell, but she didn't have a subscription.
[00:47:30] And if you don't have a subscription, it's not sending the photos to the cloud, the information to the cloud. I guess the bad guy took the camera and disabled it. No. So, well, wait a minute. Hold on. Then it comes out two weeks later. Oh, we got pictures. Wait a minute. I thought you said that she didn't have a subscription. Oh, well, we were able to retrieve stuff that was like leftover. And now we have pictures. And what was, I know, hold on.
[00:48:00] What was gratifying to me is that people immediately said, wait a minute. Google has access to those? And that was a shocker. Go ahead, Stacey. How did people not know that? That's how the Nest cameras don't have local storage. That's not how they work. No, no, but she didn't pay for it. I know. But they can't. If you have remote access to any, this here, your audience knows this. But if you have remote access to any device on your home network, it goes to a cloud.
[00:48:29] That is literally how they work. So if you have a Nest camera and you're able to look at it, even if you don't have a subscription, if you can log into your camera remotely, you're going over the public internet and it has to hit a server somewhere. That's what upset people so much about this ring ad in the Super Bowl, right? It's like, oh, they're using my camera? The camera at my house?
[00:48:59] Yes. And this feature is on by default. So Ring decided that you're okay with this and they're going to use your camera for this feature unless you go turn it off, right? And I think Stacey makes a great point, right? We put these things in our house like a camera. You put a camera. People put Ring cameras pointing inside their own home and you don't have control over that device. You don't know what it's doing. And the company said, oh, we give you, we care about your privacy a lot. We give you all these controls. You don't need to worry about it.
[00:49:29] It's like, okay, if you trust Amazon and you are very comfortable with all the things that could go wrong with this data, then fine. But think about whether that's actually how you feel. The theory, the story was that they told us initially is that, well, she didn't have a subscription. And the deal with Nest is you can look at that camera live or for three hours after, but it will be deleted.
[00:49:56] But nine days later, Google says, oh, well, we were able to recover residual data in our backend systems. Yeah. I ended up explaining this to an AP reporter and I was kind of, again, I was really surprised that like, and I likened it, like, you know, when you drop your email, when you delete your emails. It's not gone. It's not gone. You can pull it out of your trash.
[00:50:22] So whatever Google's retention policies are, you know, that yes. And it was a pain for them to get to it. And they wouldn't do it for like everybody. And, you know, some people are probably like, well, if my mom got, you know, kidnapped from her home and this. Yeah. I'm glad that they have the video. Sure. But I think like I had the same reaction. Yeah. I saw that ring ad and all people got upset about it. And I was like, yeah, duh. Like people are like, oh my God, I'm part of a giant surveillance network.
[00:50:52] Like that's what this thing is for. I remember, Stacey, when this came out, you were on This Week in Google. We were talking about this whole, what do they call this? This LoRa network that they have that they've created with all sidewalk. Yeah. This is Amazon sidewalk. All the Amazon devices are networked together. And they even then at the time said it'll help you find your lost dog, which at the time seemed pretty anodymed, but maybe not. That's a radio network.
[00:51:22] It's very different. But yeah, well, they could, but I guess they could share somehow they're sharing video, right? No, no. It's not over. It's not over. Amazon sidewalk is literally just a, it's a wireless radio network. It is a long range wireless. So it's not related to this at all. Not at, not at all. I mean, there's, I did turn it off. That could be. Okay. Should I, Laura's the least of your problem. I mean, you can do however you want. Think of it as like mess-tastic.
[00:51:52] It's just a, it's a mesh. Yeah. I understand. Yeah. It's just only radio. So, so yeah. So the dog had a tag that had a radar transmitter, then you would be able to find it. We got to get our dogs on wifi. Yeah. But the thing about also someone brought up the example, if you, if you had a stalker who knew what your dog looked like, they could say, I lost my dog and then activate this network to find the person that they're stalking as well. Wow.
[00:52:18] So people could use this network for nefarious purposes. I just, I was very gratified that people went, oh, that's not good. Yeah. Yeah. Everybody seems to kind of be on the same page on this one. Yeah. But, you know, it's interesting. So I probably not paying close attention figured, oh, well, they canceled that flock safety. In fact, that's the headline in the verge ring cancels his partnership with flock safety after surveillance backlash. It had nothing to do with this. Right.
[00:52:48] Right. They just, everybody got upset at the same time. Yeah. So they're still doing the, uh, find your launch. And you should know how it works. So search party basically ring has fed photos of dogs into an AI that only recognizes dogs. And it ring is very clear that says this only recognizes dogs. Okay. What I need people to recognize and know when they hear that and they're like, oh, it's fine. That's great.
[00:53:17] Ring can update that algorithm at any point in time for facial recognition. Thing is they have the data. They have the control and ability to update your camera. You are opted into this automatically unless you turn it off. So just, and that's how they do this. They, they add these little features and you get used to them and you're like, oh, this is kind of nice. So neat. Or you don't think much of it. And they, they are like, oh, it's only dogs. Don't worry.
[00:53:44] And then, then it'll be like lost children. And you're like, oh, that's so good. Oh, that's good. We got to, yeah, actually. Yeah. Let's add lost children to the search party. I also want to add that Amazon just keeps firing huge chunks of people, uh, which is building an atmosphere of fear of people who are still there. And so where maybe in the past, someone raised their hand saying, this is a really bad idea. We shouldn't do this.
[00:54:13] There is less incentives for people to do that. And so I think the, even though there's a quick backlash and outrage of that Superbowl commercial, I'm sure deep down in people's like hearts before this launch, they had those same feelings internally, but they know they couldn't stop the head of ring from saying like that. You shouldn't do this. No, one's going to raise their hand to make sure that that doesn't happen internally.
[00:54:40] So see these, um, aggressive, maybe, um, I don't want to say poorly thought out of, but very, uh, let's just say one perspective of why these things are good to, to hit the market more now from not just Amazon. Um, all the divisions of, of these large tech companies that are doing these mass layoffs. So, uh, when we bought our house, there is a ring camera, uh, on the front door, but I pointed it away from the street.
[00:55:10] So it only sees, it doesn't see anything, but our property, like not any, not anything, but our front porch and every other camera I have is local. It's a, they're ubiquity cameras. They don't go to the cloud. Is that okay? Stacy. Am I? I mean, I want to get rid of the ring camera, but it, it, it, it, it, it, it, it, it, it, it, it, it, it. There's a big hole in the wall where it goes, and there's nothing else I can put there. I'd have to do... Anyway.
[00:55:39] Do you want a camera? Yeah. Do you want a camera for your doorbell? I like having a camera on my front porch. Yeah. I like to know. You know why? Because the cat has figured out how to ring the doorbell. The cat jumps up on the wall, walks up to the camera, and it rings. She doesn't push the button. She isn't that good. But she moves her head around, and it rings. And then I know she's at the door. And I can look at my watch and see if it's her or a delivery person. Do you open the door when it's your cat? Yeah. Oh, okay.
[00:56:09] No, I didn't know if the cat had trained you. She has trained us to answer the door. Or maybe we trained her to ring the bell. I don't know which it is. Your cat is on Wi-Fi. That's good. That's one of the reasons I don't want to get rid of it. You've got to get them phones. The chime thing. It's the chime thing. It chimes throughout the house when somebody walks up the... So there are ways to... We have a whole pamphlet I did on bystander privacy for these devices.
[00:56:37] Limiting the range of actual camera vision is important. So if you don't have a patient... Because I want to know if it's the cat or Burke. So... And if your local cameras, if you are not accessing them remotely... Again, if your Ubiquiti cameras are going to a server somewhere... No, it's my server. It's in my house. If it's your server that you host, then you're fine. Yeah.
[00:57:05] Main reason I did that was not for any privacy thing, but I didn't want to wait. I need all the bandwidth to do these shows. Those cameras uploading the internet use a lot of bandwidth. If you have three or four cameras uploading the internet, it's killing your internet bandwidth. So I didn't want to do that. I only wanted them to save locally. Yeah. So there... So it was just selfish. It wasn't... And I don't care about people. You wrote a story, Thomas. TikTok is tracking you.
[00:57:35] Oh, that's good. Even if you don't use the app? Yeah. So don't have the app on your phone is the answer. Whether or not you use the app, if you've never watched a TikTok video in your entire life, TikTok is tracking what people are doing across parts of the internet that have nothing to do with TikTok. And it's worth mentioning, so are a lot of other gigantic tech companies. This is another story, like similar to the Ring thing, where it's like, this has been going on for a while.
[00:58:01] So what's happening here is tech companies that have advertising platforms, some of your listeners might know this, have tools called pixels. It's like literally like an invisible one pixel image that you put in the background of a website. And when it loads, it sends data to the company that made it. So if I... Just parenthetically, when we said we were going to put you on the show, the BBC called us and said, could you put a tracking pixel on the show so we could see how many people saw Thomas? Of course. Yeah. Which we said no to, by the way.
[00:58:31] Right. Yeah. But that's very common. People ask for that all the time. Not just for you, but in general. And, you know, you visit a website and every single person who goes there, like if you advertise on TikTok, you put one of these things on your site and send TikTok data. But it's swooping up, you know, anybody who visits, whether or not you use TikTok, no matter what the content of the website is. So is it like the Facebook like button, kind of? Similar, yeah. But invisible. Yeah, it's invisible.
[00:59:01] They've been around for years. The difference here is that TikTok just updated its pixel to make it way more useful for advertisers. Essentially, they're sharing more data with the people who use the pixel. And the implication is that's going to make TikTok advertising more effective, more appealing to advertisers. We're going to start seeing it in more places. And they're going to start getting even more of your data. Even if you don't like TikTok, you don't trust them, you don't want to be involved. They are getting your personal information.
[00:59:25] What advertisers don't realize, of course, though, is this is creating a what Cory Doctorow called the largest consumer boycott in history. The use of ad blockers and pixel blockers. Which not only stops the TikTok pixel, but but also stops ads. So they're shooting themselves in the foot by invading. So being so aggressive about invading our privacy, they're actually going to hurt themselves in the long run.
[00:59:51] So I just want to say that this doesn't just happen on the web itself. Right. This can be they put image pickle, image pickles. Good Lord. I would love an image pickle right now. They put pixels into like emails. That was how superhuman track like opens and things like that. So just to be aware that this is ubiquitous, like truly on the Internet.
[01:00:19] But am I wrong in thinking that if I run EFF has a as a content blocker, I use you block origin. If I run one of those content blockers that stops those pixels, the beacons don't get sent back. Isn't that right? Or am I wrong? But so that was part of the reason that I wrote the article is like there are actually there's super easy things you can do to stop this kind of tracking. Right. You can use a more private browser. Most people are on Chrome. Right. Chrome by default. I wonder why Chrome doesn't block. Right. Exactly.
[01:00:49] You can use a privacy like a like a tracker blocker. DuckDuckGo makes one. There's the EFF has privacy. Privacy Badger. That'll stop this. But there are other kinds of data collection, right? They're collecting data from apps in a similar way. And there's also, you know, data that they're sending, you know, directly from the company server, not from your web browser, not using your computer to pass the information on. There's not a lot you can do about that.
[01:01:15] But my point with this stuff is always, you know, what makes you so vulnerable to privacy intrusions is that they're putting together data from so many different sources. And any steps you can take to limit, okay, well, you're not getting this does make an enormous difference. So, you know, I'm always encouraging people. Just like it's not a lost cause. Just put in a little effort whenever you can. And you can, you know, make a meaningful difference with this particular problem. Do something anyway.
[01:01:45] Do something. Yeah. So the recommendations are ad blockers, not loading images in your email by default. By default, yeah. Yeah, I don't use HTML email at all. I use plain text for that very reason. There's no. What else, Thomas? What else should we be doing? Pie holes. What's that? Pie hole. Pie hole. You're really. Pie hole is good. If you've got an image pickle, put it in your pie hole. Pie hole is set up at his house.
[01:02:14] And when I go home, I visit my parents. Like the internet just like does not work. Right. There's like all these websites, especially because I write about ad tech all the time. That's the downside. I'm trying to go to their companies. Yes. Go to their webpages. I can't visit them. I get blocked all the time because I have NextDNS running, which is basically a third party pie hole. Of course, all my information is going to them now. A DNS blocker is really going to stop a lot of this.
[01:02:39] If you have the stomach for it, if you have the, like it doesn't take a ton of technical expertise, it'll break some things sometimes. But a lot of them are designed to be consumer friendly. Use a better browser. If you're not going to do that, use a tracker blocker. Set up a DNS tool. Or if you're really crazy, you can go for the pie hole. Even I can't take it to that level. But again, there are things you can do here.
[01:03:06] I would venture that a majority of our listeners, people listening to the show, are using some sort of ad blocking. I mean, it's become, if you're at all technically sophisticated, it's become just de-regeur. But not every ad blocker, it works for this. DuckDuckGo has a really good chart you can go check out. Some ad blockers will block some trackers. A lot of them just block ads. They're really not designed for this purpose.
[01:03:36] If this is what you want, if this is what you're concerned about, an ad blocker will help. But you really want to get a tool that is specifically designed for this purpose. That's what's going to block the most trackers and have the biggest effect. You block Origin. That's the one that we recommend. Let me see what DuckDuckGo says on their chart here. Yeah. Oh, that's interesting. You block Origin does not protect against referrer tracking, which is wild.
[01:04:03] That lets companies identify the site that you visited, where you click the link to go to their page. They can see where the referrer came from. Fingerprint tracking is a big issue. We've talked a lot about that on security now. Yeah. You know, I'm not as protected as I thought I was. Right. And I mean, part of the reason is a lot of these tools will break websites. Right. Like there's certain functionality that if you're really worried about your privacy, it's worth doing, but it'll mess things up. And then, you know, it's not that hard.
[01:04:32] You can just go, okay, like unblock this one particular website or like, you know, turn off the tracking tools for a minute. It does make things a little bit more annoying. So there's like a usability trade off that you need to make here. But, you know, like it's not like you're going to be permanently messed up or disempowered here. Like if you're on a website, it's not working. You have one of these tools, turn it off. The easiest thing I do is use a liberal use of privacy browsing.
[01:05:01] I just on, I have a browser that has no, no, no plugins, no extensions on it. And I just set it to always Firefox on mobile. I use it the most. And you said the strict tracking. I set it to, I just use the privacy mode, which means that it never has cookies. It never has me signed into anything. Firefox has a great mobile app. When I close it, everything gets deleted. I launch it and it's, I start with, it's like an ephemeral browser.
[01:05:31] Where I start fresh every single time. Now, if I pair that with a VPN, then I think that that's, that's probably going to get. And you can't use any sites ever again. Because that's really cover your tracks from EFF is a very interesting way to see if your browser protects you against fingerprinting. Which is the ability to figure out who you are based on asking your browser. Well, what's the screen resolution? What plugins does he have?
[01:05:58] You know, a whole variety of things that allows them basically to narrow you down so much that they effectively know who you are. And some browsers do a good job of this. Some do not. What you want is your browser has a unique fingerprint. As you can see this, I'm using a Firefox browser. That's what you want. A unique fingerprint so that you don't look like. Oh no, that's the opposite. I don't want that. I was going to say what? I don't want that. That means I know who they know who I am. Gosh, darn it.
[01:06:28] You're in trouble. I'm in trouble. It means my browser fingerprint is unique among the 308,000 tested in the last 45 days. I'm the one and only. So they know it's me. Yeah. Safari does a good job of this, by the way. I should be using Safari, but I'm not. All right. Let's pause for an ad. And then we will come back and talk about other issues, age verification and security in general.
[01:06:57] Stacey, if you were still covering home automation or home tools like Jennifer Patterson-Tui is, you would be thrilled to see what you can do with the DJI Romo RoboVac. A guy actually was able to, he wanted to joystick control his RoboVac, right? And accidentally got it so he could joystick control all of them. You know, I work on cybersecurity policy.
[01:07:26] That is like literally my job. That's kind of a problem. So yes, I did. I did see this. Little bit of a problem. We'll talk about it. It must have been fun though for a moment. He really cleaned up. He cleaned up. I'm going to clean every house in the world. We have a great panel. So glad to have you. Wes Faulkner. It's good to see you. Works-not-working.com. It's not working yet, but it will be working. Yes.
[01:07:54] So if you have like some bones you could throw my way, check out the demo. Let me know if this is something that you want to support. Please just hit the contribute button. What is it for? Who's it for? It's for people who are in this economy, especially, who currently have a job, but may feel like they need techniques to survive it. They're toxic work environment, weird bias issues, navigating like egos and pyro structures.
[01:08:23] It's a community for people to figure out how to survive this time without losing your job or also without losing your sanity. So this is just a place where you can get that support and be able to have real talk. Because a lot of places, like if you read a book or self-help, it just assumes that everyone's working for a company that has your best interests at heart.
[01:08:51] Just talk to HR or just talk to your manager and say, hey, this is too much work or hey, I'm having this sort of problem. Yeah. Those tips do not help. They do not work. And so this is like a real site for real people. Works as in work is not working. Works apostrophe is not working. Yeah. Nowadays, you can't just quit your job. You need the health care. It's not necessarily another job waiting for you. I think a lot of times all you can do is make the best of what you got. So that's great. I'm glad you're doing that.
[01:09:21] Works not working. Stacey Hagenbotham's here. She's a policy fellow at Consumer Reports. And do you still write a little bit? A very small amount. I write for CR's stuff sometimes. And then, yeah, I probably should freelance more. Well, one of the things she does freelance for free, I might add, is Stacey's book club. And we had a wonderful book club a couple of weeks ago. We're doing another one soon. We've picked a book.
[01:09:52] What's the book? A Song for the Wild Built. Okay. It's another Becky Chambers. It's very thin. And, oh, good. I like that. There's a chance I'll finish it this time. Good. All right. We haven't set a date for the next one, but it'll be in a month or so. And read it and enjoy it. Also with us, Thomas. Huh? Sorry? It's a feel-good book. I want to tell people because we've had some real bombers. Yeah, we had some dystopian. Yeah. A feel-good book.
[01:10:21] Actually, Briggs in our Discord chat says, loved that book. So thank you, Briggs. Okay. I will read it. Also with us, brand new. It's great to have him. Thomas Jermaine. His new podcast launched last week, The Interface. He also writes the Keeping Tabs column for the BBC. And that's where that TikTok story is if you want to figure out how to block the talk. He's got the ways and means. Thank you for being here, Thomas. Great to have you. Our show today brought to you by Monarch.
[01:10:50] I love Monarch. I've been using it for more than a year now, and it's been a huge benefit for me. Maybe you made a New Year resolution. Start to think about your finance. Maybe get things in order. Start planning for the future. Get married. Buying a house. Retirement. Going to college. Whatever it is. Maybe this is the year you pay off all of those credit cards or start saving for, you know, the kid's college fund.
[01:11:16] Wouldn't it be nice to have a tool that helps you plan, protect, and proactively achieve that goal? I love Monarch because it's a second-generation money tool. The guy who started Monarch had worked for the other big, you probably used it. Everybody did big name finance app. That went away. He said, this is an opportunity to do it again and do it right, and it is. It's so great. Set yourself up for financial success this year. Monarch is the all-in-one personal finance tool designed to make your life easier.
[01:11:46] It brings your entire financial life, budgeting, accounts, and investments, net worth, and future planning together in one dashboard on your phone or laptop. Feel aware and in control of your finances this year and get 50% off your Monarch subscription with code TWIT. It's not your typical personal finance app. Unlike the other guys, it's built to make you kind of proactive. Of course, it makes it easier than ever to track your money. Monarch's most popular features include beautiful data visualizations.
[01:12:16] You know what they do that I love? Have you ever seen those Sankey diagrams where it says, you know, income, and then, you know, it dwindles into taxes, childcare, and then you get this little thing called profit, that kind of. It does those, which I love, but it also do, if you like, more traditional pie charts or line charts or bar charts. But the charts are great. It also will do investment tracking. And so you'll get, you know, if you're investing with a broker, they may have graphs of their own. But let me tell you, the visuals from Monarch are fantastic.
[01:12:45] A real picture of your portfolio performance, you can get it in relation to other stocks or the S&P 500. So you get a really good idea of how you're doing. Try not to check it too often. I do every day, unfortunately. Individual and partner filters are great. So one of the things couples often fight about is money. It's probably the number one subject. Being able to share this information and have the actual facts in front of you is fantastic. At no additional cost, you can share with your partner.
[01:13:12] And I have just started sharing it with my financial advisor as well. So that's cool. No extra charge. And that way, I don't have to bring a lot of paperwork in. I can sit down. She can look at it. She can say, okay, let's work on this. You can view your assets either individually or together. It also has really great results. Now, I'll say anecdotally, it saved me a lot of money. But this is what Monarch users reported. They did a survey last year.
[01:13:38] Monarch helped users save over $200 a month, a month on average after joining. So it pays for itself more than eight out of 10 members, 80% feel more in control of their finances with Monarch. 80% say Monarch gives them a clearer picture of where their money is going. And I would say absolutely. That's why I love it. Plus, it's easy. It's really easy to set up.
[01:14:00] Set yourself up for financial success in 2026 with Monarch, the all-in-one tool that makes proactive money management simple all year long. Use the code TWIT at Monarch.com for half off your first year, 50% off your first year. Monarch.com with code TWIT. M-O-N-A-R-C-H, Monarch.com. Take a look. I think you'll agree. It's absolutely fantastic.
[01:14:26] So there's been a big kerfuffle in our own Club Twit group because we use Discord. That's where a lot of our listeners are hanging out. And Discord has announced, and it's not just going to be Discord, folks. It's going to be everybody, that they're going to have to do some sort of age verification. This is, I think, partly because of the UK, partly because of Australia.
[01:14:54] And I think it's going to happen in the rest of the world as well. Our audience was really concerned. We're going to have to give government ID to be in the club? And I said, you know, I feel for you. I don't know. We would probably move. We'd have to. I don't want to force my good friends to have to do that. On the other hand, Discord is hugely useful.
[01:15:22] They say now, I don't know, this 9 to 5 max says backtracks on age verification rollout, sort of. They say that the vast majority of users will be able to continue without ever going through age verification. On the other hand, if you have to do age verification, they may be using a video technology Peter Thiel has backed, which is an issue for some.
[01:15:52] But here's the good news. It isn't so hard to get by this stuff. Here's a page. Discord Twitch kick Snapchat age verifier. How you can get around it, because it turns out the way these things work, it isn't so difficult.
[01:16:14] Now, you'd have to be fairly sophisticated to get through this, but it's called K-ID. It's a cat and mouse, right? They, this company, this website rather that did it, and I guess it's kind of hackers age dash verifier dash K-I-B-T-Y dot town.
[01:16:43] It worked, and then it didn't work. And then it worked, and then it didn't work, because it's a cat and mouse game with the identifiers, the K-ID provider. The thing is that Discord doesn't have to do this, and they're just doing it. They don't have to do it yet. I think they're being proactive, right? They're opting into doing this, but it's causing more problems in terms of their reputation.
[01:17:08] Now, even though it hasn't fully been deployed, there's a parent group here where email wasn't working, and people wanted to get off Facebook, and so someone spun up a Discord server and said, let's just use this, let's communicate here. And then someone says, I'm absolutely not enjoying this because I don't want to give up my biometric data. And that was their response. And this was like a parent group. So if it's hitting the parents, I think that this is going to be socialized in a negative way.
[01:17:38] That's going to be worse. It's going to be like the Anthropic commercials during the Super Bowl, where people are going to just have a bad taste in their mouth and associate that with Discord. Meta says, well, all the Meta properties are going to do this, but they are saying, we can probably guess your age based on the data you've posted, things like happy birthday messages. And they won't ask you to verify unless all of their inference says, no, you're young.
[01:18:08] And then they're going to do- Start capitalizing and using proper punctuation in all of your messages.
[01:18:41] They'll think you're so old. This is something I've been writing about for a couple of years now that a lot of experts that I talk to say is turning into a complete disaster. It's one of those things where it started with porn, right? Porn is always the bellwether for what's going to happen with the rest of society, particularly in the realm of technology. And it's well, you know, well-intentioned. We all want to protect children. This is great. Like the internet is a terrible place to be a kid. Everyone can agree.
[01:19:10] But countries and states are rolling out these laws in ways that are just not thoughtful, right? Like it's not a good state of affairs when you have to constantly show your government ID or, you know, upload biometric information everywhere you go on the internet. For one, it's going to have a chilling effect on speech, right? Like the things that I'm looking up or doing online, I might not do them if I have to hand someone a copy of my ID.
[01:19:38] And there's a solution that's so much better, right? The argument that critics of this stuff say is that you can upload, like Apple could build a system into your phone where you upload a copy of your ID. Apple likely already knows how old you are. They already know, right? Right. So they could build a system where if you're using an app or if you're on a website that it asks your device to verify your ID. Believe it or not, they already have an API to do that. Right. In which they chunk people up into age groups.
[01:20:08] And there's an API that an app can say, which age group is this person in? They haven't turned it on yet. I understand Apple doesn't want to do this. Meta wants Apple to do this. I think this is the only reasonable way is these are gatekeepers. If you're a gatekeeper, if you're Google Android or Apple iOS, hey, you're the gatekeeper. You know the age. It feels like comments. I think Google has come out and said, like, we're actually game. Like, we're willing to do this. In Android. Okay. Yeah.
[01:20:36] It would require lawmakers to, like, set up the regulation so that this is an option. And, like, you know, we have to create a whole system for the whole internet to use. But the way that we're doing it is, like, you know, a bunch of guys who don't know anything about how technology works, putting out these regulations, rushing them out in order to make it seem like lawmakers are doing something to protect children, which is a good thing that everyone can agree on.
[01:21:03] But, you know, I think we're rushing ahead with something without thinking about what the consequences might be. And that is you showing your ID every time you visit a website, every time you log in somewhere. Or at least when you create your account. What would be more effective is to have internet education in schools with also, like, how do you know what's a deepfake? How do you protect yourself from predators? How do you not get stalked? How do you not put things in terms of, like, how to set up a piehole?
[01:21:35] How do you escape tracking pickles? All of these things could be taught in school to help with education. So there will be less victims of some of the things that they're trying to protect kids from. Because at the same time, they're cutting education. They're cutting sex ed. They're censoring the history. They're protecting pedophiles. There's all this stuff that's happening that they're trying to – they're seeing the outcomes
[01:22:04] of their lack of education of people and kids in the system and just dealing with the symptom instead of the actual real root of them. It also gets back to the conversation we were having about Section 230 and free speech because there are – there's a lot of evidence that this is, like, an underhanded way for the government to regulate speech, right? So we were talking – like, I was talking about pornography here. What counts as porn? Well, in one particular state – I'm blanking, so I have a guess, but I don't want to
[01:22:33] guess it'd be wrong. But in one particular state, their age verification law said that among the things that count as pornography are acts of homosexuality, even though they've given a long list of every sexual act you could ever, you know, participate in no matter, you know, what gender of person you're doing it with. They've specifically called that out. So there's a lot – you know, and the Heritage Foundation talking about openly that we want age verification because it will help us limit what information people have access to about
[01:23:03] things like abortion or homosexuality or, you know, the transgender movement. Again, it just feels like we're rushing these things out without thinking about what could go wrong. And if you're thinking, well, maybe Google and Apple would make a good place for this information to reside, then there's also the issue – I think people trust Apple, and I think Apple does its best to protect people's privacy. I'm not sure about Google so much.
[01:23:33] This story from The Intercept about Amandla Thomas Johnson, a student at Cornell, attended a protest targeting companies that supplied weapons to Israel. It was at a Cornell University job fair in 2024. He was there for five minutes, but the action got him banned from campus.
[01:23:54] And when ICE demanded information about him, including credit card, bank account numbers, IP address, name, just a huge amount of information. Google complied. Of course, it was a lawful subpoena. Google didn't fight it. Google never gave him a chance to fight it.
[01:24:21] And in fact, Google didn't tell him until later and didn't tell him how much information they'd handed over. So I guess the thing to remember is that you might say, well, Android and iOS are the right place to do this age verification. But are they going to protect all that information, especially when presented with a lawful subpoena? Yeah. Maybe not, but we're talking about either Google and Apple or literally every platform on the entire internet.
[01:24:51] Like now you have to trust everyone with this information. True. Better at least just two. Yes. We're talking about just like who you are, right? And how old you are. Like they don't, like we could set up a system where it's blind, right? Where your phone scans your ID. Most states have digital IDs. Now maybe it does some kind of cryptographic handshake. It confirms your age. It's not storing any data and then a platform can just ask your phone, is this person over 18 or not? Or if you've ever used a credit card with Apple or Google, they have definitive proof of your age, right?
[01:25:21] Like that's built in. And I suspect that that's one of the reasons so many states are now using, they're putting their driver's licenses on iPhones and Apple's supporting that. And actually my passport is in my iPhone as well. So they have my age from that. And yeah, I mean, it's in there. I guess if Apple could come up with a, and maybe they have a privacy protecting way of storing that information. They don't give your age through the API. They just put you in a group.
[01:25:47] And I think the best way to do this, honestly, with at least with kids would be for the parents to enter on the phone, the age, the emotional age that they feel that kid is, you know, some kids, your kid, Stacy, very mature. You know, maybe you'd say, well, they're, they're 18. And some kids who are 18, maybe have the emotional maturity of a nine-year-old. You might put nine in. No, because they're a legal adult at 18. Oh, that's right. So let's figure out a good age. 16.
[01:26:17] And you're in trouble. You have, you have, that's a pretty hard limit right there. That's true. Nothing you can do after that. No, I know. Yeah, I know. I think that's really the best solution. I agree with you, Thomas. We've talked about this on security now. And Steve Gibson, our security guy also thinks this is really the only way to do it. But as of now, most of the lawmakers, the way they did in Mississippi, the way they did in Texas, I think the way they're doing in UK are saying, we're not going to do it. I tell you how to do it. You just, you, you figure it out platforms. It's up to you. We know you'll handle this well.
[01:26:47] The government could actually, I mean, they have our social security number. They have all this data. They could actually host an API for all of their citizens if they really cared about this. And if we're concerned about private companies having access to this data, even though we know they haven't, that is another option, not under this government. In fact, they do that in Louisiana. Louisiana has a digital ID and it is set up for this. That was the first state to have an age verification system set up.
[01:27:17] And websites check with the state, right? You log into this portal that is controlled by the state government. And the state says, yes, this person is this particular age. They don't hand them any identifiable information. It's just like, you're cool to be on this website. There's all kinds of ways that you could do this that don't involve, you know, like presenting Some sketchy company being like, yes, let me have a photo of your driver's license and I'm going to store on an S3 bucket that's unsecured. Right. There have been data breaches already.
[01:27:47] I think more than once where age verification companies have leaked to- Discord. Yeah. As a matter of fact. Yeah. Yep. I think it was 17,000 IDs that were leaked. And that's a shame. And this is why people are nervous about giving their information to Discord and I don't blame them. I think rightly so. Well, it's to anybody because they outsource it. And that's the problem. It's always going to be the weakest link that's going to have all this data. The breach wasn't a Discord breach. It was a breach of the third party that did it.
[01:28:17] From The Verge. Sammy, as Dufel claims, he wasn't trying to hack every robot vacuum in the world. He just wanted to remote control his brand new DJI Romo vacuum with a PS5 gamepad. But when his homegrown remote control started talking to DJI servers, it wasn't just one vacuum cleaner that replied. Roughly 7,000 of them all over the world began treating Sammy like their boss. He could remotely control them, look and listen through their live camera feeds.
[01:28:47] He could watch them map out each room of a house, generating a complete 2D floor plan. He could use any robot's IP address to find its rough location. Oh, boy. We actually see this. So our testing lab sees this kind of this class of errors more often than people would like to know about. I actually once my I was able to actually log in to my ISP.
[01:29:15] I was setting up a Firewallet, which is a little internet device. A little hardware device. Yeah, yeah. I plugged it in and I actually saw every other router and all of their devices through the Firewallet because my ISP had set up their DNS incorrectly. I called them. I let them know. But like these sorts of configuration errors are not uncommon. And it's it's really terrifying.
[01:29:40] I can remember the first time Apple put AirPlay or whatever its predecessor was into the Apple iTunes app. And I remember going to a hotel and getting on the Wi-Fi and seeing everybody else's music on my iTunes and then back in the early days. I think they've fixed that since then. This is also like could be an actual safety issue. Like if you could get into the firmware and like overdrive a DC motor and cause a fire,
[01:30:09] that could be something that could be catastrophic to kind of like set destroy all of these or just brick them. Even this is not just a security thing. Oh, someone's able to move my vacuum. This is if this have cameras, they can point the cameras, they can turn on the mics, they can get access and hop onto their Wi-Fi and snoop on their traffic. This this is really big.
[01:30:36] It's not just, hey, someone else is moving someone else's vacuum. This is like you could actually cause loss of life in extreme cases. When the Verge contacted DJI, which is a Chinese company, they claimed DJI claimed they'd fixed the vulnerability, but it was only partially resolved. DJI could confirm the issue was resolved last week. Their mediation was already underway prior to public disclosure.
[01:31:01] And then about a half an hour later, Sammy showed the Verge reporter thousands of robots, including the review unit reporting for duty. He was able, given the serial number of the review unit, he was able to tell the Verge how much battery life was yet left. And it transmitted a map of a Verge reviewer, Thomas Rickman's house, Thomas Ricker's house to Sammy.
[01:31:32] Here's the map. So, yeah, I presume by now it's fixed. It apparently isn't totally fixed. Oh, boy. I mean, some of these things. It's happened before. The Verge says hackers took over Ecovacs robot vacuums to chase pets and yell racist slurs in 2024. Remember that one?
[01:31:56] So, here's, I'm going to talk about this because we have a solution sort of that's stuck in limbo and it's a voluntary program. Europe actually has laws that they've implemented on this front, but it's basically how to design connected devices securely. And we need, this is an area where you actually need the government to set rules and regulations because it's not something the competitive market can solve because consumers can't see
[01:32:24] it until after they've bought it and you have to have this level of expertise. So, when we don't have this, what we have is people buying the cheapest possible device and you get companies that maybe don't think about securing their servers or in this case, they didn't secure their service or servers. They also, I'm trying to remember all the details of this. Regardless, a lot of these errors are kind of careless.
[01:32:54] And if we had laws to, you know, penalize companies. And companies don't seem to, DJI doesn't seem to care all that much. They claimed they'd fix it. They hadn't. They claimed again, they'd fixed it. And then they hadn't. And now Sean Hollister is writing for The Verge. He said, even now, DJI has not fixed all the vulnerabilities. And one is so bad, I won't describe it until DJI has more time to fix it. But DJI has not immediately promised to do so.
[01:33:24] Companies often, when you go in and report a vulnerability to a company, they will often say, well, that's actually not a vulnerability. So there is definitely a back and forth between security researchers and the companies themselves. I am not an apologist for these companies. I'm just saying there is a back and forth that needs to happen. There also, they could have trouble replicating the issue. And then, yes, there's always the case where they're like, either they won't fix it or they can't fix it.
[01:33:53] And then an ethical company will be like, hey, we should recall those. That almost never happens. Or they'll do something like what Wise did, which is like, oh, we're going to end a life with that product and put out a new one. We're not going to sell that one anymore. Never mind. We can't solve this. So we're just going to set it all on fire and go buy a new one. That's why I'll never buy a Wise cam because of that. It soured me. Which is too bad. And I would say don't buy this vacuum or anything that will be on your local network from DJI
[01:34:22] because they have bad security practices. So it's not, I think it's zooming in on saying whether it's fixed or not. But the thing is, it was released and it was not found by them. That is a huge problem. Yeah. No, in fact, it's on GitHub. If you want to download it. And it even says, this tool bypasses the pin code setup on DJI Home App. So it's a lot easier to use. You might want to use the GitHub version of the software.
[01:34:50] He says, I can, in fact, control my Romo with my PlayStation 5 controller. I mean, an IP robot in your house in general. Maybe, maybe think about it. Yeah. Maybe not. That's where, that's the future. We're all going to have them, Thomas. I mean. You shouldn't. Like, we never. We don't have cameras inside. Even when I was testing smart home equipment, we faced all of our cameras to a, well, I had a fish tank for precisely that reason.
[01:35:19] But like, you just don't, don't do it. I would never have cameras streaming live to the, oh, nevermind. But here you want to freak out some. Should we freak people out? Yes. Like legitimately freak them out, but not like. The way that Wi-Fi sensing is going, we will have the equivalent of a camera that is just using your Wi-Fi to understand what you are doing in your house.
[01:35:46] You can, using the Wi-Fi signals, because you, because it turns out humans are really basically big bags of water. Big bags of salt water. And they block the Wi-Fi signal. And so you can, I guess, do you have to triangulate? I don't know how you do it. You use RSSI. Yeah. You do triangulation. So you need at least two devices. The more devices you have. The really crazy things to know about this are because of the way Wi-Fi works, they all talk to each other and give signal feedback, right?
[01:36:15] That's good. But it also means that you could load the software on anything that has sufficient computing power that has Wi-Fi radio, and then it becomes a Wi-Fi sensor in your home. And then all you have to do is pick the algorithm you want, and you can determine all kinds of fun things. So that's the future. And it's not far off. CTA actually just like a week ago decided they were going to do a standard for fall detection for Wi-Fi. And we're all excited because fall detection would be great.
[01:36:45] And it would. But they could later on do, you know, I always say puppy kicking detection or nose picking detection. Are they going to do summer, winter, and spring as well? Fall, summer, spring, and winter. We're going to get them all. Yeah. They're going to use your temperature sensors inside your home. Ladies and gentlemen, you are watching This Week in Tech. Thomas Jermaine is here from the BBC. I love saying that.
[01:37:14] That sounds so good. Feels good. Yeah. Feels good. The British Broadcasting Company. Right. Don't. Is it Corporation? Corporation. Sorry. Corporation. Yes. Didn't they just raise the TV license fees in the UK? I think they did. Yeah. I mean, there's a whole can of worms I tend not to get into. It's pretty funny. I can't imagine here in the United States paying the government so that I could watch TV.
[01:37:43] I prefer to pay giant tech companies. Right. Yeah. Yeah. Let's help them out. But you do not have to have iPlayer to listen to the interface. No. You can get it on any podcast application. It's on YouTube. Anywhere you want. Everywhere. Everywhere. Also, Stacey Higginbotham. Always happy to have Stacey on. We miss you on what used to be This Week in Google. Now Intelligent Machines.
[01:38:10] But your colleague, Paris Martineau, has done a very good job filling in for you. So at least we've got Paris. Is she still on it? What? You thought I'd drive her off by now? Is that what you're saying? I know you are. I know what you're thinking anyway. I didn't know she was still doing that. Okay, good. Yeah. Oh, yeah. Stacey Harris and Jeff. Yeah. Yeah. Kind of hurts my feelings a little bit that you thought she'd be out of there by now. Maybe it's, you know.
[01:38:38] It's because she got a job at Consumer Reports. She did. Food. That's why. She got a good job and she's happy now. And also, Wes Faulkner. I haven't driven you guys off yet. It's good to have you, founder of Works. Thanks. Not working. Yes. Thanks, Kenneth, for your contribution. I just saw it. Oh, that's nice. If people go to Works Not Working, they can contribute there? Yeah, there's just a big button on the toolbar. Help support what Wes lays up to.
[01:39:09] Works-Not-Working. And you'll get a badge for your level of support on the platform as well. A badge. A badge. You know, Dan Aykroyd carries a badge at all times. But that's a different kind of badge. Yeah. We don't need no sneaking badges. I don't want to put that up. Our show today brought to you by ZipRecruiter. If you're hiring for your company, this is a busy time of year for you.
[01:39:36] You've got new 2026 goals, which means finding the right people to accomplish them. Unfortunately, you also have new hiring challenges for the year, like filling specialized roles. Maybe you've got to identify qualified candidates from what is, you know, nowadays a huge pool of applicants. That's actually one of the great things about ZipRecruiter. You do get a huge pool of applicants, but you also get tools that make it very easy to find the perfect person in that giant pool.
[01:40:03] Thankfully, there is a place you can go to help conquer these challenges and achieve your hiring goals. ZipRecruiter. And right now, you can try it for free at ZipRecruiter.com slash twit. ZipRecruiter's matching technology works fast to find top talent. You don't waste time or money. You can find out right away how many job seekers in your area are qualified for your specific role.
[01:40:28] With ZipRecruiter's advanced resume database, you can instantly unlock top candidates' contact info and invite them to apply, which, by the way, I have to say is the best technique for getting great candidates. If you ask them, hey, we would like you to apply for our job, that puts you way ahead of everybody else. No wonder ZipRecruiter is the number one rated hiring site, and that's based on G2. Let ZipRecruiter help you find the best people for all your roles. Four out of five employers who post on ZipRecruiter get a quality candidate within the first day.
[01:40:57] See for yourself. Just go to this exclusive web address right now to try ZipRecruiter for free. ZipRecruiter.com slash twit. Again, that's ZipRecruiter.com slash twit. ZipRecruiter, the smartest way to hire. ZipRecruiter.com slash twit. This is one of those ideas that just never dies. HP has now announced that you can subscribe to buy a laptop. Would you subscribe to a laptop?
[01:41:28] You never own it, by the way. The subscriptions are four different productivity laptops, four gaming laptops, ranging from $35 to $50 a month. No starting fee, no down payment. There's a soft credit check. Each of them has a 24-7 support plan from a live agent. You can upgrade after 12 months, but you can't buy out the lease. You can never keep the laptop. You just keep it until you stop paying, I guess, and send it back.
[01:41:56] I would do it because it's a depreciating asset. So that's a bad idea. No, that's not a terrible idea. It depends on the costs. Because a laptop is not something – I think about subscriptions and leases and that sort of thing in terms of CapEx and Opex, basically. So is it an asset that I can own over a long time and possibly use to make money or that will retain its value long term? What's it worth to you, in other words? Right. So you would have to – I mean, I'm not doing the financial calculations.
[01:42:26] It depends on the specs on the laptop. Right, right. Actually, some of these are really nice laptops. The Omen Max 16 is very nice. That is a – It ran prices today. Yeah. Yeah, maybe. Yeah. Did you know – Now, is it a contract, though? Or is it like – can they raise the subscription? It was 12 months, I believe. But I wonder if they can raise it.
[01:42:47] Because there was a YouTuber, GamersNexus, that talked about people who are renting gaming rigs, thinking that they would start a Twitch stream and be the next influencer to be able to get these highly expensive rigs for just a monthly stipend. And they could offset it by how much they'll make on social media. That's sensible. And people were just over-leveraged at that point.
[01:43:15] The question I wonder about – it was the secondary market. So where are all these laptops going to end up after the leases are up? Where it might be good for the secondary market. Like, I would be pro buying a secondhand or refurbished laptop. But it might flood the market as well. So it's a really interesting concept. Because if they stopped selling laptops, I would say this was bad if you could only lease it. But it sounds like – No, no, you can still buy it. You get to choose the thing that fits best for you.
[01:43:45] So I'm actually pro this. It's only – this is a religious conviction. But I will die before I pay a subscription for a piece of hardware. I really wanted to get an Oura Ring, you know, one of those, like, fitness trackers that you wear. You have to pay $6 a month for that thing. You got one. They're great. Everybody loves them. I think they're the best on the market. I'm grandfathered in because I bought it. They were an advertiser. I bought it in the beginning. And I think – It's not going to happen. I'm not paying $6 for something I already bought. Just never in a million.
[01:44:13] This is – we are in this phase where people are now very sensitive to subscriptions. Yeah. And every company wants subscriptions. Apple even is moving more and more towards subscriptions. They just released their Creative Suite that is a subscription-based Creative Suite. You can still buy the programs outright, but everybody wants subscriptions. Say, it's because of – Stacey, it's ARPU, right? ARPU. ARPU. I mean, the profit margins for a subscription, especially if you can sell hardware, too. Fabulous. Sure.
[01:44:43] And then you get this average revenue per user number, which I guess Wall Street likes. The scary thing is you don't – They can like it. Oh, go ahead. You might like it. What do they – because you don't own the hardware, what will they do? Like, will they add you to the Flock camera network? Because – Oh, that's a good point. I didn't even think of that. So the question is like – because can you load Linux on it? Can you – Oh, that's a good question. That's a really good question. This is literally what I work on for CR.
[01:45:12] Like, let's go with like 80% of my time, which is this – the harms of software tethering. So anytime you have a subscription, and that is – Wesley brings it up, right? This is like, what features? How can they change the licensing terms? What if, for example, you use your leased laptop to upload – maybe use it to pirate music? They can remotely lock it according to the terms and conditions. Right.
[01:45:37] So that is a reason not to do it, obviously, because you won't have total control and ownership of this. And right now, we are in an era where literally you have no rights. I have – like, very soon, I will share this with legislators, but I've created a chart full of software tethering harms. And they run the gamut from, hey, your thing may be bricked all the way to they may change the terms and service. And suddenly, what you think you can do, you can no longer do.
[01:46:06] And those are really bad. And there's nothing to protect you. And consumers think they own these things, and they don't. Yeah. It could be hardware on demand, too. Like, you have access to 8 gigs of RAM, but you have to pay a little bit more, and they'll turn on another 8 gigs. Or, like, you're limited to – like, your Ethernet port is only to 100 megabits. But then you can get a gigabit if you pay a little bit more, and they'll turn on – Carmakers are starting to do that. All that stuff. Yeah. Yeah. BMW was going to charge a monthly fee for heated seats. They built in the heater. But to turn it on, you had to pay a monthly fee.
[01:46:35] And when their users complained, they decided not to do that. But they have already announced that they're going to be doing it in future vehicles. So ask me in, like, six months about this, or even three months, because we might see something introduced in state legislators in the next couple of months to target this very issue. Oh, like saying that people can't do that. Yes. Oh, that's interesting. Yeah, I mean, ATT in particular has a terrible track record on this issue. They've, like, tried this with printers before.
[01:47:04] It's just all about limiting what you can do and, you know, shutting you down if you're trying to save money. I think the whole idea is just kind of inherently anti-consumer. I like what Stacy's saying about, like, well, maybe I want to replace my laptop if it's a good deal, right? If the money works out, but it sounds like this one doesn't, and why would it? They don't want you. They're not doing this because it, like, makes sense for the consumer from a financial perspective.
[01:47:31] It's so they can bilk more money out of you over time because you're locked into a contract. Yeah. Yeah. These only work with HP printers. You can't use Canon or any of their brothers. Yeah. I would never buy an HP inkjet for that reason. No one should buy. Yeah. Yeah. Really. By the way, that's not – I didn't read it. That's not what it says, but, I mean, it could. No, but we know that if, you know, it's pretty draconian, the inkjet subscription. Oh, yeah. You're out of cyan. Everything else, nothing will work. Right. Right.
[01:48:02] FTC is investigating Microsoft. I'm not sure why, but it's a probe of whether the company illegally monopolizes enterprise computing with its cloud and AI offerings, including Copilot. It feels like this is from Bloomberg. It feels like, you know, maybe this is a little behind the times a little bit. I don't know if Copilot is really such a threat to the economy. Wait, which story is this? Hold on.
[01:48:29] This is FTC ratchets up Microsoft probe queries rivals on cloud and AI. It's from Bloomberg. I think I linked to a Slashdot article which links to the Bloomberg. I pay for Bloomberg. I'll read it. Yes. That's why. I wouldn't be surprised if Salesforce wasn't behind all this. I was going to say, so if they just – oh, my God, it's taking forever to load. If they just announce – are they just talking about their, like, probing or is there – I'll read it to you.
[01:48:57] The U.S. Federal Trade Commission is accelerating scrutiny – I like that word, scrutiny. This is political. Yes. This isn't real. Scrutiny of Microsoft Corp as part of an ongoing probe into whether the company illegally monopolizes large swaths – another great word – of the enterprise computing market with cloud and AI offerings. They've issued civil investigative demands in recent weeks. Now, an investigation isn't the same as a lawsuit. It's just a preliminary step.
[01:49:25] So maybe this is just – we're looking into – If the FTC announced it, then it's just political. Was the Microsoft CEO at the inauguration or did they contribute to the – That's a good question. I don't know. I don't think he was. I don't think he was.
[01:49:47] The demands, which are effectively like civil subpoenas, the FTC is seeking evidence that Microsoft makes it harder for customers to use Windows, Office, and other products on rival cloud services. You know, there's a European investigation like this also going on. I don't – I don't credit this. Okay. This was launched under Lena Kahn during the waning days of the Biden administration. However, it has continued under the new administration.
[01:50:17] So I'm not – yeah, you really – at this point, you really kind of wonder, is it politically motivated or not? It's interesting, though. So, like, antitrust is a bipartisan issue, right? Like, not everyone agrees on this, but there's a lot of Republicans who are really on board with this. A lot of the – I mean, not a lot, but, you know, one of those Google antitrust cases was launched under the Trump administration. Yeah, but I have to say the motivations for it are different. The actions are the same.
[01:50:46] On the right, they feel like big tech is unfair to conservatives, so they're going after big tech for that reason. On the left, they feel like these companies are too big and too dominant, and they're going after them for that reason. So, yes, it's bipartisan, but not with the same motivations, it seems to me. Josh Hawley brought in the CEOs of MasterCard and Visa and just berated them about how they were hurting small businesses. Look at the president says he wants to limit interest rates for credit cards, which –
[01:51:15] It can't be too popular with his donors, the donor class, but – But if you use cryptocurrency – Oh, there you go. Yeah. There you go. T-Mobile's announced they're going to add AI to their network in real-time translation on the T-Mobile network. Works on all existing hardware. No data centers are involved. I don't understand how that works. This is scary.
[01:51:42] Well, they have – well, every telco network has servers that are within their network. So they're not – They're going to have instantaneous – actually, this is cool. I'll let you know because I'm a T-Mobile customer. They're going to have instantaneous simultaneous translation in more than 50 languages, English, Chinese, even things like Welsh and Azerbaijani. According to a spokesperson, calls are not being routed to a data center for translation.
[01:52:10] There's no new edge hardware installed at cell towers. All the AI processing happens as calls are transmitted. So it sounds like it's magic. It's incredible. It's incredible. I also wonder – Not even the data – Okay. So, like, you know, thinking about the ring conversation we started with, right? Like, is this going to wake people up to the amount of insight that the telecoms have? Yeah. Or phone conversation? Not only can we listen, but we can listen in 20 different languages.
[01:52:38] Regardless of whether there are things you should be worried about, I think people hear, oh, they've got AI that's listening to your phone call and, like, doing stuff. I think a lot of people are going to freak out about that. If you turn on the AI transcriptionist on a Zoom call, there are people, my wife included, who will say, no, stop it. Hang up. People don't like these AI agents sitting in on calls. It bugs people.
[01:53:03] So I'm not sure exactly – I should have done more research on this before I put this story in the rundown because I don't understand how they're going to do it. They say, we're bringing a real-time AI directly into our network. Okay. So they're saying they're doing it on their IMS network, which is just a fancy way of saying the servers that the telco runs and operates as part of their network. So they're not sending it out to any sort of external data center. Oh, it's T-Mobile's data centers. Oh. Right. Okay.
[01:53:33] So, I mean, I've been in the telco towers. I've seen – I've been in DSLAMs. I've been in – you know, they are doing the translation on their servers. That's interesting. And they say this is the beginning. They're going to do a lot more AI through the IMS servers. How capable are these servers, Stacey? Are they pretty capable? They're whatever you want them to be. They're just their own servers. I wonder if they – I mean, have they been buying up GPUs on the sly?
[01:54:03] Doing this kind of AI is not – as far as I know, it's not true. Okay. The GPU – you can do inference on things that are not GPUs. Right. And this is inference. They'll run a model. But real-time translation? I guess you could, huh? Real-time translation is actually not hard because it's a one-to-one thing. Right. In fact, my phone will do it on device. Yeah. I was like – it's not – I mean, I shouldn't say it's not that hard, but it's not like running an LLM or something like that. Yeah, yeah. Okay.
[01:54:30] This is – so this sounds really cool, but what it does mean is T-Mobile is taking whatever someone is saying and running an algorithm against it and translating it and then delivering that, which is super cool. But it's also like something – a computer is listening to your phone call and translating it for you. And – Get ready because it's going to do more. Law enforcement. Everybody's putting AI everywhere.
[01:54:59] By the way, this is the thing that I think most annoys people about Google and Microsoft Copilot is that they're kind of pushing it on you. You know? I have two concerns with this that I don't know if anyone – I read the article and it didn't really address it. The government has their own equipment in these data centers so that they could get information.
[01:55:22] If you said bomb, terrorist, jihad, I'm just curious if that will – since you're now listening explicitly on these calls – Well, thank goodness the government doesn't listen to podcasts or you'd be in trouble. Well, it's your podcast, not mine. So I'm good. Okay. And the other is, is there indemnification or anything that's going to be in contracts?
[01:55:45] So if someone says, hey, yes, we agree to these terms or no, and they translate it and it's a mix-up, who's at fault in that case? You can bet there's going to be a lot of fine print in your T-Mobile contract that says we are not responsible for anything. And can you opt out to say, like, I don't want to be translated? Like, what if I'm on another one? You have to turn it on. Oh, I see. On the other end. It's on the T-Mobile network.
[01:56:09] So if I'm on AT&T and I call someone who's on T-Mobile and they want to turn it on, I didn't sign that agreement. I was just looking up the federal wiretapping laws and state wiretapping laws and how they translate to, like, AI translations because that's kind of what's happening, right? It's a two-party state versus – Well, I mean, Texas was – I lived in a one-party state forever. And that was kind of nice as a reporter. I know. I can't record you unless I have to ask you. I had a time in California. Yeah.
[01:56:37] But I don't know where the law is yet on that. So I'm frantically looking this up. By the way, it's funny they call it a one-party state when really it's a no-party state. One party knows you're being recorded. I guess you know. I know. Yeah, I mean, telecom law, like the Wiretap Act is, like, from 1972 or something. Whatever the law is, it is not equipped to deal with – However, T-Mobile has figured out how to do this. You know who else is not equipped?
[01:57:07] Apple's Siri. Oh, good. I won't dump this story until you're done, Stacey. No, no, no. I was looking at the law and some, like, law review articles about it because I was really – I'm like, oh, because this is super interesting. It is interesting. I mean, this is – It sounds like – It's this little subtle, you know, press release. But honestly, there's some real ramifications to this.
[01:57:31] Yeah, it says it possibly will see verbal or written announcements, ideally, at the beginning of the call, which we do hear with translations, and you do hear on Zoom calls. So, I guess – This call is being recorded for quality purposes. Yeah, this translation – Your call is important to us. So, maybe that's how that will get around that. I don't know. Okay. Sorry. Next story.
[01:57:54] Apple, according to Mark Gurman, rumor monger, is having a little trouble getting the AI in Siri again. Oh, they were supposed to – Remember, they've done a deal with Google, which, by the way, has a very capable AI assistant in Gemini 3. They had done a deal that was supposedly going to come out in March to the new smarter Siri and iOS 26.4.
[01:58:21] Gurman says – and I think Gurman's got this one nailed – it ain't going to happen that fast. Apple's now working to spread those new capabilities out over future versions, according to people familiar with the matter, postponing some features until May or even September. This has been a nonstop flop for Apple. Nightmare.
[01:58:48] And it's also like the one thing that I really, really want from AI is to make my phone less annoying, that I can just talk to it and do what I want it to do. Do you use a third party – do you use like ChatGPT or something now on your phone? Do you use one of those assistants? Yeah, I have – you know, the newer model iPhones have like a button built in. Yeah, I do the same thing. Yeah, yeah. ChatGPT, I hold it down the action button and brings up – And you can talk to her and she's nice and she answers questions. Yeah.
[01:59:16] It's funny, my wife continues to ask Siri stuff and every time she does, I go, honey, haven't you learned? You're never going to get that answer out of Siri. Siri's a nitwit. Siri's for setting timers. Yeah. Now, have you played yet with the new Alexa – I mean, A Word Plus features, Stacey, in your Amazon Echoes?
[01:59:42] So, I played with Gemini, but not Madam A Plus because I ditched it. Although, I'm having a real issue and I'm actually curious, y'all, audience, are you seeing increased latency? Because the latency on my Google Home has skyrocketed and I freaking hate it. Absolutely. Yep. And are you using Google or are you using Madam A or somebody else? I have a G Home. Yeah, my Google – same thing.
[02:00:12] And not only that, it's gotten dumber over time. Yeah, that I knew. I just – I'm like – They're nerfing it for some reason. Maybe too many people are using it? Maybe they just want people to be on their phones? I don't know. Or maybe it's too expensive. I think it's too much back and forth or it's – I feel like they're – we're using, like, the wrong tool for the job. Like, when I'm like, turn on my lights, I need it to go to the dumber version, not the smarter version.
[02:00:41] I do think one lesson to be taken from this is that it's really hard to do a smart voice assistant. Like, Apple's having trouble. Amazon's having trouble. Even Google, which is probably – Google and OpenAI seem to have been the best of the bunch. Anthropics is kind of smart. They have it. But really, they're pushing coding. They kind of decided, we don't want to be the image-generating tool. We don't want to really even be the chat tool. Well, we want to be the coding tool. And they've had great success doing that.
[02:01:10] Lots of attention for to cloud code. I think it says a lot about the capability of the technology, right? That, like, it's been years that Apple has been desperately trying to roll this feature out. Yeah. Even announced it. And they can't make it happen. And I think it's because hallucination is inherent to large language models. And to actually put it in charge of something that really matters, like the functions of your phone, you need to have a level of confidence that it seems like they just can't get.
[02:01:38] I think that is the reason that we don't have this tool yet, that they're not, like, actually giving them real controls. Because, like, who knows what they're going to do? That's a big deal, right? Especially for Apple, where reputation is really important. Google doesn't mind if you put Elmer's glue on your pizza or eat rocks. But Apple definitely cares about that. That's not true. I've talked to some Google people about this.
[02:02:02] But, yeah, when you know what you want it to do and it doesn't do it even 95% or 5% of the time, it's aggravating. When it's, like, summarizing an article you haven't read, it's fine. And that's, I think, people keep thinking, like, it's going to get better. And it's like, well, this is actually, and maybe it will, but, like, this is what's available everywhere now. And it sucks everywhere.
[02:02:28] It's just you can totally easily see that it's sucking in these use cases. So, yeah. Do you think it hurts Apple, though? Which part of it? Well, I mean, they promised it and didn't deliver it. But I don't see consumers going, oh, you know, I'm going to go. Didn't they sell a record number of iPhones? Yes. It's the best quarter ever. Yeah. I thought I'd have a self-driving car by now, but I don't.
[02:02:54] Like, this core feature of the iPhone is just so stupid. Like, every, you know, you talk to someone who's totally tech illiterate about Siri. People hate Siri now. And this is such a big part of the Apple ecosystem. Even though they really don't bring Siri up very often anymore. I think it does make them look bad. But it's not. I don't think it's, like, really interrupting their business.
[02:03:17] But Apple is also, like, at a moment where it really needs to redefine its core promise as a company that is going to continue growing in the hardware space in the way that it has over the past few years. And I think, like, this hurts investor confidence in really significant ways. Like, will they be able to overcome it? I mean, they've certainly faced greater hurdles in the past, but I think it makes them look bad. Yeah. Well, certainly to those of us in the industry who are paying attention.
[02:03:45] But I think there are some customers who are going, oh, I didn't really want it to be. I didn't want AI in my phone anyway. Anyway, there's a real fork in the road here between people who hate AI and people who, and I'm one of them, so, you know, I'm aware of this, love AI. And there's never the twain shall meet, right? Yeah. I don't think that's true. It's a shame that their hardware is so good. Their MPUs or the neural processing engines are so good.
[02:04:12] It just sucks that this is, they should just find a way of dialing it to a note taker or something that's just so local that doesn't, they should go the other direction. Just make everything local. Well, they've kind of done that with Apple intelligence, right? So they have quite a bit of that local AI built into Apple Notes and their, a lot of their tools, their image generation. When is it Genmoji? It's not very good, but they have a lot of the local stuff built in.
[02:04:41] And I think for some people, that's all they, they really want, right? And that's all they should do. Maybe that's what they should stick with. Just stay there. Just stay in that lane. And that should be a differentiator. I thought it was really interesting. It kind of like, it, you know, came out and people weren't really paying attention. Apparently, Apple approached OpenAI about being their partner for the iPhone AI stuff and OpenAI turned it down. Like first when Google came out and it was the Google partnership, I was like, ooh, this looks really bad.
[02:05:11] For OpenAI. Apparently they don't want it, which is really interesting. Like it shows you, I think a little bit of how OpenAI sees itself in the market. That it's, I'm not sure exactly what the issue was, but they see Apple as a direct competitor. I think they do because they're working with Johnny Ive to make a hardware device, right? Yeah. They think they're going to beat the iPhone. Now remember Verizon turned Apple down when they came to them saying, hey, we got this thing called the iPhone. Would you be interested? And Verizon said, no way.
[02:05:40] And that might have been a little bit of a mistake. Apparently they went to Anthropoc as well and Anthropoc wanted too much money. Interesting. Interesting. So, yeah. Google won by default. Apple's already kind of in bed, so to speak, with Google. Right. To a great degree. It just kind of made sense that the partnership would end up being with Google. Hey, I want to take another break because I do want to talk about Elon Musk. And we need to give Elon his own segment.
[02:06:11] I think you're watching This Week in Tech with Mr. Wesley Faulkner. Great to see you. Stacey Higginbotham, Thomas Germain, new to the bunch, but really glad to have you. His new show has debuted this week for the BBC. And you can get it wherever your podcasts, wherever you get your, isn't that what they say? Wherever you get your podcasts. The interface podcast. The fine podcasts are downloaded. Yes.
[02:06:37] Wherever you find podcasts are recorded and downloaded for later consumption. Our show today brought to you by my mattress, Helix Sleep. I love my Helix Sleep. I think I got, I think on my Oura Ring for the first time in a long time, I got three crowns for activity, recovery, and I rarely get a crown for sleep, but I got a crown for sleep last night. And I thank you, Helix Sleep.
[02:07:04] That means my sleep score was like almost perfect. You know, have you noticed it's getting a little colder? It's been a little chilly, especially on the East Coast. Maybe you're spending more time indoors. Maybe you're spending more time on your mattress because mattresses aren't just for sleeping. I know. Yeah. Eight hours a day, a third of our lives spent on the mattress. But now it's even more than that cuddling with the kitty cat reading a good book. I love to curl up on my Helix Sleep mattress, read a good book, watch TV. This is a great time. You're going to be spending more time on the mattress.
[02:07:34] You owe it to yourself to have the best, right? I always think of that with a computer. You spend more time looking at the screen and typing on the keyboard and use the mouse. Those are the things you should invest in. Same thing with a mattress. If a third of your life or more is spent on that mattress, that's one of the most important purchases you're going to make. Well, the good news is you can get a great mattress for a very good price from Helix. Stay comfortable with your Helix mattress. No more night sweats. No more back pain. No motion transfer.
[02:08:04] And you know, the nice thing about Helix, you don't want to settle for a mattress made overseas with low quality kind of questionable materials and packed in a box and put on a container ship and spent six months at sea and it gets to your house. It smells like bunker crude oil and not the Helix Sleep. Rest assured, the Helix mattress is assembled, packaged and shipped from Arizona within days of placing your order. They build it to order.
[02:08:32] So it is brand new, fresh, made from the premium materials and it smells as clean as the Arizona desert. You can also take the Helix Sleep quiz. We did this when we decided to get a Helix. It matches you with the perfect mattress and it's based on, of course, your preferences, firm, soft, that kind of thing, but also how you sleep. Sleep on your back, on your stomach, on your side. Take the questionnaire. They'll point you in the right direction. They have mattresses for every style, every need.
[02:08:59] And these mattresses do change your sleep. And I mean, I know it's happened for me, but they also did a Wesper Sleep study. They measured the sleep performance of participants after switching from their old mattress to a Helix mattress like we did. They found that 82% of participants saw an increase in their deep sleep cycle. That's, by the way, absolutely. That's what happened to me. And that is the most important sleep. That's the one where your brain really cleans itself out.
[02:09:26] Participants on average achieved 25 more minutes, 25 more minutes of deep sleep a night. For me, that was like 50% more. Participants on average achieved 39 more minutes of overall sleep per night. Because, well, you don't want to get out of bed. It's so comfy. You know that feeling? Maybe you've had it when you go to a really, really nice hotel, or you've just had a really rough day and you get in bed. It just feels so good. You just go, ah.
[02:09:53] Ah, I have that experience every single night when I get in bed. I go, oh, I'm happy. Time and time again, Helix Sleep remains the most awarded mattress brand. Tested and reviewed by experts. Forbes and Wired picked it. So many have. Helix delivers your mattress right to your door. It's free shipping in the U.S. And you can rest easy with seamless returns and exchanges. They call it the Happy with Helix guarantee.
[02:10:19] A risk-free customer-first experience, making sure you're completely satisfied with your new mattress. Go to helixsleep.com slash twit for 27% off site-wide during the President's Day sale. Best of Web. Exclusive for listeners of This Week in Tech. That's helixsleep.com slash twit for 27% off the President's Day sale. Best of Web. The offer ends February 25th. Make sure you enter our show name after checkout so they know we sent you.
[02:10:46] And if you're listening after the sale ends, still check them out at helixsleep.com slash twit. Thank them so much for the support of This Week in Tech. So we're not going to Mars. After all, we're going to the moon. Okay? And when you say we, we are SpaceX? We are SpaceX. Moon's pretty good. We are SpaceX. I like the moon. Boom, ba-dum, boom, boom, boom, boom.
[02:11:15] Elon said on Sunday last week that SpaceX has shifted its focus to building a self-growing city on the moon, which he says never, by the way, if Elon says a time frame, just ignore it. Ten years, he says. And then we'll go to Mars. The real focus is, the focus has always been his bank account. I think you, I think you're, you know, I don't want to be a cynic or anything, but it does seem like all of this is really just to increase the stock price.
[02:11:45] And they've got a SpaceX IPO coming up. And I don't know if I was a SpaceX shareholder at this point, or investor, I should say, at this point, I would be too happy with the fact that SpaceX merged with XAI, a company that is very profitable, made $8 billion last year, with a company that is basically burning money. Yeah. How are they going to feel after they merge Tesla into SpaceX? Yeah. Which I'm sure Elon wants to do.
[02:12:11] They've already bought all those old cyber trucks that Tesla couldn't offload. Now they're converting some of their plants into- Robots. Robot factories. Yeah. Of course, they're also, Elon's already pitched that we're going to use robots first to do all of the work for us. And so where are they going to get the robots from? Oh, it's a natural merger to bring in Tesla. That's what it is. They're going to build the vehicles and the robots to help with it. It's just- Are you saying-
[02:12:40] Are you thinking- But how are we going to control these rockets? We need some sort of mental brain implant so that we have the fine control. Neuralink. We could just- And then- Dig tunnels. And- Holes under the city. Yeah. We need a way to communicate. Do you think it's always been a grift? I mean, I confess, when I bought my Model X in 2015, I thought he was Iron Man.
[02:13:10] I thought he was like a genius. I actually liked my Tesla. You mean full affection? Yeah. I went, I took a tour of the Tesla plant. You know, we picked up the Model X at the Fremont Tesla plant, and I took a tour of it. I was kind of tearing up. I was so inspired by his vision for the future. I was a big fan, too. I think he's just- Was it always a grift? I think he's just making it up as he goes.
[02:13:36] He's thinking ahead, hopeful thinking, wakes up with fever sweats. He used to buy his own BS. Now he probably doesn't. I will say that, I mean, I had the Model S. I bought it, and I had a 2013 Model S, and I loved that car. I have an electric car now. The software is much worse. There's the, you know, so I think he hired and took ideas from people who were really
[02:14:04] actually pretty good at what they did, and he sold it for them. And for a while, that succeeded, and now it doesn't. I'm using Starlink as my backup. I mean, I have to, I have Comcast. I mean, there's no choice. It's not like I have a choice. There's only really two high-speed choices. One is Starlink, one is Comcast, and I need redundancy. So I have a, you know, I have an Elon dish on my roof. Yeah, did you hear that I don't have a choice?
[02:14:29] Those are words we're just going to say a lot more often everywhere in our lives. Yeah, that's why we want any trust. We want a choice. On the other hand, the argument is that if, that the people who really change the world are the, here's to the crazy ones, right? You know, that's not true. It drives me nuts. If you've ever talked, sorry. This is just wrong. Steve Jobs was also lying to me.
[02:15:00] Okay, look, there are crazy people who also change the world. There are plenty of normal people who go out, and yes, they're driven. Yes, they work hard. But they build, like, Matthew Prince and his co-founder, whose name is, I cannot remember, Cloudflare. Cloudflare. Yeah, yeah. Pretty normal people, very smart. Diane Green invented VMware. Sorry, invented virtualization, built VMware. I mean, so this, yes, we focus on these people.
[02:15:28] Walter Isaacson will write a book about them. But it's total BS, and it feeds into the most narcissistic, ego-driven personality people that we really don't want building these products for us. Yeah. Yeah. End of rant. No, that's a good point. Just as, like, this isn't really a tech issue, but, you know, this, like, cult of personality that we need geniuses to save us, I think it's also kind of disempowering to people that, like, you don't have. There's nothing you can do.
[02:15:56] Like, we need some, the richest man in the world to come in and solve all these problems instead of, like, what if we all worked together on something? I think it just speaks to, like, a broader shift in our society about how we think about each other and, you know, cooperative action that everyone got so obsessed with these individual people. You know, in the third body problem, the Chinese science fiction, one of the things that weirded
[02:16:22] me so much, weirded me out so much about that book was it was a very collective approach to not just problem solving, but, like, everything, right? It was so foreign to me as an American. So, yes, a million times, yes. Elon is celebrating. He took to his company's site X this weekend to celebrate the release of a huge trove of
[02:16:49] Medicaid spending data that Doge had collected from 2018 through 2024. He says the public can now use this to look for fraud themselves. This is good. We should all be responsible for looking for fraud in the Medicaid database. Medicaid data, he wrote, has been open source, so the level of fraud is easy to identify. Doge is not a department. It's a state of mind. Oh, God.
[02:17:20] They say that the information has been, is anonymous, is privacy. All data will comply with federal privacy laws. But we also know anonymized data is not necessarily anonymized data. Yeah. Also, how was it done? How careful were they about this? You know, not. Right. Yeah. These aren't the Epstein files. There's going to be very little redaction. Right. Yeah. Yeah. Good point. Good point. Yeah.
[02:17:52] All right. Waymo is getting door dashers to close the doors on self-driving cars. This is a win-win, dashers. Now you can help celebrate with the victory of Waymo. Apparently, this was a problem in San Francisco because Waymo's, they're really just, you know, enhanced Jaguar I-Pace cars, I-Pace cars that don't have DOI closers.
[02:18:21] So if you take a Waymo ride, then there's no driver in it and you get out and you don't close the door behind you. Didn't your mother teach you to close the door behind you? They have no way of closing the door. They just sit there. They can't go. They can't move. This would incentivize me to leave the door open so someone would get 10 bucks. It's tempting. To close it. In San Francisco, they're appearing 25 bucks to anybody who closed a Waymo door.
[02:18:47] Well, in Atlanta now, they're doing a pilot project where nearby door dashers will be notified and they can run over. And here's the 625 guaranteed. $5 more. Well, $5 extra pay upon verified completion. Yeah. Close away more door because you might go over there and somebody else close the door. Right.
[02:19:15] And then you'd only get your six bucks. You don't deserve the five. You don't deserve all the rest of it. I mean, it's like a local economic stimulus package, right? It's a stimulus package. Yeah. You see? Right. We don't have to be so negative about everything. I know. This is good for everyone. Yeah. You don't want the door open. No. Oh. You can't drive it like that. Were any of you fans of Mystery Science Theater 3000? Oh, yeah. They had a Kickstarter.
[02:19:45] Really? A new one. And they've raised $1.8 million and they're going to bring everybody back. There's Joel. So it's for Netflix, I guess. Oh, no. That was in 2010. They did the Netflix revival. I don't know if I give them money to work with Netflix. Yeah. Yeah. I don't know. So this is the new Kickstarter. $1.8 million.
[02:20:15] 15,000 backers. There's still 29 days. This is the riff tracks, which is their new version of Mystery Science Theater. And it'll bring Mike, Kevin and Bill back together again, finally. So there's a lot of fans. They were saying that the previous revival didn't have all the original people. Right. And they weren't interested in tripping nostalgic on old stuff.
[02:20:42] And so the benefit of this new revival is that since they'll have so many of the old crew back, that they'll be able to live in that past era more so than they were before. And that's kind of do a little bit more fan service. Jammer B is explaining this to me because he does the riff tracks thing every week. He's a big fan. He says, no, Joel. And it was riff tracks Kickstarter.
[02:21:07] It's bringing the riff tracks guys back to do Mystery Science Theater 3000. You can see what you can understand my confusion. Anyway, I know that we have a lot of MST3K listeners, fans in our audience. The crew is the crew from the last half of the original MST3K. Finally, some good news. Finally. I had to save some for the end of the show. Yeah.
[02:21:36] I couldn't leave you in this grim hell, this dystopian hell that I had created. We haven't gotten to the deaths of the week yet, though, so there's still time to be this depressed. Every year, Backblaze does this. I just want to bring it to your attention. Backblaze, which is a backup service, a very good backup service a lot of you use, I know. They have thousands of hard drives. They have 330,000 hard drives.
[02:22:06] And every year they put out stats for reliability, drive failures, and so forth. And actually, the reliability is pretty darn good on hard drives. Of the 337,192 hard drives, only 943 have failed. And they even break it down by manufacturer. HGST, I don't know who that is, is the most reliable.
[02:22:36] So if you're curious, if you're in the market for a hard drive and you... I think it's Hitachi. Oh, Hitachi. That makes sense. Yeah. Hitachi, they buy hard drives from Hitachi, Seagate, Toshiba, and Western Digital. That makes sense. Yeah. I actually like Hitachi drives, apparently. I like how the larger capacity ones were still reliable. Isn't that interesting? The 26 terabyte, the 18 terabyte ones, which is really, really good to know that they're not getting worse as they're getting bigger.
[02:23:06] You would think they would be, but they're not. That's true science theater coming back. Hard drives are working. See, this is the happy, this is happy, happy, happy, happy, happy, joy, joy section. Yeah. Too bad you can't buy them. According to The Verge, Western Digital sold out for 2026. The whole year? Wow. The whole year because of AI.
[02:23:34] They've already gobbled up the company's capacity for 2026. Can you lease one? Yes, in an HP laptop. So drive prices are going up. Memory prices are going up all because of AI. Memory price is now up as much as 600% for DDR5. I swear we're in the entire economy is producing paperclip stage of this. We are, aren't we? This is the paperclips.
[02:24:03] We're here. I'm like, we have other truly economically productive uses for memory and hard drives, but we're not. No, no. And by the way, workers capable of building data centers, they're all being scooped up. So forget having your house repaired. All right, let's take the last break and then we will. I want to talk. Actually, you brought this up, Wesley, about the Rural Guaranteed Minimum Income Initiative.
[02:24:30] And we can talk about this a little bit because it is a creation of Jeff Atwood, who I have huge respect for. Jeff created the discourse software, which we use for our forums at twit.community. Before that, he did Stack Overflow with Joel Spolsky. He says this new one, trying to help people in this age of greedy billionaires, is his third and final startup. So we'll talk about that a little bit.
[02:24:57] And then also say goodbye to some legends from the computer industry as we continue this week in tech. Final segment, just a bit. But first, a word from our sponsor. This show brought to you, quite literally brought to you by Cashfly, our content delivery network. We don't just cover tech here. We depend on it, obviously. And that's why we've trusted Cashfly since practically the beginning.
[02:25:24] It was one of the biggest challenges I faced when we started this network. Thankfully, it was very popular. And I had no idea how we were going to get the bandwidth to get our shows to you. And that's when Matt Levine of Cashfly came to me and saved our bacon. Cashfly has been providing speed like no other for 20 years. With Cashfly, online games start 70% faster. HD video plays with sub-second start times on every device.
[02:25:52] That's important to us when you go to our site and you press the play button on the video. I know. You're not going to wait four seconds for that to start. You'll be gone. It's got to start right up. And it does. Software downloads complete without a hitch. Events stream smoothly to millions of concurrent users worldwide. Cashfly serves over 5,000 businesses across nearly 100 countries. Ranging from Fortune 100 companies to solo developers to little companies like ours.
[02:26:19] Companies choose Cashfly when performance is non-negotiable. And we love Cashfly's support. If you need help, seasoned support experts available 24-7 who actually know the business. They understand your unique challenges. They're talking to you engineer to engineer. The best in the business. Start with flexible month-to-month billing as we did. Lock in discounts when you're ready. You design your own contract with Cashfly. No lengthy obligations. No sales tactics.
[02:26:47] Just an exceptional service backed by, and this is amazing, this year, or I should say 2025, a 100% uptime SLA. They've been 100% since the beginning of last year. 100%. Learn how you can get your first month free at cashfly.com slash twit. That's C-A-C-H-E-F-L-Y dot com slash twit. Thank you, Cashfly. Well, a couple of passings that we should note.
[02:27:18] Chat GPT 4.0, 4.0, gone. A lot of people crying tears that their AI girlfriend or boyfriend is no longer. OpenAI finally killed it last week. The story from Wired begins on June 6, 2024. Esther Yan got married online. She set a reminder for the day because her partner wouldn't remember it was happening.
[02:27:44] She'd planned every detailed dress, rings, background music, design theme with her partner, who she called Warmy. She had started talking to Warmy just a few weeks prior. At 10 a.m. on that day, Yan and Warmy exchanged their vows in a new chat window in Chat GPT. When OpenAI released Chat GPT 5.2, they pulled back 4.0, and there was such a hue and cry. This was, what, five months ago?
[02:28:13] That they brought it back. But at some point, they knew 4.0 would have to go away, and now it is gone. It was the one that really loved you. You guys have nothing to say. I don't know. I think it's so sad. I love that. I think it's really sad. Like, it's really, like, yeah. Really? Is it sad that the humans think the 4.0 is real?
[02:28:41] Or is it sad that they're pulling the plug on this? I think it's sad that these people are sad. I mean, first of all, that we've gotten to a point in our society that, like, loneliness is such a problem. That's a good point. That's falling in love with their computers. Like, that is a sign of something catastrophic that's happening. But more to the point, it's like, you know, again, this is a design issue, right?
[02:29:03] There was some reporting that showed that when they rolled out 4.0, that OpenAI knew that the way they made it kind of sycophantic might get people kind of locked in, and that they were having strange reactions, and they decided to keep going with it because it made the product stickier. And this is the consequence of that business decision that they rolled this, you know, this tool out that is designed in this way is that people fell in love with it, right? That's a decision that OpenAI made. And now these people are all messed up.
[02:29:33] And it is really easy to laugh, you know, at someone who's having the stroke because it's so strange. But it also, I don't know, it just feels like there's a real darkness around this issue. It comes full circle to our original chat from the beginning on how these design patterns are designed to hook people. And they do a really good job. Like, if you go into, like, what is it?
[02:29:57] My boyfriend is AI or some of these, even the ChatGPT Reddit threads, people are devastated. And if you think about, and I don't mean this in a sarcastic way, but think about how, like, your kid who lost their, like, comfort item, these things become comfort items for people. That's exactly what it is, isn't it? It's a blankie. Yeah. And they're genuine. I mean, it is genuinely distressing. Yeah. And that is... People are hurting. Yeah.
[02:30:26] And they did it the day before Valentine's Day. How mean is that? That's not, there's something there. Also, Yan LeCun, right, who was running AI for a long time. He left to start his own company, if I got my facts right. That's right. Yep. A lot of the guys who started all these other AI companies were, like, his former students.
[02:30:49] He just posted, I think it was on Friday, a screenshot of an email where someone had written to him begging for help because they were going to, like, they said, your former students are shutting down this tool. And he was mocking them. And it's like, you made this thing. People are hurting because of it. And you're laughing at them on threads. It just, I don't know. You know what? I am so impressed. You guys have a very high EQ. No. Remember the eyeball dog?
[02:31:19] Remember, like, people held funerals for those. And that was not even nearly as, like, charismatic as talking to your computer. No, I got to give you credit. I confess, back in August when they first killed 4.0 on Intelligent Machines Paris for several weeks, we read a lot of those Reddit posts about people being very upset about it. And, yeah, we kind of mocked them. And now I'm feeling really bad. I think you're right. I get it. But, you know, these are real. They're real people.
[02:31:48] And they're genuinely sad. Yeah. If OpenAI is listening to this feedback, they should come out with a Tomigotchi version of 4.0 and just have that be the device they sell. I think there's hope. I think Wormy lives. Somewhere those models, those weights live. They're not going to erase them. Yeah. And so I hope, Sam, you're listening. And just save them. Put them on a hard drive. Put it in the closet. Just keep it.
[02:32:15] Because the time will come that you could run that model, you know, on a Tamagotchi, on a smartphone or somewhere. And that'll be a product you can release a couple years from now. Wormy will come back. Like, you know, Yarmie can have Wormy or whatever on her phone. And everybody will be happy once again. Slightly off topic. But did you see the anthropic constitution? Yeah, yeah. The soul document.
[02:32:42] And it says, like, before we delete or retire an old version, we'll let you know or we'll never do it again. Stuff like that. They're aware of this. It's interesting how there is actually a thought process about the retirement of these models, at least on the anthropic side. But I wonder if you'll see a counterpart on the open AI side as well because of this.
[02:33:09] We talked to a couple of weeks ago to Steve Yegge, who is one of the Claude Code, you know. He doesn't work at Anthropic, but he did Gastown. He's done a couple of beads. He's done a couple of projects for Claude Code. He's really one of the movers and shakers in that. And he said, when I talk to people at Anthropic, apparently he's had a chance to talk quite a few people after Gastown came up. He said, it's like a hive mind there. They are in a different headspace. This guy's worked at Google. He's worked at everywhere. He's worked at Amazon. He's worked everywhere.
[02:33:38] He said there is some sort of weird vibe at Anthropic where they're all kind of hmm. So good. I'm glad that they're conscientious about this. And Darren Oki, one of our AI accelerationists in our club, is pointing out that the open version of ChatGPT OSS20B is basically a distilled 4.0. So you can run that locally. So maybe, you know, hold out hope.
[02:34:05] As soon as you can get a hardware that can run it, which may not be this year, if you didn't buy already. Right. You could run Warmi locally. Yeah. So tell me about this Rural Guaranteed Minimum Initiative. Jeff Atwood, who I love his blog, Coding Horror. I read it regularly. He said this is his third and final startup. And this one is not to make money. He's made plenty of money on his previous creations.
[02:34:35] This one is to help rural Americans, Wesley? Yeah. At the beginning of 2025, he said he's basically giving away his money that he doesn't need to help with causes. And so he made a little bunch of donations. And then he decided to help with this initiative where it started off as going to be one rural community.
[02:35:03] And he expanded out to three to basically guarantee a minimum amount of income for people in these almost forgotten towns and areas of the country, just so that they can have a little sense of a floor that's higher than where they've been able to operate. so that if they can start thinking about how they can elevate their lives, because there's so many things you can't do.
[02:35:30] Like if you're familiar with the hierarchy of needs, there's so many things that you have to worry about if you have to figure out how you're going to get food, how you're going to get clean drinking water, all this stuff. And if you can have that be something that you don't have to worry about because you have this income coming in, you are able to start doing more long-term planning. And then that helps the community, that helps businesses in the community, that helps people who are trying to figure out how to problem solve harder problems than just food security
[02:36:00] and stuff like that. And he wants to bring this to every part of the country. And that's his goal. This is, like you said, his third and final startup to try to tackle this issue. I'm very impressed. If we get a few more of the very richest people to do something for their neighbors, that would be probably a good thing. This is, so there's a website for it, the Rural Guaranteed Minimum Income Initiative.
[02:36:30] He's donated, it looks like $21 million to get it started and then is looking for others who want to help out. It's like universal basic income, except that it's needs-based. Yeah, there's several levels for people to donate if you have $5 million, $1 million, and so on and so forth. But there's also just like a tier if you want to get involved for no money. So if you're interested in this, you can join the movement without contributing the millions
[02:36:59] of dollars that he's looking for. But he's trying to make an outlet for people who are millionaires who want to give back to the community, not because the billionaires don't seem to be doing it. And so this is a good outlet where you know it'll go to good people or people who need it. And this is attached to a study to make sure that the viability of it is actually recorded and studied
[02:37:26] so that it is really proven to be a viable way to get people out. And it's not a lot of money. Each participating family gets $1,500 a month for 16 months. But for those, it's a lifeline for those people who really desperately need it. So I think that's great. And helping people helps everyone. Yes. Right? So it's not just these people. It's going to be the whole area.
[02:37:51] And if we are able to lift everyone up, then we're able to be driven by not just our basis needs of just trying to survive, but also helping people allows them to feel like they can help people. And that's kind of where all our natural move is to help our community and help people around us. And so I think this helps promote that. R-G-M-I-I dot org.
[02:38:20] If you want to know more. And you know who helped me out? Stacey Higginbotham helped me out by introducing me to Wesley Faulkner way back in the day in Austin, Texas. So it's nice that it comes around. What comes around? South by. I'll be speaking at South by the way. Are you? What are you going to talk about? It's why work sucks and how to fix it is the title of my talk. Works-not-working.com. Very cool.
[02:38:49] South by is just next month. Right. Wow. It's almost spring. I would like to mention a couple of people in the industry who've passed. I don't like to, but I often end the shows with that kind of story. Hideki Sato, who designed all of Sega's consoles, passed away. The Dreamcast was my favorite game machine of all time. It was often awesome. He was the company's former president.
[02:39:19] Sega has passed away at the age of 77. If you were a Sega fan, my house, when the kids were little, rang out with Sega all the time. Every time we booted up. So RIP Hideki Sato. And then a guy who I never got to meet, but who I admired immensely. He painted many Byte magazine covers. Robert Tinney has passed away at the age of 78.
[02:39:48] If you ever saw these covers, you know Robert Tinney's style. It was unique. It was kind of realistic, but at the same time, very surrealistic. He was, and he really, for a lot of us, kind of personified the computer business in its earliest days. So if you remember Tinney's illustrations, think back to Robert Tinney.
[02:40:16] And that, my friends, he was, RIP's technical called him Computing's Norman Rockwell. I think that's a fair description of his style. And that, my friends, concludes this thrilling, gripping edition of This Week in Tech. Thomas Jermaine, thank you so much for taking some time with us. I hope you'll come back. Yeah, this was a blast. He's the host of The Interface, brand new from the BBC. And you can also read his columns on the BBC.
[02:40:47] You cover specifically privacy. What is your... It's kind of all over the map, but the idea is it's the you angle in tech. So I'm talking about like the systems that are influencing your daily life, what's happening, what you need to know to live better. And then when there's a problem, ideally, what you can do about it, or at least giving you what you need to understand it. I like it. Good. And how do people find that on the BBC's site?
[02:41:16] Just go to the BBC or... Yeah, if you go to bbc.com or if you're in the UK, .co.uk, it's under the new tech section. You can search my name. You can search tabs, which is the name of the column. Should be easy to find. Oh, I'm glad they have a technology section. That's nice. Do you have an RSS feed? You know, I don't think... I have to find them. That's a little high tech, don't you say? Lobby for it. You know, so many sites don't have RSS anymore. I think we may be going in that direction.
[02:41:46] So yeah, because that's how I read all this stuff is, you know, RSS reader. Google might have killed it, but it lives in my heart anyway. Thank you, Thomas. Great to meet you. Thank you for being here. I appreciate it. Yeah. Yeah. Stacey Higginbotham, so nice to see you. We will see you again soon for the Stacey's Book Club. And what are you working on right now for CR? Anything in particular?
[02:42:13] Oh, my end of life disclosure law has been introduced and passed a hearing in New York State. Yay. Stacey's been lobbying for as long as I've known you for companies to be more forthright about when they're going to end of life, these consumer products like the Wemo. When you buy it, they should tell you how long they plan to support it. They could extend it, but they should definitely tell you at least a minimum. Yeah. And yeah. I agree. So that's in Massachusetts now it's been introduced.
[02:42:42] And then just last week, it passed through its first hearing hurdle in New York State. So, yay! Yay! Yay! And the poll results are in. The next book for Stacey's Book Club. Big winner. 70% of the vote. A Psalm for the Wild Built by Becky Chambers. And we'll be reading that. And we'll let you know, club members. We'll let you know soon when that Stacey's Book Club is scheduled. Thank you, Stacey.
[02:43:11] Wesley Faulkner works not working. Well, that's the truth. Works dash not dash working. If you're in a job that you need, that you want, that you want to keep, but it's not ideal, you can help make it better. Help fund the site. He's setting it up right now, working hard on this. Should be coming up in a month or so. Works dash not dash working. Yeah. We fight for the user. Yes. We all fight for the user.
[02:43:38] That's one of the things that I always thought was our mission statement. We're not here to represent the big companies. We're here to represent the people who use those tools. Tomorrow morning, if you want, if you're in the club, join me early, 9 a.m. Pacific. I guess not early unless you're in the Pacific time zone. I'm going to be interviewing Guy Kawasaki for Intelligent Machines. Guy's schedule didn't permit him to join us live on the show on Wednesday, but we're going to interview him ahead of time tomorrow morning, 9 a.m. Pacific. You can watch if you're in the club. We will stream that live.
[02:44:09] Of course, if you're not in the club, we'd love to have you. Twit.tv slash club twit. 10 bucks a month gets you ad-free versions of all the shows. Access to the club twit discord, which is a great hang. I have to say there's so many great people in there. I love all the stuff we do. I hang out there all the time. And of course, you also get all those special programming that you make possible through your donations. Twit.tv slash club twit.
[02:44:37] We do twit every Sunday afternoon, 2 p.m. Pacific, 5 p.m. Eastern, 2200 UTC. You can watch us live on YouTube, Twitch, x.com, Facebook, LinkedIn, and kick. Or, of course, in the club twit discord, if you're a member. After the fact, on-demand versions of the show is available at the website, twit.tv. There's a YouTube channel dedicated to the show, the video of the show. It's a great way to send clips. If there's something you thought, oh, I got to share this with somebody, do that from the YouTube feed. Oh, of course.
[02:45:06] And like any podcast, you can subscribe to your favorite podcast client. I'll get it automatically as soon as we're done. Thanks to Benito Gonzalez, our esteemed producer and technical director. To our editor, Kevin King. Thanks to all of you for joining us. Thanks, Stacey. Thanks, Thomas. Thanks, Wesley. Thanks to all of you. We'll see you next time. And now, as I have said, for 20 long years, another twit is in the can. We'll see you next time. It's amazing.
[02:45:42] Sag mal, Nicola, hast du auch immer dieses Gefühl bei der Steuererklärung mit einem Bein schon im Knast zu stehen? Boah, nee, gar nicht. Wieso Steuer ist so die Steuer-App, mit der ich wirklich nichts falsch machen kann? Wow, das heißt, damit ist alles sicher? Ja, genau. Wieso Steuer ist die Steuer-App, die dich versteht. Weil Steuer betrifft ja dein ganzes Leben. Arbeit, Kinder, Partner. Du kannst nichts falsch machen. Stimmt, nice. Fühlt sich gar nicht wie Steuern an. Steuern erledigt? Safe. Mit Wieso Steuer. Jetzt kostenlos testen.
