From AI-powered code generation boosting productivity to adversaries using the same tools to hunt zero-days, the panel exposes the coming wave of AI-fueled cyberattacks—and why most companies aren't ready for it.
- Cotton blocks Trump-backed effort to make daylight saving time permanent
- The End of Cybersecurity
- Amazon says it didn't cut 14,000 people because of money. It cut them because of 'culture'
- Here's How the AI Crash Happens
- US government is getting closer to banning TP-Link routers
- Neato cloud shutdown sees robocleaners robbed of their smarts
- FCC will vote to scrap telecom cybersecurity requirements
- Trump FCC Votes To Make It Easier For Your Broadband ISP To Rip You Off
- Swedish Death Cleaning But for Your Digital Life
- The F5 Hack is a Big Deal
- OpenAI Releases Agentic Security Researcher
- 'Do not trust your eyes': AI generates surge in expense fraud
- Proton Data Breach Observatory aims to alert you in near real-time
- Using a Security Key on X? Re-Enroll Now or Your Account Will Be Locked
- YouTube denies AI was involved with odd removals of tech tutorials
- 10M people watched a YouTuber shim a lock; the lock company sued him. Bad idea.
- Samsung's $2000 smart fridges are getting ads - gHacks Tech News
- ESPN, ABC, and other Disney channels go dark on YouTube TV
Host: Leo Laporte
Guests: Jill Duffy, Alex Stamos, and Stacey Higginbotham
Download or subscribe to This Week in Tech at https://twit.tv/shows/this-week-in-tech
Join Club TWiT for Ad-Free Podcasts!
Support what you love and get ad-free shows, a members-only Discord, and behind-the-scenes access. Join today: https://twit.tv/clubtwit
Sponsors:
[00:00:00] It's time for TWiT This Week in Tech. Stellar panel for you this week. Stacey Higginbotham is here from Consumer Reports. Jill Duffy from PC Magazine and Wired Magazine. And our favorite security guru after Steve Gibson, Alex Stamos is here. We're going to talk about the time change and how we almost, we came this close to avoiding it. Thanks, Congress. We'll talk about YouTube TV and Disney. Alex ain't too happy about his time.
[00:00:30] His Cal game. And the AI layoffs, are they really AI? And is the big crash coming? All that and more next on TWiT. Podcasts you love. From people you trust. This is TWiT.
[00:00:53] This is TWiT, This Week in Tech, Episode 1056, recorded Sunday, November 2nd, 2025. The Big Sleep. It's time for TWiT This Week in Tech, the show where we cover the latest tech news. And man, we got a great panel. Oh man, I know I say this every week, but it's true this week.
[00:01:20] Stacey Higginbotham is here. She couldn't make it last week due to a power outage on her tight little island. But she consented to join us this week and I'm so thrilled. Hi, Stacey. Hi, happy to be here. Hi, happy to be here. Policy fellow at Consumer Reports. She's the person responsible for getting Microsoft to back down on... Wait a minute. They didn't back down. They didn't back down. Oh well. I'm the person tilting at windmills. I know.
[00:01:45] They kind of, they made it as easy as possible to get another year out of them. But they didn't do what you and your colleagues called for, which was to just give it another year, Microsoft. You're going to do it anyway. Give it another year. It's great to see you, Stacey. We will have another Stacey's Book Club and I will tell you what that book is soon. She's reading it right now. And she's depressed.
[00:02:09] Also with us, Jill Duffy. We haven't seen Jill in a long time. It's so great to see you. She's a contributor to PC Magazine and Wired and always welcome on our network. Thank you. Even though it's five in the morning where she is today. I'm so sorry. Yeah. Shout out to everybody on VNC in Laos. They're up early in Laos. It's Monday. By the way, I noticed Brave New World is a library book. You might want to get that back before you get the over.
[00:02:38] I think it was like a library sale. Oh, good. Okay. When they were liquidating old stock. Yeah, yeah, yeah. I'm very aware of it because I'm reading a book right now. I'm listening to an audiobook from the library at Libby and it's only got eight days and I don't think I'm going to make it. And that's a bad feeling. I don't know if you can extend audiobooks. You got to get back on that list. I know. It was four people ahead of me. So yeah, it's gonna be a while.
[00:03:03] All right. So it's 5am in Laos, but it could be 4am. But you live in a country, a wise country, a thoughtful country where they don't change the clock twice a year. People who have been here for an hour waiting us for to start the show maybe didn't notice. We are now in standard time, which means we start at an hour later. But that's good for you.
[00:03:31] For me today, it's good for me. Yes, it's good for you. Yeah. You know, the closer you get to the equator, the less it matters. It doesn't matter. That's right. Because the sun rises and sets nearly the same time all year round. It's the people who are farthest from the equator who I think benefit the most from saving time. Ironically, the person who blocked it in the United States is from Arkansas.
[00:03:54] We were this close, this close in the United States to have permanent daylight saving time. In fact, even the president said the change in the clocks is quote, and he said this, of course, on Truth Social, a big inconvenience. And for our government, all in caps, a very costly event. Yeah, you got all those people going around winding the clock.
[00:04:20] What do you mean? Anyway, there was a bill to fix this that could have passed in time to change to keep the change from happening last night. Right. Sheldon Whitehouse, Democrat of Rhode Island. You couldn't get farther left than Sheldon and farther right than Tommy Tuberville, Republican of Alabama, called for the Senate to pass the bill on Tuesday, the Sunshine Protection Act.
[00:04:48] But Tom Cotton of Arkansas single handedly blocked it. There has been a bill like this. Every year. I'm going to say every year for at least the last 10 years, maybe longer than probably longer than that. And every year Americans get their hopes up and it never happens. It's not controversial. Well, you know, it's controversial. I think one of the reasons it stops is the controversy is, well, should we do daylight saving time or standard time? And it's about 50 50.
[00:05:18] And that's probably part of the controversy. Cotton says if permanent daylight savings time, because of the pandemic, when it comes to the law of the land, it will again make winter a dark and dismal time for millions of Americans. The sun wouldn't rise till after eight or even 830 during the dead of winter in Arkansas. Try Alaska. It would be like 10 in the morning before the sun comes up. I could, you know, why isn't Alaska fighting this? I don't know.
[00:05:45] Anyway, the darkness of permanent savings time would be especially harmful for school children and working Americans. Another year lost. I think there is a, the biggest issue is kids waiting for the bus in the dark. Did we want to introduce Alex? Did I not introduce Alex? Son of a gun. I just, I'm just going to have my way in here at some point. Oh, so sorry. God, I just, it's not that I didn't know you were here.
[00:06:13] I was just so upset about this. Alex Stamos is here. You know, we get Alex like twice a year. I'm thrilled to have him on. His newest job. He's the chief security officer at Corridor.dev. Very timely because Corridor.dev is designed to protect people using AI to code. Yeah. The security layer for AI coding. You couldn't be more timely on this.
[00:06:39] And we're going to talk a little bit about agentic browsers because the more and more problems with them. Yeah. Alex Stamos, his name though is legend. He, he founded the Stanford Internet Observatory, which is now sad to say defunct. But that was watching for disinformation. And of course we know there's no disinformation. So you didn't really, we didn't need it. CSO at Facebook, Yahoo. Zoom called him in to fix their problems during COVID.
[00:07:09] He's legend in the business. Alex, it's so great to have you on the show. I appreciate it. Thank you for being here. And congratulations on the relatively new gig. Yeah. Just jumped a couple months ago. It seemed a lot more fun than being a CSO at a big public company. It's a, yeah. Is, is quarter, it's a startup I would guess, right? It's a startup that you like to see people right now started by two of my former Stanford students actually. Oh, nice.
[00:07:35] I was, I was one of their first seed funders and they seem to be having a lot more fun than I was, you know, CSOing at a public company. So decided to make the jump. You don't have all those pesky shareholders to think about. You just do your job. Yeah. Yeah. Yeah. Yeah. Yeah. Well, I'm, I'm sorry about the Stanford Internet Observatory. That's again, another fine act of Congress. Yeah. Well, yeah.
[00:08:04] Because I mean, our two big things are disinformation. We actually did a ton of work on online child safety and AI and child safety stuff. That's totally right. Gosh darn it. Well, that's probably a little more important than the clocks changing twice a year. So I'm sad. I mean, this is also people keep on bringing up the idea. This would be the time for us to go to just two time zones in the U.S. just in east and west, right? Get rid of central and mountain.
[00:08:30] Well, there are even people who say we should just go to UTC worldwide and forget about it. And you just figure out when the sun's up in your neck of the woods. Right. And schools could, you know, start at whatever, 1400 UTC. They wouldn't have the whole issue. But I guess that's a little, that's a bridge too far. It's really silly to be moving the clocks, but you know what? We can't even use the metric system. I mean, you really knew what was going to happen. I know. You know, it's funny, Alex.
[00:08:59] I was a vibe coding this morning. I got up early because my biological clock thought it was 7am, but it was 6am. So I thought, well, I got for some free time. I'll sit down with Claude Code, clean up my Emacs configuration in time for a couple of coding challenges coming up this month and next. And three hours gone by and it did it, by the way, it's mind boggling what it did. I mean, I thought, oh, it's not going to really know Emacs. That's kind of obscure.
[00:09:28] It's not going to know Common Lisp because that's what I code in. No, it's great. In fact, the older the language, probably the better, right? I mean, what it needs is lots of examples. And there's tons of examples of people have their Emacs configs out there. There's lots of Lisp. I think structure and interpretation computer programs, you know, the famous computer design book. The classic. My favorite book. That's what I learned in CS61A. I think that's actually completely open source. So it's been trained on the whole. That's right. And how to design programs, which is Racket, but similar.
[00:09:57] Both are Scheme. Yeah. Which are Lisps. Yeah. I mean, that's what got me into Common Lisp. I followed that path and Scheme was a little restrictive, but Common Lisp, you know, it's as old as I am. I like a language. It's as old as I am. Anyway, enough of that. But Vibe Coding is amazing. It is. And it's just mind boggling what I use Cloud Code, but Codex from OpenAI, Cursor.
[00:10:27] I mean, there's so many of these tools now. Lovable. And I think companies more and more. Is there a risk to companies adopting Vibe Coding? I mean, it does seem like you kind of risk not really understanding what the AI has done. So, first off, I think we have to separate these two things. There's two totally different uses, right? So, there's the Vibe Coding, which is like amateurs using these tools to do fun things, to do amateur projects. Me. That's me. Yes.
[00:10:57] Right. And that's great. I mean, I think this is actually a wonderful thing. Over the next couple of years, we're going to see really positive impacts from normal people being able to use computers for the first time ever, as they really should be used. Like, it used to be if you were just a normie and you had to parse through megabytes and megabytes of data, you had to either have a data scientist or you had to get it in Excel and try to work your way through it.
[00:11:24] And now you can go into Cloud Code and it will, if it can't load it into its own context window, it will write Python to do it for you. Right? Yeah, that's what it does. Yeah. That's the kind of thing that, you know, a lot CS folks took for granted and now billions of people have access to that level of capability over the next couple of years. That's amazing. That's a positive thing.
[00:11:47] That is different than software, professional software engineers, utilizing AI enabled tools to help them speed up, which is also a really positive thing, but does have some interesting challenges. And that's what we're doing at Corridor is basically building the frameworks that if you're a bank, if you're an aerospace company, if you have to live up to existing frameworks, to privacy rules, to safety and security rules, these tools are really cool, but they don't understand those things.
[00:12:17] They don't understand architecture very well. And they certainly don't understand the rules you've had to live before. They make things really fast, but they're still human beings who will go to jail if you break those rules. Right? So you want to have- Or customers whose social security numbers will be leaked. There was actually, I'm sure Alex, you saw it. Jen Easterly wrote something in, I think it was Foreign Affairs maybe?
[00:12:38] She had a really good article or op-ed about basically vibe coding, allowing for better cybersecurity on the development side because it'll be cheap and easy. And that's what you need to embed security into things because people don't really pay for it. So I don't know. Yeah. I think it cuts both ways. I think- Exactly.
[00:13:02] The AI tools are going to make fewer of the individual mistakes, but we're also ending up with adversaries being able to find bugs faster. I think we're going to talk about that in some of our stories. Yes. Yes. Yes. And also the other crazy thing is adversaries are using vibe coding tools as well. That's actually been a big change in the last six months. If you look at both OpenA and I'm in Theropic have Threat and Tell Reports. It's kind of a model they took from the big social media companies of these quarterly threat and tell reports.
[00:13:31] And this is how we see people abusing our platforms. And they've always said folks have done spear phishing and spam and this kind of stuff. In the last six months, these reports have really changed. And it's really, really, really, that now all of the steps of the traditional intrusion kill chain are now being automated on top of OpenAI and Anthropic, including tool creation, exploit creation, and such.
[00:13:52] And this is a big deal because like one of the dirty little secrets of the attacker world is traditionally only a relatively small number of adversary teams have had the ability to do true ODE creation and exploit development. Right. You've got, obviously, the five eyes, the US and our Anglophone allies, the, you know, France and Germany, a couple of European countries, Israel, obviously Russia, China, some of the best Iranian groups. Most of the Iranian groups don't do ODE, right?
[00:14:20] Like they get away with like really good password sprays and social engineering and such. India and Pakistan do terrible things to each other, mostly without ODE and without new exploits. You're saying ODE, you mean zero day? Zero day, right. Like without actually creating new exploits or new vulnerabilities. I'm sure that in your group, I'm sure ODE, I just want to make sure everybody understands because I've always called it zero day. Yeah, I'm sorry. No, no, no. I'm going to call it ODE from now on. If Alex Stamos calls it ODE, it's ODE. Zero day. I'm sorry.
[00:14:50] But like, meaning like that same issue, Leo. My apologies. I was thinking ODE, is this a, is this a, I'm still thinking about the metric system over here. Right. ODE is not the SI unit. My apologies of exploitation. But yeah, so very few adversaries have ever actually been able to do that.
[00:15:14] And with Vibe coding, which will mostly not be an open ananthropic, it's mostly going to be open source tools that are specifically retuned for this. And you actually find some of this stuff. If you go to hugging face, people have built models, open source models specifically do things like write exploit code. That is how we build these exploits going. Wow. And that's good. Can they innovate or is it just rehashing existing exploit code? It's rehashing, but that's good enough, right? Oh yeah. Yeah.
[00:15:42] If you use these tools and you take some off the shelf open source thing that's super widely used and you scan through it and you find a use after free bug. You know, use after free is use after free, right? Yeah. It's good everywhere. And AMD 64 is AMD 64. And so if you use something that's been trained on 5,000 AMD 64 exploits in the past, then like, there we go. It knows where to look. Yeah, exactly. That's really interesting. I mean, ScriptKitties have always been a problem.
[00:16:11] These are people who don't have the skills perhaps to create good hacking tools, but can use other people's hacking tools. Is it your sense that if somebody is good enough to write these tools from scratch, they're less likely to be evil? And so making it accessible to ScriptKitties makes it worse? Or is that just a nutty premise? Well, it's, I mean, even among the adversaries who've had this capability, very few of them have, right? Like, let's just talk about Russia. Yeah. So you're making it more widespread.
[00:16:40] I mean, that's bad enough all by itself. You look at the Russian adversaries, right? You've got, like, the difference in quality between, like, the top hacking teams at the SVR versus the not as good hacking teams at the GRU versus the monetarily focused ransomware teams is humongous, right? Yeah.
[00:16:58] The people who did the SolarWinds hack, they built a custom in-kernel Windows rootkit, which swapped out SolarWinds' source code in memory. It swapped out the page in memory without touching disk. It decrypted it, swapped it out just for the moment it was compiled and linked, and then swapped it back in. Oh, my God. To not try to create a trace on disk. So I got to work on the investigation and the response at SolarWinds, right?
[00:17:26] This is the first project Chris Krebs and I did for a consulting firm. And they did an incredibly good job because they have probably the best, you know, engineers who work on hacking in Russia working for them. This is SVR? SVR, right. Yeah. So that's the part of the- Worse than the GRU. Much worse. I mean, much more capable than the- This is the government. This is the Russian government, basically. Yeah, SVR is part of when the KGB split up.
[00:17:54] The foreign intelligence service component of the KGB became the SVR. The domestic and kind of near abroad part became the FSB. That's the GRU. No, it became the FSB. GRU has always been GRU. GRBU is military intelligence. So they're part of- Yeah, they're the thugs. Right, they're the thugs, yes. And so the GRU, when they break in, they don't mind you. They're the ones who literally, like, will break in people's apartments and then, like, leave a crap in the toilet. So they're there, like, they do stuff like that. So if you're gonna be thrown out a window, it's probably the GRU. That's true.
[00:18:24] Or FSB, right. FSB is like the KGB part that, like, terrorized Russians. But if your memory is gonna be swapped in vitro, that's gonna be the SVR. Yeah, right. Because what they- They're sophisticated. Yes. Ah, they don't want to get caught. Yeah. Yes, it's quite different. Yeah. And so my fear with all the AI tools is that it's kind of, you know how, like, the Premier League, you can get sent down, you can go back up, right, if you do well? Right, right. That everybody's gonna step up to the next league with AI tools, right?
[00:18:52] And that's gonna be really terrifying. So on the defensive side, we all have to do the same thing. We all have to take a step up because the attackers are all gonna be able to go to the next league up with the kinds of things that they're gonna be able to build that they've never been able to build before. So, Jen Easterly, former director of CISA, and you're right, Stacey, it was Foreign Affairs. Great. What a great reference. She wrote an article says, the end of cybersecurity, America's digital defenses are failing.
[00:19:21] And this is primarily because of the effectiveness of these AI coding. It's because, no, it's because we don't invest in security. Because we don't. Yeah, because we've basically shut CISO down. Yeah. CISO down. Yeah. But AI can save them, she says. So what's the good, what's the good news? Well, the good news is that because firms don't invest in cybersecurity until they have to, right? Right. And we don't have a really great regime for forcing that.
[00:19:51] Her argument is that we can use AI to make it easier to build secure software. She also makes the same point that Alex made, which is we have to do it anyway because everybody's going to use AI to attack us. Yeah. So we should be leveling up anyway and AI will just help us level up. But she's making kind of an economic argument here, which is that it will be cheaper and easier, presumably, to build more secure software.
[00:20:19] Cheaper is always, you know, more likely to happen. Sad to say. She says AI systems can autonomously find and patch software flaws. Has that been your experience, Alex? Is that the case? It is to a certain extent. I mean, they certainly can find it. Patching is a little bit harder. But the, I think for the first time ever, we're in a place where it is economical to refactor code. And it might, we're getting to the point of where it's becoming more economical to actually refactor large amounts of code than to just fix bugs.
[00:20:49] That we might get to a place where rewriting your old C++ code to be on a new C++ 21 with a secure templating library is better than trying to find all the memory management problems. Or we're not quite there yet, but in a couple of years, converting your C++ code to rust is actually going to be economical. Yeah. That was insane two years ago. Right. That would have cost, it would have been cheaper just to rebuild from scratch. It's not automated though.
[00:21:18] You still need engineers to supervise, yes? To supervise, yeah. But like the amount of code they're writing. I mean, human engineers are basically becoming really technical product managers, right? They're doing architecture. They're doing design. Again, the coding tools are quite bad at architecture. The only one that does a reasonably good job here is Amazon Hero, which is still in beta, which is an IDE-based coding tool that makes you do a PRD. It makes you do an architecture.
[00:21:46] It makes you write tests. So it forces you through the steps that you have to do as a professional engineer. The rest of them, like cloud code, you say, I want to build an incredibly complicated client-server architecture that is going to take 17 different components on AWS. It's like, yep, let's get going. You're like, no, no, no, wait, wait, wait. Yeah, and you see the code scrolling off the screen. Yeah. Yeah, it just goes, right?
[00:22:13] And so I think that's, I mean, that's like part of our thesis at Corridor is like, hey, we got to build frameworks around these things if you're an enterprise. And to their credit, you know, OpenAI and Anthropic in particular, and Cursor understand this, right? Like Cursor in particular is a company that's like trying to target enterprises.
[00:22:32] It's quite different than like Lovable and Vercel where they're building stacks that are in particular targeted for the hobbyist consumer and to make it easy for them. So I think you're going to see more bifurcation. Even Claude has started to say things like, you know, it'd be good if you, you know, if you made a little to-do list here and if you did some things, set some structure around it instead of letting me just go like crazy. Although you can still hit shift.
[00:22:58] You can still hit shift tab and let it just go, which I did this morning, I must say. Right. I think what you're going to have is professional engineers are not going to use the CLIs. Right. They're going to use a tool like Cursor. They'll use an IDE. Yeah. And then Cursor's going to use, you know, Sonnet 4.5. They'll use Anthropics models, but you'll be using it through a professional tool that forces you through some kind of engineering plan. Well, I didn't anticipate getting in this, but you know what? When Alex Samos is in the house, I think it's pretty important we do. Sorry. I just write in.
[00:23:27] And it does segue into the next story, which we're going to get to in a minute, which is Amazon firing 30,000 people. Did they do it because of AI? Well, maybe, maybe not. Not exactly. We'll talk about that in just a bit. Jill Duffy is here. She's spending the morning with us. So nice to have you. I hope you have a nice cup of tea. Thank you. I have some strong coffee. Yes, I do. Yes. What is this vigilante? I like your mug. It says vigilante. Oh, this is a great roastery in Maryland. Nice.
[00:23:57] What a good name. That makes coffee. Yeah. You know, it's the lead roaster's last name. And it's got the evolution of coffee drinkers on the back, which I like. Yeah. Early man ate coffee drinkers. From monkey to, looks like fisherman. Oh no, spear carrying to coffee drinkers. That's good. Yeah. Yeah. I like it. Vigilante. Good. I'm going to try some. Nice to have you, Jill. Alex Stamos is also here. His new company, Corridor.dev, helps devs do it securely.
[00:24:26] So you're in the right place at the right time. And Stacy Higginbotham from Consumer Reports, where she's a policy fellow. It's great to have all three of you. We will get back to the news in just a second. But first, a word from our sponsor. Our show today brought to you by ZipRecruiter. We use ZipRecruiter. What if you could consistently find whatever it is you're looking for right away? We're talking about everything from parking spots to something on my mind right now, holiday
[00:24:54] gifts, jackets or jeans that fit perfectly. Imagine how much time you would save. Well, you may never instantly find those things. But if you're hiring, you can find qualified candidates right away, time and time again with ZipRecruiter. And today you could try it for free at ziprecruiter.com slash twit. And this is their secret sauce. ZipRecruiter's powerful matching technology works fast to find top talent.
[00:25:23] With ZipRecruiter's advanced resume database, you can unlock top candidates' contact info instantly. No wonder ZipRecruiter is the number one rated hiring site based on G2. If you want to know right away how many qualified candidates are in your area, look no farther than ZipRecruiter. Four out of five employers who post on ZipRecruiter get a quality candidate within the first day. Right now you can try it for free. ZipRecruiter.com slash twit.
[00:25:51] Again, that's ZipRecruiter.com slash twit. ZipRecruiter, the smartest way to hire. I'm glad some people are hiring because some people are firing. Amazon, according to Reuters, is preparing to fire as many as 30,000 corporate jobs starting Tuesday. Now, they have one and a half million employees, but that's almost 10% total.
[00:26:21] And Amazon says it's not because of money. It's because of, and this is the quote, culture. According to CNN, this is from Andy Jassy, CEO. It's not really financially driven, not even really AI driven, not right now. It's culture. Amazon added head count. Of course, in recent years, a lot of companies did. Jassy says, quote, you end up with a lot more people than what you had before.
[00:26:52] He's so folksy that Andy Jassy. And you end up with a lot more layers. Sometimes without realizing that you can weaken the ownership of the people that you have who are doing the actual work. Okay. I know it's not AI, although, you know, a lot of tech companies are kind of claiming, oh yeah, we're replacing people with AI. I think the truth is maybe a little closer to the fact. Yes. AI is very, very expensive.
[00:27:19] And our burn rate is insane buying all those Nvidia GPUs. And how are we going to fund that? What if we fire 30,000 people? Would that help? Does that make sense? Yes. If you look at the quarterly. Yes. If you look at the quarterly results. Yes. Thank you for that. Affirmative. I was like, are you, are you waiting for your comment? Sure. Yeah. Just jump right in anytime.
[00:27:46] Don't I, I will talk until somebody else says something. I will say. That's an old radio habit. I don't know dead air on this show. No dead air. Go ahead, Stacey. You know, I'm old. I've, I've been through a couple of downturns and what we're seeing is everybody prepping for things to get tight. And I will say that if you can get rid of people, it's not, I mean, sure, it's easy to be like, oh, it's AI, but they're also slowing down on their hiring front. If you can get rid of some expensive workers. Great.
[00:28:17] Um, so I think this is part of the beginning of a hunker down kind of mentality. And I think we're seeing it across the board. I mean, we're seeing it in other firms, not just Microsoft did it last quarter after a record profitable quarter. They fired a bunch of people, which looked kind of unseemly. Alphabet, which announced its quarterly results this week. Unbelievable revenue over a hundred billion dollars in revenue for one quarter.
[00:28:45] That's a growth in the cloud of 34%. Their profits are phenomenal. Amazon's very similar. I mean, everybody, the tech sector is doing pretty well right now. Yeah. I mean, we're seeing it. I mean, UPS announced layoffs, target announced layoffs right ahead of Christmas. I mean, this is, I mean, it's not good news. Yeah. I mean, I, that's the other thing.
[00:29:13] And, you know, when we talk about this stuff, I, sometimes I forget to say, this is terrible for the people who get laid off. That's 30,000 people who will have a terrible Christmas. It's only 14 so far. So far. Right. I mean, it's still a lot. Only 14. Being in Seattle is pretty grim right now. I bet it is. I bet it is.
[00:29:35] I bet it is. It's technically held steady, but that's what them taking two charges. Right. They had an FTC settlement, and they took a severance charge as part of this. Right. 1.8 billion. So those two things there quarterly income for AWS was up to 21 billion from 14 billion.
[00:30:05] But what's interesting, free cash flow is down to 14 billion and that's down from 47 billion. Trendling 12 months. So that is a huge difference. And there you go. Points out. There you go. That's driven from a year over year increase of 50 billion for property. So they have been massively investing in AWS. And that has taken up a ton of their free cashflow. That doesn't hit their net income because they amortize that, right?
[00:30:32] That's, that's, that's CapEx that gets amortized over, you know, the depreciation schedule of that stuff. So I think it might be, they are reading the tea leaves, like Stacy said on a downturn. Amazon has more data than probably any other organization other than maybe JPMorgan Chase on the position of our economy, right? Like they see all consumer sentiment. They see all business sentiment via AWS and they're spectacular.
[00:31:01] They're more exposed than any other organization to both the business world and the consumer world, right? And so if there's a downturn and they have been spending $50 billion, $60 billion of their free cashflow just on increase in, in CapEx, then it's not a crazy thing for them to start to time early. What about meta? It would be really interesting to look at actually. So Azure, Microsoft, Amazon, and.
[00:31:30] Google. Google. Sorry. It was like the other one. The big three cloud data. Because they have the CapEx. It's a really good point, especially with the costs of NVIDIA chips, looking at that versus companies that are buying that from them. Cause that was the beauty of the cloud is it switched all of your original CapEx to OpEx. So I wonder if there should be like a new classification for some of these companies just for this particular element. I like that subtlety. And you know, who's doing great. NVIDIA crossed the $4 trillion.
[00:32:00] Yeah. Valuation mark this week. Yeah. Does that mean that, that $50 million charge goes straight to NVIDIA? It's all NVIDIA. I'm sure there's some Amazon. Amazon. Amazon. Well, everybody's working as fast as they can. Amazon, Google, and everybody else to create chips. Cause there's, they don't want to spend all this money with NVIDIA, but NVIDIA is in the catbird seat right now. Yeah. Although they do talk about their, their number one bullets of like for strong highlights is continued strong adoption of, of their custom AI chip.
[00:32:29] So they are talking very aggressively about how they're looking. Samsung, everybody, everybody's working hard. Yeah. Google's got the TPUs, so they don't have to worry quite so much, but. Right. And they highlight that Anthropic is doing trained a lot of their new models on Amazon's custom chips. So a lot of that was not. That is one of the issues, isn't it? NVIDIA has a proprietary system, CUDA, which no one else can duplicate because it's proprietary. And a lot of this stuff is based on CUDA, right? It needs CUDA.
[00:33:00] No, I mean, you can, you can, you can build on other libraries. I mean, yes, for consumers, it's a little bit of lock, but if, if you're Anthropic, that's not a lock in. Okay. If you're big enough, you don't care. Yes. Yeah. Now meta, I think Mark Zuckerberg lost $29 billion or something last week. It was some, because meta stock tanked because they had a $16 billion tax income tax.
[00:33:30] Charge. But I think the market's also a little jumpy because Mark's been spending like crazy on AI, bringing in paying as much as a billion dollars for new, for talent. Well, how many times can Mark be wrong and or late and acquire his way to success? And I'm not, I'm not being. that's a good question. That's what the market's asking. Isn't it? There was a great story in the Atlantic that I really liked that sort of
[00:33:59] zoomed out and said, how is the AI bubble going to crash? Charlie. Oh, Charlie. Yeah. That's yeah. And Matteo Wong, I think was the other writer on the story. Yeah. Matteo Wong. Here's how the AI crash happens. Yeah. And it, it was just, it, it, it, explains it reasonably simply. But reminded me that nothing financial in the United States is ever simple. Money is moved around. It's sort of what you all were saying, you know,
[00:34:28] what's amortized and whatever, but you know, so much of this money right now is going into real estate and building out these huge, um, server farms for AI. So when you are acquiring real estate that the Atlantic article poses, they're not really interested in taking on debt to get that land and to build. So they're selling instead of taking traditional loans, they're looking to, um,
[00:34:57] essentially package up the money they need and get it from. Um, Oh, this is the securitization of the data center loans. Yes. Uh, yes. The power, the data center contracts by the tech companies. Right. So they're going to private investment. We work at a business. They're going to private investment. Am I too simple on this? Everything is sort of being packaged and sold in ways that makes it harder to trace, harder to figure out who owes money to who,
[00:35:27] how much debt do they actually have? And there's a lot of this circular stuff that Nvidia is giving a billion dollars to a company. So they'll buy a billion dollars worth of Nvidia chips, things like this. It's very confusing. Well, there's, okay. So there's a bunch of stuff at play there. Um, that's the, that's the OPEX versus CAPEX, right? Uh, no, no. So, okay. I'm going to shut up now and let the smart people talk. Well, so securitization, which is what this, is this the securitization article? Okay. Um,
[00:35:57] that is when you do a deal, like an extended financial deal, and then you, you slice it up into little slices and then. Oh, like red lobster. It is exactly what caused the mortgage bubble in 2008. Oh dear. Oh dear. As a, as a former bond reporter, this is like, this was my jam. Oh dear. Uh, so it's, so, okay. So what happens? So what's happening here is what you're saying. This is a bubble. Well, it's not transparent.
[00:36:25] So the issue is there's a lot of insider. So if you sum up the whole thing, there's a lot of insider dealing that people don't necessarily understand. And it's not transparent either to like the media or people watching investors, but even also within like the securities in banking world. So there's a lot of, I would call it financial chicanery or shenanigans. Yeah. Um, but it is actually legal. So I'm not sure. It's not illegal chicanery. It's legal chicanery. I'm like, I'm like, so I don't know if that's actually,
[00:36:55] if I were writing an article, I don't know if I could use those words. Wong and Wartzel say the U S is a burgeoning AI state. And in particular, an invidious state number keeps going up, which is buoying the markets in the short term. That's good, but it's precarious. Uh, and I think there are a lot of people I, you know, my, I'm, I'm at retirement age. My, uh, IRA is, you know, heavy, heavily in us equities.
[00:37:22] Everybody's a little worried about a bubble or a crash, but we just spent half an hour talking about how valuable AI is and how transformative it is to businesses and coding and security and so many areas. In fact, the chairman of the, um, uh, the fed said this isn't exactly a bubble because there's value being created. I mean, I would ask, where is the value? Yeah. So where is it right now? Yeah. Yeah.
[00:37:52] We're throwing out AI. We're hyping it up to be this economic powerhouse, which I think it absolutely could be eventually. So it's not like a tulip bubble where there's no underlying value, except for like flowers are good. It's more like a railway or the very early days of, I don't know, oil. That's what Jeff Bezos said. It's not a financial bubble. It's a, it's a, it's an industrial bubble. So what happens with industrial bubbles, like railways or the.com bubble is you get infrastructure.
[00:38:21] Which survives the bankruptcy of a number of companies. And the infrastructure is a value. Except we have fundamentally changed our definition. We've changed how long our infrastructure can last because this infrastructure is built on a, a chip that has an 18 month life, right? If you're really lucky. So we have to think about like, if you're doing your financials and you're thinking,
[00:38:48] I'm going to amortize this over 18 months versus I'm going to amortize it over computers have what is it? A depreciation schedule of like five years. Yeah. Well, in reality it's probably like that, but yeah. Yeah. So I don't, and I don't know where we are on like our, our accounting yet for accounting for this, but so we're basically saying, so it's, it's like figuring out what we can do with like oil and gas or even electricity, right? Like we've got to put all this infrastructure in place,
[00:39:17] but the infrastructure is not as long lived. So we have to have different financial tables and account for it and value it differently. We aren't. So, so what you're saying is to paraphrase Terminator 2, we are in uncharted territory, making up history as we go along. Is that what you're saying? It's, it's charted. We're just using the wrong like depreciation schedule. Our maps don't match the, the landmass that we are actually traveling. Is that what you're saying? No.
[00:39:46] Am I not doing a good job here? Jerome Powell, chairman of the fed says, unlike the.com boom, AI spending is in a bubble. I mean, his proof of this is I can't name names, but some of these companies have earnings. That's. He did this in. So, okay. Think about, think about Akamai in the night in 1999 Akamai. Think about building out the telecommunications infrastructure for the underlying internet way back in the late night, right before the.com crash.
[00:40:16] And all of that eventually became super valuable, right? We were, we were not wrong. There's a lot of dark fiber. It was going to be valuable. Yeah. But one of the reasons it became so valuable is because the financial in the bubble aspects of it became untenable. And we got bankruptcy for a lot of these companies and it wiped out all of their debt. Right now we have a different financial structure. This isn't necessarily debt funded. This is venture capital funded, which is a little different, but. Anyway,
[00:40:46] we're basically, we're spending money too fast on this and we're, we're putting too much money too quickly in it, or we're not accounting for the actual how. Nevermind. Let me quote. Let me quote Wong and Wurzel from that Atlantic. Let me quote Wong and Wurzel from that Atlantic. If you really want dystopia from that Atlantic article that Jill gave us, listen to the AI crowd talk enough. You'll get a sense. We may be on the cusp of an infrastructure boom.
[00:41:15] And yet something strange is happening to the economy. Even as tech stocks have skyrocketed since 2022, the company's share of net profits from S and P 500 companies has hardly budged. Job openings have fallen. Despite a roaring stock market, 22 States are in or near a recession. And despite data centers propping up the construction industry, U S manufacturing is in decline.
[00:41:40] They say AI is drowning out is obscuring the wobbling American economy. Yeah. I think we can, AI can have value. It's just not enough. Part of the problem here is there's a handful of AI skeptics for whom their skepticism is there become their entire personality. And their entire brand. And we know who they are. Yes. Yeah. And I'm not going to say the name because they have like followers who we know who they are because we've had them on our shows. Okay. Yeah. Some of these people's, their followers will send you death threats and stuff. Like,
[00:42:10] yeah, no, they're very, yeah. Yeah. And so like, they make it hard to talk about this because if you're not fully in their camp, you're, you're a mush. And then they also make any kind of AI skepticism seem like you're a Lenite. Right. Right. And AI, from my perspective, AI is incredibly valuable. I use it every day in a variety of ways. And I think these companies will make a bunch of revenue, but there's a bunch of ways this can still be a bubble. One, if the real economy is hollowing out, right?
[00:42:39] The way the AI companies will make money is from the real economy. Yeah. They're paying them money to do things. Right. So if State Farm and Walmart and, you know, every consumer products company, the Procter and Gambles and Boeing, and all of these folks are losing money, then they will not have as much economic flow. They will not have as much free cashflow to take, shave money off the top, to send it to the AI. Right. Right. So if, if the,
[00:43:07] if we end up because of a confluence of issues, including tariffs, the business cycle, a bunch of other things, hollow out the real economy, then we can be in a bubble, no matter what's going on with the AI companies. Second, I think, even if things were going great, you can have like this bizarre financialization that safety's talking about. I saw this personally of like a friend of mine got in on one of these deals and then offered for me to invest in a data center. And I didn't do it because in the end, you know,
[00:43:35] I do some seed investing in companies every once in a while. I know what I'm buying. I'm buying stock in a company or I'm getting a, a safe, which is debt that can turn into a stock. I had no idea what I was buying, what my money was being turned into. It's like Bitcoin. It's like NFTs. It's at least in Bitcoin. I know I'm getting some kind of BS. You're getting the promise of a payment from a, from a bank that is getting a payment or a chunk of a payment from the data center company. But yes,
[00:44:06] it's right. But it was like repackaged. Yeah. But yeah. And it was like a contract with an entity with another entity. And there's like five entities with made up names. And you're like, this is so spectacularly sketchy. I'm going nowhere near this. Right. Like, and so it, it screamed financial bubble to me. Right. Like, and so everything could be going great. Even if the economy was going great, that could be insane that you have like this crazy financialization where you have the AIG mortgage issue stuff going on. Now, like Stacey said, we're not talking about banks here.
[00:44:36] We're talking about like hedge funds and BCs and a bunch of, who cares if they go belly up? Right. Yeah. But like, it's still, who knows? Like that's what people said about AIG. Right. Right. You, them to start selling all their debt. And then I think there's a third way this could be an interesting bubble, which is with the railroads, like we laid all this infrastructure, like there was no situation in which all of a sudden, uh,
[00:45:06] the railroad tracks themselves were not going to be useful or the engines. Like when, when all this railroad track was being laid in the mid 19th century, we were not going to have all of a sudden, like diesel electrics were going to be invented like 1865 or something, 1870. Right. Right. But we're in these revolutions. Stacey talked about like 18 month chips. Like, I mean, I think, you know, the depreciation on these things that can use them more and stuff, but like, you know, all this money is going, for example, I just gave this talk. Um,
[00:45:36] I gave a talk at Purcell and I, I gave a different talk at like InfoSec world, but in both cases, I talked about like Gary Marcus, MIT talks about this a lot. Like we, you know, if you think about what AI is, AI is a study in computer science of making computers act like humans. Within that you have machine learning, right? Which is getting computers to learn from data. And within that you have reinforcement. And within that you have LLMs. And that is,
[00:46:00] LLMs are like a tiny little bubble of these concentric circles of which AI is this huge circle. And all the money is going to LLMs. Right. Because it's the first thing that people are like, Oh, this is actually useful. And so Marcus has said that's a, that's a mistake because we're spending so much on LLMs. We're not paying attention to other systems. It might, they're so fundamentally limited. We just like them because they look like us. That is all it is. And so there's all this cool other research, right?
[00:46:28] Like people are building these models are much smaller that are much better at like solving real interesting problems. They can't write poetry, but they solve like useful things. And so I think we also could have technological breakthroughs that either make LLMs much more efficient and therefore massively change the economics of all this, or that replace LLMs for certain use cases and then invalidate a bunch of this spend. I mean,
[00:46:55] there's all kinds of stuff that's unpredictable that could happen in the next five years. And then that could blow up parts of this stuff. So that wouldn't be like an overall bubble, but that could create micro bubbles or, you know, partial bubbles and cause all kinds of churn. And overall it'd be great because we'd be having a new efficiency, but that could absolutely kill one of these companies that looks absolutely, you know, dominant right now. And in the long run, it would be great in the short term. It would cause less of them. And so I, I think it's, it's,
[00:47:24] it's crazy to have, but it's also, we've all, all of us have like decided it was good to put all of our money in the S&P, but like the way that the S&P 500 allows these companies to expand, I think like somebody at Standard & Poor's needs to get, slap like, because there's a Richard Campbell told me there's a, there is an S&P 500 minus the Magnificent seven. There's a EFT that has the, it's like 493. And I guess that might be a good investment.
[00:47:54] I don't know. I'm not an investment expert and I don't know what to do because yes, it's overvalued. AI is highly overvalued, even in something as supposedly as diversified as the S&P 500. If you're going to call it an index and you're going to say there's 500 companies, then you should not allow seven of the companies to be 50% of the 500, whatever. Right. Like it's, I think there is this, this crazy thing is like, as the companies get bigger, we cannot, this index investing has gone too far. Right.
[00:48:23] Like we need to. Don't tell me that. That's the only way I can invest. Cause I can't buy individual stocks. Yeah. I'm just, I'm just waiting for somebody to start using the term too big to fail about all of these seven tech companies, because I mean, they're all in the administration, right? Like, why do you think they've donated so much money? So cozy with the president. Like they, they need not just laws and legislation on their side, but I,
[00:48:51] I have a feeling that they meet, they may in the near future need some help making regulation problems go away. More investment. Are they too big to fail in that respect? Will they be, can they be bailed out like the banks? Yeah. Well, that's kind of what I mean. I mean, if, if you have 56% of the S and P riding on these seven companies, how can you, how can you let any one of them fail? Well, thank goodness. The president is a businessman. Those.
[00:49:21] Oh, nevermind. He knows the art of the deal. He knows the art of the deal. I'll grant you that. There is an argument to be made that they're too big to fail on the actual infrastructure front. As we saw with like all these networks, operation centers that are being built. This is. Although do what, does anyone else remember when AWS E or the, the Eastern, the one in the US East one. Yeah. I was like, that used to go down all the time. I mean,
[00:49:51] I remember when Adrian Cockroft was doing his, you know, chaos monkeys and that's, that's why that came around. So I actually think we're really amazing, but it is too big to fail from that perspective. Yeah. It's amazing how many people were dependent on AWS East. That was a big deal. All the zones going down at the same time was a big deal. Yeah. We learned a lot. That should not have happened. Yeah. Yeah. But financially too. I mean, like Leo, like you were saying, you know, your,
[00:50:19] your retirement accounts are completely. invested in these companies. I'm terrified. If we have just this massive number of baby boomers who are starting to collect on their retirement and, and that, that money collapses, that value collapses. Are we going to, is the government going to allow that to happen? Is there going to be some rectification there? Yeah. Cause we baby boomers vote. Don't forget. They can sell their houses. They'll be fine. Oh, that's right. All their houses, all our houses.
[00:50:48] We just sell them. Yeah, no, it's a, it's a, it's a scary time. Cause if you look at the S and P 500 for the last 10 years, boy, that was a good thing to buy into. You know, in the financial news world, everything I keep hearing is the K shaped economy, K shaped economy, which is, you know, the people who are doing well, it's going up, up, up for the people who are not doing well, it's going down. Right. Is that just not the story of America? Yeah. Yeah.
[00:51:16] And income inequality is really on a fast track. You're going to, you're going to sell your houses, but to whom who's going to buy houses? Well, that's the point. You know, if you fire everybody to make money for AI, who's going to buy AI you're making. Who's going to buy your stuff. Yeah. The other story, I mean, not to get too much into financial news and the economy, but you know, the other one I keep hearing about is the birth rate, right? So the birth rate is low. People are not having enough children. Big, big problem in South Korea.
[00:51:46] But it's starting to be talking again. And China. In China. Yeah. And, but in the United States now, so the average birth rate for women is a little over two children. But once it goes under two, that's. Well, it's already, it's already a little too low. So it's a little too low. For the economy, right? Like it's not a problem for anything, but the economy. It's, it's not necessarily a bad thing for the environment,
[00:52:14] for keeping the human race alive. Like it's fine for most things. It's a problem for the economy. And it's because we want our economy to keep growing. And like, that is just a fundamental problem that we're going to have to deal with at some point. It is. We need to make more customers, get to work. Our standard of living in addition to the economy. Yeah. So I'm not by any stretch of the imagination, like arguing that we all should be. So you have one kid. I guess I am not contributing.
[00:52:45] Jill, how many kids do you have? Zero. Okay. Alex, how many kids do you have? Two, three. I have two, but I have three wives. So the five of us only made three kids. So we're, we're negative. This whole panel, even with Alex's overproduction. My wife and I did our part. She did the hard part, obviously. You did the hard part. All right. I want to take a break on that. No, I will talk about, well, there's many other things to talk about.
[00:53:14] And we have such an excellent panel. No more financial talk. Cause I don't know, you know, we're not a financial podcast, but this is kind of what has tech wrought in a big way. Have your ad started? Are you going to start selling food and survivalist stuff? Gold. Yeah. Gold, baby. Gold. Gold. I turn into the same ads as like, yeah. As AM radio, as talk radio. Yep. Gold baby.
[00:53:46] All right. Let's take a break. We do have a fabulous panel and lots of good things to talk about. We'll get to those. The president actually is in Asia right now. He's apparently made a deal for TikTok. No one knows what it is, but we'll find out maybe. We'll get to that in just a little bit. And the FCC has voted to make it easier for your ISP to rip you off. Isn't that good news? All that and more. Still to come. Alex Stamos is here.
[00:54:13] Always a pleasure to have him from Corridor.dev. Our security guru. Stacey Higginbotham is here. We'll talk about the Neato vacuum cleaners in just a bit. Policy fellow at Consumer Reports. And Jill Duffy is also here. We could talk about EVs. What's your beat these days? You're doing PC, Mag, and wired. Yeah. I'm writing still about organization a little bit. I had some fun stuff recently about death. Death and organization. Yes.
[00:54:43] I want to talk about death. Swedish death cleaning, I believe you brought this up. Okay. We'll get to it. It's not as bad as it sounds. I've actually been in. I embarked on a little Swedish death cleaning myself because I am almost 70. I mean, it's time, right? To start. I don't want to dump all this on my kids. Anyway, we'll get to that. Uplifting topic. Just a moment. This is this week in tech. I'm glad you're here anyway. I hope you, hope you feel the same way.
[00:55:12] Our show today brought to you by Zscaler, the world's largest cloud security platform. They could not be in a more timely position. As you can tell AI, you know, it's changing the world. The rewards for enterprise are, are too big to ignore. The risks for enterprise are also humongous. It's the loss of sensitive data and attacks against enterprise managed AI. Generative AI increases the opportunities for threat actors.
[00:55:42] We were just talking about, I can't remember if it was before the show or during the show, but helping threat actors rapidly create, you know, perfect phishing emails, or write malicious code, automate data extraction. There were, this is, I mean, these numbers are terrifying. 1.3 million instances of social security numbers leaked. And how were they leaked through AI applications? Often by your employees who are using AI, right?
[00:56:11] ChatGPT and Microsoft Copilot alone saw nearly 3.2 million data violations. So there's all sorts of things to think about. This is time to think about your organization's safe use of public and private AI, and perhaps to protect yourself against overachieving bad guys. Chad Pallet is acting CISO at BioIVT, and he says, Zscaler helped them reduce their cyber premiums.
[00:56:40] I was just looking at our cyber premiums. What a depressing number that is. They got theirs down 50% and doubled their coverage, plus improved their controls. Take a look at this. We got a video from Chad. With Zscaler, as long as you've got internet, you're good to go. A big part of the reason that we moved to a consolidated solution away from SD-WAN and VPN is to eliminate that lateral opportunity that people had,
[00:57:07] and that opportunity for misdirection or open access to the network, it also was an opportunity for us to maintain and provide our remote users with a cafe-style environment. With Zscaler Zero Trust Plus AI, you can safely adopt Gen AI and Private AI to boost productivity across the business. Zscaler Zero Trust Architecture Plus AI helps you reduce the risks of AI-related data loss,
[00:57:36] protects against AI techs, because, you know, zero trust is the way, and guarantees greater productivity and compliance. Learn more at zscaler.com slash security. That's zscaler.com slash security. We thank you so much, Zscaler, for supporting this week in tech. I have to say, AI has been very, very good to us advertising-wise.
[00:58:04] There's no question about that. So maybe we're the magnificent, little Magnificent Seven, the baby Magnificent Seven, the benefactors of all this. The U.S. government has voted to tighten restrictions on Chinese tech companies, deemed threats. This is from the FCC, including, they're getting close to banning the number one, according to Wirecutter,
[00:58:34] the number one router. And I know there's millions in the US TP-Lynx routers. Yeah, they've got like a 52% market share or something. 52%. For years, Wirecutter said, you know, the TP-Lynx routers are the best. The Washington Post says, a number of U.S. government agencies are backing a mood by the Commerce Department to fully ban TP-Lynx routers. Now, we've done this, and I think understandably, with things like Huawei networking here and the infrastructure gear.
[00:59:04] Should we be worried about TP-Lynx routers? Alex, what's your take on this? So, I think this is a hard one to justify for a ban. I mean, TP-Lynx have a really bad security history. They have lots of bugs, and they get exploited a lot. But so does Microtix. So does, I mean, so many. Consumer routers are awful. Consumer routers are awful. They don't, you know, this is why I usually push
[00:59:34] friends and family to Eros and other ones that are both made by big companies, but are also Amazon-owned. Amazon makes Eros, and they're cloud connected, and they auto-update, right? That's the thing about keeping it right. That's the key, isn't it? Yeah, yeah. Because if there is a flaw, you want it to be owned by a company that cares enough to fix it, and then you need the mechanism to do that without you checking. Because no one checks their router firmware. Is it up to date? I mean,
[01:00:02] normal people will never log in again, right? They'll set it up once, and they'll never touch it. I like the Eros too, because then I can manage them remotely, right, over the cloud. I set my mom up with Ero. Yeah. For that reason, exactly. But they're expensive, right? Yeah, they're very pricey. I use Ubiquity here. Is Ubiquity okay? I love Ubiquity. That's what I have at home. But I mean, that is prosumer slash S&E gear. Talking about expensive, yeah. And they're great for if you have Cat6 through your house,
[01:00:33] or in your office. Right. But their mesh stuff isn't that great, right? Like, so the Ero mesh works pretty well. It's the best, isn't it? Do you think it's the best? Or is Orbi? Orbi is another one that people seem to like? I think Orbi, I mean, I've heard Orbi's fine, so I just haven't used it myself. Some people were saying Asus. I like Asus stuff, right? But I mean, TVX are famous for getting owned. A version of DDWRT, which is an open source router firmware. So you're. But again, if you're not updating it, like anybody can have a flaw. But like,
[01:01:03] unless I have not seen evidence, I mean, there's a difference between like a TP-Link and a Huawei, right? Like the Huawei stuff is going into American core networks. It's going into, you know, the Verizons and the T-Mobiles and stuff like that. And, you know, we are coming out of the largest mass hacking incident that we stopped talking about, right? Which was the salt typhoon campaign. Oh my God.
[01:01:32] Against the U S telecom industry. And they're still in there, right? We can't get rid of them. It is widely discussed that they are still in there. There was supposed to be an investigation by the CSRB, the cybersecurity review board, which was unfortunately disbanded in the first month of the Trump administration. Because who needs security? Completely nuked the CSRB. And it was never restaffed.
[01:01:59] And my understanding is the report they were going to put out was scathing. Like the one they did on Microsoft was extremely enlightening. And it would have been great to get this. I saw a briefing from a CISO of a telecom company that is not an American telecom company. This was a closed door briefing. So I can't really speak about who it was, but it was shocking. Well, the telecoms in the U S say, yeah, we could get rid of it.
[01:02:25] We'd have to swap out all our equipment and the entire telecom infrastructure be down for a few days. Is that okay? It's, it's that it's money. It's basically they never patched these devices. They never read with devices to a certain extent. And, uh, is it SS seven still a problem or as a seven is definitely a problem. That is not the core problem here. A lot of the attacks here are IP level attacks, but as a seven is definitely a problem and causes a lot of the, uh, you know,
[01:02:54] every one of our mobile devices has an SS seven stack on it. Right. So like when you send an SMS message that actually has SS seven framing, SS seven was never meant to be something that it was untrusted device. It was hacked years ago. I mean, more than a decade ago, right? Yeah. It's a problem. And as a seven causes all kinds of issues. That's why, uh, foreign adversaries can track Americans around the world is through SS seven signaling attacks and such. Um, there's this home record location attack that actually some colleagues of mine, uh,
[01:03:25] demoed, uh, at black hat, I think in like 2006 or 2007, that still hasn't been fixed. So, yeah, I mean, there's, there's all kinds of layers of problems here. Um, the idea that TP links at the top of the list is low to me in a high inflation environment where we're like raising prices on consumers. Like I don't see this as the top of my list, unless there's some classified data out there that these really bad bugs and TP link are intentional. I think they're just like crappy devices. We're about to ban, uh, DJI drones because they're made in China. Right.
[01:03:55] Yeah. And in the article, the WAPO article, they talk about this being another bargaining chip for Trump. And look, I, I don't love TP link. We've done, I've been doing a lot of research on cybersecurity for routers, like consumer routers for CR. Uh, that's one of my side projects that one day you'll see out in the world. Yay. Um, in TP link, like all of consumer routers are terrible or not all many art.
[01:04:23] Many do not have the features that we need. And there was actually an effort. Uh, Biden did a 2021 executive order that was like, Hey, it's the same one that did the trust mark. They were like, let's make a router standard for secure routers. And NIST actually did it. Very few router companies actually participated in the NIST. I remember that. Yes. There was going to be a, a seal that you could put on the box and all that. Well, they didn't, they didn't know what was going to become of that, but, but yes,
[01:04:52] NIST actually came up with a secure router framework. And one of the top things, like Alex said, was you've got to be able to update it over the air and people, it should be opt in and you can opt out if you like, feel like you're a super expert and want to, there's a bunch of other recommendations in there about like remote access, what port should be open. My favorite thing. You have to declare the end of life for the router when you're planning to stop supporting it because thank you, does it happen? Thank you.
[01:05:22] So, yeah. So I, on the TP link front, we test a lot of routers at CR. We see a lot of the same stupid bugs. I'm just going to call them stupid bugs because they are, there are things like your SSID and password going over the network in plain context. We see this across like multiple brands and I'm like, why is this still happening in the year of our Lord, 2025? And we tell consumers for this front, look, don't panic.
[01:05:52] If you've got a TP link router, just your next router, maybe look at some other options. I know they're expensive, but think about this is like, they're all made in China though. I mean, right. Everything's made in China. They're all made with basically the same underlying chips. They've got a lot of them use the same like reference designs. I mean, this is not like, but the post quotes, Jeff Seidman, a spokesman for TP link saying it's nonsensical to suggest that any measure
[01:06:20] taken against the company could serve as a bargaining chip in us, China talks. As the administration suggests, any adverse action against TP link would have no impact on China, but to harm an American company. If that's true, I'm sure China knows that, you know, well, so I mean, his brother's still in China. The guy, the person who owns the company, I mean, like, but I don't know. there are people. It's not, it's split, but it's not fully split or it's, they say we're fully split. They fully split, but his family is in China.
[01:06:50] Okay. I mean, it seems like it's a pretty tenuous. Well, the FCC has done some really, they've been very, so the FCC itself has been very focused. They've been very focused on China, like much of the administration. So they've done a lot of their bad labs. So they're trying to make all of your, you know how everything electronic gets tested by the FCC for interference. Yeah. So they're trying to take all of that testing capacity outside of China. That's their bad labs efforts.
[01:07:17] So they don't want Chinese owned companies or labs that are in China to be testing this stuff. The reason that stuff's in China is because it's all built there and you don't want to ship your boards back and forth across the ocean. And then there's a lot of the cyber trust mark right now is actually in the air because the FCC is investigating the fact that UL has Chinese testing relationship. I'm disappointed because, as you know, every time I buy a drone, I crash it.
[01:07:48] But the new DJI drone has LIDAR built in. And, but it may be banned as well because guess what? Made in China, Chinese company. Uh, and I guess if I have a Chinese drone, it could be sending pictures of my house or something. If we go down this route, we're not going to have anything. Holes consumers so much. Well, what about stupid too, because the Chinese are way ahead of us in drone technology, right? Like, yes,
[01:08:15] the irony here is that China is begging for American tech, right? Like in these negotiations, what they're trying to do is to get the boundaries for American exports lowered. They need chips. To increase. Yes. There's this, this basic arrogance, the idea that the U S is better at everything tech. And that is just not true. And drones are one of those things. So getting American kids to play with drones, taking DJI drones apart,
[01:08:41] and then going to college and then going and working at Andrew roll or whatever next generation of American drone companies are, and then competing with DJI to build. Unfortunately, the truth is the weapons that world war three will be fought with is way smarter. Like the idea that like, it, it, this is as dumb as, as China boycotting NVIDIA, right? Like if they did that, we'd say that is ridiculous. That is as ridiculous as us boycotting DJ right now. Right.
[01:09:10] So just to get back to this, cause I see in our chat room there, a lot of our listeners have TP link routers. Some of them still in the box, uh, should, I'll, I'll send all those listeners a link they can click. And, just what your advice is, make sure you check the firmware regularly. Is that update the firmware, change the default password. Right. Yeah. Um, the usual, that's the other, because a bunch of these have CSRF attacks against the, uh, interface. Right. Is,
[01:09:38] is also the way turn on WPA three encryption, uh, turn off when administration change. Most of them should have nothing on the WAN, but yes, make sure you have like WAN SSH and WAN, uh, uh, turn that off, admin off for sure. Yeah. To the usual, but you do, you're supposed to, you should do that with anything, any router you put on your network. You know, if you know what you're doing, you can map yourself from the public internet or you can look yourself up. See what happens. Yeah. Yeah. I mean, you can like after, you know,
[01:10:06] after you have it set up for a day or two, you can. There's some advice. Look yourself up on show Dan. You know, I'm, I'm scared to do that where you just put your IP address and showed in and see what happens. Yeah. Oh God. Your public IP address. Don't put one 92 to one. No, right. You're probably, but like, um, I mean, or, you know, go to a coffee shopper. I mean, go to a place where you have free. See if you can log in, go to school and then map yourself. You know what you're doing. I mean, people are on the twit discord. Uh,
[01:10:37] they know what they're doing. Install, uh, brew install and map. Yes. Yeah. Brew install and maps. Ladies and gentlemen, there's your tip for the week. And if your router is more than five years old, look it up, put the, put the model number in and see how long, and Google that with like end of life and see if it's actually still supported. Were you shocked that Nito vacuum cleaners are no longer going to work? Uh, because the, the Nito folks.
[01:11:06] So they announced that in 2023. What they announced though at the time was that they were going to try to do the cloud updates for five years. Yeah. And keep inventory for. They went out of, they went out of business in 2023, although they're owned by the same company that makes my, the thing I bought that you told me to buy. Thermomix. My Thermomix. Um, when I saw Vor, Vorwerk owns Nito, I thought, what is my Thermomix going to stop working in a year or two? I mean, maybe.
[01:11:36] Um, so it was expensive. Yeah. I love it. I'm curious. Where did you, where did you buy your Thermomix? Oh, I don't know. Why do you have a Thermomix? No, no, no. But for the longest time they were not available in the U S. Oh, are they in the U S now? Yeah. They came to the U S like three or four years ago. So, you know about the Thermomix. Oh yeah. Oh yeah. Makes the best mashed potatoes, the best risotto,
[01:12:04] the best tomato soup I ever had. All of them. That's all I ever made. You could get them, you could get them in Europe, but then it would have a two 20 plug. Right. So you, and then I think they finally started selling them in Canada, but they wouldn't ship them over. So you physically had to go to Canada. If you wanted to buy a one 10 version, you can now buy them in the U S. That's exciting. For $1,699. Yeah. This, I tell you, this will give you some idea of this device. Uh, if you ever watched below deck,
[01:12:34] which is the Bravo reality show about living and working on a yacht, all the yacht, all the super yacht kitchens have Thermomixes in them. Yeah. Should we tell people what it is just so they, yeah, I've never figured out a way to, it's a blender that heats up. It heats and cools, right? So you can make, you can make soup, you can make ice cream. All in one unit. Wait a minute. Right. Is that right? Can you make ice cream in a thermal? I don't think it cools. It doesn't cool. What does it? Stacy, does it cool?
[01:13:05] She's looking it up. If, if, if you've, Oh my God, I have to stop the show right now. I might be wrong, but it definitely has a heating function. Right. Definitely heats. It's why it's good for soup, but you could anyway, I don't know how steam. It doesn't say cooling. Yeah. You could, you could, you probably, I know what you have. You, you would need like an external, you would need to run like a free on tube outside. You can make your custard for your ice cream and the thermo mix, but then you have to, you still have to put it in the fridge.
[01:13:35] Yeah. Yeah. You probably have one of those slushy machines. That's all the rage right now. The Ninja or the really fancy Italian? Yeah. The Ninja. No. Well, I don't, I don't know anything about that. I only know about ninjas. This is the, this looks like the closest thing to the, the machine and back to the future. Right. Like, yeah, look at this. It comes in purple. It is. It's like the fusion. It's like the, yeah, just throw your garbage in it. And no, no, no. I meant like one of their cooking where it's like the old.
[01:14:05] Oh yeah, totally. Right. Where it's the family. It's all messed up. Yeah. They put something, it just comes out. Yeah. Larry in our discord says we have the Ninja creamery to make ice cream. This seems like such a bad idea. I don't want anything that can make ice cream in the house. I have enough problems as it is. Yeah. Now I want to buy one of these things and just run through wire. Sure. Oh, here's a bundle. Here's the, but this will make slushies and ice cream. It's you get the creamy and the slushy two for one, baby.
[01:14:35] Okay. Let me just tell you the best ice cream maker. And I got one. It's a wedding present. It's the winter automatic compressor, ice cream and yogurt maker. It is a heavy, that's a giant, difficult thing. Yeah. Um, and you can't clean it because you put all the ingredients in this, it's like stainless steel bowl. That's built into the winter is spelled W H Y. And that's my question.
[01:15:05] It really makes amazing. This looks like my giant, uh, hermetic sealer. It's huge. Oh my God. Cause it actually does have a compressor built, but it has a comp. Yeah, no, it's amazing. A mint mixer. But if you have a thermomix. Well, I might as well. Right. I'm at a counter space to be honest. I, but this thing. Oh my God. I'm just going to let y'all. It's like,
[01:15:33] we're not going to let the laws of thermodynamics get between us and ice cream on, on top of your counter. It's the smallest amount of ice. I mean, it's like, it only makes a little tiny bowl. It's, it's not. It does it in 30 minutes though. It's, I mean, it is an awesome, can I make the custard in the thermomix and then pour it into the winter? Yeah, you probably could. You could mostly automate it. All right. We're going to take a break now.
[01:16:03] 180 Watts. Okay. I'm going to have to get a new fuse box put in, but other than that, I'm getting another 40 amp circuit for this thing. And the thermomix. All right. I'm sorry, folks. I apologize. And trust no one says, I can't believe I'm saying this, but $500 for both the creamy and the slushy is not a bad deal. The ninjas are very affordable, very affordable.
[01:16:34] We're having fun. This is this week in tech, the wonderful Jill Duffy, who do you schlep it? You move around a lot. You bring your thermomix with you everywhere you go. I don't have a thermomix. Oh, I'm I actually, I do a lot of old school stuff. I've been making homemade yogurt lately, but I do it just in glass jars. And then I have a, I have a water distiller. Cause I can't drink the water out of the tap here, but the water distiller is warm. So I just use it for making yogurt. Glasses. Like all jar glasses.
[01:17:04] It keeps it just on top of the distiller. And then 12 hours later, I have yogurt. Aren't you thrifty? Very. Yeah. Low old school, old school. Yeah. That's it. Now I'm hungry for yogurt. If you just got the slushy, you could, you could make frigo. I could do that. Yeah. But I'd need 220 here. That's a problem, isn't it? You, when you move around and you move around a lot, you have to always have the plugs, the things. Yeah. So there's something called the step down converter,
[01:17:33] which is a giant box, right? So that I can plug in that with my, um, I usually have them provided for our house here. Yeah. Yeah. Right. Yeah. We're going to have the house. You better put a converter, an inverter or whatever. Yeah. Yeah. It's great to have you, Jill. Thank you for being here. Contributor wired magazine and a PC magazine. Stacy Higginbotham is also here. She's policy fellow at consumer reports working on that router stuff. I'm excited about that.
[01:18:01] Your colleague Paris Martineau has become a, a radioactive shrimp and lead in your protein powder superstar. I'd love to radioactive shrimp story. It answered so many questions. She figured it out where the cesium was coming from. Pretty impressive. I love CR. I've been a customer, a customer subscriber, a member since 1980. Uh, my whole life. Yeah. It's the best. I was so happy when you went there. You give it a full pizza,
[01:18:31] full pizza, all eight slices. Alex Stamos. He's the CSO. Cause chief security officer, uh, at corridor.dev. If you've, if you're using agentic AI to do vibe coding, you better go visit corridor. You need some security. You can't just, you know, do what I did this morning and let Claude code go crazy on your repositories. Well, it was kind of fun. I admit tickled a little bit. Great to have all three of you.
[01:19:00] Our show today brought to you by Miro. Oh, we love Miro. Micah and I used Miro, uh, for a long time as we were planning the show. Cause we were in different places. We did the ask the tech guys show together. Right. And Miro was such a help the gap between when you're a startup or you got to, you know, you're, or you're planning a show or you're working on a project together. The gap between idea and impact can kill your team's progress.
[01:19:28] And I think it's safe to say just throwing AI at the problem without, you know, and without the clarity that, you know, the planning doesn't help. That's where Miro's amazing. Yeah. It's powered by AI, but with Miro teamwork, that typically takes weeks done in days. So your team, your team focuses on building the right things and having AI available in a contextual, easy to apply way because the AI sees your planning. It's all there in the Miro.
[01:19:58] Here's how Miro helps teams get great done. Uh, your second brain with Miro AI sidekicks. Okay. AI sidekicks are trained to think like product leaders, like agile coaches, like product marketers, to review your materials, to recommend areas, to double down on, or to clarify inputs, or to add direct feedback.
[01:20:22] You can build custom sidekicks yourself that integrate into other workflows for exactly what your team needs. Think of it as an extension of your team's capabilities. It's not a replacement. No, no, it's an extension. It's great. You can generate meaningful insights faster by eliminating the need to switch between tools with Miro insights. Miro AI sorts through everything you input, even if it's in different formats. That's the beauty of Miro. So you throw in your sticky notes, your research, your ideas,
[01:20:51] then Miro combines them into a structured research summary product briefs. They can even do like things like sentiment analysis, which means you can take concepts. You take 20 concepts, test them rapidly with Miro prototypes. You could generate instant and contextually embedded prototypes for right from your board without needing to do it anywhere else. It's kind of miraculous.
[01:21:15] You'll iterate rapidly on near limitless variations and then feel confident about the feasibility and visualization before you even get into the high fidelity builds. It's nice to have that confidence. We're on the right track. Spend time on building, not digging for information. Miro doesn't replace the design tools your team loves. It works with them. It's aligning before you need them. Okay. Okay.
[01:21:41] Miro's blueprints and spaces organize your teams work in an intuitive and easy to follow format. Help your teams get great done with Miro. Check out Miro.com to find out how. That's M-I-R-O.com. We thank them so much for supporting this week in tech. You check it out. Miro.com. And if they ask you, I don't know if they do, but if they ask you, say, yeah, I heard it on Twitter. So I, I mentioned this in passing before the last break. Let's get to it now.
[01:22:10] Now the FCC voted or is about to vote to scrap. This is almost stunning, but it's, I guess shouldn't be surprised. Cyber security requirements. Brendan Carr, who's the chair, voted against the rules when he was just a commissioner in January, calls them ineffective and illegal.
[01:22:31] So they'll be voting in this month on whether to eliminate cyber security requirements for telecom carriers. These are requirements that were enacted after salt typhoon. In January. Yeah. Yeah. This was basically, you know, Calia is fine. I mean, Calia is not the best avenue to do this, but we don't have another avenue.
[01:22:59] Wyden actually introduced a bill last year. And I think this year as well to require the FCC to set cyber security rules for carriers. So Carr said they moved too fast. It was an 11th hour ruling that not only exceeded the agency's authority, but wasn't effective or agile to the cyber. I've heard that Mr. Carr really cares about the FCC staying within its lane and not exceeding its authority. We're correcting course, he says.
[01:23:31] All right. Well, as long as. Yeah, they're terrible. And they do need cyber security rules for our carriers because they are not. I mean, I'm sure Alex can tell you that if you don't force companies to, if you don't require them to have some sort of cyber security. Rules to follow. So they will not follow them unless it becomes too painful for them. And you would think salt typhoon was too painful, but. Yeah, the market's not going to take care of cyber security.
[01:24:03] It's getting hard to not be conspiracy minded here. For all of the. Statements that this administration makes about China. Salt typhoon completely owned up America's telecom networks to the point of where they were able to. Listen in on the calls of the then candidate. Trump and the vice president,
[01:24:32] apparently possibly even sitting members of the administration. Those space vulnerabilities have not been taken care of. but we're going to get rid of the TP link routers because those are the problems. They closed the cyber safety review board before the CSRB issued a report that would address those issues. They have destroyed CISA. They've eliminated the vast majority of the good people there. CISA does not have a confirmed director.
[01:25:00] A ton of good people are gone. The National Security Council used to have 396 people working for it, including a fully staffed cyber division. There are now fewer than 40 people working for the National Security Council and nobody working for cyber, as far as I can tell. The only person who's been confirmed doing anything for cyber is the National Cyber Director, who seems like a perfectly nice guy but has no background in this, and Mitch as much if you ever meet with him, which I was able to do during the RSA conference.
[01:25:29] And if you look at the long-term advantages the United States has over the People's Republic of China, our ability to attract the best and brightest people from around the world, high-skilled immigration, the quality of our university system, our alliances, free trade, all of those things have been destroyed. It has been absolutely the best possible outcome for the People's Republic of China, and Xi since January.
[01:25:59] I don't know what else to say. So, like, this just adds to kind of a complete surrender, at least on the cyber side, to the Chinese. We are spectacularly poorly prepared right now for a cyber attack. But another thing that we haven't talked about yet is just recently it was discovered that something like 200 different local municipalities, including especially like their water departments have been compromised
[01:26:28] by a different hacking group inside of China, which is something that you would only do if you're preparing for potential significant conflict with the United States. That follows years and years of the PRC planting backdoors in power grids, water grids, railroad systems throughout Western United States. Are those backdoors still in there? No, they're discovered and then taken out, right? But what we've seen over the last several years is kind of a real change
[01:26:54] in the kind of goals of Chinese hacking, right, which used to be very much focused on financial and intellectual property focused. And now you see way more focus on infrastructure and the planting of access capabilities, which could only be used in conflict, right? So it's a time bomb.
[01:27:20] Should they say invade Taiwan and we decide to respond? That's the threat. They've got a time bomb. It's right. There are situations that would both... A booby trap. So that's like the power and water grid in Guam, the rail system in Southern California. That is the mechanism by which Marines would ship out to San Diego to board ships, you know, to be shipped out to the Pacific in case of,
[01:27:49] you know, threat shutting off water and power to our air base in Guam. But then also, you know, if you're talking about 200 water systems throughout the United States, that is a threat to try to make life more difficult for America. And we have not responded appropriately to this in any way. And the staff doesn't even exist, honestly, to do so. We've also lost a massive number of people in the FBI. There's been kind of political purges that have happened. And a ton of people took early retirement.
[01:28:19] Because if you've put your 25 years in, if you're dedicated your life to the FBI, you're going to go take your pension and then go make a bunch of money doing something else. In the private sector. Right. Instead of possibly waiting to be purged out and then lose all of that time. So I don't know what to say other than it's a terrifying time. When you talk to CISOs who work on critical infrastructure, who work on companies that are possible targets, they feel like they're possibly alone. Here in California, it's terrifying.
[01:28:48] Because we have the World Cup in 26, the Super Bowl in 27, the Olympics in 2028. And without the federal government possibly here to protect us. So like the state of California and our critical infrastructure companies are kind of gearing up to be able to do our, you know, our own defense without having CISA, the FBI or Cyber Command, which also had their director fired. The most qualified commander of Cyber Command and director of the NSA that has ever existed,
[01:29:18] fired because he worked for Joe Biden, which every general who is currently in the U.S. military works for Joe Biden by definition. That's how it works in the military. They're not just hired and fired randomly. They've spent their entire careers and lives. They're non-political. They're non-political. They're non-partisan. But he was fired by the recommendation of Lord Loomer. And then his deputy has been just told that he is not going to be confirmed. So we have no idea who will be in charge of Cyber Command and NSA.
[01:29:48] The definitive expert in cybersecurity, Laura Loomer. Yes. Can I ask you about your partner, Chris Krebs? Because of course he was the head of CISO. CISO. CISO, rather, under Trump. And had the temerity after the 2020 election to say it was the most secure election in American history, giving a lie to the, you know, the big lie. And Trump's been going after him. Yes. I know he resigned for your partner. Yeah.
[01:30:17] They killed his global entry membership. I mean, it's a little, you know, harassment stuff. Although the DOJ is apparently investigating him. What's the situation with that? Do you know? I haven't heard anything about any further DOJ. Yeah, because there's nothing to investigate. No, he did his job. And he didn't even do anything particularly political. He just told the truth, which is he believed that the 2020 election was secure. Yeah. And, you know, he was fired at the time. And then, you know, since then,
[01:30:46] he has not been particularly political. He's talked publicly about cyber issues. Yeah. He was your partner. And I thought he, in a really respectable, upstanding way, decided to resign that partnership because he didn't want to bring trouble to you and to the company. So, yeah, we had sold the company to Sentinel One. So we were both working at Sentinel One, which is a much larger public company. Yeah, I remember. That was the last time he was on. We were talking about that. Yeah. And he stepped down after the executive order
[01:31:16] because the executive order said that everybody who works at Sentinel One was going to get their security clearance or vote, which obviously for a company that does $100 million in government work. That's a big deal. I expect not this exact situation, but this is, I think,
[01:31:44] why the Constitution specifically says that you're not allowed to single out individuals and say this person's a bad person. It says that twice, I believe. But, you know, the Constitution does not seem to be too much of a barrier to this. So, yes, Chris stepped down, and I have left as well. It was less fun without him. I bet. All respect to Chris Krebs. It's just shameful. So, let's talk about the FCC again.
[01:32:14] Brendan Carr began dismantling rules requiring your ISP to offer detail. Remember what really liked this? The nutrition label? You've probably seen it that actually showed the real broadband numbers and the information about what you were buying. They're going to get rid of those. You probably didn't see it because it wasn't marketed super well. I saw it because I was looking for it. So, if you looked for it, you could see it. And there's,
[01:32:43] so we are actually filing. I am writing up comments this coming week about it. I'm also writing comments about the cybersecurity stuff too. So, you know, yay, CR. Arguing against this, the real, like, so they want to kill the broadband label. It's part of Carr's delete, delete, delete efforts. He's got a bunch of them. They're all pretty, well, some of them are actually reasonable. They got rid of the click to cancel rules the FTC passed because Comcast said,
[01:33:14] if you make it as easy to cancel as it was to sign up, we're going to lose customers. So they said, oh yeah, you're right. We shouldn't. This is so blatantly anti-consumer. So that was the FTC's click to cancel. Yes. Yes. I'm just, I know it's not the FCC, it's the FTC, but it's similar. It's the same idea. It's like broadband. Who, who doesn't like these broadband nutrition labels? Oh, the broadband companies because they don't want you to know you're not getting what you pay for. Well,
[01:33:43] and they don't want it to be easy for people to comparison shop and to see. I think based on how they've complied with this, like if you could actually pull the labels. So Comcast will give you an Excel file with all of the label data. There are 3,497 rows for each individual plan. And the idea is to make it as hard as possible to compare broadband pricing across. We wouldn't want to do that. You wouldn't want to show that.
[01:34:13] No. So, so yeah, they're doing this too. It's, now it's not, it's a rulemaking. So it has 60 days. you can file comments. You can file comments. Everybody should absolutely file comments. Although, remember the net neutrality thing, the comments, there were a lot of fake comments. You could file real comments. Yeah. Like I'm filing real comments. Y'all could also file real comments. Yeah. If you do file real comments, you should say like,
[01:34:43] there's, there's a couple of points that are really important. And that is that, um, I don't want to get into this. Yeah. It's like, do you want me to get into it? But it's a lot. I mean, I think enough. I mean, we, we could spend all day. You like the labels. You like that they're machine readable. You should say that they provide useful information. You should say that it is important to have the labels in the same language that you were sold your broadband subscription at.
[01:35:13] But you should also say that it's important to have the fees, like the pass through fees. One of the things they're mad about is, you know, they want to take out the state fees and they're, they're enumerated in the label, but they want to take that out. So it would just be like what Comcast wants to charge you as opposed to what Comcast wants to charge you. Plus the utility rates and all the things associated with your. Now I should point out. Comcast gave a couple of million dollars to the ballroom. So if you really want to be effective in this comment,
[01:35:42] you should probably donate to the ballroom. That was your mistake. My job seemed useless. Don't do that. Comcast, T-Mobile, Meta, Apple, Microsoft, Google, uh, all donated to the ballroom. So, uh, I guess that probably carries some weight with Brennan Carr. You know, he's, he's probably, you know, well, these are, these are good people. Uh, there are people.
[01:36:12] I'm being cynical. Yeah. Make your comments. Although Stacy, do you think it's going to make a difference? It is important to be on the record. Yes. Get on the record. Get on the record. Because maybe one day, we will not have the current administration. And that's a good point. That's a good point. We only have three more years of this. Everything Alex told you about like what's happening in cybersecurity.
[01:36:41] That is literally like every day. That's true. I look at that from a policy perspective and I'm like, okay, what do I fight on now? Yeah. It's sad. It's frustrating. Um, death cleaning, death cleaning. Can we talk about death? So what is Swedish death cleaning? Swedish death cleaning. Um, it's a concept that was around in Sweden for a long time. And then, uh,
[01:37:10] an author put it into a book and it sort of became well known for that. It's the Marie Kondo for, for this generation. Let me, let me, um, side note on Marie Kondo for a moment. I love her. She had three kids and she was sloppy. She was like, so what do you do with three children? She's like, Oh, I don't do that. And I don't do the organizing anymore. I have three kids. Leave me alone. Who has time? So much respect. Cause she could have spun out like the, the, the,
[01:37:39] the artful joy of tidying up for mothers. The artful joy of tidying up for senior, like she could have spun it on the anyways. She was just like, no, I'm done. My kids are messy. She said, thank you to that. And goodbye. I appreciate it. Your kids spark more joy than all the clutter. Yeah. So Swedish death cleaning. Anyway, it's this, it's this really phenomenal concept that, um, we have, this is your article by the way. And why? This is my article on why we have too much stuff. We,
[01:38:08] we spend the first half of our life collecting stuff and we should spend the second half of our life very slowly getting rid of things we don't need and keeping the things, things that we like in good condition and thinking about handing them down so that we are not leaving a burden to other people of our big mess. Now, let me ask you, does this include digital? Yeah. So, so this is what my article was really about. Um, I was, I came up with this idea of beyond just passing along my passwords. Like,
[01:38:37] what about organizing my photos so that the person who inherits them isn't just faced with thousands and thousands of digital images they don't know what to do with. In my case, 32,000. What about my diaries? Do I want anybody to see them? Do I want to keep them encrypted and protected and make sure that nobody has that password? That's funny. Yeah, that's a good point. And then I started interviewing people and it just got more and more interesting. So I, I interviewed this wonderful woman, Tina O'Keefe.
[01:39:05] She has a personal organizing business and she specializes in working with seniors. Um, so the senior element is really important because a lot of people think like, Oh, I'll do my, my own Swedish death cleaning throughout my life, little by little as I get older. But the way that most people first encounter it is with a parent or a grandparent, one of your elders. I've done this. My mom is in assisted living. She's got Alzheimer's. She's 92. She'll be 93 in a couple of months. And we have her house.
[01:39:34] And so I'm amazed at all the crap that's in there. But also, you know, as we age, a lot of people have, um, dementia, Alzheimer's, different, different, um, cognitive abilities that start to decline. They also have physical abilities that start to decline. Right. And that's why she's in assisted living. Yeah. And not everybody has a lot of tech skills either. So anyway, this woman I interviewed, Tina O'Keefe, she's a personal organizer. She specializes in working with seniors. And she said, um,
[01:40:02] she was working with women who was terminally ill, who had had two marriages. And she said, I've kept all these mementos from my first marriage that I don't want my second spouse to see. Fine. So they had, they had to come up with a way like, okay, do I give it to my kids? Do I save it for myself? Do I burn it? Like, what do I do with it? And then the digital aspect was just so interesting. So I interviewed a guy, um, named Adam who is Swedish and he went to help his grandfather with Swedish death cleaning. And, uh,
[01:40:32] one of the things he found was a phone full of malware. So not only did he then want to rescue all of the photos and important documents. Send it to Alex. I bet he could use it. So it was, it was super interesting just to see like the real world challenges people have in, in thinking about how do I, how do I organize my digital life? So then I'm passing it down in a way that's sort of like respectful. And I'm giving people the things that I want to make sure that some people have.
[01:41:02] And so you can think, what do I not have to wait until I die to pass it now? As if you're dead, you're not going to do it. So share photos and videos, for example, put them in a folder, organize it neatly. Um, but, but thinking about things like if, if you give somebody access to your computer or your phone, is there a very simple read me file right on the desktop, right? Or are they going to have to go digging through on my desk water?
[01:41:29] There is a folded piece of paper under the leaf there that it's not visible, but if you know, it's there, you can open the leaf. My wife knows about it. It says in case of my death or dismemberment, and it has, that's probably a little overdramatic way to put it, but it has my, actually, you know, Bitwarden has a form that you fill out. And I think other password managers do this. they do. Yeah, it's clever. So that's an easy way to pass on your online accounts, but there are tricky things you have to think about too.
[01:41:59] So some social media have terms of service that don't allow you to technically give your account to somebody else. You have to be careful with that. You can't hand that down. No, you can't, um, or you really, really shouldn't give anybody your password, uh, for your banking accounts. Like that needs to go through the beneficiary form. That's really, Oh really? I shouldn't give that to my wife. Right. Like, you're right. It's in my will. You know, it's something that you have to educate it. The people,
[01:42:27] people don't know this from birth. You have to tell people this. So younger people may not know that you can't just give somebody your, your financial logins. For this is important. Here's my, here's my Bitwarden security, uh, readiness kit, which I'm, I hope you can't read through the paper. I'm hiding. Let me just put that away, but this has all the important stuff in it. Right. And then, uh, and then, uh, Lisa can,
[01:42:52] but that's a good point that you probably shouldn't give your access. Well, it's got my, all my passwords. Right. But you, you really don't want anybody touching or moving your money after the minute you die. She's my executor. So I, they'll get in some big trouble that they don't do it through the proper channels. You have to do it right. You have to have a death certificate and bring it to the bank. But that's also why you have a trust, right? Yeah. I have a trust. Yeah. We have a trust. But there are lots of other, yeah, yeah, yeah. There are a lot, but you have to go through those channels, right?
[01:43:22] But there are lots and lots of other digital assets we have that we don't necessarily think about, um, organizing planning. Every time I put another picture in my photo, my photos collection, I go, nobody's ever going to want to see this after I'm gone. Right. There's another article I wrote. I want to tell you about, um, there's a, a wonderful newsletter called Gloria. It's at hello. Hello, Gloria. Um, it's for women in midlife and I've been writing a little bit for them. So, uh,
[01:43:48] a piece I kind of wrote in tandem with this one was about creating a death notification list in my email. Oh, so I started, I had, I had a friend who died, um, unexpectedly young and sorry. thank you. It was, it was really upsetting. Um, in part because I didn't find out for almost five months after she died, we had an email based friendship. We didn't really have any friends in common. And, um, it was just like, so weird. I haven't heard from her in a while.
[01:44:18] And I went to look up her name online thinking like, let me see what her bylines are. Maybe she got laid off. She works in media and I found an obituary and I was like, Oh, what? Like it was so, it was so unnerving. It was really, really shocking. Yeah. But it, it sent me down this path of thinking like, how do we tell people anymore, how we die? And there are so many connections I have that like, my partner doesn't know these people. My sisters don't know these people. Like, how are they going to know? Um, so I,
[01:44:47] I came up with this idea of, uh, making sure that I have an email list in my email that I plan to go through once a year and update. Uh, and it's lots of like acquaintances and my accountant. Have you written the email that's going to be sent already? Yes. Yes. So I wrote the email. I am dead. If you're reading this, I am dead. Yeah, basically. Um, it's a, it's a mad lib. I should let you know. Jill Deppi has died on such and such date of such and such cause.
[01:45:17] so your husband has to fill that part in. Is I have a different trusted person. I have different trusted person for this, um, who will inherit some of my passwords. And yeah, I mean, I think it's a little goofy. It's a little wacky. It's a little low tech in some ways. Um, I want to make a, I'm thinking I should have a dead man switch so that like every month I have to say, I'm still alive. But if I don't, then it sends it out to everybody. Leo must be dead. Cause he didn't say he's still alive. That's how the emergency.
[01:45:48] Yeah, I know. You're not a spy, Leo. Alex, you sound like you might have some experience. I'm just saying. There's the Google, Google inactive account manager, right? So it will pass on your Google account to somebody, to a trusted person. If you don't touch it in so many months. Our sponsor, Pitt Warden and every other password manager I know of has an emergency. Uh, uh, legacy account. Legacy feature. And they have a, they have a dead man switch. So if I, uh,
[01:46:17] if Lisa requests my password, it sends me an email and then you get to set how long it waits for, uh, before you don't respond. And then it says, okay, go ahead. He must be gone. Now. Somebody's saying Leo sounds so morbid, but, uh, when you get to my age, you start thinking about these things and everybody probably should think about, cause you know, this is, this is a Swedish helpful for the people. I mean, people are doing it for them. Yeah. You're doing it for them. Think about it that way. Yes.
[01:46:47] They should rename your article. Somebody will have to do something. Gloria. And not hello. Somebody will have to do something with your stuff after you die. I think about that. Don't tell them. Yeah. They're going to throw it out. I, for instance, I have a painting from the, uh, 18th, 19th, late early 19th century of, uh, an ancestor hanging on my wall. And I don't think either of my kids really wants this, you know, 1830s portrait of, of their ancestor.
[01:47:18] I don't know what to do with it. I guess it's going to end up a goodwill and somebody is going to have my, uh, antiques roadshow, antiques roadshow. It's not worth, I doubt. Well, you never know. I should look and maybe there's stock certificates in the lining or something. Maybe, uh, maybe it is worth something. Probably the Magna Carta. Yeah. Yeah. The Magna Carta. Uh, Swedish. You know what? This is good advice. Wired.com has a story. It was just last week. Uh, there's a link to, uh, various things in there. And Jill's a story with hello. Gloria is also on their front page.
[01:47:48] You can read about that. I think this is, I think this is a wise. This is good planning. Uh, let's talk about the F5 hack in just a little, uh, also Y X may be telling you, you need to re-enroll your YubiKey. Huh? And a Proton's new data breach observatory, a little competition. Or have I been pwned? It's got a good name. Well, uh, we'll talk about that in a lot more as well as, uh,
[01:48:17] ads on your $2,000 smart fridge. Uh, when we continue, uh, we got Stacy Higginbotham's here. She sounds, she looks sad. Are you, are you making your list of, of people to send emails to if you pass away? You're going to be with us a long time. I'm wondering how often I need to like, maintain my, my prep for death. No, don't, don't even think about it. Well, don't, like she said, you do it once a year, Jill. So I'm like, is once a year, that feels reasonable. So check,
[01:48:48] make sure everything's copacetic. Yeah, I guess. You have to rewrite the angry. Cause you have your enemies emails that go out too. Right. Oh yeah. Hey, good news. You know, that guy you really hate, Leo rented car. I'm dead. And I hope I really regret it. That's so depressing. You won. Okay. You won. You could dance on my grave. Uh, it'll be at the cemetery in a few minutes. Wait, your guys is dead men.
[01:49:17] Switch doesn't have all your blackmail material. Does it lift you out? It should. Yeah. That's not part of Swedish. You can't use it anymore. You can't blackmail me. I guess we have different, uh, I'm gone. I'm done. Uh, Alex Stamos is here. Uh, so good to have you CSO corridor.dev and the Swedish death cleaning queen. Jill Duffy. Marie Kondo of Swedish. The Marie Kondo of Swedish death cleaning. I,
[01:49:47] I did. I did the Kondo thing. I did. I went through my whole closet, but everything sparked joy. So it wasn't really much of a net gain. all these shirts. The one thing I don't like about the Marie Kondo method is it's sort of framed as do it one time, like take a couple of hours, take a two day break and do it one time. And I'm much more of the idea that like thinking about it as hygiene, you know, something you do a little bit every day. If you miss a day, it's not a big deal. Little bit at a time, all the time. Yeah. That's the key. It's like tidying up.
[01:50:17] I have about 60 of these cuckoo shirts and Lisa keeps inviting her son over to take them. No, these are mine. I do admit, uh, I have at least three quarters of the closet space in the house. That's probably not as it should be. I know. Isn't that awful, Stacy? You wouldn't think, you don't, you don't see me. I really wouldn't, Leo. No, I know. It's not the first thing you'd think. Oh, Leo. He's probably got a lot of suits. No,
[01:50:47] it's all cuckoo shirts is what it is. And costumes, a lot of costumes. We will continue in just a moment. Ah, Alex, you probably know what this is. This is our sponsor. This, my friends is a honey pot. This is brilliant. This is the Thinkst Canary, our sponsor for this segment of this week in tech. Now I disconnected my honey pot, so I'm going to get an alert in just a little bit saying, your honey pot's disconnected. But that's, what's the beauty of this. Thinks Canaries.
[01:51:17] Honey pots traditionally are very complicated. I remember talking to, I think it was Bill Cheswick who wrote one of the very first honey pots. And he said, it's, it's complicated to create a good honey pot. A honey pot is something that attracts not flies, but hackers, something that's sitting on your network that will let you know if a bad guy has penetrated your defenses. And you need this, not just a bad guy or a malicious insider, but because it's so complicated,
[01:51:47] you really want to get a Thinkst Canary. This is a honey pot that can be deployed in minutes. It can impersonate almost anything from a windows server to, I have a 2019 SharePoint server running into my network, wink, wink, nudge, nudge. Bad guys can't resist that one. To a SCADA device. It could be a SCADA device. It could be anything. It's absolutely indistinguishable from the real thing. Down to the Mac address. They actually have appropriate Mac addresses.
[01:52:16] My Synology NAS, Thinkst Canary, has the DSM seven login, just like the real deal. So bad guys see it on the network. They, they, they're irresistible. They're irresistible. You can also create lure files with your Thinkst Canary. I have a wire guard, a fake wire guard configuration file sitting on Google drive, just waiting. How can a bad guy resist that? Right? It's the keys to the kingdom. The minute someone accesses your, you know, your,
[01:52:45] your lure file or your, or brute forces, your fake internal SSH server, your Thinkst Canary will immediately tell you, you got a problem. No false alerts, just the alerts that matter. And you can get them any way you want them. SMS, email, syslog, it supports web hooks. There's even an API, any way you want them. The thing is, if you hear from your Canary, there's a good reason. Just choose a profile for your Thinkst Canary device. And it's easy to pick. There's hundreds to choose from.
[01:53:15] It's actually, it's so easy. I change mine regularly. It's fun. Then register it with the hosted console for monitoring and notifications. And then you just, you know, you, you sit back and you wait. Attackers who've breached your network, malicious insiders, other adversaries are going to make themselves known by hitting your Thinkst Canary. They can't help it. Visit canary.tools.com. $7,500 a year. You get five Thinkst Canaries. You get your own hosted console. You get upgrades. You get support. You get maintenance.
[01:53:45] Ah, and if you use the code TWIT in the, how did you hear about us box? You'll get 10% off the price. And not just for the first year, but for as long as you have your Thinkst Canaries. 10% off. You can always return your Thinkst Canary. They have a two month money back guarantee, 60 days for a full refund. I have to tell you, they've been advertising with us since 2016. We've been talking about this. That's how long we've been using them. And in all that time, no one has ever claimed the refund.
[01:54:13] Visit canary.tools.com. Canary.tools.com. Don't forget to enter the code TWIT in the, how did you hear about us box for 10% off. Canary.tools.com. This is a must for every network. Thinkst Canary. This is from Alex and Risky.biz. F5 says an advanced persistent threat stole source code.
[01:54:43] This was just a couple of days ago. F5 is one of the, is a huge security company, right? Yeah. They make network devices in particular. Yeah, not security. Low balancers. Yeah, yeah, yeah. So this is a security threat though, because it's one of the largest, this, everybody has F5 equipment. Lots of large companies use F5 on the edge of their network. So what do we do? It's a real problem.
[01:55:12] So it looks like it was the Chinese and they're in the F5's network for at least a year. So this looks like it could be a SolarWinds level attack. So F5 released a patch with at least 40 vulnerabilities. Those look like those were bugs that were reported to F5 during that time that the Chinese know about. So this is an interesting story. Microsoft, this, I'd heard this and maybe you can confirm it. Of course, Microsoft's had some real problems with Exchange and other servers.
[01:55:42] And there was some concern because Microsoft was getting its patches made in China. Yes. And they would send the breach, the security flaw information to China for the fix before it was publicly announced. And weirdly, the bad guys knew about it somehow before the patch went out. I know, it's a shock. How did that happen? Yeah, so that's the SharePoint bug for this summer.
[01:56:11] There's a huge bug in on-prem SharePoint. And it turns out that Microsoft moved SharePoint sustained engineering to China. And so when there was a vulnerability found in SharePoint, it was actually found as part of one of the Pwn2Own competitions. That exploit code was turned over to Microsoft. That's why you do Pwn2Own, so that Microsoft gets it first before the bad guys. Yeah, so Microsoft pays a bunch of money for this exploit. What do they do? They ship it to their engineers in China.
[01:56:39] And then it gets used by the Ministry of State Security and a couple other groups in China to own up over a weekend tens of thousands of American targets. In fact, one, I was still at Sentinel, one at the time, one of the Chinese servers, we were actually, we had data from upstream. We found that they did DNS requests for about 120,000 targets. And that included a huge number of industrial targets.
[01:57:03] Most of the local and state government SharePoint servers in the state of California, lots of folks. Yeah, important. If you think this doesn't affect you, it does, because those servers had your information, which was then exfiltrated. Yes. So what they're doing is they're just spraying everybody, putting a backdoor so they can get back later, before the patch was then available. This is true zero day. It was not something that you could possibly patch yet. So Microsoft had to rush the patch out super fast.
[01:57:33] But yeah, so I actually, I was just... Do we know how the F5 breaches happened? No, so we don't know exactly how the F5 breach happened yet. There hasn't been a release of a full root cause analysis or investigation yet. But F5 says hackers were in the company's network for at least 12 months. At least a year. By the way, did I mention the Thinks Canary? Yes, if they had a Thinks Canary... They would have known. It's possible that they would have known that the Chinese would have tripped on one and set off an alarm.
[01:58:02] So they spent a year inside. F5 says they have not seen that they had planted a backdoor, but if they have access to source, then they can be finding vulnerabilities that nobody else knows about. Is that the first thing these state actors do is they look for source code so that they can then implant other exploits? Well, if they find the exploit, then they have what's called a bug door, right? They don't need a real backdoor. They can have a bug door. They can introduce bugs. Well, they don't have to introduce it, right?
[01:58:29] So changing the code is really risky, right? So if you look at F5's code, remember, F5 just patched 40 vulnerabilities, right? So you're talking about a product that had 40 unknown vulnerabilities. So if their code is that buggy and you're the Chinese, like the first thing you do is you pull down the code and you're like... You don't need to introduce bugs. They're there. Right. And so if you're the Chinese, you might look at that and say the risk-reward ratio here is not worth to plant something.
[01:58:56] Let's just find five bugs that they don't know about and then use them in targeted circumstances. And then when one gets patched, move to the next one, move to the next one. Could you use AI on a code base to look for vulnerabilities? In fact, you can. And I believe this is one of the stories that we have there if you want me to pitch to it. But like OpenAI just announced exactly this. Google has a project to do this too. So OpenAI just announced a project called Aardvark where they have an agent specifically to do this.
[01:59:25] And they're going to have this as an agent that you use for your own code, but they're also doing things that's open source. Do you know how effective it is? It looks... I haven't gotten access to the beta yet, but from what I've seen, it's been pretty effective. Google's is very effective. Google has a project called Big Sleep, which is a little... I mean, from a naming perspective, I'll look to Stacey or Jill for their professional writers. The Swedish Big Sleep, yes. Yeah.
[01:59:51] Like if you're like a multi-trillion dollar tech company, would you call something Big Sleep? I mean, it just sounds a little Bond villainy. But it comes from... It's Raymond Chandler. It's The Big Sleep. It's a Humphrey Bogart movie. It's the long sleep, the sleep that doesn't end, that sleep. I guess. It's not a good name for a bug hunter, but okay. Right. By the way, so Google's running their bug hunter against large amounts of open source. And it's very effective.
[02:00:21] Is this the MFFM Peg story here? Because they were upset over this. They were very upset. Yeah. So you can go to Google, if you search for like Google Big Sleep bugs, Google keeps a listing of all the big bugs that they have found. And they have reported a bunch of bugs at FM Peg. FM Peg is this extremely important open source library that all of us are probably using actually now. I have it on all my machines. Even if you didn't think you did, you do. Yeah. Plex installs it.
[02:00:51] A lot of programs install it. It's a transcoder. It's used all the time. It's extremely important for encoding and decoding video. Google uses a ton of their products. So as a result, Google cares a lot because both Chrome, the browser, Chromium, Chrome OS, Android all use MPEG. And so Big Sleep is going through their AI is going through the source code. But that's good. Right? They found some in ImageMagic as well.
[02:01:20] Another very widely used tool. Yes. And ImageMagic has been used to exploit Android over and over again. The problem is what Google is doing is they are reporting using AI. AI is writing up these huge bug reports that are extremely detailed. It's great. But there's no fixes attached. And what the FFMPEG people are now tweeting about is they're calling this AI slop CDEs. Right? Like just like AI slop imagers.
[02:01:46] And they're complaining that Google is burying them in these reports. Are they not doing responsible disclosure? They're just publishing these? Well, no. They're doing responsible disclosure. But Google's policy comes from the Project Zero Days when human beings were finding this stuff. Right. And it's 90 days. They give 90 days to these developers to fix it. And they're not giving them any money. Right? So the FFMPEG people are complaining like, wait, you're just burying us. You are one of the world's most valuable companies. And they're burying you.
[02:02:14] You have the world's best security researchers that are now backed by the world's best AI. And you just expect us now for free to fix these bugs in the software that you guys use to make trillions of dollars. FFMPEG says, is it really the job of a volunteer working on hobby 1990s codec to care about Google's security issues? Yeah. Well, it is everybody's security issue. I mean, FFMPEG does need to fix it. But 90 days is insufficient. These are volunteers.
[02:02:44] These are, you know. But this is generally a problem in the world, which is a lot of small open source. It's that famous XKCD cartoon where there's one little brick supporting this giant infrastructure maintained by some guy in Wisconsin. And he's not getting paid to do it. And they both have a good point. Like FFMPEG says that Google actually themselves, now because of these bugs, they have driven one of the FFMPEG developers away. Oh.
[02:03:12] Because FFMPEG tries to like, their goal is to decode every video format that's ever existed in the world. Right. And like this bug was found in a video format that is only was used for like one LucasArts game in the 90s. Right. And so it's like, what do you care? Well, the reason they care. Does it affect every user of FFMPEG? It does. This is the problem that Google points out, which is by default, when you compile FFMPEG, this codec is included.
[02:03:39] And FFMPEG has a bunch of modes in which it will automatically detect. So you put on a browser page a phony file from an old LucasArts game that gets downloaded. And the FFMPEG jumps in and says, I'll take care of that. And you're owned. Right. And so they're both right. I feel really bad for FFMPEG here because they're getting buried in these reports. And they are not easy bugs to fix.
[02:04:05] My position here would be Google should not be reporting these bugs without patches of tests. Right. Like if AI can do these incredibly beautiful reports, it can also come up with, here is a proposed patch. Right. Here is a proposed patch. And if Big Sleep's not ready for that, then they should not be doing the reports at this point. Or Google can have their human engineers go write this up because Google does have the best paid security researchers in the world.
[02:04:34] Project Zero is amazing. And this has been a bit of a problem with Google Project Zero is, I know some of these people, they're great. They're also kind of arrogant. Yeah. Kind of is doing a lot of work here. They're extremely arrogant. And they embody a bit of what you see with security researchers, which is like lording over them of like, I found problems in your code. It is now your responsibility to fix it.
[02:05:02] And like crushing them under the brilliance of, and then doing that with your AI system now is like, it's very arrogant. And it is punching down. Right. And I do think it is on Google here to do everything they can, if not paying these guys directly. At least every one of these reports should also have it attached of like, hey, our AI did its best to create a picture. Make it as easy as possible. Yes. Yes. Yeah.
[02:05:31] We quote Tavis Ormadi all the time because he's discovered so many, we talk about him on security now all the time. So many, you know, heart bleed and other very serious issues. Right. I want to give credit to Manuel Laus, who did patch the Lucas Smush algorithm that's only used in one case, the first 10 to 20 frames of Rebel Assault 2 from 1995.
[02:06:02] He fixed it. Thank you, Manuel Laus. But, you know, you're counting on some guy. And there's like, if you look in their tracker, let's see how many more are open for just FFMPEG. There's like a half dozen more open bugs. Yeah. And FFMPEG says, arguably the most brilliant engineer at FFMPEG left because of this. He had reverse engineered dozens of codecs by hand as a volunteer.
[02:06:30] It was hugely demotivating and to the fun and enjoyment of reverse engineering. Yeah. This is tough. I have sympathy on both sides. The problem really is open source code like this that's written by hobbyists for the enjoyment of it is everywhere. Yeah. And it's the volume of what AI can do. Right. Is it can massively outstrip the capability of the maintainers to fix stuff.
[02:07:00] But the flip side is, is the bad guys are now getting these capabilities too. So Google is trying to front. Google and open AI. Yeah. And Anthropic is, you know, is probably going to be doing the same thing. They're going to try to front run the bad guys, which is a totally reasonable thing. Right? Because you want the stuff fixed before the bad guys have the same bug. Right. And it's amazing what AI can do.
[02:07:24] Financial Times story says that AI is generating a surge in expense account fraud because it can create such believable receipts. I got so mad about this story. Why? I got so mad. What happened, Julia? First. I mean, sorry, Jill. Jill. Jill. Jill E. Duffy. Jill? Jill? What happened? Jill E. Duffy. It is not hard to fake a receipt ever. That's true. It is not hard.
[02:07:54] You go to Stables, you buy a pack of receipts, you can fill them out yourself. It has never been hard. What's frustrating is corporations require so much bulls**t in very minimal expense report accounting that is not actually required by the IRS. Yes. So, I used to work for this company. They were a fine organization, but they outsourced their expense account management to another company.
[02:08:23] And so, they didn't really know what it was all about. But they said, everything needs a receipt. Everything needs a receipt. And I left hotel tips. And I was like, well, I have a receipt for the hotel tips, but I'm just going to write them in. We might get audited for that. I said, honey, it's less than $95. It's in the IRS code that you will not get audited for that. And if you're not going to reimburse your employees when they stay in hotels, that starts to look bad on you when you have 200 employees staying in a hotel for a conference.
[02:08:50] But, Jill, now you can use ChatGPT to create your tip receipts. The problem is not employees faking receipts. The problem is that companies are so – they have such a bug up their ass about this for no reason whatsoever. There is no fraud going on for a $28 meal. You know what I'm saying?
[02:09:11] Like, this is the dumbest place to spend any of your time and energy is approving somebody's $8 cab ride to get three blocks quickly. I suspect that everyone on this panel shares your annoyance. I worked for Ziff Davis. I remember a very famous story that was told to me by a Ziff Davis executive. He said, Leo, you got to hide the boots. I said, what?
[02:09:40] He said, a while back we had an account executive who was a whining and dining, you know, advertisers. And, of course, he'd write off the whining and dining. But, at one point, he bought a $400 pair of Tony Llama boots and tried to write it off on his expense report. And, of course, got denied because you can't expense the boots that you bought. And he said all he had to do was hide the boots.
[02:10:08] Just put it to the dinner. It was a bottle of wine. He said, Leo, when you do your T&E, hide the boots. But the thing is, the majority of employees, they are honest people. I know. Who takes a receipt anymore? We forget. Like, people are not creating false expense reports. Just want to get paid back for my expenses, please. Yes. Please. Yes. It's a matter of you're making it difficult. Just give me a per diem, right?
[02:10:36] Like, just let me collect a per diem. It's such a minuscule amount of money for large companies. It does not matter. And it's just such a stupid thing to point the finger at employees and say, oh, you're making up this expense report. Like, probably the person did spend the money. They just want to be reimbursed for it. I suspect there's going to be even more link-baity articles like this. This is the Financial Times. Not known for link bait.
[02:11:04] But if you think about it, this is what you're going to see is a lot of articles about, oh, look what AI can do now. Which is too bad because there is some legitimate complaining about AI Sora, for instance, which is extraordinarily popular open AI video generator. And now they have a Sora app that's often number one on the iPhone app store.
[02:11:30] I decided instead of trying to fight it, I just took my generated persona. They call it a Cameo. Cameo, by the way. The company's not happy about that. They're sewing them. They call it a Cameo. I did one of my cat the other day. You can now do pets and you can do objects. And I just made it public. I said, anybody can do it. Because that way, I mean, it's like, well, it was made up. It was AI. It's just, you know, everything is going to be AI. But there are people complaining.
[02:11:58] Apparently, there are influencers, racist influencers, who are using Sora to generate videos of poor people using, you know, flouting their use of Snap. Food stamps as a way of saying, see, we don't want to give these people benefits. And so there is a legitimate misuse of AI. I agree with you. Forged receipts from a company that really should be paying you back for those anyway. It's kind of seems like.
[02:12:27] Yeah, like are we going to use AI to start looking where wage theft is happening? Because I feel like that's a much more important concern for employees who are getting shafted. I agree. I did mention Proton is now going into competition with Troy Hunt, who's very famous Have I Been Pwned website. It's fabulous. And I think this is fine. Proton's calling it the Data Breach Observatory. And they say, unlike Have I Been Pwned, they don't wait until a breach is reported. They actually search the dark web.
[02:12:56] Alex, do you think this is legit? Is this a good idea? I mean, I have to look. Troy does a lot of work to try to keep his consolidation from being abused by people. So it's like not easy, right? It's not easy to pull that data in one place and then let people query it without then that being abused. A little conservative in that regard, yes? Yeah. Yeah. Have I Been Pwned's pretty conservative. So much more conservative. I mean, there are services in which you can just buy that data, which is good.
[02:13:26] I mean, you buy that data and then that data is then used by companies to go reset, mass reset passwords, give people notifications and stuff. And that would not be in the Have I Been Pwned database? Necessarily. Probably a big overlap, right? Yeah, yeah. But not all of it is. So companies that are concerned about password breaches might go to a third-party service other than Have I Been Pwned. Yeah. I mean, there are – this is what you pay intelligence firms for, right?
[02:13:55] There are a bunch of intelligence firms that go into the dark web and then they have a bunch of competing numbers of I'm the best, I'm the best. And they'll give you advertisements of who – Yes, we have advertisers that do that. Yes. Yes. Right. Of course. Right. And they are the best, by the – I just want to point out, they really are the best. Right. If we – yes. I was really interested – oh, go ahead. Oh, go ahead, CB. Sure.
[02:14:17] No, I'm interested in the verification side of it because they're pulling it from the dark web and they're saying that they're going to use a firm to verify that it is actually legit because sometimes you see hackers put in – They're working with Constella Intelligence to verify it. Hmm. So, I mean, if you're a company buying this, you test against your own stuff, right? So, like, when I was at Facebook, we would buy stuff ourselves and we also had our own intelligence team go.
[02:14:46] And then we'd run username-password pairs and we'd find, like, for any reasonably sized breach, about 5% of username-password pairs would match up with the username-passwords people would use on Facebook. And so, if there's a match, what we do is lock the account so nobody could – you could not log in to a new device. So, that's how that happens. Oh. Yeah. So, we could basically – you could – the cookies that you had. So, if you had a browser that was already allowed on Facebook, if you had a mobile device that was already allowed, that's fine.
[02:15:14] And then you would get notifications, please change your password. That's cool. Right. But a new one couldn't, right? So, somebody all of a sudden couldn't come from Romania and add a new browser to say, I'm Leo Laporte with a password that was known to be – And it's a – I mean, it's a controversial thing, right? Because either you're paying directly or you're effectively – you know, for a number of those intelligence companies are paying money for those – the only way they can get that stuff is to pay. They're making a market. They're creating a market.
[02:15:41] Now, the upside of it is they're also destroying trust in that market, right? Because then on this black market, the person they're selling to might be an intelligence vendor that the moment they sell it to that person, that goes out to Google and Facebook and a couple other companies. And then the value of the thing they sold goes to the floor. Right. Right. And so, if you can create dishonor among thieves, if you make people not trust the market, then it makes a much less liquid market. And then it reduces the economics.
[02:16:11] So, it is a very sketchy slash ethically difficult area. For this specific one, like in any case where you're not selling it to big trustworthy – but like companies that aren't going to go just use that for their own purposes, then you have to be extremely careful. And, you know, Troy has been really careful in, you know, verifying that people are who they say they are. They have access to that account and stuff. And so, I don't know if that's going to be true for Troy. I have huge respect for Troy.
[02:16:40] And I use – he has an, I think, lesser-known feature on haveibeenpwned.com where you can enter a password and see if it's shown up in the breach. And I think people might go, I'm not giving it my password. But it's actually quite cleverly done so that no information is exfiltrated. They hash it and then look for hash matches. Yeah. I think that's really cool. I think what Troy does is really important and very cool. So – Yeah. I mean, I would stick with haveibeenpwned unless these guys show that they have a synonym.
[02:17:10] I mean, his data set is pretty impressive. Yeah. So, I don't see any reason this way. Yeah. If you used a YubiKey or a hardware key on X, X wants you to know they are abandoning the twitter.com domain. And your key won't work anymore. So, make sure you re-enroll your hardware keys. Yeah. They'll let you know. Are they really abandoning it? Like, they'll let it actually expire?
[02:17:39] Or are they just abandoning it? Oh, I'd like to buy that. You're not going to have the infrastructure. I'm just curious. I should buy that. How far does the abandonment go? I can't imagine they're not going to pay the $12.99. They're going to keep you. That's what I'm like. They're going to keep the domain. They're just not going to use it to direct to X. Any other business, I wouldn't even question it. But – You never know. They've never done the interesting choices. Elon might need that $12. He might. You never know. Yeah, they don't have a choice here, right? Like, a FIDO token, the token itself. It's tied to the domain name, the euro.
[02:18:07] Yeah, it has a cryptographic relationship with twitter.com. Like, with, you know, auth.twitter.com or whatever. And so, they can't swap that over to X. Right. So, yeah. They'll let you know. If you have done that, they will let you know. But you should know that this is a possibility. I immediately went in and it turned out I have also – maybe this is a bad idea – but also authenticator, a TOTP that I can use. So, that wouldn't have gotten locked out. It's just my YubiQ wouldn't work anymore. Yeah, that's fine.
[02:18:36] I mean, that's – the only time that that's worse is if you're being attacked in the active man in the middle. Right? So, as long as you're careful about making sure you're actually on X.com and such. Oh, yeah. I did go once to T-V-V-I-T-T-E-R.com. Ah. In that case, I would not use your TOTP. Yeah. That was – But that's a great example. Yeah. That's why the FIDO token is locked to the domain. Right? Yeah. Because in that case, you might not notice the VV. You might not notice a – I didn't.
[02:19:05] It's a Unicode attack where it looks like Twitter, but it's actually a, you know, character – It's fairly easy to spoof. A legal character set or something. Right? Yeah. But the FIDO cannot be spoofed. The FIDO can't be spoofed. It understands Unicode. It understands the Bitwise UTF-8. It cannot be spoofed. That's actually one of the values of using a password manager, right? The autofill doesn't get spoofed either, right? Yeah. There are some trickiness there in that you have to be careful.
[02:19:33] Like, the autofill stuff can be tricked in situations where, like – Because it's JavaScript. It's a little bit – Right. If you have, like, a JavaScript injection, if there's an injection vulnerability. The other problem there is, like, in situations where there are subdomains that are under the control of attackers, right? And so, like, some of them are, like, way too aggressive about filling in. Oh, yeah. Yeah. You went to hacker.twitter.com, and I'm going to fill that one in, too. Yeah. That makes sense. Yeah. Yeah.
[02:19:59] And so, what the good password managers do is they try to have a list of these are domains that, you know, have subdomains that you can't trust. This stuff is hard. Or they'll lock – Yeah. Or they'll lock it to auth dot whatever. Yeah. These bad guys. They suck. Right. This is why FIDO is good. FIDO does – Yeah. Use a hardware key. It's a good idea. Yeah. Let's take a little break.
[02:20:25] Final segment coming up in just a bit with a wonderful panel, which I would like to spend many, many more hours with. But the sun has come up in Laos. And so, Jill has got to go run some errands and make some ice cream. I think my brain's finally awake fully now. It's so nice to have you. Jill Duffy writes for PC Magazine and Wired. But don't worry. She's not going to defraud you with her phony receipts for the tips that she gives the busboy. That's not going to happen.
[02:20:54] She's an honest, honest person. Do you want us to put the E in there? I forgot. We usually do put the E in your name. We could put it in there. Yeah. If you want to find me online, I'm Jill E. Duffy. Yeah. Everywhere. We'll put that in. Yeah. That way people can find you. Sure. Alex Stamos is here. He's the one, the only. He's here. C-E-S-O for Corridor.dev. There's an unfortunate high school kid in Chicago who's Googled his story. Does he get hacked a lot? People go after him. Yeah.
[02:21:23] That's terrible. Do not ever. You never want the name of a famous security researcher because you're going to just be in trouble. Sorry, kid. Sorry, kid. By the way, I was going to use Hide the Boots as the show title. You cannot fool AI anymore. You know, you told that Hide the Boots anecdote in 2009 and it was used then. Thank you.
[02:21:50] I think there should be a, I mean, there's a, there's a statutory limit, right? I mean, after 2009, after 16 years, I should be able to tell the same anecdote one more time. What do you think, Stacey? I think it's okay. I would, I would even give you one or two years. Yeah. Nice. 16 years. Come on, man. But no, AI knows all. AI or our engineer, Patrick Delahandy. Maybe it was Patrick doing. Stacey Agenbotham is also here from Consumer Reports.
[02:22:19] Glad to have all three of you and glad to have our club members who are so great and fund a lot of what we do here. About 25% of our operating expenses come from Club Twit. If you're not a member, lots of benefits, including Stacey's book club. We've decided on a book. Stacey says it's a depressing book, but it's going to be fun, isn't it, Stacey? So fun.
[02:22:44] If you want to extrapolate out, it's, it's kind of like what, I mean, okay, this is way too much credit, but like, you know how Neuromancer predicted the future? Yeah. Well, this kind of predicts the future. It's just a really terrible future. Well, what the heck? Well, these things happen. Anyway. It's not like the times we're living in are so great. Have we set a date for Stacey's book club? I think we're going to do it next month, but I don't know if we set a date yet. Don't have a date. Okay. So we will set a date, but that's one.
[02:23:12] We do a lot of things in the club because we want to make it fun. We want you to be a member and be glad you're a member. 10 bucks a month there. By the way, this is a good time to join because we have a coupon at twit.tv slash club twit for 10% off the annual plan. Best price. That's a good way to do it. Also good for gifting to the geek in your life. There are family plans and corporate plans as well. And as I said, we, we make it fun. You get ad free versions of all the shows.
[02:23:40] You get lots of stuff going on in the club to discord. Our AI user group is Friday. That's going to be fascinating. Show how you can make your own MCP. I think Darren has said he would help us with that. That's great. Snow crash, not neuromancer. Neuromancer talks about the future too. It's that's both of them do. Both of them count. Yeah. But snow crash is like the quintessential. Yeah. It's where the word metaverse came from, right? Yes. But on the other hand, cyberverse came from neuromancer.
[02:24:10] So both of them are important. Anyway, join the club. I don't know. I'm easily distracted. I'm a little ADD. Twit. Twit.tv slash club. Twit. And we thank you all so much for your support. Our show today brought to you by Melissa, the trusted data quality expert. They've been doing it longer than we have since 1985. Of course, it started with address validation. Address validation today still. 40 years later is Melissa's bread and butter. But they do so much more.
[02:24:39] They're really data scientists. But let's talk about address verification for a minute. Melissa's address verification services are available to businesses of all sizes. They're very affordable. And Melissa's address validation app for, as an example for Shopify, is vital for e-commerce merchants. Especially if you want to do international business. International companies like Siemens AG, they manage a diverse group of customers and clients all over the world.
[02:25:08] They have to have country-specific address formats for many, many countries. They've got to make sure the data they hold is correct. If not, they can face significant costs and delays to supply and production chains. It could be a nightmare. Well, since they started using Melissa, Siemens AG has reliably processed more than half a billion queries for 174 countries. That's close to all of them using Melissa's dedicated web service.
[02:25:38] Ask the global IT headmaster of data management at Siemens. He says, thanks to these very stable solutions, we've achieved an automation rate of over 90%. Melissa reacts very quickly to our requests and offers us the right solutions to questions that come up, and they consistently meet our service level agreements. They're happy. You will be too. Data quality, it's not just Siemens. It's essential in any industry. Melissa's expertise, as I said, they're data scientists. It goes far beyond address verification.
[02:26:07] Many banks all over the world have know your customer regulations, similar regulations. MetaBank, like any bank, absolutely must know the exact identities of all its customers. These are federal regulations. However, it's problematic, especially if a bank's customers include not only its own retail clients, but also hundreds of organizations with their own customers. And that's what MetaBank faces, an exponentially greater challenge. Senior VP of Data Systems and Business Intelligence at MetaPayment Systems says, quote,
[02:26:37] I believe Melissa has helped us improve not only data quality, but also our downstream experience for end users. We're now able to identify everything from fraud to missing data and allow our individual customers to swipe their cards with confidence. And importantly, as every data engineer knows, having clean data translates to the bottom line. Melissa saves you money. And of course, your data is always safe with Melissa. They're compliant. It's secure.
[02:27:06] Melissa's solutions and services are GDPR and CCPA compliant. They're ISO 27001 certified. They meet SOC 2 and HIPAA high trust standards for information security management. You know your data is secure with Melissa. They go the extra mile. Get started today. 1,000 records cleaned for free. Melissa.com slash twit. That's Melissa.com slash twit. We thank them so much for supporting Twit. They've been with us for a long, long time.
[02:27:35] We're really glad to have them. This was actually a topic of conversation in our community. A bunch of tech tutorials were removed from YouTube. YouTube. I don't know if this is the right denial. They say AI didn't do it. Well, okay. Educational videos that YouTube had allowed for years are suddenly being flagged as dangerous, are harmful.
[02:28:04] No way to trigger human review to overturn them. Creators were pretty sure AI was running the show Friday. Yesterday, a YouTube spokesperson said, we've reinstated those. It wasn't AI and we're making sure that doesn't happen again. Sorry. What do they have? A renegade content guy? I don't know.
[02:28:29] Some of them were, you know, people were thought, well, you know, how to install Windows 11 without having to use a Microsoft account? Maybe Microsoft thought that was piracy. It's not harmful. Anyway, YouTube says, sorry, it won't happen again. Now, here is what something might not happen again.
[02:28:54] Maybe you saw the YouTube video by Trevor McNally. He is a lock pick, former Marine Staff Sergeant, 7 million followers. You probably watched his videos, 2 billion views. A Florida lock company called Proven Industries in March posted a promo video on their social media accounts saying, you guys keep saying you can easily break off our latch pin lock.
[02:29:25] No, you can't. So McNally made a video. Apparently they didn't like it because he was drinking a juicy juice, swinging his feet, hops down from his seat, goes over to a proven lock on a trailer hitch and uses a shim from a can of liquid death that he cut and opens the lock. So Proven locks didn't like that so much. They tried to shut McNally down.
[02:29:55] They tried to get him taken down with DMCA takedowns on YouTube. He didn't bow to the pressure. In fact, he made several more. In fact, because Proven said, oh, he's doing a very, it's very tricky. You got to cut this shim carefully. He said he actually did it on camera. He finished the can, cut the shim, opened the lock in seconds. Proven sued him.
[02:30:21] Kate, there's this thing called the Streisand effect you might want to know about Proven. They charged him with copyright infringement, defamation, false advertising, violating the Florida Deceptive and Unfair Trade Practices Act, and torturous interference with business relationships, civil conspiracy, trade libel, and unjust enrichment. But they really didn't like it was he was drinking a juice box, swinging his legs.
[02:30:51] It's actually in the court papers. McNally appears swinging his legs and sipping from an apple juice box, conveying to the purchasing public that bypassing plaintiff's luck is simple, trivial, and even comical. How dare he? Of course, as you know, McNally had, as I just mentioned, quite a few fans who immediately
[02:31:16] started commenting on Proven's posts and product videos, mocking Proven. Uh, they doxed the, uh, proven executive, uh, which no one should do, but, uh, I think the whole thing is going to go away. The judge said, did the plaintiff bring a lock and a beer can?
[02:31:48] She wanted the judge, she wanted the plaintiff to actually show that it couldn't be done. There was going to be no live shimming in the, uh, in the courtroom. The judge, the judge, after several hours, the judge said, I'm declining to grant the injunction and, uh, the purpose and character of the use to which Mr. McNally put the alleged infringed work is transformative. It's a critique. It's his own way of challenging it.
[02:32:14] He went to court and he, he got his fair use proven, which I tell you is something I'm not prepared to do. So don't come after me anyway. The company dismissed the lawsuit. They, they put, when it ran as fast as they could in the other direction with their tail between their legs. So a hundred, 10 million people watched the, the lock pick death of the proven locks and, uh, proven was not able to, uh, shut them down.
[02:32:42] I love those stories because most of the time, as, uh, I think Mike Masnick on tech said, fair use is just the right to hire a lawyer. And most people say, including us, we're just going to back down when you, when you do that. That's why we don't show videos anymore. It's my favorite line in the article was, um, the judge stepped in and just declined the injunction and said, um, for her to do so proven would have to show that it was likely to win at trial among other things. It had not.
[02:33:14] Yeah. They gave up. They gave up. Uh, oh, well that's a, it's a happy ending. Now, how do you feel about Samsung's $2,000 fridge showing you ads on that big screen in the front? Yeah. I'm against it. And you know what? It's part of that whole software tethering. We yelled about like back in the day, it's a concept that you don't actually, if you're going to have a cloud connection, you don't actually own your product. That's right. Rights around them yet.
[02:33:43] My friend has a Samsung fridge and the browser is so out of date. It doesn't work anymore. So maybe they'll use the money from the ads to pay to update the browser. Next week, we actually, I am finally, finally producing the longevity by design recommendations for a device manufacturer. It's appalling. Just appalling. Yeah. Uh, what, WhatsApp can now use pass keys to secure your backups?
[02:34:12] That's good news. Um, right? We like pass keys, Alex. Do we like pass keys? We do. Yeah. Just checking. I mean, so, I mean, this has always been a real challenge with end to end encrypted messengers is that the expectation of people is that, uh, if you restore your phone, if you lose your phone and restore it, that your, your chats are there. Uh, but if you're to do that, if you're not Apple and you don't own the whole stack and
[02:34:39] you have this really kind of complicated way of, of, uh, providing, uh, end of encryption, uh, is turned out to be really hard. In fact, iCloud backup has some really not so great, uh, security properties itself. Um, and so WhatsApp first provided the ability to backup your chats. And when they first did that, the backup itself was not encrypted. So all the work you did, they did around and the encryption, which shipped while I was at Facebook, um, was negated by turning on backup.
[02:35:06] So they then, uh, created the ability to encrypt backup, but you had to remember the passphrase. Uh, and so now this does allow you to, it basically kicks the problem to pass key sync. Um, but for, for example, if you're doing it with, uh, uh, uh, an Apple device, um, Apple has a secure way of syncing pass keys between devices. That is, that's relatively new, right? The original Fido spec that had, had no way to do that. No, no pass keys.
[02:35:34] There's, there's different kinds of pass keys. And so this is where it starts to get a little bit complicated and now things are starting to get a little confusing for consumers. There are pass keys that are hardware only. So, uh, let me, I can grab an example here. Uh, but basically, yes, if you're, if you're a, yeah. So here's like a, uh, uh, a YubiKey, um, different kinds of Fido tokens, uh, Fido2 tokens have identifiers.
[02:35:59] And so if you're an admin, uh, of like a Microsoft Entra or Okta, for example, if you have like an advanced enterprise, uh, authentication environment, you can actually tell the difference between different kinds of pass keys. And you can say, I only allow hardware pass keys versus ones that can be synced. Uh, and so for two, from a Google Gmail or Facebook perspective, they're all the same. From an enterprise perspective, you can actually differentiate. Ah, interesting.
[02:36:26] But yes, uh, Fido2 does have the ability to have the syncable pass keys. They're supposed to still be biometrically, uh, tied to people and they're supposed to still be stored in a way that is encrypted hardware secured. And then if they're synced, there are rules around, uh, syncing them that they shouldn't just be stored in an insecure manner. Um, but like, for example, the ones that are stored in one password and such, uh, you can come up with these scenarios in which they can be compromised. People often... It does reduce the security in some ways.
[02:36:54] Assume that these encrypted messengers, uh, the backups are also encrypted. They are not necessarily. As, uh, as Spook and Sugar and, uh, and, uh, Peso and Scary Terry learned, uh, iCloud backups, uh, not necessarily, uh, encrypted. That's... Those are the mafia guys who got busted in that, uh, gambling scandal. How did they find out?
[02:37:21] Uh, they got into the, uh, the, the, the feds got into the, uh, iCloud backups. That's right. Unless you're running Apple's advanced, uh, advanced security, uh, so you can opt into advanced security, iCloud backups are not necessarily... Sorry, quack, quack. Yeah. You're going to jail. Oh, I don't know if they're going to jail. I shouldn't say that. It's allegedly, allegedly that we're involved in the poker scam. Allegedly. Allegedly.
[02:37:51] Which is a incredible story. Isn't it? And one of the biggest scandals in NBA history. Certainly since, like, the Tim Donaghy... Yeah, although I think some have pointed out that it's a little overstated, uh, that the FBI was really hyping up this NBA angle, but, uh, it was really mafia guys. They were using the NBA people to, like, as shills to attract, like, for advertising. I'm sorry, yeah. I, I, the, the real NBA scandal is the other one, the, the betting scandal where people
[02:38:18] have been, um, there's also, like, simultaneously there's been a betting scandal, uh, involving, like, a friend of LeBron James, who is leaking, uh, in two friends who are betting on whether LeBron, uh, was, uh, injured on the injured list and stuff. So there's been... Oh, I missed that one. Okay. Yeah, yeah. So this is another whole NBA betting scandal based upon all the, all the prop bets, right? Uh, because you can bet on these weird things. This is just gonna, this is just the beginning of that. I mean, you see now...
[02:38:46] This is why legalized gambling is just a really awesome thing. Especially with prop bets, because every second there's another bet. And, I mean, this is a nightmare. And if I feel, I feel terrible for anybody who has a gambling problem, because it's, it's now in your pocket. There's no way to avoid it. And you watch an NFL game, it's nonstop, the advertising for it. Well, I don't know if you just saw, just a couple days ago, Brian Armstrong, the CEO
[02:39:12] of Coinbase blew away a bunch of this, because there was side betting on the prediction markets of whether certain words, he would say certain words on his, his results call on the Coinbase earnings call. Yeah. And somebody in his team had told him about this. So he said all the words? He just read through the list of words at the end of the call. Prediction markets are the sneaky way to get these prop bets into real life.
[02:39:42] Unbelievable. Good for him, I guess. I mean, there's some unhappy people probably as a result. But it's also like, it's, it's kind of like, you should, you cannot, you should not allow betting on things like, does a person say a word on a call? But right. How is that going to stop? Right. At least like an NBA game. Right. Is there's a ton of people who have, you know, personal money writing on it. Like the players want to win, you know, like there's. Are you still saying a word or not?
[02:40:12] He doesn't care. Right. Like there's just no, there's, there's no externalities. It's crazy. Yeah. But it's like Bitcoin. There's no externalities. It's there either. You still a Kings fan? I am still a Kings fan. It's, it's, it's, it's a little rough. Sorry about that. I'm a Kings fan and Cal fan. I'm wearing my Cal colors. Uh, you had another story about like, I, I couldn't watch the game yesterday. Cause I had no ESPN on my YouTube TV. That's still going on.
[02:40:37] And it's going to be an issue because tomorrow it's Monday night football, which is on ABC ESPN. So people with YouTube TV are not going to be able to watch Monday night football. I pay for the NFL Sunday ticket. And, uh, what are they going to, what is YouTube going to do? They're going to make a deal eventually. Eventually. Yeah. Disney. Uh, part of the problem is, and this is kind of new.
[02:41:02] You've seen these carriage battles go on in the past, but what's different now is Disney ESPN. They have their own streamings. They have FUBU. They have Hulu. They have ESPN streaming. They have an incentive not to let YouTube rebroadcast. Yeah. And they don't have to abide by any FCC public license. Oh yeah. They're not working over the public. So there's no, I mean, there's no regulator in the picture here.
[02:41:31] So it was a disaster on Saturday with college football. Just this is capitalism at work. Uh, yeah. No Kings broadcasts, huh? No, I mean, Kings are in, uh, it's TNT, so it's, it's okay. But yes. Oh, okay. Uh, the Cal game was, it was ESPN. So that was rough. I didn't get to watch. I had to listen on AM radio. Like it was the fifties. There've been all these articles, uh, how you can watch these games.
[02:41:59] If you have YouTube TV and it, in most cases they're really, you know, yeah. Listen to the AM radio. They should be the voices, right? It should be like, ah, he's got the ball. The 10 yard. It's like, it's like Ronald Reagan going. It's a hit. It's going up. That, by the way, when Dutch Reagan, before he was the president, before he was the governor, before he was an actor, used to do baseball play-by-play, would do reenactments on the
[02:42:25] radio where he would hit with a stick to make the sound of a ball. He would be reading it off the wire. Uh, what happened in the game? Wow. Doing the play-by-play. No, I feel really old these days because I work with all these Gen Z-ers, but I feel better now coming on quit. That's really old, isn't it? You make me feel young. That's really old. I have, I got a million of them. That's Alex Stamos, ladies and gentlemen. Check out Carter.dev. He's a CSO there. He's one of the good guys. And, uh, thank goodness.
[02:42:57] Thank goodness you're, you're there. Just keep up the, I know it's hard. Keep up the good work. Uh, we didn't even get, uh, we didn't even get to the story about how someone snuck into the Microsoft Teams call with Celebrite. 404 Media had this story and, and leaked the phone unlocking details that Celebrite was, was talking about, was pitching.
[02:43:22] Almost all Google Pixel phones, except for the most recent are hackable by Celebrite. So I guess I'm going to put, uh, I'm going to put, uh, Graphene OS on my Pixel 9. Is Graphene safe, Alex? Uh, it sounds like it from this. It doesn't, it's not Celebritable. It's not Celebritable. So that's good. Um, yeah, I mean, I think I, I mean, it has one of the big upsides of the Android ecosystem is the ability to change out your OS. Yeah.
[02:43:50] Well, not on all, not on all Android phones. And I think. Not on the pixels, right? The pixels are all unlocked. Yeah. You can root them. I'm going to put Graphene on it tonight. Thank you, Alex. Great to have you. Stacey Higginbotham. We will make a date, uh, soon for the book club. I loved our last book, Memory Called Empire. In fact, I'm reading the second volume of that. I can't wait to find out what's going to happen in London. What's, you're reading it now. What's the name of the book? Uh, oh, why, why do you, the hollow heist of London?
[02:44:21] The hollow heist of London. Yeah. No, the heist of hollow London by Eddie Robson. That's going to be our book for the book club next month. So get, get going, read it. Uh, look for Stacey's work, uh, at Consumer Reports where she's a policy fellow. I can't wait to see the stuff you're, it sounds like you're working on some big store, big stuff right now. It's great. It's all big stuff. It's all important. It's all important. No, you're doing again, you're doing, uh, the work of the angels. Thank you, Stacey.
[02:44:50] And Jill E. Duffy is an angel contributor at PC Magazine and wired. She got up very, very early to join us. I appreciate that. Thank you so much. It's great to see you again. It's been too long. Thank you. Anything else you want to plug? You can find me online anywhere at Jill E. Duffy. Um, Instagram I've been using and Mastodon. I like to use, I gave up Twitter. I don't really use BlueSki, but I have an account there. I like it. BlueSki.
[02:45:20] That's good. I'm calling it BlueSki from now on. I like it. It's the Polish Twitter. BlueSki. Yeah. Yeah. Um, somewhere in that region. Instagram and, uh, what was it? Instagram Mastodon is the other one I use. And which server are you on on Mastodon or does it matter? I'm on the big one. Mastodon.social, I think. Yeah. So if I go to our Mastodon on twit.social and I enter Jill E. Duffy, it should be able to find you. I think it, I think it works like that.
[02:45:50] Thank you, Jill. Appreciate it. Thanks for having me. Yeah. Thanks to all of you for joining us. We do tweet on Sunday afternoon. Uh, yes. And we did go to standard time. So yes, we started an hour late. I hope you didn't get here an hour early. We are at two to 5 PM Pacific. That's five to 8 PM Eastern standard time. That's 2200 UTC because we moved, but UTC didn't. 2200 UTC.
[02:46:19] Uh, you can watch us live on YouTube, Twitch, TikTok. No, we took down TikTok. It's too complicated. YouTube, Twitch, Facebook, LinkedIn, x.com and kick.com. Plus of course you're in the club. You can watch on the discord. You don't have to watch us live. That's just, if you want the freshest version, we'll take out all the swear words and we'll put it up on the internet on our website, twit.tv. There's a YouTube channel dedicated to the video. There's audio and video available for subscription to and your favorite podcast client. That's probably the best way to get it.
[02:46:49] Leave us a review. Let the world know when you've been doing a show for 20 years in podcasting, that means people, you know, like they're still around. So make sure you put a review up. So people go, yeah, they are still around. And you know what? I bet you're pretty glad you listened to this show. Lots of good stuff. Thank you for being here. We will see you next time. And as I've said for 20 years, and I hope I'll say for another 20 years, thanks for joining us. Another twit is in the can. Bye-bye. This is amazing.
