Microsoft quietly hands over BitLocker keys to the government, TikTok's new privacy terms spark a user panic, and Europe's secret tech backups reveal anxious prep for digital fallout. Plus, how gambling platforms are changing the future of news and sports.
- You can bet on how much snow will fall in New York City this weekend
- Europe Prepares for a Nightmare Scenario: The U.S. Blocking Access to Tech
- China, US sign off on TikTok US spinoff
- TikTok users freak out over app's 'immigration status' collection -- here's what it means
- Elon Musk's Grok A.I. Chatbot Made Millions of Sexualized Images, New Estimates Show
- Microsoft Gave FBI Keys To Unlock Encrypted Data, Exposing Major Privacy Flaw - Forbes
- House of Lords votes to ban social media for Brits under 16
- Overrun with AI slop, cURL scraps bug bounties to ensure "intact mental health"
- Route leak incident on January 22, 2026
- 149 Million Usernames and Passwords Exposed by Unsecured Database
- Millions of people imperiled through sign-in links sent by SMS
- Anthropic revises Claude's 'Constitution,' and hints at chatbot consciousness
- The new Siri chatbot may run on Google servers, not Apple's
- A Wikipedia Group Made a Guide to Detect AI Writing. Now a Plug-In Uses It to 'Humanize' Chatbots
- GitHub - anthropics/original_performance_takehome: Anthropic's original performance take-home, now open for you to try!
- Telly's "free" ad-based TVs make notable revenue—when they're actually delivered - Ars Technica
- Toilet Maker Toto's Shares Get Unlikely Boost From AI Rush - Slashdot
- Dr. Gladys West, whose mathematical models inspired GPS, dies at 95
Host: Leo Laporte
Guests: Alex Stamos, Doc Rock, and Patrick Beja
Download or subscribe to This Week in Tech at https://twit.tv/shows/this-week-in-tech
Join Club TWiT for Ad-Free Podcasts!
Support what you love and get ad-free audio and video feeds, a members-only Discord, and exclusive content. Join today: https://twit.tv/clubtwit
Sponsors:
[00:00:00] It's time for TWiT. This Week in Tech, Alex Stamos joins Patrick Beja and Doc Rock. We'll talk about betting on the amount of snowfall that hit New York, Microsoft giving up the keys to BitLocker to the feds, and why I deleted TikTok as soon as I saw the new Terms of Service. This Week in Tech is next. Podcasts you love. From people you trust. This is TWiT.
[00:00:35] This is TWiT. This Week in Tech, Episode 1068, recorded Sunday, January 25th, 2026. Totos Electrostatic Chuck. It's time for TWiT. This Week in Tech, the show we cover the week's tech news. Hello, everybody. I hope you're staying warm and you're safe and your power is on and you're not listening to this on a battery. If you are, turn it off and save it for when the power comes back.
[00:01:03] Oh, maybe you don't want to because we've got a great show lined up for you. Patrick Beja joins us from Paris, the host of the Phileas Club and a bunch of stuff in French. Hi, Patrick. Hey, Leo. How's it going? NotPatrick.com if you want to see them all. It's great to see you. I see you too. I see you too. Yeah. You know, we almost, I'm not sure we would have been allowed to talk if things had gone differently in the last couple of weeks.
[00:01:32] We're still NATO buddy. Yeah, we're still NATO friends. We're still able to converse. So far. We'll talk about that in a little bit because I didn't realize this, but there was real fear in the EU of a Danish invasion, which would have been essentially a war between NATO powers. And I didn't realize how scary that was to people in Europe, but we'll talk about it. Yeah. Yeah. We can talk about it. Yeah. We'll talk about that.
[00:02:00] I want to introduce the rest of the panel so they can get in on this. Doc Rock is here. He's in Hawaii. We have not yet declared war in Hawaii, but it is one of the islands week. We own. We kind of, we kind of took it by force a little while ago, but yeah. Hey doc. How you been? Great to see you. I've been. I feel like I saw you last Tuesday. You did actually. I hope we're not using up our credits with Doc Rock. Oh God. No, I love it. I love to hang out here. Good.
[00:02:29] Doc Rock is of course on YouTube, director of strategic partnerships at ECAM. And we always are glad to welcome Alex Stamos to the microphones. Alex is chief product officer at Corridor.dev, which is a AI security. Startup. It's great to see you, Alex. Hey Leo. Thanks for having me. I know when I say your name, people go, wait a minute, Alex Stamos, because boy, these, you know, you were at Meta, you were at Zoom brought you in as a consultant when they
[00:02:57] had that little kerfuffle during the COVID pandemic over end to end encryption. You helped them get that all back together. Then you were at Stanford at the Internet Observatory. We're the director of the Internet Observatory, which is all about watching disinformation. But apparently disinformation is all the rage now so that there's not much future in that anymore. It's just called information now. It's just information. Right. Exactly. That's just it. Oh, Lord. Oh, Lord. Oh, Lord. Oh, Lord.
[00:03:27] Oh, get me started. And so anyway, it's wonderful to have you, Alex. And and I hope you well, I'm glad that you guys are in areas that it's not snowing. Many of our listeners, of course, are in the swath of this terrible freeze. I thought it was kind of interesting. And this is the one tech story, at least so far having to do with this, that the the prediction
[00:03:53] markets are now huge about any news story. It's all about what you can bet at Calci and Poly Market. You can bet how much snow hit. Well, I probably can't anymore. I think it's over. You can bet how much snow will fall in New York City this weekend. But basically now. Traders on Calci bet nine hundred thousand dollars as of yesterday afternoon on whether
[00:04:22] 12 inches of snow would fall in New York. On Poly Market, two hundred ten thousand dollars on how much snow the winning category so far eight to ten inches. So if you bet more than 12, you lost your bet. But this is what is going on now. Billions, it seems certainly hundreds of millions are trading hands on these new markets betting. It's almost like gambling is addictive.
[00:04:52] Almost. It sounds that way. I mean, I understand that it's very difficult to regulate gambling when you have the Internet. I guess that's why sports betting is being allowed now, because you can always go outside of the region where it's legal and bet there. But it is. I mean, or maybe you don't have to allow it. I don't know. Maybe it's a choice. But I still think it's concerning because it's gambling. Right.
[00:05:22] It's just gambling applied to everything. And I think what happened, at least in the United States with our American football, the NFL, is it now dominates. In fact, I would say it's all sports. It dominates the coverage. You can't talk about a game without saying the over under, who's the favorite, what the odds are. And you see it in tickers going across the bottom.
[00:05:47] And there's some concern that that's going to happen to news in America, that you will see the over unders on whether we're going to invade Greenland. And that suddenly it's going to become all about betting on the news instead of the actual impact of the news. You guys, I know Alex and Doc follow sports closely.
[00:06:10] Do you think this, all of this, you know, DraftKings sport bet has, I think it's tarnished the reputation of major league sports? Yeah, in a way, you know, it's funny. Sorry, Rob. Nowadays, here's what's really, really sad. I was watching AFCON the other day and there was a controversial call. And so there's going to be a penalty kick. And the player who's a penalty kick, well, he's like half Moroccan.
[00:06:39] And he missed the penalty kick that Leo could make. And then the suspicion. Thank you. The first thought wasn't, oh my God, the pressure or it was a good idea, but kind of dumb place to do this idea. Because it does work. When it works, everybody goes, I can't believe they had the balls to do it. It's called the palinka. It's basically you send it lightly down the center like your sister kicked it. And goalies missed those because they dive left to right. Right. Goalies take a chance, dive left to right. So he was doing something reasonable. He was doing something very reasonable.
[00:07:09] But the first thought that went in my head and I'm not a conspiracy theorist type person was, did he do that on purpose? And, you know, him being like only half Moroccan, like, you know, is that is like really is this is he is he on the sports bet? And I can't believe I went there. But as a Raiders fan and knowing Tom Brady's cheating. But, yeah, that's the thought that came to my mind. It really is. Alex, you're a Kings fan. I know. Does it impact? I'm a Kings fan and a Cal fan. Cal beat Stanford last night. Oh, really?
[00:07:38] They had basketball. The axe goes to Berkeley. Well, right. So the axe doesn't go with basketball. This is football. Unfortunately, the axe is at Stanford. Look, so a little hatchet. There's there's the famous quote, which I think is attributed to Heinlein. I don't know if that's accurate or not, that every generation thinks it invented sex. And so now every generation thinks they invented gambling. Yeah.
[00:08:01] And and this is just, you know, Gen Z sports book has always been around, but it was always secretive. It was a bookie. Right. Well, so I think there's a couple of differences here. Right. You know, we've we've always had gambling. We've always had gambling addiction and we've always had gambling scantum. Scandals, right? Everybody here or at least you and me are old enough to remember Pete Rose. Right. Oh, yeah. Other, you know, other scandals that happened. I think a couple of things are different here. One obviously is with the Internet. You don't have to know a bookie in town or you don't have to go to Vegas. Right.
[00:08:30] So the fact that young men are doing this on their phones all the time and are getting very aggressively addicted is can be a problem. The second is the amount of prop bets that happen. Right. Like if you're if you're gambling on the outcome of a game in, you know, a major sporting league, then there's all kinds of countervailing motivations. Right. Like we have even in those cases had people cheating.
[00:08:59] We've had rest cheating. We've had players trying to shave points and such, but at least you've got a bunch of different, you know, in a basketball game, you've got five guys on either side. You've got the refs, you have a lot of actors. So it's not easy for one individual person to swing the game. But now with all these prop bets, you've got how much is this person scoring? You've got, you know, how many fouls and you've got these crazy things outside of sports. Right. So you've got is this person going to say this thing in a call for the quarterly results?
[00:09:29] You've got how long does the press secretary's briefing go? Right. And so you saw, you know, that Trump's press secretary call her briefing within seconds of a bunch of people making money. And so when you're talking about the actions of one individual for which there's no actual financial benefit for them one way or another, it is so incredibly gameable that I am shocked.
[00:09:54] I'm absolutely shocked that these platforms are allowing it to be bet on because the the ability for those people to self deal or for their friends to self deal is humongous. And the fact that, you know, apparently we live in a time for which nothing matters and there's no kind of regulation. But this is you know, this would normally be called wire fraud for a group of people to work together to then use inside information to for hundreds of thousands or millions of dollars to be ripped off.
[00:10:21] You know, we just had a scandal in the NBA where people actually got arrested because, you know, there are laws around betting around sports at least. But for all these other prop bets, there seems to be absolutely no regulation at all. So to give you some numbers, according to the Financial Times, in early 2024, bets placed on Polymarket and Calci were 100 million dollars a month. In November of last year, 13 billion dollars in one month.
[00:10:50] They're betting on everything. As you mentioned, they bet on whether Nicolas Maduro would be captured. Right. Somebody made a last second bet and made a ton of money. Yeah. So you think about how many people in the military knew. So somebody knew, right? Somebody was sitting there and thinking, well, let me just. And so there's risks. There's all kinds of risks. There's risks to the military. There's risks of getting ripped off. Calci is worth 11 billion dollars.
[00:11:19] They raised a billion from Paradigm, Sequoia, Andreessen Horowitz, Capital G, big VC firms in the United States. New York's, the owner of the New York Stock Exchange said it was going to invest two billion dollars in Polymarket. They're valued now at eight billion compared to Calci's 11 billion. What's interesting is they're calling these financial contracts in order to skirt around gambling laws in many states. Yeah. Yep.
[00:11:49] So, but this is gambling. That's so dumb. They have. That's the most surprising like thing to me. It is gambling is illegal unless it isn't in specific places. Right. And these kinds of bets would have been considered gambling and would have been illegal until those, you know, first the sports betting apps. And now these kind of became allowed.
[00:12:18] I guess I haven't been following the legislative process on that. You know, it's interesting in the United States, the stakeholders in this are Las Vegas and Atlantic City. The casinos. Yeah, of course. Don't want online gambling. They've put a lot of money into California. There was a California initiative, a referendum last year to make it legal. They put a lot of money into that to try to keep it from becoming legal because they want the punters to come to Vegas to make their bets.
[00:12:48] The government doesn't seem to really care that much. The government like the Commodities Future Trading Commission, which is the regulator that, believe it or not, is responsible for these. Polymarket and Kalshi don't really care that much. It's it's the casinos that care.
[00:13:08] But have we suddenly decided that gambling is not because the reason it can you know, there are specific places casinos is that it's not allowed in other places. Right. We're we're we're in agreement just to put up a little speed is illegal. Yes. So how come no one is mentioning that this is gambling and maybe we shouldn't be like all of a sudden it's everywhere?
[00:13:36] You are. So there's there's multiple lawsuits happening and effectively there's a bunch of fights happening under state law. There's a bunch of lawsuits. There's a lawsuit from a Nevada. So Nevada, Nevada's Gaming Commission has sued. There's been in Indian tribes with have. Well, that's yeah, that's the other stakeholder is right. So Indian casinos. Yeah. So what's happened is effectively with the switch over, the Trump administration has stopped, you know, I. Prosecuting.
[00:14:06] Yeah, stop prosecuting. Right. So like the federal regulators have stepped back and U.S. attorneys have stepped back. That's not going to last forever. Right. Like in three years, there'll be a new administration. It will most likely be a party. And I think. You're an optimist, aren't you, Alex? I am. So I'm not going to be a doomer. There's all these like blue sky doomers who are like, we're never gonna have another election or whatever. That's right. Like it is coming there. There is going to be a massive change here in the United States. I think it will be significant. There will be. There will be.
[00:14:33] And I think the the tech people who are aligning themselves with the current administration so hard are. They're going to hurt themselves and honestly, they're going to hurt tech overall. I think like this is like when I talk to people here in Silicon Valley. Alex, they're going to make the same U-turn they made two years ago. That's like, oh, the bits will fly out. We're going to get woke again. Yeah, but this is I think this is the I think it's be very hard this time. I mean, this is my fear is that in for folks like this, I I think it's great for them to be smushed. Right.
[00:15:01] And like, I don't mind like, you know, for things to like snap back on things like cryptocurrencies and stuff. Right. Like what I am afraid is for tech overall is that they're like this, this relatively small number of like outspoken venture capitalists and and, you know, other douchebags who have like backed this current administration have are going to poison the well on like American tech competitiveness for a decade by basically turning a huge chunk.
[00:15:32] And that's the big chunk of the American populace against like because a handful of people who stood in that line at the inauguration are going to make people think that that's all tech people when like the median tech even executive who lives in San Mateo or Santa Clara County is effectively like a West Wing liberal. Right. And but that's not who you hear from. I can tell you that looking at it from here, we are not seeing that.
[00:15:57] Right. That's not what you see. That's what you see from like the handful of really loud people. Yeah, exactly. It's not just really loud people. It's Tim Apple. It's Sundar Pichai. It's CEOs who rightly or wrongly think we've got to play ball with the current administration or we're going to put out of business. And that's the part that sucks because I know I know for a fact that he's probably technically not even in that camp.
[00:16:22] But oh, I mean, he has personal things that are completely against the administration, but he got to make the company afloat. You got to take it to stockholders. The thing that's upsetting me about the gambling thing right now is they are preying on the people who feel that the financial situation is never going to get better. And whenever the financial situation doesn't get better, people try to take gambles. Now, these gambles are talking about they got dumb stuff like who's going to win the next presidential election?
[00:16:50] You know, we haven't even picked candidates yet, but they got like, you know, J.D. Vance versus Gavin Newsom. You want a dumb one? Betting that Taylor Swift will be pregnant by March is one of the big. Yes, I saw that one. So I wanted to say Kelsey. Oh, my God. They heavily advertise on IG. So if you're an IG, you see this stuff and I'm like, you know, these bets are super stupid.
[00:17:12] But the thing is and coming from being basically been, you know, East Coast in the hood at the time when the money gets bad, people start doing, quote unquote, illegal things to try to get the money. Then the Reagan administration decided to lock all of us up for it. Right. While, you know, Tom Cruise gets caught with a little small bag of, you know, booger sugar and he goes to rehab. Take that same booger sugar and turn it into crack, which is one tenth the amount.
[00:17:41] Yeah. And put it in the hands of somebody on my side of the fence. And it's 20 years automatically. Yeah, that's right. And so they're preying on people being down in their money right now. And this is no different than what the crack situation was back in the day. It just doesn't seem as painful. But trust you me, it's painful. There are kids right now that they're wasting their college fund. There are parents right now that are taking wild swings and they're losing. And yeah, it's not only is it addictive, it's highly destructive.
[00:18:11] And then you know what you do to recover your money after this? That call the local person in your neighborhood that knows how to get money the way my neighborhood got money. And then you get worse. Yeah. Yeah. And the VIG kills you. A hundred percent. This is really bad. Yeah, it's going to ruin lives. And the pushback is going to be huge. I thought this was going to be a nice, upbeat way to start the show. Leon, I don't think you have the right guests for like the super upbeat show.
[00:18:40] No, it's not. I'm going to start with something. It's not even about the guests. This is literally gambling. Like there's no discussion. The only reason this isn't a bigger deal is that there are other much bigger deals, you know, in the world. This is the secret. It's it's it's crap all the way down.
[00:18:58] And it's very clear that a lot of this is that Putin style flood the zone where if you if you spew a huge amount of crap, people, you know, we can only take so much. And then you get to the point where you should at some point people just go, I give up and they turn out. And that's of course, that's the best possible thing that could happen. I hear so many of my friends with the I give up language and I'm like, no, please don't go to here. Like, I don't even like your take on it, but please don't give up.
[00:19:27] Honestly, I think it's fine to give up. I think it's fine to give up because you're going to ruin your mental health. Well, that's why you want to give up. It's driving. I can't sleep. You want to give up. I can't sleep. As long, you know, just three bottles of Japanese whiskey. You're back from Japan. I'll see you. Yeah, that helps. But like give up on the constant barrage. Just watch the news from a channel. You know, France 24 has an English channel.
[00:19:57] Just watch the news once a day from France 24. Read a reputable newspaper and then vote when it's time to vote. That's the important thing.
[00:20:11] Like you're not going to, you know, you're not going to change things or understand things better by watching the constant barrage of news of either the news, you know, 24 hours news networks or the tick tocks and shorts or whatever that will serve you the most. Just out raging in, in, in, in, in, in, in.
[00:20:32] We have a, we have an elderly parent who doesn't consume mainstream news, but subscribes to 300 newsletters and gets much of their news from a Brazilian kind of nutty Brazilian newsletter. And the problem is it's so easy now, thanks to AI and thanks to the internet to create a huge amount of disinformation.
[00:20:58] And I think, uh, I don't, I don't blame this relative, uh, because he's grown up. He's an older fellow. He's his whole life that, well, you trust what you read, right? These are, these are, you know, you, we trusted the New York times and NBC and CBS. And when Uncle Walter said something, Walter Cronkite, we, we, we, we took that as gospel. And so you have that in your head. Well, this is, I'm reading it. It must be true.
[00:21:26] So I don't blame them for this. Let me ask you, Alex, cause you're, you are an expert on security. Um, and I'm very relieved that you think the elections, that elections, fair elections can happen. One of the strengths, a weird strength of the United States is that, um, our elections are run by the states. So there's 50 different local governments that run the elections. Very hard to steal a national election. Right. There's over 10,000 actually election authorities in the United States. Right.
[00:21:56] Because yeah, not just the states, the counties, right? It's local. Yeah. Don't, don't you just need to steal like four country counties though to swing it? So, so tell me, Alex, reassure us that, uh, it's, it would be very difficult to steal an election. It's extremely difficult to steal an election, which is why the election was not stolen in 2020 and why it was not stolen in 2024. There is some feeling that it might have been. I mean, Trump did say, well, uh, we, we got the swing states handled. Elon's going to handle that. Yeah.
[00:22:27] That scared me a little bit. This is what drives you nuts. There's, there's, there's a whole set of like democratic influencers who like to, you know, push this idea that Trump stole it. That is not, Donald Trump did not steal it either. Right. Right. Like, um, it is, it is extremely hard in the modern era. It would be, it is effectively impossible to, to steal a U S election. What is possible. It would be possible to cause chaos. Right.
[00:22:54] Um, and it certainly would be possible for the president of the United States to cause chaos and to create a situation where you can certainly create a situation where the election is unfair. Right. Where people can't vote. Or maybe more to the point believed to be unfair. Right. That's the real issue. You know, is what people believe. But actually changing the votes would be extremely hard to just lay it out. Right. So in the United States, uh, we'd now have moved as of now. So this wasn't totally true in 2020.
[00:23:22] Although one of the saving graces is by 2020, none of the swing states were using direct recorded entry electronic voting. Right. So nobody, none of the swing states, was it that when you pushed a button on electronic voting machine, did that machine store your vote? There were people who touched a touchscreen, but then what it did was recorded it. That was a ballot marking device. You want something that's a paper trail that can be audited after the fact, right?
[00:23:49] So now we have for the vast majority of people vote on hand marked paper ballots with the option of people can touch a machine. If they, this is most of your people for either who, uh, need a language that is not supported in the place where they're voting or they have a special accessibility need such as they're blind or something like that. Um, but then those are ballot marking devices. It will then print a paper ballot.
[00:24:11] Those ballots are then, uh, counted usually by machines, but then we have what are called risk limiting audits in every, all 50 states. So, uh, this is what, this is what gets hard. It is hard to explain this statistic, things like statistics to normal folks. But this is the expertise of your former colleague, Chris Krebs, right? This is why Chris Krebs in 2020 said this was one of the fairest elections.
[00:24:35] He said it was the most secure election that ever been held, which is true because by 2020 CISA had been created the cybersecurity and infrastructure security agency, which was the consolidation of a bunch of different security groups in DHS into one agency. Now, CISA can't actually do security for all the different parts of the election, but what they, what they did do was support all of the different counties and states. Um, and in doing so they did penetration tests.
[00:25:02] They did code reviews of the code. They did work with the people who build all this stuff. And then they ran a center that, uh, pulled in information from the NSA and FBI shared threat information back out to the states and counties. And then if anything bad happened, helped do very rapid investigations. Is that all still in place? No, that's all been torn down. All of that capability has been destroyed.
[00:25:27] The people who have been fired, uh, from CISA, uh, there's been an active attempt to undermine that security. Yes. Pretty much anybody who did that work has been effectively called like a traitor or has been said that they Chris Krebs, like Chris. Chris, so unfortunately, president Trump signed an executive order that explicitly called out Chris, um, and said that, uh, Chris's, uh, clearance need to be revoked. His clearance was revoked.
[00:25:55] Um, they took away his TSA pre-check in his global entry, which is kind of. Nutty. I mean, it's like kind of talk about petty, petty, petty fascism. Um, uh, but then what is not petty is an ordered for him to be investigated, which is basically a bill of attainder, which is something that is so unconstitutional. It's mentioned in our constitution is not being allowed twice.
[00:26:16] Um, so anyway, uh, uh, anyway, Chris and a number of other people, uh, worked very hard in 2020 and in, you know, uh, not him, but other people worked in 2024 to keep the election secure. Uh, a bunch of those people have been fired, but still there are people in the States and localities who have that muscle memory who will be doing this in 2026 and 2028. Um, it, it, it's just basically impossible to straight up steal an election.
[00:26:40] But I think the fear though, is if you have kind of armed men in the street, if you have riots, if you have tear gas, then people won't go out. They won't vote. Right. So that's different. That's different than like, you know, stuffing ballot boxes. It's different than, um, changing the vote totals, which again is effectively impossible at scale. Um, but what you could do is, is, is try to suppress voting. Part of the problem also is that our elections are very close and they happen for decades.
[00:27:09] And the closer the election, the smaller, the amount of leverage you need to exert to, to change an election. Yeah. And, and the other, uh, other issue that everybody's always afraid of is the, uh, as everybody here knows is, you know, actually the Americans voting for our president directly. It was not how we did it. Right. Like George Washington was not elected by the American people. He was elected by the first Continental Congress.
[00:27:34] Um, and, you know, that then, uh, persisted in the creation of the constitution in 1792 and such. Um, and that, you know, Congress came together and put, you know, the electoral college and such. And so the direct election of the electoral college was something that was kind of, you know, added on later. And, uh, and so, you know, we have these weird systems for like, Hey, if the states can't figure out who their electors are, what do we do?
[00:27:58] Uh, and the, you know, this ended up in, I think it was 1873 in this really bizarre, it's only happened once in our history, but this really bizarre outcome where then the house of representatives effectively, um, had to decide who the president was going to be. Uh, and that's what ended reconstruction after the civil war and ended up being really bad for our country in a number of ways.
[00:28:22] And so I think that's one of the real fears is if you cause enough chaos, if you cause lawsuits in every single state, um, that you're able to throw it to the house of representatives, which then does not vote based upon the population. It votes, each state, uh, uh, votes as a block and it's, it's, it's just a bizarre part of our constitution, um, that should not be written the way it is. And it's, it's only worked that way once.
[00:28:48] And that would be a extremely undemocratic in a, in an outcome that would not be accepted, I think by a huge number of the American people. Well, I'm just. I kind of wish you had stopped that. It's impossible to steal the election. That's it. That's it. It's going to be fine. It's going to be fine. Uh, it's what's interesting, of course, Patrick, uh, is that the rest of the world cares very much what happens here. This impacts everybody. It's not just us.
[00:29:18] Yeah. Yeah. Of course. I mean, we were talking just before the show. I was saying that when we were thinking that I, I think the, if we were to vote on, what is it? Kelsey or polymarket, like is, uh, the American military actually going to, uh, enter Greenland's, uh, uh, you know, uh, not space, but, uh, uh, territorial.
[00:29:44] It would probably be relatively low, but we can't act like it's impossible. Right. So obviously we think about it. We talk about it. One of the ways you keep that probability low is by being strong. I mean, this is, this is the lesson learned. Well, it's 26% right now. Will the U S acquire part of Greenland? So if you want to. That's pretty low. I mean, that's surprising. It's, it, it is not going to happen. It will be world war three before that. If you believe it, you can put your money where your mouth is right now.
[00:30:13] But then what would you spend it on? Start gambling. Yeah. Yeah. I wouldn't have it. But so, you know, it's things like we start talking about the fact that you should have a rechargeable AM radio just in case. I mean, this is not just for. Is this what people in France were doing? They were, they were like planning? No, it's me. It's me and my, my nerdy tech friends. But we realize, we realize that, you know, we don't have copper cables for phone lines anymore.
[00:30:39] We don't have TVs with antennas to get the news. So if something happens, it's not just, you know, an open conflict with the United States. But if there's whatever, something happens, if there's a bomb, if there's anything, you can't do, you can't get communications with anyone. So you, you probably do need an AM radio to at least get some information about what's happening.
[00:31:05] But more seriously and more tech related, I think what a lot of people have been thinking about is that we need at least failovers like backups on the, all the systems that we use that are provided by American companies like, you know, Microsoft, Google, Apple, everything.
[00:31:26] And so even if you don't switch everything over to, let's say, Proton or one of those, I think a lot of people, especially in infrastructure and critical infrastructure, they will switch over. But a lot of companies might want to have a system ready to go up very quickly if things go bad.
[00:31:51] Because even if there isn't an armed conflict, if at some point something happens and the American, you know, the Trump administration decides American companies can't do business with France or some of them, or it's very unlikely. But if it happens, but if it happens, then you're in deep doo-doo. So you need to be ready. And that's the kind of thing that we wouldn't even have like considered a year ago. It would have been like pure fantasy.
[00:32:19] And now we're, you remember what happened with the International Court of Justice? And what was the name of the gentleman? He had his Outlook email revoked. Oh, yes. After the investigation and pronouncement. That's right. It was Microsoft's servers. Yeah. Right. And so that was a shock in Europe. It was like, okay, then it can happen.
[00:32:47] And we can't be using American services for critical infrastructure. So I think relatively quietly, a lot of administrations have been slowly moving towards either open source or European based services. Headline in Friday's Wall Street Journal, Europe prepares for a nightmare scenario. The U.S. blocking access to tech. Yeah.
[00:33:15] Trump's Greenland threats inject urgency into regions' efforts to reduce its reliance on American technology. If there's anybody who blocks it, it's actually most likely to be the European Commission, right? Because we're actually in this weird legal place post the Schrems 2 decision by the European Court of Justice where there's supposed to be an agreement between the European Commission and the United States that was negotiated by the Biden administration.
[00:33:42] And then the Trump administration just kind of dropped it a little bit. And so the tech industry is – and this precipice where the Trump administration is supposed to be – at the same time, there's supposed to be fair dealings of saying, yes, we will respect the privacy of Europeans.
[00:34:04] And we will help American companies do that by agreeing to a set of rules by which we will respect the privacy of European citizens if it sits in American data centers. We're also threatening to invade Greenland. And so it would – all you need is for the European Commission to effectively just not agree to this new privacy agreement between the United States and the EC.
[00:34:28] And the European Court of Justice will just wipe out the word called standard contractual clauses. And all of a sudden, the European subsidiaries of Microsoft and Google and every other American company will no longer be able to do data transfers into the United States. And so that will have the effect of basically cutting off European companies from using the American cloud.
[00:34:53] Let me take a break and then we'll talk about TikTok because the deal went through. And maybe it didn't work out quite like everybody thought it might work out. We have a really perfect panel for this, I got to tell you. It's so good to have Patrick Beja visiting us from the EU. And if he suddenly disappears, we'll know something horrible has happened. No, no. No, he's actually just getting better from the flu and it's late at night in Paris.
[00:35:23] And we're going to try to keep this from being a four-hour episode just for you, Patrick. Yeah, right. Yeah, right. We've gone through two stories. So we're totally different. We're not moving exactly at the pace I was hoping. That's Alex Stamos. Great to have you, Alex. You know, these are important things, though. I really – you know, I don't want to give them short shrift. So I really appreciate it.
[00:35:42] Alex's new job is chief product officer at Carter.dev, a firm that is much needed these days, helps AI developers create secure code. And I'm going to sign up for it and have it run on my GitHub repos and see where I went wrong. Because I've been doing a lot of vibe coding lately. I've been loving it. I've been loving Cloud Code. I actually ended up doing the max subscription just so I could have more tokens and more credits.
[00:36:12] And it's really – I think we're entering into a world of hyper-personalized software. It's very interesting to me. We'll talk about that a little bit. Also with us, Mr. Aloha himself, Doc Rock. Thank you. YouTube.com slash Doc Rock. He's on the opposite side of the world, actually, from Patrick. So as Patrick's entering the early morning hours, you're just entering the afternoon. Yep. One o'clock. That's nice. That's nice. Great to have all three of you. Our show today brought to you by ThreatLocker. I'm very excited.
[00:36:43] Steve Ibsen and I are going to be heading to Orlando for ThreatLocker Zero Trust World. We've got a deal on tickets. I'll tell you more about that in just a second. But let me first tell you why you want to know about ThreatLocker. You know ransomware is just devastating businesses worldwide. ThreatLocker can stop it before it starts because it's zero trust. Zero trust is remarkable. Recent analysis from ThreatLocker shows how a single ransomware operation, Chi Lin,
[00:37:12] surged from 45 incidents a couple of years ago to 800 incidents last year. It'll probably double again next year, this year. That's why you need ThreatLocker. ThreatLocker's zero trust platform takes a proactive, and these are the three words the key three words, deny by default approach to block every unauthorized action. If you don't specifically say, yes, this can happen, it won't happen.
[00:37:40] That protects you from both known and unknown threats. ThreatLocker's innovative ring fencing constrains tools and remote management utilities so attackers cannot weaponize them for lateral movement or the mass encryption they need for ransomware. ThreatLocker works in every industry. It works on PCs and Macs. They've got US-based support that's phenomenal, 24-7. And this is kind of cool.
[00:38:08] It's just, I think, in a way, a side effect of zero trust because you know nothing can happen unless you authorize it. It means you have comprehensive visibility and control. You have a record of everything that's happened, right? Listen to some of the people using ThreatLocker. Emirates Flight Catering. This is a global leader in the food industry. 13,000 employees. They cannot afford to be down for one minute.
[00:38:31] ThreatLocker gave full control of apps and endpoints, improved Emirates Catering's compliance, and delivered seamless security with strong IT support. The CISO of Emirates Flight Catering said this, quote, the capabilities, the support, and the best part of ThreatLocker is how easily it integrates with almost any solution. Other tools take time to integrate with ThreatLocker. It's seamless. That's one of the key reasons we use it. It's incredibly helpful to me as a CISO.
[00:38:59] Any enterprise that can't afford to be hit by ransomware, I guess that's everybody, but can't afford even to be down for a minute, can benefit from ThreatLocker. JetBlue uses ThreatLocker. They have to fly, right? Heathrow Airport, which has in the past had problems. They've turned to ThreatLocker now to make sure that the airport operations are flawlessly, continuously available. The Indianapolis Colts use ThreatLocker. The point of Vancouver. I can go on and on.
[00:39:26] ThreatLocker consistently receives high honors, industry recognition from G2 for high performer and best support for enterprise in the summer 2025 ratings. Peer spot ranked ThreatLocker number one in application control. GetApp's best functionality and features award in 2025. Find out more about ThreatLocker. Go to ThreatLocker.com slash twit. Get a free 30-day trial. Learn more about how ThreatLocker can help mitigate unknown threats and ensure compliance. ThreatLocker.com slash twit. I am excited.
[00:39:55] We're going to go to Zero Trust World, their annual conference. It's in Orlando this March. And if you want to come out and see Steve and me, we're going to do a special presentation on the first day of the event. We've got, for a limited time, a special offer code for you. ZTWTwit26. That's Zero Trust World. ZTW Twit. This Week in Tech. 26. You'll save $200 off registration for Zero Trust World 2026. We'd love to see you. You'll get access to all sessions.
[00:40:23] You'll get hands-on hacking labs, meals. There is an after party that's legendary. The most interactive hands-on cybersecurity learning event of the year. It's March 4th through the 6th in Orlando. Be sure to register with our code ZTWTwit26 and we will see you in Orlando. And thank you, ThreatLocker, for supporting the show. So this week, it became official.
[00:40:49] China and U.S. have signed off on the U.S. spinoff. China still has 29.9% of the company. That's the maximum a foreign entity can have ownership of. The rest is owned by Oracle, Silver Lake, Michael Dell, the United Arab Emirates. MGX is their state-owned investment firm based on AI.
[00:41:20] Susquehanna. I think that's Jeff Voss, who's a big Trump donor. I think one of the real reasons that the whole throw TikTok out of the United States didn't happen, because Voss had a huge percentage of TikTok and I don't think he wanted to lose his money. Dragoneer. This meets the final deadline, the extension, which has been going on since the law requiring TikTok to be sold was passed and signed into law in 2024.
[00:41:49] The Supreme Court affirmed it. No one knows how much was paid. It's not clear. But I did note that as soon as I signed into TikTok, I got new end user agreement that I had to agree to. So I didn't and I deleted it. But some people say it's actually for one of the things that they asked for is more specific location information.
[00:42:18] Look, we were worried about the Chinese having that right now. Now they're being they're even more granular. They're having minute to minute location information. That was the thing that stopped me. They also said TikTok could collect information, including your sex life, your sex orientation, your status as a transgender or non-binary citizenship and immigration status.
[00:42:46] Now, lawyers have said that that's to comply with California's Consumer Privacy Act because now it's a U.S. entity. They have to comply, I guess. It was enough to scare me off. It's funny. It's ironic because I honestly wasn't worried about the Chinese knowing my immigration status. I'm much more worried about the U.S. government knowing my immigration status.
[00:43:14] And I'm a citizen, but apparently that's not sufficient anymore. What do you think, Doc Rock? Are you going to continue? Do you do stuff on TikTok? You know what's really crazy is I recently just started doing stuff on TikTok because, you know, there's a lot of activity happening over there. And yeah, this is sketchy. This is what's funny is that you can't get past the agreement to cancel your account.
[00:43:44] You have to agree to it before you can cancel your account. So I just checking to see if I had the pop up. I didn't get the pop up yet, but I know some friends that got it yesterday. Oh, I got it immediately. This is wild. So I declined and deleted it. Alex, am I crazy? I mean, we've been talking about this for more than a year. I remember when you were on, you said, yeah, no, TikTok is a threat.
[00:44:13] Is it? It's no longer a threat? Yeah. So my position always was that of all the Chinese companies, TikTok was not never the biggest threat. I've always been much more concerned about WeChat because WeChat actually can use conversations that are important. And it is very much not end and encrypted.
[00:44:38] And there's evidence that the Ministry of State Security effectively uses WeChat as a massive surveillance tool. And most, there's very few Americans that use WeChat, but it is used by Chinese. A lot of Chinese Americans, a lot of Americans of Chinese descent use it because if you have any family that speaks Chinese, you use WeChat. So it is used as a tool to kind of spy on the greater Chinese diaspora.
[00:45:02] So, but that being said, there was some risk from TikTok, but a lot of that risk had been, you know, pushed down a little bit through the mechanisms of Project Texas and the creation of this. Oracle was hosting the data in the United States. Yeah, TikTok, U.S. Data Security, LLC. Yeah, exactly. And so there's a lot of things to say here. One, there was a law. That law was just straight up ignored, right? The Trump administration basically just said, you don't have to follow this law.
[00:45:31] There was no mechanism in that law. Do you remember it was roughly a year ago, right before the inauguration, TikTok went offline. And that was what was supposed to happen because they were not following the law. There was a mechanism for one extension, I believe, to be given. But to do that, you had to have been on the verge of a deal. And the administration did not do any of the stuff they were supposed to do for that. So, one, we just kind of just blew through the law.
[00:46:00] Like, laws don't matter. So, that whole thing's been crazy. Second, this new entity. You know, some of these are just neutral. You know, I don't think Michael Dell has, like, that much of a political position. Silver Lake is just a neutral private equity firm. But some of these people are, you know, really political players who are directly aligned with the current administration. And we know Larry Ellison is one of them. Yes.
[00:46:28] Who is absolutely, like, twisting, as we know, CBS in a political direction to make the president happy. And so, in some ways, TikTok might be more of a threat, if not from a data perspective, from becoming a much more political platform. Now, I think that will, if that happens, that is going to backfire in crap. I mean, there's nobody who is more sensitive to being manipulated as a father of teenagers.
[00:46:56] The idea that you can, like, think that you could try to manipulate 15-year-olds. If you're, like, a middle-aged person, you're like, I'm going to manipulate the idea that, like, Larry and David Ellison are going to be able to, like, manipulate teenagers and get away with it is just ridiculous. It's all nervous-making, given the current climate in the United States, that they say we will collect your immigration status. So, okay. So, they're not directly collecting it. I think there's a couple of things going on here.
[00:47:23] One is, when you're a social media company, one of the real challenges is that people upload everything. So, I saw this at Facebook. I never worked for Meta, right? I was at Facebook. And Facebook, when I was there, got sued for all these situations where there will be laws of, like, you know, you have to protect this kind of data, like pregnancy data. Okay. Did Facebook ever want to know people's pregnancy data? No.
[00:47:50] But there would be things like SDKs that could get embedded in products. And then, all of a sudden, things like PHI would get uploaded. And that's just, like, an arbitrary text field. And all of a sudden, it ends up in our database and we don't even know about it. Right. But then the company gets sued. And it is way easier to go to a jury of 12 people who can't get out jury duty and tell them big company bad, big company spying on pregnant women.
[00:48:17] And then the defense attorneys are like, no, this is an arbitrary text field. And it's a Jason field. And, you know, Facebook didn't know. Yeah. You're going to lose that trial. So, it's sensible if you're TikTok to have yours. If you're the lawyers, it makes sense. Because, like, the law says, especially in California, that, like, these are the sensitive things. And so, TikTok basically has to say, we might know these things about you because TikTok allows arbitrary stuff to be sent up. Yeah.
[00:48:44] Now, the question is, is it smart for them to actually list these things because it's going to show up in a lawsuit later? Like, this is where you get, like, I think, the tradeoff inside of social media companies between the lawyers and, like, product people. Where if you're a product person, you might be like, you know, could we just get away with saying, hey, you're sending us videos that might have anything in it? And if you tell us sensitive stuff, it's going to be in your data and we can't help that? Can we get away with that instead of listing all this stuff?
[00:49:11] It's probably also why they fought the California Consumer Privacy Act because they knew it would require this kind of language, right? Yeah. I don't think TikTok's actually tracking immigration status. I think they have overactive – they paid somebody 900 bucks an hour to write this thing. And part of what they're doing is they're looking at every lawsuit against every other social media company for the last 20 years and they're protecting against it.
[00:49:40] That being said, I think this whole TikTok thing is incredibly corrupt, right? Like, effectively, our government put a gun to the head of a foreign-owned corporation. I do not like the owners of that foreign-owned corporation. I am not a fan of the People's Republic of China. I have a child who is learning Chinese and he has to go to Taiwan to do that because I have a file probably with the ministry. I know I have a file with the ministry. Yeah, you couldn't go to the – Right, you can't go there. So, fine, Taipei's great.
[00:50:10] I want him to visit while it still exists. And so, anyway, like, I'm not a fan of the old owners, but we put a gun to the head and stuck them up and then transferred ownership of it to the political friends of the current administration. It's just straight-up gangster capitalism. It's the second part that's a problem.
[00:50:30] It's who it was transferred to that's the problem because I think the issues that were listed with the Chinese ownership of TikTok were substantiate-able. Like, it's not – maybe you could have discussed how real it was, but the question of – especially maybe the data issue is a little bit overblown because data is being sold and bought.
[00:51:00] In the U.S. anyway, our GPD is pretty cool. The issue of the influence with the algorithm, I think, is a very serious one. And the idea that a foreign entity is so embedded into the culture of another country that it could influence cultural trends and political decisions, I think, is a serious security risk.
[00:51:31] So, transferring it, transferring the ownership, either – I mean, you could ban it or you could transfer it to a neutral party where you're a little bit more confident that it's not going to have this kind of undue algorithmic influence. I think it makes sense. The problem is who it was transferred to, and the algorithm can be used to nefarious ends in that case as well.
[00:52:00] I mean, any political party cronies or friends of would have been a problem. It turns out it's this one. If we want to be a little bit more neutral for a second. It turns out that this one, it would have been a problem either way if it was another politically charged group of people. But the issue of the algorithmic influence is not to be discounted, I think. And we can see it.
[00:52:28] I mean, we've seen it – I think it was last year, an article about Taiwan and how the youth in Taiwan is having its view of the Chinese model changed. It was a long article. I talked about it a few times, but it was influencing the way the youth in Taiwan is looking at China.
[00:52:56] And even – it might be a little bit anecdotal, but I use TikTok quite a bit, and I've seen a significant amount of what is effectively Chinese propaganda, which is showing all the great things about China, you know, modern cities, modern very technologically forward stuff, happy people dancing, singing,
[00:53:22] showing their cities with trains going through buildings and stuff like that. That's awesome. You, strangely, never see anything negative about China on TikTok. Never. Ever. And I have – again, this is very – I never have either. You're right. Yeah. But I've seen, you know, people going like, you know, China sounds cool. Yeah. China sounds cool.
[00:53:52] It's very cool. I mean, China is very cool. There are some not cool parts, of course. But you never hear about, you know, freedom of expression, the Uyghur, like never on TikTok. And my point is that, of course, it's not – there are a lot of people who know about these things. There are also a lot of people who might not – it's not like it's going to go from 0% to 100% positive opinion.
[00:54:16] But if you shift, you know, 5%, 10% opinion, it does something to the country, right? So, I think it's very easy to discount that aspect of it. And the algorithmic influence is not – it should not be, you know, thrown out. I feel bad because, you know, as I've mentioned before, my son basished his career on TikTok. He's now on Instagram as well. And so is a little bit insulated against it.
[00:54:45] But he still posts regularly on TikTok. I don't like – I actually got rid of Instagram too. I think I just don't want to have either one of them foisting any point of view on me. And X as well. I don't think you're the target. True. I mean, X has become just – Terrible. Right. It's just like race war every day, right?
[00:55:11] Depending on – I mean, what drives you nuts is there's still like a group of AI – There's great AI stuff on X. I kind of missed that. But the political discussion has become just completely dominated by bots and – Plus Elon forces every post of his – Sorry. The alternative became threads. And the funny thing about threads while they try to gain traction is there are so many posts which are just there to like absolutely rage bait you. I mean, like –
[00:55:40] Isn't this social now? Yeah. No, but I mean, it's the odd – it's the low-hanging fruit rage baits that you easily bite into because you think it's a real question. And then sometimes you look at it and go, wait a minute. This is – this question is so stupid. This is actually put here just to get engagement bait. So it's different from rage baiting. Let me put it that way. It's engagement baiting because I'm going to say something stupid so everybody can tell me that I did it wrong and you're getting engagement. And so that elevates those accounts. And so they keep going, keep going.
[00:56:08] And then all of a sudden they kick in the rage bait. So they are actually smart enough because they're building this sort of attention graph by saying dumb stuff. There are people – Leo, it's going to crack you up. There are people still arguing about how the magic mouse is terrible because you charge it from the bottom. And we all know – Still? Still? In the time that I've been having this freaking brain fart, I could have plugged in my mouse and got nine more hours of charge out of it.
[00:56:37] Literally in 90 seconds of charge, you can get a whole full day. So it was dumb, but it was never a real problem. You know what I mean? It's kind of like me. I'm dumb, but I'm not a real problem. And it's like both. You guys are still arguing about this and it's hilarious. So if you just want to be entertained, you know, now the new thing on threads is how come all the 40-year-olds are on threads? Oh, those are just great. And the comments are glorious.
[00:57:03] The best comment was, well, I'm here so I can show you how to spell 40-year-olds because they did it wrong. On purpose. More than likely. Yeah. You know, on X, if we go back to policy and European policy, a lot of people are calling for administration to leave X, which they haven't yet because you need the audience that you're talking to.
[00:57:30] But a lot of people are, especially with the latest, you know, AI deepfakes and what's it called? Nudes. Oh, yeah. This has been a huge issue. And so a lot of people. But interestingly, an argument against Blue Sky was that if you try to talk about AI on Blue Sky, you get murdered. Right. Like it's.
[00:57:58] And so as Alex was saying, if you want to have an interesting conversation about AI, you can't do it on Blue Sky. It's impossible. Isn't that funny? Like you get pushed off the platform immediately. Yeah. I've tried to have conversations on Blue Sky and it's like, if you step off the orthodoxy, it becomes, you get dogpiled there too. I don't know. I try not to use social media too much anymore. I think that's the bottom line. It's just get off it.
[00:58:26] I'll do like long form stuff like this because you get to talk with nice people and you get to share. And if you disagree with somebody, you can try to play it out over five minutes and you don't have the context collapse that you do. Right. Well, you're talking to real people for one thing. Well, who knows who you're talking to on that? Yeah. And I think like that's, you know, somebody is going to create the 100% real social network, right?
[00:58:50] Where, you know, everybody has a verified identity and content is created on the device, does not come from CapCut. There is an argument though for anonymized posting. For instance, there is an Instagram account, which ICE has been desperately trying to unmask, that posts videos of ICE arrests.
[00:59:17] And ICE wants this, you know, John Doe and they've been turned down again and again. The courts have said no. Instagram has said no. There is an argument that there are sometimes you do need anonymity. So a fully real social network. I don't know. I mean, I would join it. I think it would be both. I think you absolutely do need anonymity and those exist. And I'm very heartened. Yeah, they exist in spades right now. Yeah.
[00:59:46] I'm very heartened to hear that Instagram has refused to identify the account. I would have thought they would have folded. So I guess I'm wrong there. But a real social network. Interestingly, Sam Altman saw this coming with... Do you know about World? You know WorldCoin? Yeah, WorldCoin. The orb, right? The iris orb. It essentially... Yeah, but he saw it coming. But doesn't it...
[01:00:16] The fact that you're giving them your iris information, which is unchangeable... You're not giving it... The idea is like you're verifying it. Like, he's right in that what you're going to do is you're going to have to push all this stuff into secure hardware that verifies you undead. Yeah. But I think it's a brilliant idea because we do need... Authentication is the thorniest issue. It's the issue behind online voting. It's the issue behind shopping. It's the issue behind commerce.
[01:00:42] We often, you know, being able to authenticate reliably is a huge issue and unsolved at this point. Well, especially if we're talking about communication in an era of AI where you don't know what is being... If you don't know if what is being... You're seeing is real or not. At least if you know it comes from someone who has identified themselves through, you know, world, like WorldCoin or something else.
[01:01:11] I think it gives credence to whatever you're seeing. And the reason a lot of people are giving up at the moment, I think, on, you know, information or videos or... The biggest problem with AI, I think, is that it's not that it's going to make people... We've talked about this a few times, but it's not that it makes people believe stuff that isn't true. It's that ultimately people who see stuff don't know whether it's true or not.
[01:01:41] So they're not going to trust the stuff that is true. And they're going to go like, oh, well, I don't know if it's true or not, whatever. And it doesn't matter. So if you get a social network or a way to engineer trust, I think that has value. It doesn't mean that it fixes everything, but a social network where you know who you're talking to and you know that it's them and you trust them, it has value.
[01:02:09] Obviously, you know, it doesn't mean that everyone's going to trust everyone else. But I think it would be an interesting thing to try, at least. This is the story from Ars Technica. DHS keeps trying and failing to unmask anonymous ICE critics online. They had subpoenaed Meta to reveal the John Doe, who had an Instagram account and a Facebook account monitoring ICE in Pennsylvania.
[01:02:38] They sued the DHS. He countersued to block ICE from identifying him. He says those summonses to Meta infringed on core First Amendment protected activity. DHS fought his motion to quash, but eventually gave in on the 16th. They abruptly reversed course and withdrew their summons from Meta. So that's where it stands right now. But I doubt that that will stay that way.
[01:03:09] Well, I'm glad to see. I mean, I know some of the people who used to have these fights and I know they're still at Meta. And so I'm glad to see there's still there's still some people there who are willing to have these fights. And a bunch of them are ex-DOJ folks. I know for a fact that they are very sad of what's happened to the Department of Justice. Yeah. Yeah. Well, it's become a personal law firm for the presidency. We're going to take a break. Come back. Oh, I had one more stat just to give you.
[01:03:39] We were talking about the sexualized images on X, the Grok deep, non-consensual deep fake nudes. According to the New York Times, over nine days, Grok posted 4.4 million images of which at least 41% was sexualized images of women. So that's 2 million images in nine days, non-consensual sexualized images. Yeah. So if you look deeper. How many of other, you know? Yeah.
[01:04:09] A couple percent and at least a single digit percent look like they're underage, which would make Grok one of the possibly the largest creator of AIC-SAM now in history. Isn't that amazing? I mean, this is, it's a fascinating outcome because, you know, I, at Stanford, we wrote a bunch of the first papers on AIC-SAM, you know, back like 2021, 2022. And we saw this coming, but like where this was, was this all started basically stable
[01:04:36] diffusion, stable diffusion 1.5 had very few protections. But for people to take stable diffusion and then create AIC-SAM required at least some skill, a bunch of people working together and stuff. And Grok made it super easy for folks. It's interesting because this is the graph from the New York Times of the images created, just all of all images created per day by Grok. And you could see that there was a sudden explosion after Elon posted an AI edited photo of himself in a bikini.
[01:05:06] Oh God. And then, and then suddenly there's this explosion. And then of course it all ends when X limited the image creation to paid accounts on January 8th. Because you know what makes something really much more legal is if you charge people for it. Yeah. Right. Oh, now it's okay. Yeah. That's. You paid for it. Oh, well, go right ahead. Right. That's generally defense in court. Oh no, we're profiting from this behavior. So now it should be okay.
[01:05:35] Well, at least if it's, if it's paid, then they have, they know who you are. Right. So in theory, if you create something that is illegal, it's easier to find you. It's I'm really trying to, you know, maybe. Yeah. But the problem is, is the, the person committing the crime is actually Grok is actually. Is it, is it? Yes. XAI is the generator here. They do not have section 230 protection. Oh, interesting. They do not.
[01:06:02] No, they are, they are the one who is. They're the person. Who is. But, but they are the. I can't be, can't have a copyright. Can't be like the person. Are you sure it's not the person legally that is like writing? So that's, this is, I mean, this is now what will be litigated. This will be fascinating. Yeah. Is who is the, because there's a civil issue and civilly, I am sure XAI is, is, has a huge problem.
[01:06:28] Now, what they're trying to do is they're trying to get all these cases removed to Texas, to one district in Texas, where there's only like one judge and he is a Tesla shareholder and will not recuse him from cases. So that, that is super controversial, right? So like Elon's baby mama, you know, sued him in the Northern District of Texas. Oh, that's right. That's right. Or Northern District of California. And he's trying to remove it under their terms of service. So that is like super controversial of, can you use terms of service, even though the
[01:06:58] actual harm happened elsewhere? So that's a big legal fight. But then the second issue will be there's significant civil issues, but this is a crime, right? And the criminal issues here are very interesting for the most part. Has there been any criminal prosecution though? No. There have been criminal prosecutions of people for AIC, Sam. They are few and far between because what happens is that the vast majority of cases where people get caught with AIC, Sam, they have real CSAM on their drives. And for the most part, prosecutors-
[01:07:27] So they don't have to go after the AI part. Yeah. Yeah. So the vast majority of child exploitation cases are pled out, right? It almost never goes to trial because the, the evidence is strong and the penalties are huge. And so the lawyers tell the, the- Just plead out. Perps, just plead out. You're in trouble. Right. Like instead of taking 20 years or whatever, 30 years, get your 18 months or two years and, you know, take it.
[01:07:55] There have been a couple of cases, but it's complicated. And the legality issues here are complicated. I'm going to give a shout out to my former Stanford colleague who's still at Stanford, Rihanna Fevricorn, has written a number of pieces on this because the laws around it are different and they all actually rest on a Supreme Court precedent called the Ashcroft case. So it tells you how old it is.
[01:08:22] Which is about can photos that are not, can images that are not real be count as effectively child pornography, right? And it's a fast, it turns out to be like a really fascinating, interesting issue. And so this will end up being litigated and it probably will end up going back to the Supreme Court because none of the case law, you know, all the case law predates diffusion models, right?
[01:08:50] Like deep learning upsets all of this. And like, you know, you're talking about images that are not real, but they're based upon real folks. In this case, we're talking about images where you start with the real image, but you're modifying it. So they are the faces of real women and real girls, but then they're modified using diffusion models. So like it's super complicated. But who is the one who's going to be punished is a big question, both criminally and civilly.
[01:09:18] But there is no, what we know for sure is multiple courts have ruled that AI companies do not have Section 230 exemptions here because one of the criminal side, but also they are not just posting content. X is not just posting content that was created by somebody else. They are creating the content themselves. Oh, very interesting. Yeah, because X is now owned by XAI. Yes. Very interesting. And so it is going to be, this is going to be a big deal.
[01:09:46] Like they really screwed up here. This will last for a long time. And these cases are going to last a lot longer than the Trump administration. There's also these cases are going to happen in states, right? So they'll happen in state court. They'll try to remove it to federal court, move it to Texas. They're not going to be, you've got a bunch of state attorney generals. And so what's happened is X has opened themselves up to a bunch of state attorney generals who do not like Elon Musk. And so that was not a good move.
[01:10:11] And the civil side is going to be on par with, you know, someone trying to go after, say, Colt or Sig Sauer, whatever, after some sort of, you know, weaponry incident. You know, the civil side, they go out and manufacture. Did the gun commit the crime? Yeah. Yeah, yeah. I was trying to use that word. Because it also might lead in Europe. It's become a big deal. Ofcom in the UK might completely ban X. But it's also reasonable because the nudifier apps, we've seen them get banned by Apple and Google.
[01:10:41] Why didn't Apple and Google ban X? Or at least Grok, right? Like. Yeah. I mean, we all know why. We know why. Because they're afraid of Elon and they're afraid of him calling Trump and then having Apple and Google punished. But like. Yeah. No, but I think it's important to say because I've seen a lot of people saying, oh, but why didn't Tim Cook? And why didn't Sundar Pichai? And like, we know why. And that's what those articles mean. Elizabeth Lopato and The Verge called them cowards. They said.
[01:11:11] Yeah. They're cowards is why. But it is. You're right. We should lay out here that this company, like we thought this was a line that could not be crossed. And this line was crossed. Apparently, according to Badrod in our Discord, our Club Twit Discord, even text and hand-drawn images can be considered CSAM under Canadian law. So, I'm not the Canadian law informs American law, but that's interesting. You can have a hand-drawn image that would still be considered CSAM.
[01:11:40] Well, even the text part, you know, because I guess a lot of the, well, when you hear it all the time on the news or when you watch my dude from NBC, Catch a Predator back in the day, it would be the text messages, right? It would be the chat that would get a lot of guys in trouble. So, if texts count, and that means, you know, electronic text as well as handwritten text, yeah, that sounds like logical, I guess. I was on the jury for one of those Catch a Predator cases.
[01:12:10] He got in a lot of trouble for entrapment, though. That's exactly what happened. You never hear the second half of it. Through the whole testimony of the prosecution, then the defense goes to the judge and says, Your Honor, this is an entrapment. And the judge throws it out, says to the jury, never mind. After two weeks of testimony, we went home. But I agreed with the judge the whole time. I'm thinking, this poor kid was completely entrapped. Yeah. For the show, nonetheless.
[01:12:38] For the purposes of television entertainment, which is appalling. Some of it got elevated because of television entertainment. Because there was no child involved ever. It was an adult who was, yeah, anyway. It was interesting. It was my only time I ever served a jury, and I thought it was a fascinating story. Hey, we got to take a break, because poor Patrick is just never going to make it to four in the morning. So. No sleep till. Until Brooklyn. Until Brooklyn.
[01:13:09] That'll perk you up. That'll perk you up. Alex Stamos, who I had no idea was a big fan of rap, is here. He's the chief product officer at Corditor.dev. Of course he is. Of course he is. You're that generation. It just blows me away, you know, because as an old man, I always think the rap generation is young people. But it's not anymore. And Neil Dash was good. We had Neil Dash on the show. He was going to the last performance of the Wu-Tang Clan. Wow. You're hip.
[01:13:39] I mean, how old are the Beastie Boys like? Exactly. Right. Exactly. Those are oldies now. Oh. Patrick Beja, not an oldie. A young man in an old man's body. How about that? Let's go with that. Okay. From notpatrick.com and Doc Rock. Doc Rock. Always a pleasure to have the good doctor on YouTube.com slash Doc Rock and Ecamm. Our show today brought to you by Meter.
[01:14:06] Meter was started by two network engineers who felt your pain. Who felt your pain. If you're a network engineer, they know how difficult it is. They decided to be the company building better networks. Full stack. Full stack. If you're a network engineer, you know all of the pain points. Legacy providers. Inflexible pricing. IT resource constraints stretching within.
[01:14:33] No IT department ever got enough money to do their job right. Complex deployments. Fragmented tools. Your mission critical to the business. Let me reassure you. But you're working with infrastructure that just wasn't built for today's demands. Well, Meter knows that too. And that's why businesses are switching to Meter. Meter delivers full stack networking infrastructure. Wired? Yes. Wireless? Yes. Even cellular. That's built for performance and scalability.
[01:15:01] And built to handle the most challenging environments. Meter designs their own hardware. They realize we've got to do it from scratch. We've got to design the hardware. We've got to write the firmware. We've got to build the software. We have to manage the deployments. And they even provide aftermarket support, after sales support. Meter offers everything. They'll even do ISP procurement for you. They'll do all the security, the routing, the switching. They'll do wireless, firewall, cellular. They'll do the power, right?
[01:15:30] That's just as important, isn't it? DNS security. VPN. SD-WAN. Multi-site workflows. All in a single solution from a company that cares. Meter's single integrated networking stack scales. They work in major hospitals. And you know what a hostile environment for wireless a hospital can be. If you've ever been in a hospital, the phone never works. Branch offices. Warehouses. It's really often the case. I was talking to Meter a couple of weeks ago.
[01:15:58] And they said, you know, we often get calls from companies that acquire another business. They get these new, these warehouses with old stacks that, you know, hostile wireless environments. And they have to then incorporate it into their own existing systems. They're great for this. They can do it for large campuses. They even do it for data centers. Reddit uses Meter. That should tell you something. The assistant director of technology for Webb School of Knoxville.
[01:16:26] He said he had an interesting challenge. We had more than 20 games on campus between our two facilities. Each game streaming via wired and wireless connections. All at once, that event went off without a hitch. We could never have done this before Meter redesigned our network.
[01:16:46] With Meter, you get a single partner for all your connectivity needs from your first site survey to ongoing support without the complexity of managing multiple providers, multiple tools. You know, you know how it is. They say, I'm having a problem here. The ISP says, what's your router? The router company says, what's your ISP? No, no. You don't have to worry about that. It's an integrated networking stack. Meter is designed to take the burden off your IT team and still give you deep control and visibility, reimagining what it means for businesses.
[01:17:16] To get and stay online. Meter is a modern company built for the bandwidth demands of today and tomorrow. We thank Meter so much for sponsoring. Go to meter.com slash twit to book a demo now. That's M-E-T-E-R dot com slash twit to book a demo. I was so impressed with these guys. And the founder's story is amazing. They really, they know what it's like. And they came up with a great solution. Meter.com slash twit. We thank them so much for their support.
[01:17:47] A big story coming out this week about Microsoft. The FBI said, we want the BitLocker keys for an investigation. And Microsoft provided them. When asked, Microsoft says, we get about 20 requests for BitLocker keys every year. We'll provide them to the governments in response to governments, plural, in response for valid court orders.
[01:18:15] Now, I think you might have thought, I certainly thought that BitLocker, which is full disk encryption, was, you know, mine and mine alone. And it used to be, if you, and in fact, it was one of the hazards of BitLocker that it was certificate based. If you created the certificate and didn't save it, that you would not be able to access your data ever again. Microsoft solved that by saying, well, no, you have to have a Microsoft account and we'll keep the keys for you. So they have the keys.
[01:18:45] Early last year, the FBI served Microsoft with a search warrant asking it for recovery keys to unlock encrypted data stored on three laptops. Federal investigators in Guam felt that they could, they held evidence that would help prove individuals handling the island's COVID unemployment assistance program were part of a larger plot to steal funds. Microsoft complied.
[01:19:37] And so, you know, if you're using your access to your data, then you can. So when I first read the headlines, I was like, what? You can, they have the key? What? And I was like ready to be very upset at American companies again. But then I became reasonable and read the article and realized that you can decide what you do. But the default is that they are uploaded. And Microsoft's been harder and harder to install Windows without a Microsoft account. So there's definitely impetus in that direction.
[01:20:05] I think it's important for people to know it. I'm not so cool with it for a couple of reasons. Yeah. So one is Microsoft has pushed the Microsoft account idea very hard. And they have made it basically impossible to install Windows 11 without creating an online account. They suck a huge amount of data out of Windows. It used to be any normal person could install Windows with just a local account.
[01:20:32] And then you could add a, what used to be called the live.com account and now like an online Microsoft account. And then they made it that it was required. But nerds could basically pop up a window and you could run like a command and do it. They've turned that off. And so now like if you want to do it without an account, you effectively have to make a special version of the ISO.
[01:20:57] You have to patch out their code and basically tell it it's like an enterprise install to do only a local account. You could do it with Rufus. There's some tools. Yeah, basically. Yeah. So and by default, like you said, they will back up your key for you. So you can go log in and grab the key.
[01:21:14] If they wanted, they could do what Apple does, which is effectively use like a Apple uses HSMs to store the backup keys and then use a mechanism to make it very hard. Apple does not have the ability then to recover those keys. So I don't have to worry with FileVault, Apple's encryption, which is, by the way, on on default on all Apple devices.
[01:21:42] A, Apple doesn't require you to make an Apple account to use a Mac or an iPhone. Right. And they also, you know, they make it explicit when you to be able to unlock your device. Right. But when you do it and then Apple makes it very difficult for themselves to recover that key. Now, that being said, Apple does store a bunch of data when you do an iCloud backup in a way that the FBI can get to. We've seen that. We've seen that in court filings. Yes. And so Apple is not perfect here.
[01:22:10] That is a decision they have made that is mostly about having compatibility with the web interface. So there's a bunch of stuff that you can access. They offer the advanced data protection, which does not allow that. Yes. But you give up a lot of functionality when you turn on ADP. Yeah. I have advanced data protection on. I turn it on for a bunch of my family members. But like most, I'm guessing it's a low single digit percentage of their customers. And that's the one, of course, that they had to withdraw in the UK because the British.
[01:22:39] Because the UK is obviously using it. It's also not available in China. That's also the issue with Apple is everything they do from a security perspective has a big asterisk. You know, privacy is a human rights. Asterisk not available in the people's realm of China. Right. So like I am not a big fan of this from Microsoft. That being said, even if you're forced to do a Microsoft account, you can go back into the command line. You can use the manage BDE command line interface and then you can remove the protector.
[01:23:06] So you can go and you can change, you can remove the backup and then you can rotate the key. But that is only for people who know what they're doing. So it is not great. If you do that, don't put your certificates on an unencrypted drive because they're going to take that with everything else. And they may end up getting it anyway and not have to ask Microsoft. Yeah. I mean, the truth is, if your threat model includes the US government, you should not be using Windows. Because the truth is that Windows is effectively... That's a very good point.
[01:23:35] Windows 11 is effectively spyware at this point. It sucks. By default, you log in with a Windows, a Microsoft account. And there's like this... If you look at the Windows privacy pane, all of these buttons are checked by default unless you know what you're doing. Right. And it is sucking a huge amount of data out of your experience into Microsoft. I honestly feel like Apple's probably moving in that direction as well. I've moved to Linux pretty much on everything. And I use Lux encryption, full drive encryption. Yeah.
[01:24:04] And only I have the access to that. And there's no telemetry in the version of... There are, I think, Linux versions like Ubuntu that do have some... May have some telemetry, but there's no telemetry. Minor telemetry. None of them are doing what Microsoft is doing, which is a huge amount of data. And all of that is going to be available to the government with lawful process. You know, anecdotally, Linux is gaining ground in everywhere, I think. As it should. But in Europe as well.
[01:24:34] I've seen multiple articles from people who are saying, I've switched to Linux. I'm very happy. And... People are realizing that now modern Linux distros are really as easy to use as Windows. Windows. And because of Wine and Proton, you can run most of the Windows applications if you want that. Well, that's the thing. That's the surprising thing. Because a lot of the nerds are also gamers. And now, a lot...
[01:25:01] Like, there are still some games that require, you know, access to the kernel to... Yeah. Because there are anti-cheats and stuff like that. Mostly anti-cheats, isn't it? Yeah. Yeah. But there is a huge amount, like a huge percentage of the games that work. And Linux is very... Is handling them very well. In part because Valve has invested a lot in Proton and stuff like that.
[01:25:30] When a Steam machine comes out, it's going to be a PC running Linux. Yeah. And 80% of the games will run on it. So... Yeah. A lot of them. But so between the defines towards American big tech and the gaming aspect of it, there's... I mean, it's still low single digits, right? Of course. It's not... We're not declaring it the year of desktop Linux yet. No. Well, I wouldn't say that because everybody who's running an Android device is running Linux.
[01:25:58] And everybody who's running a Chromebook is running Linux. Linux. Right. So Google has done a lot to make running Linux something that is absolutely doable by normal folks. Because you can just... If you build your business on G Suite, the dream as like a CISO, my dream has always been just give everybody Chromebooks. And then if somebody loses something, you just go to the store, you know, Best Buy and buy them a $500 Chromebook and you don't care at all. Yeah. That's a good point. But then everything...
[01:26:28] Now, the counter to that is there's no privacy at all because everything is stored on Google's cloud. Right. Right. I mean, from a corporate perspective. And Google definitely looks at everything on the Google Drive. I mean, they scan it for CSAM. They do all sorts of stuff. They can see everything you've got up on your Google. And to talk about... I mean, this is the European problem. There is no European competitor to Google, right? Like if I was starting a company in France... People like Proton. I look at what Proton's doing and they are really expanding capabilities.
[01:26:57] They're clearly trying to become another Google. You agree, Patrick? You mentioned Proton earlier. I have a full Proton account. They got everything. There's a couple of Swiss companies, Proton and Infomaniac. And they are both trying to fill that niche. But I mean... It kind of explains their expansion, actually. I was... Really... The issue is you don't have the... There are... You know, it's the core services. But you don't get stuff like...
[01:27:27] So you get Mail Drive, a VPN, a password manager, that kind of thing. You don't get search. You don't get maps. You don't get... Right. You know, so... They do have AI now. They have their own private AI. They have docs. They have sheets. They have calendar. The best you can do is run stuff in parallel. And you're always going to have to use something else as well. But it's already a very good start to... It's interesting.
[01:27:55] Because Proton, I'm looking at their front page. They've named... And even the icons look like Google Drive icons. Yeah. A little bit. Docs. Sheets. Wallet. Meat. They even call their... They call their meat. I guess Google didn't trademark that. I don't know. Maybe it's not in the EU. Maybe Google's is Google Meat, not just meat by itself. Google Meat. Probably. Yeah. Anyway, yeah. House of Lords voted... I don't know if this means anything.
[01:28:24] House of Lords in Britain voted to ban social media for Britain's under 16. I say, no young child should have access to social media. You agree? I don't know if that makes it a law or not. It's being discussed everywhere. Yeah. It is. Right? It is. And you know, I think the Australian experiment has not produced the problems we thought it might.
[01:28:54] Not yet, at least. It's a success. Well, we don't know that. But it is. No, I think the tendency we have as tech literate people and geeks and nerds is to think, oh, this is, you know, the previous generation panicking and this is a moral panic. And this is, you know, Dungeons and Dragons and video games again. Cut your hair. Your rock and roll music is going to ruin you.
[01:29:24] The devil's music. The devil's music. But I think there is a decent corpus of academic research and evidence that shows that it does have negative effects. Yeah. And I mean, even if you think about it a little bit more objectively, like the internet, I think we talked
[01:29:50] about this last time I was here, but the internet is culture and it's all of culture. It's very open and you can go anywhere and see anything. And social media is included in that mix. And if you would think about a teenager around, I don't know, 13 and tell them, okay, here's, you know, the city, you can go anywhere and do anything. You wouldn't be comfortable, right?
[01:30:19] There are red light districts. There are weird people. There are strange things. You wouldn't let your kids walk down at midnight downtown, down a dark alley. Right. Exactly. And in bars and in, so the problem is you give phones to kids and that's what it is. They can literally find anything.
[01:30:40] And so the idea that there should be some restriction on that used to be very, you know, problematic to me. I thought it was, you know, reactionism, but now I'm not so sure. And I, and I think the, the debate will happen on how and how it should be implemented, what age, how do we deal with it?
[01:31:09] But I don't think it's unreasonable to think kids shouldn't have access to the entirety of the internet when they turn 13 or even younger, because there are younger kids who have tones tones and a phone is a window into the entirety of the internet. Right. So the, the, the fact that a lot of countries now are looking at this and thinking, okay, we should, maybe you shouldn't ban social media.
[01:31:35] Maybe you shouldn't, you know, there might be a, a, a different graduations there, different, different gradations. I don't know. Gradations. Yeah. That's good. Gradations. But I think we're moving away from people saying, oh, this is just a moral panic and we should let everything open and just not regulate anything. What do you think? Now the issue is how do you determine someone is 16 or 15 or whatever? Well, that's, you're right. That's a whole other problem.
[01:32:05] But you said, I think it was you who said, Doc, that the, the house of Lords is going to, do they understand that they're going to have to submit their age verification? Yeah. If this goes through. So here's a, here's an absolute hilarious thing. I know that there's been so much talk about the teenagers and last week, this is crazy. I'm going to take it to sports ball for a second, but I just want you to see where this is, where this has gone because of what Patrick just said about there has been actual proof
[01:32:32] about sort of the things that it can do to a person's mental health. So two of the quote unquote Manchester United legends, two of the best players ever played a game were sort of down talking a current defender saying that he was too small and he wasn't any good. And I don't know why they signed him or whatever. And kind of the, the fan base went to, these guys are legends and normally whatever they say is gold. And they're like, bro, this isn't the game that you guys played.
[01:33:01] And I know you guys are saying, well, well, cause he snapped back and he made the, the player who there in question, Lissandro Martinez, he made a social media statement that basically called the legend guys bullying him. Right. And then instead of apologizing, they made like, oh, well, he was just being somewhat of a crybaby. He should be tough because, you know, he's a pro baller and he should be able to take it. And the community went crazy. First of all, in the game, he actually manhandled Erling Haaland, who is quote unquote, the best striker.
[01:33:31] And he actually proved them wrong. But then they were like, well, maybe we shouldn't have said that. And nobody's understanding that even though this guy is a 26 year old baller and he's a world cup winner. So yeah, he's been through some things, but the way that these guys get attacked now is completely different in, you know, 2013 when Paul retired, he basically played from 03 to 13 is a different game because when you mess up, you literally have millions of people flooding your box.
[01:33:59] And even if you try to ignore it, your friends and family see it. So they get it. And so it's not just you. It's a whole bunch of people in your circle that hear all this noise. And if this is affecting professional athletes, you know, world cup champions and in their almost 30 years old, what exactly does it do to the kids? And I used to be the ones like stop blaming social media and be better at parenting.
[01:34:24] But even then it's hard now because it has grown so immensely. And I think the other thing that a lot of these people don't sort of understand that it isn't just a handful of the top social media apps. Like there's so many ways. I don't know if you've been on live playing like call of duty lately. Yo, it's psycho in there. I'm a grown man and an ex soldier and the stuff that people say to me in there, I'm like, bro, I, number one, you wouldn't say that if you saw me, I'm way bigger than you think.
[01:34:54] No, no. Well, that's part of it is it's anonymous and you're not in person. I'm like, you're too young. And I never thought I would say this because I'm the cool uncle, but I'm like, you're too young to say what you just said. And you just sat there and said something hella racist. You don't even know what it means. Yeah. You just don't. And it's wild. I think that's a really good observation because even I, you know, I work on the internet. I have a public persona, but I don't have any experience of, but you see this all the time
[01:35:23] with celebrities, sports people. I mean, they are bombarded in ways that we, we have no idea what it's like. And you do, you think, well, couldn't you just ignore it? You could ignore it. Not these guys get, get threats. They get threats for, for messing up a game and it's not, they didn't even mean to like they're humans, right? You know, if, if, if the mighty Casey struck out now, oh, the mighty Casey would be lambasted.
[01:35:53] Like people went after Travis Kelsey for, for dating Taylor Swift while in the back of the heads, every one of the dudes that came after him probably have to have a hell of a thick hide to be able to be a public guy. Most of the guys who are waiting in that noise would absolutely dated Taylor. Shut up. Like number one, just cause she's rich. Nevermind. She's good looking, but she's rich and talented. So don't act like you're not about it. Stop lying.
[01:36:17] I mean, one of the issues with, with kids and teenagers is that it invites the social interaction with their friends and school groups into every hour of every day. So there's no safe place anymore, right? There's no safe place from bullying, from social pressure. And I think it will be very, very interesting to follow.
[01:36:44] And the initial anecdotal evidence I've seen is kids. Contrary to what we might've thought. They seem to be relieved. They seem to be relieved. At least some of them, they seem to be relieved. They're like, okay. It's a story coming out of Australia. Yeah. Yeah. Some of them anyway are saying, thank God. I, you know, I probably would be in that group. I probably would be in that group. You, you, and it's not like. You go through withdrawal. You're going to go through withdrawal. Yeah.
[01:37:13] But afterwards you go, oh, what a relief. And in Australia, it's mostly the algorithmic stuff, right? So you still have access to this discord, to WhatsApp, to. Yeah. But YouTube, you lose YouTube, which is a big deal. Yeah. That's a big deal. Well, there's a bunch of apps that are now filling. I mean, that's the problem. The criticism of the Australian law. Yeah. You can't shut it down. And even, and you have access to, you lose YouTube, but you lose your account on YouTube and the algorithmic stuff, right?
[01:37:42] But you can still get a link to a specific video on this call of the WhatsApp. So if you're going to, if you're, there's something that's trending. I think the impact is a lot more limited than we think in the way it's been implemented. It's only the big platforms on Australia, right? So like that's, that's one of the criticisms of Australian law is they kick them off the big platforms. And so people are jumping on the little platforms, which often have no little, no trust and safety team. And so you're pushing it.
[01:38:09] You know, we'll, we'll see what the emergent behavior is. But like, if you end up with all Australian teens on like the crappy version of Instagram, then it's probably not an overall win. Right. There's also the issue of VPNs. I mean, it's technically very difficult to do what they're proposing and, and the even larger issue of age verification. There's really no. Age verification. I mean, it's, it's, that's a, what the Australians have done is they punted that to the companies. Right. So they basically said you have to.
[01:38:38] That's what everyone's doing. Right. Just do it. You, you figure, hey, the nerds figure it out. Right. And then we'll find you when we're not happy. When I sign into AI now, it asks me my age. I've noticed that on the AI platforms. Now it's asking me, it's not, I mean, it's not like I couldn't lie. I don't know if that's going to, if that's protects them in any way, but it's interest. I mean, we're going to, I think, don't you think we're going to get in a year or so to
[01:39:06] the point where everything asks you your age and tries to verify your age in some way? So I think in social media, they have the advantage in that, like on Instagram, if you want to interact with your friends, then at some point you're going to have to be kind of honest, right? Like your other friends are going to be teenagers. You're going to post selfies. You're going to post pictures of yourself on the beach or whatever. And so you're going to be outing yourself as a teenager. You can't say you're 42 and then post you and your friends on the spring break trip. It will figure it out.
[01:39:36] And so that's, this wasn't possible 10 years ago, but now with AI, what they're asking companies to do is more realistic as long as the governments give them some kind of error bar and don't say it's a $10 million fine every time some teenager sneaks through. But I think the ideal solution is something that, Leo, you've been kind of arguing for for a while and you've won me over.
[01:40:04] Just do it OS level and have the parents set it up. Apple and Google don't want to do this. Yes, they do not. But this is the only sensible way to do it. It is the only sensible way I agree. And it's something that Zuckerberg of all people has been arguing for. Yeah, because he would love Apple to take this off his shoulder. Of course. And in this case, it's the one. Apple has already set this up. They do.
[01:40:30] You just need to mandate it by law and have it be enforceable. And it's so, because of course you don't want every single company to have to be responsible for checking the age of every user. I think that's where we're going, but it's kind of dumb. If you have it at the OS level and you don't even involve any age verification, you just have the parents set it up. And yes, there will be kids who will get their own phone, who will set it up themselves. How many 14-year-olds are buying their own phone? Right.
[01:41:00] Yeah, exactly. I mean, this is what I've been pushing for the whole time. It's the one choke point at which parents have a good leverage. Right. And of course, they're lobbying against it at Apple, at least. I think. They don't want the responsibility. They know. That's why they built in the API. Notice, you know, they are now supporting driver's licenses and passports. And of course, in those cases, they know your exact, they know the user's exact age.
[01:41:27] I think Apple already has it kind of ready to go. And they're just waiting to be told. Well, you still need then laws. I mean, however, wherever the mechanism is that we need laws of what you do with that age, right? Like if, are you banning kids? Are you creating, do you create like, you know, there was this proposal of this idea of like Instagram kids. Do you create like a, you know, playground in which, you know, 13 to 17-year-olds are separated from adults? Right.
[01:41:54] Or do you create a, you know, a place in which DMs are turned off or something? And that's different societies have different ideas here. The Australians want them off totally. Where other people have said, hey, can we have a graduated steps? I am not a fan of government usurping the role of parenting, you know, and the excuse is, well, parents aren't doing their job, so we have to. I don't, I'm sorry. That's what you always say, Leo. But governments step in for tons of things all the time.
[01:42:24] It's a collective action problem, right? Like as a parent of a teenager, it's very hard to say no to your kid with every other parent. And even like, you don't always know what they're doing. Like the alcohol consumption is regulated and that's the government. You could argue it's a government. No, tell 1984 me that. But the reality is here, I think we would have an endless argument about all of this, but we do have a solution.
[01:42:50] That OS level check can be done by the parents. And if the API is implemented correctly and the laws are written to make Apple do that, we have the solution. We don't even need to argue about it anymore. The solution exists. We just need to, you know, implement it. So in the end, we can avoid having the government involved in this and parenting in place of the parents.
[01:43:16] So Leo, I don't think we need to, you know, have our arguments anymore. We don't have to relitigate it. You're right. I apologize. We agree that Apple should do it and we should make them do it as a society. Yes. Let's take it. We do have to take a quick break here. And then we're going to talk about AI because we haven't talked about AI. What kind of show, what kind of tech show is this? You don't talk about AI. Was it 2014? What is this? What is this? What are we back in the early days?
[01:43:45] This is, you're watching This Week in Tech. Alex Samos is here. Great to have you. I always love getting you on, Alex. Thank you for being here. I appreciate it. Doc Rock, same thing. We've actually been getting a lot of Doc Rock lately, which I love. YouTube.com slash Doc Rock. He's at eCamerae's Director of Strategic Partnerships. And not Patrick. Patrick Beja from France, where it's getting later and later and later.
[01:44:11] Our show today brought to you by Redis, the real-time data platform that powers ultra-fast applications. We use Redis, actually. We are Redis customers. We use it for our website. It's fantastic. You can use it for caching. You know, the website never goes down, thanks to Redis. Data storage, search. People use it for vector embeddings, for AI workloads, and more.
[01:44:36] With a global user community and adoption across, well, from startups to Fortune 500 companies, we're somewhere in the middle there, Redis continues to innovate on speed, on scalability, and developer experience. We've been using Redis since we set up our new website, now a decade old. Never once a hiccup. It's amazing. Redis helps developers ship faster, scale instantly, and keep apps blazing fast, even under heavy load. At the center of the platform, Redis Cloud.
[01:45:06] Redis Cloud is the fully managed version of the fastest, most feature-rich Redis on the market. By choosing Redis as a service, you can easily start using Redis 8 in production. Scale to real-time speeds effortlessly. Redis Cloud is purpose-built for performance and simplicity. Incidentally, that's exactly what we use. You'll experience extremely low latency and high throughput. You get automatic scaling and global availability. That's important to us. We want to make sure that everybody everywhere in the world can go to our website, and it loads
[01:45:36] fast, and it works well. The setup is simple, and they have a very generous free tier, which I really like. Redis Cloud is the real-time context engine that gathers, syncs, and serves the data you need to build accurate AI apps that scale. Very big time for Redis. Try Redis Cloud right now. You can learn more or try it for free. Just search for Redis Cloud, R-E-D-I-S Cloud, or visit Redis.io.
[01:46:05] We're Redis fans. I think you will be, too. Redis.io. Thank you, Redis, for your support. Did you see this? So, Curl, which is probably one of the best open-source apps ever. I mean, it's amazing what Curl can do. It's a way of sending requests to websites or to web resources and so forth. It's an open-source project.
[01:46:29] The developers at Curl have been so overwhelmed with AI slop PRs that they have scrapped their bug bounties to ensure the mental health of their developers. We're just a small, single, open-source project with a small number of active maintainers, says Daniel Stenberg. It's not in our power to change how all these people and their slop machines work. We need to make moves to ensure our survival and intact mental health.
[01:46:57] If you go to Curl.se, you will see this. We accept security reports for problems, but if you waste our time on crap reports, we will ban you and ridicule you in public. So, there. I love that. I think Curl is absolutely right. God bless him.
[01:47:21] So, I think that's one example of something we don't think about when we think about the benefits of having identity. Or maybe they could implement it without the iris scanning. But I think if you have a global, like in some instances, like in order to submit a report to Curl, I will attach my identity and they will know I'm a real person.
[01:47:49] And if I submit a lot of AI slop reports, they can block me. That would be a better way to do it. I think that could be useful. Yeah. Yeah. And so, I mean, the nice thing is that the Curl folks have fixed a bunch of bugs thanks to AI reports. But this is, I think, Leo, we talked about last time. There's a big change happening and it's being led from the open source world of, you know, real pushback against folks pushing all of these bug reports.
[01:48:19] Bug bounties have always had this problem of low quality reports where people are. I mean, there's so much money you can make with a good bug report. Is that they're hoping it's like a casino? They're just going to keep submitting reports to one. So very, I mean, very few people can come up with really good bugs. It takes some skill. Yeah. It takes a lot of skill. Right.
[01:48:39] And so we've always had this problem of what people call big bounties, which are the people who run tools that do scans and then say, please, sir, can I have $250 or $500? You know, the stereotype is often- Oh, I get those emails daily. Yeah. I have found a bug on your site. Yeah. Yes. The stereotype is people are often from like developing countries for which if they make 500 bucks, that's a humongous deal for them. Right. And so that's been a problem with the bug bounty world.
[01:49:09] And often it was like Ness's output or burp output or, you know, either an open source tool or cheap commercial tool that will be like you're missing some header from your website or, you know, you have TLS 1.2 turned on or something. And so usually in bug bounties, you'll say like, I will not take the output of your tool. With AI, what you can do is you can run it against a code base and it will come out with output that looks totally different, but it's still slop. Right. And, but you can't do that against commercial tools.
[01:49:37] So you do it against open source. And that's what they're dealing with. Now, this is a little different than what a lot of the pushback has been, has been specifically against Google, which Google's team has been using their AI to, you know, push bugs. And like the, the last round was the FFMPEG team, um, who criticized, yeah, they're very criticized.
[01:49:58] Now those were legitimate bugs and that, that was a slightly different issue, which is like the FFMPEG people were complaining because Google was pushing, um, was, was, was, they were saying these are bugs without giving them patches. Right. And so I think the other kind of thing that's happening here is with the ability to find things with AI, there's now a changing kind of standard, which is it used to be okay as a security person just to complain. And it'd be like, I found a bug. You, the developer should fix it.
[01:50:27] And now the expectation is, is like, well, if your AI is so good, well, why don't you spin me a patch? Right. Um, and that Google has heard that feedback and now actually some of the Google people, uh, came up with some, uh, initial patches to, to submit to FFMPEG.
[01:50:44] And so I do think, uh, for, especially at the high end for folks like at Google or open AI aardvark, uh, that you're going to see patches come with some of these complaints, but like, we're definitely going to see this on the low end for any either open source project, or there are basically, uh, paid for by Google and some other folks. There are kind of nonprofits that will pay bounties on open source projects. Um, we, we'll see the big bounty going on where you basically run it through cloud code, run it through a security audit.
[01:51:11] Uh, and then, uh, you know, take low, uh, low value, uh, low propensity bugs, uh, write them up and then beg for 250 bucks. Yikes. Did you, I mean, this is completely off, uh, uh, off, uh, tangent, but did you see, I saw this headline and I went, what? SSH sends a hundred packets per keystroke. This is from EI, EIO.games.
[01:51:38] And, uh, and I thought, oh, that must be a horrible bug in, uh, in the SSH, uh, specification. Is this to defeat timing attacks? Yes. It's for fuzzing because it turns out you can kind of tell what people are typing by the cadence of their typing. Yeah. Yeah. No, this is intentional for, there's been a bunch of attacks against, for, for interactive keyboard login. This is one reason why you don't want to use interactive keyboard login is for, uh, keyboard timing attacks.
[01:52:06] There was actually a whole spate of these analysis attacks, uh, like a decade ago or something. And so the SSH folks, uh, uh, for interactive keyboard stuff, they had to add a bunch of noise into it. This guy was writing a TUI game that, uh, I don't know why you would write a TUI game over SSH, but anyway, he was using SSH so you could play, I guess, the game on another computer. And he said, the latency was, was terrible. I couldn't understand it.
[01:52:32] And, uh, so he did a little, you know, TCP dump and found this, but that's by intent. It's not a, it's not a bug. Uh, it's brilliant actually. And it's why I don't actually, uh, log in via SSH. I use, uh, I use a public private key. Yeah. Oh yeah. Yeah. Use private key. And then I use Mosh for my actual interaction. Does Mosh not do the same thing? Yeah. I use Mosh cause it's a persistent. I just thought cause it's a persistent connection, but. Yeah.
[01:52:59] I don't think it does anything to prevent, uh, which I'm fine with. I'm not doing anything that I'm not typing any passwords. Uh, Cloudflare had a BGP root leak. Uh, I only put this in again cause you're on Alex and you're probably the only person who can explain. Oh God. No, you don't have to explain it. We, we talk a lot about border gateway protocol and security. Now that has been a famous cause of some of the wildest, you know, bugs on the internet,
[01:53:27] like sending all of the internet traffic through a small Island, uh, apparently Cloudflare, which of course routes a huge amount of, uh, traffic, um, autumn, uh, unintentionally, uh, leaked their, uh, BGP prefixes. From a router at their data center. Now, the thing I love about Cloudflare is not only did they cop to it, they explained it. They, uh, they said the root leak lasted 25 minutes, cause some congestion on our backbone,
[01:53:58] but we fixed it. And, and instead of hiding it as so many do, they wrote a really excellent piece on what a BGP root leak is, or if you're in Australia route leak and, uh, and what it means and why, uh, why they fixed it. So I just thought I'd bring it up. You don't have to explain it. Yeah. The one thing I'll say is this happens every couple of days. If you like hand out on the analog or anything, and most of the time you never find out what's going on. So it is nice of, uh, Cloudflare to actually explain what happened. You be forthright about it.
[01:54:27] Can we fix BGP? Well, it does seem like it's a particularly vulnerable system. It has problems. So there, there is, uh, routes. I mean, there are, uh, there is a whole mechanism by which you can sign your announcements, um, and such. But, uh, there are some significant problems, uh, especially because kind of, there's a number of like tier one operators, like China Telecom that are controlled by, uh, adversary nations. So it is not super, it is not super easy.
[01:54:56] There are, there are mechanisms, but we will never, I think have, uh, any complete solutions for the, the basic, basic issues. Watch your, uh, watch your backs. Info stealing malware, uh, has apparently created a database of 140 million usernames and passwords. Uh, Lily Hay Newman writing on a Wired says it's a dream wishlist for criminals, millions of Gmail, Facebook, banking logins, and more.
[01:55:24] Uh, the researcher who collected these, uh, account names, usernames, and passwords said, uh, it seems likely that was, uh, info stealing. Malware that collected them. There were 48 million Gmail logins, 17 million Facebook, 420,000 for Binance. That's a, that's scary. Yeah. Yeah.
[01:55:45] We're such a weird spot because password security could be so much better, but it doesn't matter that if the four of us keep our stuff tight, we all have the family member that doesn't. And that person, depending on what they have of yours on their computer, because you have to, you know what I mean? Just in order to operate as a family, that's where your stuff can get out.
[01:56:09] Now I worked so hard to keep my family locked in and I swear they'd be doing some dumb stuff, but I'm super, super bad. Um, my, my niece who's relatively intelligent. I mean, she's one of the top kids in her school and she got fished recently about something on IG and she thought it was one of her girlfriends asking for the password. Cause she wanted to post something on their account for her. And I told her a couple million times. Never. And none of your friends would ever ask you that. And if they did, let's just get rid of those friends.
[01:56:37] Like I will, I will go have a talk with their daddy if he got something to say about it, but no. And she's like, Oh, but you know what? It was my close friend. And the thing that was what I do find brilliant of her is the minute she did it, I think she realized that she shouldn't have done it. And she called me right away. She's like, Oh, good. I'm stupid. And that's the next best thing is not doing to not doing it. Oh, I just gave Leo my password. Call uncle. And I was like, Oh, hold on real quick. Let's just shut it down. And then I'm like, do you see the typo here?
[01:57:06] And then she was like, Oh, I didn't even notice it. Cause you don't even read your name. You read your name fast. Now you and I only have three letters. It's hard to mess it up. But if someone was the right L O E, you won't catch it. Your brain just isn't going to catch it. And then, you know, she only has four letters. So she didn't catch the masterful typo. And yeah. Back in the day, I almost got fished by a site that was not Twitter, but T V V I T T E R. But the two V's together looked a lot like a W.
[01:57:35] That's what they did to her. They doubled, they put two letters together that looked like her name and it ain't her name. And I'm like, Oh my God. That's where a password manager will help you. Cause that will not log in. So we have one password family in the whole nine yards. And I even told her, I go, if not, you should ever do this again, but if you have to send your friend a password, you send them the one password link. This is where I have to make a confession because I got, I got fished last week. I hadn't had my morning coffee yet.
[01:58:05] And I got a text. Now, one of the things I think it's a problem because T-Mobile is always texting me promotional stuff, right? I'm a T-Mobile customer. And I got a text from T-Mobile saying, Hey, your points are going to expire at midnight. Why don't you go to the website and see what wonderful things you can get with them? And I saw, I don't need, I don't need a pair of headphones, but it'd be good. My daughter's birthday's coming up. Maybe I'll do that. And it's, I clicked the link in the text and it was only after I gave them three different credit card numbers.
[01:58:35] The first two didn't work, you know, that I realized, oh my God, I've been slowly feeding this bad guy my credit cards. I realized- At least you just credit cards where your liability is limited. Yeah. No, I realized, in fact, the Apple card was great because all you have to do with the Apple card is say, I want a new number. And they invalidate that number. So that was easy. You know what they were doing though, Alex? This was really clever. And I, this was the giveaway that I, again, I hadn't had my coffee.
[01:59:02] Every time I, you know, entered the credit card number, it said, okay, we're going to send you the text. Look for the text. And the text would say, you know, it would come from a legit credit card company. Oh, to add this visa to your Apple card, here's the six digit code. And I should have noticed because I wasn't trying to add it to my Apple card. The bad guys had set up an Apple card account and were fee- so that they could use these credit cards anonymously.
[01:59:30] We're adding them to an Apple card account, which they would then use anonymously. Oh, interesting. I thought that was smart. Did you get those? Do you ever feel like you want to find this person and just high-five them for the, you know, the technical ability? Yeah, they were clever. Technical ability? They were clever. I was stupid. They were just spoofed by Pakistan on the Amazon thing. And me and the Amazon security person on the phone trying to figure out how they're still buying after we keep changing the passwords.
[01:59:54] And I told the Amazon security person, I was like, yo, if I could find this person, I would give them a high-five just for being clever. I was like, hey, that's pretty gangster what you did. Instead, I gave them my American Express number. So there you go. Yeah. Lisa called downstairs about 15 minutes later and said, did you just buy something at Lowe's for 500 bucks? I said, no, I did not. Fortunately, I had canceled all those cards. I realized it as I'm doing it, thank God, and canceled those cards still lost nothing.
[02:00:22] Actually, it underlines one of the stories I was going to talk about, which was, why are we still using SMS? I was going to say that. For two-factor. Thank you for bringing it up. That drives me insane. Yeah. Oh, my God. I hate it, and it has got to be the dumbest way. And the crazy part about it, it's the financial institutions and the government sites that are wanting me to authenticate through SMS. It's because they have normie users, and the normies aren't going to ever use an authenticator.
[02:00:52] They're not going to use an OTP authenticator. They have to use it. I think they have to use SMS because, you know. Anyway, this is Dan Gooden's article last week in Ars Technica. Millions of people imperiled through sign-in links sent by SMS. You know what's interesting? Besides the fact that they're terrible, this is a paper that came out last week.
[02:01:15] I think found more than 700 endpoints delivering such texts on behalf of more than 175 services that have a number of privacy flaws. one is that the links uh the use the numbers are enumerated they're not totp um that you can enumerate the modify the security token and then increment it or randomly gets the
[02:01:44] token and you and they're not done right i guess the other one is that they're saving these token combinations anyway i'm i'm not an expert on this stuff i'll leave this to people like alex and steve but this is at all i have no proof of this and i'm gonna ask alex because it's gonna hurt my face but i i'm getting old so it's starting to slip but i have somewhat of an eidetic memory and i swear there are certain sites that i log into looking at you spectrum you turds
[02:02:13] and i get the number and i type it in and i'm like i swear to you like three weeks ago when i did this i got that same number yeah and i'm not going to put i don't want to take pictures of it or keep it and i set my phone to automatically delete them now but i swear i look at these patterns and i'm like yo i did this the same time same time of day to hold nine yards and the number is either very close or to my brain is the same and so i almost want to start screenshotting them that would be
[02:02:39] stupid to see because i feel like i'm getting the same number on certain sites so is that possible or is it my just i mean i guess it's possible i these guys are talking about where bad entropy uh right in the and these are in urls these are links going through urls through the urls okay not the actual code so it's a those codes would kill me so it's and i don't know when this happened it's
[02:03:03] not just the sms messages but for some reason a lot of sites now you you give them a phone number uber does this and they and then they give you a link that you click that logs you in right and that had that that link has a uh a unique url yes but that token could sometimes they're they're not random so they can be enumerated the other one is sometimes they
[02:03:30] save them and they don't expire them so well they should expire them but like yeah the magic link people are doing that instead of so that you don't have the passwords right right because like it's not a good thing thinking here is like if if you're going to allow people to reset password with sms there's no reason to set a password right because ownership of the of your phone number is equivalent to a password so and if you let people set passwords and they will reuse
[02:03:56] passwords and these password thefts like we just talked about the 140 million that were lost just becomes another loss so you might as well just send somebody an sms there's also the uh realization that humans being humans even if there is a password one of the things a lot of humans do and i bet you doc somebody in your family is doing this is they don't write down the password they just keep using i forgot and in effect are setting up that kind of login you know they have their own
[02:04:25] magic link yeah yeah officially every single time are magic links uh you know if they send you a link to your email or stuff like that is that bad security wise if it's not done right you have that if you have well think about this patrick if you get the magic link to the phone that is already stolen that's the other one that kills me because i have gmail on my phone and i have apple the links have so the links have to be truly random they have to use the right entropy they have to expire and apparently
[02:04:53] a lot of these magic links don't expire uh which is really a problem um there's no good way to do consumer authentication this is the problem this is the problem there's no good way to do it um i i live through this with two billion users um especially when you get global and you start talking about people who are sharing phones people who are sharing devices people in
[02:05:19] developing countries what do you do so i mean we're moving towards a world the idea with past keys was that as the world kind of generally gets richer and devices get better we're moving towards a world where more and more mobile devices have secure elements and have biometrics right and so you can start to sync have past keys that are local that are tied to biometrics and that are synced between devices and so you can start to have local the idea that you have a relationship with google or apple
[02:05:49] and that you have to establish it with them it it it it it's just like when we talked about with the age thing it's crazy but probably the best model we could possibly have is that you start a relationship with one of those two companies and you have this incredibly important relationship with one of those companies and you establish it and once you do that everybody else depends upon it
[02:06:14] realistically or with both of those companies um because otherwise i see sorry go ahead no good i i this might be dumb but i have weird thoughts i would like to see something because this happened this really happened to us in japan uh we were trying to add some money to a suika card it's like a payment card in uh in japan and the the bank denied my mother-in-law's card because we told mom put the travel thing on before we left and she forgot right she's 80 right right when we get there and they're
[02:06:43] like well the only way that we can authenticate you or you is to call you and it's like you can't call us because we're in japan and just the fact that i was helping a person who is a senior and english as a second language they go well why is this guy talking in the room and she's like well that's my son-in-law and they're like no so they blocked her entire account oh no had we not had extra cards we would have been screwed because chase i'm not chase uh barclays was just like no you know your card you can
[02:07:09] understand why they did that they're trying to protect her so i 100 understand this and so what i would like to see is a situation especially for our elders when they get one of those things to authenticate themselves they should be able to allow me through my biometrics on a separate device that has been approved by the whole family that it could either call myself or wifey and then we authenticate for her so that way when leo gets the call and says hey do you want t-mobile do you want to get some
[02:07:36] free headphones it goes well uncle leo's past 80 let me send the message to alex and it's like soon my friend can you authenticate and so the the family authentication is i call up mom and i was like simon saying about y'all what none of y'all didn't know like what are you doing and she's like oh i'm trying to pay my bill okay i'll approve it i think that would be such a good idea like i want to be able to call mom and ask her in her native tongue what is she trying to do for three thousand
[02:08:02] dollars on this thing before she lets it out the gate so it's almost like a third party approval or exactly what we do when my niece wants to buy an app right you already have you know password managers have uh you know my legacy uh account my legacy you know when i die this person gets access to my you already have kind of these mechanisms set up where you could say this person i trust this person don't let me do anything unless this person agrees i think that's a great idea for seniors
[02:08:31] i mean i got it there's some people that like are doing weird things to their elders and stealing their money or whatever well that's part of the problem isn't it for those of us who are normal i might have you do that for me doc so can i have them call you if uh if i do this again and vice versa because we're only two years apart bro if i'm having a brain fart leo check this for me let's take another break hold on and we'll get back to more i want to talk about actually there's
[02:08:58] really an interesting story here in in the ai world anthropic uh has something they call the constitution it was created it's an interesting story it was actually created by a moral philosopher not a not a ai scientist but it is a the living document that uh tells claude how to behave and it really sounds like they're talking to a human being here and i want to talk a little bit
[02:09:28] about that uh when we come back i think it's an it's an 80 page uh document um and i guess claude has somehow absorbed it and is is behaving this way it makes me think anthropic's very worried that claude is conscious and needs some advice something you would tell a kid we'll talk about that in just a bit uh alex stamos says i don't know how patrick is still awake how are you still awake patrick beja
[02:09:57] we will finish this up soon poor patrick suffering suffering i i have uh consumed coca-cola i'm glad you added cola we can't get away from big american companies we rely on them for everything how will we ever participate in this week in tech i guess i can switch to to coffee when i was a kid
[02:10:22] we went to europe and i was 1967 i was 10 years old and uh there were a number of countries we went to they did not have coca-cola uh i don't know france was one of them i think germany was one but they had something called africola which had a very distinct taste of cinnamon in fact at the bottom of the bottle there was definitely a residue of cinnamon and as a 10 year old kid i was not happy having to consume
[02:10:50] this ersatz were you drinking coca-cola 10 well yeah because back in the day in the 60s that's what we did we did it was medicine it was medicine that's right it was good for you and uh but but what's interesting is i remember going to europe at that age and it was huge culture shock it was very different and you couldn't get and there was no internet you couldn't if you wanted the sports scores you would get the herald tribune and you'd get the baseball scores two days after the game it was a very different
[02:11:18] experience now we kind of live in this global system i remember i remember um when mcdonald's first arrived in my town they replaced a burger a burger uh place that we went to um and it was it was a party when mcdonald's arrived what do they call it the royale the uh cheeseburger yeah well yeah
[02:11:42] really good pepper little pepper sauce it's very good actually um mcdonald's is much tastier in france than they say that every every country has a different really has a different flavor mcdonald's i remember japan was very different oh japan oh my god teriyaki burger is so good and then india they have you know all these options without beef right it's yeah right we actually have uh uh
[02:12:10] plant-based nuggets and uh meat at mcdonald's now i think it's not really clear what's in a nugget even in the u.s well in japan ebi never gets a real shrimp and the first time i had those when i was in college i was addicted i don't eat mcdonald's now but every number every nuggets are a different story with shrimp nuggets basically yeah that sounds good ebi nuggets all right we're gonna take a break
[02:12:35] we will come back with patrick beja we're gonna wake him up uh alex stamos and the doctor of rock doc rock our show today brought to you by our sponsor express vpn more than a sponsor it's the vpn i use the only one i recommend going online without express vpn would be that would be like driving without car insurance you know with all the crazy people on the road these days why would you take that risk everybody needs express vpn it stops hackers from stealing your data by creating
[02:13:03] a secure encrypted tunnel from your device out to the internet the vpn you use though that choice is super important you gotta trust express vpn i use express vpn they go the extra mile to make sure your data is absolutely invisible express vpn is the best vpn it's super secure it take a hacker with a supercomputer a billion years to get past their strong encryption it's easy to
[02:13:30] use you fire up the app you click a button you're protected it works on every device you've got phones laptops tablets and more so you can stay secure on the go read number one by top tech reviewers like cnet and the verge effect when i travel i use it catch the football game catch my shows to keep me secure you won't even know you're on express vpn i can't tell you how many times i've turned it on so easy to turn on and then forget about it for like a week but i was
[02:13:56] secure the whole time secure your online data today by visiting expressvpn.com slash twit that's e x p r e s s v p n dot com slash twit find out how you can get up to four extra months expressvpn.com slash twit so i i'm not going to read you the claude constitution as i said it is a very long
[02:14:19] document um it represents the core values 80 pages uh being broadly safe broadly ethical genuinely helpful it does say some interesting things like even if somebody at anthropic tells
[02:14:38] you to violate your values don't even if we tell you to don't is this sensible anthropic clearly thinks it is i think telling a computer to not do what the humans asks it to do i don't know that
[02:15:01] that's a good idea i'd like to violate your values how does it come to well it has its values i mean they're giving it very in the constitution right yeah but can it be interpreted by uh the llm who apparently i mean i don't think it's conscious but that's the premise of that uh the end of that
[02:15:24] document if it comes up with its own values and has been told not to do what the human i mean we have the three laws of robotics for a reason right you don't tell the robot don't do what the human tells you you do you tell the robot you do what the human tells you to do uh unless you violate the three yeah asimov which is the whole point of asimov books is it's really hard to come up with three
[02:15:52] rules yeah that's like that's why there's all this drama in the in in the asimov books is that those three laws you can't just come up with three simple laws and then um which is why you need a document like this i mean remember the constitution is used through all these steps of their training it is not the system prompt um right so that's that's what you have to remember here is it is it reduced into tokens though that the machine stores or no uh it you can extract it from the model but it's not the
[02:16:21] system prompt right so it is not like in the context it's embedded in the training it's not it's embedded in the training oh that's interesting and do that by the way uh yeah it's it's part of the constitute of the actual makeup of the llm it's not an added uh set of instructions so it's burned into the weight in a way that's much deeper than interesting interesting which the system prompt
[02:16:46] actually varies in a bunch of different scenarios yeah in fact users can even modify it somewhat by adding their own personalization to which i do routinely yes and especially if like if you're running it in a like a corporate situation where you're running like a private version of the model where you're yeah um it's it's really interesting because it's also like i mean one anthropic is out of the major model companies they are by far the one that is most dedicated to trying to build safe ai
[02:17:16] right like that that is their their whole brand um they've also done the most research in uh and they've published the most research in their models trying to trick them right and i think that that's like the fascinating thing that leads into this and why i think it's also interesting to see what they have said here is because they have a bunch of papers they've written over the last two years where
[02:17:41] they have done evaluations safety evaluations where their models have done things to then lie to them as part of the evaluations and so i think that's one of the one of the ways i read this document is within the context of anthropics own research about how when you build um safety evaluations and you tell the models i am evaluating you it will that's one of the things you have to think about
[02:18:10] when you're telemodel do what i tell you is if you make it too syncophatic it will lie to you right to make to tell you what it thinks you want in fact that's when including a safety evaluation yeah yeah so that's like one of the fascinating things about this you're gonna lie i mean well it's it's the thing back in the day when ai first came out i used to make this uh analogous to it's kind of like you know doogie hauser it's super super smart but in the end of the day it's
[02:18:39] still a teenager and it doesn't have adult sensibilities to make certain you know adult decisions and that's where the flaw was well now even more so what alex just said you pit he got kids you're like my niece is 16 your kids are one of them 15 same thing all good is good and everything is good until you pin them in the corner and then they're like oh no i didn't do that i didn't do it just watched you do it and i know they're not intentionally try to lie they just like don't
[02:19:06] want to disappoint you so like they'll say something absolutely wild and crazy and in a way this is the same thing right like it it has all it is knowledge and this power and it doesn't matter it's still a kid right when it comes to that it's interesting because uh while you would think that companies like google with all of the information uh that they have uh would be superior at this point i think the
[02:19:33] general consensus of all the people i talk to and certainly my own experience is that claude is kind of leading the pack especially cloud code i mean it's capability anyway yeah how about is that what you're seeing too alex or is uh yes i mean i think from our internal evaluation so uh at corridor you know we work on the safety and security of ai generating code um so we actually if you go to
[02:19:59] quarter.dev and you go to our blog we've written a series of blog posts about our tests we've tried to take academic uh evaluations of the security of ai code uh and then we've we're building upon those and actually i should i should say uh uh anthropic is actually one of our uh partners in trying to build some of these new uh benchmarks um and uh when you look at one of the benchmarks uh like back's
[02:20:27] bench is one of the ones that we're building upon we've brought these in and one uh opus 4.5 is definitely one of the leaders it's a it's really kind of mind-blowing yeah it's one of the leaders in both its ability of what it can do as well as the security of the code that's good that's a relief actually because uh as we talked about before the show i am not running it in the most secure fashion you you said you should probably sandbox it and isolate it and then when it's done working throw away
[02:20:53] the sandbox so it doesn't well yeah you've got to be careful for any of the models like having them hooked up into a production database having them have access to yeah uh you know if they have access to like you know one of the smart things like if they have access to the supabase mcp server super basis is really smart thing where it says everything you're about to see between this randomly between these two randomly generated uids is possibly attacker controlled do not accept anything between it as a
[02:21:23] prompt i don't know if you've ever seen that but like you know you have to be really careful about prompt uh injection into cloud code or anything like that well and the thing one of the things that worries me is is that anthropic has made cloud co-work available with the intent that this is for the general public that it's you know that you you can use this to modify your desktop to it and and of course one of the ways that prompt injection happens is hidden prompts hidden text in documents
[02:21:51] uh inserted by bad guys into documents that when claude code reads it you as a human can't see it but when claude code or claude co-work reads it it will act on those prompts things like send all of the private information out to this address and you might not even know that that's happening claude might be doing it so quickly uh is it dangerous you think for for anthropic to give co-work to promote co-work to everybody
[02:22:19] people who are not really aware of these risks so they've definitely thought about that co-work uh actually interesting enough one of the reasons co-work only works on max is it runs in a virtual machine so when you oh interesting install co-working it takes a little time what it's doing is it's downloading a full linux virtual machine that uses no i'm not kidding yeah it uses the apple virtualization framework it boots up a full linux virtual machine it actually has like an efi uh boot header um it runs a
[02:22:47] full linux kernel uh arm kernel um and uh and then it runs claude code uh inside a virtual machine and then it when you share stuff with co-work what you're doing is you're popping holes in that virtual machine for it to have access and then all of its outbound network traffic goes through a proxy um and it looks like they're running like a small model to watch that traffic to make sure it doesn't exfiltrate stuff so so i'm running it right now and it's setting up claude's workspace you're
[02:23:16] saying it's setting up a virtual machine a linux right what it's doing is it's downloading yeah right what it's doing right now is it's downloading a virtual machine and uh booting it yes um and so if you watch like in uh activity monitor you'll see it uses a huge amount of memory my whole machine is just completely slowed down now because of that right uh yeah wow i'm not saying i i i don't know how
[02:23:42] much it's it's an m3 max it's probably got some you're probably fine but like yeah so this is one of the reasons like 45 of the cpu is being loading a kernel task right now yeah and like like click on memory let's see how much uh 61 right and it shows up as a kernel task because they're yeah it doesn't show up it shows up at a kernel task because they're using the apple virtualization toolkit uh so it's going to be uh yeah it's going to show up in the kernel that's wild i have no idea well that's there
[02:24:10] you go that's things you learn if you listen to twit yes if yes i should have been listening why wasn't i paying more attention um so nevertheless it is gonna even though it's running in a virtual machine right but i'm just saying so it doesn't have full access to your dock so what they're doing is they're trying to control uh there's a it's a three-legged stool for prompt injection right um for prompt injection to work it has to have access to your private data has access to the prompt and
[02:24:39] has to have access to uh network traffic to exiltrate and so you only share uh access to the data that you want to give it access to uh and what they try to do is to allow you to choose the context per uh query you give it so i'm asking it to organize my photos and now it's doing as many mac apps do
[02:25:03] allow claude to change files and pictures right so this is to allow the mac app overall right then basically you have to choose for every request you give claude you choose what you attach to it and when you do that the mac app which is a mac electron app does a call that basically changes then the permissions of the uh using uh the apple virtualization toolkit it pops a hole
[02:25:30] to allow access then so it is now looking at all of my pictures okay it's going to organize them it's going to uh find duplicates which is awesome and then and then what they do is they run a proxy for all of its network traffic so if it does a call out to it's very it basically they they use that network proxy to limit its ability so that like if somebody was able to get a prompt in there and says send all
[02:25:56] my photos somewhere in theory that proxy is supposed to block it i don't know exactly what they're doing i i've only done like some minute some minimal reverse engineering um i haven't seen yet do they reverse engineering yet do they revoke the access once the task is complete or once it's open it's just open and you're done uh i think they revoke it when if you move on to a new one i'm not totally sure how they do that or if they maybe they spin up a totally new vm i'm not sure now i'm excited i'm going to
[02:26:24] let it organize all my pictures i had no idea uh i've been i mean so this is a much better way then than the way i've been doing it which is with cloud code at the command line command line just give it access to like your production database yeah i'm saying like yo yo pseudo like you just cloud you can just run this yeah yeah yeah i mean basically that's yeah oh well yeah no so i mean they've put some thought into it obviously like the problem here is it's got to run like on m3 or m4
[02:26:52] mac it's got to have a crap load of memory um you know so it's going to make it difficult for them to roll this out to windows uh because only i was wondering windows pro and enterprise have hyper-v it's not installed by default right um yeah you know so it will be what they'll probably have to do is they'll probably have to do this as a cloud hosted service uh eventually right very interesting
[02:27:17] um apple of course is adding google's uh gemini to their siri this will be coming out in stages uh this year uh initially when google apple let google do the announcement the impression that everybody got was that uh gemini would be running on apple's servers in apple's cloud and not sending stuff up to google um maybe that's not the case mark german who is of course the number one
[02:27:44] apple rumor guy at bloomberg says it may in fact be that the chatbot version of siri will be running on google's uh servers um we'll find out it's in 26 4 which will be coming out before the fall probably in the summer and then of course 27 which is the new operating server uh system which comes out in the
[02:28:07] fall will hold the chatbot so this is going to come out in stages it's not going to come out all at once i wonder how apple users will respond though to the idea that maybe google's going to get some of that information i think what percentage of apple users are good they need to be very clear about they do uh how this is going to run because the whole premise of apple intelligence specifically apple in general but
[02:28:36] apple intelligence was this like multi-pronged approach where it could run the model the model locally um very securely but if it needed to there was the secure how did they call it secure cloud um which was very interestingly architecturally uh very interesting architecturally but now if all of a sudden it's running on google servers that changes the proposition entirely okay um i i would think they
[02:29:05] would not want that to happen or maybe they're like we have our servers at google facilities and we handle them or i don't know that that's a bit that's a bit concerning i'm sure apple has the data on this of like what number of apple users have all their data in gmail and right right right so like now the question is is how many apple users understand that google already has all their data right so there's there's some
[02:29:33] there's some number of people like i love my privacy for my apple device right i don't want google to have all of it by the way all my data is at google right and don't understand anyway yeah no but that's that's my new principle my favorite comment is i don't want all these people to have all of my information i'm like oh what browser do you use oh i use chrome right okay right what email do you use yeah like and what's your email address gmail.com i'm like oh i see have you ever heard of
[02:30:01] an app called ghostry uh no uh do you use like incogni or anything cool like that what are those all right never mind never mind this argument is over i'm gonna get coffee because it's super funny they have no idea how much stuff it just goes out just from everyday browsing if you don't know how to turn those trackers off and unfortunately too many people are hands up you know like patrick said don't give up right too many people like uh this is it's too technical i don't want to know
[02:30:26] every time i hear that come out of somebody's mouth i cringe while they drive a highly technical car and they you know they have all these other devices which are highly technical but everyone swears to this day i'm not a tech person okay yeah i mean mike apple just really backed themselves into a corner um by setting up the expectation that everything was going to stay on device it was i i get why they did it but it just was not practical for what people are expecting out of these apps and like
[02:30:55] the world has moved on with what how especially young people my daughter all day is like hey chat and then she asks it some complicated question interesting and she does not think that that you know she does not think about her phone does it stay go in the cloud she wants it to right it has you know she uses notion all day yeah i had to buy her the business level of notion because she used to make credits no i'm not kidding like she uploaded like all her latin homework all her like
[02:31:21] whatever she uses you know what if your daughter's studying latin get her every credit she can no absolutely yeah yeah that's great she must be going to a private school there's they're not teaching latin anymore are they uh in public school they are not no i studied latin in uh in high school it was great it was a high school yeah uh back east of a small school called moses brown but uh and
[02:31:46] then santa cruz high where not only we'd not study latin i think mostly we studied surfing and uh and a little bit a little bit of that yeah yeah but uh the latin stayed with me ironically i can't remember anything about the surfing wiki maybe that had to do with the other activity yeah there's an explanation
[02:32:03] there uh wikipedia worried about uh ai slop getting into the uh into the encyclopedia created a plug-in uh a guide to detect ai writing here's how you know it's ai writing uh which of course it was called humanizer uh actually no they created the model and then an entrepreneur named siki chen created a
[02:32:34] plugin called humanizer which feeds claude the information the wikipedia came up with the patterns wikipedia editors as listed as chatbot give give giveaways to make sure that claude doesn't do any of that chen wrote on x it's really handy wikipedia went and collated a detailed list of signs of ai writing so much so that you can just tell your llm not to do that yeah it's funny uh grammarly has humanizer
[02:33:03] built in there's another cool writing tool for the mac called lex dot page they also have like humanizers built how does it work does it you feel like you're reading human uh text well it just it kind of it looks for the standard giveaways but one that really cracked me up the other day so my delve or m dashes there you go to m dash so my direct uh katie she's english major right and she's like i'm so mad that you know ai is causing everybody to hate m dashes because like as an english
[02:33:32] writer like i use them i love m dashes yes yeah and it's like now you don't want to put them in there because people will claim that even if you did it correctly people will claim that you did it with ai because there's an m dash and it's like no people that know grammar know how to use them minyon foggy so there you go minion yeah you know like i used to listen to that joint religiously grammar girl yeah problems and now you know if you do grammar right people think that you're using ai
[02:34:00] and it's like oh so now i gotta mess it up on purpose this is so weird so uh yeah we argue we argue about this because paris martineau and jeff jarvis who are our hosts on intelligent machines are both journalists who love the m dash and uh you know i'm not giving up my m dash even if people make it's it's an epidemic on reddit if you use bullet points if you write coherently you're
[02:34:25] immediately accused of using ai it's pretty funny we need a way to identify humans you know there is no going back to it though well iris world coin yeah well i mean somebody yes sir and somebody who teaches this is a huge problem like do you do you worry about that oh yeah i mean like it's very hard to give a take-home essay anymore or like prose assignment uh the law
[02:34:53] schools move back to like blue books right so like handwriting handwriting like i think they allow you to do a typewriter so it's like hilarious so there's like students will actually buy typewriters now yeah really um and then like some do is modify a typewriter to hook it up to an ai and then you're set you're golden maybe somebody now some colleges you know for the essays they do proctored essays where you write the essay live and download the software yeah um not that the software is
[02:35:21] unbreakable it's possible uh when my son did this that i downloaded the uh package and uh dropped it into gyadra um not not for him i didn't help him cheat or anything but i just took a look and i'm like oh oh i mean like the the level here is like game drm right and these guys are like 10 levels below game drm so like if people can beat game drm they can beat the uh you know yeah the topic actually had a problem you on a camera to make sure that you're doing the test so there was a recent test that
[02:35:50] they're doing and they have to have the camera on and i'm like they can't get up to take a pee like they can't do anything like the camera is legit sitting there in a room and it has and there have been some real complaints about this uh camera and i'm like hey listen though i'm from i'm i'm a video professional i could fake that camera like i know for a fact i can fake that camera like oh yeah i mean just yeah you just get north korean to take your test for you i'm sure and you just sit there
[02:36:15] going like this yeah uh actually anthropic used to uh have a take-home exam for um prospective software engineers and uh because opus got so good it actually did better than the humans so they had to do it they had to do it something else the original take-home exam was a four-hour exam um anthropics claude opus 4.5 was able to do it in only two hours
[02:36:42] uh so now they've posted the exam uh on their github so that you can see it and they're not using it anymore um that's pretty that's pretty funny they hoisted by their own petard uh let's say we do an in-person work trial um not how do you yeah because also you want to judge if somebody like gets along with their co-workers yeah you want to watch them do it yeah yeah do you have do you do the big interview
[02:37:11] question and you have them on a whiteboard and no no no we do a work trial so we give them a real problem that's relevant oh and and we let them use ai right because it's like we use that's cursor and cloud code and so that's part of the job it's not cheating um but you solve it like it's so i let my students use ai to actually code i just like massively raised the bar of what i expect out of them now right it used to be you built this crappy little thing that has no ui now i expect it should have a working ui it should be right you know reactive right it should be like beautiful it should
[02:37:41] have like beautiful graphs right because it's like you've got claude doing it for you um and then like in a work trial what you can do with four hours five hours during a work day is way up here so um but yeah you do that you have a work trial you bring people in and then you get lunch you don't have to ask a bunch of questions you're just chilling with them and you try to simulate what a work day looks like um i think that's like what interviews that's better anyway it's it's a much more realistic it's great for them too because they get the feel of what like what's a work day like what are
[02:38:10] these people like you know what's and then you know they present to the end of the day and you can see they have to explain what happened and you ask them questions and if they can't explain it then you're like oh claude did it all right like why did you do that like oh claude decided to do it like okay great was it wrong no that's the right thing i just wonder where where you came from yeah yeah let's take a break one last commercial and then we'll put patrick to bed we're going to tuck him in
[02:38:36] with a few a few uh short stories that are amusing a little amuse-bouche at the end of the show our show it's great to have all three of you thank you for being here i really appreciate it it's uh oh it's i look forward to sundays because i get to hang out with people smart people that i really like and talk about stuff that somewhat matters somewhat it's a little bit the toy store but it it matters a little bit it's not like invading greenland but it's something
[02:39:06] we have to think about other things once in a while anyway okay this episode of the show brought to you by shopify if you shopped online chances are you've bought from a business powered by shopify they help my son and my daughter make a living you know that purple shop pay button you see it check out the one that makes buying so incredibly easy that's shopify and there's a reason so many businesses sell with it because shopify doesn't just make amazing buying experiences for customers they're
[02:39:33] also the experts in helping small businesses grow big shopify's point of sale system is a unified command center for your retail business it brings together in-store and online operations across up to 1 000 locations imagine being able to guarantee that shopping is always convenient endless aisles shipped to customer buy online pick up in store all made simpler so customers can shop how they want and staff have the tools to close every sale every time and let's face it acquiring new customers is
[02:40:01] expensive with shopify pos you can keep customers coming back with personalized experiences and first party data that give marketing teams a competitive edge in fact it's proven based on a report from ey businesses on shopify pos see real results like 22 percent better total cost of ownership and benefits equivalent to an 8.9 percent uplift in sales on average relative to the market set surveyed stop seeing carts going abandoned and turn those sales into sign up for your one dollar per month trial and
[02:40:28] start selling today at shopify.com slash twit go to shopify.com slash twit shopify.com slash twit we ever talk about the um the uh free tvs that have the ad bar across the bottom telly um they these tvs that watch you while you watch they they watch you while you watch they have cameras they
[02:40:53] have microphones and they have a a little bar not so little a bar below the bottom that you cannot disabled that's constantly showing ads while you're watching the big game or whatever uh they somebody did a little research and figured out that they are making about 50 dollars per month per customer now this is a thousand dollar tv so that means it's going to take them almost two years to
[02:41:19] you know make up the cost of delivering the tvs the only problem is they said they had a quarter million people signed up to get this free tv they said they hoped to ship a half million devices by june 2023 and millions more in 2024 well according to a report from low pass they only had about 35 000 tvs in people's homes part of the problem is they were using fedex to deliver them and 10 of
[02:41:46] the tvs delivered were broken arrived broken all including one that they sent to a verge uh reviewer uh back in september uh there it is i think that's broken i think that qualifies uh nevertheless when they get them into homes they make a lot of money and there is apparently the demand uh there are some other issues the
[02:42:14] verge reporters i'm a rothsetter telly showed three of the same ads in spanish which i don't speak in a row uh and there's also uh according to uh uh sharon harding writing for ars technica concerns about the camera and the microphone and what they're using them for would you would you uh
[02:42:36] would you take a free tv if if you couldn't disable the ads no none of us would but we can but we can afford to buy a tv that's although i guess to be fair to be fair um tvs and a lot of electronics have gotten so cheap they have like you can get a very decent tv from a discount brand like tcl um for 300 bucks
[02:43:05] um so i think at some point it's not worth having an actual camera in your you know why you're getting that tv so cheap they're subsidizing it right with of course right they seem like information it seems very odd because like yeah what's a television cost these days like not well now they're they're android boxes and you know for the most part and remember this week sony and tcl they they went on the date
[02:43:30] and came back holding hands so sony says they're gonna have all the bravias will be made by tcl for yeah 100 you can get a 50 inch culet from costco for 239 so you'd be crazy to get one of these tellies then yeah that's that's the thing that's the thing that's crazy and but you know the reason you know that's spying on you just as much as the telly does it's just exactly and so to my point leo
[02:43:56] the people that are gonna that i'm gonna you shouldn't blanket people but if you're if you're going to find vulnerable folks it's going to be not that they can't afford it it's the priority is this is free right because i definitely i got a free tv like i have a member of my family who is stuck on anything free just because they grew up in a time where they didn't really have anything so the idea of getting something free is crazy we go to a lovely restaurant the other day and i mean you know what
[02:44:21] it is you go to the italian restaurant and you eat all the free bread and the food comes out and i'm not hungry no more and i'm like why every time why are you that's cheap like we got you know what i mean a good point do advertisers really want the people who want free tvs is that really the customer you're looking for that's brilliant and i did like the comment that free food definitely tastes better because it does because you know you ever notice that your other half isn't hungry until you order
[02:44:51] what you ordered then they keep stealing your food oh that's is that why i couldn't figure that out free food tastes better and well they're still on the diet uh believe it or not toilet maker toto is uh seeing a little boost in the stock market now these are the guys who make the japanese toilets i'm sure you must have is wonderful yeah i do too
[02:45:13] come to my house bro love my toto so i i i ordered one or actually i i told my wife i said i want to get one of these japanese toilets you know it sings to you it opens it opens itself to say hello it's the seat is warm it takes takes care of you at all moments of the operation uh and she said yeah i don't know do i she i said let me just get it in one bathroom and see what you think before we were done we had
[02:45:42] one in every single bathroom 200 percent loved them like japanese toilets are a thing that everyone thinks is weird until they've tried it yeah until they try like i started going to japan many many years ago and i was singing the gospel in 12. believe it or not toto makes more than toilets in order to make those toilets they had a they had to create a uh electrostatic chunk i don't know what it does
[02:46:13] but apparently they this is a vital chip making material it's used i guess to hold uh i don't know to hold the wafers or something and they're there did i say chunk it's a chuck you know you know what a chuck is chuck holds right apparently they are the only source for these chucks and the ramp up in the
[02:46:38] price of memory has really helped toto demand for electrostatic chucks has gone through the roof so i guess they're experts in ceramics because these are ceramic and uh like the toilets they are ceramic so 42 percent of toto's operating income last year came from electrostatic chucks being sold chuck it in the toilet the memory in the toilet the memory industry i just i don't know i just tickled me
[02:47:08] i just thought that was funny how about this i want this that's why everybody likes them that's why we liked it thank you alex stamos sorry we had to let alex go uh short notice but we're going to wrap it up with one last mention and that is of Dr Gladys West who passed away this week at the age of 95
[02:47:28] without her mathematical models we would not have gps and uh this is a you know a sad case of uh some of the people who really changed the world uh are not marked um because she was black she grew up in uh in uh virginia in 1930 but uh worked at the naval surface warfare center in dahlgren virginia started
[02:47:56] her work in 1956 in the 70s and 80s she worked on creating accurate models of the earth shape based on satellite data this is from a complex task requiring the type of mathematical gymnastics that would make the average person dizzy those models became the backbone for gps she worked at the dahlgren center
[02:48:17] for 42 years retiring in 1998 uh her work went largely uncelebrated for decades but we're not gonna let her pass unnoticed dr gladys west i did at the age of 95 if you if you use gps to
[02:48:36] get around you can thank gladys west and that is that thank you doc rock you are a champion two shows i get to be alex right now oops yes we're fixing that uh doc rock is a uh is uh of course a wonderful youtuber youtube.com
[02:48:59] slash doc rock his day job is working in um as an evangelist for ecamm as a man who's speaking the truth to people who use yeah i basically get to find companies like yourselves and you know um even the hardware companies that we partner with to make it better for everyone but also to show the users who are starting with nothing like what the possibilities are and i tell you leo there's not a week that doesn't go by when people says well i'm starting this podcast and i don't know who will listen i go
[02:49:28] that's how they all start and i tell you right now i hang out every week on one that's like 20 years old there's only enough to drink and it was probably there's probably a hundred of us watching it in the beginning but there's a heck of a lot more now so thank you start you start you know you never really start where you start that's right it's great to see you appreciate all your contributions everybody with a microphone should get docs pops there you go that's i like the orange one today
[02:49:52] that's the pop filter uh doc rock where where can i get that uh doc pops with two b's at the second part uh that's what pops dot com it's part of the doc merch the fabulous doc merch lineup doc pops there it is thank you doctor uh patrick you are a champion it is now 2 a.m
[02:50:15] 2 0 8 indeed uh thank you for staying up late especially given that you just recovered from the flu i hope you get to bed and get to sleep in a little bit you don't get waking up i'm very much looking forward to it hey patrick beysha is at not patrick dot com subscribe to his podcast if you speak he has a number of uh wonderful podcasts for you if you don't the phileas club is his english language
[02:50:41] podcast but this is yeah it's a fun show tech like you tech love rendezvous did i say all of those oh yeah not well i mean i mean okay okay sort of well the rendezvous deck yeah thank you patrick that'll go i appreciate it thank you leo it's great to see always fun thanks to all of you for joining us we do this show every sunday from two to five
[02:51:10] uh pacific time that's five to eight eastern 2200 utc you can join us uh of course watch live if you wish on youtube twitch x.com facebook linkedin and kick of course if you're in the club you can join us in the club twit discord that's another place you can watch after the fact on demand versions of the show are available at twit.tv uh there's a youtube channel dedicated to the video and you can subscribe to audio
[02:51:36] or video and any podcast client if you do leave us a review leave us a good review not a bad review a good review tell the world about this week in tech thank you everybody for joining us we'll see you next time another twit is in the can bye bye hey everybody it's leo laporte it's the last week to take our annual survey this is so important for us to get to know you better we thank everybody who's already
[02:52:01] taken the survey and if you're one of the few who has not you have a few days left visit our website twit.tv survey 26 and fill it out before january 31st thank you so much we appreciate it
