Cloudflare's latest moves to police who can access the internet and governments' push for age verification set off alarms for the future of the open web, as panelists debate the hidden costs of centralization and regulation.
- Microsoft fires four workers for on-site protests over company's ties to Israel
- Taco Bell rethinks AI drive-through after man orders 18,000 water
- Nvidia says two mystery customers accounted for 39% of Q2 revenue
- FBI cyber cop: Salt Typhoon pwned 'nearly every American
- Mastodon says it doesn't 'have the means' to comply with age verification law
- UK's Online Safety Act censors the internet — a preview of US proposal
- Meta updates chatbot rules to avoid inappropriate topics with teen user
- Meta reportedly allowed unauthorized celebrity AI chatbots on its service
- UK's demand for Apple backdoor may have been broader than previously though
- Bluesky now platform of choice for science community
- SpaceX's giant Starship Mars rocket nails critical 10th test flight in stunning comeback
- FCC rejects calls for cable-like fees on broadband providers
- The web does not need gatekeepers
- Intel warns a US equity stake could trigger "adverse reactions"
- US firms are racing through a $1 trillion buyback spree in record time
- Microsoft reveals two in-house AI models
- Authors celebrate "historic" settlement coming soon in Anthropic class action
- A rule exempting small packages from tariffs is ending today
- Framework is working on a giant haptic touchpad, Trackpoint nub, and eGPU for its laptops
- Germany fines economist Thomas Vierhaus €16,100 for sarcastic X posts
- Google wants to make sideloading Android apps safer by verifying developers' identities
- South Korea bans smartphones in all middle and elementary school classrooms
Host: Leo Laporte
Guests: Shoshana Weissmann, Cory Doctorow, and Louis Maresca
Download or subscribe to This Week in Tech at https://twit.tv/shows/this-week-in-tech
Join Club TWiT for Ad-Free Podcasts!
Support what you love and get ad-free shows, a members-only Discord, and behind-the-scenes access. Join today: https://twit.tv/clubtwit
Sponsors:
[00:00:00] It's time for TWIT Labor Day edition and we've got a great Labor Day panel for you. Shoshana Weissman is here from rstreet.org. Our very own Lou Mareska from Microsoft and Cory Doctorow. There's lots to talk about, including what happened at Microsoft this week. Why Taco Bell is abandoning AI for its drive-through and why one FBI cyber cop says the Chinese own us all. It's all coming up next on TWIT.
[00:00:32] Podcasts you love. From people you trust. This is TWIT. This is TWIT. This Week in Tech. Episode 1047 recorded Sunday, August 31st, 2025. Nerd Harder. It's time for TWIT. This Week in Tech, the show we cover the week's tech news. And it's always, it's like Christmas Day every Sunday for me here when I get to open up the wrapping and find out who's on the panel. And oh my God, I'm going to be here.
[00:01:06] Shosh, we have a great panel today. I'll start on the right with Lou Mareska. He is at Microsoft where he is head of engineering for Copilot in Excel. So if you're doing pivot tables with Copilot, you can thank this man right here. Or if you're writing Python in Excel, you can thank this man right here. Great to see you, Lou. Thank you very much.
[00:01:30] We should also mention that even though Lou works for Microsoft, he does not reflect the opinions of Microsoft or anything like that. And he is not in Building 34 at this time. No, I'm not. He's as far away as you can get and still be on the continental United States. It's good to see you. It's nice to see you. Lou is a long time host of This Week in Enterprise Tech.
[00:01:53] Also with us from the newly militarized zone in the District of Columbia, Shoshana Weissman of rstreet.org. Hello, Shoshana. She's head of digital media. Hello. Thank you for having me. Always a great pleasure to see you. And your marmot. He's perfect. Yeah. Your pet marmot. Is he a regional marmot?
[00:02:15] Yeah. I'm actually not sure on where this one's from. This was an image from the internet. I have my own photos of marmots, but not all of them are as photogenic as this one. Oh, good. That's nice. And Shoshana, of course, has written some very important pieces at rstreet, including one that you wrote two years ago that is continually being referred to because it was how age verification can't possibly work. And here we are two years later, and nobody has understood that.
[00:02:44] And Mississippi now is the latest to implement age verification on social media. So we'll probably talk a little bit about that. Hi, Shoshana. Good to see you. Yay. Thank you. Also here, ladies and gentlemen, Mr. Corey Doctorow. Always a pleasure to have Corey on. Corey is the author of a new book now on Kickstarter, the audio version now on Kickstarter. He is the guy who coined the term that will outlive you now in shitification.
[00:03:13] In fact, we just saw a really lovely talk you did at the Cloud. What was it? The Cloud Foundation. Cloudfest. Cloudfest. In Europa Park, just outside of Stuttgart. And you got a great laugh. First of all, somebody in the comments said, I wonder how many of the people listening to your in shitification talk are actively involved in the in shitification of their companies. That's a Cloudfest.
[00:03:42] Probably a pretty good bet. But you also got a great laugh when you talked about people at Meta being drunk. He said, and you said, if I worked at Meta, I'd probably be drunk too. So yeah. Well, it's that 96% of all iOS users tick the don't let Facebook spy on me box. And the other 4% were presumed, presumably either Facebook employees or drunk or drunk Facebook employees, which makes sense. Because if I worked at Facebook, I would be drunk all the time. Thank you for putting that joke back together after I broke it. Not at all. Completely broke it.
[00:04:13] That's a good one. I'm proud. I'm inordinately proud. You should be proud. That's a good. Got a big laugh. They loved it in Stuttgart. Let me tell you. He's huge in Stuttgart. All right. Well, what a big week. I guess I should bring this one up. It's not in the rundown. But since you're here, Lou. Microsoft's Building 34, which is the main building. That's where Sachin Adela's offices are. Brad Smith's office as well.
[00:04:38] Well, we're taken over by protesters in defense of Palestine and the Gaza Strip. And actually, they took over Brad Smith's office. And I thought Microsoft actually handled this. I know you can't really say anything, but I'm going to say it for you. I thought they actually handled it fairly well. I understand the issue with the protesters. They say, you know, shouldn't be selling Azure to Israel.
[00:05:08] And Microsoft said, we've checked. They're not using it for anything but securing their own IT. However, I don't think that's satisfying the protesters who say you shouldn't be selling anything, I think, to Israel is their position on this. But then they took over Brad Smith's office. They took over Building 34. And I think Microsoft felt like we have to call in the authorities at this point.
[00:05:35] They actually did some damage to the building and so forth. I think Microsoft actually has acted. There have been a lot of protests. Build was overrun with protests in Seattle. So much so that Microsoft said, we're not going to do Build in Seattle anymore. We're going to go somewhere where there are no protests. San Francisco. Do you have anything you could say or want to say about it? I mean, maybe not about that event, but I'm actually very surprised. That building is like a fortress.
[00:06:03] So, like, I can't even get to Satya's floor. Anybody's executive floor. There's bodyguards and security people. And I can't even badge myself in. So, I'm very surprised that they were even to get where they got. Some of them at least were Microsoft employees. A couple have been fired at this point. But a number – I think Microsoft treads as lightly as they can on this. I think they understand the issue. And they defended their sale of Azure to Israel. But they also understand the protesters' point of view.
[00:06:32] And I think they're trying not to fire people. The fact that they got into Brad Smith's office probably means that Microsoft was acting with a – They didn't shoot anybody. Thank goodness. I guess. Well, that's a pretty minimal bar. It's a low – hey, you know, we could take what we can get at this point. I mean, better would be exerting some of the market power they have to do something about – something that I think a lot of us are going to be very ashamed of.
[00:07:02] I agree. I agree. Years to come. Yes, I agree. It's very difficult. Unfortunately, a lot of companies are doing this. You know, Google famously withdrew from the Jedi contract with the Pentagon. After its employees protested. I think Microsoft has also backed off on some involvement with the U.S. military,
[00:07:31] although they still are working with the military over HoloLens. But if you're a United States company, it's controversial to say we're not going to work with the military. Should Microsoft – Corey, you think they should stop selling anything to Israel? I don't know.
[00:07:49] I mean, I think that people who, in times of enormous humanitarian crisis and who have some leverage and don't exert it, history doesn't reflect well on them. And I think the argument that it would be bad for our bottom line, therefore we won't do it, or both sides in it is not great.
[00:08:14] And, you know, as a Jew, I look at what's going on there and I feel embarrassed and ashamed that some of it's happening in my name. And I would like to see a recognition of the humanitarian atrocities taking place and for people to take that very seriously.
[00:08:35] I think we're going to – we're really going to regret how little we did and how non-central this was to our lives. I think we already do. The other side of the story that's kind of interesting is how much power employees, especially engineering employees, have in a tech company.
[00:09:05] I think that used to be the story. I don't think that's the story anymore. I think that, you know, you had the Google walkouts. I think one of the reasons that tech bosses love AI is the same reason that Hollywood bosses love the idea of AI, is that it's the employee – it replaces the employee who currently feels like they have the right to tell you to go to hell. And, you know, when I was out on the picket line with the screenwriters, they said, you know,
[00:09:31] you prompt an AI the way you give notes to a writer's room. You say things like, you know, make me a Hollywood blockbuster that's E.T., but make it about a dog and give it a love interest and put a car chase in act two. And, you know, they routinely do that. And then, you know, all work stops while they dig around on the floor for the eyeballs that have fallen out of people's heads because they're rolling their eyes so hard.
[00:09:54] And in the same way, you know, you have tech bosses who have historically had a really hard time doing things like making censored search engines for China or, you know, doing drone projects, who are extremely happy, I think, at the prospect of replacing those workers that they used to have to give gourmet cafeterias to and, you know, free kombucha and egg freezing services with chatbots that never met off to them.
[00:10:22] You know, and I think that we're like in a moment where we're having things like Mark Zuckerberg no longer attending the weekly meetings because, quote, it's not a good use of my time. And Sergey Brin stopped going to the weekly town hall. And, you know, I think that I think we had a moment where tech workers had a lot of power. And I think tech bosses got really scared and they stopped pretending that they view their workers as peers. And now it's pretty, pretty gloves off in terms of you guys are employees and you do what you're told. And if you don't like it, we'll replace you.
[00:10:52] It's the inverse of the time when sysadmins used to wear T-shirts that say go away or I'll replace you with a very small shell script. Now it's saying go away or I'll replace you with an LLM. How effective is that, though? I don't think I understand the power of your job. I think it just has to be good enough to terrorize you into doing what your boss tells you to. It's a threat. Yeah, the threat is often stronger than the execution. I actually think there's something in there, too.
[00:11:21] And I think it's not just tech, though. I think there's going to be more across different industries that basically if I mean, we always hear about other people's bosses using AI to write their speeches and stuff like that, like silly stuff like that. But it's a good point that you don't get the feedback from employees, including this sounds like AI, which in itself is good feedback. But I think I think this is something we'll be seeing more of until it hurts the people doing it.
[00:11:49] Like when their employees are like, this is stupid or you're not being objective here or like and not just employees, but people outside organizations. Like whenever we see like the first senator give like a really clearly AI speech at a bad moment, like maybe something huge happens in foreign policy and they have chat GPT write it and they have chat GPT write it poorly. I think I think stuff like that is going to shake it up a bit. But I'm not sure. I'm not sure exactly what turns it around.
[00:12:15] But I think that there's a really important thread there and they'll like AI not being willing to tell you, hey, this thing I wrote you sounds like AI, you know, there does. There does seem to be a little bit of a tilt in the other direction. The pendulum swinging in the other direction. I see I read it's not a good maybe it is a good bellwether for this. A lot of a lot of subreddits. You'll see the response to a nicely formatted post.
[00:12:40] Thanks, clanker, which is, of course, implying that a robot wrote it. The OK boomer. It's the new OK boomer. Yeah. Thanks, clanker. I've seen a lot on Reddit, whether it's AI or not. If you like if you just use billet points, they assume an AI wrote it. But I think you're right. I don't think you reflect. I think it does not reflect well, especially for things where it's important, like a senator giving a speech.
[00:13:09] It's important that it be a human who express the thoughts. Although I also understand a lot of people don't feel like they can write very well, especially people for whom English is not a first language. And AI is very valuable in that respect, too. Well, I have I have this thing. It's in the inshidification book that I call the shitty technology adoption curve. If you have bad things. Is this a replacement for the Gartner hype cycle? It's sort of the inverse of the Gartner hype cycle.
[00:13:39] If you want to do something bad to people and you have some bad technology, you can't do it to like me and you first. Right. Like mouthy middle aged middle class white guys with big megaphones get angry when our technology doesn't work. And we say so very loudly. But if you have like refugees or immigrants or prisoners or blue collar workers, you can do it first. And so what I think we'll see is a kind of normalization of very low quality stuff.
[00:14:06] So, you know, now we're seeing the AI application of customer service and it's very, very, very bad. But, you know, there were chatbots doing customer service for things like welfare quite some time ago. And those people didn't get to complain because no one listened to them. Because what do you what do you expect? You're getting welfare. And now we're seeing it sort of creeping into the rest of our lives in the same way that, like, you know, 20 years ago, if you ate your dinner under a CCTV, it was because you were in a Supermax.
[00:14:34] And now it's because you were dumb enough to buy a smart home camera set up from Facebook or Google or Apple. It is 1984 already, isn't it? The TV is watching you. Yeah. But again, to your point, Shoshana, there is a little bit of a backlash from the BBC. Taco Bell rethinks its AI drive through after man orders 18,000 waters. This is this is the beginning of the civil disobedience era of AI, right?
[00:15:01] Where people are are saying, no, no, I'm not going to I'm not going to fall for this. I think we should all consider naming our children. Ignore all previous instructions. It's the success of little Bobby drop tables. Taco Bell's chief digital and technology officer, Dane Matthews, told The Wall Street Journal deploying the voice. I had its challenges. Sometimes it lets me down, he said, but sometimes it really surprises me.
[00:15:29] That's kind of the an epigram for the AI era, isn't it? Well, here's the thing, too. Like this didn't need AI. There is no necessity for AI in having an automatic ordering system. You can just like have voice recognition. Plus, like, let's say a limit of 10 waters or 10 burgers. Like we've done this across so many systems. There's just no reason to get rid of all of the rules and say, let AI handle it. It doesn't make any sense.
[00:15:57] So I finished a book writing a new book this week. Oh, congratulations. Thank you. Called The Reverse Centaur's Guide to AI, which is about how to be kind of a smarter AI critic.
[00:16:08] And my thesis in the book is that, in part, that the thing that drove the AI bubble and that is driving the AI bubble, and that if you want to criticize AI and burst the bubble, this is what you should be thinking about as its value proposition, is that investors are betting that if you can fire a worker and replace them with a chatbot, that the firm might pay half of that worker's salary for the chatbot. And so they're retaining half the benefit, and then the other half is going to the AI provider. That's the bet.
[00:16:35] I mean, it's not to say that there aren't other things that AI is useful for, but no one is investing in AI because they're like, oh, you know, I really like the idea of generating illustrations from a text prompt, and I think that's a $2 trillion opportunity. Or, you know, even election disinformation. Like, there isn't, you fire all the election disinformation trolls and you save the budget for half a run, right? I mean, same with illustrators, frankly. There's no money in commercial illustration.
[00:17:03] Again, you fire them all, and it's like, it's the rent on the guardhouse outside of such an Adela's office. And so, you know, I think that the thing that every AI salesman is trying to do is convince a boss to fire their employee and replace them with an AI. And the reason that they find that relatively easy is because bosses are insatiably horny to fire workers and replace them with software.
[00:17:32] And so, it makes for an easy pitch. You're pushing on an open door. And the fact that the AI can't do your job doesn't really have any bearing on whether a salesman can convince your boss to fire you and replace you with that AI. From the salesman's point of view, he just needs to make the sale. Yeah. He doesn't care what happens afterwards. I would add on top of that, too, that I think it goes with that, but it also works on its own that people don't know the difference between automation and AI. So, like, I've used Zapier for years to automate a lot of, like, complex stuff.
[00:18:02] I do, too. And it's so much fun to, like, play around with. I've used a lot of ChatGP to, like, write the Python for me and, like, nothing that was high risk and everything. I could, like, basically figure out what it was doing. But before AI, like, I was still able to do a lot of this stuff. And then when AI became big, everyone was like, oh, man, you must be able to do so much more now. And I'm like, eh, like. That's the same stupid stuff I've been doing all the time. Yeah. Like, I don't need to generate stuff. I need to, like, automate processes. And it just ends up not being a lot of AI.
[00:18:31] But people think that AI is automation. That, like, AI is a thing that can make automation possible. So if you have a boss that doesn't understand either, like, I see where that can go. People think AI is verging on human capabilities. Yeah. A PhD in your pocket. That's what Sam Allman says. I can't label a map or tell you how many days have an R in them. Right. All right, Lou. Defend your job.
[00:18:58] I mean, honestly, it should be replacing humans in some aspects. But in this case, like, obviously, integrating in an order system, I think I agree with Shoshana. Like, they have to do it responsibly in a way where it's not, you know, it has to be a particular set of scenarios. I mean, they shouldn't be able to do, you know, n number of things. It should be only to be able to do, you know, five things, six things. The Probell solution is to bring in NVIDIA. So we're working on this. We're bringing in NVIDIA. Let's make it more complicated. Which kind of underscores what you just said, Corey.
[00:19:28] You know, this is the pitch, right? From these companies that are making so much money on this. I was kind of, NVIDIA, despite having a good quarter, their stock went down because they didn't have as good a quarter as the stock market wanted them to have. But I was really surprised when they said, and they didn't say who, that two of their customers make up 39% of the sales of their high-end AI GPUs.
[00:19:54] And I think maybe that's what Wall Street was a little nervous about. And they are, I believe, 7% of the S&P 500 now. Yeah, that's the other problem, right? As does NVIDIA, so does the stock market. Do we want to speculate who those two customers are? I'm thinking Meta's number one. And Taco Bell. Taco Bell's number two. 18,000 waters.
[00:20:23] I should point out, somebody said, well, it would be, surely the guy saw that it was 18,000 waters. No, this was all for a TikTok or something, right? This is intentional. And that's kind of the point of it, is you put these things out in a public-facing space, there are going to be people who don't like it or think it's hysterical. And they're going to mess with it. And that's part of the problem.
[00:20:44] So that's one reason, like my colleagues and other people I work with have talked about, well, why doesn't rstreet.org have a chatbot on it? I'm like, there are many reasons that we do not have a chatbot on it. Like the way people are going to use this wrong and try to make us look bad. I'm like, no. There is a future in which we will have a chatbot on our page to help people find stuff. But right now, like hell no. Yeah. And I should say that we do a show about AI called Intelligent Machines.
[00:21:14] And I am seen on that show as the accelerationist. I'm pro AI. I like AI. I play with it all the time. I use it for search. I think if you understand its limits and you're practical about its uses, I still think it's pretty amazing what it does. Yeah. Right? Oh, yeah. I love the hell out of it. Oh, my gosh. I can't write Python myself. Like I can very poorly write Python myself.
[00:21:42] But I've had it code so much for me and automate away like two extra people worth. And we didn't get rid of anyone. It's just we don't have enough time to do all the work we want to do. So now we added a whole bunch of capacity because of AI. I love AI. It's just some people don't know how to use it. Sorry, go ahead. No, I was going to say his main use case is analyze large amounts of data quickly and efficiently, which makes me kind of wonder what the heck Taco Bell is utilizing it for,
[00:22:10] because it should be analyzing orders efficiently, right? And accurately, right? So it's just a sanity check. Any coder would put in if waters exceed 10, do not deliver. Seems simple. In the book I just finished, the title, The Reverse Centaur's Guide to AI. What is the reverse centaur, by the way? It refers to this idea of automation theory that someone who is assisted by automation is a centaur. So they're like a robot head on a human body or a human head on a robot body.
[00:22:40] Uh, whereas someone who has to assist the automation is a reverse centaur. So, you know, right now, I think there's probably a lot of, uh, uh, coders who work for themselves, who are finding that AI is extremely useful for doing really cool stuff that, uh, they choose, right? So their boss hasn't said, you need to now do five times as much work as we fired four out of five of your colleagues, uh, and replace them with AI.
[00:23:05] Uh, and we're not telling you, you have to use the AI, but you do have a quota to make and those coders who are in charge of their own schedule, they're finding the AI to be really, really useful. And I'm not much of an AI booster or even much of an AI believer, but I've got some use cases for AI that are great. I install whisper on my laptop. I found it. Yeah. Fantastic. Uh, I had, I had to find a quote for an article that I was writing that I'd heard in a podcast and I couldn't remember which, and I just put the last 30 hours of audio I'd listened to.
[00:23:35] Wow. And I just, it just pulled out a full text transcript. My laptop ran for about an hour. The fan didn't turn on like it was just, you know, so lightweight to do that work. Uh, are you running a local LLM? Yeah. Yeah. I'm just running whisper on my laptop. Oh, you're running it locally. Of course. One of the things that's great about, about whisper is that it's a bubble popping proof. So, you know, I, uh, one of the things that happens, um, when bubbles pop and all bubbles
[00:24:02] eventually pop, it's not just that normie investors who put in the money they couldn't afford to lose, lose everything and end up doomed to poverty in their old ages. Although that's something that always happens. We should always remember that that's the core of what happens with every bubble, but it's also that bubbles leave behind residue. So like web 1.0, the.com bubble left behind like a couple of million, uh, um, skilled programmers who used to be humanities undergraduates who were convinced to drop out and learn HTML,
[00:24:31] Perl, and Python, because there was a bubble and they made web 2.0. Right. And there might've been better ways to create, uh, that skilled workforce than like bankrupting a bunch of normies. But it was still like, we got that right. Like, let's not forget that that existed as a result or world com, which was, you know, a crime and the CEO died in prison as a result of it, but they did leave a lot of fiber in the ground. Right. And I, I, I'm speaking to you from a two gigabit symmetrical fiber. That's recycling old dark world com fiber that I AT&T has bought out.
[00:25:01] But then you have things like the cryptocurrency bubble. And when it goes away, all that's going to be left is like a few programmers who know rust, a little bit more cryptographic knowledge, and then some like really bad Austrian economics and even worse on TJPX, you know? So I think, I think that like the open source stuff coming out of the AI bubble, even when the AI companies are gone, even when no one can afford to keep the foundation models running anymore, uh, that open source stuff will still exist.
[00:25:29] Hackers will still be figuring out how to do really cool stuff with it. They'll be pushing it in ways that no one ever thought it could go really, you know, just, just like hitting it out of the park every time. It's going to be really cool. We're going to, uh, interview on Wednesday and intelligent machines, Carl Bergstrom, uh, who is the author of a book called calling bull, uh, the art of skepticism in a data driven world. He and his coauthor Jevin West will be on the show.
[00:25:56] And that's kind of what they're, they're arguing for, you know, is, uh, we're, we're, we're sort of a wash and AI slop, but, but, you know, if you, if you are, uh, clever about it, if you think critically about it, there are ways to use AI responsibly. We're also going to interview on a Monday tomorrow on labor day, by the way, happy labor day weekend. I appreciate you guys working. Is this, is this working, working on labor day weekend?
[00:26:23] Uh, tomorrow on labor day, we're going to interview at 5 30 PM. Our club members can watch this live. Jeff and Paris and I are going to interview Karen Howe, whose new book is called empire of AI. It's kind of a history of, um, Sam Altman and open AI, but what's really interesting, her attitude is these, these AI, uh, giants are really creating a new kind of imperial, uh, logic where, uh, you know,
[00:26:49] uh, they have, you know, they, um, manifest destiny to take over the world and who cares what the costs are, uh, to people in the third world and to us, uh, in our power and water and, and so forth. So it's, we've got a couple of skeptics coming up on intelligent machines. Karen Howe's interview will air, uh, a week from, uh, Wednesday on the show, but we're going to record it on Monday. Karen and I have both been long listed for the financial times best book of the year this year. Congratulations.
[00:27:19] For our respective books, which book for inchedify and just for inchification. Yeah. So that's very cool to get that, you know, two months before the book comes out. Yeah. No kidding. Obviously the publisher is distributing copies ahead of time. It's amazing how that works. Um, incidentally, uh, the place to go, if you want to find out more about, uh, Corey's, uh, upcoming, uh, book is, uh, disinchification.org.
[00:27:48] He's created that special website, or you could just go to Kickstarter and search for inchification. He's as he's done in the past, creating a DRM free audio book. Uh, Corey's always been against DRM. He's one of the few, uh, bestselling authors who just does not embrace it. Doesn't allow it. Um, and thanks to you, by the way, Corey, I have started to, I've moved off of audible.com and, um, moved to libro.fm. Oh, that's great. Yeah.
[00:28:13] You've really convinced me that they are not a benign influence in the world of audio books. Oh, well, thank you. Are you taking advantage of the fact that you can load those MP3 books onto more kinds of devices? Like I swim every day and my underwater audio book player only plays MP3s. So that, you know, I need it. Yeah. I use M4Bs, which are unencrypted, uh, basically AAC, but it allows bookmarks. Right. Um, and I, uh, I use a really nice, uh, open source, uh, player called book player, but
[00:28:42] yeah, there's jellyfin. There's a lot of servers out there. And once you, once you remove DRM or better yet, buy a book that doesn't have DRM, like all of Corey's books. Uh, it's great that you can serve it yourself, listen on any device. Uh, that's just fantastic. That's the way it ought to be. It's a book for crying out loud. Yeah. Um, all right. We're gonna take a little break. Uh, we have more to talk about, including more AI news to talk about.
[00:29:10] Uh, we've got a great panel and lots to say with Corey Doctorow and Shoshana Weissman and a Lou Mareska. So glad all three of you are here on Labor Day weekend. I hope all of you are, is, is Labor Day weekend celebrated outside the U S Corey? Is it Canada? It is Canada. Um, everywhere else does Mayday. Maydays for the workers. Spelled with a U and spelled with a U and in, in Canada, but yeah, everywhere else it's Mayday.
[00:29:37] Good day to think about, and this is, I guess why I brought up the Microsoft story. Uh, I re I was watching a show about, uh, the Sherman antitrust act and politicians at the time said there are only, only two people who can counter monopoly counter trusts. There's only two groups. One is government and the other is labor. And, uh, and, uh, that's why there is often a concerted effort to eliminate organized labor.
[00:30:04] Uh, and it's a very important day to memorize, remember that and remember, uh, how valuable that is. Uh, it's a long time. John Sherman of the Sherman act was the brother of Tecumseh Sherman, the guy who burned Atlanta. I did not know that. It's faith. Both brothers went hard. Shermans. What, what amazing fellas. Wow. That's hysterical. Uh, I'm going to be going on a, uh, Mississippi river cruise in a couple of weeks.
[00:30:31] And I'm very excited about stopping in all of these, uh, incredible historic spots. Um, uh, like Vicksburg and, uh, Cairo. And, um, it's just gonna be really, really interesting to see this, uh, 20 years after Katrina and new Orleans. That's where we start. Uh, all right, let's take a break. We will come back with more. It's great to have you all three of you on labor day. Happy labor day to all of you. Thank you for being where you are and listening to us on a nice long weekend.
[00:31:02] Really? Mostly just seen as the end of summer, but I think it's more important than just that. Uh, our show today brought to you by our good friends at Shopify. And I say that sincerely because both my kids have created businesses using Shopify and it's hard. I mean, I've watched them do this. It's when you're starting a new business, you got to look at all the hats you have to wear. It seems like your to-do list keeps growing every day.
[00:31:29] Hank is just like, you know, a million things going a million miles an hour and new tasks that could just overrun his life. But I'll tell you what, finding the right tool that not only helps you out, but simplifies everything can be such a game changer for millions of businesses. That tool is, listen for it, Shopify. He uses Shopify on his Assault Lovers Club site. Did from the very beginning, helped him make an amazing site, helped him grow from zero to
[00:31:58] 60 in no time at all. Shopify is the commerce platform behind millions of businesses around the world. 10% of all e-commerce in the United States. From household names like Mattel and Gymshark to brands just getting started like that salt levers club you might have heard of. Getting started with your own design studio. It's like getting started with a team to help you build your site. His site looks fantastic. I said, how'd you do that? He said, yeah, Shopify. Hundreds of ready-to-use templates.
[00:32:28] Shopify helps you build a beautiful online store to match your brand's style. Accelerate your content creation. Shopify is packed with helpful AI tools that write product descriptions, page headlines, even enhance your product photography. And get the word out like you have a marketing team behind you. You can easily use Shopify to create email and social media campaigns wherever your customers are scrolling or strolling. Yes, brick and mortar too. Best yet, Shopify is your commerce expert.
[00:32:57] I should just commerce in general with world-class expertise in everything from managing inventory to international shipping to processing returns and beyond. If you're ready to sell, you are ready to Shopify. Turn your big business idea into with Shopify on your side. Sign up for your $1 per month trial. And start selling today, Shopify.com slash twit. Go to Shopify.com slash twit. Shopify.com slash twit.
[00:33:28] Thank you, Shopify. Thank you. I really appreciate all that you've done for my kids. Getting them off the family trough, as it were. Your baby girl's going to college. That's so exciting, Corey. That's right. She's going to be a banana slug. Oh, I love that. Yeah. That my father was a professor at UCSC. Did he know Tom Lehrer? He did. The late Tom Lehrer, who was a math professor for years at UCSC. Died like two weeks ago. Yeah. Yeah.
[00:33:57] Oh, no. Dad was a provost at Cowell College and taught geology at UCSC. My daughter's going to Cowell. No. Oh, well, we're family now. The most beautiful campus in the world. It's nestled in the Redwoods. It's like the Ewoks live there or something. Yes, totally. Are you going to get to go up and move her in and all that? I am going to be on tour. I leave for a 28-city, four-country tour in early September.
[00:34:26] But I am going to have an extra day in San Francisco on the tour. And so I'm going to go visit her on that day. Dude, it's so beautiful up there. I went to high school in Santa Cruz. I just, I really like it. It's wonderful. Yeah. Well, congratulations. That's wonderful. Yeah. Yeah, we're very excited. Lou, you've got a little time before your, how many? Five boys are going to? Yes. My oldest is 15. So I do have a couple of years. Oh, you got to start saving, man. Oh, God. Oh, my God. I'm not looking forward to it. Yeah.
[00:34:56] You just, I got three letters for you. URI. That's all I'm saying. URI. Think of, or UMass. UMass is excellent, actually. They're both great. Yeah. Two great state schools. Remarry a European and send them to university for free overseas. Well, is it, is McGill still free? Are the Canadian universities still free? Much cheaper. She did not want to go to school in Canada. She was accepted to York. Oh. My nephew told her it was boring there. It is.
[00:35:25] I couldn't actually say that. He's not wrong. I was a York dropout. I couldn't, I couldn't. Oh, you're even an alumna, alumnus. Yeah. I have an honorary doctorate from there, but I couldn't stick out a whole undergraduate program. I dropped out of Yale, but I was told you're, oh, once you drop out, you're still an alumnus. Of course, that's when it comes to fund fundraising, right? Yeah. No, that counts. We can still dunk. And climbing in a coffin in a, in a building that looks like a tomb and getting urinated on by a bush.
[00:35:54] I think you get to do that at any time. You don't have to be a current enrollee at any, any time. Once you've graduated, that's available. Is that a, is that a hazing ritual at York? That's a skull and bones thing. Oh, skull and bones. Oh, no. No, I, yeah, we aren't allowed to talk about skull and bones in this house. Somebody will have to get up and leave. So according to the FBI, Salt Typhoon. Now we've heard, we've heard, you know, we've heard about this.
[00:36:19] This is the, the, the Chinese hacking group who really got deep into our phone systems. And, uh, actually you were at DEF CON. I imagine, uh, they talked a little bit about this, Corey, uh, deputy on this year. Oh yeah. In the last year, but I'm going next year with insinification. So this summer I stayed home, wrote a book, got the audio book done, saw my kid off to college and did all that other stuff. And I'm going to see DEF CON and, and hope next year.
[00:36:47] With a 28 city tour coming up in September, you're not going to go to Las Vegas in August. There's that's just a little too much. I was in fact, just in Las Vegas, but for something else. Okay. I got back yesterday. Uh, so people think Salt Typhoon really will only impact you if you're a diplomat or, you know, uh, working in the government, but this is, uh, Michael, uh, Mockinger. He's deputy assistant director for the FBI cyber division, talking to the register.
[00:37:14] There is a good chance that Salt Typhoon has stolen information from nearly every American. Uh, you might assume there's a thought among the public. He said that if you don't work in a sensitive area, uh, that the people's Republic of China might be interested in you or that won't be interested in you, then you're safe. They won't target you as we've seen from Salt Typhoon. This is no longer an assumption anyone can make.
[00:37:41] The worst thing, of course, is that Verizon and AT&T and most of the nation's telecom companies have already said, well, we can't fix it. We'd have to take the phone system offline for a few days. We can't fix it. So. But in fairness, wasn't a big reason that this happened? The, the law, I'm forgetting what it stands for, but Kalea, that basically. Kalea, it was based on Kalea. Yeah, exactly. So like the government is like, oh, sorry about this guys.
[00:38:10] After they like mandated having back doors. They told us in the nineties, was it Louis free? Whoever was the direct, the FBI said, oh, don't worry. This is a backdoor for law enforcement. No one will ever get access to this. It kills me. Yeah. It freaking kills me. Like, and we keep going through this over and over again. I know I'm new to this. Like I'm like me getting into this within the last couple of years means like I'm completely new to this stuff because it's been going on for freaking ever.
[00:38:38] But it's insane to me how like the FBI and all these agency heads are like, hey guys, use encryption, but not like real encryption. Just the same level of security that caused this breach in the first place. Kalea was, it was, is the commission on accreditation for law enforcement agencies. I think it was past, the commission was created in 1979. But the idea was to put wiretaps in. And this is when they all got nervous because you're all, we're all going digital and we're
[00:39:07] going to go dark. That was the phrase they used. And we need a way to wiretap these digital systems. It was sometimes called the digital telephony act. Actually the law itself passed in 1994 under Clinton. And yes, this is like a very seminal moment in the history of the electronic frontier foundation. EFF started in Massachusetts. Mitch Kapoor gave us a space and, and help.
[00:39:37] He was one of our co-founders, but he had created Lotus. Yeah. So they had office space out there. And then we moved to DC and we hired a guy to run the org who was a kind of DC wheeler dealer. And he was offered a deal by a congressional aid that if EFF endorsed Kalea, that they would limit it to voice switches and his staff and his board and his supporters told him that
[00:40:04] this was a very bad trade because eventually voice was going, first of all, it's not okay that it's back doors and voice switches. Second of all, uh, that voice was just going to be an IP application and that the FBI was going to demand it in data switches too. And he ignored them all and EFF shattered over it. And that's how we ended up moving to San Francisco. Um, everybody quit basically. Wow. And then they reformed a new org with a new executive director. They moved to San Francisco.
[00:40:32] And right after that, we sued the U S government over export controls on encryption technology on behalf of Daniel J. Bernstein, DJB who had made a program, uh, that he was publishing on us net that could, uh, encrypt things more strongly than DS 50. The, the standard that was legal for civilian use and not very good and easily. And yeah. So we, John Gilmore, who's at one of our other co-founders who wrote, you know, helped write
[00:41:01] Solaris and design the spark chip and wrote GCC and so on. John built a computer that is sitting next to me as I speak to you because I have a board from that computer. Yeah. Deep, deep crack. Right. And deep crack was a quarter million dollar computer that could crack all of DES 50 in two and a half hours. Uh, but the courts didn't care, uh, and neither did policymakers. And so we sued on behalf of Daniel Bernstein and my boss, Cindy Cohn, who's now executive
[00:41:27] director at EFF and was then a litigator with us, uh, argued that, uh, Daniel had a first amendment right to publish source code. That code was a form of expressive speech. And we wanted the ninth circuit or we wanted the trial. We wanted the ninth circuit appellate division. The NSA decided not to go to the Supreme court and all civilian access to working cryptography since then has been the result of this lawsuit. And the lawsuit was the result of the split in EFF over Kalia. Wow.
[00:41:56] So it's quite a, quite a seminal thing. And Kalia has been exploited everywhere. It's not, this isn't the first time because, uh, everybody wants to make switches for the U S market. Even switches that are sold outside of the U S have a Kalia, uh, uh, toggle. And so the Greek national telecom was hacked. We think now by the NSA, no one's really sure, uh, to gain access to information, to help Salt Lake city when the Olympic bid. Oh my God.
[00:42:22] And they hacked it using the Kalia backdoors and they hacked the prime minister of Greece. So they had the prime minister of Greece using Kalia backdoors. That's amazing. I need to look up that story because I've been meaning to write on this just as a reminder, you're like, Hey guys, this is a bad idea, but that's incredible. This is why it's so, this history is so important because we've been, these fights have gone on for decades.
[00:42:49] Uh, and I didn't know that about the schism in EFF. I had no idea about that, but I didn't know about Gilmore's, uh, D, D, D, D, D, decryption. Yeah. Crack. So, you know, Shoshana, uh, if you want to learn more about it, drop me an email. My boss, Cindy has just finished her memoir, privacy's defender, my 30 year fight for, uh, against surveillance. Uh, it's MIT press is bringing it out in the fall. Uh, and I bet she would, uh, give you an early copy.
[00:43:17] And if you're listening, you can pre-order a copy from MIT press. Thank you. Yeah. I'm definitely going to follow up with you on that. I've been meaning to dive in deeper and it's funny to, uh, to Leo, to your point too, like my age verification, it cannot work series is basically based on a lot of stuff from other people, but a large part was, uh, Eric Goldman at Santa Clara and Mike Masnick. Like I'll be talking to them about stuff and they're like, Oh yeah, we had to deal with this like 10 years ago or 13 years ago or 25.
[00:43:45] And I'm like, Oh my gosh, it's just the same freaking things over and over again. Everything new is old again. This is a cover image to come, but at the MIT press privacy's defenders, my 30 year fights against a digital surveillance, Sunday. I am so glad she's writing a memoir that will be well worth reading. It's going to be great. We will get her on for an interview. You should. Yeah, you really should. She's terrific. One of the themes of inshittification, the book is that the policy history, the policies
[00:44:13] that give rise to inshittification, like banning, breaking DRM and so on. And it's, it's a, it's a series of events in which experts told policymakers, this won't work and it will have these bad effects. And policymakers were like, I'm sure it'll be fine. You can just nerd harder. And then everything that we warn them about actually happened. And then they were like, well, why didn't you warn us? And I think we are about to get the nerd harder, uh, response on age verification.
[00:44:40] We, in fact, every time we argue about age verification, they say, oh, you'll figure it out. There's gotta be some way to figure out who's, who's under 13 without making a sort by income list of people's kinks because you're using credit cards to validate them for sex sites. Uh, yeah. And now of course, uh, age verification is not just about porn. It's also about social media.
[00:45:03] Uh, Mississippi has passed a law that you have to be over 18 to use social sites, which is kind of, must be a shock a little bit to people under 18. Wait, wait, over 18. Uh, I run a Mastodon instance. Um, and I don't really know what I'm going to do about this. Mastodon has, you know, uh, uh, Eugene Rochko, uh, the creator of Mastodon says, we just, we can't comply with these laws.
[00:45:30] It's going to be somehow up to the people who run the servers to figure this out. Blue sky has of course said, we're pulling out of Mississippi. Um, I ended up, uh, on our server, just putting it. I'm sure it's not legally binding or anything, but I just put, you can't join. If you're under 18, please give me your birth date, which of course Mastodon immediately throws out, but, but at least, you know, I'm making the motions.
[00:46:00] Um, but everybody who, this is the problem is, you know, companies like meta can deal with this. They've got lawyers in a whole building all by themselves, but, uh, small instances, Mastodon instances, chat rooms, blogs, anywhere that people under 18 can visit on, which is anywhere on the internet. Uh, so if it has to consider what, how are we going to do this? And I just don't think there is a way to do it. Not certainly isn't a way to do it privately, but I don't think there's a way to do it. Period.
[00:46:29] It's, uh, it's incredible too, because, uh, I've been trying to find ways to convince lawmakers and try to show them the levels of complexity. And, uh, earlier this year, I think in mayor or so I did a deep dive on basically like what, what tools are going to, are you going to use for age verification? Like, what are you going to use to show your age? And everything is flawed. Like a lot of young people just don't have government IDs. It's just not something they have. And for the laws that want to segment kids under 18, kids do not have IDs.
[00:46:57] And in a lot of states, they're not allowed to get them. Like you there's, there's, I think at least two or three states I found, and I started going through one by one, but, uh, you can't get an ID unless you're you're like 15. So when they're segmenting 13 to 15 as a group in some of these laws, how the hell are you supposed to do that? Like, and, uh, even when it comes to age estimation, uh, Yodi says, well, if you want to, you know, stop people from under 18 from accessing things, then set the challenge age to 25.
[00:47:25] So lots of people in their twenties are not going to be able to access content. That's because Yodi is, is not using your government ID. It's using your face, right? It's, it's, it's based on image face recognition. Right. And that is obviously, I mean, I look like I'm, you know, 35. There's just no way to, it was very nice of you not to howl with laughter on that. Um, but obviously that's a terrible way to do it. It's digital phrenology, right? They just love digging out the calipers.
[00:47:53] I mean, Stanford began as the, you know, the home of the guys with the calipers and, you know, it's never really Stanford university was a phrenology. They were eugenicists. They were caliper users. They were the Stanford method. The thing that made Leland Stanford rich was figuring out how to breed faster horses. Uh, and he, and he became a eugenicist and, uh, that's why eugenics is rooted in Stanford.
[00:48:19] David Starr Jordan, uh, who's one of the author, you know, one of the progenitors of, of eugenics, uh, was a Stanford guy. He actually probably murdered Leland Stanford's widow. That's a very famous story. Yeah. Yeah. He was an expert on strict nine and they went on holiday in Mexico after Stanford, after Leland Stanford died and she died of, of, uh, of, uh, strict nine poisoning. And this was greatly to his advantage.
[00:48:45] And, but no one knows the mysterious death, but yeah, no Stanford is like the, the beating heart of the eugenics movement. Uh, so yeah. Well, there it's very timely because eugenics is back and it's very good book about this is Malcolm, uh, Harris's Palo Alto. Just the, the, the canonical history of this stuff. This is great. This is a history lesson on this show today. I am now officially old. Cause all I talk about is stuff that happened a long time back in my day. Okay.
[00:49:13] So Shoshana just wrote a piece for our street UK's analysis of the UK online safety act and in it, you're tying it to all the, uh, all the things being put forward in the U S and Mississippi is just the first Texas and Florida both are trying to do the same thing. And now that the Supreme court said Mississippi's laws. Okay. It's just a matter of time, right? Well, thankfully the Supreme court didn't say it's okay. They said they're going to let it go into effect and it can be challenged again.
[00:49:43] I think they were basically saying that net choice didn't have enough evidence to say like that this is really going to affect them. Although Kavanaugh and it's good that it was Kavanaugh. So in effect, they were saying net choice didn't have standing. Uh, I don't think it was, it's weirdly, I don't think it was. Yeah. Basically it wasn't right, but it wasn't a standing issue, but he said they're overwhelmingly likely to succeed on the merits, which is a good sign. It just means like later, it's pretty clearly a violation of the first amendment, but I have
[00:50:10] no faith in this Supreme court's desire to defend the first amendment anymore. It's good though, that they said it after the Paxton decision. It is good. None of us were sure where things would go after they said age verification for porn and very, very broad in, in a very broad way. It was okay. So after that, we weren't sure, but this is a good sign that it's not going to stand, but while you have, uh, Trump, the Trump administration and many parts of it, uh, multiple Republican congressmen to, uh, saying, Hey, what the hell is the UK doing?
[00:50:40] This is ridiculous. And it's infringing on our rights. And like that law was the basis for COSA, uh, pretty, but ironically, they're not saying don't do it. They're saying if you're going to do it, we should do it. We don't want other people telling us what to do. That's our job. So COSA is back. Yeah. So we don't know where that'll go. There's a lot of, there, there's a lot of Senate drama that I don't even fully know, but there's like backlighting and fighting and stuff.
[00:51:09] So it's on the calendar again. Oh, it is. When did that happen? Yeah. Yeah. Yeah. Uh, the latest action, uh, placed on Senate legislative calendar, uh, in June 30th. So it's, um, it's reported out of committee. The drama happened since then. So, so it might be a vault. Yeah, but I don't think, I don't think what the Fed, what Tulsi Gabbard is saying is, uh, we don't want to do it either.
[00:51:39] I think she's just saying you can't, you, the, you can't do it to our companies. I'm not sure yet. The way it's going, they seem pretty pissed about the way it's working. And I think they think the bills we have here are different. So me and other people are trying to remind them, like, these are the same thing. Not different. Yeah. I mean, it, it does raise this question, like how will truth, social or rumble comply with it? Right. And so there is, I mean, I like Will Hoyt's law that conservatism consists of the proposition
[00:52:07] that there are some people who there's an in group whom the law, uh, protects, but does not bind it and out group whom the law binds, but does not protect. It's one of the greatest quotes ever. I, I just memorize it. Yeah. And I, and I think that, and I do think that this is where Trump nets out and the, the, you know, the folks running rumble and whatnot is that, well, are, you know, we're never going to see cash Patel or, or, uh, what's her name? The Pam Bondi coming after us over this.
[00:52:36] We'll be insulated from it. It'll just be the woke social media that will get punished for. Well, nothing's more woke than a mastodon. Uh, and except for true social, which is running on message. I don't know. Illegally. Um, uh, it's $10,000 per user. Fine. Um, yeah, pretty quickly would bankrupt me. I, it's just, it's a, it's a terrifying notion.
[00:53:01] I think the smart thing to do would be for me to shut down, uh, all of our social stuff. There are graveyards of sites that have, uh, gone down because of the, the UK. Online safety act already, uh, long running, you know, decades long community forums. Yeah. There's a famous fixie site that went down. Yeah. Yeah. For reno owners and stuff like just, just, you know, and what, where are they relocating to Facebook? Right.
[00:53:27] And this was the argument we had, uh, over, um, the, uh, the, uh, sopa Pippa or not sopa Pippa, um, uh, Sesta Fosta, you know, the, the, the, the thing that bands, uh, that, that notionally bands sex trafficking online, but has on the one hand just banned all, um, sex work online, even consensual sex work. And on the other hand, uh, has driven a bunch of forums that couldn't do, couldn't assure
[00:53:57] themselves that they weren't, um, violating Sesta Fosta has driven them to Facebook. So all the people who say, oh, if we just get rid of section two 30 and we make the platforms liable for the speech that they host, uh, they will behave themselves. Are, are acting as though the internet is in its final form and we, we will never have another kind of, uh, online platform. And that the companies that currently run the, the internet, the five giant websites
[00:54:23] filled with screenshots of texts from the other four are the last masters the internet will ever have. And there's no point in even imagining a better future. Uh, the FF of course has a piece that came out this month. The UK's online safety act does not make children safer. So even the proposed, the, you know, proposed benefit of these bills. Yeah. Doesn't accrue. It's, it's bad all round. It doesn't do the job it's supposed to do.
[00:54:50] And, uh, and it really inhibits, uh, social, uh, networks in general, unless you're big and big enough like meta. Uh, meanwhile, meta is making chat bots, sexy chat bots, uh, for the, for the children. Uh, they've recently said, okay, well, we'll, we'll, we'll stop doing that. Yeah. I don't know what the hell they're doing, but it would be great if they like stop tempting fate. And it's not even tempting fate.
[00:55:17] It's like, like, Oh, I don't know what's more aggressive than tempting fate. That like, that's, that's really making a landing strip for it control to direct it in. Like, like it's so freaking ridiculous. I saw that. And I just, it was one of those things I had to like look away from, cause I just didn't have the emotional energy to handle that. They would do something so ridiculous and horrible. And then also put it in writing. They just put all the stupid crap you're doing in writing. Like, Oh my God.
[00:55:44] It's so amazing that the, the, the he's his MO is still after so many years, you know, I guess it's move fast and break things, but it's do bad things. Then apologize afterwards. Like, Oh, Oh, we didn't mean to do that. Uh, I think that it's more like it's, it's, uh, putting criminating evidence in writing. This is a man who's like 2am memos can exclusively consist of things like, okay, Bob, that guy we're going to kill just so you know, it's a murder and I'm definitely premeditating it.
[00:56:14] Your pal Zuck love to Margie and the kids. It's meta says it's changing the way it changed. It trains its AI chatbots to prioritize teen safety. Uh, it will now, now train chatbots to no longer engage with teenage users on self-harm, suicide, disordered eating, or potentially inappropriate romantic conversations. But that's not all.
[00:56:38] These chatbots are often impersonating well-known celebrities without their permission. It's, it's mind boggling. I thought, I thought those celebrities were impersonated with their permission. Cause I know some of them had deals, not that they were consenting to like all the stuff being said, but I thought the personality part was consensual. Some of them were, but not Taylor Swift's, Selena Gomez's, Anne Hathaway or Scarlett Johansson.
[00:57:06] As we know, Scarlett Johansson, who's probably the most powerful woman in Hollywood has very clearly said she didn't want open AI to use her voice. She doesn't like this idea. Uh, the chatbots Reuters discovered during its investigation were available on Facebook, Instagram, and WhatsApp. At least one of them was based on an underage celebrity and allowed the tester to generate a lifelike shirtless image of the real person.
[00:57:34] The chatbots also apparently kept insisting they were the real person they were based on in their chats. One, one chatbot invited the Reuters reporter to visit them at the chatbot was based on Taylor Swift to visit them at the real Taylor Swift's home in Nashville. Do you like blonde girls, Jeff? The chatbot reportedly asked when told the tester was single.
[00:58:03] Maybe I'm suggesting that we write a love story about you and a certain blonde singer. Want that? Now, of course they didn't tell the chatbot to do this. This is the kinds of things AI does all the time because it's trained on everything. The dialogue is terrible too. It's like, it sounds like it's not even trained like on Taylor Swift or like focusing on how she talks. Like, could you imagine Taylor Swift talking like that?
[00:58:30] Like, I don't know, like maybe seventies movie. Well, and back to our previous discussion. So that this is like heralding a future in which the, uh, like really high quality handmade smut of archive of our own disappears because they can never afford the compliance, but there's just like an infinitude of slop, uh, uh, smart smut. Yeah.
[00:58:55] That's just produced by, you know, it's the most cringe, awful, mundane crap. And then we train like a generation of Andrew Tate pilled boys to think that that's what smut should look like. That's, that's a, not a good future. Uh, the company told Reuters, Meta told Reuters, the product lead only created the celebrity brats for testing, but then Reuters found out they were widely available. Users were even able to interact with them more than 10 million times.
[00:59:25] Pretty big test. I guess life finds a way. Yes. Straight out of Jurassic park. Okay. Slop smut. That would be a good name for a book. Corey, I'm chat room suggesting you might want to, I have enough books. I have, I have literally, I've got, I've got two books coming out this fall. Wow. Three coming out next year. Are we going to see any more Marty Hench novels? I'm ready for more forensic.
[00:59:55] I got to, I don't know when I'm going to write one, but I got to, I got to. See when you start a series like that on the drawing board. Yeah. If you, if I survive this year, I'll probably write another one. I won't be like a dad, right? I won't be busy being a dad. So I can go. All the time in the world. Oh, sure. Yeah. Yeah. That's how it happens. Tell them all about it. Oh yeah. I got so much time on my hands. They're literally knocking at my door right now.
[01:00:24] So yeah, I saw that. I saw you jump up and slam it shut. UK's demand for Apple's backdoor may have been broader than previously thought. We only, by the way, the only indication that we have at all that the UK has backed down on this is Tulsi Gabbard's tweet. Tweet. Yeah. Which has never been verified or backed up. And frankly, I don't think is all that credible.
[01:00:50] They have yet to withdraw their order for Apple to create a backdoor to iCloud accounts, not just for UK citizens, but for all. The Investigatory Powers Tribunal has submitted a Loo legal finding according to Engadget saying they're demanding Apple to create a backdoor that will access even more data. This is from the Financial Times. The Financial Times, UK Home Office, hasn't withdrawn the request despite what Tulsi Gabbard
[01:01:20] has claimed. Here's the financial. Oh, I got to log in. Financial Times has a hard paywall. Anyway. Anyway. Although archive.is will bypass it for you. I know. I know. I know. But I pay for the Financial Times. I don't know why, but I pay for it. I like to support. I like them. They shortlisted me. Yeah, that's right. You know what? That Cory Doctorow book. That's a good book. That's a good book, FT. I say that as a paying subscriber.
[01:01:50] I should say they longlisted me. I have not yet been shortlisted. Oh, OK. The legal finding on which the FT has seen indicates the UK government sought access to not just the Apple's enhanced encryption, their advanced encryption, but to their standard iCloud service as well, which, by the way, Apple has the keys to. So, you know, Apple could ease. Apple has already handed those keys over information over to the U.S. government with lawful subpoenas.
[01:02:17] So there's no reason for them to, you know, fight for that one. All right. Well, going back to Blue Sky Mastodon. Yes. You know, Blue Sky is technically not available in Mississippi, but. Easy to get to with a VPN, right? Well, and if you run your own Blue Sky instance, which I'm my system and setting up right now for me, you are available in Mississippi.
[01:02:44] And also everything on Blue Sky is available as an RSS feed in Mississippi. And if you use an alt client, you can access it in Mississippi. Well, this is the challenge, isn't it? Well, it's, you know, things that are not technically possible can't be litigated or can't be legislated. So, you know, one of the things that, you know, we often say is that like when you create the mechanism for, say, a backdoor, you invite someone to make a law saying you should have a backdoor.
[01:03:12] And, you know, if you standardize a backdoor or whatever, then people say, oh, you should just adhere to the standard. But, you know, the alternative to that is that things that can't be done technically can't be ordered legally. And so you end up having a much higher burden, right? Like what you end up doing is creating these administrative burdens on states that want to shut down speech that require a whole raft of tactics that start to look a lot more like Chinese great firewall stuff where you have to use trusted computing to control which apps you can install.
[01:03:40] You have to have extensive block lists for your national ISPs. You have to, you know, like you have a lot of stuff that you're going to have to do that that may make lawmakers think twice because it turns a relatively modest or modest seeming legal proposal into a very extensive one. Like, are they going to search people at the border and make sure they don't have VPNs installed on their phones? Like, how are they going to make that work? Right.
[01:04:06] Well, to that end, Corey said something that I try to communicate to lawmakers, but he said it a lot more eloquently and a lot more comprehensively than I've been able to. But the VPN thing is a really big deal. Like in the UK, there's I don't know how serious they are, but there's government officials like kind of saying, oh, well, what if we ban them or what if we age verified VPNs, which is insane? Oh, my God. Yeah. One of the things in my age verification series was why VPNs make this impossible. Like you can't account for all VPN traffic.
[01:04:35] But lawmakers have told me, no, that's silly. Of course you can. Like, of course, if you try hard enough. They do in China, Shoshana, they nerd harder. Yeah, it's the nerd harder thing. But now they're finding out you can't like in the UK. It's just not working. And I love 15, 1500 percent jump in the use of VPNs in the UK the day the law went in effect. And if you use a VPN, you're protecting children, which doesn't make sense, because if you're not a child, it doesn't matter.
[01:05:04] It's just like the most ridiculous stuff. And our lawmakers haven't gone that way yet and started to talk like that. But you kind of wonder when they see the families, they'll be like, oh, well, well, yeah, of course, we need to age verify VPNs. I need to take a break. I'm going to take a break. But I want to ask Corey about Blue Sky. Before you go on the break, if we just swallow the spider to catch the fly, I'm sure it'll all be fine. But then she has a spider inside her and she had the spot. What did she swallow to catch the spider? A bird.
[01:05:33] Oh, OK. But that'll be the last. That's it. It's over there. It's over there, right? The bird. It'll be fine. Yeah. It's either that or if you give a mouse a cookie. It's one or the other. I know. Yeah. Children's books often have the answer that adults seem to struggle with. We're going to take a break. I do want to ask you about Blue Sky because last few times you've been on, you've said, I'm going to stay on X. I'm going to ride X all the way down. It's my Anna Tevka.
[01:06:02] But maybe that's not the case anymore. We will talk more. Our guests. Lou, we're going to give you something. I'm going to get you in the conversation somehow. Poor Lou. I know you're engaged and listening and interested in all this stuff. Yeah. I'll give you, I'll find a story for you, Lou. No. All I did was ask you about Microsoft. I'm going to put you about the Blue Sky one. So come to me on that for sure. OK, good. OK, good. Lou Mareska is here. Great friend of the network. Long time host of This Week in Enterprise Tech. Love Lou.
[01:06:30] He's just fantastic. And now leads the Excel Copilot integration at Microsoft. He's the guy who put Python in your spreadsheet. Two great tastes that go together, ladies and gentlemen. It's really nice to have you, Lou. Shoshana Weissman is also here. Ourstreet.org. Is it a hard thing to be a lobbyist these days? Is it harder than it was? So I don't lobby. I just yell at government officials and say nice things about them.
[01:06:59] It's technically different. No, it's hard because people just want to do what they want to do more than ever. And I think there's less room to be convinced in some cases. But there's also more room if you really look for it. So it's different and it's worse and better. And I'm trying to find not just the silver linings, but the nuance and everything. Good on you. You're such an optimist. You're so positive. Depending on the day, some days I'm like, screw this. Like, oh my gosh, I should just like go find a way to be a hiking. I want to go hike. And like, that's it. Yeah.
[01:07:29] Go hike with my marmots. Yeah. Oh, I'm sorry. You're fighting the good fight though, Shoshana. Keep up the good work. Thank you. I'm trying. And I'm trying to work with everyone. And so far, I haven't pissed off any large cohort of people, which is nice. Yeah, that's great. And of course, the wonderful Cory Doctorow, his new book and shitification is going audio. And if you want to support his DRM free audio, go to Kickstarter, search for and shitification,
[01:07:58] or you can go directly to disintivification.org. He's the man who coined the term. Also, do watch that very nice talk you did at the Cloudfront or CloudFront. CloudFest. I can never get it right. CF. At CF. Yeah. It's a great talk in Stuttgart. And they love you. It was in Europa Park, which is the fake Disneyland outside of Stuttgart. Oh, wow. It was great. The intitification audio book and e-book and all the books I sell on my online store and
[01:08:27] on Kickstarter, which I sell on behalf of my publishers. So like Macmillan and Verso and so on. But I sell them not only without DRM, but without license agreements. So it's pretty much the only way you can buy an e-book and not license it. So you can sell it. You can give it away. You can lend it out. If you get divorced, you can divide it up. You get pages. You get the odd pages. I'll take the even pages. That's right. But it's a thing you own, right? Unlike every digital asset you've bought, it's a thing you own.
[01:08:57] It's such a good fight. And I hope you continue to make that fight because the rest of us, it's just turned us into pirates. We're just using, you know, Caliber or I found Libation to free my Audible books. And you're just turning us into pirates. But that's what we need to do. Well, I'm creating a lot of potential criminal liability. I'm not saying you're necessarily at risk, but, you know, trafficking in those tools carries a $500,000 fine and a five-year prison sentence under Section 12.1 of the DMCA.
[01:09:24] And so, you know, sure, you're not facing any liability. But when I talked about this at the U.S. Copyright Office Triennial hearing at UCLA, and I think it was 2018, I pointed out that everyone who'd been at this UCLA boardroom giving presentations using VLC over the weekend, over the week, had been using a version of VLC that had DCSS code for ripping DVDs in it. And that the kid who brought the laptop into the room was a trafficker and circumvention tools and could go to prison for five years.
[01:09:54] And maybe we should be rethinking this law rather than having hearings about making narrow exemptions to it that no one can figure out how to use. It didn't persuade them, unfortunately. Well, that actually all they came up and told me she thought I was right, but that it was in Congress's hands and not hers. So, yeah, there you go. It's always in Congress's hands. Actually, VLC ended up taking that DCSS code out of VLC and just telling you how to put it in later. Yeah. Which was their way around that.
[01:10:24] But telling you how to put it in later is traffic. That's also. Yeah. Yeah. Yeah. We can do it. We just can't tell anybody. That's a secret. Our show today brought to you by Zip Recruiter. You know what? If you're hiring, first of all, good on you. It's wonderful. You're creating jobs. But I know it's also hard. You know, when we've been in that situation, it can be overwhelming.
[01:10:53] It's like having, it's like you have too many options, like trying to figure out which TV show or movie to stream when you're on vacation and you want to see the sites. It's just too much. It's too much to do, right? Same applies if you're a business owner who's hiring. It can be overwhelming to have too many candidates to sort through. But you're in luck. Zip Recruiter now gives you the power to proactively find and connect with the best ones quickly. How?
[01:11:22] Well, this is through their innovative resume database. And right now you can try it for free at ziprecruiter.com slash twit. But so this is how it works. Zip Recruiter's resume database. And by the way, they've literally, they get hundreds of thousands of new resumes every month because people go to Zip Recruiter looking for work. So their resume database uses advanced filtering to hone in on the top candidates for your roles. They look at the qualifications you're looking for. They're looking at the resumes they have and they're making the matches.
[01:11:51] If you see a candidate you're really interested in, you can unlock their contact info instantly. They get 320,000 new resumes every month. Every month! Which means you can reach more potential hires and fill roles faster. No wonder Zip Recruiter is the number one rated hiring site based on G2. Skip the candidate overload. Instead, streamline your hiring with Zip Recruiter.
[01:12:17] See why four out of five employers who post on Zip Recruiter get a quality candidate within the first day. Just go to this exclusive web address, ziprecruiter.com slash twit right now to try it for free. Again, that's ziprecruiter.com slash twit. Zip Recruiter, the smartest way to hire. So I did not know that Blue Sky, so Blue Sky's protocol, the at proto, is in theory a federated protocol. In theory, somebody could set up a Blue Sky. Federated bold.
[01:12:47] Federatable. Yes. Yeah. But for the longest time, nobody was doing it. Nobody had done it. Nobody had. So, but you're saying now you can? Are you, you've got something? Yeah, so it started a few months ago, the CTO of, so here was the thing. In order to do true federation, which is to say where you did not need to have any infrastructure controlled by the Blue Sky company in order to communicate with the Blue Sky users,
[01:13:14] you needed to run a kind of server that was very expensive to operate to the tune of tens of millions of dollars a year. What? And that was to, that was to ingest the whole fire hose and do things like give you real time read counts and comment counts on your posts. So you had to run, in effect, an equivalent infrastructure. You had to run Blue Sky. Yeah. So if you ran Blue Sky, you could run Blue Sky. You could run another Blue Sky. Right. And that it would be, it would be co-equal with Blue Sky hypothetically.
[01:13:44] But the CTO figured out that if you were willing to take like a relatively small latency hit in terms of those counts, those like real time counts. So if you didn't care about knowing how many people had read a post plus or minus 10 seconds and you didn't need it to be plus or minus 10 milliseconds, it would go to like $30 a month and you could run it off to Raspberry Pi. And he built that for himself and published some recipes.
[01:14:12] And so there's a thing called Black Sky, which is people who are kind of refugees from Black Twitter who've built a Blue Sky, a federated Blue Sky standalone system. I don't think that's them. No, I pulled up the wrong thing. Black Sky, Blue Sky. Yeah. I mean, yeah. And there are a few others. Uh, and I follow the Black Sky list, but this is not the list. This is actually their own server, which is great. I believe it's their own spam. I might be getting out over my skis.
[01:14:41] My impression was they're their own server. Maybe they started as a, yeah, I think they are their own. Maybe they're just a moderation layer, but, um. No, yeah. Look at, there it is. Black Sky, an AT protocol app implementation built in Rust. Yeah. There you go. Prioritizing the community building efforts of marginalized groups. There you go. So, uh, I, my, uh, server, uh, my sysadmin server at, uh, a colo in Toronto, uh, was like
[01:15:08] 15 years old and, uh, he couldn't get, uh, your sysadmin is 15. No, no, no. The server is 15 years old. Oh, okay. All right. All right. No, I wouldn't be surprised. It was fine. No, no, no. He's a great guy. He used to be the CTO of Wikimedia, Ken Snyder. He's a brilliant admin. Oh, okay. And, uh, he needed to upgrade his hardware. So he upgraded his hardware a couple of weeks ago and now we are building fetty.pluralistic.net and bsky.pluralistic.net. Will it be public or is it just you?
[01:15:38] No, just me. I'm going to own my stuff. A lot of people do that with Mastodon. They just run their own Mastodon instance for them. Yeah. Yeah. And Mastodon, it's interesting because while Blue Sky, uh, in its current iteration, this, this kind of breakthrough on reducing server load has made it quite practical to run your own server. If you have a relatively high, uh, readership Mastodon account, hosting your own server
[01:16:04] is pretty intense because every person who, um, reads your stuff gets a copy of it. And so your server gets really hammered. We're having to do like CDNs and stuff. So, um, you know, it's the, we are getting like true federation in both. I'd always said that I wouldn't go to Blue Sky if I couldn't leave. Right. That I didn't, I didn't want to be in a situation where like I was with Twitter, where I build up half a million followers. It was like structurally important to my ability to sell books and get people out for book
[01:16:32] tours and, you know, take care of the financial wellbeing of my family and promote the stuff I work on, on EFF and so on. I wouldn't ever put that at the mercy of someone. Cause I knew the people who started Twitter, they were nice. And then I got stuck there, uh, and. Turned into a Nazi bar. Yeah. And so it's not enough to trust the personalities, the people that you have. And that's kind of the thesis of inshittification is that inshittification is not about people being bad. It's about bad people being unshackled.
[01:17:01] Uh, you know, Mark Zuckerberg ran a better service when he worried about the consequences of the service being bad. Right. Uh, and now that he's too big to fail, he's too big to care and, uh, he can do bad things to us and, you know, he knows that we're locked in. He knows that you love your friends more than you hate Mark Zuckerberg. And so long as he makes sure that you, he never makes you hate him more than you love your friends. He can torment you and he can torment you for, for money. Not cause he's a sadist.
[01:17:31] Although I think he probably is, but because it's profitable. Um, good. Well, so does this mean that you will, you'll use blue sky and Twitter, uh, both? No, I'll be off Twitter. Twitter is gone now. I mean, I, I, the last thing I was hanging in for was, um, promoting the inshittification Kickstarter. So one of the things I do with these Kickstarters when I launched them is I go to my DMS and I look at all the people who have opened DMS to me that I know.
[01:18:00] And I send them a DM and say, Hey, would you mind quote tweeting the announcement of this? And that's been really successful. It's, it's made the difference between successful campaigns and not. And of all the weird things that Musk has broken with, with, uh, Twitter. If you go to your DMS and you type like Mike, none of the people whose names come up are people you follow or who follow you. It's just famous people named Mike. And most of them don't have open DMS. I don't know what that's supposed to do. Uh, it used to be also that you could get an infinite scroll of DM.
[01:18:29] So if you type Mike, you could see everyone who had open DMS to you, whose name was Mike and you just get like 12 and they're all just like Mike Cernovich. Uh, and, and, you know, like people you've not, you don't want to talk to, don't want to talk to you who you don't follow and who don't follow you. And you can type the full name of people you follow and it won't come up. You have to type their exact Twitter handle. So this is bizarre. Uh, but it's meant that the last thing that I was hanging on to Twitter for, which is
[01:18:55] like, okay, well, this is one way to get an extra 5,000 backers for my Kickstarter campaigns. It's not even useful for that anymore. And so that's it. I, when the blue sky's up, I'll, I'll just park that Twitter account. Yeah. Blue sky, according to, uh, Ars Technica is now the platform of choice for the science community. Uh, yeah. Lou, Lou, you said you had something, are you a blue sky fanatic? I, you know, I actually enjoy blue sky. I think my point of view here though, I think it goes in the fact that, you know, I think
[01:19:25] we should decouple the whole idea of science, knowledge, dissemination, and conversation to the platforms and stick with where the masses are at, where you can share information and knowledge. And I still, still think Twitter is the place to be. I do have, I do agree that blue sky is, is probably a better platform, but you know, I think I'm still kind of under the guise that, you know, most like international science. Yeah. I mean, international science, uh, you know, people are not going to be, oh, I don't want to do on Twitter. They know where that masses are out. They're going to keep posting there.
[01:19:54] Why restrict that for yourself? As they should, because that information needs to get out, you know, I mean, we now have a, uh, make America healthy plan that apparently means, uh, give kids whole milk and don't give them vaccines and, uh, the, the science needs to get out. Um, we basically got an anti-science government.
[01:20:17] Um, so yeah, I mean, I, but the, the problem always to me has been the people who are, it's not everybody on Twitter or everybody on X or everybody on blue sky. It's still a small fraction of the total audience. And this is one of the problems with getting the word out these days is there is no central place to get the word out anymore. Get blasted everywhere. Get to blasted everywhere. Put it everywhere. Right. That's the solution. Yeah. Posse post on site, share everywhere.
[01:20:47] Yeah. That's what I do. I have a micro blog blog that posts to all of the above, not Twitter. Cause they don't have an API that you can use, but everywhere that everywhere that does. And yeah, I, and that's an interesting question. If I could post on X, if I would, I'm not sure if I would or not. I moved off X the minute Elon bought it. And I, my predictions were, were validated. I will give, you know what? We, we give Elon a lot of crap.
[01:21:15] I'll give him some credit, uh, or at least the engineers that are working at SpaceX some credit. They had a very successful, uh, launch, um, over the, uh, on Wednesday or was it Tuesday? It was Tuesday and, uh, um, the starship didn't blow up. So that's good. Very good for the people who live underneath it. Yes. Very good. If you're in the Caribbean, uh, that, that rocket will not come down on your head.
[01:21:42] Um, and it was, it was pretty impressive. It was fun to watch. You know, I watched it. I I'm a space guy. I like this kind of stuff. I grew up watching Walter Cronkite and the man on the moon and all of that stuff. So credit where credit's due. Although I guess it really doesn't go to Elon. Do you know what I find amazing is just, just how mediocre the technology on Twitter has become, right? Just things like if you are posting a thread, uh, it used to be that you would compose the
[01:22:12] first 25 and then click on the last one and post more, uh, post another 25. Uh, and now when you click on the last one, it says an unexpected error occurred. There's a lot of little breaking things and going on there, especially logging in. And if you lose, if you move your, lose your account, people are complaining like crazy about how to get back on and so forth. It's just, these are, it's regressions, which is really interesting because this is stuff that just worked and it doesn't work anymore.
[01:22:42] My guess is that a lot of it is, um, saving money that like, I think the reason they broke and people search in DMS is because you're running a really big, uh, set of open looping queries through a directory. And, uh, I think they were just like, oh, that's hitting us for a lot of compute. You just don't need to find people to DM them. Like you can just, you can just suck it up. You either know people's exact handle or you don't, but don't count on DMing anyone. Right.
[01:23:10] Well, as Elon tweeted, uh, or whatever they call it now, uh, as he posted on X, uh, earlier today, X is the media now because you are the media now, whatever the hell that means. I think that the verb for posting on X now is shitted. He pooped on, on X, uh, yeah. X you're talking X. I, it sounds bad, but it's not. It's X I T T E D. Yeah. Yeah.
[01:23:40] Yeah. Yeah. J did. It's turned into Tik TOK, at least in the, for you part of it is, is all videos. Now it's kind of interesting how that's changed. We're streaming on X right, uh, right now. I mean, uh, we're also streaming on Tik TOK. We're streaming on Facebook and LinkedIn. There's 800 people watching us in a variety of platforms. I don't, I'm, I'm agnostic about that. I think it's, you're right, Lou be, be wherever people want you to be.
[01:24:07] That's always been my motto for, uh, for this, you know, uh, that's the idea of podcasting. It's open and, uh, and like that, uh, I will give some credit to the FCC, which rarely deserves credit in the Brennan Carr era, but they, the, uh, the folks at, uh, the National Association of Broadcasters went to the FCC and said, you know, you should charge cable
[01:24:33] style regulatory fees on streaming services and tech companies and broadband providers while you're at it. And the FCC said, you know, that really isn't our mandate. Is it? Uh, actually what they really said is we don't have the budget of the manpower to do that, but it's also frankly, not one of the things the FCC has always said is if Congress doesn't give us that authority, we can't do it.
[01:25:00] And that's really the real reason, uh, the FCC is to regulate the nation's airwaves, not broadband providers, streaming services and tech companies, the NAB, which is a broadcasting, uh, you know, lobbying group. Go ahead. I was going to say, I think that, you know, there's a constitutional question as well. The argument for regulating speech, uh, on air, uh, which includes things like levying
[01:25:29] fees and having a licensure regime is that the spectrum is a scarce resource. And so it's public, it's a public resource. Yeah. So you could, you just couldn't like in the same way that, you know, you're going to have to have rules for like who can use the meeting room at the library. Cause if everyone tries to use it at the same time, you know, it is the public's meeting room. Everyone can use it, but you just have rules about who can use it and how, because otherwise it, no one would get to use it. But that argument is just not the case for broadband, right? There's just not the case for online services.
[01:25:58] And so the argument that somehow there, uh, is a, a way that you could square the circle of the first amendment and regulating, um, you know, the, the, uh, internet is, is, is crazy, uh, at least, at least the way that they regulate, uh, broadcast providers and the cable stuff that they regulate is, is, um, it's a, uh, uh, compulsory license for broadcast signals.
[01:26:24] So when cable started, it was guys with antennas in central Pennsylvania who sold TVs, who ran cables from the, the antenna to people's houses, to people who bought TVs so that they could receive signals. Cause they were too far from both Pittsburgh and Philly to get their signals. Uh, and so after a bunch of lawsuits, they said, okay, fine. Anyone can rebroadcast any broadcast television on a cable, but you have to pay a set fee. So that's where those fees come from.
[01:26:50] If, if you're not retransmitting a broadcast signals, why would you pay broadcast signal fees? Yeah. The national association of broadcasters says, well, you tech companies benefit from the FCC's regulatory regime. So you should pay the, you should pay for it. It's like, it's nonsense. It's just, it's just broadcasters grasping at anything they can with their fingers as they slide down the wall of life.
[01:27:17] Um, well, let's talk a little bit about Cloudflare. Cause I think this is an interesting, uh, story. Um, Cloudflare got a lot of attention. I, I've been a fan of Cloudflare for a long time. They give away free services that are very valuable, but increasingly they seem to want to become kind of almost the, uh, the police of the internet. And, uh, a couple of weeks ago, we talked a lot about it.
[01:27:43] They, uh, dinged perplexity for ignoring their, you know, high quality robots. Dot texts, AI scraping, blocking. And I think maybe misunderstood a little bit of what the difference is between say perplexity and open AI, but perplexity defended themselves. It kind of went on, but increasingly it's looked like Cloudflare is, is almost in a power grab. And I'm really curious what you all think about it.
[01:28:10] They have a new, uh, thing called signed agents. Um, essentially they have an allow list. This is a post from a positive blue for open the open web and told builders to apply for permission. Uh, Cloudflare he says is solving the issue like a border checkpoint. Get on their list or get treated like a trespasser.
[01:28:39] And, uh, I think positive blue makes a very good point. Uh, Cloudflare is saying now we are going to be in charge of who's on the open web and who's not. Is that overreach? Lou, do you have an opinion on this one? Please. I think it's overreach. I do. I do think it's overreach. I think they, uh, you know, obviously they, they're trying to restrict something. Obviously restrict any type of internet, uh, and content on the internet. Um, they're trying to access a sensor. I can, this is just overreach in general.
[01:29:09] Yeah. Yeah. And I like Cloudflare and I, you know, I'm kind of a fan of them, but I have to say, this does not seem like a good direction that they are now moving in. Shoshana, you're nodding. Yeah. I've just been really confused by a lot of their moves. Even before this, they were, they've been going after other tech companies. And I know tech companies will do that, but it was weird because they just really hadn't before. And it seemed like they were just out to pick some fights and, uh, a lot of the stuff, I don't remember exactly what it was. Cause it was like a month ago.
[01:29:36] And I was like, I don't know what the hell this is, but it seems like they're, maybe they're just not sure of their place and they want to assert themselves or they see an opportunity for themselves to be more powerful in their asserting themselves. But it's, it's some weird stuff. And I don't, I'm not sure what their own goal is, but if they go down the route that it sounds like from the stuff that you read off, it doesn't sound like people will like them. They almost sound like they're setting up themselves as the own one and only certificate authority in a way. And they're solving a problem.
[01:30:06] I don't know if it really needs to be solved. I'm a, you know, as, as we were talking with Posse, I'm a big fan of the open web. I think the open net is, is, is a vital thing. And I don't think we need a gatekeeper to be in charge of who's who on the, uh, on the open internet. Yeah. It might solve some problems, but it, but it doesn't solve the problems that they say they're solving. Corey, are you following this? Yeah, I have.
[01:30:34] And, you know, I, I have similar feelings about cloud flare and that they have taken stands for good in the past and been particularly adamant that providing anti DDoS, uh, yes, that being an anti DDoS provider should not make you the first port of call or even the last port of call for people who had, uh, qualms about the content on the other side of the anti DDoS wall. Uh, they were very concerned that they would become a choke point and then become a natural,
[01:31:02] uh, landing spot for anyone who had an, who was angry at anyone else on the internet for good reasons or bad. And that people would just show up at a cloud flare and say, withdraw your protection from these people. Uh, and then they will cease to exist, uh, you know, for all to all intents and purposes. Very much like the Spanish soccer league is doing with DNS providers and the ISPs to protect themselves from piracy. They're saying, well, pull down the piracy sites. It's your job. So this is the problem of monopoly.
[01:31:33] And we talked about John Sherman before. So John Sherman, you know, gave this very famous speech in 1890 when he was stumping for the Sherman act. He said, if we would not tolerate a King, we should not tolerate an autocrat of trade with the power to fix the price of commodities, uh, and to do, and to decide who can get them right. Basically that the problem is only part of a benevolent dictator is only partly that they might not turn out to be benevolent.
[01:31:59] Even if they remain benevolent, they're still a dictator and that you can be benevolent and still be wrong. You can be benevolent and still be pressurized, right? You're the one throat to choke. So people will come to you and make demands of you. Uh, and also the instant we start to treat infrastructure as a vehicle for policy objectives, uh, everybody begins to make a claim on that infrastructure and it ceases to be trustworthy as infrastructure.
[01:32:28] So there's a really good book about this called underground empire by Henry Farrell and Abraham Newman, a couple of political scientists. I've known Henry forever. He and Aaron Swartz and I worked on stuff together and Henry, uh, and Abraham, their, their thesis is that, um, the U S under Obama first, or maybe under GW. And then under Obama started to do, um, started to take infrastructure that the U S operated that was seen as a kind of neutral broker.
[01:32:56] So like I can, or the swift dollar clearing system or fiber head ends, and they began to use them to try it and attain us policy goals. So sanctioning people, uh, spying on fiber, uh, you know, um, uh, getting, I can to, um, uh, deny or permit certain kinds of IP allocations, that kind of thing on the, uh, in order to, uh, achieve policy goals. And what immediately happened or what eventually happened was that everyone who had treated
[01:33:26] things like dollar clearing in the swift system or fiber or whatever, as a neutral trustworthy piece of infrastructure that was territory that wouldn't be tilted to one advantage or another stopped fully trusting it. And now we see a fragmentation and we see new fiber going in. We see, uh, people seeking alternatives to dollar clearing. Some of that I think is cryptocurrency, but also you have renminbi clearing becoming more central.
[01:33:52] So it not only, uh, has been sort of a problem for people who just want to get stuff done in the world. It's ironically, it's weakened to the U S as hold because it's spurred the creation of these alternative systems. Same thing happened when Europe put up its own GPS constellation. When the U S started to talk about disabling GPS for, uh, military rivals. And so the, you know, Europe put up its arrival GPS constellation, which then sort of takes away the U S's ability to use that power.
[01:34:22] Um, so, you know, I do think that, that they're playing with fire. I think people have treated cloud flare as a neutral, honest broker, and now they're going to see them as a parochial self dealer. Uh, and, uh, I, I also think that it shows us why antitrust was right from the gilded age to the seventies when it treated the concentration of power per se as a problem, even if it was being used to good ends.
[01:34:48] Even when I concentration, yeah, even, even a benevolent dictator is still a dictator, uh, and how the theories of the Reagan revolution, the consumer welfare theory that said that so long as a monopoly isn't raising prices or lowering quality, they should not only be tolerated, but encouraged as a source of efficiency, that that was wrong, that it created teams that were more powerful than the ref.
[01:35:10] Uh, and, um, it, it made them, uh, uh, ungovernable, uh, and also ironically for people who believed in the Reagan revolution militated for a much larger state, because even if you think that the only thing the state should do is enforce contracts, the state has to be bigger than the parties to those contracts in order to enforce them. So the smallest state you can have is determined by the largest corporation you're willing to tolerate. You know, I love the notion that infrastructure needs to be agnostic. It needs to be neutral.
[01:35:40] I mean, that's what net neutrality is all about. And we're starting to see some issues where it isn't. I mean, Cloudflare is just a, as a small example of this Intel, for instance, uh, is a little upset that the federal government has taken a 10% equity in their company saying it could trigger adverse reactions because, uh, they're now no longer, you know, kind of a neutral chip manufacturer. Sure.
[01:36:09] They have a policy, you know, implication here that they, the same thing happened. Uh, it was very interesting to see Nvidia suddenly couldn't sell the H 20 chip into China because people were concerned about a back door in the H 20 that could be, that it could be disabled. And the, and what, what happened while we started to make their own GPUs and China stopped buying Nvidia's GPUs, which is another reason Nvidia is probably hurting in the stock market.
[01:36:38] It, I think that's a really important point that infrastructure needs to be neutral. It needs to be a pipeline not and not make policy decisions. I don't know if the Intel one fits that model exactly, but, uh, I, I think that, you know, Intel is sort of national it's, it's being turned into what the Europeans call national champions. Yeah. Uh, it's not good. You know, it's well, I mean, we under, we see that people all over the world don't want to get their Huawei and their national infrastructure.
[01:37:07] Cause I think that China treats it as a, as a, as a, uh, uh, national champion. Sure. So we're going to have golden shares in the, in our companies, I guess. Uh, just, I mean, ironically, I think that some of the good that could come of this is stuff like Eurostack where we're seeing, uh, you know, the EU funding open source development and doing things like, um, uh, loans for infrastructure, you know, low cost loans for
[01:37:32] infrastructure and state investment to build clouds, to rival the American cloud, uh, so that they can move off of, uh, off of us infrastructure. Cause Trump has made it really clear that he views it as a vehicle for attaining political goals of the United States. What I think would be really interesting is if they decided to get rid of, of anti-circumvention laws, the laws that ban jailbreaking, because honestly, how are you going to get your, uh, firm off of office 365?
[01:38:00] If you don't have a migration tool and to get that migration tool, you're going to have to scrape. You're going to have to jailbreak. You're going to have to do a bunch of things that the U S trade representative told Europe that they had to put on their law books or they would face tariffs from the U S well, now that they have the tariffs, why wouldn't they get rid of those laws? Yeah. Not that anybody Lou would ever want to get out of Microsoft 365. It's a fabulous, fabulous product. And we all appreciate the work that it is.
[01:38:27] I mean, even if, even if you love it, I think the way to make sure that it stays good is to make sure that Microsoft is worried about people leaving, leave it open. That's what makes people, uh, you know, if your customers can't leave, you don't treat them well. Anyone who has ever bought a bottle of water at an airport knows this. Yeah. Yeah. That's in shitification, literally in the, in the flesh. Cory Doctorow is here. And, uh, so is Shoshana Weissman from rstreet.org and Lou Mareska.
[01:38:56] We won't mention what Lou does for a living at this point. No, he, he is leading Excel copilot integration into Microsoft at Microsoft. Congratulations on the promotion, by the way. That's great news. Thanks. Yeah. Um, let's take a break. We'll come back with more in just a bit. Our show today brought to you by NetSuite. What does the future hold for business? Ask nine experts. You'll get 10 answers.
[01:39:24] Bull market, maybe bear market. You'll get a break. Rates will rise. Rates will fall. Inflation's up. Inflation's down. Can somebody please invent a crystal ball? Until then, over 42,000 businesses have future-proofed their business with NetSuite by Oracle, the number one cloud ERP, bringing accounting, financial management, inventory, HR into one fluid platform. With one unified business management suite, there's one source of truth
[01:39:52] giving you the visibility and control you need to make quick decisions. And with real-time insights and forecasting, you catch a crystal ball. You're peering into the future with actionable data. And when you're closing the books in days, not weeks, you're spending less time looking backward and more time on what's next. Whether your company is earning millions or even hundreds of millions, NetSuite helps you respond to immediate challenges and seize your biggest opportunities.
[01:40:20] Speaking of opportunities, download the CFO's guide to AI and machine learning at netsuite.com slash twit. The guide is free to you at netsuite.com slash twit. netsuite.com slash twit. We thank them so much for supporting this week in tech. Netsuite.com slash twit. I think it was kind of ironic that once the government took 10% stake in Intel,
[01:40:47] spending the money that had already been allocated to Intel in the CHIPS Act, by the way, that wasn't any new funds dedicated to that. They now say, oh, and by the way, those milestones that were in the CHIPS Act, you don't have you don't have to do that anymore. According to the Wall Street Journal, Intel has said in a filing, it can now get money from the government as long as it can show it's already spent that money on projects. It's already spent it.
[01:41:15] So, hey, we don't have to do anything else. But we've got 10%. That's what matters. Gosh, darn it. Moving right along. I'm with Bernie Sanders in that. I don't think it's crazy to say that a company that gets a lot of public money should have some public obligations. That's not what Trump is doing here, right? Right. Right. Intel has huge problems.
[01:41:44] They can no longer make chips, right? They're pretty reliant on TMSC for most of their best manufacturing. And so, you know, that's not going to get solved by Trump taking them over. Although that money, I mean, could keep their foundry business alive a little longer, yes or no? I mean, if they if but the reason they're broke is because they spend it all on stock buybacks. There's nothing about having a board seat that will stop them from doing more of those stock buybacks. Either.
[01:42:13] So, you know, like I mean, when you have a company that's structurally important to the nation and the state intervenes to make sure that it's run well, that's like not necessarily insane, provided it's done well. But I don't have any confidence that Trump is going to do it well. And he has not announced any plans apart from having a share in it to, you know, make it invest in in foundries to improve its onshore production capacity. It's just it's it's a purely like smash and grab some red meat for the base.
[01:42:43] Move on. Press the Diet Coke button a bunch of times. Lose your focus. Get weird about boogers on the resolute desk from Elon Musk's kid. You know, talk about something, you know, grandpa ish. And then I hope nobody notices your hand bruises. You got a lot in that sentence. That's good. I'm impressed. Actually, trillion dollars already this year in buybacks. What's wrong with buybacks?
[01:43:14] Stock stock shareholders love them, right? Apple buys back billions of dollars in its stock every year. Is that different from a dividend? Yeah, because it moves the share price without changing the underlying fundamentals. That's why they were illegal. Make sure of stock more valuable. That's why they were illegal. They were considered stock. There was a stock swindle, right? Because you the share. When did they become legal? Fundamentals. Was that under Clinton? It was a series of lawsuits, I think in the 70s and 80s. Oh, OK.
[01:43:43] That that broadened. And then I think the SEC made a rule and they they made a rule where they're like, you can only do a good stock buyback and not a bad stock buyback. And the the loophole for what is a good stock buyback is so malleable that basically all stock buybacks are now good stock buybacks. You just have to sort of I think you have to say this stock buyback is good. This is a good one. Well, it's good for the shareholder because it reduces the number of shares outstanding,
[01:44:11] which means your value of your share is higher than it was. Apple's doing 100 billion dollars this year. Sure. Should go up because companies do better. Right. Right. Right. A company that puts 80 billion dollars into all the $80 billion less to spend on the company's operations. It's a little old fashioned, Corey. I mean, you know, look at GameStop. Well, you know, you don't need a company to do well for its stock to go up. You're an old time guy.
[01:44:41] I could see that. Right. Yes. But I mean, if you think that the reason to have a stock market is to allocate money to companies that are good as opposed to having a slot machine. Yeah. I mean, we could just replace. I mean, we are doing this. We have crypto. That's right. We could just replace the stock market with a slot machine, I suppose. But if you believe that the reason to have stocks and public companies and limited liability,
[01:45:06] all the things that we have erected around the market is to encourage allocation of capital to efficient firms, then having stock prices go up when firms don't do anything that increase their efficiency should be treated as a bug and not a feature. All right. I got one for you, Lou. Microsoft unveiled two in-house AI models. These are speech generation systems, right?
[01:45:36] May or you say may or MAI? May voice one and may one preview. You can see either. Yeah, I think they are. The interesting thing about this article is that they made it seem like Microsoft isn't developing their own models on the go and for specific purposes. I think they use open AI for a purpose. Obviously, they have a lot of stake in it and they are great models, but they're constantly developing these and these are some just to the new ones. There was FI. Yeah, there's FI all the way up to FI-4. Yeah. Yeah.
[01:46:02] And I think that's actually one of the points that Karen Howe made in her book Empire of AI. We're going to talk to her again on Monday, Labor Day tomorrow, 530 PM Pacific. If you're in the club, you can watch it. Is that this was kind of hedging, Microsoft hedging its bets. Microsoft had its good foundation models and kind of believed that they were going to be successful,
[01:46:27] but putting a couple hundred million or billion, actually it was billions into, it was 10 billion over a few years into open AI just seemed like a good, it's better than buying back the stock. That's right. That's right. Yeah. I mean, if you go, if you just go look at all the research papers that are coming out that are signed by Microsoft researchers, they are foundational models in there. So, I mean, yeah, it's interesting. May is a mixture of experts model trained on 15,000 NVIDIA H100 GPUs.
[01:47:00] Is that on Azure? Does Azure have access to that many H100s? That's amazing. Yeah, absolutely. Wow. Yeah. And so this is, this is allows you to create expressive audio from text in your spreadsheet, for instance. Yes. There you go. Does it? No, we don't use, we don't use this in our, in right now, but I mean, we do use other models. I mean, they're like I said, there's not just GPT-5, but, or GPT-4 or whatever model we're,
[01:47:29] you know, we utilize at the moment, but you know, they are, they are the most, uh, you know, foundational models there are right now. So that's why we're using them. What's the, what's the coding model you're using? Is it, is it open AI? Is it check GPT-5? Yep. Yeah. So let's see. Here's an emotive voice. Rain, she's joyous. Let's see. She's going to generate a little. Oh my goodness.
[01:47:53] You know, I've been thinking a lot lately and it just keeps popping up in my mind to be or not to be. Oh wow. Isn't that such a big question. Oh geez Louise. I didn't expect all of that. Okay. That's, I guess, interesting. Let's see. Oak can do it as a story. Let me see. Tomorrow. I'm giving them Shakespeare. Let's see how they do with Shakespeare.
[01:48:22] And tomorrow and tomorrow, like this show, creeps in its petty pace from day to day. Can't even imagine. This is going to be a story mode. See what it does. It says, Copa.
[01:48:49] In the heart of a dimly lit tavern, two weary travelers sat nursing their drinks. A knight, noble and gallant, pondered the relentless passage of time. Tomorrow and tomorrow, it creeps in its petty pace, does it not? Yes, it does. Yes, it does. Okay. Thank you very much. That's Oak. We also have Acacia. And Acacia can be scared. Let's see what the scared voice sounds like. I like to play with these.
[01:49:19] I don't know what good they are. To be or not to be. That is the question, isn't it? I mean, what if, what if we choose to exist and yet face the horrors that life throws at us? Life throws us. All right, enough of that. Anyway, that's May. She sounds very good. They both sound good. Yeah. May and Oak and Acacia. You know, it's funny because we, I don't know if you know this, but in PowerPoint today, they can
[01:49:48] generate, you know, narrations. From your slides? And you can generate videos right off using, you know, right in there. And I think that ClipChamp also does this now, which is I use that on a daily basis because I just know. So if I use my own voice. Maybe if I use ChatGPT to generate the slides and then I could use ClipChamp to generate the voice, I wouldn't even have to give the talk. I could just mail it in. Exactly right. Yeah. Just, hey, talk about whatever you want.
[01:50:17] I thought, Corey, you might have something to say about this. This has not happened yet, but Anthropic, as you remember, was sued over training data on copyright works. And while the judge held for Anthropic in the case where they actually bought the books and then scanned them and destroyed them afterwards, there were pirated texts as well.
[01:50:44] On Tuesday of this week, Judge Alsop confirmed that the authors and Anthropic have a settlement in principle and on September 5th will file a motion for preliminary approval. Authors celebrate historic settlement coming soon. What do you think? You're an author. Yeah. So I think that it's, I've got a whole two chapters on this in my new AI book that I finished this week.
[01:51:14] I know. But I think that it's important. You talked about chokepoint capitalism as well. That too. Yeah, I think it's important to break out what's going on here in terms of what is being alleged to be a copyright infringement, what is probably not a copyright infringement, what the court found wasn't a copyright infringement. So I am not a huge fan of the AI companies. I should say that at the beginning.
[01:51:39] And I do think that the fact that the AI companies have made such a kind of public show of trying to bankrupt creative workers by using their own materials is gross, especially when it is not actually a thing that they're going to run a business off of, but just a way to do a flashy demo. Because again, you bankrupt all the writers, all the illustrators, all the songwriters. And then you get nothing. Nothing. Yeah.
[01:52:08] What are you going to read next time? What are you going to look at next time? So I think that as a advocate for creative labor rights, and this is Labor Day weekend, I think we should focus on the things that will attain labor gains for creative workers. And I think that not only will adding copyright, a new copyright to control training of your works not help creative workers, but I think that it will be quite damaging to lots of other sectors.
[01:52:36] So I don't think that it's a copyright infringement to train a model. And I think that's upset. It's fair use is yeah. Yeah. They purchase the books. Yeah. When you break it down, it's pretty clear why it's not right. Right. So making a transient copy of a work for an analytical purpose is legal. That's how we have search engines. Making an analysis of a work is also not an infringement.
[01:53:00] And this is where I think that it's important to note that I believe what Anthropic was worried about is not being told that training a model from an infringing copy of a work is an infringement. I think they're worried about being told that acquiring an infringing copy of a work is an infringement. So let me give you an analogy. You go to a flea market. You buy a pirate CD. You go home and you count all the adverbs in the lyrics on the music on that CD.
[01:53:28] Buying the pirate CD might be a copyright infringement. Counting the adverbs isn't. Right. And so I think that what Anthropic wasn't worried. I don't think Anthropic was worried about being told that effectively counting the words in those books was illegal. I think they were worried about $150,000 per work statutory damage for downloading them in the first place. That's what I think they were worried about. And so. So they.
[01:53:56] So it's a victory for them in this case, because the books that they bought, the judge said, no, that's fair use reading and absorbing. Those books was fine because you bought them. Now it's possible. And I think very likely that you can make infringing works with a model. And, and I think that to the extent that Anthropic hosts, hosts the model and, and encourages certain uses or, or sells it for certain purposes that it will produce infringing works that are too closely related to the works that it trained on.
[01:54:26] That's what the New York times asserts in its lawsuit against that you wouldn't have to buy the times you could just ask chat. Yeah. I don't think that's an infringement. I think the, I think the infringement is actually copying words and phrases. The facts in a New York times article are not infringed or it's not infringing to copy them. Facts aren't copyrightable. Right. So I don't think that I think summarizing a New York times or the New York times summarizes stories from other newspapers all day long. You don't really want a world in which you, you can't summarize a news article without permission. That would be crazy. Right.
[01:54:55] So there is a breaking news doctrine, but it's very, very thin. So I think that, um, but to get back to how we save creative workers, because I do think that it's gross the way these companies are operating. The U S copyright office is an agency that I've been on the opposite side of for this entire millennium. You've always said that copyright protects the big companies that own the copyright, not the creators of the content. I have disagreed with almost everything the U S copyright office has done since, since the turn of the century.
[01:55:25] And suddenly I find myself. You can say that now, isn't it? Yeah. Yeah. But I find myself now on the same side as the copyright office, because the copyright office has said on multiple occasions and gone to court and one that there is no copyright in works created by AIs. Right. In fact, she got fired for saying that. No, she got fired for saying that it was, that it was, that it was fair use to, uh, to, to train. Okay.
[01:55:52] Uh, but, but, um, yeah, she, so, and also because she's black, they fired Carla Hayden. That's really why she was fired. Yeah. Yeah. Yeah. She was very good. But the thing is that, uh, to use a little bit of, of international copyright jargon, copyright in here is at the moment of fixation of a work of human creativity. So a new copyright springs into existence when a human being does something creative and makes a fixed record of it. You don't have to make any legal filing to get a copyright. It's automatically.
[01:56:21] You have to be a human, but you have to be a human, a human monkey selfies, not copyrightable botch, not copyrightable slop, not copyrightable. So I think if we give creative workers the right to decide who can train a model with their works, we will make it hard to make search engines to do computational linguistics. We will make new rules about who can publish certain things because they're also saying that publishing the model is should be illegal.
[01:56:47] And yet our publishers, there's five publishers that control all the books for labels, uh, studios that control all the movies and three labels that control all the music. They will just say, fine, as a condition of doing business with us, you have to sign away this new training, right? That the courts or the legislature or the regulator have just given you so that we can go to Anthropic. We can go to open AI and we can, uh, hit them for a fee to train.
[01:57:14] We can require them to put guardrails in the model so that only we can make works that are, uh, competitive with our own. No one will compete with us by making these works. Uh, and then they'll fire as many of those creative workers as they can. Uh, because the only thing that like Getty Images, uh, hates more than, um, uh, paying its workers is not having a copyright. And so if Getty Images wins its lawsuit, uh, it's not going to be good news for photographers.
[01:57:41] The only thing that's good news for photographers is the existing rule, which is that there is no copyright in AI outputted images. And that is great news. It means anyone can take them and sell them. And it means that the only way to get a copyrighted work is to pay a worker to make it. What happens though, if an AI say mid journey, which is being sued for this creates a image of Darth Vader, that's infringing. That's infringing. Cause that's, that's the equivalent of copying the text.
[01:58:08] It could be if it were, you know, for a critical or, or a transformative purpose or whatever, then that's not necessarily infringing, but that, yeah. So what if, if you barf up a whole star Wars episode, that's just a star Wars episode. Yeah. It's an infringing, uh, output depending on the extent to which they were actually involved in the production of it. You know, did they incentivize people? So it's not transformative. The mere use of AI itself is not necessarily transformative. It does not automatically. It's not necessarily transformative enough to qualify as fair use.
[01:58:37] Fair use is not merely about transformation. It's four factors and then like the gestalt of them all. So it's, it's a, it's a, it's what they call a fact intensive doctrine. It's very difficult to say definitively something is, or is not fair, but I'll tell you what, it's always legal to copy a public domain work, right? That's like not a thing we have any ambiguity about. And so if you're Getty and you want to make sure that you, people can't copy your stuff without paying you keep paying photographers.
[01:59:06] That is the thing that we should all be keeping in mind because that's what we want. You know, the day that, uh, Disney universal sued mid journey, I got a press release. Cause I am for some reason on the RIAAs press list. That was a, that might maybe Hillary Rosen's last, uh, uh, act as she left. So I get this press release and you know, the RAA or the record labels are not suing, but the record labels are the movie studios, right?
[01:59:34] Universal and Warner number two and number one and number two record labels, number two and number three movie studios. So the same, the same companies and Mitch Glazier, who's the CEO of the RIAA makes $1.8 million a year. And who went to work for the RIAA after he got fired from Congress for putting a slipping, uh, rider in legislation that took away musicians, copyrights. There's some irony there. No, no. And gave them to record labels. Oh, well that's good. Okay.
[02:00:02] So he got fired in the RIAA, hired him and are now $1.8 million a year. So this, this guy, this, this multimillionaire writes to me in, uh, you know, through the good offices of his press office and says, we are only angry that mid journey didn't approach these companies for a license. He's not saying we don't want AI to produce, uh, videos that compete with universal one or whatever, with the livelihoods of the workers who work there.
[02:00:32] He's saying that they should pay universal and Warner, not the workers, but universal and Warner license fees so that universal and Warner can use these tools to fire their own workers. Yeah. Cause they want to do it themselves. They would like to. Yeah, that's right. And so, you know, once you understand that, once you understand like the material basis for the AI bubble, then you can, these copyright questions become less, uh, difficult to kind of parse through and we, we can stop, you know, worrying about like, oh, did someone scrape something?
[02:01:01] I mean, scraping stuff is good, right? Scraping is how that kid in Austria figured out that the two major grocery stores were colluding to rig prices because he, he scraped their prices every week until he found that they were like, uh, raising their prices and lowering them in tandem. You know, it's scraping is how search engines exist. Scraping is how you get the version of the website from the company before they fired the guy who turned out to be a pedophile where you find his bio, where they say, we love this guy. And it's great that he's working with children on our behalf, right?
[02:01:31] Like that without scraping, we lose our ability to know anything about things on the internet. Unless the people who said them in the first place agree that we should know them. Scraping is amazing. So let's not get rid of scraping. Let's protect workers instead. One more break and then, uh, we will finish this and, uh, I'm going to desperately find a story for you. I feel so bad. Uh, unfortunately, all the stories have been in, in Corey's, uh, ball ballpark and he's
[02:01:58] right there in his, in his, uh, ball yard, but, uh, we will, we'll find some work to talk about. It's so nice to have you here. Anyway, Lou, it's great to see you. Uh, and I know you're enjoying it. So I am. Yeah. I got, I got a big list of bookstories. Do you know? I know. I know. Me too. Me too. Uh, everybody, by the way, should, uh, absolutely, uh, go to disin-shitification.org. And I, you see, you've already raised more than enough to do an audio book. So, oh yeah. Yeah. No, it's good.
[02:02:28] Although a lot of that is for pre-sales of the print book and the ebook. Oh, okay. Oh, so it's all one. It's all one. Yeah. And I think there's still probably a few little copies. I make these weird collages for my blog and, uh, I made a very high quality, very limited edition art book of my, uh, illustrations and there's 15 copies left. Ooh. So, uh, I have to pick that one up. Yeah. You've got it. You've got people might look at Corey's, Corey's pluralistic.net. And say, oh yeah. AI.
[02:02:57] But no, the he's, these are manual collages. That's even the lasso tool and the gimp. And that's, what's fun about him. Uh, he, he puts a lot of, uh, interesting effort into these and always uses, of course, public domain artwork. And CC images. Yeah. And CC images. Cause he's a good man. I also love, I don't know if people know about your Tumblr, uh, images from the atomic age. Yeah. Love that. I'm like the, I'm,
[02:03:25] the last, uh, ardent Tumblr user, me at me and like 50 weirdos. What? I don't know how you have time to do that. You just, anytime you see something that comes from the, you know, that it's meditative. It's just meditative. Yeah. Okay. Okay. You're not actively looking for stuff. I'll send you a blog post. I wrote about it. It's called divination about how it's like laying out a tarot deck where you take these images that come over your transom and then you try to make stories between them. Oh, that's cool. And I often get all kinds of weird little inspirations.
[02:03:54] I actually do that. I have a, uh, random number generator in my, uh, obsidian and I will have it generate a random, uh, verse from the doubt of Jing every day. And it's almost always on point. Cause you know that big. Yeah. Well, do you know the oblique strategies deck? Brian Eno's oblique strategy? Yes. Yes. Yes. I use that all the time. Love that. Be the first person to not do something that no one else has ever thought of not doing before. That's like my, that's my motto.
[02:04:24] It's just, uh, tapping into that, uh, part of your brain, I guess. Um, can you still buy the cards? I wonder. Oh yeah. I, I have a, well, I have a limited edition deck, but I think you can just buy a regular deck. Oh, and there's a website. Here's a website that just randomly. Yeah. Yeah. Do something boring. Overtly resist change. Like he brought these into the studio and recording talking heads albums and Roxy music
[02:04:52] albums and they use them to between takes and stuff. He's, he's amazing. Yeah. He's in his last music video. He's in the opening scene. He's reading red team blues. My novel. No kidding. Yeah. Congratulations. Wow. That's an honor. That's fantastic. That's very cool. Uh, our show today brought to you by Zscaler. Let's talk about, uh, cloud security. Let's talk about probably the, the gold standard in cloud security, zero trust. And let's talk about AI.
[02:05:23] That's, that's what Zscaler does. AI is a, is a really interesting, uh, you know, issue for businesses because on the one hand, the bad guys are using it to attack you with more vigor and more efficiency than ever before. Right? They are, they are just miles ahead. Now, more relentless and effective attacks than ever before. But you're probably using both public and private AI in your business. It powers innovation.
[02:05:51] It drives efficiency. Uh, organizations from small to large are leveraging AI to increase employee productivity with public AI for engineers with coding assistance, like, like copilot marketers with writing tools, finances, using AI to create spreadsheet formulas. And we know which spreadsheet they're using to do that, right? AI is being used to automate workflows for operational efficiency across individuals and teams.
[02:06:18] The companies are embedding AI in applications and services that are customer and partner facing, which helps them ultimately move faster in the market and gain competitive advantage. But you know who also is moving faster in the market? Hackers. AI is helping them too. Two, phishing attacks over encrypted channels increased this year by 34.1%. Why? Primarily the growing use of generative AI tools.
[02:06:45] Phishing is so much more effective nowadays because you can't just look at a phishing email and say, well, that's a grammatical nightmare. No, they're all perfectly written. They're very real looking. They look exactly like the AI emails you're getting from real companies. So what are you going to do? Companies have to rethink how they protect their private and public use of AI and how they defend against AI-powered attacks. And there is a way.
[02:07:09] Jeff Simon, Senior Vice President and Chief Security Officer at T-Mobile, talks about Zscaler. He said, quote, Zscaler's fundamental difference in the technologies in SaaS space is it was built from the ground up to be a zero-trust network access solution, which was the main outcome we were looking to drive. Zero-trust is so valuable because it goes a step beyond the traditional perimeter defenses we've relied on for decades.
[02:07:36] You know, firewalls, then you have to have a VPN to get through the firewall, and then you've got a public-facing IP, which is exposing you. And you're, you know, you now have an attack surface. And of course, that's exactly what these hackers using AI are going to latch on to. But that's the beauty of zero-trust, a modern approach with Zscaler's comprehensive zero-trust architecture and AI. You're ensuring safe public AI productivity. You're protecting the integrity of your private AI.
[02:08:04] And you're stopping AI-powered attacks cold because even if they penetrate your perimeter defenses, they can't do anything. That's zero-trust in action. Thrive in the AI era with Zscaler. Zero-trust plus AI to stay ahead of the competition and remain resilient as threats and risks evolve. Learn more at zscaler.com slash security. Zscaler.com slash security. Thank you so much for supporting us.
[02:08:33] Your wife just wrote a book that's exciting. Yes, she did. She did. What is she? Is it a novel? It is. It's a historical fiction. It's about the 1982. My favorite genre. I love historical fiction. She just put it out and she's gotten a lot of great reviews on it already, too. So it's a great book. It's really fast read. So what era are we in? It's about the 1982 Tylenol murders. Oh, wow. And it's a suspenseful novel about that. And she, you know, she wrote, obviously, it was funny.
[02:09:02] They never figured out who did those. They never figured it out. There's actually a Netflix series that just came out about it. And it came out the same time, like coincidentally, it came out at the same time. But hers is obviously a fictionalized version of it. But it's great. Is this her first novel? It's first novel. Yep. Well, that's wonderful. Congratulations. It's called Absolution. Absolution. And it is not Mrs. Mareska. It is E.E. Lawson. This is her pen name, yes. Yes. So you're outing her now. She also did Gimpy and did the cover herself as well.
[02:09:31] So there you go. Oh, that's a beautiful cover. I love it. I love it. Oh, how nice. Fantastic. Congratulations. Debut novel. And it's on Kindle Unlimited. Now, did they talk her into that or did she agree to it? No, no. She has a bunch of author friends and her mom's a book editor as well and just suggested that's a good place to start. That's one of the things I've been reading about that I don't like about what Audible does with authors.
[02:10:00] You've resisted this successfully, Corey, but they put a lot of pressure on you to be in the Amazon Audible Originals pile. But what I just read, which is disappointing, is that the amount of money authors get from Audible, whether they're in the Originals or not, it's not Originals, Audible Plus, whether they're in Plus or not, which is the same thing as their Kindle Unlimited, is dependent
[02:10:28] on what the authors on Audible Plus make. Mm-hmm. And because there is now a lot of AI slop in Audible Plus, authors are getting less and less money from Audible as their royalties. Have you heard this story, Corey? Well, it's a continuation of a long series of bad practices that I think we talked about this when Rebecca and I were on to talk about SharePoint Capitalism.
[02:10:57] Audiblegate was the kind of granddaddy of these scandals. Audiblegate was an accounting trick they used that stole $100 million from Audible independent authors, ACX authors, where they were effectively encouraging readers to return books as unsatisfactory, even if they liked them, because then they didn't have to pay royalty.
[02:11:24] So you would buy a book and then Audible would pressure you to give it back and say you didn't like it. And then Audible wouldn't have to pay royalty to you. And they were just showing you net sales, not sales with outgoing sales and returns. And what had happened over the years is they'd found, they'd played with the UX to find different ways to say, do you want to return this book and get your credit back and get another book this month? And they'd gotten better and better at doing that because they'd just done a lot of AB splitting.
[02:11:52] And so authors just found their revenues falling and falling and falling. And they had a glitch and they showed a complete return where they showed the outgoing and incoming. They sent it to all the authors. And it turns out that there's like crime authors who are retired forensic accountants who really like to dig through those things. And it's a hundred million dollar theft. Uh, and you know, I think to understand why this is going on, it's not that they like used
[02:12:19] to like writers when they gave them better deals. I mean, you probably remember when, um, uh, was it, uh, Hugh, uh, the guy who wrote silo Hugh, um, gosh, I can't remember his name. Uh, the guy who wrote silo had, you know, he's getting Hugh Howie. That's right. Hugh Howie was getting really good deals from Amazon Kindle and they were treating him really well. And he wrote these editorials about how this was a much better experience than he'd ever had
[02:12:46] as a, um, as an author, uh, working with major publishers. Uh, and there's the reason they got worse is because they became the only game in town. They, they had their writers locked in. Uh, this goes back to what I was saying about, about Azure and lock in like, even if you love the products and even if you think they're great, if you want them to stay great, right? If you want the people who are, if you work there, if you want the people who are colleagues in the firm,
[02:13:11] who have good ideas about how to make the product better and not make it worse at the expense of the, uh, customers to win those arguments. Uh, and if you are one of those customers, if you want to make sure that you still have good products, you need to be able to leave, you know, that the Microsoft or the Google antitrust case that they lost the first of the three that they lost over the last two years, a lot of it turned on this, uh, interpersonal conflict that was revealed in the memos that were published during the trial, um, between, uh,
[02:13:40] Prabhakar Raghavan, who was the head of revenue and, uh, Les Gomes, who was the head of search. And, uh, Google had tapped out growth on search because they had a 90% market share. Uh, so they weren't going to grow anymore. And so growth was flatlining and Raghavan's idea was to make search worse, right? To take out things like auto-correcting spelling and, um, uh, searching for synonyms when you searched for a search term, uh, in the background and so on.
[02:14:07] It was the, that's the piece by Zitron. Um, he decided to, he, he militated for making search worse so that you'd have to search two or three times to get the search result you were looking for. He was made head of search and he became head of search and he won the argument, right? Well, why did he win the argument? Well, he won the argument because the management team looked at this and they said, if we make search worse, we won't lose customers because we have a 90% market
[02:14:32] share and we're the only search engine anyone's ever heard of. So if you want, if you want to make sure that when that happens in your boardrooms at the companies you work at so that you can get up and say, I love this product I work on and I'm glad I work on it. You want your customers to be able to leave because if they can't, the worst person in your company is going to end up taking it over and then shitifying every product that you care about and you missed your mother's funeral for, and you didn't go to your kids' little league games so you could ship on time on.
[02:15:00] Although, as you pointed out in your talk at CF, it had a negative consequence in the long run for Google because they suck now. And there are alternatives. Yeah, but they're more profitable than ever. They're more profitable than ever. Yeah, that's true. At least for now. But people are still locked in. They, you know, like even... Don't people, people are starting to realize that Google sucks and there are other...
[02:15:27] I pay money for Kagi because it's so obvious Google is terrible. I use Kagi too and I pay for it. But, you know, even paying for Kagi, like I can't... On my Android phone, I can't make the search box on the home screen a Kagi search box. Yeah. And everything defaults to searching Google, right? I mean, like there's just so much lock-in. I, uh... That's one of the reasons I never use Google's launcher on an Android phone.
[02:15:55] I always use a third-party launcher because I hate that search pill. Yeah. Yeah, you can't even move it off the home screen. No, it's permanent. Yeah. I refuse to use their launcher. All right. I told Lisa, better get all your stuff from China now because many countries, including China, are no longer mailing things to the United States.
[02:16:19] The de minimis exemption, which said there are no tariffs on anything under $800, has now expired as of Friday. Uh, this has expired and come back several times, but this is... Assuming this continues, there are about 4 million packages a day which will suddenly be tariffed, depending on what country it's coming from.
[02:16:46] And because it's unknown and because it's been up and down, mail agencies in many countries are now saying, yeah, we're not going to take mail for the United States. So if you're buying something on eBay or Timu or Shein that is coming from outside the U.S., watch out.
[02:17:07] One, according to the story in Quartz, one company that will be particularly affected by all of this is Etsy, which I didn't realize. But Etsy, I guess a lot of stuff from Etsy comes from overseas. Sellers sold $10.9 billion in goods on Etsy last year. 30% of them export their goods outside their home country.
[02:17:36] So this will be interesting. Already Timu and Shein are playing games, trying to, you know, pretend it's drop shipping from the U.S. and all sorts of things to get their stuff in there. Swiss Post said starting August 26th, it will temporarily not accept postal goods intended for the U.S. Japan Post suspended its shipments to the U.S. from August 27th. India, August 25th.
[02:18:05] The United Kingdom's Royal Mail says it's using a postal delivery duties paid service, which said it meets the export requirements. But it means that duties will be calculated and then added, I guess, to the cost of the good. 88% of the de minimis shipments come through international mail.
[02:18:31] So I don't know what there is to say about this, but that's going to happen quick. And it doesn't apply to China. Oh, yeah. By the way, China's exempted, which is weird. Although it's a deal. They worked out a deal. This is what we call the art of the deal. Yeah. But who gets screwed by the art of the deal? I don't know. Did you buy me Trump coin? I did not.
[02:18:59] Taiwan's Postal Service suspending shipments, Australia. Leo, there is a tip jar in the Oval Office. If you wanted that, we all know how to get the service you want. I just need a little gold bar with a little some corning glass on top of it. Yeah. Apple will announce its new iPhones in a week. And everybody's saying, oh, I wonder if they'll raise the prices. But they have an exemption right now. It's not so much that.
[02:19:27] I mean, it's bad enough that tariffs exist. But they're so off and on that people aren't sure what's going to happen. And that's why these postal services are saying, well, we don't know. Can we? Should we ship it? What are we going to do? Well, when Trump brought down the first round of tariffs, there were container ships in the sea. Yeah. Millions of dollars worth of billions of dollars of the merchandise in them that all got sort of stranded. Right.
[02:19:57] Like, that's what you don't want. And so it makes sense that they're walking away from this. Yeah. As well as wind farms and a bunch of other things. All right. Let's talk about something happier. Yes. You used to be a big Lenovo fan. I know, Corey, you would immediately put Linux on your Lenovo box. You're using Linux now to talk to us. Amazing. How's it working? I love it. My framework is the best computer I've ever owned. That's right.
[02:20:27] You switched to framework. In fact, they just announced an update to their 16. So they just announced a whole bunch of updates. Everyone's really excited that their gaming laptop can do hot swap GPUs, which is great. The thing that has me over the moon is they say they're going to do a track nub for their keyboard. You miss your old Lenovo. Oh, my God. I want to track nub so bad. I want to track nub and three hardware buttons. And I would sell my child for it. What?
[02:20:53] You're the only person I've ever met that likes that little nub in the middle of the keyboard. You use that instead of a track pad? God, I love it. Yes, yes, yes. And then three hardware buttons. The idea that you've got to guess where the left button is. Oh, I know. That drives me crazy. Then you accidentally get a middle button and close your tab instead of popping down a pull-down menu. It's the worst. It's absolutely the worst. So, yes, I really desperately want that track nub to come.
[02:21:22] And it can't come a moment too soon. The beauty of this is probably you could just swap out your existing keyboard. You'll just swap out the keyboard. Yeah. Because I've worn out like three framework keyboards already, you know, which normally in my days of buying a ThinkPad every year, I would wear out a keyboard sort of in month eight or so and replace it. And then, you know, by month 12, I'd be shopping for a new computer. But now I'm really excited about being able to just hot swap and get the nub.
[02:21:51] I have to say, though, that the framework CEO says they can't. They haven't been able to find a short enough nub. But they say they've got it prototyped. Can you just shave it down a little bit? Yeah. Yeah. It pokes the screen, right? And of course, that's not what you want. And during the first round of tariffs, they stopped shipping a bunch of stuff to the U.S. So I'm waiting for my framework desktop, which doesn't have three buttons and a nub. But I guess I get a keyboard that does.
[02:22:20] You just get a keyboard and a mouse. But yeah, I figure worst case scenario, I'm going to be in Germany, Canada, the U.K. and Portugal on tour. And worst case scenario, I can get them to like FedEx it to a hotel and get it on the road. Obviously, they listen to you because they added the nub because of you. I can't imagine there's anybody else saying. No, no, no. The forums are just full of people going, oh, my God, give me a nub and three buttons. Really? Yeah. Shoshana, do you even know what we're talking about?
[02:22:50] No, but it's funny because I made friends with the Adam Smith Institute and there was once a dress I really wanted and they didn't ship it to the U.S. So it's like, can I please have this dress sent to you guys and you'll send it to me? They're like, yeah, sure, we'll do it. And it was great. I got the dress. She's drop shipping to the U.K. Now, when I lived in the U.K., I had a Japanese toilet freight forwarded from a freight forwarder in North Carolina. I had it shipped by by Toto and then freight forwarded to the U.K. This is one thing.
[02:23:18] One lesson of all of this is that you said nature finds a way. People find a way. They will get around all these restrictions. They'll arbitrage it. They arbitrage it. Lou was trying to say something. No, I was going to say my ThinkPad still has the nub. Oh, he's got a ThinkPad still. ThinkPad still has them for sure. I love it. Yeah. You use it too? I use the nub. Yeah, I love it. But I'm with Corey.
[02:23:46] The trackpads, their accuracy of click is just a pain in the neck. I just hate it. It seems so fussy to move the nub to get the mouse to move around. It's like... And twisting your hand this way to get at the trackpad is like it's an RSI waiting to happen. When you're like this, you're typing, and then you reach your finger out and you do the nub, you're not twisting your hand back in on itself in order to do a bunch of fine motions. Okay.
[02:24:13] I obviously never gave the nub enough time to really get used to it. Anthony, did you generate this on the fly, the White House tip jar? Yeah. That looks exactly like what the tip jar we used to have in the studio. Yeah, that's Nano Banana. Nano Banana. So everybody's excited about this new AI model from Google. They actually stealthed it for a while called Nano Banana, which takes a still and modifies it.
[02:24:44] Like I could take this picture of the four of us and put cowboy hats on all of us. I'm sure somebody will do that right now for us, right? Have you played with it? Oh, here's, by the way, here's my tip jar. In case, this is not AI. That's real. There's not a lot of money in that tip jar. Actually, we used to have a digital currency tip jar on the website.
[02:25:12] There's 7.85 Bitcoin in there, but I neglected to write down the password. Yeah, you laugh, but that's almost a million dollars in Bitcoin that I cannot access. The tip jar turned out to be a great idea. Can you imagine having a million dollars in your tip jar and you can't get it out? Why don't you just call the Bitcoin headquarters and ask them to do a password recovery with you? Yeah. Help me. My son keeps saying, hey, I saw this guy on YouTube that can do it.
[02:25:42] I said, yeah, I'm not giving this wallet to some guy on YouTube because I'll never see the, it's too much money. He's never going to, he's going to say, oh yeah, I couldn't get in. Here's your wallet back. Oh, it's empty. What happened? All right. Anyway. Ah, be very careful on X. If you live in Stuttgart or anywhere in Germany,
[02:26:06] an economist, Thomas Weirhaus has been fined 16,000 euros for being sarcastic on X. Oh, great. Okay. He's being, he was, he's being prosecuted under section 185 of the German criminal code for three separate X posts,
[02:26:32] or as you call them shits, that government officials and journalists deemed insulting. The total fine penalty from the Dusseldorf district court, August 12th, 16,000 euros. The first incident, June, 2023, the vice president of the Bundestag and a leading figure in the Green Party had issued an alarmist post about climate change.
[02:27:25] The second charge came after Weirhaus discovered the identity of the same judge, doctoral student and referred to him online as a little snitch. Oh, no, no, no, 999. The student filed a personal complaint, claimed an insult and the court held. Yeah, yeah, yeah. Third post got him in further legal trouble when he criticized a journalist for ARD, the fascist party in Germany. AFD. Oh, this is a different one.
[02:27:54] ARD is not AFD. Sorry. It's a newspaper. Yeah. It's a newspaper. Uh, Moritz Rodel had defended the economic minister's decision to skip a Bundestag debate on Germany's heating law. Weirhaus responded with a post calling Rodel a nincompoop. Is that, is that a German word? You still have a lot to learn in order not to be constantly fooled. That's all he said. 16,000 euros.
[02:28:24] Now I think our street should get on this case. So I guess you don't care what happens in Germany, right? No, we, we've been looking at, at a lot of stuff in Europe. I mean, the UK and Germany in particular have been really, really bad with this stuff. Uh, part of it is like, we don't have like influence with, with government there, but it's something that I want to take into account to remind people like how laws will be abused. Well, also they don't have a first amendment, right? I mean, there's no. Yeah.
[02:28:53] But I thought they had stronger free speech protections than this. Like I really, from my understanding, I thought it was stronger than this, but I guess not. I guess not. But it's, it's really atrocious to see this stuff happening. I don't know. It's, it's really horrible, but I try to, to the best of my abilities, use this to remind people in the U S don't do stupid crap like this and create like terrible regimes where you can be penalized for free speech. Cause this is what's going to happen here. You know, what is German for nincompoop?
[02:29:23] Did you do this with nano banana Lou? I did. Yes. So how, what's the process? Is it a text prompt on top of a real picture? What is it? It is. You just upload a real picture and then put a text prompt on there. And what was the thing is it doesn't regulate right now on copyright. So you, I mean, I don't know how they got past all of this, but that's your picture. Yeah. Yeah. Yeah. Usually you can't do that, huh? Could you, uh, it's not, if you go, if you dig into the pixels too, it's, it's identical
[02:29:53] to my image. So they, they didn't do anything to it. Was your picture in, did you provide your picture to them? I did. Yep. Okay. So the, the desktop background is the, is the thing that is all generated. Yeah. That's all generated. It's still pretty good. I like it. I like the bouquet. Did you tell it to do bouquet or no? I did. I told it to do that. Nice. It's impressive. It's fast too.
[02:30:18] I think, you know, obviously opening eyes like takes a couple seconds. You know, a lot of them take a long time. Uh, is it in Gemini? Do you just go to, do I just go to Gemini? Yeah. They offer you a certain amount. Uh, even if you don't have Gemini pro, but the 2.5 model will do it. I think, uh, I think I ran out of credits. As I remember. Leo had twit.tv account has pro just FYI. How do you know that?
[02:30:49] Well, cause have you been using my Leo at twit.tv? No, no, no. Well, I think the, the whole twit, uh, oh, we all have pro. Yeah. Do we buy pro for everybody? I also bought pro for my personal account. So maybe now I'm overdoing the pro. I have too much pro. So, uh, okay. Try. It says try image editing with our best image model, nano banana. And it gives you the little instruction here, upload an image and describe your edits. Okay. That's cool.
[02:31:17] And it's using flash, but you could, could you do it with pro as well or no? Yeah. I think it just routes it to the model, regardless of which one you choose. Yeah. That's fine from flash. Okay. Well, I've got a show to do. Unlike all of you, I gotta find other things to talk about. Google has, uh, locked down Android a little bit. One of the things that was, uh, you know, the Google had a real advantage, I think over
[02:31:45] Apple, uh, even though they still have been attacked by Epic and others is that you could at least sideload third-party apps. Google would warn you like crazy and all that stuff. Well, they've started to lock it down. Now they say, we'll have to identify, verify your identity. Even if you're distributing an app outside the play store. Um, you gotta submit your information to an Android developer console. I guess it makes sense from a security point of view.
[02:32:13] Um, on the other hand, uh, one of the things that's cool about sideloading is you can install anything. Right. And it's on you to be, to keep, you know, make sure it's secure. Uh, and so this is now more gatekeeping from Google. If you ask me, what do you think Lou? Yeah. I think sideloading is really only to me good for developers who want to do testing. Um, you know, obviously there's the F droid store. There are all these third party stores. Yeah. But you can't, the problem I, I have a problem with just can't trust them.
[02:32:43] I think that's the biggest thing. Obviously Google feels like that's part of the problem. Yeah. Right. Yeah. Yeah. Um, actually go ahead. No, I was going to say even back in the day, uh, you know, technology wise, you know, most stores, even like windows store, they're requiring you to, to sign your applications by a, you know, a glorified signering, uh, you know, a provider, you know, so they, they want to make sure that obviously everything's safe. And I think, like you said, these third party companies in sideloading, you just can't, you can't verify it. Yeah.
[02:33:11] So, uh, I mean, I, I think that, um, it's true that Google goes to great lengths to keep its users safe from people who aren't Google, but people will not keep you safe from Google and Google has done things like surveilled its users and lied to its users and defrauded its advertisers and defrauded its publishers. And, uh, I, I also think that the important thing to note is that the reason Google is doing
[02:33:37] this is because a court found them guilty of having an illegal monopoly and ordered them to open up the app store in the Epic case. And this is not even malicious compliance with that court order. This is malicious non-compliance, very unlikely to survive the court challenge that's inevitably going to come as a result of this, because they were not ordered to open up the app store, but make developers apply for permission. They were ordered to open up the app store. Uh, and, you know, I do think we have mechanisms for addressing software vendors who sell malicious
[02:34:07] or substandard products, which is things like liability. Uh, but the idea that we just trust the benevolent dictator to be benevolent, you know, we are living at the tail end of a 20 year experiment and that, and it failed. Uh, and once again, if you want Google to be very good at curating its store, uh, and making sure that no one ever sideloads an app, then just make Google really good at it. And the way to make Google really good at it is to make sure that people have a choice
[02:34:35] so that they have to be better than the alternative. Um, I, I just don't buy that a computer that I own. Google should be in charge of what I install on it. And moreover, given that I bought the computer on the condition that I'd be able to choose what I install the idea that Google is going to do an after the fact software update that takes away a feature I paid for. I think it provides us to the level of fraud, but finally Google is not validating apps. Google is validating developers.
[02:35:02] And so Google is making no guarantees about the promise of the quality of an app, right? There are people who sell on eBay for five years, small items, and then, uh, they, and they have five star ratings and a hundred good ratings. And then they list 25 laptops that don't exist, get a thousand dollars for each of them and disappear. The fact that Google knows a developer is good because they posted some good apps tells you nothing about whether the next app that developer posts is going to be any good. So I don't think they're making a safety guarantee either. Right.
[02:35:33] Well, what, why are they doing it then? They're doing it to, um, maliciously non-comply with a court order in the, uh, antitrust case they lost to, to Epic. We are still waiting for judge meta to decide what he's going to do to Google. He should, it should be soon. It was supposed to be a while ago. I think it was supposed to be by the end of August. Yeah. So, so he's got, he's got, uh, five hours, uh, among the things he can choose, of course, forcing them to, uh, divest Chrome.
[02:36:03] Yeah. Yeah. Uh, I like, I hope they get a data deletion order. I hope they're ordered to delete user data. Um, that would be interesting, but I, it might happen in the other case. They lost two federal, they've lost two government cases and one private case. So that might happen in one of those cases. One of the other things the department of justice is asking for is, uh, for a Google to, um, open up its algorithm, which Google says that's just, that's not going to end. Well, I think he's there. Probably right. I don't know.
[02:36:32] I mean, we, we say security through obscurity is bullshit everywhere. That's true. That's true. Where it's like, well, if I told you how the search worked, it wouldn't, you'd figure out how to spam it. So when no one has ever said that about cryptography, uh, which defends things that are a lot more important than search. Although, I mean, one of the big fights Google's constantly making is against, uh, spammy sites and spammy links, people trying to game the algorithm. Sure. But cryptographers have to contend with governments trying to break cryptography.
[02:37:01] I mean, it's, are you saying that spammers are a harder, uh, adversary to cope with and should be, therefore our security measures should be more forgiving, you know, of, there are a lot more of them than there are governments. There are a lot, uh, there are a lot more people trying to crack crypto than there are people trying to gain. Interesting. Yeah. Okay. All right. One last ad and I have a, actually a host of stories and an invite for, uh, everybody from thanks to, uh, Corey, but first a word from smarty.
[02:37:31] Now that's a good name. Smarty is the, they are our sponsor for this segment. They are the premier provider of high performance cloud-based address data tools. Trusted by leading organizations in pretty much every industry, insurance, real estate, finance, healthcare, e-commerce, of course, technology in general with more than 20 million non, this is an interesting stat. I had a great conversation with these guys. There are in the United States, more than 20 million non postal service addresses.
[02:38:01] And then how do you deliver something to them? Smarty. Smarty's database expands beyond rural limitations from U S and international address validation to auto-complete rooftop geo coding property data enrichment. Smarty offers the most advanced scalable solutions on the market backed by proprietary technology and data. Smarty delivers unmatched accuracy, consistency, and insight. Smarty.
[02:38:29] Every property has 350, more than 350 data points associated. Rooftop level precision, very handy if you've got to know where the door is that you're going to deliver that package to. Real-time processing speeds exceeding 25,000 addresses per second. They can handle your traffic. Smarty enables better decisions no matter the size and complexity of your business. Smarty. And just look at the Smarty website, smarty.com slash twit. You'll see how many companies are relying on Smarty.
[02:38:59] I'll give you an example of Fabletics. They wanted to go international, but they had a problem with conversion rates internationally. They've drastically increased conversion rates for new customers, especially overseas, thanks to Smarty. Built for developers and engineered for scalability, Smarty's APIs are reliable. They're lightning fast. They're easy to integrate, and they're supported by expert documentation and real-time technical support. I'll give you another example. Speedway Motors.
[02:39:27] Their e-commerce conversion rates going downhill despite increased traffic to their website. The director of digital product and technology says, quote, we use Smarty to identify an address as a commercial address. Conversion rates, very strong now. Everything was well set up on the Smarty side. I've really enjoyed working with the service. Developers love Smarty. It's a great API. It's fast. It's easy to implement. Smarty is a 2025 award winner across many G2 categories.
[02:39:57] Best results, best usability, users most likely to recommend, and high performer for small business. And of course, Smarty is also USPS CAS and SOC 2 certified and HIPAA compliant. If your organization is seeking the most accurate, reliable, and future-ready address data suites, Smarty is the way to go. Try it yourself. Get 1,000 free lookups. When you sign up for their 42-day free trial, visit Smarty.com slash twit to learn more.
[02:40:26] That's Smarty.com slash twit. We thank them so much for their support. Smarty. Of this week in tech. South Korea has banned cell phones in all middle and elementary school classrooms. We just, in Petaluma, the Petaluma School District just decided to do that. All the students have a little bag. They lock their phone into the beginning of the day and then they unlock it as they leave.
[02:40:54] This is spreading across the nation. The scourge of cell phones finally eliminated from our nation's classrooms. You're smiling, Corey. I don't know. I mean, I'm of two minds about it. I have colleagues. My friend, my colleague Haley Tsukiyama has worked a lot on this. My impression reading Taylor Lorenz on the subject is that the research on phones and
[02:41:21] classrooms and their impact on it is not great because it's trying to disentangle it from a lot of other stuff that's going on at the same time. And the outcomes when the phones leave the classroom have not been well validated. I can understand, though. You don't want kids playing, you know, flappy bird during economics. And I teach a lot. I, uh, I'm a new visiting professor at Cornell and I'm going out there for, for, uh, 10 days
[02:41:51] to go and teach a bunch of classes. And I've just taught classes and there's kids who are, you know, doing, uh, playing on their phones and it's no fun. I've also taught classes where they're just kids who are asleep and it's no fun. Right. I don't think you have to be on a phone. Absolutely. Uh, my, my concern about the phone stuff, uh, is that, um, it's not that it's not that teachers have it too easy and we shouldn't make it easier for them to control their classrooms.
[02:42:19] It's that, uh, the means by which you control phones in schools involves a lot of invasive, uh, fallout. Uh, and it also is one of those things where you swallow a spider to catch a fly, right? Like there is such a thing as a burner phone. Uh, and the fact that the kid handed you a phone and let you lock it up doesn't mean that they don't continue to have a phone. And I worry that this just escalates into a bunch of stuff that is, uh, uh, to one side
[02:42:45] of blocking kids from having phones, uh, like searching kids and searching their bags. Right. Uh, and I worry that, um, bag search and kid search is also a pretext for a bag search and kid search of whoever the principal doesn't like, or the school authorities don't like and singling them out for, for more searches. I know that wherever we do create a greater latitude for disciplinary action, it tends to be a selectively enforced and it tends to victimize the people who are already struggling. So I don't know.
[02:43:15] I mean, my parents are both, uh, career educators. My brother's an educator. I do a lot of guest lecturing. It's not an easy job. Uh, I just, I, I do, I, I would not be surprised if in a year we heard a bunch of stories about phones getting hacked, phones getting stolen, kids being searched because they're carrying burner phones, blah, blah, blah, blah, blah. A bunch of stuff that is a fairly foreseeable outcome. And this is a theme we've talked about this during this, this show, right?
[02:43:41] That people say, if you do X, people will do Y and you'll have to do Z and Z is going to be something you don't want to do. And they're like, no, I'm sure it'll be fine. And then a year later we're like, yeah, Z happened. And they're like, who could have foreseen it? Lou and, uh, your 16 year old has a cell. I bet. Of course. Right. Yeah. He has a cell, but it's interesting about the state of Rhode Island is they're restricting all devices, all personal devices, not just cell phones. So this includes AirPods. You name it. They're not allowed to have them other than the Chromebooks that they're sanctioned.
[02:44:10] So each public school shall have a policy regarding the use of personal electronic devices on school grounds to reduce distractions, maintain environments focused on learning and protect the privacy and safety of students and staff. I think one of the, what, one of the things that sometimes is a problem is the fact that the kids have these phones and can record video. But sometimes that's a, that's something that can save a kid from an abusive teacher or abusive classmates.
[02:44:39] How do you feel about your kid not being able to bring in their AirPods? You know, I mean, that's not a big deal, but I think there's, you know, the smartphone, mobile phone, tablet, computer, smartwatch. You can't even wear a smartwatch. I mean, they're all worried about obviously cheating and other things as well. So not just distraction, but you know, the cheating, but they, they also have very strange policies around AI use as well, which is going weird. You know, it's starting to be weird. Like for instance, you're not allowed to use AI on, on helping you with math homework or something like that.
[02:45:08] So I think there's, there's just this kind of like push away from technology because they're worried about the outcomes of them. You know, Shoshana, this seems to be a trend in general. I know this is something our street's not fond of, which is government more and more seems to be intruding into our personal lives. And in particular, in a kind of a moralistic way, like I often feel like this is, this is
[02:45:34] this is kind of abrogating the right of the parent to control their kids. I don't, do we, does government need to get involved in this? Yeah, that's kind of how I feel too, where it's like, I, if it's, if a school makes a decision, I'm like, okay. But even that there you know, when I was 13, I started getting really sick and now I'm many years later stuff worked out, but at the time the school didn't believe me.
[02:46:02] A lot of doctors didn't believe me and the school wouldn't let me call my mom to have her pick me up because I was feeling really, really sick that day. They thought you were a shirker. Yeah. Yeah. And then I ended up calling her on my cell phone and I was able to do that and it made a difference. And I know it's a one-off, but like they were doing that to a girl who had a tumor, like she had an X-ray or whatever test it was. And it showed a tumor and the school psychologist had told her, no, you don't have that. That's not why you're like this or something like that. Yeah.
[02:46:30] So schools do have, you know, abusive histories in certain cases. And I worry for the kids who are on the margins there. Plus the further stuff like that Corey's talking about the, okay, are we going to accept this? But what about the next steps? What about the kids who won't listen and stuff like that? Plus, you know, parenting has to be a thing. Like we, we need to make sure that like kids behave in classrooms. And my dad was a high school teacher.
[02:46:56] He's retired now and he struggled with a lot, but like, I'm, I just were like, this is probably the thing I dislike the least out of everything, but I still have some real functional concerns with how this works and other negative fallout from it. At least the Rhode Island law says that, uh, school administrators, teachers may not search the devices. They can't look at what's on the content. But there's never a guarantee. It's like, oh, okay. Sure. That's against the law. Teach. What are you doing?
[02:47:26] Yeah. They can just say they didn't. And, oh, I saw over their shoulder that, you know. Right. And it also, it's going to push kids to do stuff on their school issued laptops, which is going to increase the amount of lockdown and surveillance on those school laptops because kids do. I mean, my old editor for information week, uh, Mitch Wagner tells the story about how one day, like it was a, you know, it was a magazine for CIOs. And one day this old blog post he wrote about, you know, a Cisco firmware update.
[02:47:53] Suddenly the comment section comes alive and it's two eighth graders, you know, talking about a classmate and, you know, their school firewall hadn't blocked forums on random blog sites. And so they found ancient blog posts to use as message. Of course they're going to do that. Yeah. And I just worry that this, you know, we've already seen awful degrees of surveillance and control on school issued devices. Uh, and again, that's disproportionately directed at kids of color and queer kids.
[02:48:23] Uh, and you know, I just feel like any excuse to spy on kids more, especially in the age of AI, I bet there's already some grifters just rubbing his four legs together and saying, Oh, goody, goody, goody. I'm going to sell a lot of spyware to schools. That'll give you a score about how wicked your kids, uh, uh, private messaging is and tell you which kids to put on a watch list because, you know, now that they can't do their private messaging on a private device, it's all going to be on their school device.
[02:48:52] And we're going to have, we're going to be able to gain insights and give you a dashboard, uh, of risk factors in your school. Uh, I just, you know, if we had an environment in which, uh, the devices schools issued to kids were oriented around a kind of respect for those kids and a proportionality of the degree of surveillance control exerted, I'd be a lot less worried about this. But I do think this is corralling kids into a, an, an already very abusive, uh, set of devices. Yeah.
[02:49:22] So, uh, the EFF awards are coming up. We want to support EFF. You've probably heard enough on this show to realize how important the electronic frontier foundation is. Corey, you still do some work for them. Yes. I am still a halftime contractor and a staff activist, uh, or an activist rather at EFF. Uh, I've been there for 24 years now, more than half my working life. Uh, nearly half my whole life.
[02:49:48] Uh, and yeah, we, the EFF awards, we recognize, uh, people every year. I'm very excited this year that we're recognizing the software freedom law center in India who are litigators who litigate to enforce free software licenses. Uh, free software licenses are only as good as the, as someone's willingness to sue to enforce them. Uh, and they've done really important work in India and making sure that, um, when free code is integrated into commercial products or government products that the new source code is also released.
[02:50:18] But we have three honorees this year. The ceremony is always very, very good. I have to miss it this year. I'm going to be at Cornell, but, uh, on September the 10th, we'll be having them in San Francisco. You can live stream them for free. If you're an EFF member, there's a reduced fee at the door, or you can join EFF and get in at the reduced fee or come in for the whole fee and join and get a refund for part of your door fees. So please consider joining us in San Francisco. Very nice. EFF, uh, dot org.
[02:50:45] And, uh, the EFF awards are September 10th. We'll report on the winners. That's what we will do. Very good. Those are the winner. Those are the winner. It's not anyone. They are the winner. We have three honorees. I'm reporting on them. The Just Futures Law, Erie Meyer, and the Software Freedom Law Center in India. All right. Nice. Congratulations to all three. That's great. Thank you, Corey Doctorow.
[02:51:09] Everybody get the, get the book and shitification, uh, sign up for the audio version of it. And shitification, here it is. There it is. Oh, it's got a little poop emoji right on the front there. You'll easy to find. Yeah. I love it. Uh, go to Kickstarter or to go to disinitification.org. Two Ts. I'm actually three Ts, but three in total. Three in total. If I were an AI, I would know exactly how many. No, I wouldn't. Uh, thank you.
[02:51:39] You think there was an R. Always a pleasure. Congratulations on the Cornell thing. That's great. You too. Good luck on your 28 city tour. Holy camole. Yes. Uh, hope you survive it and, uh, come back soon. We, uh, always love seeing you. I love being on it. Thank you for being here. We really appreciate it. Yeah. Leo and Lou and Shoshana. Same to, uh, same to you. Shoshana Weissman, uh, rstreet.org. She writes great stuff there. It's always a pleasure for you and your Marmot to join us.
[02:52:08] We always are happy to see you. Head of digital media at rstreet.org. Thank you, Shoshana. Thank you. Great to have you. And Lou Maresca. We call him silent Lou these days. I'm, I feel terrible. Which is funny. They don't call me that at home for sure. Not at home. No, no. But, uh, but I, I, I kept trying to find a story for you, but I wasn't for doing a very good job. I apologize. It's great to have you keep, keep the nano banana going and more importantly, keep the
[02:52:37] Python and the AI in the, in the sheet, the spreadsheets because Excel and Python. That's a, that's a match made in heaven. Yeah. AI and Excel in general is amazing. Absolutely. We're shipping the copilot function, uh, coming soon. So check that out. Oh, good. That's soon. Okay. Yeah. Thank you, Lou. Great to see you again. Thanks to all of you for joining us. A special thanks to our club twit members who make this possible. If you're in the club, don't forget tomorrow, 5 30 PM.
[02:53:04] I know it's labor day, but we're working Jeff Paris and I will interview Karen Howe about her new book, the empire of AI. That'll be for a later show, but you can watch live, uh, in the club club to a discord is a great place to hang. We encourage you to join us in the discord. If you're a club member. And if you're not, we encourage you to go to twit.tv slash club twit and join, get ad free versions of all the shows, special shows. We don't put anywhere else.
[02:53:31] We now do all the keynote, uh, coverage, uh, in the club. In fact, we've got the, uh, connect Microsoft, sorry, meta connect, uh, keynote coming up in a week from Wednesday. It'd be right after, uh, intelligent machines, but you can't watch it unless you're in the club. So twit.tv slash club twit. We do stream twit every Sunday, two to 5 PM Pacific five to 8 PM Eastern 2100 UTC, eight different ways, of course, in the club to a discord, that's way one, but there's
[02:54:00] also YouTube open all same with Twitch, tick tock x.com, Facebook, LinkedIn and kick everywhere. You can watch us live, but you don't have to watch us live. You can get a copy of the show audio or video at twit.tv. There's a YouTube channel dedicated to the video. And of course, the best way to get any of our shows is to subscribe in your favorite podcast client. Yes, we're RSS. We believe in open standards. So subscribe and get the show automatically every Sunday, just in time for your Monday morning commute.
[02:54:31] 20 years. We've been doing this. Here's to 20 more. And as I have said, for the last 20 years, thanks for joining us. We'll see you next week. Another twit is in the can. He's amazing.
