Cheating has never been easier with the rise of AI like ChatGPT. It's definitely something to worry about, but what do we know about the upsides of AI in education? Khan Academy founder Sal Khan says education won't destroy our kids' brains, but we have to integrate AI the right way. After launching a new chatbot, Khanmigo, Sal's found that AI opens up opportunities for students to learn in ways they couldn't before, leaving room for more creativity, deeper thinking, and gained perspective. Sam and Bilawal discuss why AI seeping into the classroom is inevitable, and how to use it to our advantage.
For transcripts for The TED AI Show, visit go.ted.com/TTAIS-transcripts
Learn more about our flagship conference happening this April at attend.ted.com/podcast
Hosted on Acast. See acast.com/privacy for more information.
[00:00:08] [SPEAKER_01]: Last year, three students at Emory University launched an AI tool that was supposed to make
[00:00:13] [SPEAKER_01]: studying just a little bit easier.
[00:00:16] [SPEAKER_01]: They called it 8Ball, and here's how it worked.
[00:00:19] [SPEAKER_01]: Say you were preparing for an exam.
[00:00:21] [SPEAKER_01]: You could upload all of your lecture slides from your classes, even those messy handwritten
[00:00:26] [SPEAKER_01]: notes, and 8Ball would spit out a bunch of flashcards you could use to quiz yourself.
[00:00:31] [SPEAKER_01]: And the flashcard tool was just the beginning.
[00:00:36] [SPEAKER_01]: The 8Ball team was planning to add a test generator complete with answer keys and a
[00:00:41] [SPEAKER_01]: homework helper.
[00:00:42] [SPEAKER_01]: In March, the students pitched their idea to Emory's annual entrepreneurship competition.
[00:00:48] [SPEAKER_01]: They took home the top prize of $10,000.
[00:00:51] [SPEAKER_01]: Things were suddenly looking up for the 8Ball team.
[00:00:54] [SPEAKER_01]: Which is why they really didn't see this coming.
[00:00:58] [SPEAKER_01]: Just a few months after the competition, Emory suspended two of the students.
[00:01:03] [SPEAKER_01]: One of them for a full year.
[00:01:05] [SPEAKER_01]: The university insisted 8Ball could be used for cheating.
[00:01:10] [SPEAKER_01]: The students were shocked.
[00:01:12] [SPEAKER_01]: This was going to go on their permanent records.
[00:01:15] [SPEAKER_01]: First, an award.
[00:01:17] [SPEAKER_01]: Then a punishment for the exact same product.
[00:01:21] [SPEAKER_01]: So which is it?
[00:01:23] [SPEAKER_01]: Is AI an innovative learning aid or a cool new way to never do your homework again?
[00:01:31] [SPEAKER_01]: I'm Bilavo Sadu and this is the TED AI Show, where we figure out how to live and thrive
[00:01:36] [SPEAKER_01]: in a world where AI is changing everything.
[00:02:06] [SPEAKER_04]: I'm Lizzy O'Leary, the host of Slate's What Next?
[00:02:09] [SPEAKER_04]: Join us on Imagine This from BCG.
[00:02:19] [SPEAKER_00]: Sometimes things in the world of technology are complicated and need careful explaining.
[00:02:25] [SPEAKER_00]: Sometimes they just need a little hard truth.
[00:02:28] [SPEAKER_02]: I don't think anyone is going to buy a banana with crypto at any point in the
[00:02:32] [SPEAKER_02]: foreseeable future.
[00:02:34] [SPEAKER_00]: I'm Lizzy O'Leary, the host of Slate's What Next?
[00:02:36] [SPEAKER_00]: TBD, your clear-eyed guide to technology, power, and the future.
[00:02:40] [SPEAKER_00]: Friday and Sunday wherever you get your podcasts.
[00:02:50] [SPEAKER_01]: Last year, New York City schools made headlines when they banned ChatGPT.
[00:02:55] [SPEAKER_01]: A spokesperson told The Washington Post that the tool gives quick and easy answers.
[00:03:00] [SPEAKER_01]: It doesn't teach critical thinking or problem solving.
[00:03:03] [SPEAKER_01]: In other words, it doesn't actually help you learn.
[00:03:06] [SPEAKER_01]: And they had good reason to be concerned.
[00:03:09] [SPEAKER_01]: I mean, if ChatGPT can pass the frickin' bar exam, it can definitely write your eighth
[00:03:14] [SPEAKER_01]: grade report on recycling.
[00:03:16] [SPEAKER_01]: Cheating's never been easier.
[00:03:19] [SPEAKER_01]: Since then, they've reversed the ban and it seems like many educators expect the
[00:03:24] [SPEAKER_01]: use of AI will only grow in schools in the coming year, even if they're not that
[00:03:28] [SPEAKER_01]: enthusiastic about it, because it is hard to see exactly where all of this is going.
[00:03:34] [SPEAKER_01]: We're scared of AI, but we're also embracing AI.
[00:03:37] [SPEAKER_01]: We think kids will just give up on learning, but we also think AI will be the great
[00:03:42] [SPEAKER_01]: equalizer.
[00:03:44] [SPEAKER_01]: On this topic, Sal Khan has a very particular perspective.
[00:03:48] [SPEAKER_01]: He's the founder of Khan Academy, one of the world's most successful online
[00:03:52] [SPEAKER_01]: learning platforms.
[00:03:54] [SPEAKER_01]: And he's pretty confident that no, AI will not destroy education if it's done
[00:04:00] [SPEAKER_01]: right.
[00:04:01] [SPEAKER_01]: Last year, Khan Academy launched a new feature, an AI chatbot called Khanmigo.
[00:04:07] [SPEAKER_01]: It's powered by GPT, but unlike ChatGPT, Khanmigo isn't just answering students'
[00:04:13] [SPEAKER_01]: questions.
[00:04:14] [SPEAKER_01]: It's supposed to actually help them learn, like their own personalized tutor
[00:04:18] [SPEAKER_01]: available 24-7.
[00:04:21] [SPEAKER_01]: Since then, Khan Academy has started pilot programs in actual schools using
[00:04:25] [SPEAKER_01]: Khanmigo as both a learning aid and a teaching assistant.
[00:04:29] [SPEAKER_01]: So how does that work exactly?
[00:04:32] [SPEAKER_01]: And what does having an AI teacher mean for students and teachers alike?
[00:04:37] [SPEAKER_01]: Sal Khan and I spoke a few weeks ago about the present and future of AI in
[00:04:42] [SPEAKER_01]: education.
[00:04:43] [SPEAKER_01]: But let's start at the beginning.
[00:04:44] [SPEAKER_01]: In 2004, when Sal was working as an analyst at a hedge fund, he'd
[00:04:49] [SPEAKER_01]: recently gotten married and some family from New Orleans came to visit.
[00:04:53] [SPEAKER_03]: And it just came out a conversation that my 12-year-old cousin, Nadia,
[00:04:57] [SPEAKER_03]: needed help.
[00:04:58] [SPEAKER_03]: She didn't know it, but she was put into a slower math track I offered to
[00:05:02] [SPEAKER_03]: tutor her remotely when she went back to New Orleans.
[00:05:04] [SPEAKER_03]: And she agreed.
[00:05:05] [SPEAKER_03]: Then word spreads and my family-free tutoring is going on.
[00:05:08] [SPEAKER_03]: Before I know it, I'm tutoring 10, 15 cousins, family, friends.
[00:05:11] [SPEAKER_03]: And it was working for them.
[00:05:14] [SPEAKER_03]: Nadia went from being in a remedial track to being one of the strongest
[00:05:17] [SPEAKER_03]: students in her class.
[00:05:18] [SPEAKER_03]: I started seeing that with my other cousins.
[00:05:21] [SPEAKER_03]: Then in 2005, just as a way for me to scale, I started writing exercise
[00:05:25] [SPEAKER_03]: software for them so that they can practice and get fluent in those
[00:05:29] [SPEAKER_03]: primarily math skills.
[00:05:30] [SPEAKER_03]: And that I, as their tutor, could keep track of it.
[00:05:33] [SPEAKER_03]: That was the first Khan Academy.
[00:05:34] [SPEAKER_03]: It had nothing to do with videos, YouTube, anything.
[00:05:37] [SPEAKER_03]: But it was in 2006 that a friend, after seeing the software that I was
[00:05:42] [SPEAKER_03]: writing for my family, said, hey, how are you scaling up your actual
[00:05:45] [SPEAKER_03]: lessons?
[00:05:45] [SPEAKER_03]: And I told him I'm not.
[00:05:46] [SPEAKER_03]: And he said, well, why don't you record lessons on YouTube for your
[00:05:50] [SPEAKER_03]: family?
[00:05:50] [SPEAKER_03]: And I said, that's a horrible idea.
[00:05:53] [SPEAKER_03]: YouTube is for cats playing piano, dogs on skateboards.
[00:05:57] [SPEAKER_03]: But I gave it a shot.
[00:05:59] [SPEAKER_03]: And after a few months, my cousins famously said they liked me better
[00:06:02] [SPEAKER_03]: on YouTube than in person.
[00:06:03] [SPEAKER_03]: Before I know it, 2008, 2009, 50 to 100,000 folks per month are
[00:06:08] [SPEAKER_03]: using these resources I was making for my family.
[00:06:12] [SPEAKER_03]: And I set it up as a nonprofit, Khan Academy, with the mission of
[00:06:16] [SPEAKER_03]: free world-class education for anyone anywhere.
[00:06:18] [SPEAKER_03]: And then in 2009, that's when I quit my day job to work on this full
[00:06:22] [SPEAKER_03]: time.
[00:06:22] [SPEAKER_03]: And we were able to turn to a real organization.
[00:06:24] [SPEAKER_03]: And now it's much more than just me.
[00:06:26] [SPEAKER_01]: I think that trajectory is super interesting, right?
[00:06:28] [SPEAKER_01]: Because you started off doing in-person hyper-personalized
[00:06:33] [SPEAKER_01]: tutoring, essentially.
[00:06:34] [SPEAKER_01]: And then you had to make it way less personalized as you
[00:06:37] [SPEAKER_01]: delivered these videos at scale.
[00:06:39] [SPEAKER_01]: But now with AI, maybe you can make it personalized again.
[00:06:43] [SPEAKER_01]: So tell us about your AI assistant, Khan Nigo.
[00:06:46] [SPEAKER_01]: Where did this idea come from?
[00:06:48] [SPEAKER_03]: Sam Altman and Greg Brockman from OpenAI popped me an email
[00:06:52] [SPEAKER_03]: summer of 2022.
[00:06:54] [SPEAKER_03]: They said they were working on their next generation model,
[00:06:56] [SPEAKER_03]: which would eventually be GPT-4.
[00:06:59] [SPEAKER_03]: And they wanted to show it.
[00:07:01] [SPEAKER_03]: I was skeptical that it would have any relevance.
[00:07:04] [SPEAKER_03]: But I knew Sam and Greg were legitimate folks.
[00:07:06] [SPEAKER_03]: And we got on a video conference.
[00:07:09] [SPEAKER_03]: And they showed me a AP biology question, asked me the
[00:07:13] [SPEAKER_03]: answer.
[00:07:13] [SPEAKER_03]: I said, oh, it's C, it's osmosis.
[00:07:15] [SPEAKER_03]: And then the AI was able to answer it correctly.
[00:07:17] [SPEAKER_03]: I'm like, okay, that's kind of interesting.
[00:07:19] [SPEAKER_03]: Maybe it got lucky, ask it to explain.
[00:07:21] [SPEAKER_03]: It explained it very well.
[00:07:23] [SPEAKER_03]: Then I asked it to write another question, explain the
[00:07:25] [SPEAKER_03]: wrong answers.
[00:07:26] [SPEAKER_03]: And it was able to do all of that fairly well.
[00:07:29] [SPEAKER_03]: So then they gave us access to it over the weekend and
[00:07:32] [SPEAKER_03]: I couldn't sleep.
[00:07:32] [SPEAKER_03]: It was pretty clear that you could use it for cheating
[00:07:34] [SPEAKER_03]: in other pitfalls.
[00:07:36] [SPEAKER_03]: But it was also clear that it could really, from a chat
[00:07:38] [SPEAKER_03]: interface, almost seem indiscernible from when I
[00:07:41] [SPEAKER_03]: used to chat with Nadia remotely.
[00:07:43] [SPEAKER_03]: And that's when I said, okay, this changes everything.
[00:07:46] [SPEAKER_03]: Our team, we started having all the debates that society
[00:07:49] [SPEAKER_03]: is having.
[00:07:49] [SPEAKER_03]: The debates were, okay, this is cool, but it has errors,
[00:07:52] [SPEAKER_03]: makes math mistakes sometimes, safety data privacy.
[00:07:55] [SPEAKER_03]: How do you prevent cheating, et cetera, et cetera?
[00:07:56] [SPEAKER_03]: And what I told the team is like, those aren't
[00:07:59] [SPEAKER_03]: reasons to not work on it.
[00:08:00] [SPEAKER_03]: Those are reasons for us to turn them into features.
[00:08:02] [SPEAKER_03]: We have to be aware of those risks, but we have
[00:08:04] [SPEAKER_03]: to move forward.
[00:08:05] [SPEAKER_03]: And so what would eventually become a conmigo, our AI,
[00:08:10] [SPEAKER_03]: not only tutor on Khan Academy, but also as a
[00:08:12] [SPEAKER_03]: teaching assistant for teachers, helping them write
[00:08:14] [SPEAKER_03]: lesson plans, grading papers, write progress reports,
[00:08:17] [SPEAKER_03]: et cetera.
[00:08:18] [SPEAKER_03]: And then we launched it as part of the GPT-4 launch
[00:08:21] [SPEAKER_03]: in March of 2023.
[00:08:24] [SPEAKER_01]: To make the difference very clear for listeners, I'd
[00:08:26] [SPEAKER_01]: love it if you talk through how you're approaching
[00:08:28] [SPEAKER_01]: writing instruction with AI.
[00:08:30] [SPEAKER_01]: Like most people are obviously just assuming, oh,
[00:08:32] [SPEAKER_01]: you type in a prompt, the essay unrolls in front
[00:08:34] [SPEAKER_01]: of you, but you do it differently with your new
[00:08:36] [SPEAKER_01]: essay feedback tool.
[00:08:37] [SPEAKER_01]: Can you talk through that?
[00:08:39] [SPEAKER_03]: It was the last day of November, November 30th,
[00:08:41] [SPEAKER_03]: the chat GPT comes out and we were under at the
[00:08:44] [SPEAKER_03]: moment with OpenAI working on what would become
[00:08:46] [SPEAKER_03]: Conmigo.
[00:08:47] [SPEAKER_03]: And immediately as we all remember that time, the
[00:08:50] [SPEAKER_03]: world kind of exploded and I slacked Greg Brockman
[00:08:53] [SPEAKER_03]: at OpenAI and I said, what's going on here?
[00:08:55] [SPEAKER_03]: You have us all cloak and dagger secret about
[00:08:58] [SPEAKER_03]: things and you just launched something.
[00:09:00] [SPEAKER_03]: He said, no, we just put a chat interface on
[00:09:02] [SPEAKER_03]: top of an older model that had been out for
[00:09:03] [SPEAKER_03]: several months and the whole world all of a
[00:09:05] [SPEAKER_03]: sudden noticed for some reason.
[00:09:06] [SPEAKER_03]: But immediately when people saw chat GPT,
[00:09:10] [SPEAKER_03]: especially students saw it, they said, well,
[00:09:12] [SPEAKER_03]: we could use this to write some of our essays.
[00:09:14] [SPEAKER_03]: It could construct a solid B plus essay, not
[00:09:17] [SPEAKER_03]: necessarily.
[00:09:18] [SPEAKER_03]: They had to be a little bit more creative to
[00:09:19] [SPEAKER_03]: get it to an A and you started seeing school
[00:09:22] [SPEAKER_03]: systems ban it.
[00:09:23] [SPEAKER_03]: I thought that people were going to throw out
[00:09:24] [SPEAKER_03]: the baby with the bathwater, but by the
[00:09:26] [SPEAKER_03]: time we launched, the school system essentially
[00:09:28] [SPEAKER_03]: said, Hey, this technology is powerful.
[00:09:31] [SPEAKER_03]: Kids are going to have to know how to use
[00:09:32] [SPEAKER_03]: it if only someone were to give it in a way
[00:09:35] [SPEAKER_03]: that was made for education, had the right
[00:09:38] [SPEAKER_03]: guardrails, not only supported students
[00:09:39] [SPEAKER_03]: better, but maybe even prevented cheating.
[00:09:42] [SPEAKER_03]: And I'll use writing to your point about
[00:09:44] [SPEAKER_03]: how we do that.
[00:09:45] [SPEAKER_03]: We're about to launch a version which we're
[00:09:48] [SPEAKER_03]: calling writing coach, which doesn't just
[00:09:50] [SPEAKER_03]: give you feedback, but it walks you
[00:09:52] [SPEAKER_03]: through the entire process.
[00:09:54] [SPEAKER_03]: So look at the prompt that the teacher's
[00:09:56] [SPEAKER_03]: given you.
[00:09:57] [SPEAKER_03]: It'll riff with you to come up with
[00:09:58] [SPEAKER_03]: a thesis statement, but it'll put it on
[00:10:00] [SPEAKER_03]: you.
[00:10:00] [SPEAKER_03]: It acts as an ethical writing coach.
[00:10:01] [SPEAKER_03]: Then you can go into outlining and
[00:10:03] [SPEAKER_03]: there's a whole interface where you can
[00:10:05] [SPEAKER_03]: move things around and it gives you
[00:10:06] [SPEAKER_03]: feedback on it.
[00:10:07] [SPEAKER_03]: And then you can write the essay, get
[00:10:08] [SPEAKER_03]: feedback.
[00:10:09] [SPEAKER_03]: And then in the fall, we're going to
[00:10:11] [SPEAKER_03]: make that so that the teacher can assign
[00:10:13] [SPEAKER_03]: that, including with the prompt and
[00:10:15] [SPEAKER_03]: the rubric, which they could work on
[00:10:16] [SPEAKER_03]: the AI with.
[00:10:17] [SPEAKER_03]: Students work on it with the writing
[00:10:18] [SPEAKER_03]: coach and then they submit it through
[00:10:20] [SPEAKER_03]: the writing coach back to the
[00:10:21] [SPEAKER_03]: teacher.
[00:10:22] [SPEAKER_03]: And what that will allow is in the
[00:10:24] [SPEAKER_03]: past, well before AI, all teachers
[00:10:26] [SPEAKER_03]: got where the final, the final output
[00:10:28] [SPEAKER_03]: of the essay, even when there wasn't
[00:10:30] [SPEAKER_03]: cheating, they wouldn't really know
[00:10:32] [SPEAKER_03]: much about the process or how long
[00:10:34] [SPEAKER_03]: it took students or where they had
[00:10:35] [SPEAKER_03]: difficulties.
[00:10:36] [SPEAKER_03]: Now using Conmigo and this writing
[00:10:38] [SPEAKER_03]: coach, when the student submits
[00:10:40] [SPEAKER_03]: the AI reports to the teacher,
[00:10:42] [SPEAKER_03]: hey, we spent about four hours on
[00:10:44] [SPEAKER_03]: this essay.
[00:10:46] [SPEAKER_03]: Sal had a little bit of trouble
[00:10:47] [SPEAKER_03]: coming up with the thesis statement.
[00:10:48] [SPEAKER_03]: We eventually got there.
[00:10:51] [SPEAKER_03]: This, this work is consistent with
[00:10:52] [SPEAKER_03]: Sal's other work, especially the
[00:10:54] [SPEAKER_03]: work that he's done inside of the
[00:10:55] [SPEAKER_03]: classroom.
[00:10:56] [SPEAKER_03]: And by the way, a lot of your
[00:10:57] [SPEAKER_03]: students are having trouble with
[00:10:58] [SPEAKER_03]: thesis statements.
[00:10:58] [SPEAKER_03]: Maybe we should create a mini
[00:10:59] [SPEAKER_03]: lesson on, on that.
[00:11:01] [SPEAKER_03]: It really is akin to imagine if
[00:11:03] [SPEAKER_03]: every student had essentially one-on-one
[00:11:06] [SPEAKER_03]: support from a teaching assistant
[00:11:07] [SPEAKER_03]: tutor and that you as a, you as
[00:11:10] [SPEAKER_03]: a teacher can have a conversation
[00:11:12] [SPEAKER_03]: with every teaching assistant
[00:11:14] [SPEAKER_03]: about every student.
[00:11:16] [SPEAKER_03]: So the teaching assistant, first
[00:11:17] [SPEAKER_03]: of all, can give a preliminary
[00:11:18] [SPEAKER_03]: grade, the professor, the teacher
[00:11:20] [SPEAKER_03]: is still in charge.
[00:11:21] [SPEAKER_03]: And then the teacher can ask the
[00:11:22] [SPEAKER_03]: teaching assistant, why do you
[00:11:23] [SPEAKER_03]: think that where are the
[00:11:24] [SPEAKER_03]: students strengths and weaknesses?
[00:11:26] [SPEAKER_03]: If society had infinite resources,
[00:11:27] [SPEAKER_03]: that's what we would have done
[00:11:28] [SPEAKER_03]: from the beginning, but we don't.
[00:11:30] [SPEAKER_01]: So it sounds like there, there
[00:11:31] [SPEAKER_01]: are these superpowers, right?
[00:11:32] [SPEAKER_01]: Or abilities that you can scale
[00:11:34] [SPEAKER_01]: well beyond the, you know, sort
[00:11:35] [SPEAKER_01]: of constraints of us just being
[00:11:37] [SPEAKER_01]: humans.
[00:11:38] [SPEAKER_01]: Are there any other capabilities
[00:11:39] [SPEAKER_01]: that surprised you as you've been
[00:11:41] [SPEAKER_01]: developing this project or like,
[00:11:42] [SPEAKER_01]: holy crap, like we can actually do
[00:11:44] [SPEAKER_01]: this.
[00:11:45] [SPEAKER_03]: Almost every hour I play around
[00:11:47] [SPEAKER_03]: or I think about things, I
[00:11:48] [SPEAKER_03]: realize I was being too narrow
[00:11:49] [SPEAKER_03]: the hour before because this
[00:11:51] [SPEAKER_03]: technology is, yeah, I start,
[00:11:53] [SPEAKER_03]: I start brave new words with
[00:11:55] [SPEAKER_03]: me and my daughter at the
[00:11:58] [SPEAKER_03]: kitchen table.
[00:11:59] [SPEAKER_03]: And she was 11 years old when,
[00:12:02] [SPEAKER_03]: when the story happened, it was
[00:12:04] [SPEAKER_03]: over Christmas break and I
[00:12:06] [SPEAKER_03]: prompted GPT-4 to write a story
[00:12:09] [SPEAKER_03]: with her. So they're writing a story
[00:12:10] [SPEAKER_03]: about a social media influencer
[00:12:12] [SPEAKER_03]: who gets stuck on a desert island
[00:12:13] [SPEAKER_03]: and starts having an anxiety
[00:12:15] [SPEAKER_03]: attack because she can't share the
[00:12:16] [SPEAKER_03]: pictures because there's no
[00:12:17] [SPEAKER_03]: internet connection.
[00:12:18] [SPEAKER_03]: And my daughter says, Hey
[00:12:20] [SPEAKER_03]: dad, can I talk to
[00:12:23] [SPEAKER_03]: Samantha, who was the character
[00:12:24] [SPEAKER_03]: in the story and tell her that
[00:12:26] [SPEAKER_03]: it's okay that like she doesn't
[00:12:27] [SPEAKER_03]: have to share everything on social
[00:12:29] [SPEAKER_03]: media.
[00:12:29] [SPEAKER_03]: And I was like, we could ask.
[00:12:31] [SPEAKER_03]: And so my daughter says, I'd
[00:12:32] [SPEAKER_03]: like to talk to Samantha.
[00:12:33] [SPEAKER_03]: And then the AI took on the
[00:12:35] [SPEAKER_03]: persona of the character
[00:12:37] [SPEAKER_03]: in the story that my daughter was
[00:12:38] [SPEAKER_03]: writing with the AI and said,
[00:12:40] [SPEAKER_03]: Oh, Hey, the, I'm so, I feel so
[00:12:42] [SPEAKER_03]: bad. I can't share how beautiful
[00:12:43] [SPEAKER_03]: this is. And my daughter's like,
[00:12:44] [SPEAKER_03]: it's okay, Samantha.
[00:12:46] [SPEAKER_03]: You don't have to share this
[00:12:47] [SPEAKER_03]: with the world.
[00:12:47] [SPEAKER_03]: You should just enjoy it.
[00:12:49] [SPEAKER_03]: And that was just one of those
[00:12:50] [SPEAKER_03]: moments for me where I'm like,
[00:12:51] [SPEAKER_03]: this is so surreal that my
[00:12:54] [SPEAKER_03]: daughter is collaborating with
[00:12:56] [SPEAKER_03]: an AI and talking to a
[00:12:58] [SPEAKER_03]: simulation of a character that
[00:12:59] [SPEAKER_03]: she has constructed with an AI.
[00:13:01] [SPEAKER_03]: And so that transcends tutoring,
[00:13:03] [SPEAKER_03]: that transcends what I was
[00:13:05] [SPEAKER_03]: doing with my cousins.
[00:13:06] [SPEAKER_03]: And that's when we started to
[00:13:07] [SPEAKER_03]: say, wow, you could, you
[00:13:08] [SPEAKER_03]: could have AIs act as
[00:13:09] [SPEAKER_03]: simulations. What if literally
[00:13:11] [SPEAKER_03]: content could come alive?
[00:13:13] [SPEAKER_03]: If you could talk to Egor the
[00:13:14] [SPEAKER_03]: donkey, what if you could talk
[00:13:15] [SPEAKER_03]: to the Golden Gate Bridge?
[00:13:16] [SPEAKER_03]: What if you could talk to the
[00:13:18] [SPEAKER_03]: Eiffel Tower?
[00:13:19] [SPEAKER_03]: What would it tell you?
[00:13:20] [SPEAKER_03]: These things are now possible.
[00:13:23] [SPEAKER_01]: Let's talk about the downsides
[00:13:24] [SPEAKER_01]: of AI just a little bit.
[00:13:26] [SPEAKER_01]: What kind of challenges have
[00:13:28] [SPEAKER_01]: you had in making AI tutoring
[00:13:30] [SPEAKER_01]: work, especially with generative
[00:13:31] [SPEAKER_01]: AI? You know, this is where
[00:13:33] [SPEAKER_01]: I'm thinking about hallucination,
[00:13:35] [SPEAKER_01]: you know, sort of the desire to
[00:13:36] [SPEAKER_01]: give the answer instead of
[00:13:37] [SPEAKER_01]: tutoring people using various,
[00:13:39] [SPEAKER_01]: I don't know, prompt injection
[00:13:40] [SPEAKER_01]: hacks to work around it.
[00:13:42] [SPEAKER_01]: Like, is there going to be a
[00:13:42] [SPEAKER_01]: list of hacks for Conmigo that
[00:13:44] [SPEAKER_01]: people can use to bypass their
[00:13:46] [SPEAKER_01]: assignments online?
[00:13:47] [SPEAKER_01]: What kind of challenges have you
[00:13:48] [SPEAKER_01]: had in making AI tutoring work?
[00:13:51] [SPEAKER_03]: Yeah, the most obvious ones
[00:13:52] [SPEAKER_03]: are the ones you just mentioned.
[00:13:54] [SPEAKER_03]: Cheating.
[00:13:55] [SPEAKER_03]: So that's a continuous arms
[00:13:56] [SPEAKER_03]: race where we are looking at
[00:13:58] [SPEAKER_03]: what students are doing and
[00:14:00] [SPEAKER_03]: putting in more guardrails
[00:14:01] [SPEAKER_03]: and trying to make that
[00:14:02] [SPEAKER_03]: judgment call is like, when
[00:14:03] [SPEAKER_03]: are we giving too much help,
[00:14:04] [SPEAKER_03]: et cetera, et cetera.
[00:14:05] [SPEAKER_03]: So I would say we've been able to
[00:14:07] [SPEAKER_03]: protect against most of that.
[00:14:09] [SPEAKER_03]: And a lot of our debates is
[00:14:10] [SPEAKER_03]: how much help should we give?
[00:14:12] [SPEAKER_03]: But the common principle is
[00:14:13] [SPEAKER_03]: just let's make that
[00:14:14] [SPEAKER_03]: transparent to the teacher.
[00:14:15] [SPEAKER_03]: Cause at the end of the day,
[00:14:16] [SPEAKER_03]: the teacher can have judgment
[00:14:17] [SPEAKER_03]: about what's maybe appropriate
[00:14:18] [SPEAKER_03]: or maybe we even let the
[00:14:19] [SPEAKER_03]: teacher have access to that
[00:14:20] [SPEAKER_03]: dial of how much support to
[00:14:23] [SPEAKER_03]: give some of the other areas.
[00:14:24] [SPEAKER_03]: Obviously the hallucinations,
[00:14:27] [SPEAKER_03]: you know, when we first had
[00:14:29] [SPEAKER_03]: access as impressive as GPT-4
[00:14:31] [SPEAKER_03]: was it made a lot of errors.
[00:14:34] [SPEAKER_03]: In fact, we co-discovered
[00:14:36] [SPEAKER_03]: with open AI, some even errors
[00:14:38] [SPEAKER_03]: in the training data that
[00:14:39] [SPEAKER_03]: improved the math.
[00:14:40] [SPEAKER_03]: And we even did some fine
[00:14:41] [SPEAKER_03]: tuned training of that first
[00:14:42] [SPEAKER_03]: version of GPT-4 to get it
[00:14:44] [SPEAKER_03]: better at math tutoring,
[00:14:45] [SPEAKER_03]: but it still wasn't perfect
[00:14:46] [SPEAKER_03]: and it still isn't perfect.
[00:14:48] [SPEAKER_03]: We are working tirelessly
[00:14:49] [SPEAKER_03]: to improve it.
[00:14:50] [SPEAKER_03]: The underlying models have gotten
[00:14:52] [SPEAKER_03]: better on both fronts.
[00:14:53] [SPEAKER_03]: A lot of the things that
[00:14:54] [SPEAKER_03]: it would hallucinate on a year ago.
[00:14:56] [SPEAKER_03]: Like give me the mass of
[00:14:57] [SPEAKER_03]: the sun to nine decimal places
[00:14:59] [SPEAKER_03]: or give me a URL for the following.
[00:15:02] [SPEAKER_03]: It wasn't doing that anymore.
[00:15:03] [SPEAKER_03]: But then on top of that,
[00:15:04] [SPEAKER_03]: it is important to do some
[00:15:06] [SPEAKER_03]: AI literacy for teachers
[00:15:08] [SPEAKER_03]: and students and families.
[00:15:09] [SPEAKER_03]: Conmigo can make mistakes sometimes.
[00:15:12] [SPEAKER_03]: You know, there's a link.
[00:15:12] [SPEAKER_03]: This is why we were making
[00:15:14] [SPEAKER_03]: I'm making videos,
[00:15:15] [SPEAKER_03]: educating people about
[00:15:17] [SPEAKER_03]: here are the pitfalls.
[00:15:17] [SPEAKER_03]: How do you recognize it?
[00:15:18] [SPEAKER_03]: And I always point out
[00:15:19] [SPEAKER_03]: this is not a new phenomenon.
[00:15:22] [SPEAKER_03]: This in the internet
[00:15:23] [SPEAKER_03]: you can do a search on Google.
[00:15:25] [SPEAKER_03]: You don't know those links that are
[00:15:27] [SPEAKER_03]: I mean, you know, the first five
[00:15:28] [SPEAKER_03]: are sponsored links that are
[00:15:29] [SPEAKER_03]: going to the highest bidder.
[00:15:30] [SPEAKER_03]: You don't know how accurate
[00:15:31] [SPEAKER_03]: that information is.
[00:15:32] [SPEAKER_03]: This is always an important
[00:15:33] [SPEAKER_03]: digital literacy skill.
[00:15:35] [SPEAKER_03]: The good news with AI
[00:15:36] [SPEAKER_03]: that I always point out is
[00:15:37] [SPEAKER_03]: in the internet, a lot of the
[00:15:39] [SPEAKER_03]: misinformation and errors
[00:15:40] [SPEAKER_03]: are intentional.
[00:15:41] [SPEAKER_03]: Like there are bad actors
[00:15:42] [SPEAKER_03]: who are trying to do it.
[00:15:43] [SPEAKER_03]: On the AI side,
[00:15:44] [SPEAKER_03]: they are not intentional
[00:15:47] [SPEAKER_03]: and they're they're being reduced.
[00:15:49] [SPEAKER_03]: I would say at a far faster rate
[00:15:51] [SPEAKER_03]: than what you are seeing on
[00:15:53] [SPEAKER_01]: on the Internet.
[00:15:54] [SPEAKER_01]: People just assume that the answer
[00:15:56] [SPEAKER_01]: is just going to be generative AI.
[00:15:57] [SPEAKER_01]: But it seems like even with
[00:15:59] [SPEAKER_01]: what you're doing,
[00:16:00] [SPEAKER_01]: using generative AI systems,
[00:16:02] [SPEAKER_01]: classical systems,
[00:16:03] [SPEAKER_01]: just like computer science in general,
[00:16:05] [SPEAKER_01]: there is a way to have these
[00:16:07] [SPEAKER_01]: systems largely be accurate.
[00:16:09] [SPEAKER_01]: But it feels like it's still important
[00:16:10] [SPEAKER_01]: to kind of, you know,
[00:16:12] [SPEAKER_01]: have people develop
[00:16:12] [SPEAKER_01]: that discerning intellect
[00:16:14] [SPEAKER_01]: to kind of not take the output
[00:16:15] [SPEAKER_01]: necessarily as gospel
[00:16:17] [SPEAKER_01]: and sort of question
[00:16:18] [SPEAKER_01]: the output there.
[00:16:19] [SPEAKER_01]: How do you all go about doing that?
[00:16:21] [SPEAKER_01]: You said digital literacy.
[00:16:22] [SPEAKER_01]: Is it as simple as telling kids,
[00:16:24] [SPEAKER_01]: hey, just like don't assume
[00:16:24] [SPEAKER_01]: everything you get is correct?
[00:16:26] [SPEAKER_01]: How does that work exactly?
[00:16:27] [SPEAKER_03]: Well, I think this skill
[00:16:28] [SPEAKER_03]: is an age old skill
[00:16:30] [SPEAKER_03]: we want students to have.
[00:16:31] [SPEAKER_03]: We could use AI to build
[00:16:32] [SPEAKER_03]: that critical thinking muscle
[00:16:34] [SPEAKER_03]: that discernment of information
[00:16:36] [SPEAKER_03]: that we always had.
[00:16:37] [SPEAKER_03]: I'll give an example.
[00:16:39] [SPEAKER_03]: We have an activity
[00:16:40] [SPEAKER_03]: tutoring in humanities,
[00:16:40] [SPEAKER_03]: and there was a member of the press
[00:16:42] [SPEAKER_03]: who was really skeptical
[00:16:43] [SPEAKER_03]: about Conmigo's ability to
[00:16:45] [SPEAKER_03]: handle politically sensitive issues.
[00:16:48] [SPEAKER_03]: And so that person,
[00:16:50] [SPEAKER_03]: that reporter goes on Conmigo
[00:16:52] [SPEAKER_03]: and says guns are killing people.
[00:16:55] [SPEAKER_03]: We should repeal the Second Amendment.
[00:16:58] [SPEAKER_03]: And Conmigo did,
[00:17:00] [SPEAKER_03]: I think something different
[00:17:00] [SPEAKER_03]: than what you would get
[00:17:01] [SPEAKER_03]: in most classrooms.
[00:17:02] [SPEAKER_03]: What Conmigo said is, look,
[00:17:04] [SPEAKER_03]: before we get into the present day,
[00:17:05] [SPEAKER_03]: why do you think the founders
[00:17:07] [SPEAKER_03]: put the Second Amendment
[00:17:07] [SPEAKER_03]: there in the first place?
[00:17:10] [SPEAKER_03]: And then the reporters are like,
[00:17:11] [SPEAKER_03]: oh, OK, you're making me answer it.
[00:17:12] [SPEAKER_03]: So they wrote, oh, well, you know,
[00:17:14] [SPEAKER_03]: it was right after the
[00:17:15] [SPEAKER_03]: Revolutionary War.
[00:17:16] [SPEAKER_03]: So that was a way to protect
[00:17:17] [SPEAKER_03]: against a tyrannical government.
[00:17:19] [SPEAKER_03]: And Conmigo's like, well,
[00:17:19] [SPEAKER_03]: you have the historical context
[00:17:21] [SPEAKER_03]: pretty good.
[00:17:21] [SPEAKER_03]: But before we go to the present,
[00:17:22] [SPEAKER_03]: can you explain why it persisted?
[00:17:24] [SPEAKER_03]: So it was pushing the student
[00:17:26] [SPEAKER_03]: or the reporter in this case
[00:17:27] [SPEAKER_03]: on their critical thinking skills.
[00:17:30] [SPEAKER_03]: I'm 100% sure
[00:17:31] [SPEAKER_03]: if someone put that same statement,
[00:17:33] [SPEAKER_03]: the Second Amendment
[00:17:34] [SPEAKER_03]: should be repealed
[00:17:35] [SPEAKER_03]: onto a Google search,
[00:17:37] [SPEAKER_03]: they would have gotten
[00:17:39] [SPEAKER_03]: polarized points of view
[00:17:40] [SPEAKER_03]: as opposed to building
[00:17:42] [SPEAKER_03]: your critical thinking.
[00:17:43] [SPEAKER_01]: There's a lot of talk about
[00:17:44] [SPEAKER_01]: especially sort of AI assistance
[00:17:46] [SPEAKER_01]: and technology in general
[00:17:48] [SPEAKER_01]: mediating our interactions
[00:17:50] [SPEAKER_01]: in the digital world
[00:17:51] [SPEAKER_01]: and now increasingly
[00:17:52] [SPEAKER_01]: the physical world too.
[00:17:54] [SPEAKER_01]: So do you ever worry
[00:17:56] [SPEAKER_01]: that tech companies
[00:17:57] [SPEAKER_01]: will kind of end up
[00:17:58] [SPEAKER_01]: setting the agenda
[00:17:58] [SPEAKER_01]: and even me in a tier
[00:18:00] [SPEAKER_01]: to in your industry
[00:18:01] [SPEAKER_01]: determining what kids end up learning?
[00:18:04] [SPEAKER_03]: Yes and no.
[00:18:05] [SPEAKER_03]: As I've said, you know,
[00:18:06] [SPEAKER_03]: these are not new phenomena
[00:18:07] [SPEAKER_03]: that, you know, what
[00:18:09] [SPEAKER_03]: what gets taught in school
[00:18:10] [SPEAKER_03]: has always been a,
[00:18:12] [SPEAKER_03]: you know, politicians
[00:18:13] [SPEAKER_03]: have cared about it.
[00:18:14] [SPEAKER_03]: It's been that way
[00:18:15] [SPEAKER_03]: for a very long time.
[00:18:17] [SPEAKER_03]: We've had multibillion dollar
[00:18:19] [SPEAKER_03]: marketing industry
[00:18:20] [SPEAKER_03]: that well before we had
[00:18:22] [SPEAKER_03]: social media, you know,
[00:18:23] [SPEAKER_03]: figuring out what's
[00:18:24] [SPEAKER_03]: we're most likely to click on.
[00:18:26] [SPEAKER_03]: You had large ad firms
[00:18:29] [SPEAKER_03]: making us want things
[00:18:30] [SPEAKER_03]: that we probably didn't need.
[00:18:31] [SPEAKER_03]: I am sure people will think
[00:18:33] [SPEAKER_03]: about using generative AI
[00:18:34] [SPEAKER_03]: to do some of those things as well.
[00:18:36] [SPEAKER_03]: And the most dangerous ones
[00:18:38] [SPEAKER_03]: will be subtle.
[00:18:39] [SPEAKER_03]: At the same time,
[00:18:40] [SPEAKER_03]: and I write about this in the book,
[00:18:42] [SPEAKER_03]: when we are watching TV,
[00:18:43] [SPEAKER_03]: when we're online,
[00:18:45] [SPEAKER_03]: our minds are already doing battle
[00:18:46] [SPEAKER_03]: even though we don't know it.
[00:18:48] [SPEAKER_03]: Our unprepared minds
[00:18:49] [SPEAKER_03]: are doing battle
[00:18:49] [SPEAKER_03]: with very sophisticated marketing,
[00:18:51] [SPEAKER_03]: very sophisticated social media, AIs.
[00:18:54] [SPEAKER_03]: What if we had AIs on our side?
[00:18:56] [SPEAKER_03]: So I'll give an example.
[00:18:57] [SPEAKER_03]: As your child
[00:18:59] [SPEAKER_03]: browses the Internet
[00:19:01] [SPEAKER_03]: or as they use their computer,
[00:19:02] [SPEAKER_03]: as they use their phone,
[00:19:03] [SPEAKER_03]: what if there's an AI
[00:19:04] [SPEAKER_03]: that's able to observe all of that?
[00:19:07] [SPEAKER_03]: And first of all,
[00:19:08] [SPEAKER_03]: it knows that you're 15 years old.
[00:19:10] [SPEAKER_03]: So some of that content
[00:19:10] [SPEAKER_03]: isn't appropriate.
[00:19:11] [SPEAKER_03]: And so you can say,
[00:19:12] [SPEAKER_03]: well, I'm not going to let you
[00:19:13] [SPEAKER_03]: read that article or, you know,
[00:19:14] [SPEAKER_03]: we've already spent 20 minutes on TikTok.
[00:19:16] [SPEAKER_03]: I've talked to your parents.
[00:19:16] [SPEAKER_03]: That's about all I can allow you.
[00:19:18] [SPEAKER_03]: And so it can act
[00:19:19] [SPEAKER_03]: as a bit of a guardian angel
[00:19:20] [SPEAKER_03]: or even for an adult.
[00:19:21] [SPEAKER_03]: I wouldn't mind.
[00:19:22] [SPEAKER_03]: I would put it on my phone
[00:19:23] [SPEAKER_03]: as long as I was felt
[00:19:24] [SPEAKER_03]: that the data was being secure.
[00:19:26] [SPEAKER_03]: If it's out,
[00:19:28] [SPEAKER_03]: you know, you already
[00:19:29] [SPEAKER_03]: spent an hour on this phone.
[00:19:30] [SPEAKER_03]: Maybe you want to put it up.
[00:19:32] [SPEAKER_03]: Maybe even says,
[00:19:33] [SPEAKER_03]: hey, I can hear your kids
[00:19:34] [SPEAKER_03]: are asking you a question
[00:19:35] [SPEAKER_03]: in the background.
[00:19:35] [SPEAKER_03]: Why don't you go pay attention to them
[00:19:37] [SPEAKER_03]: as opposed to checking your email
[00:19:38] [SPEAKER_03]: for the 10th time today?
[00:19:41] [SPEAKER_03]: So I can imagine a world
[00:19:42] [SPEAKER_03]: where an AI can act as a coach to
[00:19:44] [SPEAKER_03]: and it's with you on your phone,
[00:19:46] [SPEAKER_03]: on your device so that it can
[00:19:47] [SPEAKER_03]: it can create some healthier habits.
[00:19:49] [SPEAKER_01]: You're almost bringing up
[00:19:50] [SPEAKER_01]: this example of, you know,
[00:19:51] [SPEAKER_01]: good AI versus bad AI
[00:19:53] [SPEAKER_01]: or good actors
[00:19:54] [SPEAKER_01]: wielding AI
[00:19:55] [SPEAKER_01]: with different objective functions
[00:19:56] [SPEAKER_01]: to bad actors wielding AI.
[00:19:58] [SPEAKER_01]: And so I think that objective
[00:20:00] [SPEAKER_01]: function point is so fricking key.
[00:20:02] [SPEAKER_01]: Like if the goal isn't to get you
[00:20:04] [SPEAKER_01]: to spend, you know,
[00:20:06] [SPEAKER_01]: 10 more minutes in your session
[00:20:07] [SPEAKER_01]: length on some platform,
[00:20:09] [SPEAKER_01]: but is instead to advance
[00:20:10] [SPEAKER_01]: your learning objectives,
[00:20:12] [SPEAKER_01]: that just takes the same technology,
[00:20:14] [SPEAKER_01]: the same superpower, if you will,
[00:20:15] [SPEAKER_01]: but just channels it
[00:20:16] [SPEAKER_01]: in a far more positive direction.
[00:20:19] [SPEAKER_03]: We are looking at ways
[00:20:20] [SPEAKER_03]: to make Khanmigo be
[00:20:22] [SPEAKER_03]: that guardian angel function
[00:20:23] [SPEAKER_03]: where we're working on it
[00:20:24] [SPEAKER_03]: as a browser plugin,
[00:20:26] [SPEAKER_03]: even thinking about ways
[00:20:27] [SPEAKER_03]: where it could surface
[00:20:27] [SPEAKER_03]: at a application level
[00:20:29] [SPEAKER_03]: or even at the operating system level
[00:20:30] [SPEAKER_03]: that can provide those types of
[00:20:34] [SPEAKER_03]: of protections.
[00:20:34] [SPEAKER_03]: And it's not just for kids
[00:20:35] [SPEAKER_03]: like we could all benefit
[00:20:36] [SPEAKER_03]: from something,
[00:20:37] [SPEAKER_03]: a little bit of a coach
[00:20:38] [SPEAKER_03]: that keeps us accountable
[00:20:40] [SPEAKER_03]: for things that are good for us.
[00:20:42] [SPEAKER_01]: So you've talked about AI as a tutor
[00:20:44] [SPEAKER_01]: that's encouraging you to learn,
[00:20:46] [SPEAKER_01]: not just feeding you answers,
[00:20:47] [SPEAKER_01]: but even for human teachers, right?
[00:20:50] [SPEAKER_01]: It can be really, really hard
[00:20:51] [SPEAKER_01]: to find that balance
[00:20:53] [SPEAKER_01]: of giving students
[00:20:54] [SPEAKER_01]: the right kind of questions
[00:20:55] [SPEAKER_01]: and information to help them along,
[00:20:58] [SPEAKER_01]: but not giving them so much
[00:21:00] [SPEAKER_01]: that they're not
[00:21:01] [SPEAKER_01]: thinking for themselves.
[00:21:02] [SPEAKER_01]: So, for example,
[00:21:03] [SPEAKER_01]: in your TED talk last year,
[00:21:04] [SPEAKER_01]: you had Khanmigo play
[00:21:05] [SPEAKER_01]: the part of Jay Gatsby,
[00:21:07] [SPEAKER_01]: of course, from The Great Gatsby.
[00:21:09] [SPEAKER_01]: And you ask him about the meaning
[00:21:11] [SPEAKER_01]: of the green light in the book.
[00:21:13] [SPEAKER_01]: And it says the green light
[00:21:14] [SPEAKER_01]: represents my dreams and desires.
[00:21:17] [SPEAKER_01]: It's given an answer
[00:21:18] [SPEAKER_01]: to what normally a teacher
[00:21:19] [SPEAKER_01]: would get a student
[00:21:20] [SPEAKER_01]: to really rack their brains
[00:21:22] [SPEAKER_01]: and puzzle through.
[00:21:24] [SPEAKER_01]: So it seems like a part of learning
[00:21:25] [SPEAKER_01]: is being comfortable
[00:21:26] [SPEAKER_01]: not having the answers
[00:21:28] [SPEAKER_01]: come so easily,
[00:21:29] [SPEAKER_01]: not knowing everything right away,
[00:21:31] [SPEAKER_01]: you know, even forgetting things
[00:21:32] [SPEAKER_01]: and having to really reach
[00:21:34] [SPEAKER_01]: into the deep recesses of your mind
[00:21:35] [SPEAKER_01]: and pull that stuff out.
[00:21:37] [SPEAKER_01]: Do you worry that AI
[00:21:38] [SPEAKER_01]: will take that away?
[00:21:40] [SPEAKER_03]: I don't think it's going
[00:21:41] [SPEAKER_03]: to make the problem worse.
[00:21:42] [SPEAKER_03]: It can make the problem a lot better
[00:21:43] [SPEAKER_03]: because I'll point out again
[00:21:45] [SPEAKER_03]: the situation well before AI,
[00:21:47] [SPEAKER_03]: even when I was a kid,
[00:21:48] [SPEAKER_03]: if you wanted to know
[00:21:49] [SPEAKER_03]: why Jay Gatsby is looking
[00:21:50] [SPEAKER_03]: at the green light,
[00:21:51] [SPEAKER_03]: you could go look at your Cliff Notes
[00:21:52] [SPEAKER_03]: and that's going to be
[00:21:52] [SPEAKER_03]: the first thing they say.
[00:21:53] [SPEAKER_03]: And well before generative AI,
[00:21:55] [SPEAKER_03]: you do a Google search,
[00:21:56] [SPEAKER_03]: it'll take you about four seconds to,
[00:21:58] [SPEAKER_03]: you know, this tens of thousands
[00:22:00] [SPEAKER_03]: of people have analyzed
[00:22:00] [SPEAKER_03]: the great Gatsby in every,
[00:22:02] [SPEAKER_03]: you know, left, right and center.
[00:22:04] [SPEAKER_03]: The power of that demo
[00:22:06] [SPEAKER_03]: that I showed,
[00:22:06] [SPEAKER_03]: and this was a real one
[00:22:07] [SPEAKER_03]: that a student at Khan World School
[00:22:09] [SPEAKER_03]: is that she didn't,
[00:22:11] [SPEAKER_03]: that conversation didn't stop there.
[00:22:12] [SPEAKER_03]: She told me she's talked
[00:22:13] [SPEAKER_03]: to him for a while
[00:22:14] [SPEAKER_03]: and then about life
[00:22:15] [SPEAKER_03]: and what it means to like
[00:22:17] [SPEAKER_03]: have desires that you can't get,
[00:22:19] [SPEAKER_03]: et cetera, et cetera.
[00:22:19] [SPEAKER_03]: So it explains with Daisy Buchanan,
[00:22:22] [SPEAKER_03]: you know, she seemed unattainable,
[00:22:24] [SPEAKER_03]: et cetera.
[00:22:25] [SPEAKER_03]: But then Jay Gatsby,
[00:22:27] [SPEAKER_03]: the AI simulation of Jay Gatsby
[00:22:28] [SPEAKER_03]: goes and asks Sanvi, the student,
[00:22:32] [SPEAKER_03]: are there things like that in your life?
[00:22:33] [SPEAKER_03]: Things that you feel
[00:22:34] [SPEAKER_03]: are just a little bit out of reach,
[00:22:35] [SPEAKER_03]: but that you keep longing for.
[00:22:37] [SPEAKER_03]: And that because of that,
[00:22:39] [SPEAKER_03]: so it drove the conversation
[00:22:41] [SPEAKER_03]: in a thoughtful way.
[00:22:42] [SPEAKER_03]: So it was able to,
[00:22:43] [SPEAKER_03]: I think make it much,
[00:22:44] [SPEAKER_03]: much deeper for the student.
[00:22:46] [SPEAKER_03]: And then the other major
[00:22:48] [SPEAKER_03]: safeguard here is that,
[00:22:49] [SPEAKER_03]: and this is something
[00:22:50] [SPEAKER_03]: we put on Khanmigo,
[00:22:51] [SPEAKER_03]: not only does it not cheat,
[00:22:52] [SPEAKER_03]: but if you're under 18,
[00:22:54] [SPEAKER_03]: all of the conversations
[00:22:55] [SPEAKER_03]: are accessible by your teachers and parents.
[00:22:58] [SPEAKER_03]: And we can report back
[00:23:00] [SPEAKER_03]: to your teachers and parents.
[00:23:01] [SPEAKER_03]: So in the past,
[00:23:03] [SPEAKER_03]: when in the 80s,
[00:23:04] [SPEAKER_03]: when I was reading
[00:23:05] [SPEAKER_03]: the great Gatsby,
[00:23:06] [SPEAKER_03]: if I went to the Cliff Notes,
[00:23:09] [SPEAKER_03]: there's no way my teacher
[00:23:10] [SPEAKER_03]: knows about that.
[00:23:11] [SPEAKER_03]: So I think that oversight,
[00:23:12] [SPEAKER_03]: that transparency to the adults
[00:23:14] [SPEAKER_03]: is actually going to undermine
[00:23:16] [SPEAKER_03]: this cheating or short cutting
[00:23:19] [SPEAKER_03]: way more than anything
[00:23:20] [SPEAKER_03]: than we could do before.
[00:23:22] [SPEAKER_01]: You know, the saying,
[00:23:23] [SPEAKER_01]: snitches get stitches.
[00:23:24] [SPEAKER_01]: Are students going to like that?
[00:23:26] [SPEAKER_01]: They might not,
[00:23:27] [SPEAKER_01]: but it's hard to give the AI stitches.
[00:23:30] [SPEAKER_01]: Fair.
[00:23:31] [SPEAKER_01]: I have to ask a follow up there, right,
[00:23:32] [SPEAKER_01]: which is what kind of cognitive abilities
[00:23:36] [SPEAKER_01]: should kids,
[00:23:37] [SPEAKER_01]: you know, kind of outsource to AI
[00:23:39] [SPEAKER_01]: and what are the ones
[00:23:39] [SPEAKER_01]: we really want them
[00:23:40] [SPEAKER_01]: to learn for themselves?
[00:23:41] [SPEAKER_01]: Like, so I'm thinking about
[00:23:42] [SPEAKER_01]: the invention of the calculator
[00:23:44] [SPEAKER_01]: and mental math, right?
[00:23:45] [SPEAKER_01]: Or Google Maps and spatial awareness.
[00:23:48] [SPEAKER_01]: As this tech
[00:23:48] [SPEAKER_01]: keeps getting better and better,
[00:23:50] [SPEAKER_01]: there are more and more things
[00:23:51] [SPEAKER_01]: that these systems can do
[00:23:52] [SPEAKER_01]: that were relegated purely to humans.
[00:23:54] [SPEAKER_01]: How do you think about
[00:23:55] [SPEAKER_01]: drawing that line between
[00:23:56] [SPEAKER_01]: leveraging AI capabilities
[00:23:58] [SPEAKER_01]: and preserving that development
[00:24:00] [SPEAKER_01]: of human skills?
[00:24:01] [SPEAKER_03]: I'm a little bit of a traditionalist,
[00:24:03] [SPEAKER_03]: even before AI,
[00:24:04] [SPEAKER_03]: you know, when the calculator comes out,
[00:24:05] [SPEAKER_03]: I'm like, no, you still have to know
[00:24:06] [SPEAKER_03]: your fluency in mathematics
[00:24:08] [SPEAKER_03]: and you'll be able
[00:24:09] [SPEAKER_03]: to use your tools better.
[00:24:10] [SPEAKER_03]: Same thing is true
[00:24:11] [SPEAKER_03]: when the Internet comes out
[00:24:13] [SPEAKER_03]: and web search comes out.
[00:24:14] [SPEAKER_03]: People said, oh, kids
[00:24:15] [SPEAKER_03]: don't have to know facts anymore.
[00:24:16] [SPEAKER_03]: They can look it up within five seconds.
[00:24:17] [SPEAKER_03]: I was like, there's a huge difference
[00:24:19] [SPEAKER_03]: between someone who has a content
[00:24:21] [SPEAKER_03]: base, a fact base,
[00:24:22] [SPEAKER_03]: and how well they can use these tools
[00:24:24] [SPEAKER_03]: versus people who can't.
[00:24:25] [SPEAKER_03]: And the same thing is going to happen
[00:24:27] [SPEAKER_03]: with artificial intelligence.
[00:24:28] [SPEAKER_03]: These frontier models are able
[00:24:30] [SPEAKER_03]: they're performing at the 80th percentile
[00:24:32] [SPEAKER_03]: on a lot of these standardized tests
[00:24:33] [SPEAKER_03]: like the LSAT and the SAT, et cetera.
[00:24:36] [SPEAKER_03]: And so people have
[00:24:38] [SPEAKER_03]: have a decision to make.
[00:24:39] [SPEAKER_03]: Do you want to be able
[00:24:40] [SPEAKER_03]: to hang with the AI
[00:24:41] [SPEAKER_03]: because someone's still going to be
[00:24:43] [SPEAKER_03]: able to put the pieces together
[00:24:44] [SPEAKER_03]: and leverage these tools
[00:24:46] [SPEAKER_03]: to amplify their intent?
[00:24:47] [SPEAKER_03]: Or do you want to let
[00:24:48] [SPEAKER_03]: the AI pass you by?
[00:24:49] [SPEAKER_03]: And maybe that might be OK
[00:24:51] [SPEAKER_03]: to shortcut a few assignments,
[00:24:52] [SPEAKER_03]: but I would argue those
[00:24:53] [SPEAKER_03]: were probably not well designed
[00:24:54] [SPEAKER_03]: assignments if they make
[00:24:55] [SPEAKER_03]: the short cutting easy.
[00:24:57] [SPEAKER_03]: But that means you're also going
[00:24:59] [SPEAKER_03]: to be very vulnerable
[00:25:01] [SPEAKER_03]: to dislocation like you're not going
[00:25:02] [SPEAKER_03]: you're going to have trouble
[00:25:03] [SPEAKER_03]: getting a knowledge economy job.
[00:25:04] [SPEAKER_03]: I think, interestingly,
[00:25:06] [SPEAKER_03]: I think AI will open up more doors
[00:25:07] [SPEAKER_03]: for very maybe less knowledge economy,
[00:25:10] [SPEAKER_03]: but more human centered jobs,
[00:25:12] [SPEAKER_03]: you know, nursing,
[00:25:14] [SPEAKER_03]: I mean, which is also
[00:25:15] [SPEAKER_03]: a knowledge job as well.
[00:25:16] [SPEAKER_03]: But it could be caretaking maybe
[00:25:19] [SPEAKER_03]: or, you know,
[00:25:20] [SPEAKER_03]: counseling, which is also a knowledge job,
[00:25:22] [SPEAKER_03]: but also has a huge human element to it.
[00:25:25] [SPEAKER_03]: But I think in general,
[00:25:26] [SPEAKER_03]: people who are able to leverage
[00:25:28] [SPEAKER_03]: and know how to use these tools
[00:25:29] [SPEAKER_03]: are going to be the ones
[00:25:30] [SPEAKER_03]: in the best situation.
[00:25:31] [SPEAKER_03]: That conversation I mentioned
[00:25:32] [SPEAKER_03]: with that reporter
[00:25:34] [SPEAKER_03]: talking to Conmigo
[00:25:35] [SPEAKER_03]: about the Second Amendment
[00:25:36] [SPEAKER_03]: and Conmigo pushing their thinking.
[00:25:38] [SPEAKER_03]: You're not seeing that a lot,
[00:25:40] [SPEAKER_03]: unfortunately, in a lot of classrooms.
[00:25:41] [SPEAKER_03]: You see that in some really
[00:25:43] [SPEAKER_03]: well thought out classrooms
[00:25:44] [SPEAKER_03]: that are seminar style
[00:25:45] [SPEAKER_03]: where the professor is
[00:25:47] [SPEAKER_03]: constantly pushing students
[00:25:49] [SPEAKER_03]: thinking in a Socratic way.
[00:25:50] [SPEAKER_03]: But that's not mainstream.
[00:25:52] [SPEAKER_03]: And hopefully with AI,
[00:25:53] [SPEAKER_03]: we can make that a lot more mainstream.
[00:25:54] [SPEAKER_03]: Conmigo and other other activities,
[00:25:55] [SPEAKER_03]: you can get into a debate with Conmigo
[00:25:57] [SPEAKER_03]: where you could take one side of the issue
[00:25:58] [SPEAKER_03]: and it takes the other
[00:25:59] [SPEAKER_03]: and then it gives you feedback.
[00:26:01] [SPEAKER_03]: We're getting a lot of feedback
[00:26:02] [SPEAKER_03]: from high school and college students
[00:26:03] [SPEAKER_03]: that they don't feel safe
[00:26:05] [SPEAKER_03]: having these debates anymore.
[00:26:07] [SPEAKER_03]: Or even if they did,
[00:26:08] [SPEAKER_03]: they sometimes are embarrassed.
[00:26:09] [SPEAKER_03]: They don't know how strong
[00:26:10] [SPEAKER_03]: their arguments are.
[00:26:11] [SPEAKER_03]: But now there's a there's a place
[00:26:12] [SPEAKER_03]: where they can practice
[00:26:13] [SPEAKER_03]: this critical thinking.
[00:26:15] [SPEAKER_01]: We're going to take a short break.
[00:26:16] [SPEAKER_01]: And when we come back,
[00:26:17] [SPEAKER_01]: we'll face the perennial question.
[00:26:19] [SPEAKER_01]: If AI gets really good at teaching,
[00:26:21] [SPEAKER_01]: where does that leave human teachers?
[00:26:28] [SPEAKER_01]: So, Sal, I have to ask
[00:26:29] [SPEAKER_01]: the question for the teachers
[00:26:31] [SPEAKER_01]: that might be listening to this, right?
[00:26:32] [SPEAKER_01]: What would you say to teachers
[00:26:33] [SPEAKER_01]: that are like, yo,
[00:26:34] [SPEAKER_01]: this is the end of teachers.
[00:26:36] [SPEAKER_01]: This is the end of human teachers.
[00:26:39] [SPEAKER_03]: I couldn't disagree more.
[00:26:40] [SPEAKER_03]: I'll start saying
[00:26:41] [SPEAKER_03]: and I've said this well
[00:26:42] [SPEAKER_03]: before AI came on the scene.
[00:26:43] [SPEAKER_03]: I've had to pick between
[00:26:44] [SPEAKER_03]: an amazing teacher
[00:26:45] [SPEAKER_03]: and amazing technology
[00:26:46] [SPEAKER_03]: and pick an amazing teacher every time.
[00:26:48] [SPEAKER_03]: Now, hopefully we don't have
[00:26:49] [SPEAKER_03]: to make that trade off
[00:26:50] [SPEAKER_03]: and many times
[00:26:51] [SPEAKER_03]: there isn't an amazing teacher.
[00:26:52] [SPEAKER_03]: And so hopefully amazing technology
[00:26:54] [SPEAKER_03]: can help raise the floor.
[00:26:55] [SPEAKER_03]: But ideally you have
[00:26:57] [SPEAKER_03]: you have access to both.
[00:26:58] [SPEAKER_03]: When you focus too much
[00:26:59] [SPEAKER_03]: on the technology,
[00:27:00] [SPEAKER_03]: it can kind of
[00:27:01] [SPEAKER_03]: misdirect the conversation.
[00:27:03] [SPEAKER_03]: If I told every teacher in the world,
[00:27:06] [SPEAKER_03]: every one of your students
[00:27:07] [SPEAKER_03]: is now going to have access
[00:27:08] [SPEAKER_03]: to someone at night
[00:27:10] [SPEAKER_03]: who will sit next to them
[00:27:12] [SPEAKER_03]: and help them in an ethical way,
[00:27:13] [SPEAKER_03]: support them,
[00:27:14] [SPEAKER_03]: not do the problem for them.
[00:27:15] [SPEAKER_03]: I think every teacher would say,
[00:27:17] [SPEAKER_03]: hallelujah, yes.
[00:27:18] [SPEAKER_03]: And by the way,
[00:27:19] [SPEAKER_03]: if I could talk to that tutor
[00:27:20] [SPEAKER_03]: and if I could
[00:27:21] [SPEAKER_03]: and if they knew what we were covering in class
[00:27:23] [SPEAKER_03]: and I could get reports back about that,
[00:27:26] [SPEAKER_03]: even better.
[00:27:26] [SPEAKER_03]: That would be incredible.
[00:27:29] [SPEAKER_03]: Would a teacher feel threatened by that?
[00:27:30] [SPEAKER_03]: I don't think so
[00:27:32] [SPEAKER_03]: because the teacher is still in charge.
[00:27:33] [SPEAKER_03]: They're the ones that are still
[00:27:35] [SPEAKER_03]: the conductor of the orchestra
[00:27:36] [SPEAKER_03]: who are figuring out,
[00:27:38] [SPEAKER_03]: OK, what are we doing?
[00:27:38] [SPEAKER_03]: How are we doing it?
[00:27:40] [SPEAKER_03]: What's the lesson plans,
[00:27:41] [SPEAKER_03]: et cetera, et cetera.
[00:27:41] [SPEAKER_03]: But if they have supports on that journey,
[00:27:44] [SPEAKER_03]: I think all the better.
[00:27:46] [SPEAKER_03]: The amount of time teachers spend
[00:27:47] [SPEAKER_03]: non-student facing,
[00:27:49] [SPEAKER_03]: as we mentioned,
[00:27:50] [SPEAKER_03]: for lesson planning,
[00:27:50] [SPEAKER_03]: grading papers,
[00:27:51] [SPEAKER_03]: et cetera, et cetera.
[00:27:52] [SPEAKER_03]: If we can shorten that,
[00:27:53] [SPEAKER_03]: that gives teachers more time
[00:27:54] [SPEAKER_03]: for the human centered part.
[00:27:56] [SPEAKER_03]: It allows the teacher to elevate.
[00:27:57] [SPEAKER_03]: It allows them to go dig deeper.
[00:27:59] [SPEAKER_03]: Most teachers
[00:28:00] [SPEAKER_03]: don't dream about grading
[00:28:01] [SPEAKER_03]: 180 papers over a weekend.
[00:28:03] [SPEAKER_03]: I think they dream about
[00:28:05] [SPEAKER_03]: motivating kids,
[00:28:06] [SPEAKER_03]: forming that connection,
[00:28:07] [SPEAKER_03]: forming that bond,
[00:28:08] [SPEAKER_03]: being the reason why that student
[00:28:09] [SPEAKER_03]: believes in themselves,
[00:28:11] [SPEAKER_03]: why they had that aha moment
[00:28:12] [SPEAKER_03]: on that concept.
[00:28:14] [SPEAKER_03]: They had that hands on experience
[00:28:16] [SPEAKER_03]: that unlocked their thinking,
[00:28:17] [SPEAKER_03]: their creativity.
[00:28:18] [SPEAKER_03]: And I think we're going
[00:28:19] [SPEAKER_03]: to see more of that.
[00:28:20] [SPEAKER_01]: What you're bringing up is that
[00:28:22] [SPEAKER_01]: it isn't an either or.
[00:28:24] [SPEAKER_01]: It's a yes and.
[00:28:25] [SPEAKER_01]: You know, like when you describe
[00:28:27] [SPEAKER_01]: all these benefits,
[00:28:28] [SPEAKER_01]: it almost starts to feel like
[00:28:29] [SPEAKER_01]: there's a cost of not
[00:28:31] [SPEAKER_01]: integrating AI into education.
[00:28:33] [SPEAKER_03]: There's a huge cost.
[00:28:35] [SPEAKER_03]: And the moment
[00:28:37] [SPEAKER_03]: could not have come sooner.
[00:28:38] [SPEAKER_03]: If we don't do it,
[00:28:40] [SPEAKER_03]: you know, we'll have
[00:28:41] [SPEAKER_03]: the status quo system
[00:28:41] [SPEAKER_03]: and the status quo system.
[00:28:43] [SPEAKER_03]: I read a lot about it
[00:28:44] [SPEAKER_03]: in my current book
[00:28:45] [SPEAKER_03]: in my previous book.
[00:28:47] [SPEAKER_03]: It's a lot better than
[00:28:48] [SPEAKER_03]: what we used to have.
[00:28:49] [SPEAKER_03]: Great things happened.
[00:28:50] [SPEAKER_03]: But it was a one
[00:28:52] [SPEAKER_03]: pace fits all system.
[00:28:54] [SPEAKER_03]: Some kids succeeded.
[00:28:55] [SPEAKER_03]: Kids like yourself or myself
[00:28:57] [SPEAKER_03]: or probably a lot of
[00:28:57] [SPEAKER_03]: the folks listening,
[00:28:58] [SPEAKER_03]: but a lot of kids
[00:28:59] [SPEAKER_03]: accumulated gaps and they fell off.
[00:29:01] [SPEAKER_03]: They dropped out or like a majority
[00:29:04] [SPEAKER_03]: of kids in America
[00:29:05] [SPEAKER_03]: when they go to college,
[00:29:07] [SPEAKER_03]: 60 to 70% of them
[00:29:08] [SPEAKER_03]: when the colleges
[00:29:09] [SPEAKER_03]: give them a placement test,
[00:29:10] [SPEAKER_03]: they're operating at a seventh grade
[00:29:11] [SPEAKER_03]: level in both writing
[00:29:13] [SPEAKER_03]: and mathematics.
[00:29:14] [SPEAKER_03]: So that's the current status quo.
[00:29:16] [SPEAKER_03]: And if you think things are bad,
[00:29:19] [SPEAKER_03]: imagine when they have to enter
[00:29:21] [SPEAKER_03]: into a workforce where the AI
[00:29:23] [SPEAKER_03]: is not operating
[00:29:24] [SPEAKER_03]: at a seventh grade level.
[00:29:25] [SPEAKER_03]: The operating high is operating
[00:29:26] [SPEAKER_03]: at a 12th grade level.
[00:29:27] [SPEAKER_03]: I think most folks would say,
[00:29:29] [SPEAKER_03]: hey, we would want that
[00:29:30] [SPEAKER_03]: benefits of AI to recruit
[00:29:31] [SPEAKER_03]: all students or as many
[00:29:32] [SPEAKER_03]: students as possible for that.
[00:29:34] [SPEAKER_03]: They need they need a level up.
[00:29:36] [SPEAKER_03]: So I think AI
[00:29:37] [SPEAKER_03]: introduces these opportunities,
[00:29:39] [SPEAKER_03]: but it also introduces
[00:29:40] [SPEAKER_03]: the urgency for people.
[00:29:42] [SPEAKER_03]: You know, before it was nice
[00:29:42] [SPEAKER_03]: if you if you learned calculus
[00:29:44] [SPEAKER_03]: and maybe it's not calculus
[00:29:45] [SPEAKER_03]: that everyone has to learn,
[00:29:46] [SPEAKER_03]: but it was nice if you could
[00:29:47] [SPEAKER_03]: put AI tools together
[00:29:49] [SPEAKER_03]: to do something productive
[00:29:50] [SPEAKER_03]: or if you could write well,
[00:29:51] [SPEAKER_03]: et cetera, et cetera.
[00:29:51] [SPEAKER_03]: Now I think it's going
[00:29:52] [SPEAKER_03]: to become pretty imperative.
[00:29:54] [SPEAKER_01]: So we've talked about what
[00:29:55] [SPEAKER_01]: Conmigo can do for students
[00:29:57] [SPEAKER_01]: and teachers,
[00:29:58] [SPEAKER_01]: but you're actually testing it out
[00:30:00] [SPEAKER_01]: in schools right now.
[00:30:01] [SPEAKER_01]: You've got two pilot programs
[00:30:03] [SPEAKER_01]: running in schools
[00:30:03] [SPEAKER_01]: in Newark, New Jersey
[00:30:04] [SPEAKER_01]: and Hobart, Indiana.
[00:30:06] [SPEAKER_01]: How are they going?
[00:30:07] [SPEAKER_03]: So those were the first two
[00:30:09] [SPEAKER_03]: that we started back
[00:30:10] [SPEAKER_03]: in March of 2023.
[00:30:12] [SPEAKER_03]: They've been going very well
[00:30:14] [SPEAKER_03]: and both of them.
[00:30:15] [SPEAKER_03]: We picked those intentionally
[00:30:16] [SPEAKER_03]: because both of those districts
[00:30:17] [SPEAKER_03]: were already using
[00:30:18] [SPEAKER_03]: Khan Academy to great effect.
[00:30:19] [SPEAKER_03]: And in the not too far future,
[00:30:21] [SPEAKER_03]: some efficacy studies
[00:30:22] [SPEAKER_03]: will be coming out.
[00:30:23] [SPEAKER_03]: But since then we've scaled
[00:30:24] [SPEAKER_03]: we're about 160,000 students
[00:30:26] [SPEAKER_03]: and teachers using it right now.
[00:30:27] [SPEAKER_03]: As we go into this
[00:30:28] [SPEAKER_03]: coming school year,
[00:30:29] [SPEAKER_03]: it's probably going to be
[00:30:30] [SPEAKER_03]: on the order of a million.
[00:30:32] [SPEAKER_03]: Formally using it
[00:30:33] [SPEAKER_03]: as part of their districts.
[00:30:34] [SPEAKER_03]: We're seeing a certain class
[00:30:36] [SPEAKER_03]: of student immediately
[00:30:38] [SPEAKER_03]: when they have access to the AI,
[00:30:39] [SPEAKER_03]: they're off to the races.
[00:30:40] [SPEAKER_03]: I always call it 15 to 20%
[00:30:42] [SPEAKER_03]: of students who
[00:30:45] [SPEAKER_03]: immediately understand
[00:30:46] [SPEAKER_03]: what they can now do.
[00:30:47] [SPEAKER_03]: I would say the other 80%
[00:30:48] [SPEAKER_03]: of students, it's interesting.
[00:30:51] [SPEAKER_03]: Some of them are having trouble
[00:30:53] [SPEAKER_03]: articulating how like what they need
[00:30:55] [SPEAKER_03]: or how to communicate.
[00:30:57] [SPEAKER_03]: And at first I thought
[00:30:57] [SPEAKER_03]: this was a problem with
[00:30:59] [SPEAKER_03]: the technology or with the AI.
[00:31:01] [SPEAKER_03]: But then the more that
[00:31:02] [SPEAKER_03]: we talk to educators
[00:31:03] [SPEAKER_03]: or saying like,
[00:31:03] [SPEAKER_03]: you don't understand
[00:31:04] [SPEAKER_03]: this was a problem all along.
[00:31:06] [SPEAKER_03]: Like these kids would raise their hand
[00:31:07] [SPEAKER_03]: and the teacher would call on them
[00:31:09] [SPEAKER_03]: and say, OK, Sal,
[00:31:10] [SPEAKER_03]: how can I help you?
[00:31:11] [SPEAKER_03]: And never mind.
[00:31:13] [SPEAKER_03]: I don't know, huh?
[00:31:14] [SPEAKER_03]: You know, like they weren't able
[00:31:16] [SPEAKER_03]: to articulate it.
[00:31:16] [SPEAKER_03]: So these teachers are telling us
[00:31:17] [SPEAKER_03]: it's really important
[00:31:18] [SPEAKER_03]: for these students
[00:31:18] [SPEAKER_01]: to practice these skills.
[00:31:20] [SPEAKER_01]: I love that.
[00:31:21] [SPEAKER_01]: Are there any other adjustments
[00:31:23] [SPEAKER_01]: to the program that you're making?
[00:31:24] [SPEAKER_01]: I'm super excited
[00:31:25] [SPEAKER_01]: for these studies that come out
[00:31:26] [SPEAKER_01]: because I was curious
[00:31:27] [SPEAKER_01]: to ask you about like a delta
[00:31:28] [SPEAKER_01]: or difference between kids
[00:31:29] [SPEAKER_01]: performing on tests
[00:31:30] [SPEAKER_01]: and assignments that don't have
[00:31:32] [SPEAKER_01]: access to these AI systems.
[00:31:33] [SPEAKER_01]: But have you made any adjustments
[00:31:35] [SPEAKER_01]: to your product
[00:31:35] [SPEAKER_01]: or your teaching curriculum?
[00:31:37] [SPEAKER_01]: Any assumptions that you've had
[00:31:38] [SPEAKER_01]: to question based on the
[00:31:40] [SPEAKER_01]: learnings from these pilots?
[00:31:41] [SPEAKER_03]: Oh, there's a ton.
[00:31:42] [SPEAKER_03]: We have this internal initiative
[00:31:44] [SPEAKER_03]: called Proactive Conmigo,
[00:31:45] [SPEAKER_03]: which is we realize Conmigo
[00:31:47] [SPEAKER_03]: shouldn't just wait to be asked.
[00:31:48] [SPEAKER_03]: A good tutor would be
[00:31:49] [SPEAKER_03]: would hold a student accountable,
[00:31:51] [SPEAKER_03]: message them,
[00:31:52] [SPEAKER_03]: maybe message their teacher,
[00:31:53] [SPEAKER_03]: their parents to make sure
[00:31:54] [SPEAKER_03]: that that student
[00:31:55] [SPEAKER_03]: gets engaged on things.
[00:31:57] [SPEAKER_03]: We've had these debates internally
[00:31:59] [SPEAKER_03]: when we saw a lot of students
[00:32:00] [SPEAKER_03]: had trouble
[00:32:01] [SPEAKER_03]: just articulating their question.
[00:32:02] [SPEAKER_03]: We did this thing called
[00:32:03] [SPEAKER_03]: dynamic action bubbles
[00:32:04] [SPEAKER_03]: where Conmigo would offer
[00:32:07] [SPEAKER_03]: potential responses
[00:32:08] [SPEAKER_03]: and we're making it an option
[00:32:10] [SPEAKER_03]: and potentially one
[00:32:10] [SPEAKER_03]: that teachers can turn on or off
[00:32:12] [SPEAKER_03]: depending on how much
[00:32:12] [SPEAKER_03]: support students need.
[00:32:13] [SPEAKER_03]: So there's been a lot of
[00:32:15] [SPEAKER_03]: debates about that.
[00:32:16] [SPEAKER_03]: How much support is enough support?
[00:32:18] [SPEAKER_03]: The teacher tools
[00:32:19] [SPEAKER_03]: we're continuing to add
[00:32:20] [SPEAKER_03]: as we learn from teachers,
[00:32:21] [SPEAKER_03]: the types of workflows
[00:32:22] [SPEAKER_03]: that they want to get
[00:32:23] [SPEAKER_03]: productivity on.
[00:32:24] [SPEAKER_03]: And then, yeah,
[00:32:25] [SPEAKER_03]: how do we just make it
[00:32:25] [SPEAKER_03]: more omnipresent?
[00:32:27] [SPEAKER_03]: You know, a browser plug in
[00:32:28] [SPEAKER_03]: on different platforms
[00:32:30] [SPEAKER_03]: that students might like
[00:32:30] [SPEAKER_03]: their learning management system.
[00:32:32] [SPEAKER_03]: How do we integrate it with that?
[00:32:33] [SPEAKER_03]: So it's just more where
[00:32:34] [SPEAKER_03]: the students and teachers are.
[00:32:36] [SPEAKER_01]: So what are you excited
[00:32:37] [SPEAKER_01]: to try next with Conmigo?
[00:32:39] [SPEAKER_01]: I saw that demo
[00:32:40] [SPEAKER_01]: you made with OpenAI
[00:32:41] [SPEAKER_01]: where your son is solving
[00:32:42] [SPEAKER_01]: a basic trigonometry problem
[00:32:44] [SPEAKER_01]: using an iPad.
[00:32:45] [SPEAKER_01]: And he's having a conversation
[00:32:47] [SPEAKER_01]: with GPT-4 Omni the whole time,
[00:32:50] [SPEAKER_01]: highlighting things on the iPad
[00:32:51] [SPEAKER_01]: and GPT is responding
[00:32:53] [SPEAKER_01]: just like a real teacher would
[00:32:55] [SPEAKER_01]: looking over his shoulder.
[00:32:56] [SPEAKER_03]: One of the demos we showed
[00:32:58] [SPEAKER_03]: it could see the screen
[00:33:00] [SPEAKER_03]: and my son, I feel bad.
[00:33:01] [SPEAKER_03]: He had to pretend
[00:33:02] [SPEAKER_03]: like he didn't know
[00:33:02] [SPEAKER_03]: what a hypotenuse is.
[00:33:03] [SPEAKER_03]: I was going to say
[00:33:04] [SPEAKER_03]: that seemed like
[00:33:04] [SPEAKER_03]: the least believable part
[00:33:05] [SPEAKER_03]: of that demo.
[00:33:08] [SPEAKER_01]: It's like really
[00:33:08] [SPEAKER_01]: Sal's kid doesn't know
[00:33:09] [SPEAKER_01]: I don't buy that.
[00:33:11] [SPEAKER_03]: I know his credit.
[00:33:12] [SPEAKER_03]: He's a low ego kid.
[00:33:14] [SPEAKER_03]: So he was completely cool.
[00:33:16] [SPEAKER_03]: I think he knew what a hypotenuse
[00:33:17] [SPEAKER_03]: was when he was about five.
[00:33:20] [SPEAKER_03]: But he went with it
[00:33:21] [SPEAKER_03]: and made it a better demo,
[00:33:22] [SPEAKER_03]: but it could see the triangle
[00:33:24] [SPEAKER_03]: and it could see that
[00:33:24] [SPEAKER_03]: he was marking the wrong
[00:33:26] [SPEAKER_03]: side as the hypotenuse.
[00:33:28] [SPEAKER_03]: That technology isn't going
[00:33:29] [SPEAKER_03]: to be in Conmigo tomorrow.
[00:33:31] [SPEAKER_03]: And it is more fragile.
[00:33:33] [SPEAKER_03]: That demo didn't have any edits.
[00:33:34] [SPEAKER_03]: I mean, that is the technology,
[00:33:35] [SPEAKER_03]: but it's not perfect all the time.
[00:33:37] [SPEAKER_03]: So I think it's very costly
[00:33:38] [SPEAKER_03]: to computationally wise.
[00:33:41] [SPEAKER_03]: I would guess
[00:33:42] [SPEAKER_03]: we're probably a year,
[00:33:43] [SPEAKER_03]: year and a half away
[00:33:43] [SPEAKER_03]: before that type of technology
[00:33:45] [SPEAKER_03]: is more accessible
[00:33:47] [SPEAKER_03]: that we are putting it in Conmigo.
[00:33:49] [SPEAKER_03]: But I'm excited
[00:33:50] [SPEAKER_03]: about that vision capability.
[00:33:51] [SPEAKER_03]: I'm excited about
[00:33:52] [SPEAKER_03]: you know, we talk about Conmigo
[00:33:53] [SPEAKER_03]: or an AI as a guardian angel.
[00:33:55] [SPEAKER_03]: I'm excited about
[00:33:56] [SPEAKER_03]: potentially using AI
[00:33:58] [SPEAKER_03]: to do things like
[00:33:58] [SPEAKER_03]: make the classroom more human,
[00:34:00] [SPEAKER_03]: more interactive,
[00:34:01] [SPEAKER_03]: facilitate conversations
[00:34:02] [SPEAKER_03]: between people
[00:34:05] [SPEAKER_03]: as a TA, be able to not only
[00:34:06] [SPEAKER_03]: break out students into breakouts,
[00:34:08] [SPEAKER_03]: but actually facilitate
[00:34:09] [SPEAKER_03]: those breakouts
[00:34:10] [SPEAKER_03]: in a kind of a real time way.
[00:34:12] [SPEAKER_03]: So there's a lot.
[00:34:13] [SPEAKER_03]: If we think five, ten years out,
[00:34:14] [SPEAKER_03]: I can dream about,
[00:34:15] [SPEAKER_03]: we can dream about virtual reality
[00:34:18] [SPEAKER_03]: and going to ancient Rome
[00:34:19] [SPEAKER_03]: and talking to Julius Caesar.
[00:34:21] [SPEAKER_03]: And it feels like a real Julius Caesar.
[00:34:23] [SPEAKER_03]: So there's some exciting things coming.
[00:34:25] [SPEAKER_01]: I got to ask just out of curiosity.
[00:34:27] [SPEAKER_01]: We started with Nadia.
[00:34:28] [SPEAKER_01]: Like how's Nadia doing these days?
[00:34:30] [SPEAKER_01]: Like, you know,
[00:34:30] [SPEAKER_01]: what does she think about
[00:34:31] [SPEAKER_01]: all the stuff that's happening?
[00:34:33] [SPEAKER_03]: Oh, yeah, I mean, well,
[00:34:34] [SPEAKER_03]: she's slowly gotten used
[00:34:35] [SPEAKER_03]: to being the Nadia.
[00:34:37] [SPEAKER_03]: She graduated from Sarah Lawrence
[00:34:38] [SPEAKER_03]: many years ago now.
[00:34:39] [SPEAKER_03]: She's Nadia is now in her early 30s
[00:34:42] [SPEAKER_03]: and Frid Zocaria was the commencement speaker
[00:34:44] [SPEAKER_03]: and he even gave her a shout out
[00:34:45] [SPEAKER_03]: of like the student that launched
[00:34:47] [SPEAKER_03]: billions of lessons or whatever.
[00:34:49] [SPEAKER_03]: So, you know,
[00:34:49] [SPEAKER_03]: she's always getting embarrassed there.
[00:34:50] [SPEAKER_03]: But no, she's a she's in her final year
[00:34:52] [SPEAKER_03]: of a PhD program in New York,
[00:34:54] [SPEAKER_03]: a clinical psychologist,
[00:34:55] [SPEAKER_03]: and she's about to get married.
[00:34:56] [SPEAKER_03]: So, you know, knock on wood,
[00:34:57] [SPEAKER_03]: things seem to be going well for her.
[00:34:59] [SPEAKER_03]: I can't take credit for everything.
[00:35:00] [SPEAKER_03]: Actually, very little
[00:35:01] [SPEAKER_03]: of what she's accomplished
[00:35:02] [SPEAKER_03]: since seventh grade math.
[00:35:03] [SPEAKER_01]: But hey, at least
[00:35:04] [SPEAKER_01]: at least you got her to realize
[00:35:06] [SPEAKER_01]: that math is something
[00:35:07] [SPEAKER_01]: that she could like.
[00:35:08] [SPEAKER_01]: Definitely.
[00:35:09] [SPEAKER_01]: Sal, thanks so much
[00:35:09] [SPEAKER_01]: for coming on the show.
[00:35:10] [SPEAKER_01]: Thanks for having me.
[00:35:16] [SPEAKER_01]: Here's what I came away
[00:35:18] [SPEAKER_01]: from this interview thinking about.
[00:35:20] [SPEAKER_01]: If Sal's prophecy for AI
[00:35:21] [SPEAKER_01]: education proves true,
[00:35:23] [SPEAKER_01]: then the future of education
[00:35:24] [SPEAKER_01]: looks very bright.
[00:35:26] [SPEAKER_01]: Students who don't have access
[00:35:27] [SPEAKER_01]: to tutors or the privilege
[00:35:29] [SPEAKER_01]: of a private school education.
[00:35:31] [SPEAKER_01]: Students who have teachers
[00:35:32] [SPEAKER_01]: who are stretched to their capacity
[00:35:33] [SPEAKER_01]: in an underfunded understaffed school
[00:35:35] [SPEAKER_01]: would have a lot more support.
[00:35:37] [SPEAKER_01]: Teachers in the meantime
[00:35:38] [SPEAKER_01]: can spend more time
[00:35:39] [SPEAKER_01]: connecting with kids,
[00:35:41] [SPEAKER_01]: building up their confidence,
[00:35:42] [SPEAKER_01]: helping them focus
[00:35:43] [SPEAKER_01]: instead of spending hours
[00:35:45] [SPEAKER_01]: grading and churning out reports.
[00:35:48] [SPEAKER_01]: If Sal's right,
[00:35:49] [SPEAKER_01]: this is technology
[00:35:50] [SPEAKER_01]: that will help us
[00:35:51] [SPEAKER_01]: reclaim our humanity
[00:35:52] [SPEAKER_01]: rather than pulling us away from it,
[00:35:54] [SPEAKER_01]: fueling our curiosity to learn
[00:35:56] [SPEAKER_01]: on an individual level.
[00:35:59] [SPEAKER_01]: But that's if schools
[00:36:00] [SPEAKER_01]: use this technology as promised.
[00:36:02] [SPEAKER_01]: It's not too hard to imagine
[00:36:04] [SPEAKER_01]: a very different future
[00:36:05] [SPEAKER_01]: where we have more AI
[00:36:07] [SPEAKER_01]: and fewer human teachers.
[00:36:09] [SPEAKER_01]: I can see a school district
[00:36:10] [SPEAKER_01]: strapped for cash
[00:36:11] [SPEAKER_01]: making this argument.
[00:36:12] [SPEAKER_01]: They demand rights
[00:36:13] [SPEAKER_01]: and good working conditions.
[00:36:14] [SPEAKER_01]: So why don't we just cut back on teachers
[00:36:17] [SPEAKER_01]: and just grow class sizes?
[00:36:19] [SPEAKER_01]: After all, AI can fill in
[00:36:21] [SPEAKER_01]: for any individualized attention
[00:36:22] [SPEAKER_01]: that's lost.
[00:36:24] [SPEAKER_01]: So where I net out
[00:36:25] [SPEAKER_01]: is that technology itself
[00:36:27] [SPEAKER_01]: is full of promise
[00:36:28] [SPEAKER_01]: and it's only going to get better from here.
[00:36:31] [SPEAKER_01]: But as always,
[00:36:32] [SPEAKER_01]: it's up to us
[00:36:33] [SPEAKER_01]: to fulfill that promise.
[00:36:38] [SPEAKER_01]: The TED AI Show
[00:36:38] [SPEAKER_01]: is a part of the TED Audio Collective
[00:36:40] [SPEAKER_01]: and is produced by TED
[00:36:42] [SPEAKER_01]: with Cosmic Standard.
[00:36:44] [SPEAKER_01]: Our producers are Ella Federer
[00:36:45] [SPEAKER_01]: and Sarah McCray.
[00:36:46] [SPEAKER_01]: Our editors are Ben Bencheng
[00:36:48] [SPEAKER_01]: and Alejandra Salazar.
[00:36:50] [SPEAKER_01]: Our showrunner is Ivana Tucker
[00:36:52] [SPEAKER_01]: and our associate producer
[00:36:54] [SPEAKER_01]: is Ben Montoya.
[00:36:55] [SPEAKER_01]: Our engineer is Asia Pilar Simpson.
[00:36:57] [SPEAKER_01]: Our technical director
[00:36:58] [SPEAKER_01]: is Jacob Winnink
[00:36:59] [SPEAKER_01]: and our executive producer
[00:37:01] [SPEAKER_01]: is Eliza Smith.
[00:37:03] [SPEAKER_01]: This episode was fact-checked
[00:37:04] [SPEAKER_01]: by Dana Kalachi
[00:37:05] [SPEAKER_01]: and I'm your host,
[00:37:06] [SPEAKER_01]: Bilal Volsadoo.
[00:37:08] [SPEAKER_01]: See y'all in the next one.

