Why hasn't the dream of having a robot at home to do your chores become a reality yet? With three decades of research expertise in the field, roboticist Ken Goldberg sheds light on the clumsy truth about robots — and what it will take to build more dexterous machines to work in a warehouse or help out at home.
Learn more about our flagship conference happening this April at attend.ted.com/podcast
Hosted on Acast. See acast.com/privacy for more information.
[00:00:00] TED Audio Collective.
[00:00:06] One of my favorite streaming series was a three season sci-fi show called Humans.
[00:00:14] It's set in a universe where human-like robots called Synths are the advanced must-have technology for every home.
[00:00:21] They nanny, they cook, they clean, they can do the grocery shopping, and in many ways move, look, and behave like real people.
[00:00:29] The show explores tricky questions about humanoid robots and how they would fit into our lives.
[00:00:35] But in reality, a world where man and machine are alike enough to be as adopted in our society like this is still something made of science fiction.
[00:00:44] I'm Sherelle Dorsey and this is TED Tech.
[00:00:49] UC Berkeley robotics professor Ken Goldberg has been fascinated by robots for quite a while,
[00:00:55] which means he knows all about their potential and their limitations.
[00:01:00] On the TED stage, he dissects the complex idea of robots easily taking over our daily tasks for us.
[00:01:06] It's a fun problem to consider and it's going to take more time to solve.
[00:01:17] Canva presents stories to keep you up at night.
[00:01:27] It was an ordinary work day until...
[00:01:30] The Singapore presentation is at 3 a.m.
[00:01:33] The office was shocked.
[00:01:35] That's when we sleep!
[00:01:38] Maya made it less scary with Canva.
[00:01:41] I'll just record my presentation so Singapore can watch it anytime.
[00:01:45] Record and present anytime with Canva presentations at canva.com, designed for work.
[00:01:53] This show was brought to you by Schwab.
[00:01:55] With Schwab investing themes, it's easy to invest in ideas you believe in,
[00:02:00] like artificial intelligence, big data, robotic revolution and more.
[00:02:04] Choose from over 40 themes.
[00:02:06] Buy as is or customize the stocks in a theme to fit your goals.
[00:02:10] Learn more at schwab.com slash thematic investing.
[00:02:16] Support for TED Tech comes from Odoo.
[00:02:18] To put it simply, Odoo is built to save.
[00:02:21] Odoo saves time, Odoo saves money, but most importantly, Odoo saves businesses.
[00:02:26] That's right, Odoo's superhero software rescues companies from the perils of disconnected platforms.
[00:02:31] And Odoo's utility belt of user-friendly applications puts the power of total business management in the palm of your hand.
[00:02:38] Learn more at odoo.com slash TED Tech.
[00:02:41] That's O-D-O-O dot com slash TED Tech.
[00:02:44] Odoo, saving the world one business at a time.
[00:02:49] Hackers and cyber criminals have always held this kind of special fascination.
[00:02:54] Obviously, I can't tell you too much about what I do.
[00:02:57] It's a game. Who's the best hacker? And I was like, well, this is child's play.
[00:03:03] I'm Dena Temple-Reston, and on the Click Here podcast, you'll meet them and the people trying to stop them.
[00:03:09] We're not afraid of the attack. We're afraid of the creativity and the intelligence of the human being behind it.
[00:03:15] Click here. Stories about the people making and breaking our digital world.
[00:03:19] Satellite, enter ignition.
[00:03:23] Click here. And liftoff!
[00:03:25] Click here. Every Tuesday and Friday, wherever you get your podcasts.
[00:03:30] I have a feeling most people in this room would like to have a robot at home.
[00:03:36] It'd be nice to be able to do the chores and take care of things.
[00:03:40] Where are these robots? What's taking so long?
[00:03:43] I mean, we have our tricorders, and we have satellites.
[00:03:48] We have laser beams.
[00:03:51] But where are the robots?
[00:03:54] I mean, okay, wait. We do have some robots in our home.
[00:03:58] But not really doing anything that exciting, okay?
[00:04:03] Now, I've been doing research at UC Berkeley for 30 years with my students on robots.
[00:04:11] And in the next 10 minutes, I'm going to try to explain the gap between fiction and reality.
[00:04:17] In the field, there's something that explains this that we call Moravec's paradox.
[00:04:21] And that is what's easy for robots, like being able to pick up a large object,
[00:04:28] large heavy object, is hard for humans.
[00:04:31] But what's easy for humans, like being able to pick up some blocks and stack them,
[00:04:38] well, it turns out that is very hard for robots.
[00:04:42] And this is a persistent problem.
[00:04:45] So the ability to grasp arbitrary objects is a grand challenge for my field.
[00:04:51] Now, by the way, I was a very klutzy kid.
[00:04:56] I would drop things. Anytime someone would throw me a ball, I would drop it.
[00:05:01] I was the last kid to get picked on a basketball team.
[00:05:04] I'm still pretty klutzy, actually.
[00:05:06] But I have spent my entire career studying how to make robots less clumsy.
[00:05:11] Now let's start with the hardware.
[00:05:14] So the hand, it's a lot like our hand.
[00:05:17] And it has a lot of motors, a lot of tendons and cables.
[00:05:21] It's unfortunately not very reliable.
[00:05:24] It's also very heavy and very expensive.
[00:05:26] So I'm in favor of very simple hands.
[00:05:29] So this has just two fingers.
[00:05:32] It's known as a parallel jaw gripper.
[00:05:34] So it's very simple, it's lightweight and reliable, and it's very inexpensive.
[00:05:40] Now, actually, in industry, there's even a simpler robot gripper,
[00:05:44] and that's the suction cup.
[00:05:46] And that only makes a single point of contact.
[00:05:48] So again, simplicity is very helpful in our field.
[00:05:51] Now let's talk about the software.
[00:05:53] And this is where it gets really, really difficult.
[00:05:56] Because of a fundamental issue, which is uncertainty.
[00:06:01] There's uncertainty in the control,
[00:06:03] there's uncertainty in the perception,
[00:06:05] and there's uncertainty in the physics.
[00:06:08] Now what do I mean by the control?
[00:06:10] Well, if you look at a robot's gripper trying to do something,
[00:06:14] there's a lot of uncertainty in the cables and the mechanisms
[00:06:18] that cause very small errors,
[00:06:19] and these can accumulate and make it very difficult to manipulate things.
[00:06:24] Now in terms of the sensors, yes,
[00:06:27] robots have very high resolution cameras, just like we do.
[00:06:30] And that allows them to take images of scenes in traffic
[00:06:34] or in a retirement center or in a warehouse
[00:06:37] or in an operating room.
[00:06:39] But these don't give you the three-dimensional structure
[00:06:42] of what's going on.
[00:06:44] So recently there was a new development called LiDAR.
[00:06:46] And this is a new class of cameras that use light beams
[00:06:50] to build up a three-dimensional model of the environment.
[00:06:54] And these are fairly effective.
[00:06:56] They really were a breakthrough in our field,
[00:06:59] but they're not perfect.
[00:07:01] So if the objects have anything that's shiny or transparent,
[00:07:06] well then the light acts in unpredictable ways
[00:07:08] and ends up with noise and holes in the images.
[00:07:11] So these aren't really the silver bullet.
[00:07:13] And there's one other form of sensor out there now
[00:07:14] called a tactile sensor.
[00:07:16] And these are very interesting.
[00:07:18] They use cameras to actually image the surfaces
[00:07:21] as a robot would make contact.
[00:07:23] But these are still in their infancy.
[00:07:26] Now, the last issue is the physics.
[00:07:29] We take a bottle on a table and we just push it,
[00:07:32] and the robot's pushing it in exactly the same way each time.
[00:07:35] But the bottle ends up in a very different place each time.
[00:07:39] Why is that?
[00:07:41] Well, it's because it depends on the microscopic
[00:07:44] surface topography underneath the bottle as it slid.
[00:07:48] For example, if you put a grain of sand under there,
[00:07:51] it would react very differently
[00:07:53] than if there weren't a grain of sand.
[00:07:55] And we can't see if there's a grain of sand
[00:07:57] because it's under the bottle.
[00:07:59] It turns out that we can predict the motion
[00:08:01] of an asteroid a million miles away
[00:08:05] far better than we can predict the motion
[00:08:08] of an object as it's being grasped by a robot.
[00:08:11] Now, let me give you an example.
[00:08:14] Put yourself here into the position of being a robot.
[00:08:17] You're trying to clear the table,
[00:08:19] and your sensors are noisy and imprecise.
[00:08:22] Your actuators, your cables and motors are uncertain,
[00:08:25] so you can't fully control your own gripper.
[00:08:28] And there's uncertainty in the physics,
[00:08:30] so you really don't know what's going to happen.
[00:08:32] So it's not surprising that robots are still very clumsy.
[00:08:36] Now, there's one sweet spot for robots,
[00:08:38] and that has to do with e-commerce.
[00:08:41] And this has been growing.
[00:08:43] It's a huge trend, and during the pandemic,
[00:08:45] it really jumped up.
[00:08:47] I think most of us can relate to that.
[00:08:49] We started ordering things like never before.
[00:08:52] And this trend is continuing,
[00:08:54] and the challenge is to meet the demand.
[00:08:57] We have to be able to get all these packages
[00:09:00] delivered in a timely manner.
[00:09:02] And the challenge is that every package is different.
[00:09:05] Every order is different.
[00:09:06] So you might order some nail polish
[00:09:10] and an electric screwdriver,
[00:09:13] and those two objects are going to be somewhere
[00:09:17] inside one of these giant warehouses.
[00:09:19] And what needs to be done is someone has to go in,
[00:09:22] find the nail polish, and then go and find the screwdriver,
[00:09:25] bring them together, put them into a box,
[00:09:27] and deliver them to you.
[00:09:29] So this is extremely difficult, and it requires grasping.
[00:09:31] So today, this is almost entirely done with humans.
[00:09:34] And the humans don't like doing this work.
[00:09:36] It's a huge amount of turnover.
[00:09:38] So it's a challenge, and people have tried
[00:09:41] to put robots into warehouses to do this work.
[00:09:45] It hasn't turned out all that well.
[00:09:47] But my students and I, about five years ago,
[00:09:51] we came up with a method using advances
[00:09:53] in AI and deep learning to have a robot
[00:09:56] essentially train itself to be able to grasp objects.
[00:09:59] And the idea was that the robot would do this in simulation.
[00:10:02] It was almost as if the robot were dreaming
[00:10:03] about how to grasp things and learning
[00:10:06] how to grasp them reliably.
[00:10:08] This is a system called DexNet that is able
[00:10:11] to reliably pick up objects that we put
[00:10:13] into these bins in front of the robot.
[00:10:15] These are objects it's never been trained on,
[00:10:18] and it's able to pick these objects up
[00:10:20] and reliably clear these bins over and over again.
[00:10:23] So we were very excited about this result,
[00:10:26] and the students and I went out to form a company.
[00:10:29] And we now have a company called Ambi Robotics.
[00:10:31] And what we do is make machines
[00:10:34] that use the algorithms, the software we developed
[00:10:37] at Berkeley to pick up packages.
[00:10:40] And this is for e-commerce.
[00:10:42] The packages arrive in large bins,
[00:10:44] all different shapes and sizes,
[00:10:46] and they have to be picked up, scanned,
[00:10:48] and then put into smaller bins depending on their zip code.
[00:10:51] We now have 80 of these machines operating
[00:10:54] across the United States sorting
[00:10:56] over a million packages a week.
[00:10:58] Now, that's some progress,
[00:11:01] but it's not exactly the home robot
[00:11:04] that we've all been waiting for.
[00:11:06] So I want to give you a little bit of an idea
[00:11:09] of some of the new research that we're doing
[00:11:11] to try to be able to have robots more capable in homes.
[00:11:14] And one particular challenge is being able
[00:11:17] to manipulate deformable objects like strings
[00:11:20] in one dimension, two-dimensional sheets
[00:11:22] in three dimensions like fruits and vegetables.
[00:11:25] So we've been working on a project to untangle knots.
[00:11:29] And what we do is we take a cable,
[00:11:31] and we put that in front of the robot.
[00:11:34] It has to use a camera to look down,
[00:11:36] analyze the cable, figure out where to grasp it,
[00:11:38] and how to pull it apart to be able to untangle it.
[00:11:41] And this is a very hard problem
[00:11:43] because the cable is much longer
[00:11:45] than the reach of the robot.
[00:11:47] So it has to go through and manipulate,
[00:11:49] manage the slack as it's working.
[00:11:51] And I would say this is doing pretty well.
[00:11:53] It's got a good amount of flexibility
[00:11:55] and it's gotten up to about 80% success
[00:11:57] when we give it a tangled cable
[00:11:59] at being able to untangle it.
[00:12:01] The other one is something I think
[00:12:03] we also all are waiting for,
[00:12:05] robot to fold the laundry.
[00:12:07] Now, roboticists have actually been looking
[00:12:09] at this for a long time,
[00:12:11] and there was some research that has been done on this,
[00:12:14] but the problem is that it's very, very slow.
[00:12:17] So this was about three to six folds per hour.
[00:12:22] Okay?
[00:12:24] So we decided to revisit this problem
[00:12:28] and try to have a robot work very fast.
[00:12:30] So one of the things we did was try to think
[00:12:32] about a two-armed robot that could fling the fabric
[00:12:34] the way we do when we're folding,
[00:12:36] and then we also used friction, in this case,
[00:12:38] to drag the fabric to smooth out some wrinkles.
[00:12:40] And then we borrowed a trick
[00:12:42] which is known as the two-second fold.
[00:12:45] You might have heard of this.
[00:12:47] It's amazing because the robot
[00:12:49] is doing exactly the same thing,
[00:12:50] and it's a little bit longer,
[00:12:52] so we're making some progress there.
[00:12:54] And the last example is bagging.
[00:12:56] So you all encounter this all the time.
[00:12:58] You go to a corner store,
[00:13:00] and you have to put something in a bag.
[00:13:02] Now, it's easy, again, for humans,
[00:13:04] but it's actually very, very tricky for robots
[00:13:07] because for humans, you know how to take the bag
[00:13:09] and how to manipulate it,
[00:13:11] but robots, the bag can arrive
[00:13:13] in many different configurations.
[00:13:15] It's very hard to tell what's going on
[00:13:17] for the robot to figure out how to open up that bag.
[00:13:18] So what we did was we had the robot train itself
[00:13:21] by we painted one of these bags with fluorescent paint,
[00:13:24] and we had fluorescent lights
[00:13:26] that would turn on and off,
[00:13:28] and the robot would essentially teach itself
[00:13:30] how to manipulate these bags.
[00:13:32] And so we've gotten it now up to the point
[00:13:34] where we're able to solve this problem
[00:13:36] about half the time.
[00:13:38] So it works, but I'm saying
[00:13:40] we're still not quite there yet.
[00:13:43] So I want to come back to Moravec's paradox.
[00:13:45] What's easy for robots is hard for humans.
[00:13:46] And what's easy for us is still hard for robots.
[00:13:50] We have incredible capabilities.
[00:13:53] We're very good at manipulation.
[00:13:55] But robots still are not.
[00:13:58] I want to say, I understand.
[00:14:01] It's been 60 years,
[00:14:03] and we're still waiting
[00:14:05] for the robots that the Jetsons had.
[00:14:08] Why is this difficult?
[00:14:10] We need robots because we want them
[00:14:12] to be able to do the things that we want them to do.
[00:14:14] We need robots because we want them
[00:14:16] to be able to do tasks that we can't do
[00:14:20] or we don't really want to do.
[00:14:22] But I want you to keep in mind
[00:14:24] that these robots, they're coming.
[00:14:26] Just be patient
[00:14:28] because we want the robots,
[00:14:30] but robots also need us
[00:14:32] to do the many things
[00:14:34] that robots still can't do.
[00:14:37] Thank you.
[00:14:42] Canva presents Unexplained Appearances.
[00:14:45] It was an ordinary workday,
[00:14:47] but it was a story of a man
[00:14:49] who was trying to tell.
[00:14:51] That presentation appeared out of thin air.
[00:14:54] Also, it's eerily on-brand.
[00:14:56] Wait, did that agenda just write itself?
[00:14:59] Words appear,
[00:15:01] making this unexplainable case...
[00:15:03] Unexplainable?
[00:15:05] It's Canva's AI tools.
[00:15:07] I can generate slides and words in seconds.
[00:15:09] Really?
[00:15:11] The real mystery is why I'm only learning this now.
[00:15:17] Canva.com. Designed for work.
[00:15:20] That's it for today.
[00:15:22] TEDxTech is part of the TED Audio Collective.
[00:15:25] This episode was produced by Nina Lawrence,
[00:15:28] edited by Alejandra Salazar,
[00:15:30] and fact-checked by Julia Dickerson.
[00:15:33] Special thanks to Maria Larias,
[00:15:35] Farah DeGrunge,
[00:15:37] Kory Hajim,
[00:15:39] Daniela Valarezo,
[00:15:41] and Michelle Quint.
[00:15:43] I'm Sherrell Dorsey.
[00:15:45] Thanks for listening
[00:15:47] and talk to you again next week.
[00:15:50] Bye!

