by Chris Woodford. Last updated: July 9, 2013.
Will there come a point in future when robots collect our garbage, serve our food in restaurants, and even act as our pets and companions? Science writers and futurologists (people who think about the future) have been wondering about this for much of the 20th century, ever since the word "robot" entered the language in the 1920s. Robots have all kinds of advantages: they can work 24 hours a day without complaining and they need no pay or other rewards. Why, then, don't we just manufacture billions of robots so we can all put our feet up and relax? The answer is simple: even the most complex robots now under development are no match for the all-round versatility of the human brain and body. Like it or not, humans are here to stay—at least for the foreseeable future.
Photo: This robot mannequin, nicknamed "Manny," is designed to test the effectiveness of protective clothing in dangerous situations. Developed by the Pacific Northwest Laboratory, Manny can "breathe", "sweat", and "move" just like a human. Photo by courtesy of US Department of Energy.
Make your own human
Given an infinite supply of scrap parts and plenty of ingenuity, how would you go about making a human-like robotic machine? Although humans are complex, psychologists (people who study human minds and behavior) break down our everyday actions into three main areas. There's sensory perception, in which our major sense organs (such as our eyes and ears) receive "inputs" and send them to the brain. There's cognition, which is essentially the brain processing sensory inputs and deciding what to do about them. And there's action—how the brain instructs our muscles and inner organs to operate in response. Let's look at how robotic machines could mimic each of these areas.
Perception: The eyes have it—or do they?
Humans have five senses, but robots tend to have only one: sight. It's not impossible to make robots that could respond to sound, touch, smell, or taste but, so far, vision is really all they can handle. You may have seen humanlike robots with "eyes," but what robots really have are digital cameras that feed images to a central computer. Human eyes are incredibly complex, but they represent only a small part of the human visual system (the apparatus in our heads that allows us to see the world and understand the significance of what we're looking at). Most of our visual perception happens in the brain—and the same is true of robots.
Photo: One day, robots may look just like humans, with real human emotions. This robot, Emo, has a computer-controlled face with digital cameras for eyes and lips that move to show how it's feeling. Pictured at Think Tank, the science museum in Birmingham, England.
Taking digital images of a scene and feeding them into a robot's computer brain is easy, but figuring out what the image represents is a much harder problem. Suppose you're a robot looking at a digital photo of the room where you're now sitting. How do you know where objects start and end? How do you know which objects are just lifeless, static bits of wood and plastic—and which ones are potentially threatening creatures? If there's a tiger-skin rug on the carpet, how do you know it's not a real tiger waiting to eat you? When the light changes in the room, and all the objects appear different shades and colors, how do you know you're still looking at the same things? If you go to another place in the room and look around you once more, how do you know you're looking at the same objects when you see apparently different shapes from completely different angles? All these things sound trivially simple to us, but to a computer they are extremely difficult problems to solve. Processing images and figuring out what they represent is the hardest part of machine vision, but it's essentially a computational, "brain" problem.
Cognition: The brain game
How do you give a robot a brain? That one's easy: you just stick a computer inside it. That's not quite the end of the matter, though, any more than it is in humans. After we're born, our brains develop through a mixture of nature and nurture. For example, most psychologists agree that we have an innate ability to learn language, but we have to be taught the specific words of our own language and what they actually mean. Computers are much more about nurture than nature: they can't do anything that we don't program them to do.
Photo: This NASA 3D object scanning robot isn't intelligent: it can do only what it's programmed to do. Photo by Dominic Hart courtesy of NASA Ames Imaging Library System.
That means computerized robots designed to do specific tasks have to be taught to do those tasks in minute detail. They have to have programs running in their computers that allow them to react to every possible situation they are likely to encounter. This is a far cry from human intelligence, which is a general, "common sense" ability to understand a situation and react in the most appropriate way—even when you've never encountered it before. There may come a time in future when computers and robots have a kind of artificial intelligence so they can figure out what they need to do in any situation. But, for the time being, the best they can do is to follow the orders we give them. In other words, robots don't think; they just follow instructions.
Action: One step beyond
Making a machine move is one of the easier problems of robotics. After all, inventors have been developing moving machines—from the wheel to the space rocket—for thousands of years. What makes robots different from most other machines is the extreme precision with which they can move.
Photo: Robots can be programmed to move with incredible precision. This one is learning to play the drums at Think Tank, the science museum in Birmingham, England.
A robot hand that can pick up an egg without breaking it has to be able to make precise gripping movements to an accuracy of millimeters. How does it do it? The joints in the hand are operated by very accurate servo motors or stepper motors. Unlike a normal electric motor, which rotates by an unpredictable amount when you switch on the electric current, a servo or stepper motor can be made to rotate with great precision, which allows a robot shoulder, arm, elbow, wrist, or finger to turn through an exact number of degrees.
Some robots use hydraulics (the kind of fluid-filled tubes used on mechanical excavators and tipper trucks) so they can lift or move heavy objects with relatively little effort. The Space Shuttle's manipulator arm is another example of a robot-like device that uses hydraulics. (It's not really a robot as such: the arm is operated by one of the pilots—it's not controlling itself.)
What are robots used for?
The idea of a robot as a general-purpose human servant is a long way from the reality of the robots in widespread use today. The most common robots are industrial, factory machines: hydraulic, mechanical arms precisely controlled by computers. You may have seen robots like this working in automobile factories, where they assemble, weld, and spray-paint new cars. Clothes factories use similar robot arms, fitted with lasers, to cut fabric with extraordinary speed and precision. Although most industrial robot arms are general-purpose machines, they are "trained" (essentially, programmed) to do one highly specific job and they never do anything else. To use them for another purpose, you'd have to completely reprogram them.
Photo: A typical industrial robot. This one welds the seams of car bodies at high speed and with high precision. It's a demonstration at Think Tank, the museum of science in Birmingham, England.
Another important job robots do is to go to places or do things that no human would want to do. The military have long used remote-control robotic machines to defuse bombs. A typical bomb disposal robot has tracks to maneuver it around, a camera that lets the operator see what it's doing, and a robot arm for manipulating whatever it finds. Some of these machines also have a remote-controlled rifle attached so they can destroy suspect packages. While we tend to call them "robots," machines like this are not really robotic: they are simply remote-controlled machines operated at a safe distance by a human being: they don't have an onboard computer and they're not controlling their own movements. Space explorer robots (like the ones that have landed on Mars) usually combine remote control (they can be steered from mission control on Earth) and autonomous operation (they can navigate themselves, explore, and send pictures or samples of what they find back to Earth).
Photo: A remote-controlled bomb-disposal robot in action. The grey box at the top is a camera that allows operators to see what the robot is doing. Photo by courtesy of US Air Force.
Medicine is another area where robots are becoming increasingly important. Thanks to the Internet, surgeons in one country can operate on patients in another by sitting at remote-control consoles and operating joysticks. The robot's precise arms, controlled by hydraulics and stepper motors, carry out the movements exactly as the surgeon wishes—even though doctor and patient may be thousands of miles apart. In Japan, friendly, robotic teddy bears are being used to help patients recuperate, while hydraulic robot "exoskeletons," worn by orderlies on top of their own bodies, can make it much easier to move elderly patients around. Many people have speculated that nanobots will be tomorrow's most important medical robots. A million times smaller than a millimeter, these micromachines could be injected into people's bodies to fight diseases and carry out repairs.
There's no shortage of jobs for robots to do. The question is whether robots can do those jobs as well as the best machines on the planet: human beings!
Find out more
On this website
- Handwriting recognition (OCR)
- How computers work
- History of computers
- Internet and brain
- Speech synthesis
- Stepper motors
- Voice recognition
For older readers
- The Robotics Primer by Maja J. Matarić. MIT Press, 2007. An accessible easy-to-understand overview suitable for most readers.
- Robotics: Modelling, Planning and Control by Bruno Siciliano, Lorenzo Sciavicco, and Luigi Villani. Springer, 2009. A much more theoretical introduction to robot control.
- Physics of the Future by Michio Kaku. Knopf Doubleday Publishing Group, 2011. A whistle-stop tour through the future, considering how current scientific and technological trends will play out in coming decades. Suitable for readers of all ages. A bit of a mixed bag, this book is at its best when it stays firmly grounded in science (and at its worst when it drifts off into empty speculation).
- The Robot Builder's Bonanza by Gordon Mccomb and Myke Predko. McGraw Hill, 2006. A hands-on guide to robot hacking for hobbyists packed with ideas for robot projects.
- 123 Robotics Experiments for the Evil Genius by Michael Predko and Myke Predko. McGraw-Hill Professional, 2004. After a brief introduction to robotics, the Predko's get straight to work with toilet paper, glue, nuts, bolts, and anything else they can find.
For younger readers
- Ultimate Robot by Robert Malone. DK, 2004. Combines history, science, and technology in a visually attractive format that will appeal to teenagers, in particular.
- Eyewitness: Robot by Roger Murrell. DK, 2004. A great resource for school projects, covering the history of robots and surveying the many everyday applications for which they're now indispensable.
- Automaton, Know Thyself: Robots Become Self-Aware by Charles Q. Choi, Scientific American, February 24, 2011. Can robots learn to adapt in the same way as humans?
- Robots and cars for the future: BBC News, 26 June 2009. Ian Hardy visits the famous MIT Media Lab, where tomorrow's robots are being developed.
- Robots learn to move themselves: BBC News, 6 August 2008. Researchers in Germany develop software allowing robots to improve their movements by trial and error.
- Robots could demand legal rights: BBC News, 21 December 2006. Just how human will we allow robots to become?
- Wired: Geek Dad: Robots: Wired's excellent Geek Dad blog has regular posts about robots.
The best way to learn about cutting-edge robots is to watch them in action. So here's a small collection of 10 short videos I've compiled from YouTube (and elsewhere) that illustrate the past, present, and future of robotics. As you watch these films, try to imagine the engineering challenges the robot designers have had to solve in each case:
- A quick tour of robot history: An entertaining whistle-stop tour of how robots have got where they are today, compiled by Diagonal View. There's no commentary, but it's fun to watch.
- A robot nose: Could a robot ever learn to smell? Apparently, yes!
- The MIT Leg Lab: Teaching robots to walk like turkeys!
- Robots inspired by animals: Why should robots be modeled on humans? Here's a great summary of robotic creatures inspired by other marvels from the natural world.
- Robot dog and robot cheetah developed for the US military.
- Two chat robots argue about God: What happens when two robots try to hold a meaningful conversation about a difficult subject?
- Developing emotional robots: How Kismet and other robots express emotions with humanlike face movements.
- Robot octopuses and boneless robots show how innovative materials could make robots that will go to places no ordinary, metal, "mechanical" robot ever could.