글로벌 트렌드내서재담기 


  • Robots that Feel Our Pain

    According to the International Federation of Robots, industrial robots are now a $6 billion a year market, and another $12 billion is spent annually on software and support systems for them.1 Currently, there are about 1 million industrial robots working on assembly lines around the world, heavily concentrated in the automotive and electronics industries.

    However, looking ahead, the most spectacular growth will come in the market for service bots, which clean floors, wash windows, mow lawns, and entertain people. Since 2002, a company called iRobot has sold 2 million Roomba floor-cleaning robots. According to United Nations Economic Commission for Europe, by 2008, 7 million service robots will be sold, with an estimated value of $7.4 billion.2

    Until now, robots have been nothing more than programmable machines, with as much warmth and personality as a vacuum cleaner. But a new breed of service bots will be surprisingly lifelike because they will be able to interact with humans by responding not just to words, but also to emotions.

    According to reports in Wired3 magazine and other sources, the latest advances in robotics were on display at the annual conference of the American Association for Artificial Intelligence in Boston last July.

    A robot made by the University of Alberta won a poker tournament, which was significant because it showed that the machines could overcome challenges like “randomness and hidden states,” as typified by not knowing the cards of their opponents.A robot made by Kansas State University triumphed in a scavenger hunt by using sonar to locate objects that were scattered around the room. This is important because it showed that bots could interact with their environment.

    A robot made by the Music Technology Group at Spain’s Universitat Pompeu Fabra showed that it could play improvisational jazz. Instead of merely reproducing the notes of a song mechanically, the Spanish robot uses predictive algorithms to “feel” the moments when it should stretch a note or cut it short.

    Other advances in robotics and artificial intelligence are also coming to light:Last year, a Stanford University robotics team won a $2 million competition sponsored by DARPA, by programming a car to drive 131 miles across the Mojave Desert without a human at the controls.Stanford is also developing a robot that can put dishes in a dishwasher, take out the garbage, and put together IKEA-type furniture using a hammer and screwdriver.

    Poseidon Technologies of France is marketing an artificial intelligence system that uses underwater sensors to recognize when people are struggling in swimming pools. The system alerts human lifeguards, who have already rescued swimmers who might have drowned without it.

    MIT scientists have developed a robot called Kismet that can hold realistic conversations with people. According to BBC News,4 Kismet can use seven facial expressions, change the tone of its voice, and move its eyes and head toward the person with whom is it conversing.

    University of Texas at Dallas researchers have created a robotic face that looks human. Among the 28 facial expressions it can convey are smiling, sneering, raising its eyebrows, and frowning.

    Hewlett-Packard’s engineers have created an electronic DJ that mixes music. The designers say that it can sense the emotions of people in a nightclub and then respond with the right music to sustain or improve their moods.

    It’s fitting that all of this progress is being made now, in the 50th year since scientists made their first tentative steps toward artificial intelligence. According to the New York Times,5 the new breeds of bots are taking advantage of recently discovered insights into how the human brain works. This has inspired some researchers to use the term “cognitive computing” instead of artificial intelligence to describe the new machines that are beginning to give the appearance of human levels of intelligence.

    Unlike the earlier attempts to create artificial intelligence, today’s researchers can harness computer processing power that is millions of times greater than the machines scientists used in the 1960s. They have also gained a deeper understanding of how the human brain works, so they can apply this knowledge to create robots that think more like people do.

    At the same time, researchers are learning how to make communications between machines and people feel more natural. Experiments by a team of scientists from MIT’s Media Lab and Mitsubishi Electric Research Laboratories have found that humans feel a closer connection to robots when the bot follows the person’s face with its eyes and head during a conversation.6

    In a research paper published last year, the researchers explained how they set out to study the concept of engagement ? the process by which people maintain a connection when they talk. In human-to-human communications, when the listener looks away while the speaker is talking, or makes a facial expression that shows he isn’t receptive, the speaker will change strategies.

    However, in robot-to-human communications, robots traditionally haven’t responded to these cues that suggest they should change strategies. Instead, they continue to use the same approach. The MIT and Mitsubishi researchers used a series of studies to show that when robots were designed to track people’s faces during conversations, people directed their attention to the bots more frequently and found the interactions more appropriate.

    The next logical step is for programmers to develop machines that can go beyond tracking people’s faces to “read” people’s facial expressions and interpret what they are feeling. Based on recent breakthroughs in artificial intelligence, an emotionally intelligent robot is now possible.

    According to BBC News,7scientists demonstrated an “emotionally aware” computer system at the Royal Society Summer Science Exhibition in London in July 2006.

    University of Cambridge computer technology professor Peter Robinson explained that a computer uses a camera to study 24 areas of a person’s face, looking for various facial movements ? such as a headshake, a nod, a raised eyebrow, or a smile. Then the computer’s software interprets the underlying emotions of all of the facial expressions to determine how the person is feeling ? for example, sad, angry, happy, or skeptical.

    Robinson said that the technology could be used as a sales tool. He envisions that customers using webcams would transmit their facial expressions to a Web site that would then show the customers products based on their moods.

    Meanwhile, a research team at the MIT Media Lab is developing a similar device that can be worn by people who have trouble recognizing emotions, such as those with autism; using the technology, they will be able to identify other people’s emotions. According to CNET News, the device is called the Emotional Social Intelligence Prosthetic, or ESP.8

    Essentially, ESP will allow wearers to do what most people do subconsciously every day: Scan other people’s facial expressions and head movements for clues to what they’re thinking and feeling.To use ESP, the individual wears a small video camera on a baseball cap or around the neck, plus an earphone, a vibrating device that can be clipped onto a belt, and a handheld computer. When the user is engaged in a conversation, the device picks up signals from the other person’s facial expressions. For example, if the other person looks bored, the technology alerts the user, who can then change the subject or ask the listener a question.

    The researchers hope to use the technology to improve the social skills of people with Autism Spectrum Condition, which keeps people from recognizing emotions in others.However, the Trends editors foresee that it could also easily be applied to robotics. A “bot” equipped with ESP technology would be able to communicate much more effectively than today’s machines. For example, it could scan its owner’s face for non-verbal signs that he or she is sad, happy, angry, tired, or in pain.

    Based on these intriguing developments, we offer the following five forecasts:

    First, in 2008 or sooner, we’ll see the arrival of the first “emobots.” These will be robot toys that interact with their environment and make connections with people on an emotional level. While they will only be used for entertainment, they will open the minds of mainstream consumers to the enormous potential of robots. According to Forbes9 magazine, thousands of people have sent e-mails to the developer of a robotic dinosaur called Pleo, hoping to be among the first to pay $250 for the first “emobot.” Caleb Chung, the inventor of the Furby interactive toy that sold 50 million units, founded a company called Ugobe to develop Pleo with $2.7 million in seed money ? and expects to raise $8 million more this fall. Pleo has its own operating system, a microcontroller, 14 motors, and 31 sensors so it can respond to motions and sounds and respond to its environment in more than 1,000 ways. Chung says, “Our job is to create a relationship between Pleo and a person. You’ll interact with it [and] share an emotional language with it.” Chung demonstrated Pleo at a technical conference in early 2006 and expects it to reach the market as early as March 2007.

    Second, by the end of 2007, we are likely to see a milestone in the development of robotic cars. Sebastian Thrun, whose Stanford University team won the DARPA competition in 2006, is programming an artificial intelligence system to drive a Volkswagen Passat that will compete in DARPA’s next challenge: to send a driverless car 60 miles through a U.S. city in November 2007.10

    Third, within five years, computer systems that recognize human emotions will be included in existing products, such as cars. BBC News11 reports that one automobile manufacturer is planning to use the University of Cambridge mood-identification software, along with a dashboard-mounted camera, as a safety device. It will detect facial expressions that reveal that the driver is confused or fatigued. Then an on-board computer ? or a system similar to OnStar ? could respond by adjusting the car’s interior lighting and temperature, sending alerts to the driver, providing navigational help to the driver’s destination, and offering directions to the nearest rest stop or coffee shop.

    Fourth, within 15 years, robotic butlers will become available to wealthy customers, and within 25 years they will be as common in middle-class homes as televisions and toasters are today. It may happen even sooner. Robert Hecht-Nielsen, who founded HNC Software, asserts that he could build an electronic butler within five years if he had the right resources. According to the New York Times,12 at HNC, Hecht-Nielsen successfully used neural network technology to duplicate the processes of the human brain to detect credit card fraud. He sold the company to the Fair Isaac Corporation, where he is now leading research into “confabulation,” a theory about how the human brain makes decisions. At IBM, he has demonstrated how software based on confabulation can read two sentences and then create a third sentence that makes sense. Using confabulation, Hecht-Nielsen believes he can build a robot butler that will hold conversations with its owner and handle such chores as ordering groceries.

    Fifth, within 30 years, advances in robotics will merge with advances in computing power and breakthroughs in nanotechnology to yield intelligent, life-like machines that will revolutionize every area of human life. Computing power will continue to double every 18 months due to Moore’s Law, and computers will shrink to the size of molecules thanks to nanotechnology. Ultimately, nanobots will be swallowed in capsules to prowl our bloodstreams to eradicate diseases and cancerous cells, which will transform healthcare. Robots of various sizes will assume many of the functions that people perform now. They will be put into service as office receptionists and building security guards. They will shop for clothes and prepare meals. They will diagnose medical conditions and perform surgeries. This will lead to a dramatic change in society as fewer humans are needed in the workforce. However, the gains in productivity as human workers are replaced by low-cost bots ? which will never need sick days, lunch hours, or vacations ? will create unprecedented wealth, allowing people to pursue a more leisurely and spiritual life.
    References List : 1. Forbes, September 4, 2006, “The Robots Are Coming!” by Elizabeth Corcoran. ⓒ Copyright 2006 by Forbes Publishing, Inc. All rights reserved. 2. ibid. 3. To access Wired magazine’s reporting of the 2006 conference of the American Association for Artificial Intelligence, visit their website at: www.wired.com 4. For information about MIT’s development of the more realistic robot, Kismet,visit the BBC News website at: www.bbc.co.uk 5. The New York Times, July 18, 2006, “Brainy Robots Start Stepping into Daily Life,” by John Markoff. ⓒ Copyright 2006 by The New York Times Company. All rights reserved. 6. To access the MIT Media Lab’s and the Mitsubishi Electric Research Laboratory’s joint study on robot engagement, visit the MIT website at: web.media.mit.edu 7. To access BBC News’ report on the 2006 Royal Society Summer Science Exhibition, visit their website at: news.bbc.co.uk 8. To access CNET News’ report on the Emotional Social Intelligence Prosthetic, visit their website at: news.com.com 9. Forbes, September 4, 2006, “Emobots,” by Elizabeth Corcoran. ⓒ Copyright 2006 by Forbes Publishing, Inc. All rights reserved. 10. To access Wired magazine’s reporting of the 2006 conference of the American Association for Artificial Intelligence, visit their website at: www.wired.com 11. To access BBC News’ report on the 2006 Royal Society Summer Science Exhibition, visit their website at: news.bbc.co.uk 12. The New York Times, July 18, 2006, “Brainy Robots Start Stepping into Daily Life,” by John Markoff. ⓒ Copyright 2006 by The New York Times Company. All rights reserved.