Book Read Free

The Best Australian Science Writing 2015

Page 24

by Heidi Norman


  Honest placebos

  Imagine there’s new metrics (it’s easy if you try)

  Social robots are coming

  Social robots are coming

  Wilson da Silva

  There is something unnerving about Geminoid F. She looks like a Japanese woman in her 20s, about 165 cm tall with long dark hair, brown eyes and soft pearly skin. She breathes, blinks, smiles, sighs, frowns, and speaks in a soft, considered tone.

  But the soft skin is made of silicon, and underneath it lies urethane foam flesh, a metal skeleton and a plastic head. Her movements are powered by pressurised gas, and an air compressor is hidden behind her seat. She sits with her lifelike hands folded casually on her lap. She – one finds it hard to say ‘it’ – was recently on loan to the Creative Robotics Lab at the University of New South Wales in Sydney, where robotics researcher David Silvera-Tawil set her up for a series of experiments.

  ‘For the first three or four days, I would get a shock when I came into the room early in the morning,’ he says. ‘I’d feel that there was someone sitting there looking at me. I knew there was going to be a robot inside, and I knew it was not a person. But it happened every time!’

  The director of the lab, Mari Velonaki, an experimental visual artist turned robotics researcher, has been collaborating with Geminoid F’s creator, Hiroshi Ishiguro, who has pioneered the design of lifelike androids at his Intelligent Robotics Laboratory in Osaka University. Their collaboration seeks to understand ‘presence’ – the feeling we have when another human is in our midst. Can this sensation be reproduced by robots?

  Velonaki has also experienced the shock of encountering Geminoid F. ‘It’s not about repulsion,’ she says. ‘It’s an eerie, funny feeling. When you’re there at night, and you switch off the pneumatics … it’s strange. I’ve worked with many robots; I really like them. But there’s a moment when there’s an element of … strangeness.’

  This strangeness has also been observed with hyper-real video or movie animations. It even has a name: ‘the uncanny valley’: it’s the sense of disjunction we experience when the impression that something is alive and human does not entirely match the evidence of our senses.

  For all her disturbing attributes, Geminoid F’s human-like qualities are strictly skin deep. She is actually a fancy US$100 000 puppet, who is partly driven by algorithms that move her head and face in lifelike ways, and partly guided by operators from behind the scenes. They oversee her questions and answers to ensure they’re relevant. Geminoid F is not meant to be smart. She’s been created to help establish the etiquette of human–robot relations.

  Which is why those studying her are cross-disciplinary types. ‘We hope that collaborations between artists, scientists and engineers can get us closer to a goal of building robots that interact with humans in more natural, intuitive and meaningful ways,’ says Silvera-Tawil.

  It is hoped Geminoid F will help pave the way for robots to take their first steps out of the fields and factory cages to work alongside us. In the near future her descendants – some human-like, others less so – will be looking after the elderly and teaching children.

  It will happen sooner than you think.

  * * * * *

  Rodney Brooks has been called ‘the bad boy of robotics’. More than once he has turned the field upside down, bulldozing shibboleths with new approaches that have turned out to be prophetic and influential.

  Born in Adelaide, he moved to the US in 1977 for his PhD, and by 1984 he was on the faculty at the Massachusetts Institute of Technology. There he created insect-like robots that, with very little brainpower, could navigate over rough terrain and climb steps. At the time, the dominant paradigm was that robot mobility required massive processing power and a highly advanced artificial intelligence. Brooks reasoned that insects had puny brains and yet could move and navigate, so he created simple independent ‘brains’ for each of the six legs of his robots, which followed basic commands (always stay upright irrespective of direction of motion), while a simple ‘overseer’ brain coordinated collaborative movement. His work spawned what is now known as behaviour-based robotics, used by field robots in mining and bomb demolition robots.

  But it is the work he began in the 1990s – developing humanoid robots and exploring human–robot interactions – that may be an even greater game changer. First he created Cog, a humanoid robot of exposed wires, mechanical arms and a head with camera eyes, programmed to respond to humans. Cog’s intelligence grew in the same way a child’s does – by interacting with people. The Cog experiment fathered social robotics, in which autonomous machines interact with humans by using social cues and responding in ways people intuitively understand.

  Brooks believes robots are about to become more commonplace, with ‘social robots’ leading the way. Consider the demographics – the percentage of working-age adults in the US and Europe is around 80 per cent, a statistic that has remained largely unchanged for 40 years. But over the next 40 years, this will fall to 69 per cent in the US and 64 per cent in Europe as the boomers retire.

  ‘As the people of retirement age increase, there’ll be fewer people to take care of them, and I really think we’re going to have to have robots to help us,’ Brooks says. ‘I don’t mean companions – I mean robots doing things, like getting groceries from the car, up the stairs into the kitchen. I think we’ll all come to rely on robots in our daily lives.’

  In the 1990s, he and two of his MIT graduate students, Colin Angle and Helen Greiner, founded iRobot Corp, maker of the Roomba robot vacuum cleaner. It was the first company to bring robots to the masses – 12 million of their products have been sold worldwide, and more than one million are now sold every year.

  The company began by developing military robots for bomb disposal work. Known as PackBots, they’re rovers on caterpillar tracks packed with sensors and with a versatile arm. They’ve since been adapted for emergency rescue, handling hazardous materials or working alongside police hostage teams to locate snipers in city environments. More than 5000 have been deployed worldwide. They were the first to enter the damaged Fukushima nuclear plant in 2011 – although they failed in their bid to vent explosive hydrogen from the plant.

  With the success of the Roomba, iRobot has since launched other domestic lines: the floor mopping Braava, the gutter cleaning Looj, and Mirra for pools. Its latest offering is the tall, free-standing RP-VITA, a telemedicine health care robot approved by the US Food and Drug Administration in 2013. It drives itself to pre-operative and post-surgical patients within a hospital, allowing doctors to assess them remotely.

  Other companies have sprouted up in the past 15 years manufacturing robots that run across rocky terrain, manoeuvre in caves and underwater, or that can be thrown into hostile situations to provide intelligence.

  Robot skills have grown through advances in natural language processing, artificial speech, vision and machine learning, and the proliferation of fast and inexpensive computing aided by access to the internet and big data. Computers can now tackle problems that, until recently, only people could handle. It’s a self-reinforcing loop – as machines understand the real world better, they learn faster.

  Robots that can interact with ordinary people are the next step. This is where Brooks comes in. ‘We have enough understanding of human–computer interaction, and human–robot interaction, to start building robots that can really interact with people,’ he says. ‘An ordinary person, with no programming knowledge, can show it how to do something useful.’

  In 2008 Brooks founded another company, Rethink Robotics, which has done exactly that – created a collaborative robot that can safely work elbow-to-elbow with humans. Baxter requires no programming and learns on the job, much as humans do. If you want it to pick an item from a conveyor belt, scan it and place it with others in a box, you grasp its mechanical hand and guide it through the entire routine. It works out what you mean it to do and goes to work.

  Baxter is cute too. Its face is an ele
ctronic screen, dominated by big, expressive cartoon eyes. When its sonar detects someone entering a room, it turns and looks at them, raising its virtual eyebrows. When Baxter picks something up, it looks at the arm it’s about to move, signalling to co-workers what it’s going to do. When Baxter is confused, it raises an eyebrow and shrugs.

  Baxter, priced at an affordable US$25 000, is aimed at small to medium businesses for whom robots have been prohibitively expensive until now.

  While robots are a big business today, generating US$29 billion in annual sales, the market is still dominated by old-school industrial machines – disembodied arms reliant on complex and rigid programming. These automatons haven’t really changed much from those that began to appear on factory floors in the 1960s. They are stand-alone machines stuck in cages, hardwarebased and unsafe for people to be around. Nevertheless, 1.35 million now operate worldwide, with 162 000 new ones sold every year. They’re used for welding, painting, assembly, packaging, product inspection and testing – all accomplished with speed and precision 24 hours a day.

  But Baxter and his ilk are starting to shake up the field. ‘In the new style of robots, there’s a lot of software with common-sense knowledge built in,’ says Brooks.

  Launched in 2012, Baxter is used in 18 countries with applications such as manufacturing, health care and education. Rethink Robotics’ backers include Amazon’s Jeff Bezos, whose own company is a big user of robots to handle goods in its warehouses.

  When Google revealed in December 2013 it had acquired eight robotics companies, it sent a thunderbolt through the field. Google created a division led by Andy Rubin, the man who spearheaded Android, the world’s most widely used smartphone software and who began his career as a robotics engineer. Only a month later, Google shelled out US$650 million to buy Deep-Mind Technologies, a secretive artificial intelligence company in London developing general-purpose learning algorithms.

  ‘As of 2014, things are finally changing,’ says Dennis Hong, who heads the Robotics and Mechanisms Laboratory at the University of California in Los Angeles. ‘The fact that Google bought these companies shows that, finally, it’s time for the robotics business to really start.’

  * * * * *

  Where does that leave social robotics? Since Baxter came on the scene, ‘everybody’s saying they’ve got collaborative robots,’ chuckles Brooks. ‘But some of them are just dressed up, oldstyle interfaces. Industrial robots have not been made easy to use because it’s engineers who use them, and they like that complexity. We made them popular by making them easy to use.’

  But as people and money flood into the field, artificial intelligence with social smarts is developing fast, says Brooks. Take Google’s self-driving car: the original algorithms were found to be useless in traffic. The cars would become trapped at four-way stop sign intersections because they couldn’t read other drivers’ intentions. The solution came in part by incorporating social skills into the algorithm.

  Brooks hopes that Baxter will become smart and cheap enough that researchers will develop applications beyond manufacturing. Updates to its operating system already allow the latest model Baxters to be twice as accurate and operate three times faster than earlier models.

  Brian Scassellati, who studied under Brooks and is now a professor of computer science at Yale University, also believes robots are about to leave the factory and enter homes and schools. ‘We’re entering a new era … something we saw with computers 30 years ago. Robotics is following that same curve,’ he says. ‘They are going to have a very important impact on populations that need a little bit of extra help, whether that’s children learning a new language, adults who are ageing and forgetful, or children with autism spectrum disorder who are struggling to learn social behaviour.’

  In 2012, Scassellati’s Social Robotics Lab began a five-year, US$10 million US National Science Foundation program with Stanford University, MIT and the University of Southern California to develop a new breed of ‘socially assistive’ robots designed to help young children learn to read, overcome cognitive disabilities and perform physical exercises.

  ‘At the end of five years, we’d like to have robots that can guide a child towards long-term educational goals … and basically grow and develop with the child,’ he says.

  Despite the progress in human–robot interaction that has led to machines such as Baxter, Scassellati’s challenge is still daunting. It requires robots to detect, analyse and respond to children in a classroom; to adapt to their interactions, taking into account each child’s physical, social and cognitive differences; and to develop learning systems that achieve targeted lesson goals over weeks and months.

  To try to achieve this, robots will be deployed in schools and homes for up to a year, with the researchers monitoring their work and building a knowledge base.

  Early indications are that real gains can be made in education, says Velonaki. Another of her collaborators, cognitive psychologist Katsumi Watanabe of the University of Tokyo, has tested the interaction of autistic children over several days with three types of robot: a fluffy toy that talks and reacts; a humanoid with cables and wires visible; and a lifelike android. Children usually prefer the fluffy toy to start with, but as they interact with the humanoid, and later the android, they grow in confidence and interaction skills – and have been known to interact with the android’s human operators when they emerge from behind the controls.

  ‘By the time they go to the android, they’re almost ready to interact with a real human,’ she says.

  * * * * *

  The number one fear people have of smart, lifelike humanoid robots is not that they’re creepy – but that they will take people’s jobs. And according to some economists and social researchers, we are right to worry.

  Erik Brynjolfsson and Andrew McAfee of MIT’s Center for Digital Business say that, even before the global financial crisis in 2008, a disturbing trend was visible. From 2000 to 2007, US GDP and productivity rose faster than they had in any decade since the 1960s – yet employment growth slowed to a crawl. They believe this was due to automation, and that the trend will only accelerate as big data, connectivity and cheaper robots become more commonplace.

  ‘The pace and scale of this encroachment into human skills is relatively recent and has profound economic implications,’ they write in their book, Race Against the Machine.

  Economic historian Carl Benedikt Frey and artificial intelligence researcher Michael Osborne at the University of Oxford agree. They estimate that 47 per cent of American jobs could be replaced ‘over the next decade or two’ including ‘most workers in transportation and logistics … together with the bulk of office and administrative support workers, and labour in production occupations’.

  Perhaps unsurprisingly, the robot industry takes the opposite view: that the widespread introduction of robots in the workplace will create jobs – specifically, jobs that would otherwise go offshore to developing countries. And they may have a point.

  In June 2014, for example, the European Union launched the US$3.6 billion Partnership for Robotics in Europe initiative, known as SPARC. The EU calls it the world’s largest robotics research program and expects it will ‘create more than 240 000 jobs’.

  Job growth is certainly happening at Denmark’s Universal Robots, which also makes collaborative robots. The company has grown 40-fold in the last four years, employs 110 people and is putting on another 50 in 2014. Its robots – UR5 and UR10 – look like disembodied arms with cameras attached. They are operated by desktop controllers and taught tasks using tablet computers. They are not as social as Baxter, but they are able to work alongside humans.

  ‘The more a company is allowed to automate, the more successful and productive it is, allowing it to employ more people,’ chief executive Enrico Krog Iversen told The Financial Times in May 2014. But the jobs they will be doing will change, he argues. ‘People themselves need to be upgraded so they can do something value-creating.’

  That’
s been true for some robot clients. Both Universal Robots and Rethink Robotics say customers have hired more people as output in small companies has increased.

  Brooks believes the fear that robots are going to take away all the jobs is overplayed. The reality could be the opposite, he argues. It’s not only advanced Western economies that are faced with a shrinking human workforce as their populations age. Even China is facing a demographic crisis. The number of adults in the workforce will drop to 67 per cent by 2050, he says. By the time we’re old and infirm, we could all be reliant on robots.

  ‘I’m not worried about them taking jobs,’ quips Brooks, ‘I’m worried we’re not going to have enough smart robots to help us.’

  I, wormbot: The next step in artificial intelligence

  The mind of Michio Kaku

  How dust affects climate, health and … everything

  Tim Low

  Big things are made up of many small things.

  That was especially obvious in September 2009 when extreme winds roared across outback Australia, agitating soil laid bare by drought to produce the giant dust storm known as Red Dawn that engulfed eastern Australia, reddening skies from southern NSW to north Queensland, fanning bushfires, damaging crops, delaying planes, halting construction work, triggering smoke alarms, driving up hospital admissions, smearing windows and walls and seeping inside homes to coat floors and furniture in fine powder.

  This herculean event, which elicited comparisons with nuclear winter, Armageddon and the planet Mars, swept on to New Zealand, where it sent asthmatics to hospital and dusted alpine snow. In NSW alone the event cost an estimated $330 million in lost topsoil, crop damage, car smashes, worker absenteeism, cleaning and the closure of Sydney Airport.

  The particles behind the strife were so small that 100 000 weighed a mere gram, but they rose up in such numbers that Australia managed to lose more than a million tonnes of soil, broadcast into the Tasman Sea and sprinkled over New Zealand. The drama surprised the nation, but Red Dawn was by no means the first onslaught of dust to hit the east coast and it won’t be the last. In the inland, they’re more common: the most recent in Bedourie, western Queensland, when day turned to night last December and dust enveloped the town for more than 90 minutes. Australia is one of the great dust-producing lands, the main source in the southern hemisphere. If Australia faces a drier future, it will be a dustier one as well.

 

‹ Prev