This vendor-written piece has been edited by Executive Networks Media to eliminate product promotion, but readers should note it will likely favour the submitter's approach.
Siemens' research laboratories are sprouting spiders that can work together, hands that move like those of humans, and arms that can, without programming, independently determine how to pick up a given object. What these projects have in common - and what we portend for the future - may well turn out to be the most fundamental change ever witnessed in industry: the birth of autonomous machines.
Unlike automated systems, which are characterized by the repetition of painstakingly preprogrammed sequences of motions, autonomous systems - be they industrial or service robots, cars or drones - are those that, according to Roland Siegwart, founding co-director of the Wyss-Zurich Translational Center and professor for autonomous mobile robots at Eidgenössische Technische Hochschule (ETH) Zurich, "have some freedom to make decisions based on their ability to sense and process information in a changing environment."
Such environments will include tomorrow's factories where, as most production experts foresee, humans and robots will work collaboratively on a wide variety of assignments, thus requiring robots to make decisions based on uncertainties regarding what their human associates will do next and how to respond in the most productive way possible.
Although achieving this kind of seamless collaboration may take years, the first step in this direction is now being taken at Siemens Corporate Technology (CT) in Munich. There, researchers are testing a new technology that will make it possible for a robot to automatically generate its own detailed behavior from a generalized task description. "What we want to be able to do," says Georg von Wichert, who heads the Robotics, Autonomous Systems and Control Research Group at CT, "is to tell a robot what it is supposed to do, but not how to do it. In short, we want it autonomously to decide what it needs to do to perform an assignment." Experts calculate that such a technology could save roughly 50 percent of the cost of setting up new robotic manufacturing cells.
Indeed, given the rapidly growing size of the robotics market, the economic significance of the advent of autonomous systems could be immense. In 2005, worldwide spending on robotic systems was eleven billion dollars. By 2025 it is expected to reach $67 billion. The flexibility of autonomous machines is widely expected to reduce installation and energy costs, shorten the time it takes to launch new products on the market, and thus lower costs in general as compared to previously used automation systems.
Mimicking Human Hands
A key part of most assignments for semiautonomous robots will be the ability to grasp objects. In this connection a team of CT researchers in Beijing has developed a prototype data glove that can be used to capture the movements, gestures and pressure levels of human hands, and thus describe complex commands and the safe handling of a wide variety of objects to robots. Based on a combination of sensory inputs, the gestures of such a glove - including the movements of individual fingers - have been fused and transferred to a "trainee" robotic arm and hand to perform specific tasks in real time. "As this technology evolves," says Dr. Yue Zhuo, who heads CT's Beijing robotics
Sign up for CIO Asia eNewsletters.