Skip to main content

Who’s the Boss? Next-Gen Factory Robots Could Call the Shots

Humans and robots will work elbow to elbow on the shop floor, but you'll be surprised by who's giving the orders

The minute Michael Dawson-Haggerty burst into my office, clad in a blackened lime-green welding jacket and wearing a big smile, I knew he and his partner had won. Their test: weld a metal space frame for a Humvee—a military vehicle ubiquitous in Iraq and Afghanistan—faster than a team of experts with decades of experience.

This was Dawson-Haggerty's first professional job—he had just completed his master's degree and joined the engineering staff at Carnegie Mellon University's Robotics Institute—and it is fair to say that he had been a little nervous as he got started. Truth be told, I was more worried about his partner, who was reliable enough but generally lacked people skills.


On supporting science journalism

If you're enjoying this article, consider supporting our award-winning journalism by subscribing. By purchasing a subscription you are helping to ensure the future of impactful stories about the discoveries and ideas shaping our world today.


The cohort on this project was a robot, similar to those huge industrial machines we typically associate with assembly-line work at Ford or General Motors. Yet whereas those mechanical monsters operate inside cages to keep humans safely apart from unforgiving automated thrusts, we modified Spitfire—our 13-foot-tall, one-armed welding robot equipped with a laser for an eye—to work right alongside a person. And instead of Spitfire taking orders from Dawson-Haggerty, the team tended to work the other way around: the robot dictated the next steps, with the hard work of positioning and welding divided between Dawson-Haggerty and Spitfire according to who could most efficiently complete the task. The robot, not the human, often called the shots.

With the work so split, Dawson-Haggerty and his robot partner built the frame in 10 hours for $1,150, including raw materials and labor. The experts we had hired to serve as our control group performed the same task in 89 hours and billed us $7,075.

The economic consequences of a human's ability to work with a robot, and vice versa, are potentially enormous. Factories could could do away with painstakingly configured assembly lines, saving billions in equipment setup costs. Need to modify a popular product? Human-robot teams can create custom versions of anything from electronics to airplanes without the need for expensive retooling. The technology will allow companies to quickly respond to consumer demand, updating products in cycles measured in weeks, not years. And workers should find rewarding the ever changing challenges of the factory floor. For these reasons and more, we need to realize that robots may ultimately be more effective as supervisors, not slaves.

Keeping Your Head

There is always a lot of discussion surrounding what, exactly, a “robot” is. The robotics research community defines them as machines that can sense, think and act autonomously. This is not quite right—your house's thermostat can do all these things, yet you would not classify your house as a robot. The difference is that your thermostat is just a small part of what your house does. Only when “robotic” functions are used in service of an object's core responsibility can the object itself be considered a robot. For example, when a self-driving car uses sensors and artificial intelligence to enable transportation—a car's essential function—it becomes a robot.

Manufacturers have deployed robots for more than half a century to improve efficiency through automation. Yet robots have been special-purpose machines—excellent at, say, welding a certain set of joints on every car coming down an assembly line. Humans have done the organization, setting up the assembly line to capitalize on their robots’ strength and precision.

The process works well for products such as cars that come down assembly lines by the tens of thousands. Yet with the rise of custom manufacturing, where suppliers create small batches of products on demand, the time it takes to set up a process such as welding or machining becomes a major bottleneck. It takes far too long to prep the robot for its job—sometimes months. People must plan the welding sequence, fasten the parts, program the robot, prepare stock material and optimize welding parameters.

Partnering someone like Dawson-Haggerty with a manufacturing robot could cut setup time down dramatically. In the past, programmers used special code to tell robots how to move. Now a product's computer-aided design (CAD) file is all that's needed to set up a smart assembly line. Algorithms will translate these designs into the robot's to-do list.

Designing an assembly line is not the only challenge, however. Robots and people have had a hard time working together. Industrial robots move from position to position and essentially insist on reaching their final destination—whether or not a person is in the way. Manufacturers program their robots to do the same task over and over again until the parts run out. If a rigid object makes a move impossible, industrial robots go into an error state and basically power down. This condition is better than the alternative of going through someone's head, but neither is it helpful. Consider how much would get done if co-workers just froze when they got too close to one another.

Next-generation industrial robots will be intrinsically safe around humans. If a robot accidently hits a human, the blow should not be fatal or even dangerous. Machines will have awareness of where the people are in their work space, and they should be able to communicate with their human counterparts using voices, gestures, “facial” expressions, text and graphics.

Robot makers are already building machines to meet modern manufacturing's workforce needs. Spitfire is based on a robot made by Zurich-based ABB, augmented with special features designed and built at Carnegie Mellon. ABB also offers Frida, a two-armed robot designed to operate safely around people. Meanwhile Boston-based Rethink Robotics, established by iRobot co-founder Rodney Brooks, has developed Baxter, which has two arms as well as an array of sensors to make programming easier than it was for previous generations of robots.

An operator programs Baxter by manually guiding the machine through a series of motions, which the robot later repeats. This feat is accomplished with simple learning algorithms and image processing. For example, if a person shows Baxter how to pick parts off a moving conveyor belt, Baxter will adapt and learn how to do it—even if the parts come down the belt at irregular places and times.

Willow Garage in Menlo Park, Calif., has created a mobile demonstration robot called the PR2 with two arms, a head and an array of sensors. Like Frida and Baxter, the PR2 is designed to work safely side by side with humans. Here at Carnegie Mellon we use the PR2 to serve drinks and snacks to visitors in relatively chaotic environments.

Deferential Treatment

Spitfire does not just learn from humans. It is also smart enough to instruct them. Spitfire breaks up big projects into little steps and divides those tasks according to who can do them faster—robot or human—with no preference given to either.

Dawson-Haggerty and Spitfire began their frame-welding job by extracting a “bill of materials” from the space frame's CAD description. Based on this shopping list, the robot's computer automatically planned which parts to order from suppliers and how to cut standard-size steel tubing to precise lengths. The computer then planned the best sequence to perform the welding operations and specified the optimal way to hold parts so they were secure during the welding.

We also gave Spitfire a miniature classroom projector so it could display images and text directly on the space frame. The images became a type of augmented reality. The robot used its projector to tell Dawson-Haggerty, step by step, how to set up the complex construction process—where the parts and the fixtures went in the work space and the order of welding operations. Dawson-Haggerty moved everything into place. Here the human was the better option for what we would consider “grunt work” because the parts were relatively lightweight and came in a variety of shapes that could be easily grasped by a human hand.

Spitfire also used its laser-displacement sensor to accurately perceive its three-dimensional work space and check to make sure that all parts were properly aligned. Using the projector and the sensor, it could highlight precise locations on the space frame and lead the human through the building process.

Once the team arranged the parts to be welded, Spitfire could take over and make quick work of that job. Not only is Spitfire a fast welder—taking just five seconds to make a two-inch weld—its welds are superb. Typically before each job, a welding expert will tune about 20 critical welding parameters such as voltage, welding speed and weld-wire feed rate. In our experiment, we instructed Spitfire to set up trials that it could run on its own to optimize all these variables. As these experiments proceeded, Spitfire measured the results of trial runs and adjusted its settings to improve its performance. The robot taught itself to be an expert welder.

Considering there were 400 required welds for the space frame, Spitfire's speed and prowess is a huge advantage. But Spitfire is not perfect. In some cases, the robot could not reach particular welds, so it instructed Dawson-Haggerty to step in to perform the tricky operations.

Lights On

While it is difficult to predict exactly how soon human-robot teams will first dance on the factory floor—manufacturers are often slow to adopt new technologies—the clear advantages of intelligent automation should push companies toward collaborative systems within the next five years. Our vision of advanced manufacturing has come a long way from “lights out” production, made famous in Kurt Vonnegut's 1952 novel Player Piano, wherein automated factories do all the work. As the story goes, automation makes labor obsolete, but it also makes people embittered by their meaningless lives—an unacceptable (and unnecessary) path.

A better way forward is robots and humans cooperating as teams in which tasks are dynamically assigned according to capability. The hope is that people can take pleasure from the satisfaction of being deeply involved in the process of making things—even if they are sometimes taking orders from a machine.