Industrial robots are truly incredible CNC machines –– not just for their speed, power, and precision, but because they are also highly adaptable. Unlike other CNC machines, when you put a tool on the end of the robot, you completely transform what it can do: put a sprayer on it, and it becomes a painting robot; put a gripper on it, and it becomes a material handling robot; put a welder on it, and it becomes a spot welding robot.
This adaptability has made the industrial robot a key piece of infrastructure for factory automation over the past 50 years. But despite their adaptability, industrial robots are fairly dumb machines: they have little-to-no awareness of the environment outside of their programmed tasks. This is one of the main reasons why industrial robots have thrived only in highly controlled environments, like factories. They need places where unpredictable objects (a.k.a. people) are strictly separated from their work zones.
But an industrial robot's adaptability is useful beyond the factory. Putting a film camera onto an industrial robot gives a director precise, complex, and repeatable camera moves. Putting a loader onto an industrial robot gives a construction worker a way to move heavier quantities of materials. Putting a light onto an industrial robot gives a photographer more precise control of a scene's ambiance. While these are somewhat mundane use cases, they tease out some of the biggest challenges for bringing industrial robots outside of the factory: because they are blind to the world, they are very dangerous to use; because they need highly technical skill to program, they are very difficult to use.
For the Pier 9's Fall 2015 Artist in Residence program, I decided tackle these two challenges and build a way for industrial robots to be safer and easier to use in uncontrolled settings. I created Quipt, a gesture-based control software that gives industrial robots spatial awareness and spatial behaviors for interacting closely with people.
Find out more on how Quipt was made in the few next steps...
See the full project page here.
Quipt is a gesture-based control software that facilitates new, more intuitive ways to communicate with industrial robots. Using wearable markers and a motion capture system, Quipt gives industrial robots basic spatial behaviors for interacting closely with people. Wearable markers on the hand, around the neck, or elsewhere on the body let a robot see and respond to you in a shared space. This lets you and the robot safely follow, mirror, and avoid one another as you collaborate together.
Quipt augments an ABB IRB 6700 industrial robot by giving it eyes into its environment. Using a Vicon motion capture system, the software is structured to receive and reformat motion capture data into corresponding movement commands for the 6700. Movement commands are generated using our open-source library, Robo.Op (see it on github). Quipt also visualizes debugging data in an Android app, so a human collaborator has a mobile, continuous view of what the robot is seeing.