Robots are quickly becoming more integrated into our day-to-day lives. They clean our floors, make our coffee, and are even used for telepresence. Since they are becoming so vital to society, why not give our robotic companions the gift of vision?
In this instructable, I will show you how to use the Microsoft Kinect to provide three-dimensional vision and depth to a robotic arm, in order to assist in the automation of basic tasks. The end result of this instructable is to be able to use the Microsoft Kinect to detect the three-dimensional position of a randomly placed object and relay that position to a robotic arm, which can then pick up the object with no input from the user.
For more detailed instructions and more information please visit:
- Rhino XR-4 Robotic Arm
- Mark IV Controller (connected to a PC via an RS-232C interface)
- Microsoft Kinect (first iteration)
- Windows 7 32/64 bit (64 bit is preferred) or later
- MatLab R2011b or later.
- CMEX Compiler: Microsoft Visual Studio 2010 Express Edition (VC++)
- Simulink Support for Kinect
Teachers! Did you use this instructable in your classroom?
Add a Teacher Note to share how you incorporated it into your lesson.
Step 1: Software Installation
Once you have the necessary hardware, you will want to install the following software on your Windows 7 (or later) computer.
1. Download MatLab R2011b or later
- Available from http://www.mathworks.com/
2. Download Microsoft Visual Studio 2010 Express Edition (VC++)
- Available from http://www.microsoft.com/visualstudio/eng/download...
3. Download OpenNI
- Available from http://www.openni.org/Downloads/OpenNIModules.asp...
- Select, Download, and Install: OpenNI Binaries-Stable-OpenNI Stable Build for Windows Development Edition and select your system (32 bit or 64 bit)
- Select, Download, and Install: OpenNI Compliant Middleware Binaries-Stable-PrimeSense NITE Stable Build for Windows and select your system (32 or 64 bit)
- Select, download, and Install: OpenNI Compliant Hardware Binaries-Stable-PrimeSensor Module Stable Build for Windows and select your system (32 or 64 bit)
4. Download Simulink Support for Kinect
- Available from http://www.mathworks.com/matlabcentral/fileexchang...
5. Configure the C Compiler in MatLab
- Type mex-setup in the command prompt of MatLab to select the compiler configuration and select Microsoft Visual Studio 2010 Express Edition as the compiler using the on screen instructions.
More information on configuring the C compiler in MatLab can be obtained from: http://www.mathworks.com/help/matlab/ref/mex.html
6. Install Simulink Support for Kinect
- Install Simulink Support for Kinect
- Reboot your PC
- Unzip the Simulink Support Folder and save it to a directory of your choosing
- Open and Run slkinect/setup_openni.m. If everything is installed correctly, a CMEX file (sfun_nid.mexw32)_
- Play with the different demo models in the slkinect/Samples directory. Please note that the Kinect Microphone Array and the Sensor Angle demos do not work with the OpenNI SDK, but will not be needed.
Step 2: Microsoft Kinect and Rhino XR-4 Robotic Arm
The Microsoft Kinect is a physical device that contains cameras, a microphone array, and an accelerometer as well as a software pipeline that processes color, depth, and skeleton data. The Microsoft Kinect contains:
- An RGB camera that stores three channel data in a 1280x960 resolution. This makes capturing a color image possible.
- An infrared (IR) emitter and an IR depth sensor. The emitter emits infrared light beams and the depth sensor reads the IR beams reflected back to the sensor. The reflected beams are converted into depth information measuring the distance between an object and the sensor. This makes capturing a depth image possible.
- A multi-array microphone, which contains four microphones for capturing sound. Because there are four microphones, it is possible to record audio as well as find the location of the sound source and the direction of the audio wave.
- A 3-axis accelerometer configured for a 2G range, where G is the acceleration due to gravity. It is possible to use the accelerometer to determine the current orientation of the Kinect.
More information regarding the Microsoft Kinect can be found here.
Rhino XR-4 Robotic Arm and Mark IV Controller
The Rhino XR-4 Robotic Arm with the Mark IV Controller should be set up according to the instructions of the robotic arm and controller. The Mark IV Controller should be connected to the computer via an RS-232C interface.
More information regarding the Rhino XR-4 and Mark IV Controller can be found at: http://kinectkontrol.weebly.com/rhino-xr-4.html
Step 3: Position, Secure, and Angle the Microsoft Kinect
1. Position and Secure the Microsoft Kinect
*The placement and securement of the Microsoft Kinect with regards to the XR-4 is very important.*
- Position the Microsoft Kinect (I used about 0.9 m away from the base of the XR-4) so that it is directly facing the XR-4 as shown in the pictures. The Kinect should be a minimum of 0.62 meter away from your closest object/target. If you can, secure the Kinect down to the table (I used double sided tape, which works very well) and set it at the desired angle (Ensure that the top of the Kinect is not tilted toward one side and is as parallel to the ground as possible.)
2. Record the Angle
- You will also need to record the angle, with respect to ground, that you set the Kinect at. For your convenience I have included the approximate angles that relate to their respective notches, which have held true in my experience. Each notch refers to an audible click that holds the Kinect at that position, when you manually adjust the angle. You can also calculate the angle through any other method of your choosing. From this point forward, you do not want to move or re-angle the Kinect again.
- 1 Notch = 0 degrees
- 2 Notches = 11.83 degrees
- 3 Notches = 19.11 degrees
- 4 Notches = 25.86 degrees
- 5 Notches = 32.20 degrees
Step 4: Download the Attached Simulink Models and MatLab Files
In order to transform the coordinate system of the Microsoft Kinect into the coordinate system of the Rhino XR4, control the Rhino XR-4 from Matlab, and automate the robotic arm with vision from the Microsoft Kinect I have coded several Matlab scripts and a Matlab model for your convenience. The files should be downloaded to the following directory: slkinect/Samples/win
The files are available for download here:
The file is a Simulink model, which can be easily rebuilt. It should be noted that within the TransformCoordinates function, there exist three equations. Each equation pertains to the transformation, translation, and scaling of the Kinect coordinate axis to the XR-4 coordinate axis specific to my setup. These equations are based on very specific information regarding the layout of the Kinect and the XR-4 and are further explained here: http://kinectkontrol.weebly.com/kinectrhino.html
This is the main script and was written to control the Rhino XR-4 robotic arm. It was written to reach, grab, pick up a randomly placed object, place it back down, return to the base, and then prompt the user if they would like to continue with another run to pick up another randomly placed object.
This file initializes the connection between the computer and the Rhino XR-4 Robotic arm and saves the serial connection as the variable 's'. It contains the information needed to initiate the serial connection. This file also assumes that the serial COM1 is used, if you are not using COM1 change it to the right serial port.
This file moves the XR-4 gripper to the specified location (x,y,z) in mm in relation to the origin of the Rhino XR-4. The 'a' refers to the angle of the gripper in relation to the Z-axis of the XR-4 coordinate system and the 't' refers to angle of the gripper in relation o the X-axis of the XR-4 coordinate system.
This file sends commands to the XR-4. These commands were obtained from the XR-4 manual. Some commands include 'TH' which commands the XR-4 to go into Host mode, to allow it to be controlled from the computer and 'HH' which tells the XR-4 to go into hard-home. Hard-home allows the XR-4 to start from a more accurate reference.
This file tells the XR-4 to wait for a specified amount of time.
This file is very important to close and delete the serial port connection.
Step 5: Coordinate Transformation
1. Microsoft Kinect Coordinates
- Open and run the downloaded markerlocater.mdl Simulink Model in MatLab. Ignore the point and coordinates that appear on the ImageViewer.
- Now, place several different objects (at least 6, I used 6 Dry-Erase Markers) into the view of the Microsoft Kinect and within grabbing distance of the XR-4. The more objects that you use, in more locations, the more accurate your transformation will be. Then click pause on the Simulink model.
- On the Tools bar in the Image Viewer, click Pixel Region. Using this tool, identify the pixel coordinates of the top of each object. Make sure to select pixels slightly below the top edge of each object. Record these pixel locations. Then use these pixel coordinates as indices in the X, Y, and Z matrices within the MatLab function of the Simulink model, to determine X, Y, and Z. This will provide the (X,Y,Z) location of the object with respect to the Microsoft Kinect's coordinate system.
- Note: For the Kinect coordinate system, the Y-axis is considered the vertical axis and the Z-axis is considered the depth axis (imagine the axis is pointing out of the lens of the Kinect).
2. Rhino XR4 Coordinates
- Next, open and run the downloaded file labeled init.m. This will initialize the connection between the computer and the XR4 (Make sure the Mark IV Controller is turned on). Then open and run the movexyz.m file to move the robot gripper to each marker location by guessing and checking (The numbers you enter are in mm). Record the positions of each marker location.
- Note: For the XR-4 coordinate system, the Z-axis is now considered the vertical axis and the Y-axis is now considered the depth axis.
2. Coordinate Transformation Equations
- Once you have the coordinates of the objects in (X,Y,Z) coordinates for both the Kinect and the XR-4, you can develop equations using Matrices and basic digital image processing to create the transformation equations for each axis. These equations will be able to transform any coordinate that the Kinect reads into the XR4's coordinate system. To do this, there will be several basic transformations that need to be performed. For a more complete understanding, please refer to pages 36-40 in "Digital Signal Processing" by Rafael C. Gonzalez and Paul Wintz. The equations that you will solve for will be of the following form:
- Rx = Sx * ( x + xo )
- Ry = Sy * ( y + yo ) * cos ( a ) + Sz * ( z + zo ) * sin ( a )
- Rz = -Sy * ( y + yo ) * sin ( a ) + Sz * ( z + zo ) * cos ( a )
- Rx, Ry, Rz = Robot coordinate
- Sx, Sy, Sz = Unknown scaling factor for the x, y and z axis
- x, y,and z = the Microsoft Kinect coordinates
- xo, yo, and zo= Unknown displacement
- a = Angle of the Microsoft Kinect with respect to the horizon
- With the above equations, the angle of the Kinect, at least 6 Microsoft Kinect coordinates and their respective Rhino XR4 coordinates, the equations can be solved. This can be done using MatLab, Mathematica, an advanced scientific graphing calculator (such as a TI-89), or any other method of your choosing.
- You can then replace the equations in the MatLab function labeled TransformCoordinates with the new ones you calculated. Be sure to switch the y and z variables in the equations since these axes are switched between the Microsoft Kinect and the Rhino XR4 robotic arm.
Step 6: Defining a New Volume
Now, you can define a new volume of space that you'd like to use to detect the tallest object. To do so, replace the values in the MatLab code of the TransformCoordinates function in the Simulink model. These values are in mm and measured with respect to the origin of the XR4 coordinate axes. To determine the approximate location of the origin, you can use moveXYZ to move the robot to a known location and use that to determine the origin.
Optional: You can move this part of the code in front of the transformation equations, with the necessary changes in variables. This way, only the isolated volume will be transformed. If you do this, the new isolated volume will be in meters and with respect to the Kinect coordinate system rather than that of the XR4.
Step 7: Putting It All Together!
To communicate the coordinates of the highest point in the defined volume between the Kinect and the XR-4, open both the MarkerLocator Simulink model and the main.m file. Run the MarkerLocator and when the highest point is found on the object click pause. This saves the x,y,z data to the workspace. Now run the main.m file. The XR-4 will first reset to hard home and then the gripper will move to the object, pick it up, place it back down and finally return to its base. The command window will then prompt you if you would like to continue. To continue un-pause the simulink model, move your object, and ensure the top point of the object is being measured. If it is, again pause the simulation. Then type 'y' in the command prompt and hit enter. The XR-4 gripper will move to the new location of the object, pick it up, put it down, and again return to the base. When you are done type 'n' and hit enter, this will reset the XR-4 into hard-home and close the serial connection.
1 Person Made This Project!
jhon ruiz made it!