Introduction: Mobile Robot Arm DoArm S7 by Wifi Arduino Uno R3

About: Specializing in the OpenSource Hardware. skype: yichone

This manual is just for use of the DoArm based on the

ESPduino (which compatible with Arduino uno r3) development kit from doit company. Users can develop again and/or improve the function on the basis of source code provide by doit company. More details, please visit doit, and this link. V

Step 1: Robot Arm

Now, the industrial 6 dof robot arm is widely used in many engineering

applications, such as, automatic jointing, assemble and so on. Here, ABB model “IRB4400” is taken as an example to show the DoArm model. This robot arm is developed and investigated based on the ESPduino (compatible with Arduino UNO R3 with WiFi) and 2-ways motor & 16-ways servo driven board from doit company, and this arm can be controlled to snatch at an article simply by using wifi. DoArm has the following features:

(1) It is a mini model, compared to the ABB IRB4400, shown in the Figure 1;

(2) It is controlled by WiFi based on ESP8266 SOC;

(3) The control program can be debugged by the built-in Telnet interface, and can be updated by OTA;

(4) It is controlled by the built-in robotic inverse kinematics algorithm, which can move the robot-arm end in the Cartesian and Tool coordinates.

Step 2: Product Description

DoArm has the following functions.

(1) label and initialize

After installation, at first, we must label the location for the servo to confirm the initialized location when servos start to work. At the same time, to make the security moving scope to confirm the metal structures and the servos.

(2) single-axis control

ESPduino controls the 16-ways servo on the driven board by I2C. We can make the channel on the driven board is corresponding to the 6-axis servos and the griper servo, respectively.

(3) multi-axis control

By the D-H connecting-rod matrix, the kinematics equation can be set up, and then we can solve the inverse kinematics equation to control the movement of the robotic arm end.

(4) human-computer interaction by Telnet

When we set up a Telnet servo by using ESPduino to realize the function of the human-computer interaction, and then control the robot arm and state display.

DoArm robot arm is shown in Figure 2. Robot arm is controlled by the ESPduino and driven board by the WiFi between ESPduino and router. By this way, robot arm can be controlled by the PC or APP in mobile phone.

To realize the above the functions, we should have the following
materials, ESP8266 SOC ESPduino development board, Micro usb cable, 2-ways motor & 16 motor servo driven board, robot arm, and paw.

Step 3: Installation for Mobile Robot Arm

In our official site http://bbs.smartarduino.com/,

we have put up the installation for the tank chassis (http://bbs.smartarduino.com/forumdisplay.php?fid=22 ) and the DoArm (http://bbs.smartarduino.com/forumdisplay.php?fid=21 ). Note that, when you install the robot arm, one should install the metal structure and servo at the same time. Or we cannot install the robot arm normally. Before install the servo, we should note the initial location. For convenience of the later program, we should let servo at the medium location, i.e., let 90 degree as the default medium location. At this state, we start to install the robot arm. The complete figure is shown in Figure 3.

http://www.smartarduino.com/doarm-s6-6-dof-robot-a...

After install the metal robot arm and tank chassis, in the next
step, we should install the ESPduino developbaord kit. Then, let the servo extension cable connect to the driven board. If the servo extension cable isn’t too long, can be connected by dupont cable, and insert into the board. Certainly, we can assign the channel randomly, the can decide the setting channel by the program. The completed mobile robot arm is shown in Figure 4.

Step 4: Model for Robot Arm

DoArm has 6 servos as the driving component. It

is very easy if control a single servo to the required angle. However, if require 6 pcs servo move jointly to let end actuator move to the pre-set location, then we must build up a connecting-rod coordination. By computing the D-H matrix, we can get a kinematics equation. If we have already know the position of end, then we can compute the turning angle for 6-axises by solving the inverse kinematics. This is not a simple solution. If interested, please read the related book.

Firstly, we should define the initiation location as shown in Figure 5. By the connecting-rod coordination, the parameters can be seen at the following Table 1.

Step 5: Control for Robot Arm

4.1 Human-computer interaction

Since ESPduino is already built-in the Telnet function, we can realize the Human-computer interaction, check the running-state, current position, the current velocity, and so on, by Telnet. At the same time, we can set the movement model for the single or multi-axis, and movement location.

When ESPduino connects to the wireless router, PC can connect the IP address of ESPduino by Telnet Tools (such as, SecureCRT). The connect port is 23, which is shown in Figure 6.

By Telnet, we can control the robot arm by
the following commands, which are defined as follows.

“h”: print the help information;

“z”: let the axises of robot arm return to zero location (i.e., the default initial location defined in the above sections), which are defined when we set up the D-H matrix. Then after power, the objection location is 0 for each axis, as shown in Figure 7.

“p”: print the current state of the robot
arm, as shown in the following picture. The information includes state, setting velocity, current angle for each axis, object angle, and current position matrix for the end actuator.

“d”: check or set the current velocity, of which, the velocity grade is defined as 1~10, where “1” is the lowest velocity, and “10” is the fastest velocity.

“c”: keep the current position for the end actuator, and then let actuator move along with the x-axis, y-axis, and z-axis in Cartesian coordinates, respectively. The unite is mm.

“t”: keep the current position for the actuator, and then let actuator move along with the x-axis, y-axis, and z-axis in Tool coordinates, respectively. The unite is mm.

“s”: single-axis movement. At the beginning, each axis is at the zero location, and the angle is defined as 0, the scope is -90 degree~90 degree. The positive and negative is decided by right hand rule.

“w”: move to the designed location.

“g”: control the gripper (paw). For example, g:45 is meant to let the paw open 45 degree angle by control the servo.

Similar to the smart 3-colors LED, DoArm source code can be provided web configuration and OTA updated function, as shown in Figure 7.

4.2 label and initialization

When you connect the servo extension cable to the driven board randomly, firstly, we must let each axis of robot arm is corresponding to the channel on the driven board, respectively. Then, confirm the excursion for the zero location; finally, confirm the permitted-moving scope.

(1) Confirm the channel number on the driven board for each robotic arm

When you connect the servo extension cable to the driven board, it is very difficult to differentiate the number corresponding to the channel number on the board. Therefore, we just insert the servo extension cable to the board randomly, as shown in Figure 8.

Figure 8 Servo extension cable on the driven board

In the source code, let the function initParseData() set bias[6] as {0,0,0,0,0,0}, and the moving scope for each axis is within -90°~90°.

//bias moving distance for each axis

float bias[6] = {0, 0, 0, 0, 0, 0};

arm.setServoBias(bias, 6);

//moving scope for each axis

arm.setServoRange(-90, 90, -90, 90, -90, 90, -90, 90, -90, 90, -90, 90,);

let arm.setChannel() be the actual access channel. As shown in Figure 8, the access channel is 1,3,4,9,11,13, respectively, then let

arm.setChannel(1, 3, 4, 9, 11, 13);

Compile the code and download ESPduino, open the human-computer interface by using Telnet, use the command “s:x,y” to confirm the corresponding relation for each axis. For example,

By using “s:1,10”, let axis 5 move, and then channel “1” writing into the 5th axis;

By using “s:2,10”, let axis 6 move, and then channel “3” writing into the 6th axis;

By using “s:3,10”, let axis 4 move, and then channel “4” writing into the 4th axis;

By using “s:4,10”, let axis 2 move, and then channel “9” writing into the 2nd axis;

By using “s:5,10”, let axis 1 move, and then channel “11” writing into the 1st axis;

By using “s:6,10”, let axis 3 move, and then channel “13” writing into the 3th axis;

Then, we can get the final right corresponding relation between the axis for the robot arm and the channel on the driven board, as shown in the following function.

arm.setChannel(11, 9, 13, 4, 1, 3);

since the gripper is only one, let it insert one idle channel, and set it by

arm. setGripperChannel()。

//set the channel number for the gripper

arm.setGripperChannel(14);

(2) Confirm the excursion (for the zero location of the each axis)

After confirm the corresponding relation between the axis and the channel, the source code is needed to re-compiled, and download into the ESPduino to run again. By using the Telnet command, we can confirm the excursion bias for each axis.

Firstly, set the velocity as the lowest grade, the command is d:1,1

Figure 9 set the lowest velocity grade

Then, using “s:x,y” to let single axis to move the zero location. In general, the zero location (medium location) is o. Using “s:x,y” to confirm whether the axis is located at zero location. If not, then using “s” command to let this axis to zero location. For example, “s:1,65” is denoted that this axis needs move 65 degree to get the zero location. In this case, the command is listed as follows.

//excursion for medium location

float bias[6] = {65, 5, -8, 5, 0, 0};

arm.setServoBias(bias, 6);

Then after compile, the robot arm will be recovered to the zero location automatically, as shown in Figure 10.

Figure 10 DoArm at the initial status

(3) Confirm the moving scope for each axis

After confirmation for zero location, we can confirm the moving scope for each axis by using “s:x,y” with Telnet human-computer interaction. Because of the restrictions, the angle scope for servo is set defaultly as -90°~90°. In this case, the moving scope is as follows. Note that, if the moving scope is surpassed, then the servo would be stopped.

//moving scope for each axis in this case

arm.setServoRange(-90, 30, -30, 60, -40, 35, -90, 90, -50, 90, -80, 80);

(4) Control for gripper

The control for gripper is separate to the other 6 axises, so we can use “g:x” command to control the gripper, as shown in Figure 11.

(a) Position for gripper at 0 degree

(b) Position for gripper at 20 degree

(c) Position for gripper at 45 degree

Figure 11 control for gripper

4.3 single axis control

In the Telnet interface, we can use “s:x,y” command to control the single axis, where x denotes the designed moving axis, scope is 1~6; y denotes the absolute angle, the scope is the one permitted by each axis with float type. After each action, the current state for robot arm can be displayed automatically.

Figure 12 The output for single axis movement

4.4 Jointly Control

Jointly control denotes the 6 axises movement at the same time. It has 3 modes: 1. Continues movement by the fixed location; 2. Movement of end actuator in the Cartesian coordinates; 3. Movement of end actuator in the Tool coordinates.

(1) Continues movement by the fixed location

We provide the methods arm.setWaitPosition( ) and arm.gotoWaitPosition(), which is set the fixed location, and move to this location, respectively. In our source code, we provide the source code for 9 jointly control. DoArm can be moved by “w:x”, where “x” is 0~8. Especially, the command “w:9” can start 9 continue movements. This video can be seen at:

http://v.youku.com/v_show/id_XMTQ5ODk0NzEyMA==.html?from=y1.7-1.2

(2) Movement of end actuator in the Cartesian coordinates

The inverse kinematics function calcReverseKinematics() can be provided to compute the designed position and reverse angle position for the robot arm end actuator. Especially, Telnet interaction provides the command along with x, y, z axis in the Cartesian coordinates; i.e., “c:x,y,z”, where x,y,z denotes the increasement along with x, y, z axis with unite mm and float type.

For example, if let end actuator move 20mm along with z axis, the command is “c:0,0,20”, and then return “c:0,0,-20”. The robot arm would move 20mm in vertical orientation, and then move 20mm in the reverse orientation. Finally, the arm would return to the original location. At this time, the location is zero. When received the command “c:0,0,20”, the location is (0.000,2.358,-11.361,0.000,9.003,0.000); after this, if run the command “c:0,0,-20”, the location is (0.000, -0.000, 0.000, 0.000, -0.000, 0.000), as shown in Figure 13.

Figure 13 Movement in Cartesian coordinates

(3) Movement of end actuator in Tool coordinates

Similar to the movement in the Cartesian coordinates, the difference is that the actuator moves in the Tool coordinates (corresponding to the O6). The command is “t:x,y,z”, where x,y,z are the increasements along with the x, y, z axis with unite mm and float type.

For example, let the end actuator move 20mm at the initial status along with X axis. The command is “t:20,0,0”, and return to the original location is “t:-20,0,0”. Then, the end actuator would be firstly move 20mm along with x axis, and then return 20mm by the reverse direction. We can analyze the results in the Telnet interface. From the initial location 0.000, when received the command “t:20,0,0”, the location is (0.000,2.358,-11.361,0.000,9.003,0.000); after that, if run the command “t:-20,0,0”, the location is (0.000, -0.000, 0.000, 0.000, -0.000, 0.000).

Figure 14 Movement in the Tool coordinates

Step 6: Source Code Analysis

The code for this case is already uploaded onto the github, please

download it by the following link: https://github.com/SmartArduino/DoitArm

For the source code, it includes 6 parts, “DoitArm_demo.ino”, ”project.h”, “webPage.h”, ”httpserver.ino”, “netTask.ino”, “parseData.ino”, “DoitArm.cpp”, and “DoitArm.h” where “DoitArm_demo.ino” is the main file, used to initialize the configuration, and the main loop.

“project.h” is the head file, which defines the ssid, password, version, and so on for soft ap.

“webPage.h” is put on the page of webserver in the form of the character string.

“httpserver.ino” is used to build up the http and OTA server for the function of webserver and OTA.

“netTask.ino” is already set up a Telnet server, which is used to monitor the socket link data.

“parseData.ino” is initialized for the robot arm, handle the Telnet data and control the robot arm.

“DoitArm.cpp” is used for the driver for the robot arm, including the driven function, initialized function, and many other function about robot arm.

“DoitArm.h” defines the head for the driver of robot arm.

The process can be shown in Figure 15.

Video link: http://v.youku.com/v_show/id_XMTQ5ODk0NzEyMA==.html?from=s1.8-1-1.2#paction

Step 7: Support and Service

Our professional

sites:

http://bbs.smartarduino.com/

www.doit.am

Skype: yichone

Email: yichoneyi@163.com

WhatsAPP: +8618676662425