loading

Overview

This is a description of the project I did when invited to the Intel Labs at Swindon. The original plan was to produce a motorized platform to carry a camera. Image processing software could then watch for and track the movement to keep it in the center of the camera view. This was perhaps asking too much in the short time we had available in the Lab, but this will show what we managed to get working and the thoughts we had for future adaptation.

I was using the Intel Edison Module and Arduino Breakout Board. In addition I added the following items,

2 x Stepper motors, with their driver modules 1 x Joystick, to be used for setting the centre points 1 x push button 1 x LED to show the system had booted correctly.

As you might see from the photographs, the motors are mounted in a speedily built link to hold them 90 degrees to each other; one for horizontal rotation, one to tilt the platform. This was definitely not a production model!

Set up environment

The Edison was set in the Arduino Breakout board which made it extremely easy to get things working. Grove make a number of sensors that just plug and play, so it's easy to try a number of different solutions quite quickly.

On the software side, I used the Intel IoT XDK which allows a programmer to use Arduino sketches, javascript apps using node.js or C language programs. A quick test of a blinking LED using an Arduino sketch showed that the system was working, then I moved on to javascript and node.js for the main project. The XDK provides a number of templates for examples of different sensor projects; this makes it easy to start a solution.

The XDK interface gives you a modern editing interface, with source code colors and pop-ups to show code completion options. It connects to the Edison board and will allow the user to compile, transfer the application to the board and then start and stop the application running. It will even allow more than one app to run at the same time. I noticed this having run my Arduino sketch to flash an LED, run a node.js app to do the same and then stopped the node.js app; the Arduino sketch was still running.

System layout and configuration

The two stepper motors can be driven quite easily, but each needs a five wire connection to the main breakout board. They need both standard pins and pins that have a ~ symbol showing they may be used for PWM (pulse width modulated) connections. The LED, press button and joystick may just be plugged into the sockets on the expansion board. We just needed to check that we weren't using the same pin twice.

Step 1: Code for the Platform

node.js on the Edison

I used a simple node.js application for the program. Not very sophisticated; I wrote it all in-line, but it did all that we wanted it to. The code initially turned on a blue LED to show the system was working, then set up a web server which served up the basic html page. This page opened a web socket connection to the server and was able to send some commands to the system.

The web page was just for testing, so wasn't very sophisticated. It had three buttons for each of the commands, and would log the replies which were just the same commands bounced back. The Edison needs to be connected to the wifi available. This was done using the ssh connection in the XDK. This is quite easy and only needs to be done once; it will remember the connection. If the system is powered down and restarted, the application will automatically start up again. Once started, several people could connect with their smart phone browser and send a message to rotate the turntable. Simple but pleasing demo.

It's easy to open a web server using the 'http' module; it didn't need any routing facility, as we were serving just a single page. That page was preloaded into a string, 'mainPage', and returned when the browser is directed to the url, 'localhost:1337'. If other libraries are needed, they are added to the configuration file and the XDK will download them to the Edison before running the program, doing an 'npm install'.

// Start web service

var http = require('http');

var app = http.createServer(function (req, res) {

'use strict';

res.writeHead(200, {'Content-Type': 'text/html'});

res.end(mainPage);

}).listen(1337);

console.log("Web listening on port 1337");

The web socket used code to pick up the three different messages. The commands 'a' and 'b' were used to turn the motors, 'c' was used to put the system into 'center adjustment' mode. I used the 'socket.io' library to open a web socket listener.

//Create socket.io server

var io = require('socket.io')(app);

console.log("Simple socket.io messaging....");

//Attach a 'connection' event handler to the server

io.on('connection', function (socket) {

'use strict';

console.log('a user connected');

//Emits an event along with a message

socket.emit('connected', 'Welcome');

//Attach a 'message' event handler to the socket // a and b commands will come in with a number parameter for the rotation

socket.on('message', function (msg) {

console.log('message--> ' + msg);

var args = msg.split(" ");

var delta = parseInt(args[1]);

console.log("Value: " + delta);

if (delta != 0) {

switch(args[0]) {

case "a":

rotate(motorA, delta);

break;

case "b":

rotate(motorB, delta);

break;

case "c":

centre();

break;

default:

// default code block, do nothing

}

}

socket.emit('message', msg);

});

The motors are set up with this code to set up the connection and speed; this sets the pins that the motor is connected to and the speed of movement. The numbers are the pins shown on the expansion board.

var Uln200xa_lib = require('jsupm_uln200xa');

// Instantiate a Stepper motor on a ULN200XA Darlington Motor Driver

// This was tested with the Grove Geared Step Motor with Driver

// Instantiate a ULN2003XA stepper objects

var motorA = new Uln200xa_lib.ULN200XA(4096, 8, 9, 10, 11);

var motorB = new Uln200xa_lib.ULN200XA(4096, 4, 3, 5, 6);

motorA.angle = 0;

motorB.angle = 0;

var setSpeed = function(motor, speed){

motor.setSpeed(speed); // 5 RPMs

}

When the commands 'a' and 'b' arrive they use the parameter to set how much of an angle to turn; negative delta angle is used to reverse the direction. This function is called from the web socket code when a message arrives. The stepper motor takes a value of 4096 to turn one complete rotation. It would probably be more logical turn this into an angle in degrees.

var rotate = function(motor, delta) {

if (delta > 0) {

motor.setDirection(Uln200xa_lib.ULN200XA.DIR_CW);

} else {

motor.setDirection(Uln200xa_lib.ULN200XA.DIR_CCW);

}

motor.stepperSteps(Math.abs(delta));

motor.angle += delta;

}

When the 'c' command arrived, I put the system into a center adjustment mode, where I could use the joystick to move the two motors to a set start position. The X and Y movements correspond to each motor. A simple press button was used to return to normal mode.

The hack day code is on my GitHub pages. It may not be in a particularly tidy state, but may be useful to view the whole thing. https://github.com/happyt/edison-platform

Step 2: Future Thoughts

Camera input

I had some problems on my old laptop getting the Real Sense camera set up, but I was at least able to see the cameras in it as web cams. I used the depth camera to pick up the movement, as it showed only the objects that were close to it, such as my hands and face. By calculating the rectangles around the objects in the view, I could see which was the largest and simply tracked the position of that one. I wrote a small application in C# to do this and used the excellent AForge library to do the image processing. It has a number of modes of processing and may be hacked quite easily.

I had this application send similar web socket commands as were used on my test web page. In this case, the delta angle were calculated from the movement of the rectangle across the angle of view, and had to rotate the turntable to keep the rectangle centered. I ran out of time at this point and my platform was not the most stable, but we had correct movement, so were optimistic about a future version.

Future thoughts

If I were to do this again, I would probably use servos, as they are more accurate in their control of the movement. The stepper motors have the benefit of being able to turn all the way around, whereas the servos are limited to an arc of less than 360 degrees. The problem with stepper motors is that they do drift a little and you cannot be sure of getting to the same position after several movements. It would need some other event to check the centre point regularly or at a certain position. Also, you would need to keep a check on how many revolutions have been made, as the cables will get wound around the axis.

I was using the Real Sense camera as input which has a great SDK and with the correct drivers installed would easily give face tracking information that could be used to drive the rotations. The image processing is all done for you. But for projects that need to track objects further away I would wait for the newer 'world facing' camera that should be available soon. This should have a similar SDK and sounds very promising.

Step 3:

<p>brilliant </p>
<p>Cool edison project.</p>
<p>Yes, it's a fun machine and quite a capable processor. If you need the code, look on GitHub, the formatting of the source code here is hard to read.</p>

About This Instructable

636views

15favorites

Bio: Interactive, realtime 3D for film, TV at BSkyB, W London, w VizRT, Ventuz, C#, python, OFx, Flex, IIS, node web w mongoDB. Into Golf, F1 ... More »
More by 3DRealtime:Edison motorised platform(Intel IoT) 
Add instructable to: