Introduction: DFRobot HuskyLens Object Tracking Mission.

About: I have a passion for making things, could be anything I can think of or something I see. There is nothing better than designing something and it works, doesn't matter if its 3D Printed, CNC, Lathe whatever com…

I've had the DFRobot HuskyLens for over 6 months now and thought it's time to get it out of the box and see what it can do.

There are various videos out there on how to use the DFRobot HuskyLens AI Vision, and they cover onboard applications such as Face Recognition, Object Tracking, Object recognition, Line Tracking, Colour recognition, Tag Recognition and Object Classification.

There is also an excellent Wiki which covers everything from upgrading firmware which incidentally is one of the first things you should do before setting out with the HuskyLens, the process is well documented in the Wiki.

Rather than cover everything that the HuskyLens can do and tie myself in knots with it I'm going to concentrate on Object tracking, I think this application warrants a separate Instructable and I would like to demonstrate various scenarios in which object tracking could be used.

The HuskyLens platform will be one of the 4 Motor 2 Layer Chassis, it has Mecanum wheels and its a bit battered and bruised but it should be ok for this application.

I will be using a Arduino Uno in conjunction with a L293D motor driver shield.

Supplies

DFRobot Huskylens AI Vision

L293D motor driver shield

4 Motor Chassis

Arduino Uno

18650 2 Battery holder with on/off switch

18650 Batteries x 2

Step 1: The Plan.

My thoughts with this Instructable is to try various scenarios i:e surfaces and terrain with the DFRobot HuskyLens from an Object Tracking point of view.

I have a Wall-e Robot which I can remote control and the first scenario will be using the Huskylens to learn the rear of the Wall-e from various angles and then try various situations where the HuskyLens will track the Wall-e on the move, there are various parameters within the code which can be adjusted for a smoother operation.

So we have a Object to track and the HuskyLens is on a platform to follow the object, time to set things up.

Step 2: Connections:

Connecting the 4 Motor chassis motors to the shield is pretty straight forward, Motor 1 Right front motor to M1 on the shield, Motor 2 to M2 etc etc on the shield, if the motors spin the wrong way just reverse the wires.

From the battery holder fasten the Red into M+ on the shield and Black to Gnd using the terminal block on the shield.

The connections for the HuskyLens are as follows, I am using I2C protocol so initially we need to connect 5V and Gnd to available pins on the Shield, I just soldered some header pins onto existing pins to accommodate the female Dupont style connectors on the HuskyLens cable.

Looking to the front of the HuskyLens where the connector is the T or Tx for I2C its the SDA and this is the green cable which goes to Pin A4 on the Shield, the R or Rx for I2C is the SCL this is the blue cable and this goes to Pin A5 on the shield, it's as easy as that, the connections are made, the diagram for connections on the Wiki has the SDA and SCL descriptions reversed i.e. SDA says Serial Clock SCL Serial data which is just a typo.

So with the connections made we can move onto setting up the HuskyLens for Object Tracking.

Step 3: HuskyLens Object Tracking.

Note: Only one object can be set up using Object Tracking, Multiple objects are not supported.

Powering up the Chassis powers up the HuskyLens and on the top are 2 switches one a rocker style switch, this is the function button, the 2nd button is the Learning button.

Toggling the function button scrolls through the various applications like Face Recognition, Object Tracking etc, a long press on the function button takes you into another menu where you can set related parameters one being general settings a short press on the function button takes you to where the I2C protocol can be set, dialling the function button to the left hit Save and Return to Save and exit.

The following is from the HuskyLens Wiki Page for setting up to Object Track

Long press the function button to enter the parameter setting of the object tracking function.

Dial the function button to the right to select "Learn Enable", then short press the function button, and dial it to the right to turn the "Learn Enable" ON, that is, the square icon on the progress bar is turned to the right, Then short press the function button to confirm this parameter.

The method to turn on the switch of saving models automatically is the same as before. According to the steps above to switch "Auto Save" ON.

You can also adjust the size of the frame by setting "Frame Ratio" and "Frame Size" to match the shape of the object, Dial the function button to the left to select "Save & Return", and short press the function button to save the parameters and return automatically.

Object Learning: Point Huskylens to the target object, adjusting the distance until the object is included in the yellow frame of the centre of the screen. Then long press "learning button" to learn the object from various angles and distances. During the learning process, the yellow frame with words "Learning: ID1" will be displayed on the screen.

I found a code online on a Video by Michael Klements which was useful and served this purpose well, this was uploaded to the Arduino Uno, I just tweaked the gains to get favourable results.

With all this information it's time to set the Wall-e up to be tracked and try various scenarios within the Object Tracking Algorithm:

Step 4: The Scenario's

The first scenario will be using the HuskyLens to learn the rear view of the Wall-e from various angles then getting the Huskylens chassis to follow the Wall-e as it sets off, I tried this in the lounge and was pleasantly surprised when it worked, I set the Wall-e off with forward movement and the HuskyLens hesitated for a while then set off after it, I don't think the Mecanum wheels are right for this platform as the chassis was like waddling.

I swapped the Mecanum wheels for some standard wheels and ventured outside with the Wall-e and RC Transmitter also The HuskyLens chassis and my phone on a Tripod to record the experience, I tried it on the decking first and this worked ok, it was a lot smoother with normal wheels, then I tried it on the lawn, that went ok until the Wall-e took a tumble:))

The next option was my concrete drive which is curved as it goes down to the garage, this should be a good test for the HuskyLens as it's not entirely smooth but at least it's a level playing field so to speak.

I set the Wall-e in front of the Chassis approximately 500mm and set the HuskyLens up to learn the back of the Wall-e Long pressing the learn button and moved the Wall-e into various positions with the joystick, phone was set to record, I had a camera further down the garden and I set the Wall-e off going slowly down the drive, the HuskyLens and chassis followed close behind, this went all the way to the garage including the right hand bend in the concrete, the HuskyLens lost contact with the Wall-e once, probably with the undulating surface, I just moved the Wall-e back a bit and it recognised it again and continued, I thought that was pretty impressive, what a difference the normal wheels on the chassis made as well

Step 5: Assumptions

So my first experience with the DFRobot HuskyLens Object tracking was pretty cool, to see the HuskyLens and chassis following the Wall-e was really good, I think the set up part is really easy following the correct procedure and after a few attempts and getting the correct wheels on the chassis I had it pretty much dialled in.

With 7 different applications to go at with the HuskyLens there is a lot of scope for various projects in the future and there are some examples in the Arduino library to get started with.

I hope you enjoyed this Instructable and thanks for watching.