Introduction: Roomba Scout Explorer

As one of the most highly anticipated and heavily researched American projects, the Mars rover projects have become human accomplishments in the ever-advancing production of high-tech autonomous systems for the sole purpose of investigating and interpreting the landmasses and surfaces of the red planet behind the Earth. As part of a more personal project in homage to the Mars missions, our objective was to create a roomba robot that could act autonomously over a certain time frame and react accordingly to certain criteria within its vicinity.

As for uniqueness, we focused on creating a diagram that showed each pathway the robot takes from its origins. In addition, the robot will be able to count the number of objects within its vicinity in a panoramic style.

Step 1: Equipment

-Roomba w/ Attachable Camera (with specific name known)

-Connected Server

-Windows 10 / Mac with Internet connectivity

-Bright platform

-Dark floor

-Any stray objects of monochromatic design

Step 2: MATLAB Setup

In order to create tasks and functions for your roomba, you must have the specific codes and toolkits containing the roomba commands.

With MATLAB 2016a and onward downloaded, create a folder to contain these robot files and insert the following MATLAB file below into the folder and run it to install the remaining necessary roomba files.

After that, right-click in the Current Folder window, hover your mouse over "Add a Path" and click on "Current Folder." Now, a pathway should be set up such that each of these files will be used to activate the roomba.

Now, use the command below in the command window to set up the roomba:

r=roomba(#).

The symbol # is the 'number' of the roomba specified; however, if you merely want a simulator of the roomba, simply type the following command:

r=roomba(0).

The simulation would be recommended for testing movement patterns.

If you are curious as to what commands the roomba can follow, type the following into the command window:

doc roomba.

For more details, visit the following website:

https://ef.engr.utk.edu/ef230-2017-08/projects/roomba-s/setup-roomba-instructable.php

Step 3: Function: Movement

In regards to movement, the roomba should be moving automatically for a set period of time given in the inputs. The goal of the robot's movement is to react properly when its sensors (bumpers, light bumpers and cliff sensors) change in the presence of various obstacles. This part would act as the base for all the roomba's commands as more features to the code are added on later. Some specification were needed:

-To reduce damage, the robot must reduce speed to a lower velocity.

-When approaching a cliff or wall, the robot will move in reverse and alter its angle depending on the point of impact

-After some time driving, the roomba will eventually stop and take images of the surrounding area

Note that the values used were in relation to the simulator; values like the turn angles turn velocities and robot sensor presets should be modified when using the actual robot to ensure stability and accountancy for equipment error.

Attachments

Step 4: Function: Image Processing

As per request, we were tasked to modify the data of an image (or several images) that was received by the robot's camera, to which we decided to make the roomba "count" the number of objects that it sees in the image.

We followed the technique of having MATLAB draw boundaries around blackened objects that contrast with a white background. However, this function is prone to have difficulty in an open area as the different shapes and colors perceived by the camera, resulting in unusually high counts.

Note that this function cannot work in the simulator since a camera is not provided; if attempted, an error will occur declaring only a (:, :, 3) matrix can be used.

Step 5: Function: Mapping

One additional feature that we wanted the robot to have was mapping out its locations as it directly interacts with the environment. Thus, the code below seeks to open a map and set up a coordinate system that details each location in which the robot's bumper sensors are pressed. This proved to be the longest portion of the three parts to test individually, but it proved much simpler when applied to final script.

For the sake of adding a limit to the length of the function's run time, the n<20 limit on the while loop was used for testing purposes.

Keep in mind though at because of the complexity of the code, more errors occur as the code segment runs for a long time; from previous tests, ten bumps appear to be the number of points before significant errors occur.

Step 6: Juxtaposition

Since all this will be placed in a single file, we created a function using each of the previous two steps as its subfunctions. A final draft was made with the following modification to the redux function called "recon." To avoid confusion for MATLAB, the "counter" and "rombplot3" scripts were renamed as embedded functions "CountR" and "plotr", respectively.

Several changes had to be made in the final version as opposed to the previous scripts:

-The origin will always be marked with a red circle

-Every time the roomba stops from its bumpers, location is marked with a black circle

-Every time the roomba stops from its cliff sensors, location is marked with a blue circle

-Every time the roomba stops from investigating the area, location is marked with a green circle

-Images are modified to have the top part removed due to the time stamp potentially interfering with the results

-Borders will not be counted out as an object due to rather high numbers acquired

-Several variables have been changed, so to avoid confusion use the versions above for reference.

Step 7: Testing

Tests for each individual component proved to be rather mixed at times, which is why modifications to certain preset values were necessary. The thematic background on which we wanted to test the robot's capabilities in a closed area simply consisted of a whiteboard laid on a much darker floor. You can scatter the objects around the area; make them act as objects to be bumped into or distant objects from the robot's moving area.

After setting its regulated time and base velocity, the roomba demonstrated adequate movement behavior, stopping and backing away from each "cliff" or object it rams into as well as slowing down as it detected something near. Upon reaching the desired three meter travel distance, the robot would proceed to stop and evaluate the area, taking images of each 45 degree region, and proceed onward if time allows. However, its turns appeared larger than requested, meaning that the coordinate data would be obscured.

Each time it stops, a new point was placed in the approximate region of its position on the coordinate system; however, it is noted that the initial direction at which the roomba starts plays a pivotal role in the design of the map. If a compass feature could be implemented, it would have been used as a crucial part of the map design.

The actual time the function takes to fully run always goes above the requested time, which makes sense considering it cannot stop in the middle of one of its recoveries. Unfortunately, this version of image counting does have its problems, especially in areas that are either mostly monochromatic or varying in brightness; because it attempts to distinguish between two shades, it tends to perceive objects that are not desired, hence it always counts up to insanely high numbers.

Step 8: Conclusion

While this task was a very adventurous and creative piece of work that brought about relief joy, I, from my personal observations, could see a great number of errors that could be problematic, both in the code and the behavior of the robot.

The limitation of using the specification of time in the while loop does cause the total amount of time to be longer than desired; the process of the panorama technique and image processing could in fact take longer if ran by a slow computer or was not used beforehand. In addition, the roomba that was used in our presentation acted with a great number of errors, especially in movement, compared to the simulator. The robot used unfortunately had a tendency to lean slightly to the left as it drove straight and made larger turns than desired. For this reason and many others, it is highly recommended that in order to compensate for these errors, changes need to be made for its turn angles.

Nevertheless, this is a long yet intellectually stimulating project that had acted as an interesting learning experience for applying codes and commands to directly affect the behavior of an actual robot.