Introduction: AnimeBOT - Animatronics Face

Introduction:

This is my first instructable. The AnimeBOT, a project which was a part of the course ME 511 Mechatronics II. This was a group project and is focused on concepts of Mechatronics. Here’s an INSTRUCTABLE and how we did it!

Animatronics is the use of electronics and robotics to create lifelike animated characters or creatures, often used in film and television, theme parks, and other entertainment venues to create realistic and interactive experiences for audiences.

The animeBOT – Animatronic face is powered by the Arduino platform and has two forms of interaction: joystick operation to maintain eye movement and object tracking, which when detected the bot will move the jaw along with an audio output. 

My fellow project partner Saurabh Bharane volunteered to make use of his face for project. 


Concept:

-Eyes will move longitudinally, i.e. left and right upon the input signal given.

-Eyes will be controlled with the help of Joystick.

-Their will be audio output whenever the BOT detects an object nearby. Detection is done with the help of sensor.

-Upon detection of object, along with audio output there will be a jaw movement. 

Supplies

Step 1: CREATING 3D CAD MODELS

In this project, an animatronic face with an eye mechanism is constructed using simple linkage systems and servo motors to control side-to-side eye movement.

To begin, 3D models for the linkages, connectors, and servo links were created using Autodesk TinkerCAD. These models were then used to identify the necessary 3D printed parts, which are listed in a figure provided for reference. The mechanism can be expanded to allow for up and down eye movement.

The first step is to print all required 3D models before constructing the mechanism.

Step 2: PRINTING OUT 3D MODELS AND ASSEMBLY

The development of the animatronic eye mechanism involved a complex process of designing and printing the necessary 3D parts. This was a challenging task that required careful consideration of the design of the linkage systems and servo motors to ensure efficient and effective operation.

Once the parts were printed, the next step was to assemble and mount them onto the structure of the animatronic face. This was a critical phase that demanded attention to detail to ensure that all components were correctly aligned and secured.

Finally, the mechanism was successfully mounted onto the structure of the animatronic face, resulting in the desired eye movement. A provided picture illustrates the completed project, demonstrating the effectiveness and efficiency of the mechanism.

Step 3: PREPARING AN ALGORITHM AND PROGRAMMING

Step 3 involves creating an algorithm and program in order to make the project work as per desired need and preference. Here's the algorithm which we followed and introduced it into our project;


The program declares and initializes variables and defines the pins used for the ultrasonic sensor, servos, and joystick.


The program attaches the servo motors to their respective pins and sets their initial positions.


The program enters an infinite loop that does the following:


a. The program reads the X and Y values of the joystick.


b. The joystick X value is mapped to the position of servo1, which controls the horizontal movement of the eye.


c. The program writes the new position of servo1.


d. The program generates a 10-microsecond pulse to the TRIG pin and measures the duration of the pulse from the ECHO pin. The duration of the pulse corresponds to the distance between the ultrasonic sensor and an object in front of it.


e. The program maps the distance value to the position of servo2, which controls the vertical movement of the eye.


f. The program writes the new position of servo2.


g. If an object is detected within 10 cm, the program activates servo1 to move the eye left and right in a sweeping motion.


The loop repeats indefinitely until the program is terminated.

In summary, the program uses two servo motors to control the movement of an animatronic eye in response to input from a joystick and an ultrasonic sensor. The program continuously reads the joystick position and maps it to the position of servo1 to move the eye horizontally. The program also measures the distance to any object in front of the ultrasonic sensor using the ECHO pin and maps that distance to the position of servo2 to move the eye vertically. If an object is detected within 10 cm of the sensor, the program activates servo1 to move the eye in a sweeping motion.

Step 4: PROGRAMMING

As a project team we first tested the eye mechanism and jaw mechanism separately before integrating them. They followed an algorithm for the eye mechanism and coded it accordingly, and then did the same for the jaw mechanism. Once they had a clear understanding of both mechanisms, they integrated them and programmed them together.


The eye mechanism program was designed to read joystick X and Y values, map them to servo motor position, generate a 10-microsecond pulse to the TRIG pin, measure the duration of the pulse from the ECHO pin, calculate the distance of the object in front, map the distance value to servo position, and finally, write the servo to its new position. If an object was detected within 10 cm, servo1 would activate and move the eye.


The jaw mechanism program used a push button to control the servo motor that would move the jaw up and down.


Once both mechanisms were tested and working, they were integrated, and a single program was written to control both the eye and jaw movements. The final program allowed for the eye mechanism to detect an object within 10 cm and activate the servo motor to move the eye while simultaneously the jaw mechanism would move the jaw up and down.


We developed separate programs for the eye and jaw mechanisms, tested them separately, integrated them, and finally developed a single program to control both mechanisms simultaneously.

Step 5: GIVING FACE TO THE PROJECT

The final step is to add a face to the animatronic project. One of my project partners volunteered to use their face as a reference to create an animated version of their face, which will serve as the face of our animatronic project.

Step 6: FINAL OUTCOME

The completed project features an animatronic face that can interact with humans by making eye contact and moving its jaw. The mechanism for eye movement and jaw movement was successfully integrated, although audio integration was not achieved due to technical difficulties. Further time and attention are required to properly integrate audio with the jaw movement. Nonetheless, the eye and jaw mechanism work efficiently and create a human-like interaction.

Step 7: LESSONS LEARNT

Two important lessons that can be learned from working on a basic animatronics project are; the significance of troubleshooting skills and the importance of collaboration. 

Troubleshooting skillsare critical for identifying and solving problems that arise during the project's development, including issues with individual components and final adjustments. 

Collaboration is also a crucial aspect of animatronics projects, as they often require teamwork to ensure that all members are working together to achieve the same goal and to integrate each component seamlessly into the final product.

Step 8: FUTURE SCOPE AND POINTS OF IMPROVEMENT

The future scope of animatronics with computer vision is immense. Computer vision technology involves the use of cameras and image processing algorithms to interpret visual information and make decisions based on it. When integrated into animatronics, this technology can enable robots and other animatronic devices to interact with humans in more intelligent and intuitive ways.

One potential application of computer vision in animatronics is facial recognition. By using machine learning algorithms to analyze facial expressions and movements, animatronics can interpret human emotions and respond accordingly. This could have applications in entertainment, education, and therapy.

Computer vision can also be used to enable animatronics to navigate their environments more effectively. By using cameras and sensors to perceive their surroundings, animatronics can avoid obstacles and move through complex environments with ease.

Another potential application of computer vision in animatronics is in the field of robotics. By integrating computer vision technology with robotic systems, engineers can develop more intelligent and versatile robots that can perform a wider range of tasks in more complex environments.

Overall, the future scope of animatronics with computer vision is vast, and the technology is likely to continue evolving and advancing in the coming years. As these advancements are made, animatronics are likely to become even more integrated into our daily lives, providing new and innovative ways to interact with technology.