Introduction: Computer Vision Controlled Wheelchair With Mannequin

About: neuro integrated designer. creating intelligences embedded in networks that make devices understandable and relatable to humans.

Project by AJ Sapala, Fanyun Peng, Kuldeep Gohel, Ray LC.
Instructable by AJ Sapala, Fanyun Peng, Ray LC.

We created a wheelchair with wheels controlled by an Arduino board, which is in turn controlled by a raspberry pi running openCV via Processing. When we detect faces in openCV, we move the motors towards it, turning the wheelchair so it faces the person, and the mannequin (through its mouth) will take a very scary picture and share it with the world. This is evil.

Step 1: Design, Prototype, and Schematics of the Wheelchair.

The initial concept was based on the idea that a moveable piece will be able to spy on unsuspecting classmates and take ugly pictures of them. We wanted to be able to scare people by moving towards them, although we didn't anticipate the motor mechanical problems to be so difficult. We considered features that would make the piece as engaging (in a evil way) as possible and decided to implement a mannequin on a wheelchair which can move towards people using computer vision. A prototype of the result was made by AJ from wood and paper, while Ray and Rebecca made OpenCV run on a raspberry pi, making sure that faces can be detected reliably.

Step 2: Materials and Setup

1x wheelchair (

2x scooter motors

2x Cytron motor boards

1x arduino UNO R3 (

1x raspberry pi 3 (

1x raspberry pi camera v2 (

1x 12v rechargeable battery



rubber flooring

Step 3: Fabrication of Motor to Wheelchair Attachment and Mannequin Head

AJ fabricated an apparatus that fixes the scooter motors (2) to the bottom of the wheel chair and attached the pitch bracket to a custom-made rubber timing belt. Each motor is installed separately and is fixed to a corresponding wheel. Two wheels, two motors. The motors are then fed with power and ground through two Cytron motor boards to Arduino (1) to Raspberry Pi (1), all elements are powered with a 12 volt rechargeable battery (1).
The motor apparatuses were created using plywood, L-brackets, square brackets and wood fasteners. By creating a wooden brace around the actual motor, installing the motor in place on the bottom of the wheelchair was much easier and could be moved to tighten the timing belt. The motor apparatuses were installed by drilling through the metal frame of the wheelchair and bolting the wood to the frame with L-brackets.

The timing belts were made from rubber flooring. The rubber flooring had a pitch already made that was similar in size the motors spinning bracket. Each piece was trimmed to the width that works with the motors spinning bracket. Each piece of cut rubber was fused together creating a “belt” by sanding one end and the opposite end and applying a small amount of Barge glue to connect. Barge is very dangerous, and you must wear a mask while using it, also use ventilation. I created several varieties of the timing belt sizes: super tight, tight, moderate. The belt then needed to be connected to the wheel. The wheel itself has a small amount of surface area on the base to accompany a belt. This small space was increased with a cardboard cylinder with timing belt rubber hot glued to it’s surface. This way the timing belt could grab the wheel to help it spin in sync with the spinning scooter motor.

AJ also created a dummy head that integrates Raspberry Pi’s camera module. Ray used the dummy head and installed the Pi camera and board into the dummy’s mouth region. Slots were created for the USB and HDMI interfaces, and a wooden rod is used to stabilize the camera. The camera is mounted on a custom 3D printed piece which has an attachment for 1/4-20 screws. File is attached (adopted for fit by Ray from thingaverse). AJ created the head using cardboard, duct tape, and a blonde wig with markers. All elements are still in prototype stage. The dummy head was rigged to the body of a female mannequin and placed in the seat of the wheelchair. The head was attached to the mannequin using a cardboard rod.

Step 4: Writing and Calibrating the Code

Rebecca and Ray first tried to install openCV directly on raspi with python ( however it doesn't appear to work live. Eventually after many attempts to install openCV using python and failing, we decided to go Processing on pi because the openCV library in Processing works quite well. See
Note also that it works with the GPIO ports which we can then use to control the arduino using Serial communication.

Ray wrote the computer vision code which relies on the xml file attached for detecting faces. Basically it sees if the center of the face rectangle is to the right or left of center, and move the motors in opposite directions so as to rotate the chair to the face. If the face is close enough, the motors are stopped to take a picture. If no faces are detected, we also stop in order not to cause unnecessary injury (you can change that functionality if you think it's not evil enough).

Rebecca wrote the Arduino code to interface with the motor board using Serial communication with Processing on the pi. The important keys are opening usb serial port ACM0 to Arduino and connect raspberry pi to Arduino via a usb cable. Connect the Arduino with a DC motor driver to control the speed and direction of a motor, sending direction and speed commands from raspberry pi to Arduino. Basically Ray's Processing code tells the motor the speed to go at while Arduino makes a fair guess at the duration of the command.

Step 5: Integrate the Wheelchair, Mannequin, and Code and Test.

Putting all the parts together, we found that the main issue was the connection of the motor to the wheels of the wheelchair, for the timing belts would frequently slip off. Both motors were installed with the

wheelchair upside down for easier installation. Both motors operated well while attached to 12-volt battery source. When the wheelchair itself was flipped over upright, the motors had trouble moving the chair backwards and forwards due to the weight of the chair itself. We tried things like changing the timing belt widths, adding pegs to the sides of the belt, and increasing driving force, but none were working reliably.
However, we were able to demonstrate clearly when the faces are to each side of the chair, the motors will move in the appropriate opposite direction due to face detection with the raspberry pi, so the Processing and Arduino codes work as intended, and the motors can be controlled appropriately. The next steps are to make a more robust way of driving the wheels of the chair and making the mannequin stable.

Step 6: Enjoy Your New Evil Mannequin-Wheelchair

We learned a lot about fabricating motors and drivers. We managed to run face detection on a small machine with raspberry pit. We figured out how to control motors with motor boards and the way power for motors work. We made some cool mannequins and figures and prototypes, and even put a camera in its mouth. We had fun as a team making fun of other people. It was a rewarding experience.