Robot control with facial recognition project


Project Description

Despite the advances in technology, people suffering from partial paralysis or muscular dystrophy still have a hard time being completely independant. The goal of this project was to develop of proof-of-concept of a solution to help them have more control over their day to day life.

The program can use any webcam to detect basic facial expressions: mouth opened/closed, eyes opened/closed, eyebrows up/down and head tilt. Depending on the facial expression, it is then possible to control the movement of the robotic arm.

Also, note that this work was done as part of an undergraduate research project back in 2016.
Technologies Used: C++/OpenFrameworks/Kinova robot arm

Demonstration

Fun fact: My accent has actually improved since!


Simple explanation

The webcam tracks the user's face
The system tracks 3 facial expressions: eyebrows up/down, eyes opened/closed, mouth opened/closed.
The current detected facial expression is highlighted in blue. When held for a pre-determined period of time, the system highlights the decision in yellow and moves the arm accordingly.

There are 2 control modes: the first one uses pre-registered movements, such as fetching a bottle on the table. The second one assigns a different joint of the robot arm to all 6 main facial expressions. Then, by tilting the head to the side, the user can move every joint independantly in a direction or the other.