Jules mimics the expressions by converting the video image into digital commands that make the robot's servos and motors produce mirrored movements. And it all happens in real time as the robot can interpret the commands at 25 frames per second.
The project, called 'Human-Robot Interaction', was devised at the Bristol Robotics Laboratory (BRL), run by the University of the West of England and the University of Bristol.
A team of robotics engineers - Chris Melhuish, Neill Campbell and Peter Jaeckel - spent three-and-a-half years developing the breakthrough software to create interaction between humans and artificial intelligence.
Original Text Source: dailymail.co.uk