Monday, 17 November 2008

AI 140 British Android ‘Jules’ Mimics Facial Expressions with human like qualities


Scientists at the Bristol Robotics Laboratory (BRL) have devised the first android robot that can perform the precise lip movements and facial expressions of human beings.

Named ‘Jules’, the robot can automatically copy human movements by capturing them with its video camera ‘eyes’ and mapping them onto small electronic motors in its skin.

The process allows Jules’ disembodied androgynous robotic head to grin and grimace, furrow its brow and 'speak'.

Jules mimics these expressions by converting the video images into digital commands, which allow the robot's servos and motors to produce copies of these movements in real time. Indeed, the robot can interpret commands at 25 frames per second.

Robotics engineers Chris Melhuish, Neill Campbell and Peter Jaeckel worked three-and-a-half years developing the groundbreaking software that allows the interaction between humans and artificial intelligence. Jules has 34 internal motors covered with flexible rubber ('Frubber') skin, which was commissioned from U.S. roboticist David Hanson for the project.

The quality of the androids is getting better and better as they become more human like, such as:

• Repliee R-1 (refer to AI 101)

• Intelligent female" android from Tokyo’s robotdreams.com (refer to AI 99)