This project will develop a multisensory search engine. It will first design and fabricate a robotic interface integrating visual, haptic and audio feedback. It will then use a predictive coding approach to analyse, interpret and generate tactile, audio and taste cues for the users to yield a virtual experience of various objects. Finally, it will implement systematic experiments to test and develop the capability of the system to improve object recognition in speed and accuracy.
Expected Results
Predictive coding modelling of haptic sensing integrating tactile and proprioceptive information.
Placement
Host institution: Imperial College London
Enrolments (in Doctoral degree): Imperial Collage London
Supervisors
Etienne Burdet, Vincent Hayward
Presentation of ESR7
My name is Alexis Devillard. After a master in robotics engineering (Polytech Sorbonne) and a master in Artificial intelligent and multi-agent systems (Sorbonne University), I worked at ISIR lab as a Research engineer. In 2020 I started my PhD at Imperial college London. I am passionate by all the different interactions between robotics systems and living beings.