Predictive coding of tactile information

ESR7

Objectives

This project will develop a multisensory search engine. It will first design and fabricate a robotic interface integrating visual, haptic and audio feedback. It will then use a predictive coding approach to analyse, interpret and generate tactile, audio and taste cues for the users to yield a virtual experience of various objects. Finally, it will implement systematic experiments to test and develop the capability of the system to improve object recognition in speed and accuracy.

Expected Results

Predictive coding modelling of haptic sensing integrating tactile and proprioceptive information.

Placement

Host institution: Imperial College London

Enrolments (in Doctoral degree): Imperial Collage London

Supervisors

Etienne Burdet, Vincent Hayward

Presentation of ESR7

PhD defence: To be announced

My name is Alexis Devillard. After a master in robotics engineering (Polytech Sorbonne) and a master in Artificial intelligent and multi-agent systems (Sorbonne University), I worked at ISIR lab as a Research engineer. In 2020 I started my PhD at Imperial college London. I am passionate by all the different interactions between robotics systems and living beings.

PhD goals

This project introduces a multi-sensory search engine concept, exploring the potential of augmenting traditional visual search interfaces, such as Google Images, with haptic feedback. This research could lead to two primary applications: tele-perception, where sensory recordings directly control sensory feedback devices, and automated sensory feedback generation based on the user’s virtual interaction with a selected material.

Results

Deliverable 3.3 Predictive coding model of haptic sensing
Abstract model of haptic sensing integrating tactile and proprioceptive information. Predictive coding modelling of tactile sensing in rodents. Human-like computational model of haptic sensing for robots.

Conference Article
Devillard, A.; Ramasamy, A.; Faux, D.; Hayward, V.; Burdet, E.
Concurrent Haptic, Audio, and Visual Data Set During Bare Finger Interaction with Textured Surfaces
IEEE World Haptics Conference (WHC), 2023
DOI: 10.1109/WHC56415.2023.10224372