A Framework for Learning Visual and Tactile Correlation

Research output: ThesisMaster's Thesis

Abstract

Tactile data is an important source of information for applications in fields such as object manipulation or object recognition. However, the process of gathering the tactile data can be inconvenient and time consuming. For example a robotic manipulator would have to grasp the object to be moved every time to gather the tactile information and then again to finally pick it up. This thesis proposes a way to overcome this kind of issue by implementing a method to predict what the tactile feedback sensors would measure when touching an object at a given position, based on two dimensional visual data. Therefore, visual-tactile data pairs were gathered to train a Convolutional Neural Network that takes images of objects with the positions of interest marked as the input and the force vector as the output. To improve performances, the edges of the input images were extracted using the Canny algorithm, a new architecture was developed and the training process optimised with the Bayesian Optimisation algorithm. An evaluation strategy was developed and a test set built, to be able to effectively compare the different models to each other. The result is a framework that is capable of understanding the spacial relationship between tactile sensors and surfaces but lacks in accuracy, as a result of noisy data. The noise is caused by inaccurate sensors and a sub-optimal acquisition strategy.

Details

Translated title of the contributionEin Framework für visuelle und taktile Korrelation
Original languageEnglish
QualificationDipl.-Ing.
Awarding Institution
Supervisors/Advisors
Award date16 Dec 2022
DOIs
Publication statusPublished - 2022