Desktop Layout

Works in Progress
Colonial Ballroom
Drinks will be served.
Can Humans Infer Haptic Surface Properties from Images?
Alex Burka and Katherine J. Kuchenbecker
(University of Pennsylvania, USA; Max Planck Institute for Intelligent Systems, Germany)
WIP Poster A4
Publisher's Version
Picture (Local)
Abstract: Human children typically experience their surroundings both visually and haptically, providing ample opportunities to learn rich cross-sensory associations. To thrive in human environments and interact with the real world, robots also need to build models of these cross-sensory associations; current advances in machine learning should make it possible to infer models from large amounts of data. We previously built a visuo-haptic sensing device, the Proton Pack, and are using it to collect a large database of matched multimodal data from tool-surface interactions. As a benchmark to compare with machine learning performance, we conducted a human subject study (n = 84) on estimating haptic surface properties (here: hardness, roughness, friction, and warmness) from images. Using a 100-surface subset of our database, we showed images to study participants and collected 5635 ratings of the four haptic properties, which we compared with ratings made by the Proton Pack operator and with physical data recorded using motion, force, and vibration sensors. Preliminary results indicate weak correlation between participant and operator ratings, but potential for matching up certain human ratings (particularly hardness and roughness) with features from the literature.

Authors:


Time stamp: 2019-03-26T21:44:53+01:00