WHC 2023
2023 IEEE World Haptics Conference (WHC)
Powered by
Conference Publishing Consulting

2023 IEEE World Haptics Conference (WHC), July 10–13, 2023, Delft, Netherlands

WHC 2023 – Preliminary Table of Contents

Contents - Abstracts - Authors


Title Page

Message from the Chairs




Wearable 3D Shape Display for Dynamic Interfaces Rendering
Bilige Yang, Benjamin Stephens-Fripp, Priyanshu Agarwal, Sonny Chan, Nathan Usevitch, Andrew Stanley, and Yatian Qu
(Meta, USA; Yale University, USA)
Recreating the feeling of touch is crucial for seamless interactions with objects in the virtual world. Many haptic solutions exist in the forms of graspable, wearable, and touchable systems for recreating kinesthetic and tactile feedback. Yet, to the best of our knowledge, no wearable system to date can directly render dynamic shapes in user's hand with drastic shape rendering capabilities. We present a wearable 3D haptic display with drastic shape change and dynamic signal rendering. We explored direct physical and dynamic rendering of shapes in users' hands using a 3D lattice of pneumatic actuators-- a direct-embodiment approach. We also conducted user studies to determine the efficacy of the shape and frequency rendering and found the results to be generally convincing.

Perception of and Response to a Haptic Device as a Function of Signal Complexity
Antonio Alvarez Valdivia ORCID logo and Laura H. Blumenschein
(Purdue University, USA)
Haptics devices have been developed in a wide range of form factors, actuation methods, and degrees of freedom, often with the goal of communicating information. While work has investigated the maximum rate and quantity of information that can be transferred through haptics, these measures often do not inform how humans will use the devices. In this work, we measure the differences between perception and use as it relates to signal complexity. Using an inflatable soft haptic display with four independently actuated pouches, we provide navigation directions to participants. The haptic device operates in three modalities, in increasing order of signal complexity: Cardinal, Ordinal, and Continuous. We first measure participants’ accuracy in perceiving continuous signals generated by the device, showing average errors below 5o. Participants then used the haptic device in each operating mode to guide an object towards a target in a 2-dimensional plane. Our results indicate that human’s use of haptic signals often lags significantly behind the displayed signal and is less accurate than their static perception. Additionally signal complexity was correlated with path efficiency but inversely correlated with movement speed, showing that even simple design changes create complex tradeoffs.

Performance Evaluation of Airborne Ultrasound Focus Measurement Using Thermal Imaging on the Surface of a Finger
Sota Iwabuchi, Ryoya Onishi, Shun Suzuki, Takaaki Kamigaki, Yasutoshi Makino, and Hiroyuki Shinoda
(University of Tokyo, Japan)
Mid-air haptics generated by ultrasound requires measurement and correction of the sound field on the surface of the user’s finger for optimal tactile presentation. Thermal images enable the rapid observation of sound pressure patterns without interfering with the acoustic field. In this study, we discuss the performance of ultrasound focus measurement using thermal images. Using a dummy finger with an embedded microphone, we confirmed that the temperature and sound pressure patterns of the ultrasound focus matched. Furthermore, we confirmed that the temperature change on the human finger surface determines the focal point position in 0.2 s with an accuracy of less than 1 mm. The results suggest that the temperature pattern directly reflects the position and shape of the focus and enables rapid and accurate calibration with a fixed finger by repeating the focus measurement and positional correction. In addition to increasing the reproducibility of tactile presentation, thermal measurement enhances the tactile experience by allowing the presentation of precisely controlled stimulation.

Fostering Social Empathy in VR through Physiologically Based Affective Haptic Feedback
Jeanne Hecquard, Justine Saint-Aubert, Ferran Argelaguet Sanz, Claudio Pacchierotti ORCID logo, Anatole Lécuyer, and Marc J.-M. Macé
(Inria, France; Inria Bretagne-Atlantique, France; CNRS, France; CNRS; Rennes 1, France)
We study the promotion of positive social interactions in VR by fostering empathy with other users present in the virtual scene. For this purpose, we propose using affective haptic feedback to reinforce the connection with another user through the direct perception of their physiological state. We developed a virtual meeting scenario where a human user attends a presentation with several virtual agents. Throughout the meeting, the presenting virtual agent faces various difficulties that alter her stress level. The human user directly feels her stress via two physiologically based affective haptic interfaces: a compression belt and a vibrator, simulating the breathing and the heart rate of the presenter, respectively. We conducted a user study that compared the use of such a "sympathetic" haptic rendering vs an "indifferent" one that does not communicate the presenter's stress status, remaining constant and relaxed at all times. Results are rather contrasted and user-dependent, but they show that sympathetic haptic feedback is globally preferred and can enhance empathy and perceived connection to the presenter. The results promote the use of affective haptics in social VR applications, in which fostering positive relationships plays an important role.

Handheld Haptic Device with Coupled Bidirectional Input
Megh Doshi, Michael Hagenow, Robert Radwin, Michael Gleicher, Bilge Mutlu, and Michael Zinn
(University of Wisconsin-Madison, USA; University of Wisconsin- Madison, USA; University of Wisconsin - Madison, USA)
Handheld kinesthetic haptic interfaces can provide greater mobility and richer tactile information as compared to traditional grounded devices. In this paper, we introduce a new handheld haptic interface which takes input using bidirectional coupled finger flexion. We present the device design motivation and design details and experimentally evaluate its performance in terms of transparency and rendering bandwidth using a handheld prototype device. In addition, we assess the device's functional performance through a user study comparing the proposed device to a commonly used grounded input device in a set of targeting and tracking tasks.

Modeling and Simulation of Thermal Grill Illusion Using Neurophysiological Theory
Subhankar Karmakar, Madhan Kumar Vasudevan, and Manivannan Muniyandi
(Indian Institute of Technology Madras, India)
The Thermal Grill Illusion (TGI) is a temperature-based perceptual illusion in which innocuous warm and cold stimuli evoke pain when applied simultaneously in a juxtaposed pattern. Based on neurophysiological and psychological findings, several theories have been proposed to explain the mechanisms behind TGI. However, the significance of an analytical model for TGI is not addressed in the literature. This study focuses on developing an analytical model based on the "disinhibition theory" to predict the intensity of TGI pain. A psychophysical experiment on perceived TGI pain was first conducted, and then an analytical model was developed. The model's objective is to predict the neuronal activity of pain-sensitive HPC (Heat-Pinch-Cold) nerve fibers by leveraging the existing popular models of warm and cold receptors. An experimental thermal grill setup was used to provide five temperature differences between warm and cold grills (each repeated three times). Participants rated the perceived TGI pain sensation on a Likert scale of one to ten. Both the experimental results and the simulation showed a monotonically increasing relationship between temperature differences and the perceived TGI intensity. The proposed model bridges the gap between neurophysiological and psychophysical knowledge of TGI, potentially aiding thermal display designs.

Neural Correlates of Cooperation during Interactive Visuomotor Task: An fNIRS Hyperscanning Study
Yilei Zheng, Shiyi Liu, Bohao Tian, Yuru ZhangORCID logo, and Dangxiao Wang
(Beijing Information Science and Technology University, China; Beihang University, China; State Key Laboratory of Virtual Reality Technology and Systems, China)
Investigating neural correlates of cooperation is an important topic within the field of social cognition. The inter-brain synchronization in the leader-follower joint action paradigm and how neural synchrony is associated with the degree of cooperation remain unclear. Here, functional near-infrared spectroscopy (fNIRS)-based hyperscanning was employed to measure the neural activity of 14 dyads when they cooperated on a new interactive visuomotor task involving the leader-follower relationship. During this task, the participant who moved a handle of a haptic device along a specific path acted as the leader; meanwhile, the other participant produced a specified force with another haptic device acting as the follower. Using wavelet transform coherence analysis, we calculated the inter-brain coherence between the two participants based on their fNIRS signals from the prefrontal, left motor, and right temporoparietal brain regions. Correlation analysis between the neural coherence and the cooperation performance revealed that the increased synchrony in the right prefrontal and temporoparietal region was positively associated with the overall cooperation performance on the task. These findings suggested a vital role of the mentalizing network during the cooperative visuomotor task and the potential of inter-brain synchronization as a neural marker for assessing cooperation performance during haptic interactions.

Spatiotemporal Organization of Touch Information in Tactile Neuron Population Responses
Neeli Tummala, Yitian Shao, and Yon Visell
(University of California, Santa Barbara, USA; Technische Universität Dresden, Germany)
Manual touch interactions elicit widespread skin vibrations that excite spiking responses in tactile neurons distributed throughout the hand. The spatiotemporal structure of these population responses is not yet fully understood. Here, we evaluate how touch information is encoded in the spatiotemporal organization of simulated Pacinian corpuscle neuron (PC) population responses when driven by a vibrometry dataset of whole-hand skin motion during commonly performed gestures. We assess the amount of information preserved in these peripheral population responses at various spatiotemporal scales using several non-parametric classification methods. We find that retaining the spatial structure of the whole-hand population responses is important for encoding touch gestures while conserving the temporal structure becomes more consequential for gesture representation in the responses of PCs located in the palm. In addition, preserving spatial structure is more beneficial for capturing gestures involving single rather than multiple digits. This work contributes to further understanding the sense of touch by introducing novel measurement-driven computational methods for analyzing the population-level neural representations of natural touch gestures over multiple spatiotemporal scales.

A Wearable System Integrating Force Myography and Skin Stretch Feedback toward Force Skill Learning
Arata Horie, Yunao Zheng, and Masahiko Inami
(The University of Tokyo, Japan)
We propose a wearable system integrating force myography (FMG) and skin stretch toward force skill learning, and describe its basic evaluation. The efficacy of learning sports, rehabilitation, and vocational skills is expected to be improved through the use of motion learning systems. Among the skills, force coordination skills, in particular, often require long-term experience to master because visual information may not be available. The system proposed in this study integrates body motion estimation by FMG and skin stretch haptic feedback to provide feedback cues from the current force output to the target force output. After evaluating the performance of the system in terms of sensing and haptic presentation, an initial user study was conducted to assess the feasibility of force guidance in practice. The system was found to output force intensity relative to varied static target forces, indicating its efficacy. Based on the results of the experiments, we discuss future issues and potential applications.

Naturalistic Vibrotactile Feedback Could Facilitate Telerobotic Assembly on Construction Sites
Yijie Gong, Bernard Javot, Anja Patricia Regina Lauer, Oliver Sawodny, and Katherine J. KuchenbeckerORCID logo
(Max Planck Institute for Intelligent Systems, Germany; University of Stuttgart, Germany)
Telerobotics is regularly used on construction sites to build large structures efficiently. A human operator remotely controls the construction robot under direct visual feedback, but visibility is often poor. Future construction robots that move autonomously will also require operator monitoring. Thus, we designed a wireless haptic feedback system to provide the operator with task-relevant mechanical information from a construction robot in real time. Our AiroTouch system uses an accelerometer to measure the robot end-effector's vibrations and uses off-the-shelf audio equipment and a voice-coil actuator to display them to the user with high fidelity. A study was conducted to evaluate how this type of naturalistic vibration feedback affects the observer's understanding of telerobotic assembly on a real construction site. Seven adults without construction experience observed a mix of manual and autonomous assembly processes both with and without naturalistic vibrotactile feedback. Qualitative analysis of their survey responses and interviews indicated that all participants had positive responses to this technology and believed it would be beneficial for construction activities.

Implementation and Evaluation of a Vibrotactile Assisted Monitoring and Correction System for Partial Weight-Bearing in Lower Extremities
Øystein Bjelland, William Gulliksen, Arkadiusz Damian Kwiatkowski, Martin Skavø, Håkon Isern, Mohammadamin Shayestehpour, Martin Steinert, Alf-Inge Hellevik, and Robin T. Bye
(Norwegian University of Science and Technology, Norway; NTNU TrollLABS, Norway; Ålesund General Hospital, Norway)
Accurate partial weight bearing in the foot during rehabilitation from musculoskeletal injuries in lower extremities is important to ensure successful recovery. However, it is difficult for patients to know how much to load, and many are therefore reluctant to load at all. This study presents a novel footwear concept with strain gauge based force sensors and vibrotactile feedback that enables accurate foot loading and partial weight bearing monitoring during rehabilitation from musculoskeletal injuries in the lower extremities. Four force sensors integrated in the foot-sole of a sandal measures ground reaction force, and two eccentric rotating mass motors provide vibrotactile feedback to the user when a predetermined force threshold is reached. Partial weight bearing data from the force sensors are transferred wireless using wifi communication to a remote patient monitoring dashboard, enabling decision support for health personnel. We demonstrate the use of the prototype to reduce overloading in partial weight bearing in a validation experiment (N = 16). Findings showed that the prototype significantly reduced overloading in the right foot for healthy adults (p<0.05), which indicate that closed-loop force sensing footwear with vibrotactile feedback can be useful in aiding patients with musculoskeletal injuries in the lower extremities during rehabilitation.

Towards Differential Magnetic Force Sensing for Ultrasound Teleoperation
David Black, Amir Hossein Hadi Hosseinabadi, Nicholas Rangga Pradnyawira, Maxime Pol, Mika Nogami, and Tim Salcudean
(University of British Columbia, Canada; École Polytechnique, France)
Low-profile, low-cost force/torque sensing is important in many applications. It can enable haptic feedback, performance evaluation, training, data collection, and teleoperation of ultrasound procedures. In this paper we introduce a new concept of differential magnetic field based multi-axis force sensing. A magnet is separated from two adjacent Hall effect sensors by a flexible suspension. The differential signal from the two sensors allows precise deflection measurement, and combining several of these on a compliant structure enables multi-axis force sensing. The concept is motivated, described, simulated, and tested. In initial experiments, the best-case deflection resolution is found to be 856 nm, with full-scale range of 1.5 mm and a root-mean-square force/torque error of 10.37% compared to an off-the-shelf sensor. This paper demonstrates the feasibility and potential of this force sensing mechanism.

Interpersonal Vibrotactile Phantom Sensation between Hands via Actuated Bracelets
Kenta Ebina and Taku Hachisu
(University of Tsukuba, Japan)
Multiplayer video games are interactive electronic systems that mainly consist of controllers and monitors, in which two or more players engage in social interactions by competing or cooperating with one another. The use of interpersonal touch interaction as a controller can provide an opportunity to offer social behavior with minimal social distance and facilitate prosocial behavior. In this study, an experiment was conducted to demonstrate interpersonal vibrotactile phantom sensations between two parties holding hands via actuated bracelets. We adopted hands as the medium for vibration propagation and the illusory tactile sensation (phantom sensation) to localize the sensation. The amplitude ratio, vibration frequency, and actuator position were employed as explanatory variables, whereas the perceived location was used as the response variable.The results reveal that the perceived locations varied linearly according to the amplitude ratio, which demonstrates the effect of the interpersonal phantom sensation. Furthermore, the frequency of the vibration affected the linearity between the amplitude ratio and perceived location, whereas the actuator position had no effect. Thus, the use of the phantom sensation may be a promising approach for enhanced interpersonal touch interactions and improved immersion of game players.

Video Info
Enabling Physical Interaction through the Wrist-Mounted Haptic Controller with Force Feedback
Minjae Jo, DongKyu Kwak, and Sang Ho Yoon ORCID logo
(Korea Advanced Institute of Science and Technology, South Korea)
For immersive Virtual Reality (VR), the need for effective and precise haptic feedback is increasing. This paper proposes a wrist-mounted haptic controller that provides direct feedback to the palm for rendering various physical properties. We conducted various experiments that demonstrated the effectiveness and accuracy of our device. First, we found that our device renders negligible force (0.02 N) to the user in the free-hand situation. Second, we confirm that our device renders physical properties with high accuracy (within 6%) by using Force Sensing Resistor (FSR). Finally, we present the minimum force feedback threshold and difference threshold of the proposed device. From the results of perceptual experiments, we suggest a design guideline for using the proposed prototype.

Transparent, High-Force, and High-Stiffness Control of Haptic Actuators with Backlash
Patrick Dills and Michael Zinn
(University of Wisconsin - Madison, USA)
Haptic actuators employing speed reductions display desirable increased force capability but have difficulty producing feelings of free space motion due to friction and inertia magnification implicit to actuator dynamics. This work describes a control topology that enables geared haptic actuators to produce highly transparent free space motion when combined with backlash nonlinearities. While the presence of backlash enables the proposed free space motion control, it is also a source of instability, limit cycles, and to some extent rendering distortion. We introduce a smoothed gain scheduling function to mitigate limit cycling and expand the range of stable impedances that can be rendered. The introduction of a design metric called the free space envelope provides a framework to evaluate the effectiveness of the free space controller. Together these two control approaches enable transparent free space, high-force, and stable haptic interactions in systems with backlash, a characteristic common in many speed reducers.

Information Transfer of Full-Body Vibrotactile Stimuli: An Initial Study with One to Three Sequential Vibrations
Jaejun Park, Junwoo Kim, Chaeyong Park, Sangyoon Han, Junseok Park, and Seungmoon Choi
(Pohang University of Science and Technology, South Korea; Pohang University of Science and Technology (POSTECH), South Korea; ETRI, South Korea)
Full-body vibrotactile stimuli are a promising means by which haptic interaction designers can substantially increase information transfer (IT). This paper reports the IT values of full-body sequential vibrotactile stimuli applied to ten body sites. We designed 10, 100, and 1,000 stimuli by increasing the number of vibrations in a sequence from one to three and estimated IT for each condition. We also obtained the lower and upper bounds of the IT estimates using several empirical methods to compensate for their biases. The IT estimate for each condition was 3.32, 6.55, and 8.05 bits, the highest IT ever reported in the haptics literature. This study is the first attempt to quantify the IT of full-body vibrotactile sequences, providing guidelines for the design of effective full-body vibrotactile stimuli.

Perceptual Simultaneity Between Vibrotactile and Impact Stimuli
Chaeyong Park and Seungmoon Choi
(Pohang University of Science and Technology (POSTECH), South Korea)
Multimodal haptic rendering that simultaneously presents stimuli of multiple haptic modalities, such as vibration, impact, and thermal, has the potential to provide richer and more immersive haptic experiences than unimodal haptic rendering. For that, maintaining the synchrony between distinct haptic stimuli is crucial, as it significantly affects the overall quality of haptic sensations perceived by the user. In this paper, we investigate the perceptual sensitivity to the simultaneity between multimodal haptic stimuli combining impact and vibration. We design a handheld multimodal haptic device and conduct a perceptual experiment to measure the thresholds of perceived simultaneity between the two modalities while varying the vibration frequency and duration. Our results show that the order of two tactile stimuli for simultaneity perception is affected by both vibration frequency and duration. Specific results about the effects of vibration frequency and duration provide useful insights and design guidelines for eliciting synchronous sensations in multimodal impact and vibration rendering.

The Tactile Distance Aftereffect Transfers to Roughness Perception
Michaela Jeschke, Knut Drewing, and Elena Azañón
(Justus-Liebig University, Germany; Otto-von-Guericke University, Germany; Leibniz Institute for Neurobiology, Germany)
Touch is susceptible to various aftereffects. Recent findings on tactile distance perception demonstrate that when an area of the body is repeatedly touched at two points separated by a given distance, subsequently presented smaller distances are perceived as smaller and larger distances as larger. Here we investigate whether adaptation to a tactile distance transfers to the perception of coarse textures’ roughness. Additionally, we examine whether this transfer is orientation-specific, which is typical for low-level aftereffects. On each trial, the tip of the left index finger was adapted either 1) to a tactile two-point distance of 4 mm applied along the length of the finger, 2) the same distance applied across the width of the finger or 3) to single indentations. After adaptation to a two-point distance, participants systematically perceived subsequently presented gratings with smaller groove distances as being less rough—when the orientation of the adapted distance matched that of the texture. This reflects an aftereffect transfer for the orientation-congruent condition only. The results suggest that the processing of distance between two points on the skin is involved in the computation of texture, and that texture is a basic somatosensory feature computed at relatively early stages of sensory processing.

A Magnetic Soft Device for Tactile Haptic Actuation of the Fingertip
Sarah Costrell, Mahirah Alam, Roberta L. KlatzkyORCID logo, Michael E. McHenry, Lynn M. Walker, and Melisa Orta Martinez
(Carnegie Mellon University, USA; Carnegie Mellon, USA)
In this work, we introduce a novel haptic device composed of a wearable fingertip sheath, fabricated using an oleogel loaded with magnetic particles, and an external electromagnet. The sheath is actuated using the external magnetic field provided by the electromagnet, which is equipped with a field-focusing pole piece. The oleogel composite used in this device has been optimized for the transfer of the magnetic force from the material to the skin to provide perceptible forces to the wearer. We compare our composite to composites created with materials commonly used in the literature and find the force transfer from our material, as measured by a force sensor, to be much greater when actuated under the same range of input voltages to the electromagnet. We also present a psychophysical user study that shows a linear relationship between this range of input voltages and perceptual magnitude. This result indicates that the device provides a range of tactile feedback that can be driven to a desired intensity of sensation through proportional voltage control.

Easy-to-Recognize Bump Shapes Using Only Lateral Force Cues for Real and Virtual Surfaces
Mirai Azechi and Shogo Okamoto
(Tokyo Metropolitan University, Japan)
Friction-variable-type surface texture displays can present macroscopic bumps and dents on flat touch panels. In this study, different shapes of bumps and dents were identified when only frictional or lateral resistance forces were available cues. Shapes that are easy to recognize were then investigated. The tested surfaces varied in height and width whereas their maximum gradients were the same. Those with greater width exhibited greater height or depth. We experimented with these surfaces under a virtual condition using an electrostatic friction display (Experiment~1) and real condition where actual bumps and dents were explored by way of a lateral force presenter (Experiment~2). The experimental results were not fully consistent between the virtual and real conditions. In the virtual condition, the bump and dent with moderate height/depth and width were most likely recognized as bump and dent. This suggests that adjusting the height and width of bumps and dents increases their perceptual clarity when the maximally applicable voltages are limited because of safety regulations. In the real condition, we did not find out significant differences among the different bump shapes. Pursuing the incongruency among the virtual and real conditions will lead to further understanding of haptic perception for macroscopic surface shapes.

Enhancing Perceived Resistance and Propulsion by Combining Pseudo-haptics and Pulling Illusion
Tomohiro Kawagishi, Yuki Ban, Yusuke Ujitoko, and Shinichi Warisawa
(University of Tokyo, Japan; NTT Communication Science Laboratories, Japan)
Because haptic presentation using force-feedback devices has limitations for everyday use in terms of hardware size, cost, and installation requirements, pseudo-haptics, which can present a pseudo-feeling of touch, have attracted attention. However, the magnitude of the force given by pseudo-haptics that humans can perceive is limited. Therefore, in this study, we proposed a method that combines the pseudo-haptics and pulling illusion with asymmetric vibration to enhance the pseudo-force sensation that can be presented without a force feedback device. To investigate the effectiveness of this method, we quantitatively and separately evaluated the magnitude of resistive and propulsive forces perceived by participants. In the case of resistive force presentation, we found that the participants perceived stronger resistive forces when forces were presented by both pseudo- haptics and pulling illusion than when only one of them was presented. However, regarding the propulsive force presentation, the combination of pseudo-haptics and asymmetric vibration did not produce a stronger perceived propulsive force than asymmetric vibration alone. The results indicated that combining pulling illusion and pseudo-haptics is effective in some cases.

Concurrent Haptic, Audio, and Visual Data Set During Bare Finger Interaction with Textured Surfaces
Alexis William Marcel Devillard, Aruna Ramasamy, Damien Faux, Vincent Hayward, and Etienne Burdet
(Imperial College, UK; Ecole Normale Superieure, France; Actronika SAS, France; Actronika, France; Institut des Systèmes Intelligent et de Robotique, France; Imperial College London , UK)
Perceptual processes are frequently multi-modal. This is the case of haptic perception. Data sets of visual and haptic sensory signals have been compiled in the past, especially when it comes to the exploration of textured surfaces. These data sets were intended to be used in natural and artificial perception studies and to provide training data sets for machine learning research. These data sets were typically acquired with rigid probes or artificial robotic fingers. Here, we collected visual, auditory, and haptic signals acquired when a human finger explored textured surfaces. We assessed the data set via machine learning classification techniques. Interestingly, multi-modal classification performance could reach 97% when haptic classification was around 80%.

Eyes-Free Fingertip Guidance Based on Tactile Cues, an Extension of the Steering Law
Quentin Agobert, Corentin Bernard, Balthazar Potet, and Nicolas Huloux
(Aflokkat, France; Aix Marseille Univ, CNRS, PRISM, France)
The use of a modern human-machine interface involves a large amount of possible interactions. In order to allow users to navigate through a large number of available operations, interface designers often use drop-down menus that offer many options in a constrained area. This menu shows good performance for selecting quickly from a large amount of choices. However, they require a high visual attention which is not always possible for the user. Here, we investigate if one can navigate through paths made of orthogonal tunnels, simulating drop-down menus, relying only on tactile cues on a haptic touchscreen. We found that subjects were able to follow the path with a success rate of ∼ 90% for 1 tunnel which decreased linearly to ∼ 40% for 5 tunnels. Four types of haptic feedback were tested and showed no major differences in terms of success rate. Nevertheless, participants were slightly faster with slipping path feedback. The user trajectories presented robust regularities that could be well described by the steering law model. Hence, we propose a novel definition of path difficulty for non-visual conditions based on path width, length and number of orthogonal tunnels. These findings pave the way toward eyes-free guidance on surface haptic interfaces.

Haptic Rendering of Dynamic Hand Interaction for an Impedance-Controlled Glove
Qianqian Tong, Weipeng Shen, Dangxiao Wang, and Miguel A. Otaduy
(Peng Cheng Laboratory, China; Universidad Rey Juan Carlos, Spain; Beihang University, China)
Novel haptic gloves open the door to haptic simulation of virtual interactions directly with our hands. However, rendering of dynamic hand interactions poses many challenges, due to the many degrees of freedom of hand simulations, and the undersensed and underactuated nature of haptic gloves. In this work, we propose a haptic rendering method of dynamic hand interaction for an impedance-controlled undersensed and underactuated haptic glove, following an optimization strategy. In addition, to improve rendering transparency, we design a compensation strategy for the lag between hand tracking and hand simulation. We show that our haptic rendering algorithm is effective for dynamic sliding, rolling and grasping of virtual objects using a Dexmo glove, and we demonstrate its superiority over previous work.

Design and Evaluation of a Multimodal Haptic Vest
Bora Celebi, Müge Cavdan, and Knut Drewing
(Justus Liebig University, Germany)
Haptic perception is inherently active. People utilize different exploratory strategies that affect their perception. For example, people perceive small shapes more precisely when the finger explores them laterally as compared to anteroposterior, and they adjust their exploratory direction in a corresponding task to increase perceptual performance [5]. Here, we investigated how prescribed movement direction of the finger affects texture perception and associated exploratory movements. Texture perception is based on spatial cues from static touch and temporal cues from active movement. We used stimuli that maximized the relevancy of movement-related temporal cues. The finger was moving lateral or anteroposterior to the body, but always orthogonal to the texture orientation. In addition, one group of participants explored while wearing a glove that further reduced the availability of spatial cues, another group explored without glove. Participants performed a two-interval forced choice task choosing in each trial the stimulus with higher spatial frequency. Participants applied higher force and stroked faster in anteroposterior orientation than in lateral orientation. Further, participants wearing gloves stroked the textures more slowly. Perceptual performance did not differ between conditions. We conclude that participants adapted their movement strategies to the respective exploratory constraints in ways to maintain good perception.

Temporal Detection Threshold of Audio-Tactile Delays under Conditions of Active Touch with and without a Visual Cue
Detjon Brahimaj, Giulia Esposito, Arthur Courtin, Andre Mouraux, Frederic GIRAUD, Betty SEMAIL, and Olivier Collignon
(University of Lille, France; Université catholique de Louvain, Belgium; Université de Lille, France)
While much research has been conducted on multisensory interactions in passive touch, research on how active touch influences how the senses interact remains scarce. Using a haptic surface based on ultrasonic vibrations, we investigated the perception of synchronization of audio-tactile stimuli in active touch. Tactile stimuli were delivered upon sliding the finger, and auditory stimuli followed with a delay ranging from 0-700ms. In this simultaneity judgment task, two visual conditions were employed: (i) a visual cue showing the location of the active zone on the screen; (ii) a black picture on the screen. We also consider two sliding directions: (i) right-to-left (RTL); (ii) left-to-right (LTR).
We estimated the psychometric function (threshold and slope) of the ability to judge whether the auditory and tactile stimuli were temporally synchronous. We found a threshold of 201.26 and 211.73ms for LTR and 233.3 and 207.23ms for RTL with and without visual cues, respectively. We translated temporal delays into distances (mm) using the finger sliding velocity measured for each trial. The results indicate that the simultaneity judgment was independent of sliding velocity.
Our results provide insights into participants’ sensitivity in perceiving simultaneous audio-tactile feedback generated during active touch exploration.

Realism of Visual, Auditory, and Haptic Cues in Phenomenal Causality
Elyse D. Z. Chase, Tobias Gerstenberg, and Sean Follmer
(Stanford University, USA)
Interacting in real environments, such as manipulating objects, involves multisensory information. However, little is known about how multisensory cue characteristics help us determine what has occurred in a scene, including whether two events were causally linked. In virtual environments, the number of sensory modalities present and levels of realism often vary. In this work, we explore what role multisensory information and physical realism play in people's causal perception. So far, haptic cues have rarely been studied in causal perception. Here, we combined visual, auditory, and haptic cues in a psychophysical study in which participants were asked to judge whether one billiard ball caused another to move. We manipulated the temporal delay between cause and effect event, and the physical realism of each cue. While temporal delays generally decreased causal judgments, the number of multisensory cues and their physical realism increased causal judgments. We highlight the implications of this work for building immersive environments.

Archive submitted (13 MB) Info
3D Shape Presentation by Combination of Force Feedback and Electro-tactile Stimulation
Yui Suga, Masahiro Miyakami, Izumi Mizoguchi, and Hiroyuki Kajimoto
(University of Electro-Communications, Japan)
The rapid and precise understanding of 3D objects in virtual reality environment is crucial for proficient manipulation of virtual objects. Generally, relying solely on a force feedback device falls short in conveying intricate shapes, such as the edges of 3D objects, and it is deemed necessary to supplement it with appropriate cutaneous sensory inputs. Electro-tactile stimulation, owing to its compact and lightweight design, has the potential to provide high-resolution cutaneous sensory inputs and could be a viable method for presenting intricate shapes when incorporated with a force feedback device. In this research, we devised a system that concurrently presents cutaneous inputs along the object's edge through electrical stimulation, as well as reactive force from the object through a force feedback device, and evaluated its impact on 3D shape perception under three scenarios: force feedback alone, cutaneous feedback alone, and combined sensory presentation. The results from experiments on the identification of four types of column shapes in single-finger contact and two-fingers grasping indicate that the combined presentation of force and electro-tactile sensation significantly hastens the differentiation time of the shapes and facilitates more efficient recognition of 3D objects.

Spatially Continuous Non-Contact Cold Sensation Presentation Based on Low-Temperature Airflows
Koyo Makino, Jiayi Xu, Akiko Kaneko, Naoto Ienaga, and Yoshihiro Kuroda
(University of Tsukuba, Japan)
Our perception of cold enriches our understanding of the world and allows us to interact with it. Therefore, the presentation of cold sensations will be beneficial in improving the sense of immersion and presence in virtual reality and the metaverse. This study proposed a novel method for spatially continuous cold sensation presentation based on low-temperature airflows. We defined the shortest distance between two airflows perceived as different cold stimuli as a local cold stimulus group discrimination threshold (LCSGDT). By setting the distance between airflows within the LCSGDT, spatially continuous cold sensations can be achieved with an optimal number of cold airflows. We hypothesized that the LCSGDTs are related to the heat-transfer capability of airflows and developed a model to relate them. We investigated the LSDGDTs at a flow rate of 25 L/min and presentation distances ranging from 10 to 50 mm. The results showed that under these conditions, the LSDGDTs are 131.4 ± 1.9 mm, and the heat-transfer capacity of the airflow corresponding to these LSDGDTs is an almost constant value, that is, 0.92.

Optimized Time-Domain Control of Passive Haptic Teleoperation Systems for Multi-DoF Interaction
Gianni Bianchini ORCID logo, Davide Barcelli, Domenico PrattichizzoORCID logo, and Claudio Pacchierotti ORCID logo
(Università di Siena, Italy; University of Siena, Italy; CNRS, France)
This paper presents a time-domain passivity controller for multi-DoF haptic-enabled teleoperation systems aimed at improving performance in terms of transparency for a given task. By solving an online convex optimization problem, the proposed approach enhances transparency of interaction along specific directions of the environment space which are significant for the task at hand, while guaranteeing system stability. An experimental evaluation of the effectiveness of the proposed design is presented, enrolling twenty participants. We compared the performance of the proposed approach vs. those of a standard energy-bounding time-domain algorithm during the exploration of a virtual sphere. Results show that, as the communication delay between the local and remote agents grows, the proposed technique better preserves transparency along the directions that are more important for the task at hand.

An Exploration of Just Noticeable Differences in Mid-Air Haptics
Katarzyna Wojna, Orestis Georgiou, David Beattie, William Frier, Michael Wright, and Christof Lutteroth
(University of Bath, UK; Ultraleap, UK; Ultrahaptics, UK)
Mid-air haptic feedback technology produces tactile sensations that are felt without the need for physical interactions, wearables or controllers. When designing mid-air haptic stimuli, it is important that they are sufficiently different in terms of their perceived sensation. This paper presents the results of two user studies on mid-air haptic feedback technology, with a focus on the sensations of haptic strength and haptic roughness. More specifically, we used the acoustic pressure intensity and the rotation frequency of the mid-air haptic stimulus as proxies to the two sensations of interest and investigated their Just Noticeable Difference (JND) and Weber fractions. Our results indicate statistical significance in the JND for frequency, with a finer resolution compared to intensity. Moreover, correlations are observed in terms of participants’ sensitivity to small changes across the different stimuli presented. We conclude that fre- quency and intensity are mid-air haptic dimensions of depth 5 and 3, respectively, that we can use for the design of distinct stimuli that convey perceptually different tactile information to the user.

Archive submitted (1.5 MB)
Optical Measurements of the Skin Surface to Infer Bilateral Distinctions in Myofascial Tissue Stiffness
Anika Kao, Zackary Todd Landsman, M. Terry Loghmani, and Gregory John Gerling
(University of Virginia, USA; Indiana University, USA)
About half the U.S. adult population suffers from chronic neuromusculoskeletal pain. While its evaluation and treatment are widely addressed by therapies using soft tissue manipulation (STM), their efficacy is based upon clinician judgment. Robust biomarkers are needed to quantify the effects of STM on patient outcomes. Among noninvasive methods to quantify the mechanics of myofascial tissue, most are limited to small (<10 mm2), localized regions of interest. In contrast, we develop an approach to optically simultaneously measure a larger (~100 cm2) field of deformation at the skin surface. Biomarkers based on skin lateral mobility are derived to infer distinctions in myofascial tissue stiffness. In specific, three cameras track ink speckles whose fields of deformation and stretch are resolved with digital image correlation. Their ability to differentiate bilateral distinctions of the cervicothoracic region is evaluated with four participants, as a licensed clinician performs STM. The results indicate that the optically derived surface biomarkers can differentiate bilateral differences in skin mobility, with trend directions within a participant similar to measurements with an instrumented force probe. These findings preliminarily suggest skin surface measurements are capable of inferring underlying myofascial tissue stiffness, although further confirmation will require a larger, more diverse group of participants.

Human-Delivered Brushstroke Characterization Using an Instrumented Brush Focused on Torque
Zackary Todd Landsman, Anika Kao, and Gregory John Gerling
(University of Virginia, USA)
Pleasant brush therapies may benefit those with autism, trauma, and anxiety. While studies monitor brushing velocity, hand-delivery of brush strokes introduces variability. Detailed measurements of human-delivered brushing physics may help under-stand such variability and subsequent impact on receivers’ perceived pleasantness. Herein, we instrument a brush with multi-axis force and displacement sensors to measure their physics as 12 participants pleasantly stroke a receiver’s forearm. Algorithmic procedures identify skin contact, and define four stages of arrival, stroke, departure, and airtime between strokes. Torque magnitude, rather than force, is evaluated as a metric to minimize inertial noise, as it registers brush bend and orientation. Overall, the results of the naturally delivered brushing experiments indicate force and velocity values in the range of 0.4 N and 3-10 cm/s, in alignment with prior work. However, we observe significant variance between brushers across velocity, force, torque, and brushstroke length. Upon further analysis, torque and force measures are correlated, yet torque provides distinct information from velocity. In evaluating the receiver’s response to individual differences between brushers of the pre-liminary case study, higher pleasantness is tied to lower mean torque, and lower instantaneous variance over the stroke duration. Torque magnitude appears to complement velocity’s influence on perceived pleasantness.

Dynamic Feedback in Wave-Mediated Surface Haptics: A Modular Platform
Dustin Thomas Goetz, Gregory Reardon, Max Linnander, and Yon Visell
(University of California Santa Barbara, USA; University of California, Santa Barbara, USA)
Emerging surface haptic technologies exploit wave physics to create software-programmable two-dimensional haptic displays. However, designing such systems is challenging due to the complex dependence of wave propagation on the system's hardware arrangement, materials, and boundary conditions. We present a modular system for exploring design opportunities for wave-mediated haptic feedback via elastic surfaces. The system integrates an array of repositionable, custom electromagnetic actuators that excite shear waves which propagate in a reconfigurable elastic medium. We use optical vibrometry imaging to capture data that fully encode the transmission of waves in this system. We present methods that leverage the linearity of wave transport and the acquired data to efficiently implement and evaluate a variety of hardware configurations and software methods for displaying dynamic, spatially-resolved two-dimensional haptic feedback. These techniques can allow researchers to rapidly investigate methods for engineering software-programmable surface haptic displays based on wave excitation.

An Extended Virtual Proxy Haptic Algorithm for Dexterous Manipulation in Virtual Environments
Aldo Fabrizio Galvan, Job Donaldo Ramirez, Ashish Deshpande, and Ann Majewicz Fey
(University of Texas - Austin, USA)
With the evolution of hand-based haptic interfaces, novel forms of controlled force feedback arise allowing multi-point interaction between virtual objects and the hand's digits. With these advances, there must be an effective force display coupled with an intuitive visualization of the hand at its points of high manipulability. This is the basis for dexterous manipulation immersion in virtual environments. Still, there are challenges due to the complexity of force interaction, bandwidth limitations, and redundant visual configurations. In this paper, we present an extended proxy algorithm for digit-based interactions, which through configuration-based optimization, provides an efficient, robust, and visually plausible way to interact with virtual objects with a virtual hand. Additionally, we revisit a seldom-seen modality of haptic rendering, whole-hand kinesthetic feedback, with the Maestro exoskeleton in the implementation of our algorithm. We unify these methods and develop a CHAI3D module in a comprehensive visuo-haptic framework that was evaluated through demonstrations of joint-level haptic force data during interaction with static and dynamic objects alike. Our computationally-efficient approach sets the foundation for the visual display of in-hand virtual object manipulation with the effective rendering of stable haptic interactions under complex tasks.

Manipulation of Body Sway Interpretation through Kinesthetic Illusion Induced by Ankles Vibration
Eifu Narita, Shota Nakayama, Mitsuki Manabe, Keigo Ushiyama, Satoshi Tanaka, Izumi Mizoguchi, and Hiroyuki Kajimoto
(University of Electro-Communications, Japan)
Numerous studies have explored the body tilt and sway elicited by vibratory stimuli, which are thought to be related to reflex adjustments or kinesthetic illusions. However, prior studies have not thoroughly explored the conditions that change the interpretation of self and environmental factors. In the present study, a subjective body sway was induced through alternating vibrations that were applied to the antagonist ankle muscles. Results indicated that a low switching frequency inclined the interpretation of self-sway, while a high switching frequency favored the interpretation of environmental sway.

A User Study of a Cable Haptic Interface with a Reconfigurable Structure
Bastien Jacques Étienne Poitrimol and Hiroshi Igarashi
(Tokyo Denki University , Japan; Tokyo Denki University, Japan)
Cable-based visuo-haptic interfaces usually adopt simple structures, either being wearable devices, like gloves or jackets, or grounded interfaces with fixed anchor points and many cables providing force feedback. Regarding the latter type of interface, the force and visual feedbacks are often not co-located. While planar cable haptic interfaces with variable structures have been designed, the benefits of such structures have not been investigated. Likewise, co-located visual feedback using head-mounted display (HMD) is always discarded when using grounded cable interfaces. This paper presents a desktop-sized cable-driven parallel robot with a three-degree-of-freedom variable structure and six cables. Our objective is to evaluate the potential benefits or drawbacks of this structure and the use of an HMD through user-based experiments. For this purpose, a manipulation task where the users have to push a sphere along a line and a free exploration task were designed. The performances of the participants are evaluated in terms of trajectory accuracy and task execution time, and their experiences are collected with a questionnaire. Results indicate that the variable configuration presents slightly reduces the perceived physical and cognitive load with minimal impact on performance and that the HMD improves immersion and manipulation by providing more stereoscopic cues.

The Effects of Movement Direction and Glove on Spatial Frequency Discrimination in Oriented Textures
Didem Katircilar and Knut Drewing
(Justus Liebig University Giessen, Germany; Giessen University, Germany)
Haptic perception is inherently active. People utilize different exploratory strategies that affect their perception. For example, people perceive small shapes more precisely when the finger explores them laterally as compared to anteroposterior, and they adjust their exploratory direction in a corresponding task to increase perceptual performance. Here, we investigated how prescribed movement direction of the finger affects texture perception and associated exploratory movements. Texture perception is based on spatial cues from static touch and temporal cues from active movement. We used stimuli that maximized the relevancy of movement-related temporal cues. The finger was moving lateral or anteroposterior to the body, but always orthogonal to the texture orientation. In addition, one group of participants explored while wearing a glove that further reduced the availability of spatial cues, another group explored without glove. Participants performed a two-interval forced choice task choosing in each trial the stimulus with higher spatial frequency. Participants applied higher force and stroked faster in anteroposterior orientation than in lateral orientation. Further, participants wearing gloves stroked the textures more slowly. Perceptual performance did not differ between conditions. We conclude that participants adapted their movement strategies to the respective exploratory constraints in ways to maintain good perception.

The Influence of Surface Roughness and Surface Size on Perceived Pleasantness
Lisa Pui Yee Lin, Müge Cavdan, Katja Doerschner, and Knut Drewing
(Justus-Liebig University Gießen, Germany; Justus Liebig Uni Giessen, Germany; Giessen University, Germany)
Objects’ material properties are essential not only in how we use and interact with them but also in eliciting affective responses when in contact with the body. Such affective experiences are of particular interest because they likely strongly impact our daily interactions with materials. We examined whether exploration time and surface size could influence affective responses to rough stimuli. Here, participants made pleasantness and arousal judgments after actively exploring sandpaper stimuli of different sizes with varying roughness levels under different time constraints. Findings confirm that increased surface roughness is associated with decreased perceived pleasantness; however, arousal did not systematically covary with roughness. We didn’t find an effect of exploration time on perceived pleasantness or arousal, but there were interactions between grit size and surface size. Overall, the direction of the effects of grit size on pleasantness was similar for both surface sizes. However, the slopes of increase in pleasantness relative to grit size varied depending on surface size. Effects on arousal were unrelated and small. We suggest that exploration time had little influence on the perceived magnitude of affective reactions to roughness. However, surface size may influence not only perceived roughness but also the perceived pleasantness of rough stimuli.

Active Haptic Exploration Based on Dual-Stage Perception for Object Recognition
Pakorn Uttayopas, Xiaoxiao Cheng, and Etienne Burdet
(Imperial College London, UK)
Haptic exploration in robotics is prone to sensing ambiguities. Actively selecting actions during exploration can provide crucial information to mitigate these ambiguities and improve object recognition. This study presents a dual-stage active haptic exploration technique that enables a robot to adapt its actions to optimise information acquisition for object recognition. In the initial stage of rough perception, the algorithm employs actions that maximise mutual information to swiftly identify the likely categories of an object. Subsequently, during the fine perception stage, it selects actions that maximise the Kullback-Leibler (KL) divergence between the most likely pair of ambiguous objects, thus facilitating their differentiation. To evaluate the performance of our algorithm, a robot with a sensorised finger collected tactile information from the interaction with ten objects using the primary actions of pressing, sliding, and tapping. In comparison with existing active exploration strategies that optimise a single information metric, our algorithm achieves superior recognition rates while requiring fewer exploration actions. By conducting only necessary comparisons between similar objects, it also reduces the computational cost. These results suggest that the proposed algorithm effectively diminishes ambiguities by adapting actions and enhancing the recognition outcomes in haptic exploration.

1-D Manual Tracing Based on a High Density Haptic Stimulation Grid: A Pilot Effort
Brendan Driscoll, Ming Liu, and Helen Huang
(North Carolina State University, USA)
Lower limb amputees lack the neurological pathways needed for perception of how their prosthetic limbs are interacting with the environment, leading to a lack of confidence in their devices and reduced balancing capabilities. Sensory substitution methods, such as vibrotactile and electrotactile feedback applied to unaffected body segments offer a potential way to restore some of the lost information pathways. While high resolution haptic stimulation grids have become commercially available, few studies have tried to make use of these devices to provide more intuitive sensory substitution methods. This study developed an encoding approach, which is based on the illusory “phantom actuator” phenomenon, to convert 1-D position information to a wearer through a bHaptics Tactsuit. By evaluating performance of 1-D manual tracking task among 14 participants under the proposed approach and a traditional amplitude modulation approach, we demonstrated an improvement of velocity tracing accuracy (p=0.0375) with the proposed approach, although the proposed approach did not lead to significant improvement in the position tracing accuracy.

Shaping Human Movement via Bimanually-Dependent Haptic Force Feedback
Jacob R. Boehm, Ann Majewicz Fey, and Nicholas P. Fey
(The University of Texas at Austin, USA; University of Texas at Austin, USA)
Haptic feedback can enhance training and performance of human operators; however, the design of haptic feedback for bimanual coordination in robot-assisted tasks (e.g., control of surgical robots) remains an open problem. In this study, we present four bimanually-dependent haptic force feedback conditions aimed at shaping bimanual movement according to geometric characteristics: the number of targets, direction, and symmetry. Haptic conditions include a virtual spring, damper, combination spring-damper, and dual springs placed between the hands. We evaluate the effects of these haptic conditions on trajectory shape, smoothness, and speed. We hypothesized that for subjects who perform worse with no haptic feedback (1) a spring will improve the shape of parallel trajectories, (2) a damper will improve the shape of point symmetric trajectories, (3) dual springs will improve the shape of trajectories with one target, and (4) a damper will improve smoothness for all trajectories. Hypotheses (1) and (2) were statistically supported at the p<0.001 level, but hypotheses (3) and (4) were not supported. Moreover, bimanually-dependent haptic feedback tended to improve shape accuracy for movements that subjects performed worse on under no haptic condition. Thus, bimanual haptic feedback based on geometric trajectory characteristics shows promise to improve performance in robot-assisted motor tasks.

Identifying Human Grasp Properties During Robot-to-Human Handovers
Paul Pacaud, etienne chassaing, Yilin Cai, Connor Yako, and Kenneth Salisbury
(Stanford University, USA; CentraleSupélec-Paris Saclay, France; Carnegie Mellon University, USA)
One of the most important challenges in Human-Robot Interaction (HRI) is the perception of the human state. When a robot physically engages with a human, such as during physical interaction and assistance, it is vital that the robot perceives as much information as possible about the human to properly guide its behavior. We examined the specific case of a robot handing a rigid object to a human and used only the robot’s force and motion sensors to determine when the human’s grasp was secure enough for the robot to safely release the object. From biomechanical reasoning, we assumed that safer grasps are stiffer grasps. We commanded our robot to impose small motions on the object being passed and measured the resulting force changes and used system identification techniques to measure and visualize changes in the human's grasp stiffness. When subjects were instructed to grasp more tightly an object being passed, our robot could reliably detect increases in several measures of the multi-dimensional stiffness of the human's grasp. Our technique also enabled us to measure the increases in damping with tighter grasps. This preliminary work demonstrates promising value for active haptic perception during physical HRI.

Vibrotactile Display of Distance Information in a Virtual Object Exploration Task
Johannes Rueschen and Hong Z. Tan
(Purdue University, USA)
The present study investigates how different mapping strategies of distance information affect speed and accuracy in an object exploration task with a virtual teleoperated robot. The task was to locate a potentially explosive object inside a backpack using a virtual robotic gripper. A virtual proximity sensor tracked the distance between the tip of the gripper and the object surface. The distance information was conveyed through the amplitude and/or frequency variations of a vibration on the participant's index finger. The participants were instructed to locate the visually-occluded object by moving the tip of the gripper as quickly and as closely towards the object as possible without touching it. The change in distance was mapped to frequency only, intensity only, or both. The results indicate that the mapping strategy affects the accuracy (remaining distance between the gripper and object surface) but not the task completion time. Anecdotal feedback from the participants confirmed our design strategy that linear mappings provide information on the rate of approach, while non-linear mappings emphasize cues at short distances. In addition, experienced participants can selectively attend to and integrate co-varying frequency and intensity cues, and inexperienced participants prefer single parameter changes.

CatBoost for Haptic Modeling of Homogeneous Viscoelastic Deformable Objects
Gautam Kumar, Shashi Prakash, and Amit Bhardwaj
(Indian Institute of Technology Jodhpur, India; IIT Jodhpur, India)
This paper proposes an alternative data-driven haptic modeling method of homogeneous deformable objects based on a CatBoost approach – a variant of gradient boosting machine learning approach. In this approach, decision trees are trained sequentially to learn the required mapping function for modeling the objects. The model is trained on the input feature vectors consisting of position, velocity and filtered velocity samples to estimate the response force. Our approach is validated with a publicly available two-finger grasping dataset. The proposed approach can model unknown interactions with good accuracy (relative root mean squared error, absolute relative error and maximum error less than 0.06, 0.18 and 0.76 N, respectively) when trained on just 20% of the training data. The CatBoostbased method outperforms the existing data-driven methods both in terms of the prediction accuracy and the modeling time when trained on similar size of the training data.

The Impact of Haptic Feedback During Sudden, Rapid Virtual Interactions
Nizamettin Taha Tanacar, Moaaz Hudhud Mughrabi, Anil Ufuk Batmaz, Daniele Leonardis, and Mine Sarac
(Kadir Has University, Turkey; Concordia University, Canada; Scuola Superiore Sant'Anna, Italy)
Haptic feedback is known to improve the realism and the performance of virtual tasks during manipulation or teleoperation tasks. However, these benefits might depend on the nature of virtual tasks or the intensity of haptic rendering. In this paper, we focused on the impact of the presence and the intensity of the haptic stimulus during sudden, rapid virtual interactions through a variation of an ISO 9241:411 task -- instead of calm, exploration-based interactions. We conducted a user study where the haptic stimulus is rendered through a realistic 1-DoF fingertip haptic device with different intensity levels (full-strength, half-strength, and no-strength) as they are asked to choose highlighted targets on a 6-by-5 grid as fast and correctly as possible. Our results show that haptic feedback did not significantly affect user performance regarding time, throughput, or the nature of the selection behavior. However, participants made significantly more errors when haptic feedback was present in half-strength compared to full-strength and no-strength conditions. In the post-experiment questionnaire, participants reported having favored haptic feedback in full strength in terms of perceived realism, enjoyment, and immersion.

Wireless Dual Mode Haptic Thimble based on Magnetoactive Rubber
Yong Hae Heo, Mohammad Shadman Hashem, Gyubin An, Hyun-Jeong Kim, Dong-Soo Choi, Seokhee Jeon, and Sang-Youn Kim
(Korea University of Technology and Education, South Korea; Kyung Hee University, South Korea; Kumoh National Institute of Technology, South Korea)
This paper proposes a wireless dual mode haptic thimble based on magneto-active rubber (MAR) for providing kinesthetic and vibrotactile feedback without any electric wire. When the external magnetic field is applied to the MAR, ferromagnetic particles in the MAR polarize and attract each other, causing the magnetic attraction force between the MAR and the external magnetic field generator (e.g., a solenoid). We can easily provide both the force and vibration to a user wearing the MAR thimble by controlling the solenoid. To improve the wearability and maximize the haptic performance, the best ratio of ingredients for fabricating the MAR thimble is determined through extensive experiments. The haptic behavior of the fabricated MAR thimble is quantitatively investigated. The experimental results show that the haptic thimble can generate a sufficient magnitude of force or vibration to simulate the user’s finger for various haptic effects. Moreover, a transparent force rendering algorithm is proposed to compensate for non-linear force response. The potential of the system is further demonstrated in a distinctive virtual reality-based interaction scenario.

Portable Self-propelled Force Feedback Device
Ayaka Fukasawa, Riho Taniguchi, Takumi SATO, and Shoichi Hasegawa
(Tokyo Institute of Technology, Japan)
One of the purposes of the force feedback device for virtual reality is to prevent the user's body from penetrating virtual objects and/or to facilitate object manipulations. However, as such devices are fixed, their workspace is confined to their movable range. To address this limitation, we propose a lightweight and portable device that is grounded and capable of presenting force but has no workspace constraints. To evaluate the performance of the proposed device, we measured hand trajectories and presentation forces while presenting objects in the shape of a wall, table, and sphere.

Human Recognition Performance of Simple Spatial Vibrotactile Patterns on the Torso
Junwoo Kim, Heeyeon Kim, Chaeyong Park, and Seungmoon Choi
(Pohang University of Science and Technology (POSTECH), South Korea)
This paper presents an experimental study on the fundamental human recognition performance of spatial tactile patterns applied on the torso. The patterns are made by either activating or deactivating four vibration actuators fastened to the body surface in a rectangular shape. Three body areas are considered for stimulation: the torso's front (chest and stomach), back, and waist. A perceptual experiment shows that the human recognition accuracy of the spatial tactile patterns is high on the torso, over 91%. The accuracy depends on the stimulated body area and the number of simultaneously activated vibrations. In addition, the estimated information transfer ranges from 3.48 bits to 3.66 bits, indicating that the torso can transmit 11--12 spatial patterns without errors. These results can contribute to designing spatial tactile patterns that are effective for recently emerging haptic vests.

Mediated Social Touching: Haptic Feedback Affects Social Experience of Touch Initiators
Martin Maunsbach ORCID logo, Kasper Hornbæk ORCID logo, and Hasti Seifi ORCID logo
(University of Copenhagen, Denmark; Arizona State University, USA)
Mediated social touch enables us to share hugs, handshakes, and caresses at a distance. Past work has focused on the experience of being touched by a remote person, but the touch initiator's experience is underexplored. We ask whether a variation in haptic feedback can influence the touch initiator's social experience of the interaction. In a user study participants stroked a remote person's hand in virtual reality while feeling no haptic feedback, ultrasonic stimulation, or passive feedback from a silicone hand. In each condition, they rated the pleasantness of the interaction, the friendliness of the remote person, and their sense of co-presence. We also captured the velocity of their stroking and asked for reflections on the interaction and mediated social touch as a whole. The results show significant effects of haptic feedback on co-presence, pleasantness, and stroking velocity. The qualitative responses suggest that these results are due to the familiarity of the solid silicone hand, and the participants' assumption that when they felt feedback, the remote person felt similar feedback.

Effects on Perception when Removing One Frequency Component from Two Harmonic Vibrations
Keisuke Tozuka and Hiroshi Igarashi
(Tokyo Denki University, Japan)
Textures with two-peak vibration spectral profiles complicate the digital filters necessary to reproduce their specificities. This also cascades into the complexity of texture reproduction models, which many researchers were interested in simplifying.Thus, our work focused on the simplification of vibration spectral profiles.For this purpose, we needed knowledge of what vibration spectral profiles (with two peaks) could be reproduced using only one of the peaks without affecting the texture perception.First, our main objective was to find whether the higher or lower frequency should be removed because it hadn't been investigated in other works.The similarity of vibrations, before and after the suppression of one peak, decreased if the lower frequency waves were removed but not if it was the higher frequency waves that were removed.Then, we studied the influence on similarity reduction by varying the frequency and amplitude of these waves.The frequency had the greatest influence among the tested parameters, while the amplitude ratio between the lower and higher frequency waves also had some moderate influence.Lastly, we found that texture perception is not affected by removing the smaller peak if the frequencies of the two peaks are close and the amplitude ratio is large enough.

Flexos: A Portable, SEA-Based Shoulder Exoskeleton with Hyper-redundant Kinematics for Weight Lifting Assistance
Gianluca Rinaldi, Luca Tiseni, Michele Xiloyannis, Lorenzo Masia, Antonio Frisoli, and Domenico Chiaradia ORCID logo
(Sant'Anna School of Advanced Studies, Italy; Akina, Switzerland; Heidelberg University, Germany)
In this paper, we present Flexos, a fully portable shoulder exoskeleton developed for logistic and industrial tasks. The device presents a simplified design with only a single, series elastic actuated joint employing an active torque control that assists the user in performing weight-lifting tasks. The device is lightweight and can be easily worn as a backpack. Ergonomic-shaped human-robot interfaces have been custom designed and 3D printed in order to achieve maximum comfort for the wearer. The exoskeleton has a hyper-redundant kinematic chain - a flexible link - connecting the backpack to the main series elastic actuator, which exerts torque on the upper arm. The SEA output torque is set by a closed-loop controller, whose aim is to compensate for gravity torque due to the human arm and weights lifted by the user. The device allows covering almost all the shoulder range of motion - 89.2% flexion/extension, and 88.4% internal/external rotation. Seven healthy participants were involved in a pilot study to assess the reduction of muscular activity - sEMG - in five targeted muscles during static and dynamic weight-lifting tasks. The assistance provided by the exoskeleton resulted in an average (-31.6% static, -24.6% dynamic) reduction of muscular effort.

Dynamic Pattern Recognition with Localised Surface Haptics and Apparent Motion
Mathilde Jeannin, Ayoub Ben Dhiab, Charles Hudin, and Sabrina Panëels
(LIX, France; Université Paris-Saclay, CEA, List, France)
Surface haptics is gaining increasing interest due to the potential of the added tangibility to standard touchscreen devices, with compliance effects, different localised multitouch feedback or textures. Dynamic illusions provide further possibilities, such as apparent motion, to indicate directions or other more complex motions. It could also help feel the contours of a shape, particularly useful in the accessibility field for digital graphics. The confinement of vibrotactile stimuli is a recent method that enables not only to provide both localised and multitouch feedback but also to put the whole hand on the display. Thus, it has the potential to combine different interaction styles, from static to dynamic interaction, with a single finger to the whole hand. Yet, so far, there has been little work on tactile interactions, and particularly pattern recognition with this method. Therefore, this paper presents a preliminary investigation into the feasibility of using confined vibrotactile stimuli to identify geometric shapes using apparent motion. The results show that participants could feel continuous movements and with training reached recognition rates overall of 72.1%

Haptic Mushroom: A 3-DoF Shape-Changing Encounter-Type Haptic Device with Interchangeable End-Effectors
Lisheng Kuang, Francesco Chinello, Paolo Robuffo Giordano, Maud Marchal, and Claudio Pacchierotti ORCID logo
(CNRS, France; Aarhus University, Denmark; Univ. Rennes, INSA, IRISA, Inria, France; Institut Universitaire de France, France)
This paper presents the Haptic Mushroom, a grounded encounter-type haptic device with interchangeable end-effectors. It is composed of a three-leg parallel self-constrained mechanism connecting a lower static platform to an upper moving platform. The upper platform moves over the surface of a sphere centered on the lower one. The legs are attached to the moving platform through three joints, which are in turn driven through a spiral cam, finally actuating the chosen end-effector. As a representative example, we consider a soft end-effector, able to change its curvature, and a rigid origami-inspired end-effector, able to change its shape from flat to sharp. However, the deviceis designed to support a wide range of end-effectors. The paperdescribes the device and presents the kinematic characterisationof the two end-effectors. We carry out a perceptual experiment, enrolling 12 participants, evaluating the capability of the soft end-effector to render curvatures. Finally, we present a use case where the device is used as an encounter-type haptic interface during interactions with virtual objects.

Influence of Electrical Stimulation Intensity on the Perception of Piquancy
Masaki Ohno, Kazuma Aoyama, Tomohiro Amemiya, Hideaki Kuzuoka, Keigo Matsumoto, and Takuji Narumi
(University of Tokyo, Japan; Gunma University, Japan)
The interface designed to enhance the perceived piquancy is expected to overcome the trade-off problem between the positive and negative effects of consuming piquant substances on human health. Previous research has shown that anodal electrical stimulation of the tongue can enhance the perceived piquancy. However, a method for dynamically controlling the intensity of perceived piquancy is yet to be established. In this study, we focused on the relationship between the intensity of perceived piquancy and the amount of current generated via electrical stimulation. Psychophysical experiments were conducted to evaluate the relationship. The results demonstrate an enhancement in the intensity of the piquancy perception induced by chili peppers and wasabi through changes in the amount of current via electrical tongue stimulation.

Wearable Sensory Substitution for Proprioception via Deep Pressure
Sreela Kodali ORCID logo, Brian Vuong ORCID logo, Thomas Bulea ORCID logo, Alexander Chesler ORCID logo, Carsten Bönnemann ORCID logo, and Allison M. OkamuraORCID logo
(Stanford University, USA; National Institutes of Health, USA)
We propose a sensory substitution device that communicates one-degree-of-freedom proprioceptive feedback via deep pressure stimulation on the arm. The design is motivated by the need for a feedback modality detectable by individuals with a genetic condition known as PIEZO2 loss of function, which is characterized by absence of both proprioception and sense of light touch. We created a wearable and programmable prototype that applies up to 15 N of deep pressure stimulation to the forearm and includes an embedded force sensor. We conducted a study to evaluate the ability of participants without sensory impairment to control the position of a virtual arm to match a target angle communicated by deep pressure stimulation. A participant-specific calibration resulted in an average minimum detectable force of 0.41 N and maximum comfortable force of 6.42 N. We found that, after training, participants were able to significantly reduce angle error using the deep pressure haptic feedback compared to without it. Angle error increased only slightly with force, indicating that this sensory substitution method is a promising approach for individuals with PIEZO2 loss of function and other forms of sensory loss.

Controlling Human Perception of Haptic Profiles Using Contextual Cues
Derek Van Delden, Alison Jenkins, William Singhose, Franziska Schlagenhauf, and Kelly Dobson
(Georgia Tech, USA; Google, USA)
Haptics are commonly used to convey alerts or simple notifications in commercial applications. Expanding the use of haptics to more complex scenarios is a promising area of development. The results from testing indicate that participants associate meaning with haptic profiles based on the context in which they are presented. Such results lay the groundwork for controlling perception of haptics using contextual cues. The haptic profiles in this study are customized patterns generated by linear resonant actuators (LRAs). They are intended to convey complex meanings to the test participants. The extent to which contextual cues determine participants' perception of the haptic profiles was investigated for various types of haptic profiles and contextual cues, including variations of specificity levels and cue modalities (picture, sound, video, etc.) The results indicate that contextual cues can significantly impact perception of haptics such that the cues effectively "control" the human perception.

Measurement of Rhythmical Movements in Street Dance for Quantifying Movement Timing Skills through Somatic Sensation
Ritsuko Kiso, Yuko Hashimoto, and Masashi Nakatani
(Keio University, Japan; Ochanomizu University, Japan)
This study aimed to measure the rhythms of three types of movements in street dance and to quantify the differences depending on the level of proficiency. We developed an experimental paradigm to quantify the rhythms of street dancers' movements. 20 street dancers, 10 trained and 10 novices participated in the experiment. The results of the experiment suggested that there was a difference in the accuracy of maintaining the same movement timing in time and slow when comparing trained and novice dancers. In addition, as in the HKB model of auditory-motor coordination, the novice showed a phase transition phenomenon between fast and slow, while the trained maintained the coordination pattern. This suggests that street dance experts have mastered not only the skill to match the rhythm of music accurately but also the skill to maintain the rhythm of unstable movements that are out of sync with the timing of the sound. This experimental system and the quantified results can be used to evaluate the proficiency of dancers quantitatively and to understand the learning status of body movements.

Physics Engine-Based Whole-Hand Haptic Rendering for Sensorimotor Neurorehabilitation
Raphael Rätz and Laura Marchal-Crespo
(University of Bern, Switzerland; Delft University of Technology, Netherlands)
Whole-hand haptic rendering could lead to more naturalistic and intuitive virtual hand-object interactions, which could be especially beneficial for applications such as sensorimotor robotic neurorehabilitation. However, the majority of previously proposed whole-hand haptic rendering algorithms rely on effortful custom implementations or are not suited for the grounded haptic devices often used in neurorehabilitation. Therefore, we suggest a framework for whole-hand haptic rendering based on a readily available physics engine. We employ a bilateral position-position teleoperation framework between a haptic rehabilitation device and a simulated hand avatar with added exercise-specific haptic rendering. Moreover, in consideration of the needs of neurological patients, we introduce an adaptive damping of the haptic device during hand-object interactions for increased stabilization of the patient's limb. We present the first results of the feasibility of the proposed framework in a haptic rehabilitation exercise. In an ongoing clinical study, the practical application of the presented framework is currently investigated.

Exploring Human Response Times to Combinations of Audio, Haptic, and Visual Stimuli from a Mobile Device
Kyle T. Yoshida, Joel X. Kiernan, Allison M. OkamuraORCID logo, and Cara M. Nunez
(Stanford University, USA; Cornell University, USA)
Auditory, haptic, and visual stimuli provide alerts, notifications, and information for a wide variety of applications ranging from virtual reality to wearable and hand-held devices. Response times to these stimuli have been used to assess motor control and design human-computer interaction systems. In this study, we investigate human response times to 26 combinations of auditory, haptic, and visual stimuli at three levels (high, low, and off). We developed an iOS app that presents these stimuli in random intervals and records response times on an iPhone 11. We conducted a user study with 20 participants and found that response time decreased with more types and higher levels of stimuli. The low visual condition had the slowest mean response time (mean ± standard deviation, 528 ± 105 ‍ms) and the condition with high levels of audio, haptic, and visual stimuli had the fastest mean response time (320 ± 43 ‍ms). This work quantifies response times to multi-modal stimuli, identifies interactions between different stimuli types and levels, and introduces an app-based method that can be widely distributed to measure response time. Understanding preferences and response times for stimuli can provide insight into designing devices for human-machine interaction.

Determination of the Thermal-Tactile Simultaneity Window for Multisensory Cutaneous Displays
Takuya Jodai, Masahiko Terao, Lynette Jones, and Hsin-Ni Ho
(Kyushu Univeristy, Japan; Yamaguchi University, Japan; Massachusetts Institute of Technology, USA)
Multisensory cutaneous displays have been developed to enhance the realism of objects touched in virtual environments. However, when stimuli are presented concurrently, tactile stimuli can mask thermal perception and so both these modalities may not be available to convey information to the user. In this study, we aim to determine the simultaneity window using the Simultaneity Judgment Task. A device was created that could present both tactile and thermal stimuli to the thenar eminence of the participant’s left hand with various stimulus onset asynchronies (SOA). The experimental results indicated that the simultaneity window width was 639 ms ranging from -561 ms to 78 ms. The point of subjective simultaneity (PSS) was at -242 ms, indicating that participants perceived simultaneity best when the thermal stimulus preceded the tactile stimulus by 242 ms. These findings have implications for the design of stimulus presentation in multisensory cutaneous displays.

Humans Struggle to Perceive Congruence in Visuo-Haptic Textures
Jenna Fradin, Sinan D. Haliyo, and David Gueorguiev
(Institut des Systemes Intelligents et de Robotique, France; ISIR, France; CNRS, France; Sorbonne University, France)
Multisensory assessment of objects and textures is ubiquitous in human lives. It has been shown that people can optimally integrate tactile and visual cues to extract sensory cues. It is less clear though how the congruence between tactile and visual cues is perceived when people interact with everyday objects such as fabrics. In this study, we investigate the human accuracy to detect visuo-haptic discrepancy within common fabrics. Participants had to detect the congruent one among two pairs of visuo-haptic textures consisting in a visual texture projected on a semi-mirror and a real texture underneath the visual one, which could be touched but not seen. Overall, participants did not accurately detect congruent visuo-haptic pairs and their performance correlated with the average similarity between the visual and tactile textures. Correlation analysis may suggest a role of the coefficient of kinetic friction to detect the visuo-haptic discrepancy.

Communication is a Two-Way Street: Negotiating Driving Intent through a Shape-Changing Steering Wheel
Hannah Baez, Akshay BhardwajORCID logo, Jean Costa, John Gideon, Sile O'Modhrain, Nadine Sarter, and Brent Gillespie
(University of Michigan, USA; Toyota Research Institute, USA)
In this information age, our machines have evolved from tools that process mechanical work into computerized devices that process information. A collateral outcome of this trend is a diminishing role for haptic feedback. If the benefits of haptic feedback, including those inherent in tool use, are to be preserved in information processing machines, we require an improved understanding of the various ways in which haptic feedback supports embodied cognition and supports high utility exchange of information. In this paper we classify manual control interfaces as instrumental or semiotic and describe an exploratory study in which a steering wheel functions simultaneously to communicate tactical and operational features in semi-autonomous driving. A shape-changing interface (semiotic/tactical) in the grip axis complements haptic shared control (instrumental/operational) in the steering axis. Experimental results involving N=30 participants show that the addition of a semiotic interface improves human-automation team performance in a shared driving scenario with competing objectives and metered information sharing.

Spatiotemporal Perception of Single Overlapped Vibrotactile Stimulation to Multiple Body Locations
Takumi Kuhara, Kakagu Komazaki, Junji Watanabe, and Yoshihiro Tanaka
(Nagoya Institute of Technology, Japan; Nippon Telegraph and Telephone Corporation, Japan; NTT Communication Science Laboratories, Japan)
Haptic information is useful in various fields, and presenting stimulation to multiple points enriches the experience. Independent Vibrotactile Stimulation to multiple points (IVS) evokes a tactile apparent motion. We propose Single Overlapped Vibrotactile Stimulation (SOVS) with imagination to produce spatiotemporal perception like IVS. The SOVS combines the stimulation of IVS. We developed a pseudo-dribbling system that produces a feeling of an imaginary ball moving between the hand and the feet where the same stimulation was given. To examine the feeling by SOVS, the first experiment on changing the speed of the stimulation showed the SOVS can produce a similar feeling to the IVS, indicating SOVS allows participants to match the perceived stimulation to the stimulation according to the dribbling being imagined. The second experiment was conducted using an obstacle between the hand and the feet to interrupt the image. We controlled the interruption by controlling attention. Results showed that both IVS and SOVS overcome the obstacle to evoke a piercing feeling under the attention to the floor, whereas IVS still provided a piercing feeling under the attention to the obstacle but SOVS could not provide it, suggesting the possibility that the perception mechanism is different between the SOVS and IVS.

Tactile Feedback Involving Actual Operation for Motor Skill Learning
Hikari Yukawa, Moena Tsuruoka, Takayuki Kodama, Masashi Odagiri, Masayuki Sato, Michiya Takeda, Masahiko Kurachi, and Yoshihiro Tanaka
(Nagoya Institute of Technology, Japan; Kyoto Tachibana University, Japan; Konica Minolta, Japan; Konica Minolta INC., Japan)
Traditionally, physical skill acquisition has been aimed at replicating an expert's movement. However, mere imitations of expert movements may yield different results owing to the differences in the bodies of the expert and learner. Hence, in this study, we focused on sensory feedback to develop a skill-learning method. By presenting the tactile information of the expert's and learner's fingers at different body locations, utilizing tactile information in real time while performing tasks was feasible. Our system had two functions: body movement in Expert mode and self-feedback in User mode. We aimed to clarify the learning effects of these functions. The experiment comprised a task of imitating writing skills and manipulated two variables: with/without motion in Expert mode (Motion/No-Motion condition) and with/without self-feedback in User mode (With-Feedback/Without-Feedback condition). We compared the learning effects under the four different conditions. We found that the Expert mode involving the actual operation had a positive impact on acquiring proficiency with writing force. The self-feedback in the User mode was minimized the inter-participant variability in the learning.

Perception of Friction-Related Cues Induced by Temperature Variation on a Surface Display
Matej Mayet, Jean-Loïc Le Carrou, and David Gueorguiev
(Institut Jean Le Rond D'Alembert, France; CNRS, France; Sorbonne Université, France; ISIR, France)
Frictional cues enable us to perceive material properties and topography, which has prompted a dynamic research on methods for friction modulation. Recently, a novel method based on local heating of a screen was proposed for modulating friction and create the sensation of shape when exploring the surface. We built a setup able to reproduce this method to investigate how three parameters, the width of the heater, its temperature, and the duration of the pre-heating before the interaction, influence perception of purely tactile cues. The results show that raising the temperature of the heater increases the perception of tactile cues but that the width of the heater or the pre-heating duration had no effect on it. Surprisingly, the increase of the coefficient of friction at the heated location was only impacted by the width of the heater and the pre-heating duration. Our study confirmed that non-thermal tactile cues can be induced by local heating of a tactile display but we did not observe a direct relationship between the variation of the coefficient of kinetic friction and perception.

A Miniature Direct-Drive Hydraulic Actuator for Wearable Haptic Devices based on Ferrofluid Magnetohydrodynamic Levitation
Daniele Leonardis, Domenico Chiaradia ORCID logo, and Antonio Frisoli
(Scuola Superiore Sant'Anna of Pisa, Italy)
Hydraulic and pneumatic actuators in haptics offer the advantage of soft and compliant interfaces, with the drawback of cumbersome driving devices and limited modulation capabilities. We propose a miniature hydraulic actuator based on a linear electromagnetic motor with an embedded ferrofluid sealing. The solution has two main advantages: it shows no static friction due to the magnetohydrodynamic levitation effect of the ferrofluid, and the output force can be scaled (by varying the radius of the actuator) without introducing noise and friction of mechanical reduction mechanisms. Moreover, soft and compliant interfaces in the form of actuated pouches can be obtained on wearable devices with embedded actuators. As a concept prototype, we present here a compact and soft haptic thimble integrating the proposed actuator: experimental characterization at the bench, and perception experiments with the final prototype, evaluate the low-noise rendering capability of the method.

Fingertip Wearable High-resolution Electrohydraulic Interface for Multimodal Haptics
Purnendu, Jess Hartcher-O'Brien, Vatsal Mehta, Nicholas Colonnese, Aakar Gupta, Carson J. Bruns, and Priyanshu Agarwal
(Meta, USA; University of Colorado Boulder, USA)
Fingertips are one of the most sensitive regions of the human body and provide a means to dexterously interact with the physical world. To recreate this sense of physical touch in a virtual or augmented reality (VR/AR), high-resolution haptic interfaces that can render rich tactile information are needed. In this paper, we present a wearable electrohydraulic haptic interface that can produce high-fidelity multimodal haptic feedback at the fingertips. This novel hardware can generate high intensity fine tactile pressure (up to 34 kPa) as well as a wide range of vibrations (up to 700 Hz) through 16 individually controlled electrohydraulic bubble actuators. To achieve such a high intensity multimodal haptic feedback at such a high density (16 bubbles/cm2) at the fingertip using an electrohydraulic haptic interface , we integrated a stretchable substrate with a novel dielectric film and developed a design architecture wherein the dielectric fluid is stored at the back of the fingertip. We physically characterize the static and dynamic behavior of the device. In addition, we conduct psychophysical characterization of the device through a set of user studies. This electrohydraulic interface demonstrates a new way to design and develop high- resolution multimodal haptic systems at the fingertips for AR/VR environments.

Training to Understand Complex Haptic Phrases: A Longitudinal Investigation
Mauricio Fontana de Vargas, David Marino, Antoine Weill-Duflos, and Jeremy R. Cooperstock
(McGill University, Canada)
The ability to understand simple vocabulary, conveyed by haptics, has demonstrated encouraging results, even with limited training. However, the level of achievable accuracy on communication indicative of real-world conversations, typically characterized by long phrases formed by complex vocabulary, remains unknown. This work presents an in-depth case-study analysis of the learning path of one experienced participant over a two-month period of daily practice with our vibrotactile communication apparatus. By the end of the two months, this participant was able to correctly identify 97% of the words in phrases consisting of 6--8 words, delivered as a sequence of vibration patterns. To the best of our knowledge, this work demonstrates the most advanced haptic rendering of language through a wearable device to date.

proc time: 0.69