ISS 2024 – Author Index |
Contents -
Abstracts -
Authors
|
A B C D F G H J K L M N O P R S T U V W X Y Z
Abdullah, Sayeem Md. |
ISS Companion '24: "Effects of Increasing Command ..."
Effects of Increasing Command Capacity of Spatial Memory Menus in Tablets
Sayeem Md. Abdullah and Md. Sami Uddin (University of Regina, Canada) Spatially-stable touch menus, like FastTap, leverage users’ spatial memory to enable rapid command selection on tablets. Although these spatial tablet interfaces can aid in developing spatial memory of commands having a small command set, it is, however, unknown whether spatial memory remains beneficial when the number of commands grows. Therefore, we carried out a study to investigate spatial learning in four different sizes of single-tab FastTap Menus: Small, Medium, Large, and Extra-Large, with 16, 30, 42, and 56 items, respectively. Results indicated that people do develop spatial memory in all menus; however, there is a negative correlation between command capacity and spatial memory development in tablets. We contribute new knowledge on spatial memory development in touch tablets that can enhance the design of future spatial memory-based tablet interfaces. @InProceedings{ISS24p71, author = {Sayeem Md. Abdullah and Md. Sami Uddin}, title = {Effects of Increasing Command Capacity of Spatial Memory Menus in Tablets}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {71--74}, doi = {10.1145/3696762.3698055}, year = {2024}, } Publisher's Version |
|
An, Yijia |
ISS Companion '24: "A Two-Handed Ellipsoidal Device ..."
A Two-Handed Ellipsoidal Device for Interactive Grasping and Gripping Rehabilitation Exercise
Bingjie Xu, Yijia An, Qinglei Bu, and Jie Sun (Suzhou Industrial Park Institute of Vocational Technology, China; Xi’an Jiaotong-Liverpool University, China) Many children with cerebral palsy (CP) should do various exercises to restore motor control for specific functions such as hand grasping and gripping. During daily exercises, they need intensive support from either therapists or caregivers in setting tasks and providing feedback, which creates a heavy workload. Thus, we introduce a two-handed ellipsoidal device to control computer games for interactive grasping and gripping rehabilitation training. The ellipsoidal device is designed to house an ESP32 microcontroller, a Wheeltec N100 IMU and an SF15 flexible thin-film pressure sensor so as to monitor children’s grip strength and wrist rotation. The sensing data can be used to control the characterizer motion in computer games. Preliminary user trials supported the implementation of such devices in hospitals for the hand grasping and gripping exercise and the cognition and coordination exercise between eyes, ears and hands. @InProceedings{ISS24p39, author = {Bingjie Xu and Yijia An and Qinglei Bu and Jie Sun}, title = {A Two-Handed Ellipsoidal Device for Interactive Grasping and Gripping Rehabilitation Exercise}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {39--43}, doi = {10.1145/3696762.3698049}, year = {2024}, } Publisher's Version ISS Companion '24: "Customisable Lower Limb Rehabilitation ..." Customisable Lower Limb Rehabilitation Carpet for Children with Cerebral Palsy Yaxuan Liu, Yijia An, Keming Zhang, Martijn Ten Bhömer, Qinglei Bu, Jie Sun, and Siyuan Chen (National University of Singapore (Suzhou) Research Institute, China; Xi’an Jiaotong-Liverpool University, China) Cerebral Palsy (CP) affects motor coordination, resulting in slow walking and irregular step and stride lengths. Effective rehabilitation exercises are essential for strengthening leg muscles and enhancing mobility in children with CP. However, traditional hospital rehabilitation programs often lack engagement, making it challenging for children to maintain consistent participation. Additionally, many advanced lower extremity rehabilitation systems remain largely inaccessible. This study introduces an interactive gaming carpet combined with intelligent gait pattern analysis to enhance rehabilitation efforts. The gaming carpet employs visual and auditory cues to train leg coordination and correct stepping patterns, making the exercises more engaging for children. Meanwhile, the intelligent gait analysis system provides therapists with objective data to assess conditions and develop personalized exercise plans. Initial tests indicate that this system effectively engages children and improves adherence to rehabilitation exercises, while also providing accurate progress monitoring. This innovative approach demonstrates significant potential for integrating game-based interventions and data analysis into CP rehabilitation, offering practical solutions for both clinical and home-based settings. @InProceedings{ISS24p44, author = {Yaxuan Liu and Yijia An and Keming Zhang and Martijn Ten Bhömer and Qinglei Bu and Jie Sun and Siyuan Chen}, title = {Customisable Lower Limb Rehabilitation Carpet for Children with Cerebral Palsy}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {44--49}, doi = {10.1145/3696762.3698050}, year = {2024}, } Publisher's Version |
|
Autexier, Serge |
ISS Companion '24: "Co-designing a Tangible Communication ..."
Co-designing a Tangible Communication Device to Enrich Communication over Distance
Hannah Friederike Fischer, Anke Königschulte, Jana Koch, Serge Autexier, and Gesche Joost (German Research Center for Artificial Intelligence, Germany; C&S Computer and Software, Germany) The initial phase of innovative product design is marked by uncertainty and complexity. This paper examines the use of participatory workshops to navigate this phase within the ToCaro research project. The project aims to develop tactile and multisensory interfaces for remote communication to mitigate feelings of loneliness by promoting a sense of physical proximity. Fourteen co-design workshops were conducted with senior participants (age ≥ 65) to examine their communication behaviors, identify latent needs and evaluate physical sensations elicited by various materials and forms of interaction. The workshops included semi-structured interviews, sensory perception tests, interaction concept evaluations, and “quick-and-dirty” prototyping. This paper outlines the facilitators’ experiences, the challenges, and learnings. Results indicate that while participants exhibited varied levels of engagement, those with a perceived need for new communication devices contributed effectively to the creative process. @InProceedings{ISS24p60, author = {Hannah Friederike Fischer and Anke Königschulte and Jana Koch and Serge Autexier and Gesche Joost}, title = {Co-designing a Tangible Communication Device to Enrich Communication over Distance}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {60--64}, doi = {10.1145/3696762.3698053}, year = {2024}, } Publisher's Version |
|
Bu, Qinglei |
ISS Companion '24: "A Two-Handed Ellipsoidal Device ..."
A Two-Handed Ellipsoidal Device for Interactive Grasping and Gripping Rehabilitation Exercise
Bingjie Xu, Yijia An, Qinglei Bu, and Jie Sun (Suzhou Industrial Park Institute of Vocational Technology, China; Xi’an Jiaotong-Liverpool University, China) Many children with cerebral palsy (CP) should do various exercises to restore motor control for specific functions such as hand grasping and gripping. During daily exercises, they need intensive support from either therapists or caregivers in setting tasks and providing feedback, which creates a heavy workload. Thus, we introduce a two-handed ellipsoidal device to control computer games for interactive grasping and gripping rehabilitation training. The ellipsoidal device is designed to house an ESP32 microcontroller, a Wheeltec N100 IMU and an SF15 flexible thin-film pressure sensor so as to monitor children’s grip strength and wrist rotation. The sensing data can be used to control the characterizer motion in computer games. Preliminary user trials supported the implementation of such devices in hospitals for the hand grasping and gripping exercise and the cognition and coordination exercise between eyes, ears and hands. @InProceedings{ISS24p39, author = {Bingjie Xu and Yijia An and Qinglei Bu and Jie Sun}, title = {A Two-Handed Ellipsoidal Device for Interactive Grasping and Gripping Rehabilitation Exercise}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {39--43}, doi = {10.1145/3696762.3698049}, year = {2024}, } Publisher's Version ISS Companion '24: "Customisable Lower Limb Rehabilitation ..." Customisable Lower Limb Rehabilitation Carpet for Children with Cerebral Palsy Yaxuan Liu, Yijia An, Keming Zhang, Martijn Ten Bhömer, Qinglei Bu, Jie Sun, and Siyuan Chen (National University of Singapore (Suzhou) Research Institute, China; Xi’an Jiaotong-Liverpool University, China) Cerebral Palsy (CP) affects motor coordination, resulting in slow walking and irregular step and stride lengths. Effective rehabilitation exercises are essential for strengthening leg muscles and enhancing mobility in children with CP. However, traditional hospital rehabilitation programs often lack engagement, making it challenging for children to maintain consistent participation. Additionally, many advanced lower extremity rehabilitation systems remain largely inaccessible. This study introduces an interactive gaming carpet combined with intelligent gait pattern analysis to enhance rehabilitation efforts. The gaming carpet employs visual and auditory cues to train leg coordination and correct stepping patterns, making the exercises more engaging for children. Meanwhile, the intelligent gait analysis system provides therapists with objective data to assess conditions and develop personalized exercise plans. Initial tests indicate that this system effectively engages children and improves adherence to rehabilitation exercises, while also providing accurate progress monitoring. This innovative approach demonstrates significant potential for integrating game-based interventions and data analysis into CP rehabilitation, offering practical solutions for both clinical and home-based settings. @InProceedings{ISS24p44, author = {Yaxuan Liu and Yijia An and Keming Zhang and Martijn Ten Bhömer and Qinglei Bu and Jie Sun and Siyuan Chen}, title = {Customisable Lower Limb Rehabilitation Carpet for Children with Cerebral Palsy}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {44--49}, doi = {10.1145/3696762.3698050}, year = {2024}, } Publisher's Version |
|
Burnah, Tavish M. |
ISS Companion '24: "Monocular Tracking of Passive ..."
Monocular Tracking of Passive Stylus on Passive Surface
Tavish M. Burnah (Massey University, New Zealand) This PhD project introduces a mixed reality passive stylus system designed for smartphones enabling digital ink creation on a surface. Traditional passive stylus systems face challenges in usability and accessibility. This research aims to overcome these limitations through a dual approach that integrates interactive design and robust machine learning, trained on extensive datasets. The research methodology is divided into phases of data collection, design, implementation and evaluation. The project develops an application to deliver an accessible digital tool viable for widespread use, particularly in developing regions. @InProceedings{ISS24p4, author = {Tavish M. Burnah}, title = {Monocular Tracking of Passive Stylus on Passive Surface}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {4--10}, doi = {10.1145/3696762.3698041}, year = {2024}, } Publisher's Version |
|
Carpendale, Sheelagh |
ISS Companion '24: "Summary of the Workshop on ..."
Summary of the Workshop on Interactions for Supporting Explanation and Promoting Comprehension
Maryam Rezaie, Anjali Khurana, Parmit Chilana, Sheelagh Carpendale, Melanie Tory, and Sydney Purdue (Simon Fraser University, Canada; Northeastern University, USA) Interfaces and visualizations often challenge comprehension, especially as they grow in complexity. Traditional methods—relying on standard inputs like touch, mouse, and keyboard—fall short in addressing the nuanced demands for explainability in complex systems. This workshop explores innovative interaction strategies to enhance self-enabled comprehension, focusing on the development and refinement of new devices and input modalities. We aim to gather researchers, designers, and practitioners to exchange ideas and explore interaction techniques that promote clearer understanding and transparency across diverse applications. This collaborative effort seeks to advance interactive systems that improve explanation and user comprehension in digital environments. @InProceedings{ISS24p29, author = {Maryam Rezaie and Anjali Khurana and Parmit Chilana and Sheelagh Carpendale and Melanie Tory and Sydney Purdue}, title = {Summary of the Workshop on Interactions for Supporting Explanation and Promoting Comprehension}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {29--30}, doi = {10.1145/3696762.3698046}, year = {2024}, } Publisher's Version ISS Companion '24: "Summary of the Workshop on ..." Summary of the Workshop on Visual Methods and Analyzing Visual Data in Human Computer Interaction Zezhong Wang, Samuel Huron, Miriam Sturdee, and Sheelagh Carpendale (Simon Fraser University, Canada; Télécom Paris - Institut Polytechnique de Paris, France; University of St Andrews, United Kingdom) Visual methods have become increasingly vital in Human Computer Interaction (HCI) research, particularly as we analyze and interpret the complex visual data that emerges from various interaction modalities. However, the methodologies for analyzing this visual data remain underdeveloped compared to textual data analysis. This workshop seeks to unite HCI researchers who work with visual data — such as hand sketches, photographs, physical artifacts, UI screenshots, videos, and information visualizations — to identify, name, and categorize methods for analyzing visual data in HCI. @InProceedings{ISS24p31, author = {Zezhong Wang and Samuel Huron and Miriam Sturdee and Sheelagh Carpendale}, title = {Summary of the Workshop on Visual Methods and Analyzing Visual Data in Human Computer Interaction}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {31--34}, doi = {10.1145/3696762.3698047}, year = {2024}, } Publisher's Version |
|
Chan, Daniel |
ISS Companion '24: "Exploring Effects of Interactive ..."
Exploring Effects of Interactive Virtual Reality Sensory Environment on Anxiety Reduction in Adolescents with Autism
Amaya E. Keys, Oyewole Oyekoya, and Daniel Chan (Howard University, USA; CUNY Hunter College, USA; CUNY, USA) Autism Spectrum Disorder (ASD) is a developmental disability often characterized by sensory processing difficulties that can lead to anxiety, particularly in children and adolescents. Previous research on virtual reality-based anxiety intervention tools focuses on using social skills training, exposure therapy, and meditative coaching to mitigate social and phobia related anxiety. However, minimal work has specifically evaluated the effects of virtual multi-sensory environments for people with ASD, often only testing feasibility. This pilot study aims to build on previous work by investigating how various auditory, visual, and interactive components contribute to user satisfaction and sensory-related anxiety reduction. The objective is to gain a better understanding of what features are significant towards developing a successful virtual anxiety intervention tool. Results suggest using interactive activities that promote fine motor skills can provide a healthy outlet for self-mediated stress relief. Future development aims to incorporate task-based activities, and enhance audio, visual, and lighting displays. The deployment of a full-scale study with a larger sample size and target participant pool is warranted to substantiate these initial findings. @InProceedings{ISS24p50, author = {Amaya E. Keys and Oyewole Oyekoya and Daniel Chan}, title = {Exploring Effects of Interactive Virtual Reality Sensory Environment on Anxiety Reduction in Adolescents with Autism}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {50--55}, doi = {10.1145/3696762.3698051}, year = {2024}, } Publisher's Version |
|
Chen, Siyuan |
ISS Companion '24: "Customisable Lower Limb Rehabilitation ..."
Customisable Lower Limb Rehabilitation Carpet for Children with Cerebral Palsy
Yaxuan Liu, Yijia An, Keming Zhang, Martijn Ten Bhömer, Qinglei Bu, Jie Sun, and Siyuan Chen (National University of Singapore (Suzhou) Research Institute, China; Xi’an Jiaotong-Liverpool University, China) Cerebral Palsy (CP) affects motor coordination, resulting in slow walking and irregular step and stride lengths. Effective rehabilitation exercises are essential for strengthening leg muscles and enhancing mobility in children with CP. However, traditional hospital rehabilitation programs often lack engagement, making it challenging for children to maintain consistent participation. Additionally, many advanced lower extremity rehabilitation systems remain largely inaccessible. This study introduces an interactive gaming carpet combined with intelligent gait pattern analysis to enhance rehabilitation efforts. The gaming carpet employs visual and auditory cues to train leg coordination and correct stepping patterns, making the exercises more engaging for children. Meanwhile, the intelligent gait analysis system provides therapists with objective data to assess conditions and develop personalized exercise plans. Initial tests indicate that this system effectively engages children and improves adherence to rehabilitation exercises, while also providing accurate progress monitoring. This innovative approach demonstrates significant potential for integrating game-based interventions and data analysis into CP rehabilitation, offering practical solutions for both clinical and home-based settings. @InProceedings{ISS24p44, author = {Yaxuan Liu and Yijia An and Keming Zhang and Martijn Ten Bhömer and Qinglei Bu and Jie Sun and Siyuan Chen}, title = {Customisable Lower Limb Rehabilitation Carpet for Children with Cerebral Palsy}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {44--49}, doi = {10.1145/3696762.3698050}, year = {2024}, } Publisher's Version |
|
Chilana, Parmit |
ISS Companion '24: "Summary of the Workshop on ..."
Summary of the Workshop on Interactions for Supporting Explanation and Promoting Comprehension
Maryam Rezaie, Anjali Khurana, Parmit Chilana, Sheelagh Carpendale, Melanie Tory, and Sydney Purdue (Simon Fraser University, Canada; Northeastern University, USA) Interfaces and visualizations often challenge comprehension, especially as they grow in complexity. Traditional methods—relying on standard inputs like touch, mouse, and keyboard—fall short in addressing the nuanced demands for explainability in complex systems. This workshop explores innovative interaction strategies to enhance self-enabled comprehension, focusing on the development and refinement of new devices and input modalities. We aim to gather researchers, designers, and practitioners to exchange ideas and explore interaction techniques that promote clearer understanding and transparency across diverse applications. This collaborative effort seeks to advance interactive systems that improve explanation and user comprehension in digital environments. @InProceedings{ISS24p29, author = {Maryam Rezaie and Anjali Khurana and Parmit Chilana and Sheelagh Carpendale and Melanie Tory and Sydney Purdue}, title = {Summary of the Workshop on Interactions for Supporting Explanation and Promoting Comprehension}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {29--30}, doi = {10.1145/3696762.3698046}, year = {2024}, } Publisher's Version |
|
Cunningham, Andrew |
ISS Companion '24: "Once Upon a Data Story: A ..."
Once Upon a Data Story: A Preliminary Design Space for Immersive Data Storytelling
Radhika Pankaj Jain, Kadek Ananta Satriadi, Adam Drogemuller, Ross Smith, and Andrew Cunningham (University of South Australia, Australia; Monash University, Australia) Immersive data storytelling is an emerging field that combines narrative visualisation and immersive analytics to engage an audience. While there are existing design spaces for narrative visualisation on 2D displays, there are no guidelines for creating immersive data stories, making it difficult for practitioners and researchers to explore this space. In this paper, we present a preliminary design space for immersive data storytelling that is informed by current practices and multi-disciplinary views. We interviewed multi-disciplinary experts, including museum designers, architects, and game designers, to understand how they communicate stories in physical spaces and immersive mediums. We applied inductive thematic analysis to the interview responses to inform the dimensions of the design space and analysed a systematic selection of publicly available immersive stories. In the end, we had 13 dimensions in 7 categories. We present insights into this design space as common practice or areas for future research. @InProceedings{ISS24p65, author = {Radhika Pankaj Jain and Kadek Ananta Satriadi and Adam Drogemuller and Ross Smith and Andrew Cunningham}, title = {Once Upon a Data Story: A Preliminary Design Space for Immersive Data Storytelling}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {65--70}, doi = {10.1145/3696762.3698054}, year = {2024}, } Publisher's Version |
|
Drogemuller, Adam |
ISS Companion '24: "Once Upon a Data Story: A ..."
Once Upon a Data Story: A Preliminary Design Space for Immersive Data Storytelling
Radhika Pankaj Jain, Kadek Ananta Satriadi, Adam Drogemuller, Ross Smith, and Andrew Cunningham (University of South Australia, Australia; Monash University, Australia) Immersive data storytelling is an emerging field that combines narrative visualisation and immersive analytics to engage an audience. While there are existing design spaces for narrative visualisation on 2D displays, there are no guidelines for creating immersive data stories, making it difficult for practitioners and researchers to explore this space. In this paper, we present a preliminary design space for immersive data storytelling that is informed by current practices and multi-disciplinary views. We interviewed multi-disciplinary experts, including museum designers, architects, and game designers, to understand how they communicate stories in physical spaces and immersive mediums. We applied inductive thematic analysis to the interview responses to inform the dimensions of the design space and analysed a systematic selection of publicly available immersive stories. In the end, we had 13 dimensions in 7 categories. We present insights into this design space as common practice or areas for future research. @InProceedings{ISS24p65, author = {Radhika Pankaj Jain and Kadek Ananta Satriadi and Adam Drogemuller and Ross Smith and Andrew Cunningham}, title = {Once Upon a Data Story: A Preliminary Design Space for Immersive Data Storytelling}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {65--70}, doi = {10.1145/3696762.3698054}, year = {2024}, } Publisher's Version |
|
Feuchtner, Tiare |
ISS Companion '24: "Eye-Hand Movement of Objects ..."
Eye-Hand Movement of Objects in Near Space
Uta Wagner, Andreas Asferg Jacobsen, Tiare Feuchtner, Hans Gellersen, and Ken Pfeuffer (Aarhus University, Denmark; University of Konstanz, Germany; Lancaster University, United Kingdom) Hand-tracking in Extended Reality (XR) enables moving objects in near space with direct hand gestures, to pick, drag and drop objects in 3D. We explore the use of eye-tracking to reduce the effort involved in this interaction. As the eyes naturally look ahead to the target for a drag operation, the principal idea is to map the translation of the object in the image plane to gaze, such that the hand only needs to control the depth component of the operation. We demo several applications we build for 3D manipulations, including area selection, 3D path specification, and a "Bejeweled" inspired game, showing potential for effortless drag-and-drop actions in 3D space. This demonstration includes the study apparatus and the applications from a paper that will be presented at UIST'24 with the same title (Wagner 2024, Proc. UIST). @InProceedings{ISS24p20, author = {Uta Wagner and Andreas Asferg Jacobsen and Tiare Feuchtner and Hans Gellersen and Ken Pfeuffer}, title = {Eye-Hand Movement of Objects in Near Space}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {20--23}, doi = {10.1145/3696762.3698044}, year = {2024}, } Publisher's Version |
|
Fischer, Hannah Friederike |
ISS Companion '24: "Co-designing a Tangible Communication ..."
Co-designing a Tangible Communication Device to Enrich Communication over Distance
Hannah Friederike Fischer, Anke Königschulte, Jana Koch, Serge Autexier, and Gesche Joost (German Research Center for Artificial Intelligence, Germany; C&S Computer and Software, Germany) The initial phase of innovative product design is marked by uncertainty and complexity. This paper examines the use of participatory workshops to navigate this phase within the ToCaro research project. The project aims to develop tactile and multisensory interfaces for remote communication to mitigate feelings of loneliness by promoting a sense of physical proximity. Fourteen co-design workshops were conducted with senior participants (age ≥ 65) to examine their communication behaviors, identify latent needs and evaluate physical sensations elicited by various materials and forms of interaction. The workshops included semi-structured interviews, sensory perception tests, interaction concept evaluations, and “quick-and-dirty” prototyping. This paper outlines the facilitators’ experiences, the challenges, and learnings. Results indicate that while participants exhibited varied levels of engagement, those with a perceived need for new communication devices contributed effectively to the creative process. @InProceedings{ISS24p60, author = {Hannah Friederike Fischer and Anke Königschulte and Jana Koch and Serge Autexier and Gesche Joost}, title = {Co-designing a Tangible Communication Device to Enrich Communication over Distance}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {60--64}, doi = {10.1145/3696762.3698053}, year = {2024}, } Publisher's Version |
|
Gellersen, Hans |
ISS Companion '24: "Eye-Hand Movement of Objects ..."
Eye-Hand Movement of Objects in Near Space
Uta Wagner, Andreas Asferg Jacobsen, Tiare Feuchtner, Hans Gellersen, and Ken Pfeuffer (Aarhus University, Denmark; University of Konstanz, Germany; Lancaster University, United Kingdom) Hand-tracking in Extended Reality (XR) enables moving objects in near space with direct hand gestures, to pick, drag and drop objects in 3D. We explore the use of eye-tracking to reduce the effort involved in this interaction. As the eyes naturally look ahead to the target for a drag operation, the principal idea is to map the translation of the object in the image plane to gaze, such that the hand only needs to control the depth component of the operation. We demo several applications we build for 3D manipulations, including area selection, 3D path specification, and a "Bejeweled" inspired game, showing potential for effortless drag-and-drop actions in 3D space. This demonstration includes the study apparatus and the applications from a paper that will be presented at UIST'24 with the same title (Wagner 2024, Proc. UIST). @InProceedings{ISS24p20, author = {Uta Wagner and Andreas Asferg Jacobsen and Tiare Feuchtner and Hans Gellersen and Ken Pfeuffer}, title = {Eye-Hand Movement of Objects in Near Space}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {20--23}, doi = {10.1145/3696762.3698044}, year = {2024}, } Publisher's Version |
|
Horlyck-Romanovsky, Margrethe |
ISS Companion '24: "Gamification of Food Selection ..."
Gamification of Food Selection and Nutrition Education in Virtual Reality
Caroline A. Klein, Oyewole Oyekoya, and Margrethe Horlyck-Romanovsky (Vassar College, USA; CUNY Hunter College, USA; CUNY Brooklyn College, USA) The increasing global prevalence of obesity and related health issues underscores the need for innovative dietary interventions. This paper explores the potential of combining gamification and Virtual Reality (VR) to promote healthier eating habits among young adults. By creating an interactive VR food environment with engaging game elements, we aim to assess the impact of gamified VR intervention on nutritional knowledge and attitudes. Preliminary results show an increase in nutritional understanding and awareness, though further research is necessary for statistical validation. This study suggests that VR-based gamified interventions could be a promising tool for nutrition education, behavior modification, and virtual food selection. @InProceedings{ISS24p81, author = {Caroline A. Klein and Oyewole Oyekoya and Margrethe Horlyck-Romanovsky}, title = {Gamification of Food Selection and Nutrition Education in Virtual Reality}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {81--84}, doi = {10.1145/3696762.3698057}, year = {2024}, } Publisher's Version |
|
Huron, Samuel |
ISS Companion '24: "Summary of the Workshop on ..."
Summary of the Workshop on Visual Methods and Analyzing Visual Data in Human Computer Interaction
Zezhong Wang, Samuel Huron, Miriam Sturdee, and Sheelagh Carpendale (Simon Fraser University, Canada; Télécom Paris - Institut Polytechnique de Paris, France; University of St Andrews, United Kingdom) Visual methods have become increasingly vital in Human Computer Interaction (HCI) research, particularly as we analyze and interpret the complex visual data that emerges from various interaction modalities. However, the methodologies for analyzing this visual data remain underdeveloped compared to textual data analysis. This workshop seeks to unite HCI researchers who work with visual data — such as hand sketches, photographs, physical artifacts, UI screenshots, videos, and information visualizations — to identify, name, and categorize methods for analyzing visual data in HCI. @InProceedings{ISS24p31, author = {Zezhong Wang and Samuel Huron and Miriam Sturdee and Sheelagh Carpendale}, title = {Summary of the Workshop on Visual Methods and Analyzing Visual Data in Human Computer Interaction}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {31--34}, doi = {10.1145/3696762.3698047}, year = {2024}, } Publisher's Version |
|
Jacobsen, Andreas Asferg |
ISS Companion '24: "Eye-Hand Movement of Objects ..."
Eye-Hand Movement of Objects in Near Space
Uta Wagner, Andreas Asferg Jacobsen, Tiare Feuchtner, Hans Gellersen, and Ken Pfeuffer (Aarhus University, Denmark; University of Konstanz, Germany; Lancaster University, United Kingdom) Hand-tracking in Extended Reality (XR) enables moving objects in near space with direct hand gestures, to pick, drag and drop objects in 3D. We explore the use of eye-tracking to reduce the effort involved in this interaction. As the eyes naturally look ahead to the target for a drag operation, the principal idea is to map the translation of the object in the image plane to gaze, such that the hand only needs to control the depth component of the operation. We demo several applications we build for 3D manipulations, including area selection, 3D path specification, and a "Bejeweled" inspired game, showing potential for effortless drag-and-drop actions in 3D space. This demonstration includes the study apparatus and the applications from a paper that will be presented at UIST'24 with the same title (Wagner 2024, Proc. UIST). @InProceedings{ISS24p20, author = {Uta Wagner and Andreas Asferg Jacobsen and Tiare Feuchtner and Hans Gellersen and Ken Pfeuffer}, title = {Eye-Hand Movement of Objects in Near Space}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {20--23}, doi = {10.1145/3696762.3698044}, year = {2024}, } Publisher's Version |
|
Jain, Radhika Pankaj |
ISS Companion '24: "Once Upon a Data Story: A ..."
Once Upon a Data Story: A Preliminary Design Space for Immersive Data Storytelling
Radhika Pankaj Jain, Kadek Ananta Satriadi, Adam Drogemuller, Ross Smith, and Andrew Cunningham (University of South Australia, Australia; Monash University, Australia) Immersive data storytelling is an emerging field that combines narrative visualisation and immersive analytics to engage an audience. While there are existing design spaces for narrative visualisation on 2D displays, there are no guidelines for creating immersive data stories, making it difficult for practitioners and researchers to explore this space. In this paper, we present a preliminary design space for immersive data storytelling that is informed by current practices and multi-disciplinary views. We interviewed multi-disciplinary experts, including museum designers, architects, and game designers, to understand how they communicate stories in physical spaces and immersive mediums. We applied inductive thematic analysis to the interview responses to inform the dimensions of the design space and analysed a systematic selection of publicly available immersive stories. In the end, we had 13 dimensions in 7 categories. We present insights into this design space as common practice or areas for future research. @InProceedings{ISS24p65, author = {Radhika Pankaj Jain and Kadek Ananta Satriadi and Adam Drogemuller and Ross Smith and Andrew Cunningham}, title = {Once Upon a Data Story: A Preliminary Design Space for Immersive Data Storytelling}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {65--70}, doi = {10.1145/3696762.3698054}, year = {2024}, } Publisher's Version |
|
Joost, Gesche |
ISS Companion '24: "Co-designing a Tangible Communication ..."
Co-designing a Tangible Communication Device to Enrich Communication over Distance
Hannah Friederike Fischer, Anke Königschulte, Jana Koch, Serge Autexier, and Gesche Joost (German Research Center for Artificial Intelligence, Germany; C&S Computer and Software, Germany) The initial phase of innovative product design is marked by uncertainty and complexity. This paper examines the use of participatory workshops to navigate this phase within the ToCaro research project. The project aims to develop tactile and multisensory interfaces for remote communication to mitigate feelings of loneliness by promoting a sense of physical proximity. Fourteen co-design workshops were conducted with senior participants (age ≥ 65) to examine their communication behaviors, identify latent needs and evaluate physical sensations elicited by various materials and forms of interaction. The workshops included semi-structured interviews, sensory perception tests, interaction concept evaluations, and “quick-and-dirty” prototyping. This paper outlines the facilitators’ experiences, the challenges, and learnings. Results indicate that while participants exhibited varied levels of engagement, those with a perceived need for new communication devices contributed effectively to the creative process. @InProceedings{ISS24p60, author = {Hannah Friederike Fischer and Anke Königschulte and Jana Koch and Serge Autexier and Gesche Joost}, title = {Co-designing a Tangible Communication Device to Enrich Communication over Distance}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {60--64}, doi = {10.1145/3696762.3698053}, year = {2024}, } Publisher's Version |
|
Kalloori, Saikishore |
ISS Companion '24: "Balancing Autonomy: Investigating ..."
Balancing Autonomy: Investigating User-Controlled vs Automated Guidance Systems for Sequential Tasks
Robin Wiethüchter, Saikishore Kalloori, and David Lindlbauer (ETH Zurich, Switzerland; Carnegie Mellon University, USA) Self-guided tutorials are popular resources for learning new tasks, but they lack important aspects of in-person guidance like feedback or personalized explanations. Adaptive guidance systems aim to overcome this challenge by reacting to users' performance and expertise and adapting instructions accordingly. We aim to understand the users' preferred balance of automation and control, what representation of instructions they prefer, and how human experts give instructions to match users' needs. We contribute an experiment where users perform different virtual tasks, guided by instructions that are controlled by experts using a wizard-of-oz paradigm. We employ different levels of automation to control instructions and alter their level of detail and step granularity to match the user's needs. Results indicate that while users preferred automated systems for convenience and instant feedback, they appreciated a degree of manual control since they felt less rushed. Experts relied on factors such as expected expertise, hesitation, errors, and their understanding of the current task state as main triggers to adapt instructions. @InProceedings{ISS24p75, author = {Robin Wiethüchter and Saikishore Kalloori and David Lindlbauer}, title = {Balancing Autonomy: Investigating User-Controlled vs Automated Guidance Systems for Sequential Tasks}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {75--80}, doi = {10.1145/3696762.3698056}, year = {2024}, } Publisher's Version |
|
Kang, Andrew |
ISS Companion '24: "PanoCoach: Enhancing Tactical ..."
PanoCoach: Enhancing Tactical Coaching and Communication in Soccer with Mixed-Reality Telepresence
Andrew Kang, Hanspeter Pfister, and Tica Lin (Harvard University, USA; Rice University, USA) Soccer, as a dynamic team sport, requires seamless coordination and integration of tactical strategies across all players. Adapting to new tactical systems is a critical but often challenging aspect of soccer at all professional levels. Even the best players can struggle with this process, primarily due to the complexities of conveying and internalizing intricate tactical patterns. Traditional communication methods like whiteboards, on-field instructions, and video analysis often present significant difficulties in perceiving spatial relationships, anticipating team movements, and facilitating live conversation during training sessions. These challenges can lead to inconsistent interpretations of the coach’s tactics among players, regardless of their skill level. To bridge the gap between tactical communication and physical execution, we propose a mixed-reality telepresence solution, PanoCoach, designed to support multi-view tactical explanations during practice. Our concept involves a multi-screen setup combining a tablet for coaches to annotate and demonstrate concepts in both 2D and 3D views, alongside VR to immerse athletes in a first-person perspective, allowing them to experience a sense of presence during coaching. In our preliminary study, we prototyped the cross-device functionality to implement the key steps of our approach: Step 1, where the coach uses a tablet to provide clear and dynamic tactical instructions, Step 2, where players engage with these instructions through an immersive VR experience, and Step 3, where the coach tracks players' movements and provides real time feedback. User evaluation with coaches at City Football Group, Harvard Soccer and Rice Soccer suggests this mixed-reality telepresence approach holds promising potential for improving tactical understanding and communication. Based on these findings, we outline future directions and discuss the research needed to expand this approach beyond controlled indoor environments, such as locker rooms, leveraging telepresence to enhance tactical comprehension and simulated training. @InProceedings{ISS24p15, author = {Andrew Kang and Hanspeter Pfister and Tica Lin}, title = {PanoCoach: Enhancing Tactical Coaching and Communication in Soccer with Mixed-Reality Telepresence}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {15--19}, doi = {10.1145/3696762.3698043}, year = {2024}, } Publisher's Version Video |
|
Keys, Amaya E. |
ISS Companion '24: "Exploring Effects of Interactive ..."
Exploring Effects of Interactive Virtual Reality Sensory Environment on Anxiety Reduction in Adolescents with Autism
Amaya E. Keys, Oyewole Oyekoya, and Daniel Chan (Howard University, USA; CUNY Hunter College, USA; CUNY, USA) Autism Spectrum Disorder (ASD) is a developmental disability often characterized by sensory processing difficulties that can lead to anxiety, particularly in children and adolescents. Previous research on virtual reality-based anxiety intervention tools focuses on using social skills training, exposure therapy, and meditative coaching to mitigate social and phobia related anxiety. However, minimal work has specifically evaluated the effects of virtual multi-sensory environments for people with ASD, often only testing feasibility. This pilot study aims to build on previous work by investigating how various auditory, visual, and interactive components contribute to user satisfaction and sensory-related anxiety reduction. The objective is to gain a better understanding of what features are significant towards developing a successful virtual anxiety intervention tool. Results suggest using interactive activities that promote fine motor skills can provide a healthy outlet for self-mediated stress relief. Future development aims to incorporate task-based activities, and enhance audio, visual, and lighting displays. The deployment of a full-scale study with a larger sample size and target participant pool is warranted to substantiate these initial findings. @InProceedings{ISS24p50, author = {Amaya E. Keys and Oyewole Oyekoya and Daniel Chan}, title = {Exploring Effects of Interactive Virtual Reality Sensory Environment on Anxiety Reduction in Adolescents with Autism}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {50--55}, doi = {10.1145/3696762.3698051}, year = {2024}, } Publisher's Version |
|
Khurana, Anjali |
ISS Companion '24: "Summary of the Workshop on ..."
Summary of the Workshop on Interactions for Supporting Explanation and Promoting Comprehension
Maryam Rezaie, Anjali Khurana, Parmit Chilana, Sheelagh Carpendale, Melanie Tory, and Sydney Purdue (Simon Fraser University, Canada; Northeastern University, USA) Interfaces and visualizations often challenge comprehension, especially as they grow in complexity. Traditional methods—relying on standard inputs like touch, mouse, and keyboard—fall short in addressing the nuanced demands for explainability in complex systems. This workshop explores innovative interaction strategies to enhance self-enabled comprehension, focusing on the development and refinement of new devices and input modalities. We aim to gather researchers, designers, and practitioners to exchange ideas and explore interaction techniques that promote clearer understanding and transparency across diverse applications. This collaborative effort seeks to advance interactive systems that improve explanation and user comprehension in digital environments. @InProceedings{ISS24p29, author = {Maryam Rezaie and Anjali Khurana and Parmit Chilana and Sheelagh Carpendale and Melanie Tory and Sydney Purdue}, title = {Summary of the Workshop on Interactions for Supporting Explanation and Promoting Comprehension}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {29--30}, doi = {10.1145/3696762.3698046}, year = {2024}, } Publisher's Version |
|
Klein, Caroline A. |
ISS Companion '24: "Gamification of Food Selection ..."
Gamification of Food Selection and Nutrition Education in Virtual Reality
Caroline A. Klein, Oyewole Oyekoya, and Margrethe Horlyck-Romanovsky (Vassar College, USA; CUNY Hunter College, USA; CUNY Brooklyn College, USA) The increasing global prevalence of obesity and related health issues underscores the need for innovative dietary interventions. This paper explores the potential of combining gamification and Virtual Reality (VR) to promote healthier eating habits among young adults. By creating an interactive VR food environment with engaging game elements, we aim to assess the impact of gamified VR intervention on nutritional knowledge and attitudes. Preliminary results show an increase in nutritional understanding and awareness, though further research is necessary for statistical validation. This study suggests that VR-based gamified interventions could be a promising tool for nutrition education, behavior modification, and virtual food selection. @InProceedings{ISS24p81, author = {Caroline A. Klein and Oyewole Oyekoya and Margrethe Horlyck-Romanovsky}, title = {Gamification of Food Selection and Nutrition Education in Virtual Reality}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {81--84}, doi = {10.1145/3696762.3698057}, year = {2024}, } Publisher's Version |
|
Koch, Jana |
ISS Companion '24: "Co-designing a Tangible Communication ..."
Co-designing a Tangible Communication Device to Enrich Communication over Distance
Hannah Friederike Fischer, Anke Königschulte, Jana Koch, Serge Autexier, and Gesche Joost (German Research Center for Artificial Intelligence, Germany; C&S Computer and Software, Germany) The initial phase of innovative product design is marked by uncertainty and complexity. This paper examines the use of participatory workshops to navigate this phase within the ToCaro research project. The project aims to develop tactile and multisensory interfaces for remote communication to mitigate feelings of loneliness by promoting a sense of physical proximity. Fourteen co-design workshops were conducted with senior participants (age ≥ 65) to examine their communication behaviors, identify latent needs and evaluate physical sensations elicited by various materials and forms of interaction. The workshops included semi-structured interviews, sensory perception tests, interaction concept evaluations, and “quick-and-dirty” prototyping. This paper outlines the facilitators’ experiences, the challenges, and learnings. Results indicate that while participants exhibited varied levels of engagement, those with a perceived need for new communication devices contributed effectively to the creative process. @InProceedings{ISS24p60, author = {Hannah Friederike Fischer and Anke Königschulte and Jana Koch and Serge Autexier and Gesche Joost}, title = {Co-designing a Tangible Communication Device to Enrich Communication over Distance}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {60--64}, doi = {10.1145/3696762.3698053}, year = {2024}, } Publisher's Version |
|
Königschulte, Anke |
ISS Companion '24: "Co-designing a Tangible Communication ..."
Co-designing a Tangible Communication Device to Enrich Communication over Distance
Hannah Friederike Fischer, Anke Königschulte, Jana Koch, Serge Autexier, and Gesche Joost (German Research Center for Artificial Intelligence, Germany; C&S Computer and Software, Germany) The initial phase of innovative product design is marked by uncertainty and complexity. This paper examines the use of participatory workshops to navigate this phase within the ToCaro research project. The project aims to develop tactile and multisensory interfaces for remote communication to mitigate feelings of loneliness by promoting a sense of physical proximity. Fourteen co-design workshops were conducted with senior participants (age ≥ 65) to examine their communication behaviors, identify latent needs and evaluate physical sensations elicited by various materials and forms of interaction. The workshops included semi-structured interviews, sensory perception tests, interaction concept evaluations, and “quick-and-dirty” prototyping. This paper outlines the facilitators’ experiences, the challenges, and learnings. Results indicate that while participants exhibited varied levels of engagement, those with a perceived need for new communication devices contributed effectively to the creative process. @InProceedings{ISS24p60, author = {Hannah Friederike Fischer and Anke Königschulte and Jana Koch and Serge Autexier and Gesche Joost}, title = {Co-designing a Tangible Communication Device to Enrich Communication over Distance}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {60--64}, doi = {10.1145/3696762.3698053}, year = {2024}, } Publisher's Version |
|
Lin, Tica |
ISS Companion '24: "PanoCoach: Enhancing Tactical ..."
PanoCoach: Enhancing Tactical Coaching and Communication in Soccer with Mixed-Reality Telepresence
Andrew Kang, Hanspeter Pfister, and Tica Lin (Harvard University, USA; Rice University, USA) Soccer, as a dynamic team sport, requires seamless coordination and integration of tactical strategies across all players. Adapting to new tactical systems is a critical but often challenging aspect of soccer at all professional levels. Even the best players can struggle with this process, primarily due to the complexities of conveying and internalizing intricate tactical patterns. Traditional communication methods like whiteboards, on-field instructions, and video analysis often present significant difficulties in perceiving spatial relationships, anticipating team movements, and facilitating live conversation during training sessions. These challenges can lead to inconsistent interpretations of the coach’s tactics among players, regardless of their skill level. To bridge the gap between tactical communication and physical execution, we propose a mixed-reality telepresence solution, PanoCoach, designed to support multi-view tactical explanations during practice. Our concept involves a multi-screen setup combining a tablet for coaches to annotate and demonstrate concepts in both 2D and 3D views, alongside VR to immerse athletes in a first-person perspective, allowing them to experience a sense of presence during coaching. In our preliminary study, we prototyped the cross-device functionality to implement the key steps of our approach: Step 1, where the coach uses a tablet to provide clear and dynamic tactical instructions, Step 2, where players engage with these instructions through an immersive VR experience, and Step 3, where the coach tracks players' movements and provides real time feedback. User evaluation with coaches at City Football Group, Harvard Soccer and Rice Soccer suggests this mixed-reality telepresence approach holds promising potential for improving tactical understanding and communication. Based on these findings, we outline future directions and discuss the research needed to expand this approach beyond controlled indoor environments, such as locker rooms, leveraging telepresence to enhance tactical comprehension and simulated training. @InProceedings{ISS24p15, author = {Andrew Kang and Hanspeter Pfister and Tica Lin}, title = {PanoCoach: Enhancing Tactical Coaching and Communication in Soccer with Mixed-Reality Telepresence}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {15--19}, doi = {10.1145/3696762.3698043}, year = {2024}, } Publisher's Version Video |
|
Lindlbauer, David |
ISS Companion '24: "Balancing Autonomy: Investigating ..."
Balancing Autonomy: Investigating User-Controlled vs Automated Guidance Systems for Sequential Tasks
Robin Wiethüchter, Saikishore Kalloori, and David Lindlbauer (ETH Zurich, Switzerland; Carnegie Mellon University, USA) Self-guided tutorials are popular resources for learning new tasks, but they lack important aspects of in-person guidance like feedback or personalized explanations. Adaptive guidance systems aim to overcome this challenge by reacting to users' performance and expertise and adapting instructions accordingly. We aim to understand the users' preferred balance of automation and control, what representation of instructions they prefer, and how human experts give instructions to match users' needs. We contribute an experiment where users perform different virtual tasks, guided by instructions that are controlled by experts using a wizard-of-oz paradigm. We employ different levels of automation to control instructions and alter their level of detail and step granularity to match the user's needs. Results indicate that while users preferred automated systems for convenience and instant feedback, they appreciated a degree of manual control since they felt less rushed. Experts relied on factors such as expected expertise, hesitation, errors, and their understanding of the current task state as main triggers to adapt instructions. @InProceedings{ISS24p75, author = {Robin Wiethüchter and Saikishore Kalloori and David Lindlbauer}, title = {Balancing Autonomy: Investigating User-Controlled vs Automated Guidance Systems for Sequential Tasks}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {75--80}, doi = {10.1145/3696762.3698056}, year = {2024}, } Publisher's Version |
|
Liu, Yaxuan |
ISS Companion '24: "Customisable Lower Limb Rehabilitation ..."
Customisable Lower Limb Rehabilitation Carpet for Children with Cerebral Palsy
Yaxuan Liu, Yijia An, Keming Zhang, Martijn Ten Bhömer, Qinglei Bu, Jie Sun, and Siyuan Chen (National University of Singapore (Suzhou) Research Institute, China; Xi’an Jiaotong-Liverpool University, China) Cerebral Palsy (CP) affects motor coordination, resulting in slow walking and irregular step and stride lengths. Effective rehabilitation exercises are essential for strengthening leg muscles and enhancing mobility in children with CP. However, traditional hospital rehabilitation programs often lack engagement, making it challenging for children to maintain consistent participation. Additionally, many advanced lower extremity rehabilitation systems remain largely inaccessible. This study introduces an interactive gaming carpet combined with intelligent gait pattern analysis to enhance rehabilitation efforts. The gaming carpet employs visual and auditory cues to train leg coordination and correct stepping patterns, making the exercises more engaging for children. Meanwhile, the intelligent gait analysis system provides therapists with objective data to assess conditions and develop personalized exercise plans. Initial tests indicate that this system effectively engages children and improves adherence to rehabilitation exercises, while also providing accurate progress monitoring. This innovative approach demonstrates significant potential for integrating game-based interventions and data analysis into CP rehabilitation, offering practical solutions for both clinical and home-based settings. @InProceedings{ISS24p44, author = {Yaxuan Liu and Yijia An and Keming Zhang and Martijn Ten Bhömer and Qinglei Bu and Jie Sun and Siyuan Chen}, title = {Customisable Lower Limb Rehabilitation Carpet for Children with Cerebral Palsy}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {44--49}, doi = {10.1145/3696762.3698050}, year = {2024}, } Publisher's Version |
|
Mizukami, Hana |
ISS Companion '24: "Exploring the Impact of Size ..."
Exploring the Impact of Size and Position on Visual Feedback Efficacy for Ballet Dancers
Hana Mizukami, Arinobu Niijima, Chanho Park, and Takefumi Ogawa (University of Tokyo, Japan; NTT Corporation, Japan) In balance training, such as ballet, observing one's posture in a mirror makes it easier to maintain balance. By using projectors, it is possible to show the user's posture from different angles, magnify specific body parts, and display the Center of Pressure (COP) trajectory in real time. This gives the user more visual feedback information than a mirror can provide, helping to improve balance. However, the appropriate projection position and size to enhance the effect of such visual feedback remains unclear. This study focuses on relevé in ballet and examines the effects of different types and positions of visual feedback on balance improvement. We conducted a user study and calculated balance metrics from COP data obtained from a balance board. The results indicate that only visual feedback projected directly at eye level in front of the user during relevé contributes to balance improvement. In contrast, visual feedback projected above eye level to the right did not show a clear effect on balance improvement. @InProceedings{ISS24p56, author = {Hana Mizukami and Arinobu Niijima and Chanho Park and Takefumi Ogawa}, title = {Exploring the Impact of Size and Position on Visual Feedback Efficacy for Ballet Dancers}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {56--59}, doi = {10.1145/3696762.3698052}, year = {2024}, } Publisher's Version |
|
Nakamura, Ayato |
ISS Companion '24: "Extracting Corneal Reflection ..."
Extracting Corneal Reflection of Screen by High-Speed Control of Polarization
Ayato Nakamura and Kentaro Takemura (Tokai University, Japan) Polarization has been considered to be a reference instead of near-infrared light sources in eye tracking because the light emitted from a liquid-crystal display is typically polarized. However, the degree of polarization depends on the display content. Thus, devising a novel method is crucial for stably extracting the display reflection from the corneal surface. Therefore, we propose an eye-tracking method that inserts a white background between the display contents using a high-speed display to extract the screen reflection on the cornea. A high-speed camera and polarization modulator are integrated to extract the polarized light emitted from a high-speed display, and then the point-of-gaze is estimated. We evaluated the accuracy of the estimated point-of-gaze under several conditions to compare the proposed method with conventional approaches. The results revealed that the proposed method improved eye gaze estimation. @InProceedings{ISS24p35, author = {Ayato Nakamura and Kentaro Takemura}, title = {Extracting Corneal Reflection of Screen by High-Speed Control of Polarization}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {35--38}, doi = {10.1145/3696762.3698048}, year = {2024}, } Publisher's Version |
|
Niijima, Arinobu |
ISS Companion '24: "Exploring the Impact of Size ..."
Exploring the Impact of Size and Position on Visual Feedback Efficacy for Ballet Dancers
Hana Mizukami, Arinobu Niijima, Chanho Park, and Takefumi Ogawa (University of Tokyo, Japan; NTT Corporation, Japan) In balance training, such as ballet, observing one's posture in a mirror makes it easier to maintain balance. By using projectors, it is possible to show the user's posture from different angles, magnify specific body parts, and display the Center of Pressure (COP) trajectory in real time. This gives the user more visual feedback information than a mirror can provide, helping to improve balance. However, the appropriate projection position and size to enhance the effect of such visual feedback remains unclear. This study focuses on relevé in ballet and examines the effects of different types and positions of visual feedback on balance improvement. We conducted a user study and calculated balance metrics from COP data obtained from a balance board. The results indicate that only visual feedback projected directly at eye level in front of the user during relevé contributes to balance improvement. In contrast, visual feedback projected above eye level to the right did not show a clear effect on balance improvement. @InProceedings{ISS24p56, author = {Hana Mizukami and Arinobu Niijima and Chanho Park and Takefumi Ogawa}, title = {Exploring the Impact of Size and Position on Visual Feedback Efficacy for Ballet Dancers}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {56--59}, doi = {10.1145/3696762.3698052}, year = {2024}, } Publisher's Version |
|
Ogawa, Takefumi |
ISS Companion '24: "Exploring the Impact of Size ..."
Exploring the Impact of Size and Position on Visual Feedback Efficacy for Ballet Dancers
Hana Mizukami, Arinobu Niijima, Chanho Park, and Takefumi Ogawa (University of Tokyo, Japan; NTT Corporation, Japan) In balance training, such as ballet, observing one's posture in a mirror makes it easier to maintain balance. By using projectors, it is possible to show the user's posture from different angles, magnify specific body parts, and display the Center of Pressure (COP) trajectory in real time. This gives the user more visual feedback information than a mirror can provide, helping to improve balance. However, the appropriate projection position and size to enhance the effect of such visual feedback remains unclear. This study focuses on relevé in ballet and examines the effects of different types and positions of visual feedback on balance improvement. We conducted a user study and calculated balance metrics from COP data obtained from a balance board. The results indicate that only visual feedback projected directly at eye level in front of the user during relevé contributes to balance improvement. In contrast, visual feedback projected above eye level to the right did not show a clear effect on balance improvement. @InProceedings{ISS24p56, author = {Hana Mizukami and Arinobu Niijima and Chanho Park and Takefumi Ogawa}, title = {Exploring the Impact of Size and Position on Visual Feedback Efficacy for Ballet Dancers}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {56--59}, doi = {10.1145/3696762.3698052}, year = {2024}, } Publisher's Version |
|
Oyekoya, Oyewole |
ISS Companion '24: "Exploring Effects of Interactive ..."
Exploring Effects of Interactive Virtual Reality Sensory Environment on Anxiety Reduction in Adolescents with Autism
Amaya E. Keys, Oyewole Oyekoya, and Daniel Chan (Howard University, USA; CUNY Hunter College, USA; CUNY, USA) Autism Spectrum Disorder (ASD) is a developmental disability often characterized by sensory processing difficulties that can lead to anxiety, particularly in children and adolescents. Previous research on virtual reality-based anxiety intervention tools focuses on using social skills training, exposure therapy, and meditative coaching to mitigate social and phobia related anxiety. However, minimal work has specifically evaluated the effects of virtual multi-sensory environments for people with ASD, often only testing feasibility. This pilot study aims to build on previous work by investigating how various auditory, visual, and interactive components contribute to user satisfaction and sensory-related anxiety reduction. The objective is to gain a better understanding of what features are significant towards developing a successful virtual anxiety intervention tool. Results suggest using interactive activities that promote fine motor skills can provide a healthy outlet for self-mediated stress relief. Future development aims to incorporate task-based activities, and enhance audio, visual, and lighting displays. The deployment of a full-scale study with a larger sample size and target participant pool is warranted to substantiate these initial findings. @InProceedings{ISS24p50, author = {Amaya E. Keys and Oyewole Oyekoya and Daniel Chan}, title = {Exploring Effects of Interactive Virtual Reality Sensory Environment on Anxiety Reduction in Adolescents with Autism}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {50--55}, doi = {10.1145/3696762.3698051}, year = {2024}, } Publisher's Version ISS Companion '24: "Gamification of Food Selection ..." Gamification of Food Selection and Nutrition Education in Virtual Reality Caroline A. Klein, Oyewole Oyekoya, and Margrethe Horlyck-Romanovsky (Vassar College, USA; CUNY Hunter College, USA; CUNY Brooklyn College, USA) The increasing global prevalence of obesity and related health issues underscores the need for innovative dietary interventions. This paper explores the potential of combining gamification and Virtual Reality (VR) to promote healthier eating habits among young adults. By creating an interactive VR food environment with engaging game elements, we aim to assess the impact of gamified VR intervention on nutritional knowledge and attitudes. Preliminary results show an increase in nutritional understanding and awareness, though further research is necessary for statistical validation. This study suggests that VR-based gamified interventions could be a promising tool for nutrition education, behavior modification, and virtual food selection. @InProceedings{ISS24p81, author = {Caroline A. Klein and Oyewole Oyekoya and Margrethe Horlyck-Romanovsky}, title = {Gamification of Food Selection and Nutrition Education in Virtual Reality}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {81--84}, doi = {10.1145/3696762.3698057}, year = {2024}, } Publisher's Version ISS Companion '24: "Enhancing Virtual Mobility ..." Enhancing Virtual Mobility for Individuals Who Are Blind or Have Low Vision: A Stationary Exploration Method Hong Zhao, Oyewole Oyekoya, and Hao Tang (CUNY Borough of Manhattan Community College, USA; CUNY Hunter College, USA; CUNY New York, USA) Designing accessible locomotion methods for individuals who are blind or have low vision (BLV) is a complex challenge, particularly in mobile VR environments with limited interface options. In this paper, we propose a novel locomotion technique on mobile VR that enables users to control a virtual character's movement while staying stationary or within a small physical area. The technique utilizes the phone's gyroscope for movement control, while providing spatial audio and vibration feedback to enhance virtual exploration for BLV individuals. Our study examines how BLV individuals acquire spatial knowledge in mobile VR environments. A user study is conducted to assess the effectiveness of the proposed approach. @InProceedings{ISS24p85, author = {Hong Zhao and Oyewole Oyekoya and Hao Tang}, title = {Enhancing Virtual Mobility for Individuals Who Are Blind or Have Low Vision: A Stationary Exploration Method}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {85--89}, doi = {10.1145/3696762.3698058}, year = {2024}, } Publisher's Version |
|
Park, Chanho |
ISS Companion '24: "Exploring the Impact of Size ..."
Exploring the Impact of Size and Position on Visual Feedback Efficacy for Ballet Dancers
Hana Mizukami, Arinobu Niijima, Chanho Park, and Takefumi Ogawa (University of Tokyo, Japan; NTT Corporation, Japan) In balance training, such as ballet, observing one's posture in a mirror makes it easier to maintain balance. By using projectors, it is possible to show the user's posture from different angles, magnify specific body parts, and display the Center of Pressure (COP) trajectory in real time. This gives the user more visual feedback information than a mirror can provide, helping to improve balance. However, the appropriate projection position and size to enhance the effect of such visual feedback remains unclear. This study focuses on relevé in ballet and examines the effects of different types and positions of visual feedback on balance improvement. We conducted a user study and calculated balance metrics from COP data obtained from a balance board. The results indicate that only visual feedback projected directly at eye level in front of the user during relevé contributes to balance improvement. In contrast, visual feedback projected above eye level to the right did not show a clear effect on balance improvement. @InProceedings{ISS24p56, author = {Hana Mizukami and Arinobu Niijima and Chanho Park and Takefumi Ogawa}, title = {Exploring the Impact of Size and Position on Visual Feedback Efficacy for Ballet Dancers}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {56--59}, doi = {10.1145/3696762.3698052}, year = {2024}, } Publisher's Version |
|
Pfeuffer, Ken |
ISS Companion '24: "Eye-Hand Movement of Objects ..."
Eye-Hand Movement of Objects in Near Space
Uta Wagner, Andreas Asferg Jacobsen, Tiare Feuchtner, Hans Gellersen, and Ken Pfeuffer (Aarhus University, Denmark; University of Konstanz, Germany; Lancaster University, United Kingdom) Hand-tracking in Extended Reality (XR) enables moving objects in near space with direct hand gestures, to pick, drag and drop objects in 3D. We explore the use of eye-tracking to reduce the effort involved in this interaction. As the eyes naturally look ahead to the target for a drag operation, the principal idea is to map the translation of the object in the image plane to gaze, such that the hand only needs to control the depth component of the operation. We demo several applications we build for 3D manipulations, including area selection, 3D path specification, and a "Bejeweled" inspired game, showing potential for effortless drag-and-drop actions in 3D space. This demonstration includes the study apparatus and the applications from a paper that will be presented at UIST'24 with the same title (Wagner 2024, Proc. UIST). @InProceedings{ISS24p20, author = {Uta Wagner and Andreas Asferg Jacobsen and Tiare Feuchtner and Hans Gellersen and Ken Pfeuffer}, title = {Eye-Hand Movement of Objects in Near Space}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {20--23}, doi = {10.1145/3696762.3698044}, year = {2024}, } Publisher's Version |
|
Pfister, Hanspeter |
ISS Companion '24: "PanoCoach: Enhancing Tactical ..."
PanoCoach: Enhancing Tactical Coaching and Communication in Soccer with Mixed-Reality Telepresence
Andrew Kang, Hanspeter Pfister, and Tica Lin (Harvard University, USA; Rice University, USA) Soccer, as a dynamic team sport, requires seamless coordination and integration of tactical strategies across all players. Adapting to new tactical systems is a critical but often challenging aspect of soccer at all professional levels. Even the best players can struggle with this process, primarily due to the complexities of conveying and internalizing intricate tactical patterns. Traditional communication methods like whiteboards, on-field instructions, and video analysis often present significant difficulties in perceiving spatial relationships, anticipating team movements, and facilitating live conversation during training sessions. These challenges can lead to inconsistent interpretations of the coach’s tactics among players, regardless of their skill level. To bridge the gap between tactical communication and physical execution, we propose a mixed-reality telepresence solution, PanoCoach, designed to support multi-view tactical explanations during practice. Our concept involves a multi-screen setup combining a tablet for coaches to annotate and demonstrate concepts in both 2D and 3D views, alongside VR to immerse athletes in a first-person perspective, allowing them to experience a sense of presence during coaching. In our preliminary study, we prototyped the cross-device functionality to implement the key steps of our approach: Step 1, where the coach uses a tablet to provide clear and dynamic tactical instructions, Step 2, where players engage with these instructions through an immersive VR experience, and Step 3, where the coach tracks players' movements and provides real time feedback. User evaluation with coaches at City Football Group, Harvard Soccer and Rice Soccer suggests this mixed-reality telepresence approach holds promising potential for improving tactical understanding and communication. Based on these findings, we outline future directions and discuss the research needed to expand this approach beyond controlled indoor environments, such as locker rooms, leveraging telepresence to enhance tactical comprehension and simulated training. @InProceedings{ISS24p15, author = {Andrew Kang and Hanspeter Pfister and Tica Lin}, title = {PanoCoach: Enhancing Tactical Coaching and Communication in Soccer with Mixed-Reality Telepresence}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {15--19}, doi = {10.1145/3696762.3698043}, year = {2024}, } Publisher's Version Video |
|
Purdue, Sydney |
ISS Companion '24: "Summary of the Workshop on ..."
Summary of the Workshop on Interactions for Supporting Explanation and Promoting Comprehension
Maryam Rezaie, Anjali Khurana, Parmit Chilana, Sheelagh Carpendale, Melanie Tory, and Sydney Purdue (Simon Fraser University, Canada; Northeastern University, USA) Interfaces and visualizations often challenge comprehension, especially as they grow in complexity. Traditional methods—relying on standard inputs like touch, mouse, and keyboard—fall short in addressing the nuanced demands for explainability in complex systems. This workshop explores innovative interaction strategies to enhance self-enabled comprehension, focusing on the development and refinement of new devices and input modalities. We aim to gather researchers, designers, and practitioners to exchange ideas and explore interaction techniques that promote clearer understanding and transparency across diverse applications. This collaborative effort seeks to advance interactive systems that improve explanation and user comprehension in digital environments. @InProceedings{ISS24p29, author = {Maryam Rezaie and Anjali Khurana and Parmit Chilana and Sheelagh Carpendale and Melanie Tory and Sydney Purdue}, title = {Summary of the Workshop on Interactions for Supporting Explanation and Promoting Comprehension}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {29--30}, doi = {10.1145/3696762.3698046}, year = {2024}, } Publisher's Version |
|
Rezaie, Maryam |
ISS Companion '24: "Summary of the Workshop on ..."
Summary of the Workshop on Interactions for Supporting Explanation and Promoting Comprehension
Maryam Rezaie, Anjali Khurana, Parmit Chilana, Sheelagh Carpendale, Melanie Tory, and Sydney Purdue (Simon Fraser University, Canada; Northeastern University, USA) Interfaces and visualizations often challenge comprehension, especially as they grow in complexity. Traditional methods—relying on standard inputs like touch, mouse, and keyboard—fall short in addressing the nuanced demands for explainability in complex systems. This workshop explores innovative interaction strategies to enhance self-enabled comprehension, focusing on the development and refinement of new devices and input modalities. We aim to gather researchers, designers, and practitioners to exchange ideas and explore interaction techniques that promote clearer understanding and transparency across diverse applications. This collaborative effort seeks to advance interactive systems that improve explanation and user comprehension in digital environments. @InProceedings{ISS24p29, author = {Maryam Rezaie and Anjali Khurana and Parmit Chilana and Sheelagh Carpendale and Melanie Tory and Sydney Purdue}, title = {Summary of the Workshop on Interactions for Supporting Explanation and Promoting Comprehension}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {29--30}, doi = {10.1145/3696762.3698046}, year = {2024}, } Publisher's Version |
|
Sakaguchi, Saki |
ISS Companion '24: "Spot Shadow: A System for ..."
Spot Shadow: A System for Manipulating Shadows in Spatial Design
Saki Sakaguchi (Tokyo Metropolitan University, Japan) The design of the lighting environment is important for determining room specifications. Recently, systems that can change the lighting environment in a room by controlling the direction and intensity of light using a computer have been proposed. However, with such a method of controlling the light, it is easy to specify the size and position of the brightly illuminated area, but not the darkened area. In this study, we controlled shadows and proposed a method for controlling dark areas. We construct Spot Shadow, a system that can generate shadows of arbitrary sizes and shapes at positions specified by users. The prototypes of a tabletop system and a large system can create shadow generation areas of 1m × 0.9m and 7m × 4.3m, respectively. This study proposes a method for manipulating shadows and demonstrates new possibilities for creating interactive spaces using shadows. @InProceedings{ISS24p11, author = {Saki Sakaguchi}, title = {Spot Shadow: A System for Manipulating Shadows in Spatial Design}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {11--14}, doi = {10.1145/3696762.3698042}, year = {2024}, } Publisher's Version |
|
Satriadi, Kadek Ananta |
ISS Companion '24: "Once Upon a Data Story: A ..."
Once Upon a Data Story: A Preliminary Design Space for Immersive Data Storytelling
Radhika Pankaj Jain, Kadek Ananta Satriadi, Adam Drogemuller, Ross Smith, and Andrew Cunningham (University of South Australia, Australia; Monash University, Australia) Immersive data storytelling is an emerging field that combines narrative visualisation and immersive analytics to engage an audience. While there are existing design spaces for narrative visualisation on 2D displays, there are no guidelines for creating immersive data stories, making it difficult for practitioners and researchers to explore this space. In this paper, we present a preliminary design space for immersive data storytelling that is informed by current practices and multi-disciplinary views. We interviewed multi-disciplinary experts, including museum designers, architects, and game designers, to understand how they communicate stories in physical spaces and immersive mediums. We applied inductive thematic analysis to the interview responses to inform the dimensions of the design space and analysed a systematic selection of publicly available immersive stories. In the end, we had 13 dimensions in 7 categories. We present insights into this design space as common practice or areas for future research. @InProceedings{ISS24p65, author = {Radhika Pankaj Jain and Kadek Ananta Satriadi and Adam Drogemuller and Ross Smith and Andrew Cunningham}, title = {Once Upon a Data Story: A Preliminary Design Space for Immersive Data Storytelling}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {65--70}, doi = {10.1145/3696762.3698054}, year = {2024}, } Publisher's Version |
|
Smith, Ross |
ISS Companion '24: "Once Upon a Data Story: A ..."
Once Upon a Data Story: A Preliminary Design Space for Immersive Data Storytelling
Radhika Pankaj Jain, Kadek Ananta Satriadi, Adam Drogemuller, Ross Smith, and Andrew Cunningham (University of South Australia, Australia; Monash University, Australia) Immersive data storytelling is an emerging field that combines narrative visualisation and immersive analytics to engage an audience. While there are existing design spaces for narrative visualisation on 2D displays, there are no guidelines for creating immersive data stories, making it difficult for practitioners and researchers to explore this space. In this paper, we present a preliminary design space for immersive data storytelling that is informed by current practices and multi-disciplinary views. We interviewed multi-disciplinary experts, including museum designers, architects, and game designers, to understand how they communicate stories in physical spaces and immersive mediums. We applied inductive thematic analysis to the interview responses to inform the dimensions of the design space and analysed a systematic selection of publicly available immersive stories. In the end, we had 13 dimensions in 7 categories. We present insights into this design space as common practice or areas for future research. @InProceedings{ISS24p65, author = {Radhika Pankaj Jain and Kadek Ananta Satriadi and Adam Drogemuller and Ross Smith and Andrew Cunningham}, title = {Once Upon a Data Story: A Preliminary Design Space for Immersive Data Storytelling}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {65--70}, doi = {10.1145/3696762.3698054}, year = {2024}, } Publisher's Version |
|
Sturdee, Miriam |
ISS Companion '24: "Summary of the Workshop on ..."
Summary of the Workshop on Visual Methods and Analyzing Visual Data in Human Computer Interaction
Zezhong Wang, Samuel Huron, Miriam Sturdee, and Sheelagh Carpendale (Simon Fraser University, Canada; Télécom Paris - Institut Polytechnique de Paris, France; University of St Andrews, United Kingdom) Visual methods have become increasingly vital in Human Computer Interaction (HCI) research, particularly as we analyze and interpret the complex visual data that emerges from various interaction modalities. However, the methodologies for analyzing this visual data remain underdeveloped compared to textual data analysis. This workshop seeks to unite HCI researchers who work with visual data — such as hand sketches, photographs, physical artifacts, UI screenshots, videos, and information visualizations — to identify, name, and categorize methods for analyzing visual data in HCI. @InProceedings{ISS24p31, author = {Zezhong Wang and Samuel Huron and Miriam Sturdee and Sheelagh Carpendale}, title = {Summary of the Workshop on Visual Methods and Analyzing Visual Data in Human Computer Interaction}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {31--34}, doi = {10.1145/3696762.3698047}, year = {2024}, } Publisher's Version |
|
Sun, Jie |
ISS Companion '24: "A Two-Handed Ellipsoidal Device ..."
A Two-Handed Ellipsoidal Device for Interactive Grasping and Gripping Rehabilitation Exercise
Bingjie Xu, Yijia An, Qinglei Bu, and Jie Sun (Suzhou Industrial Park Institute of Vocational Technology, China; Xi’an Jiaotong-Liverpool University, China) Many children with cerebral palsy (CP) should do various exercises to restore motor control for specific functions such as hand grasping and gripping. During daily exercises, they need intensive support from either therapists or caregivers in setting tasks and providing feedback, which creates a heavy workload. Thus, we introduce a two-handed ellipsoidal device to control computer games for interactive grasping and gripping rehabilitation training. The ellipsoidal device is designed to house an ESP32 microcontroller, a Wheeltec N100 IMU and an SF15 flexible thin-film pressure sensor so as to monitor children’s grip strength and wrist rotation. The sensing data can be used to control the characterizer motion in computer games. Preliminary user trials supported the implementation of such devices in hospitals for the hand grasping and gripping exercise and the cognition and coordination exercise between eyes, ears and hands. @InProceedings{ISS24p39, author = {Bingjie Xu and Yijia An and Qinglei Bu and Jie Sun}, title = {A Two-Handed Ellipsoidal Device for Interactive Grasping and Gripping Rehabilitation Exercise}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {39--43}, doi = {10.1145/3696762.3698049}, year = {2024}, } Publisher's Version ISS Companion '24: "Customisable Lower Limb Rehabilitation ..." Customisable Lower Limb Rehabilitation Carpet for Children with Cerebral Palsy Yaxuan Liu, Yijia An, Keming Zhang, Martijn Ten Bhömer, Qinglei Bu, Jie Sun, and Siyuan Chen (National University of Singapore (Suzhou) Research Institute, China; Xi’an Jiaotong-Liverpool University, China) Cerebral Palsy (CP) affects motor coordination, resulting in slow walking and irregular step and stride lengths. Effective rehabilitation exercises are essential for strengthening leg muscles and enhancing mobility in children with CP. However, traditional hospital rehabilitation programs often lack engagement, making it challenging for children to maintain consistent participation. Additionally, many advanced lower extremity rehabilitation systems remain largely inaccessible. This study introduces an interactive gaming carpet combined with intelligent gait pattern analysis to enhance rehabilitation efforts. The gaming carpet employs visual and auditory cues to train leg coordination and correct stepping patterns, making the exercises more engaging for children. Meanwhile, the intelligent gait analysis system provides therapists with objective data to assess conditions and develop personalized exercise plans. Initial tests indicate that this system effectively engages children and improves adherence to rehabilitation exercises, while also providing accurate progress monitoring. This innovative approach demonstrates significant potential for integrating game-based interventions and data analysis into CP rehabilitation, offering practical solutions for both clinical and home-based settings. @InProceedings{ISS24p44, author = {Yaxuan Liu and Yijia An and Keming Zhang and Martijn Ten Bhömer and Qinglei Bu and Jie Sun and Siyuan Chen}, title = {Customisable Lower Limb Rehabilitation Carpet for Children with Cerebral Palsy}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {44--49}, doi = {10.1145/3696762.3698050}, year = {2024}, } Publisher's Version |
|
Takemura, Kentaro |
ISS Companion '24: "Extracting Corneal Reflection ..."
Extracting Corneal Reflection of Screen by High-Speed Control of Polarization
Ayato Nakamura and Kentaro Takemura (Tokai University, Japan) Polarization has been considered to be a reference instead of near-infrared light sources in eye tracking because the light emitted from a liquid-crystal display is typically polarized. However, the degree of polarization depends on the display content. Thus, devising a novel method is crucial for stably extracting the display reflection from the corneal surface. Therefore, we propose an eye-tracking method that inserts a white background between the display contents using a high-speed display to extract the screen reflection on the cornea. A high-speed camera and polarization modulator are integrated to extract the polarized light emitted from a high-speed display, and then the point-of-gaze is estimated. We evaluated the accuracy of the estimated point-of-gaze under several conditions to compare the proposed method with conventional approaches. The results revealed that the proposed method improved eye gaze estimation. @InProceedings{ISS24p35, author = {Ayato Nakamura and Kentaro Takemura}, title = {Extracting Corneal Reflection of Screen by High-Speed Control of Polarization}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {35--38}, doi = {10.1145/3696762.3698048}, year = {2024}, } Publisher's Version |
|
Tang, Hao |
ISS Companion '24: "Enhancing Virtual Mobility ..."
Enhancing Virtual Mobility for Individuals Who Are Blind or Have Low Vision: A Stationary Exploration Method
Hong Zhao, Oyewole Oyekoya, and Hao Tang (CUNY Borough of Manhattan Community College, USA; CUNY Hunter College, USA; CUNY New York, USA) Designing accessible locomotion methods for individuals who are blind or have low vision (BLV) is a complex challenge, particularly in mobile VR environments with limited interface options. In this paper, we propose a novel locomotion technique on mobile VR that enables users to control a virtual character's movement while staying stationary or within a small physical area. The technique utilizes the phone's gyroscope for movement control, while providing spatial audio and vibration feedback to enhance virtual exploration for BLV individuals. Our study examines how BLV individuals acquire spatial knowledge in mobile VR environments. A user study is conducted to assess the effectiveness of the proposed approach. @InProceedings{ISS24p85, author = {Hong Zhao and Oyewole Oyekoya and Hao Tang}, title = {Enhancing Virtual Mobility for Individuals Who Are Blind or Have Low Vision: A Stationary Exploration Method}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {85--89}, doi = {10.1145/3696762.3698058}, year = {2024}, } Publisher's Version |
|
Ten Bhömer, Martijn |
ISS Companion '24: "Customisable Lower Limb Rehabilitation ..."
Customisable Lower Limb Rehabilitation Carpet for Children with Cerebral Palsy
Yaxuan Liu, Yijia An, Keming Zhang, Martijn Ten Bhömer, Qinglei Bu, Jie Sun, and Siyuan Chen (National University of Singapore (Suzhou) Research Institute, China; Xi’an Jiaotong-Liverpool University, China) Cerebral Palsy (CP) affects motor coordination, resulting in slow walking and irregular step and stride lengths. Effective rehabilitation exercises are essential for strengthening leg muscles and enhancing mobility in children with CP. However, traditional hospital rehabilitation programs often lack engagement, making it challenging for children to maintain consistent participation. Additionally, many advanced lower extremity rehabilitation systems remain largely inaccessible. This study introduces an interactive gaming carpet combined with intelligent gait pattern analysis to enhance rehabilitation efforts. The gaming carpet employs visual and auditory cues to train leg coordination and correct stepping patterns, making the exercises more engaging for children. Meanwhile, the intelligent gait analysis system provides therapists with objective data to assess conditions and develop personalized exercise plans. Initial tests indicate that this system effectively engages children and improves adherence to rehabilitation exercises, while also providing accurate progress monitoring. This innovative approach demonstrates significant potential for integrating game-based interventions and data analysis into CP rehabilitation, offering practical solutions for both clinical and home-based settings. @InProceedings{ISS24p44, author = {Yaxuan Liu and Yijia An and Keming Zhang and Martijn Ten Bhömer and Qinglei Bu and Jie Sun and Siyuan Chen}, title = {Customisable Lower Limb Rehabilitation Carpet for Children with Cerebral Palsy}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {44--49}, doi = {10.1145/3696762.3698050}, year = {2024}, } Publisher's Version |
|
Tory, Melanie |
ISS Companion '24: "Summary of the Workshop on ..."
Summary of the Workshop on Interactions for Supporting Explanation and Promoting Comprehension
Maryam Rezaie, Anjali Khurana, Parmit Chilana, Sheelagh Carpendale, Melanie Tory, and Sydney Purdue (Simon Fraser University, Canada; Northeastern University, USA) Interfaces and visualizations often challenge comprehension, especially as they grow in complexity. Traditional methods—relying on standard inputs like touch, mouse, and keyboard—fall short in addressing the nuanced demands for explainability in complex systems. This workshop explores innovative interaction strategies to enhance self-enabled comprehension, focusing on the development and refinement of new devices and input modalities. We aim to gather researchers, designers, and practitioners to exchange ideas and explore interaction techniques that promote clearer understanding and transparency across diverse applications. This collaborative effort seeks to advance interactive systems that improve explanation and user comprehension in digital environments. @InProceedings{ISS24p29, author = {Maryam Rezaie and Anjali Khurana and Parmit Chilana and Sheelagh Carpendale and Melanie Tory and Sydney Purdue}, title = {Summary of the Workshop on Interactions for Supporting Explanation and Promoting Comprehension}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {29--30}, doi = {10.1145/3696762.3698046}, year = {2024}, } Publisher's Version |
|
Uddin, Md. Sami |
ISS Companion '24: "Effects of Increasing Command ..."
Effects of Increasing Command Capacity of Spatial Memory Menus in Tablets
Sayeem Md. Abdullah and Md. Sami Uddin (University of Regina, Canada) Spatially-stable touch menus, like FastTap, leverage users’ spatial memory to enable rapid command selection on tablets. Although these spatial tablet interfaces can aid in developing spatial memory of commands having a small command set, it is, however, unknown whether spatial memory remains beneficial when the number of commands grows. Therefore, we carried out a study to investigate spatial learning in four different sizes of single-tab FastTap Menus: Small, Medium, Large, and Extra-Large, with 16, 30, 42, and 56 items, respectively. Results indicated that people do develop spatial memory in all menus; however, there is a negative correlation between command capacity and spatial memory development in tablets. We contribute new knowledge on spatial memory development in touch tablets that can enhance the design of future spatial memory-based tablet interfaces. @InProceedings{ISS24p71, author = {Sayeem Md. Abdullah and Md. Sami Uddin}, title = {Effects of Increasing Command Capacity of Spatial Memory Menus in Tablets}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {71--74}, doi = {10.1145/3696762.3698055}, year = {2024}, } Publisher's Version |
|
Vergara, Katherine |
ISS Companion '24: "A Physical Computing Workshop ..."
A Physical Computing Workshop to Engage Girls from Low-Income Backgrounds
Katherine Vergara (Pontificia Universidad Católica de Chile, Chile) The persistent gender gap in computer science, especially among women from low-income backgrounds, continues to limit diversity and innovation within the technological sector. This underrepresentation also restricts access to career paths that can enhance social mobility, particularly for women in developing countries. Physical computing offers a hands-on approach that can improve programming skills and computational thinking through interaction with tangible hardware. This research focuses on developing a short physical computing workshop tailored to young girls from low-income communities. The study combines education, tangible interfaces, and coding. Through a series of classroom-based studies and laboratory experiments, this PhD work will assess the impact of the workshop on self-efficacy and learning in programming and computational thinking. The anticipated contributions of this research include insights into the effectiveness of tangible, user-friendly physical computing workshops in increasing engagement among underrepresented groups in computer science. @InProceedings{ISS24p1, author = {Katherine Vergara}, title = {A Physical Computing Workshop to Engage Girls from Low-Income Backgrounds}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {1--3}, doi = {10.1145/3696762.3698039}, year = {2024}, } Publisher's Version |
|
Wagner, Uta |
ISS Companion '24: "Eye-Hand Movement of Objects ..."
Eye-Hand Movement of Objects in Near Space
Uta Wagner, Andreas Asferg Jacobsen, Tiare Feuchtner, Hans Gellersen, and Ken Pfeuffer (Aarhus University, Denmark; University of Konstanz, Germany; Lancaster University, United Kingdom) Hand-tracking in Extended Reality (XR) enables moving objects in near space with direct hand gestures, to pick, drag and drop objects in 3D. We explore the use of eye-tracking to reduce the effort involved in this interaction. As the eyes naturally look ahead to the target for a drag operation, the principal idea is to map the translation of the object in the image plane to gaze, such that the hand only needs to control the depth component of the operation. We demo several applications we build for 3D manipulations, including area selection, 3D path specification, and a "Bejeweled" inspired game, showing potential for effortless drag-and-drop actions in 3D space. This demonstration includes the study apparatus and the applications from a paper that will be presented at UIST'24 with the same title (Wagner 2024, Proc. UIST). @InProceedings{ISS24p20, author = {Uta Wagner and Andreas Asferg Jacobsen and Tiare Feuchtner and Hans Gellersen and Ken Pfeuffer}, title = {Eye-Hand Movement of Objects in Near Space}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {20--23}, doi = {10.1145/3696762.3698044}, year = {2024}, } Publisher's Version |
|
Wang, Zezhong |
ISS Companion '24: "Summary of the Workshop on ..."
Summary of the Workshop on Visual Methods and Analyzing Visual Data in Human Computer Interaction
Zezhong Wang, Samuel Huron, Miriam Sturdee, and Sheelagh Carpendale (Simon Fraser University, Canada; Télécom Paris - Institut Polytechnique de Paris, France; University of St Andrews, United Kingdom) Visual methods have become increasingly vital in Human Computer Interaction (HCI) research, particularly as we analyze and interpret the complex visual data that emerges from various interaction modalities. However, the methodologies for analyzing this visual data remain underdeveloped compared to textual data analysis. This workshop seeks to unite HCI researchers who work with visual data — such as hand sketches, photographs, physical artifacts, UI screenshots, videos, and information visualizations — to identify, name, and categorize methods for analyzing visual data in HCI. @InProceedings{ISS24p31, author = {Zezhong Wang and Samuel Huron and Miriam Sturdee and Sheelagh Carpendale}, title = {Summary of the Workshop on Visual Methods and Analyzing Visual Data in Human Computer Interaction}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {31--34}, doi = {10.1145/3696762.3698047}, year = {2024}, } Publisher's Version |
|
Wiethüchter, Robin |
ISS Companion '24: "Balancing Autonomy: Investigating ..."
Balancing Autonomy: Investigating User-Controlled vs Automated Guidance Systems for Sequential Tasks
Robin Wiethüchter, Saikishore Kalloori, and David Lindlbauer (ETH Zurich, Switzerland; Carnegie Mellon University, USA) Self-guided tutorials are popular resources for learning new tasks, but they lack important aspects of in-person guidance like feedback or personalized explanations. Adaptive guidance systems aim to overcome this challenge by reacting to users' performance and expertise and adapting instructions accordingly. We aim to understand the users' preferred balance of automation and control, what representation of instructions they prefer, and how human experts give instructions to match users' needs. We contribute an experiment where users perform different virtual tasks, guided by instructions that are controlled by experts using a wizard-of-oz paradigm. We employ different levels of automation to control instructions and alter their level of detail and step granularity to match the user's needs. Results indicate that while users preferred automated systems for convenience and instant feedback, they appreciated a degree of manual control since they felt less rushed. Experts relied on factors such as expected expertise, hesitation, errors, and their understanding of the current task state as main triggers to adapt instructions. @InProceedings{ISS24p75, author = {Robin Wiethüchter and Saikishore Kalloori and David Lindlbauer}, title = {Balancing Autonomy: Investigating User-Controlled vs Automated Guidance Systems for Sequential Tasks}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {75--80}, doi = {10.1145/3696762.3698056}, year = {2024}, } Publisher's Version |
|
Xu, Bingjie |
ISS Companion '24: "A Two-Handed Ellipsoidal Device ..."
A Two-Handed Ellipsoidal Device for Interactive Grasping and Gripping Rehabilitation Exercise
Bingjie Xu, Yijia An, Qinglei Bu, and Jie Sun (Suzhou Industrial Park Institute of Vocational Technology, China; Xi’an Jiaotong-Liverpool University, China) Many children with cerebral palsy (CP) should do various exercises to restore motor control for specific functions such as hand grasping and gripping. During daily exercises, they need intensive support from either therapists or caregivers in setting tasks and providing feedback, which creates a heavy workload. Thus, we introduce a two-handed ellipsoidal device to control computer games for interactive grasping and gripping rehabilitation training. The ellipsoidal device is designed to house an ESP32 microcontroller, a Wheeltec N100 IMU and an SF15 flexible thin-film pressure sensor so as to monitor children’s grip strength and wrist rotation. The sensing data can be used to control the characterizer motion in computer games. Preliminary user trials supported the implementation of such devices in hospitals for the hand grasping and gripping exercise and the cognition and coordination exercise between eyes, ears and hands. @InProceedings{ISS24p39, author = {Bingjie Xu and Yijia An and Qinglei Bu and Jie Sun}, title = {A Two-Handed Ellipsoidal Device for Interactive Grasping and Gripping Rehabilitation Exercise}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {39--43}, doi = {10.1145/3696762.3698049}, year = {2024}, } Publisher's Version |
|
Yasmin, Shamima |
ISS Companion '24: "Have Fun with Math: Multimodal, ..."
Have Fun with Math: Multimodal, Interactive, and Immersive Exploration of Wave Functions with 3D Models
Shamima Yasmin (Eastern Washington University, USA) Postsecondary STEM courses include substantial mathematics and algorithms. Students need motivation to dig deep into the topic. Multisensory modeling allows users to explore objects with multiple senses, i.e., audio, visual, and touch. If augmented with virtual reality (VR), the overall experience could be more immersive and enjoyable. This research investigated students’ experience with unimodal versus multimodal visualization and exploration of wave functions with and without VR integration. Students interactively explored wave functions, i.e., sine, cosine, sawtooth, square, and triangular waves, with 3D models to concretize understanding of wave parameters, i.e., frequency, amplitude, phase, and vertical shifts. Initial findings showed that students preferred the audio- visual exploration of wave functions over the visual-only version in VR-enhanced and non-VR platforms. Overall, VR enriched their experience while interacting with the data. @InProceedings{ISS24p24, author = {Shamima Yasmin}, title = {Have Fun with Math: Multimodal, Interactive, and Immersive Exploration of Wave Functions with 3D Models}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {24--28}, doi = {10.1145/3696762.3698045}, year = {2024}, } Publisher's Version |
|
Zhang, Keming |
ISS Companion '24: "Customisable Lower Limb Rehabilitation ..."
Customisable Lower Limb Rehabilitation Carpet for Children with Cerebral Palsy
Yaxuan Liu, Yijia An, Keming Zhang, Martijn Ten Bhömer, Qinglei Bu, Jie Sun, and Siyuan Chen (National University of Singapore (Suzhou) Research Institute, China; Xi’an Jiaotong-Liverpool University, China) Cerebral Palsy (CP) affects motor coordination, resulting in slow walking and irregular step and stride lengths. Effective rehabilitation exercises are essential for strengthening leg muscles and enhancing mobility in children with CP. However, traditional hospital rehabilitation programs often lack engagement, making it challenging for children to maintain consistent participation. Additionally, many advanced lower extremity rehabilitation systems remain largely inaccessible. This study introduces an interactive gaming carpet combined with intelligent gait pattern analysis to enhance rehabilitation efforts. The gaming carpet employs visual and auditory cues to train leg coordination and correct stepping patterns, making the exercises more engaging for children. Meanwhile, the intelligent gait analysis system provides therapists with objective data to assess conditions and develop personalized exercise plans. Initial tests indicate that this system effectively engages children and improves adherence to rehabilitation exercises, while also providing accurate progress monitoring. This innovative approach demonstrates significant potential for integrating game-based interventions and data analysis into CP rehabilitation, offering practical solutions for both clinical and home-based settings. @InProceedings{ISS24p44, author = {Yaxuan Liu and Yijia An and Keming Zhang and Martijn Ten Bhömer and Qinglei Bu and Jie Sun and Siyuan Chen}, title = {Customisable Lower Limb Rehabilitation Carpet for Children with Cerebral Palsy}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {44--49}, doi = {10.1145/3696762.3698050}, year = {2024}, } Publisher's Version |
|
Zhao, Hong |
ISS Companion '24: "Enhancing Virtual Mobility ..."
Enhancing Virtual Mobility for Individuals Who Are Blind or Have Low Vision: A Stationary Exploration Method
Hong Zhao, Oyewole Oyekoya, and Hao Tang (CUNY Borough of Manhattan Community College, USA; CUNY Hunter College, USA; CUNY New York, USA) Designing accessible locomotion methods for individuals who are blind or have low vision (BLV) is a complex challenge, particularly in mobile VR environments with limited interface options. In this paper, we propose a novel locomotion technique on mobile VR that enables users to control a virtual character's movement while staying stationary or within a small physical area. The technique utilizes the phone's gyroscope for movement control, while providing spatial audio and vibration feedback to enhance virtual exploration for BLV individuals. Our study examines how BLV individuals acquire spatial knowledge in mobile VR environments. A user study is conducted to assess the effectiveness of the proposed approach. @InProceedings{ISS24p85, author = {Hong Zhao and Oyewole Oyekoya and Hao Tang}, title = {Enhancing Virtual Mobility for Individuals Who Are Blind or Have Low Vision: A Stationary Exploration Method}, booktitle = {Proc.\ ISS}, publisher = {ACM}, pages = {85--89}, doi = {10.1145/3696762.3698058}, year = {2024}, } Publisher's Version |
63 authors
proc time: 8.19