Powered by
Conference Publishing Consulting

2017 IEEE Symposium on 3D User Interfaces (3DUI), March 18-19, 2017, Los Angeles, CA, USA

3DUI 2017 – Proceedings

Contents - Abstracts - Authors
Online Calendar - iCal File

Contest
Sat, Mar 18, 17:15 - 17:45, Parkview

Augmented Reality Exhibits of Constructive Art: 8th Annual 3DUI Contest
Rongkai Guo, Ryan P. McMahan, and Benjamin Weyers
(Kennesaw State University, USA; University of Texas at Dallas, USA; RWTH Aachen University, Germany)
The 8th annual IEEE 3DUI Contest focuses on the development of a 3D User Interface (3DUI) for an Augmented Reality (AR) exhibit of constructive art. The 3DUI Contest is part of the 2017 IEEE Symposium on 3D User Interfaces held in Los Angeles, California. The contest was open to anyone interested in 3DUIs, from researchers to students, enthusiasts, and professionals. The purpose of the contest is to stimulate innovative and creative solutions to challenging 3DUI problems.

AACT: A Mobile Augmented Reality Application for Art Creation
Ayush Bhargava, Jeffrey Bertrand, and Sabarish V. Babu
(Clemson University, USA)
In this paper, we present Augmented-Art Creation Tool (AACT) as our solution to the IEEE 3DUI 2017 challenge. Our solution employs multi-finger touch gestures along with the built-in camera and accelerometer on a mobile device for interaction in an Augmented Reality (AR) setup. We leverage a user's knowledge of touch gestures like pinching, swiping, etc. and physical device movement to interact with the environment making the metaphor intuitive. The system helps prevent occlusion by using the accelerometer and allowing touch gestures anywhere on the screen. The interaction metaphor allows for successful art piece creation and assembly.

Video
HOT: Hold your Own Tools for AR-Based Constructive Art
Giuseppe Attanasio, Alberto Cannavò, Francesca Cibrario, Fabrizio Lamberti, Paolo Montuschi, and Gianluca Paravati
(Politecnico di Torino, Italy)
Using digital instruments to support artistic expression and creativity is a hot topic. In this work, we focused on the design of a suitable interface for Augmented Reality-based constructive art on handheld devices. Issues to be faced encompassed how to give artists sense of spatial dimensions, how to provide them with different tools for realizing artworks, and how much moving away from ``the real'' and going towards ``the virtual''. Through a touch-capable device, such as a smartphone or a tablet, we offer artists a clean workspace, where they can decide when to introduce artworks and tools. In fact, besides exploiting the multi-touch functionality and the gyroscopes/accelerometers to manipulate artworks in six degrees of freedom (6DOF), the proposed solution exploits a set of printed markers that can be brought into the camera's field of view to make specific virtual tools appear in the augmented scene. With such tools, artists can decide to control, e.g., manipulation speed, scale factor, scene parameters, etc., thus complementing functionalities that can be accessed via the device's screen.

Batmen Beyond: Natural 3D Manipulation with the BatWand
André Montes Rodrigues, Olavo Belloc, Eduardo Zilles Borba, Mario Nagamura, and Marcelo Knorich Zuffo
(University of São Paulo, Brazil)
In this work we present an interactive 3D object manipulation system using off the shelf mobile devices coupled with Augmented Reality (AR) technology that allows editing 3D objects by way of natural interactions based on tangible interfaces paradigms. The set-up consists of a mobile device, an interactive wand marker and AR markers laid on a table. The system allows users to change viewpoint and execute operations on 3D objects - simultaneous translation and rotation, scaling, cloning or deleting - by unconstrained natural interactions, leveraging user’s proficiency on daily object manipulation tasks and speeding up such typical 3D manipulation operations. Depth perception was significantly enhanced with dynamic shadows, allowing fast alignment and accurate positioning of objects. The prototype presented here allows successful completion of the three challenges proposed by the 2017 3DUI Contest, as validated by a preliminary informal user study with participants from the target audience and also from the general public.

SculptAR: An Augmented Reality Interaction System
Vicenzo Abichequer Sangalli, Thomas Volpato de Oliveira, Leonardo Pavanatto Soares, and Márcio Sarroglia Pinho
(PUCRS, Brazil)
In this work, a 3D mobile interface to create sculptures in an augmented reality environment tracked by AR markers is presented. A raycasting technique was used to interact with the objects in the scene, as well as 2D and 3D interfaces to manipulate and modify the objects. The users can move, delete, paint and duplicate virtual objects using 6 DOFs techniques.

Augmented Reality Digital Sculpture
Nathanael Harrell, Grayson Bonds, Xiaojia Wang, Sean Valent, Elham Ebrahimi, and Sabarish V. Babu
(Clemson University, USA)
We present our metaphor for object translation, rotation, and rescaling and particle parameter manipulation in an augmented reality environment using an Android smartphone or tablet for the 2017 3DUI Competition in Los Angeles, California. Our metaphor aims to map the three-dimensional interaction of objects in a real world space to the two-dimensional plane of a smartphone or tablet screen. Our final product is the result of experimentation with different metaphors for translation, rotation, rescaling, and particle parameter manipulation and was guided by the feedback of voluntary product testers. The result is an interaction technique between a mobile device and the virtual world which we believe to be intuitive.

Video
Collaborative Manipulation of 3D Virtual Objects in Augmented Reality Scenarios using Mobile Devices
Jerônimo G. Grandi, Iago Berndt, Henrique G. Debarba, Luciana Nedel, and Anderson Maciel
(Federal University of Rio Grande do Sul, Brazil; Artanim Foundation, Switzerland)
Interaction in augmented reality environments may be very complex, depending on the degrees of freedom (DOFs) required for the task. In this work we present a 3D user interface for collaborative manipulation of virtual objects in augmented reality (AR) environments. It maps position -- acquired with a camera and fiducial markers -- and touchscreen input of a handheld device into gestures to select, move, rotate and scale virtual objects. As these transformations require the control of multiple DOFs, collaboration is proposed as a solution to coordinate the modification of each and all the available DOFs. Users are free to decide their own manipulation roles. All virtual elements are displayed directly on the mobile device as an overlay of the camera capture, providing an individual point of view of the AR environment to each user.

Video
T4T: Tangible Interface for Tuning 3D Object Manipulation Tools
Alberto Cannavò, Fabio Cermelli, Vincenzo Chiaramida, Giovanni Ciccone, Fabrizio Lamberti, Paolo Montuschi, and Gianluca Paravati
(Politecnico di Torino, Italy)
A 3D User Interface for manipulating virtual objects in Augmented Reality scenarios on handheld devices is presented. The proposed solution takes advantage of two interaction techniques. The former (named ``cursor mode'') exploits a cursor, which position and movement are bound to the view of the device; the cursor allows the user to select objects and to perform coarse-grain manipulations by moving the device. The latter (referred to as ``tuning mode'') uses the physical affordances of a tangible interface to provide the user with the possibility to refine objects in all their aspects (position, rotation, scale, color, and so forth) with a fine-grained control.

proc time: 0.04