FocalPoint is an augmented reality (AR) direct manipulation technique for picking up, selecting small dense objects within reach, a fundamental yet challenging task due to required accuracy and precision. A preliminary study on user selection behavior informs the design of using adaptive ray casting to enhance target selections.
Dually Noted is a smartphone-based AR annotation system that recognizes the layout of structural elements in a printed document for real-time authoring and viewing annotations. AR annotation is often unwieldy, but during a 12-user empirical study, our novel structural understanding component allows Dually Noted to improve precise highlighting and annotation interaction accuracy by 13%, increase interaction speed by 42%, and significantly lower cognitive load over a baseline method without document layout understanding.
Throwing is a fundamental motor skill that is difficult to do precisely with physical objects but even harder for virtual projectiles due to a lack of physical affordances. We explore the challenges in designing the experience of augmented reality (AR) throwing by asking four participants to throw a virtual ball in an AR environment, finding that individual differences in virtual throwing occur with throwing gesture-induced angle and speed differences. Informed by these findings, we aim to explore whether we can improve both accuracy and the naturalness of virtual throwing for users.
Portal-ware (DIS 2021, full paper)
Jing Qian*, Tongyu Zhou*, Meredith Young-Ng*, Jiaju Ma, Xiangyu Li, Angel Cheung, Xiangyu Li, Ian Gonsher, Jeff Huang.
Free-hand interaction enables users to sketch augmented reality (AR) content directly. In everyday mobile situations, we may consider the devices we routinely carry---smartphones or personal wearables----to support 3D free-hand sketching. To elicit design challenges and layout system design and implementation to achieve this, we developed a smartphone-wearable prototype that supports both bare-hand sketching and using the wearable as the AR sketching extension.
Jing Qian, Meredith Young-Ng, Xiangyu Li, Angel Cheung, Fumeng Yang, Jeff Huang.
Free-hand manipulation in smartphone augmented reality (AR) enables users to directly interact with virtual contents using their hands. However, human hands can ergonomically move in a broader range than a smartphone's field of view (FOV) can capture, requiring users to be aware of the limited usable interaction and viewing regions at all times...
Jing Qian, Jiaju Ma, Xiangyu Li, Benjamin Attal, Haoming Lai, James Tompkin, John F. Hughes, Jeff Huang.
What if we could play with augmented reality (AR) objects with our hands directly instead of clicking on the smartphone screen? Portal-ble enables such AR experience by introducing a series of feedback mechanisms on top of our open-source mobile hand tracking application. The result is a more intuitive, easy to use hand interaction experiences on your everyday mobile devices!
Fumeng Yang, Jing Qian, Johannes Novotny, David Badre, Cullen D. Jackson, and David H. Laidlaw
We present an evaluation of using virtual reality techniques to assist in research knowledge retrieval.
Jing Qian, David A. Shamma, Daniel Avrahami, Jacob Biehl
In this paper, we studied15 participants’ preferences, performances, and cognitive loads on a set of common tasks performed on smartphones at two interaction depths (close-range and distant) with two touchless modalities(hand tracking and screen dwell)...See more
Jing Qian, Arielle Chapin, Alexandra Papoutsaki, Fumeng Yang, Klaas Nelisson, Jeff Huang
A large portion of remote user studies is done on mobile devices, and most focused on users' screen inputs. Remotion provides further contextual information by replaying user hand movements using our custom-made robotic holder. We found that motion cues reveal characteristics and engagement level of participants that the screen inputs do not.
Jing Qian, Laurent Denoue, Jacob Biehl, David A. Shamma
A smartphone AR application showcases the benefit of swapping interaction linearity with sound and voice recognition in real-time.
Liu Xin, Katia Vega, Jing Qian, Joseph Paradiso, and Pattie Maes
Fluxa is a compact wearable device enabling social display through hand movement. Unlike displaying messages on a screen over the internet, Fluxa allows distant in-person communication through the effects of the persistence of vision.
With Ian Gonsher and Ethan Mok
This project provides rigid feedback on otherwise intangible virtual objects. Through