teaser1 (1).jpg

Annotating documents helps us to note our thoughts and pass knowledge and insights to others. However, pen-and-paper annotations can be difficult to share or disseminate, and are generally constrained by a document's available margin space. Thus, prior work has explored the value of linking digital information to physical documents in augmented reality (AR). The smartphone comprises a commonly accessible digital device, but its small screen size and touchscreen interface make it yet ill-suited for annotating....

portalware2.png

Jing Qian, Meredith Young-Ng, Xiangyu Li, Angel Cheung, Fumeng Yang, Jeff Huang.

 

Free-hand manipulation in smartphone augmented reality (AR) enables users to directly interact with virtual contents using their hands. However, human hands can ergonomically move in a broader range than a smartphone's field of view (FOV) can capture, requiring users to be aware of the limited usable interaction and viewing regions at all times...

Portalble1

Jing Qian, Jiaju Ma, Xiangyu Li, Benjamin Attal, Haoming Lai, James Tompkin, John F. Hughes, Jeff Huang.

 

What if we could play with augmented reality (AR) objects with our hands directly instead of clicking on the smartphone screen? Portal-ble enables such AR experience by introducing a series of feedback mechanisms on top of our open-source mobile hand tracking application. The result is a more intuitive, easy to use hand interaction experiences on your everyday mobile devices! 

vr_mp.png

A Virtual Reality “Memory Palace”

Aids Knowledge Retrieval ​(TVCG 2020, Full Paper)

Fumeng Yang, Jing Qian, Johannes Novotny, David Badre, Cullen D. Jackson, and David H. Laidlaw

  

We present an evaluation of using virtual reality techniques to assist in research knowledge retrieval.

group-d (1).jpg

Modality and Depth in Touchless Smartphone Augmented Reality Interactions (IMX 2020, Full Paper)

Jing Qian, David A. Shamma, Daniel Avrahami, Jacob Biehl

 

In this paper, we studied15 participants’ preferences, performances, and cognitive loads on a set of common tasks performed on smartphones at two interaction depths (close-range and distant) with two touchless modalities(hand tracking and screen dwell)...See more

remotion.jpg

Jing Qian, Arielle Chapin, Alexandra Papoutsaki, Fumeng Yang, Klaas Nelisson, Jeff Huang

  

A large portion of remote user studies is done on mobile devices, and most focused on users' screen inputs. Remotion provides further contextual information by replaying user hand movements using our custom-made robotic holder. We found that motion cues reveal characteristics and engagement level of participants that the screen inputs do not.

AIAR Interface.png

Jing Qian, Laurent Denoue, Jacob Biehl, David A. Shamma

  

A smartphone AR application showcases the benefit of swapping interaction linearity with sound and voice recognition in real-time.

fluxa.jpg

Liu Xin, Katia Vega, Jing Qian, Joseph Paradiso, and Pattie Maes

 

Fluxa is a compact wearable device enabling social display through hand movement. Unlike displaying messages on a screen over the internet, Fluxa allows distant in-person communication through the effects of the persistence of vision.

vr_tandem.jpg

With Ian Gonsher and Ethan Mok

 

This project provides rigid feedback on otherwise intangible virtual objects. Through