Stabilized Annotations for Mobile Remote Assistance

Omid Fakourfar, Kevin Ta, Richard Tang, Scott Bateman, and Anthony Tang. (2016). Stabilized Annotations for Mobile Remote Assistance. In CHI 2016: Proceedings of the 2016 SIGCHI Conference on Human Factors in Computing Systems, 1548–1560. Acceptance: 23.4%.

Abstract

Recent mobile technology has provided new opportunities for creating remote assistance systems. However, mobile support systems present a particular challenge: both the camera and display are held by the user, leading to shaky video. When pointing or drawing annotations, this means that the desired target often moves, causing the gesture to lose its intended meaning. To address this problem, we investigate annotation stabilization techniques, which allow annotations to stick to their intended location. We studied two annotation systems, using three different forms of annotations, with both tablets and head-mounted displays. Our analysis suggests that stabilized annotations and head-mounted displays are only beneficial in certain situations. However, the simplest approach of automatically freezing video while drawing annotations was surprisingly effective in facilitating the completion of remote assistance tasks

Materials

PDF File (http://hcitang.org/papers/2016-chi2016-annotations.pdf)
URL (http://chi2016.acm.org/)

BibTeX

@inproceedings{fakourfar2016annotations,
  author = {Fakourfar, Omid and Ta, Kevin and Tang, Richard and Bateman, Scott and Tang, Anthony},
  booktitle = {CHI 2016: Proceedings of the 2016 SIGCHI Conference on Human Factors in Computing Systems},
  title = {Stabilized Annotations for Mobile Remote Assistance},
  type = {conference},
  pdfurl = {http://hcitang.org/papers/2016-chi2016-annotations.pdf},
  url = {http://chi2016.acm.org/},
  year = {2016},
  acceptance = {23.4%},
  pages = {1548--1560}
}