Abstract:
Within mobile mixed reality experiences, we would like to engage the user's s
head and hands for interaction. However, this requires the use of multiple
tracking systems. These must be aligned, both as part of initial system setup
and to counteract inter-tracking system drift that can accumulate over time.
Traditional approaches to alignment use obtrusive procedures that introduce
explicit constraints between the different tracking systems. These can be
highly disruptive for the user's s experience. In this paper, we propose
another type of information which can be exploited to effect alignment: the
behaviour of the user. The crucial insight is that user behaviours such as
selection through pointing introduce implicit constraints between tracking
systems. These constraints can be used as the user continually interacts with
the system to infer alignment without the need for disruptive procedures. We
call this concept behaviour aware sensor fusion. We introduce two different
interaction techniques the redirected pointing technique and the yaw fix
technique to illustrate this concept. Pilot experiments show that behaviour
aware sensor fusion can increase ease of use and speed of interaction in
exemplar mixed-reality interaction tasks.
Social Program