Perception of approach and reach in combined interaction tasks

Cathy Ennis, Arjan Egges

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Often in games, a virtual character is required to interact with objects in the surrounding environment. These interactions can occur in different locations, with different items, often in combination with environment navigation tasks. This results in switching and blending between different motions in order to fit to restrictions due to the position of the character and the interaction circumstances. In this paper, we conduct perceptual experiments to gain knowledge about such interactions and deduce important factors about their design for game animators. Our results identify at what point interaction information is obvious, and which body parts are most important to consider. We find that general information about target position is evident from early on in a combined navigation and manipulation task, and can be deduced from very few visual cues. We also learn that participants are highly sensitive to target positions during the interaction phase, relying mostly on indicators in the motion of the character's arm in the final steps.

Original languageEnglish
Title of host publicationProceedings - Motion in Games 2013, MIG 2013
Pages143-148
Number of pages6
DOIs
Publication statusPublished - 2013
Externally publishedYes
Event6th International Conference on Motion in Games, MIG 2013 - Dublin, Ireland
Duration: 7 Nov 20139 Nov 2013

Publication series

NameProceedings - Motion in Games 2013, MIG 2013

Conference

Conference6th International Conference on Motion in Games, MIG 2013
Country/TerritoryIreland
CityDublin
Period7/11/139/11/13

Keywords

  • animation
  • interaction
  • perception

Fingerprint

Dive into the research topics of 'Perception of approach and reach in combined interaction tasks'. Together they form a unique fingerprint.

Cite this