A Taxonomy of Freehand Grasping Patterns in Virtual Reality

Blaga, Andreea-Dalia (2023) A Taxonomy of Freehand Grasping Patterns in Virtual Reality. Doctoral thesis, Birmingham City University.

Andreea Dalia Blaga PhD Thesis published_Final version_Submitted Nov 2022_Final Award May 2023.pdf - Accepted Version

Download (5MB)


Grasping is the most natural and primary interaction paradigm people perform every day, which allows us to pick up and manipulate objects around us such as drinking a cup of coffee or writing with a pen. Grasping has been highly explored in real environments, to understand and structure the way people grasp and interact with objects by presenting categories, models and theories for grasping approach. Due to the complexity of the human hand, classifying grasping knowledge to provide meaningful insights is a challenging task, which led to researchers developing grasp taxonomies to provide guidelines for emerging grasping work (such as in anthropology, robotics and hand surgery) in a systematic way.

While this body of work exists for real grasping, the nuances of grasping transfer in virtual environments is unexplored. The emerging development of robust hand tracking sensors for virtual devices now allow the development of grasp models that enable VR to simulate real grasping interactions. However, present work has not yet explored the differences and nuances that are present in virtual grasping compared to real object grasping, which means that virtual systems that create grasping models based on real grasping knowledge, might make assumptions which are yet to be proven true or untrue around the way users intuitively grasp and interact with virtual objects.

To address this, this thesis presents the first user elicitation studies to explore grasping patterns directly in VR. The first study presents main similarities and differences between real and virtual object grasping, the second study furthers this by exploring how virtual object shape influences grasping patterns, the third study focuses on visual thermal cues and how this influences grasp metrics, and the fourth study focuses on understanding other object characteristics such as stability and complexity and how they influence grasps in VR. To provide structured insights on grasping interactions in VR, the results are synthesized in the first VR Taxonomy of Grasp Types, developed following current methods for developing grasping and HCI taxonomies and re-iterated to
present an updated and more complete taxonomy.

Results show that users appear to mimic real grasping behaviour in VR, however they also illustrate that users present issues around object size estimation and generally a lower variability in grasp types is used. The taxonomy shows that only five grasps account for the majority of grasp data in VR, which can be used for computer systems aiming to achieve natural and intuitive interactions at lower computational cost. Further, findings show that virtual object characteristics such as shape, stability and complexity as well as visual cues for temperature influence grasp metrics such as aperture, category, type, location and dimension. These changes in grasping patterns together with virtual object categorisation methods can be used to inform design decisions when developing intuitive interactions and virtual objects and environments and therefore taking a step forward in achieving natural grasping interaction in VR.

Item Type: Thesis (Doctoral)
7 November 2022Submitted
9 May 2023Accepted
Uncontrolled Keywords: Taxonomy; virtual reality; human-computer interaction; grasping; freehand
Subjects: CAH11 - computing > CAH11-01 - computing > CAH11-01-06 - computer games and animation
Divisions: Doctoral Research College > Doctoral Theses Collection
Faculty of Computing, Engineering and the Built Environment > School of Computing and Digital Technology
Depositing User: Jaycie Carter
Date Deposited: 24 May 2023 11:18
Last Modified: 24 May 2023 11:18
URI: https://www.open-access.bcu.ac.uk/id/eprint/14405

Actions (login required)

View Item View Item


In this section...