Hand-tracked 3D Data Selection of Point Clouds in XR

PERSONAL

#masters-thesis #hand-tracking #unity #GPU

Master's Thesis

Hand-tracked 3D Data Selection of Point Clouds in XR

Selecting and interacting with 3D point cloud data remains a challenging and unsolved problem, especially in scientific visualization, medical imaging, and astronomy. Traditional selection methods are constrained by 2D interfaces, limiting precision and flexibility. My Master's thesis addresses these challenges by developing an XR-based hand-tracked 3D selection system that enables users to interact with point clouds naturally and efficiently.

The core of this research is a GPU-based selection approach using Signed Distance Fields (SDFs). This technique allows for the real-time selection of arbitrary 3D regions, providing high precision and intuitive interaction in VR environments. Implemented in Unity, the system integrates novel and existing selection techniques, validated through a user experiment with 28 participants. Results indicate that brushing-based selection modes were preferred over direct selection, offering higher accuracy and efficiency.

This research bridges the gap between traditional 2D tools and immersive XR environments, presenting an innovative solution that enhances data exploration, scientific analysis, and usability.

Selection Techniques

Master's Thesis

Brushing directly with hands

Master's Thesis

Brushing with a sphere

Master's Thesis

Convex hull selection. Convex Hull is drawn on the fly

Master's Thesis

Shape based selection by spawning and manipulating a shape out of thin air

Master's Thesis

Pointcloud transform using hand-tracking and a handlebar metaphor

Master's Thesis

UI for technique selection

Key Features

  • Real-Time Hand-Tracked Selection

    • Users can interact directly with 3D point clouds using natural hand gestures
    • Implements four selection techniques, including shape-based, convex hull, and brush-based methods
    • Brush-based selection methods were found to be the most efficient in user tests
  • GPU-Accelerated Selection Using Signed Distance Fields (SDFs)

    • Leverages GPU compute shaders for high-speed selection processing
    • Allows arbitrary selection volumes, overcoming traditional bounding-box limitations
    • Efficiently handles large-scale point clouds with millions of points
  • XR-Based Implementation in Unity

    • Developed for VR headsets (Meta Quest Pro, Quest 3) using OpenXR
    • Fully controller-free interaction with hand tracking for a more natural experience
    • Multi-pass rendering and optimized shaders ensure smooth real-time performance
  • User Study & Validation

    • 28 participants tested different selection techniques in controlled experiments
    • Measured selection accuracy, speed, usability, and user satisfaction
    • Findings indicate higher efficiency and preference for brush-based selection methods

Technology Stack

  • XR & Hand Tracking: Unity, OpenXR, Unity XR Interaction Toolkit
  • Rendering & Compute: GPU-based compute shaders, instanced rendering
  • Selection Processing: Signed Distance Fields (SDFs), real-time selection volume generation
  • User Study & Evaluation: System Usability Scale (SUS), task performance metrics, data visualization

Appendix & Download

🔗 Download Thesis: Master's Thesis
📲 GitHub Repository: [Will be made public soon]