ProXeek
ProXeek: Seeking and Leveraging Real-World Objects and Environments as Haptic Proxies for Virtual Reality through Multimodal Reasoning
Providing haptic feedback in Virtual Reality remains challenging, with active haptic systems often being bulky and expensive, while passive haptic approaches typically require pre-fabricated objects or extensive environment setup. We present ProXeek, a LLM-based multi-agent system that opportunistically leverages real-world physical objects as haptic proxies for VR experiences. During VR development, the designers could annotate the virtual interactables with emphasized haptic characteristics using our node-based editing interface. On the user side, end users capture environmental snapshots via commodity VR headset, and the system employs LLM-based multimodal reasoning and multi-objective optimization to match virtual objects with viable physical proxies, considering multiple haptic properties—inertia, interactivity, outline, texture, hardness, and temperature—as well as inter-object interactions. Our evaluation demonstrates that ProXeek effectively identifies viable proxies, with an ablation study confirming the essential roles of both proxy priority and spatial constraints in our pipeline. Our user study reveals that opportunistically selected physical objects deliver significantly higher haptic fidelity and enhanced presence compared to 3D-printed replicas and visual-only interaction, particularly when diverse physical properties beyond geometry are critical.
Authors: Haichen Gao, Tianrui Hu, Zeshui Li, and Kening Zhu (*).
