Skip to content

UIST 2022 Integrating Real-World Distractions into Virtual Reality

Notifications You must be signed in to change notification settings

humancomputerintegration/vr-distraction

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

9 Commits
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Integrating Real-World Distractions into Virtual Reality

This repository contains hardware design files and VR scenes presented in the paper Integrating Real-World Distractions into Virtual Reality, which appeared at UIST 2022. In this work, we explore a new concept, where we directly integrate the distracting stimuli from the user’s physical surroundings into their virtual reality experience to enhance presence. This work was done by Yujie Tao and Pedro Lopes at the University of Chicago's Human Computer Integration Lab.

Paper | Video | UIST'22 Talk

Proof of Concept Prototype

Group 1061 In this paper, we proposed a proof-of-concept device that detects a small set of distractive stimuli: wind, temperature change, door closing sound, car engine sound, and talking sound.

Prototype Components

  • Microphone - Link, connected to a laptop running audio classification model introduced below.
  • Wind sensor (Wind Sensor Rev. C) - Link, connected to Arduino Nano
  • Temperature sensor (DS18B20) - Link, connected to Arduino Nano

Sensing

VR Demo Presented at UIST 2022

At UIST 2022, we demoed this project using a VR room escape experience. In this demo, the audience, while finding keys in the mysterious VR room, experienced four different distractions: wind, touch, temperature change, and drilling sound. In the scene, we mapped these four distractions with curtain movement, bat appearance, fire, and debris fall accordingly. While we adopted the wizard-of-oz approach in this demo (i.e., both distractions and mapping were triggered manually) to ensure demo quality, we see future iterations of the sensing system could robustly detect all of these distractions. Here we provided the VR scene used in this demo with sample scripts to connect with sensor output via OSC (e.g., the mapping to be triggered automatically when certain distractions are detected).

VR scene could be found at this link.

Citation

When using or building upon this device/work in an academic publication, please consider citing as follows:

Tao, Y., & Lopes, P. (2022, October). Integrating Real-World Distractions into Virtual Reality. In Proceedings of the 35th Annual ACM Symposium on User Interface Software and Technology (pp. 1-16).

Contact

For questions or if something is wrong with this repository, contact [email protected].

About

UIST 2022 Integrating Real-World Distractions into Virtual Reality

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages