~ 4:30 Minute Read.
For me the biggest thrill of this project was implementing some form of redirected walking. We wanted the user to walk through a wall, since the theme of the escape room was to avert the users expectations he brought with him from reality.
But at the same time our model of the room matched the real room 1:1 in scale. And in virtual reality scale is super important, so we couldn’t simply scale down the room.
Instead we sped up the users movement in one direction and slowed him down in the other, resulting in a mismatch between the real and virtual world. This worked really well, because we positioned keys and puzzles in a way that the user had to cross the entire room multiple times allowing us to use the full width of the room while knowing where the user would go next.
By moving the tracking origin according to the users relative movement, we were able to create a mismatch of around 2 meters on a roughly 5 meter long room with the user walking three lengths.
From left to right: virtual (in blue) and real room combined, real room only, virtual room only. The green area is the HTC Vive tracking area.
If you know you are being redirected you notice of course. Kinda like with immersion in VR: you can’t fool your subconscious unless you are otherwise enganged, since you are constantly thinking “I know I’m in VR”. When enganged (e.g. by holding a key without accidentally moving their hands outside of the Leap Motion’s field of view), people would not realize what was happening.
The final effect was abolutely mind blowing. People believing they would be unable to walk through the wall and then overcoming this belief is one of the most incredible things I witnessed in quite a while.
Personally I compare creating illusions like this in VR to magic, since it has a similarly mind blowing effect and it is a lot about hiding what you are actually doing from the players attention by keeping him distracted otherwise.
This feature was a team effort: Daniel, Roxanne and I had a lot of fun implementing this together on site and experimenting how far you could go without getting sick and making it just barely noticable.
Simulation or motion sickness is one of virtual reality’s biggest problems. Once you have a mismatch between visual and vestibular 1 input, your brain goes “oh my god, I might have eaten a poisonous mushroom and must be hallucinating” resulting in your stomach being very unhappy with the situation.
We are indeed creating a visual/vestibular mismatch here, but if you don’t overdo the effect, it actually works without motion sickness. Why is that?
An effect that we’re also exploiting in an unannounced game at Vhite Rabbit is that if your brain receives input about accelerating in one direction, it’s not actually picky about how much acceleration it sees, as long as the direction matches that vestibular input. You can litterally speed up the player by a huge factor in the same direction without him getting sick.
Tomorrow we’ll have a look at some rendering details for Unity, but also some general concept that may be interesting to non-Unity users.
- The system in your inner ear that keeps track of orientation and acceleration.
Written in 60 minutes, edited in 15 minutes.