Thursday, August 15, 2013

Augmented Virtual Reality

One of the moments in recent gaming history that made the biggest impression on me was Heavy Rain's ARI, the Added Realty Interface.



When I read this Penny Arcade Report article about how the developers of City Quest implemented Oculus Rift support for their 2D game by creating a virtual basement for you to play it in, I was immediately impressed. Lots of developers are coming up with clever ways to use the Rift to create completely new and unique experiences, but I haven't seen much in the way of using the Rift to augment traditional gaming experiences. The Ibex Virtual Reality Desktop project is one such project, but it In City Quest's Rift edition, you can look around and see the computer the game is running on, decorations from 80s nerd culture, the doorway leading out of the room, and even hands (though lacking virtual arms) using the mouse and keyboard:



While the dev's reddit thread announcing this feature calls it the "silliest way they could think of" to implement support for the Rift, I believe that the concept of using the Rift to create virtual spaces where we can interact with augmented reality-style interfaces has a lot of potential.

With only a Rift, you're limited to a traditional form of input like keyboard/mouse or game controller. Of course, while wearing the Rift, when you look down, you see what's in the game, not what's on your desk. You're totally blind to where the real world keyboard is, or where the real world keys are on it. If you want to move your hand from the mouse to the keyboard or press a hard to access key like F10, you have to grope around and rely on your proprioceptive memory. However, if there were a way to project your real keyboard, mouse, and hands into the virtual space, those clumsy moments would be no worse than glancing down in real life.

3Gear Systems and Lead Motion are two of several companies that are already working on machine vision-based products that could be used to provide this solution. A camera looking down at your hands could conceivably be used to track not only the position of your hands, but the devices they interface with. I believe keyboard layouts are standardized enough - main large block of letters and numbers, function keys clustered in fours, etc - to be able to use the positions of the printed letters and some intelligence to automatically (or assisted with some guided manual calibration) produce a virtual representation of any given keyboard. Mice would be harder to represent accurately, since there are so many variations of mouse contours, button sizes, and side buttons that would be hard to precisely identify with an overhead camera. However, I believe the main difficulty with using a mouse with the Rift would be simply locating it with your hand, and so a simple representation (or perhaps one chosen from a selection of popular models) should suffice.

Once you're set up in your augmented reality Rift office, how could you take advantage of working in virtual reality? I have more ideas about that which this blog post is too narrow to contain.

Tuesday, August 6, 2013

Augmented VR brainstorming

"It’s easy to mess with someone who is in virtual reality, although doing so is borderline cruel. Anything that increases the dissonance between what the person inside is seeing and hearing from the game and the stimulus from the “real” world will be distracting, and in many cases disturbing.

If your brain thinks it’s in an underwater ship, why can you hear people faintly talking? You look behind yourself, but you see only the back of your vessel, although that’s where you hear the voices. Being touched is nearly intolerable for some people after they’ve settled into a virtual world; it is incredibly difficult to feel someone’s hands on your body when you look around and see no one around you."
Why not take advantage of the dissonance? These sound like amazingly immersive tools for a psychological horror game. It sounds a little impractical, but with something like second screen or tablet device relaying instructions to a partner (using a timer or a third person view to ensure the effects are synchronized with the game),  you could combine in-game effects with physical sensations at key dramatic moments. 

Imagine seeing an elderly groundskeeper get snatched away by something lurking in the darkness around the edge of your lodge, then later searching his deserted room, and suddenly feeling a gentle tap on the back of your shoulder - you turn around and he's standing there, as if having appeared from thin air, smiling and talking to you like nothing strange has happened since the start of the game. Your character could be walking and suddenly hear a visceral squelching sound while you feel a tugging on your real feet, only to look down and see that the carpet has been replaced by a writing mass of grabbing fleshy fingers. Or perhaps while running away from an otherworldly horror in-game, your partner could be lightly brushing or grabbing at the back of your shirt and pants...or even neck.

The partner would probably need to be careful and remain a safe distance away from the gamer while carrying these out.