I’ll admit it. I knew little about VR technology until the day I ordered a free Star Wars Google Cardboard (R2D2) for my 7 year old son. Once I tried it, I was hooked by the elegant simplicity of VR interfaces.* All of the apps I checked out afford the user a 360 degree view of the environment, visible (as in the real world) by turning ones head. Outside of some occasional visual squidginess around the straight-down view, the illusion is pretty convincing, particularly considering I’m using an iPhone 5S with a tiny screen. Each eye is looking at an image that’s smaller than 2” square. Yet strap this cardboard box to your face and suddenly you’re there (I highly recommend the SNL40 videos in the Vrse app).
The applications for VR are limitless: everything from marketing to education to entertainment to gaming. Or even marketing and education, as in this Excedrin VR campaign. Medical applications may also be possible: a relaxation or hypnosis tool or as therapy to treat phobias and social anxiety through ‘safe’ exposure. About half the apps I explored use real video, half were computer-generated (the real stuff is more fun). A couple apps even let users insert their own videos in a 3-D environment, creating the illusion of staring at a giant, curved movie screen. As you might expect, the results there were less than impressive.
For my money, the most effective VR interfaces don’t require any clicking at all. In other words, VR has introduced an entirely new form of interaction. Let’s call it “navigate and wait.” The way it works is like this: the user moves his head until the cursor is hovering over a button (or an interactive element). Then the user sits on that button while a visual timer counts down (usually about 2 seconds). When the timer runs down, the button is ‘clicked’.
Among the dozen or so apps I looked at, I identified a few interaction variations, all revolving around the same basic set of principles. Here’s what I found:
1. The user presses the physical button on the VR device to trigger an action. No virtual interactive elements appear on the screen.
Example: Google Cardboard Demo – Explorer & Exhibit sections.
2. The user presses a physical button on the VR device to press a virtual button that appears on the screen. User does not have to place a cursor on the button to press it, the button simply needs to be visible on the screen.
Example: Google Street View.
3. Pressing the physical button on the VR device causes virtual buttons to appear on-screen. The user then places a floating cursor over a button and presses the physical button again to ‘click’ the virtual button (another variation auto-clicks the virtual button after a couple of seconds).
4. The user aligns a fixed, floating cursor over virtual elements (by moving ones head), then presses the physical button on the VR device to ‘click’ the button. Alternately, the cursor changes to indicate which objects are interactive. The user then presses the physical button to click the object.
Example: Google Cardboard Demo – Arctic Journey
5. The user aligns a fixed, floating cursor over virtual buttons (by moving ones head) but then simply has to hover over a button for a couple seconds to ‘click’ it. A visual timer indicates time until activation.
Example: Fractal Flow, Jaunt
6. The buttons are placed in a constant, fixed place in the environment (often at the bottom, requiring the user to look down to see them). These buttons may be always present or may appear only when the user looks down.
7. Rotation-based interaction.
The user rotates the viewer to perform an action (e.g. return to the main menu).
Example: Google Cardboard Demo, inVR. Also, the InMind VR app asks the user to nod his head to proceed. Pretty clever.
Though I prefer the new ‘navigate and wait’ style of interactivity for its universality and elegant simplicity, all modalities are equally valid. It all depends on what you’re going for and which devices you’re targeting. One big rule, though: never overcrowd your interface! As with any UI, too much of anything is always too much. But when everything is floating in front of your face and all the buttons are activated simply by moving your head the slightest bit, the result is downright nauseating! The Daydream.VR app literally assaults the reader’s senses with way too many interactive elements, all activated by timed-delay. This app will get you on the fast track to seizure-town!
If your company is interested in adding VR (or any other new technology) to your Enterprise Mobile Strategy, but isn’t sure how to go about it or where to focus your efforts, check out our Emerging Technologies Kickstart. https://www.anexinet.com/mobile-strategy/emerging-technologies/ Or just give us a call. Propelics’ team of strategists is available to answer all your mobile tech questions.
* (2 hours later I was nauseous and somewhat less hooked)
Content Strategy Lead at Anexinet