Strapping on an Oculus Rift


Last week, we were fortunate enough to have Iestyn Lloyd come by the studio to demo an Oculus Rift. He wasn’t trying to sell us one – Iestyn is a developer first and foremost as well as part time Unity3D/Oculus Rift evangelist – he’s just keen that plenty of people experience what it’s actually like to use one. We were certainly familiar with them, for example, here at MB our Ben can often be found gazing at eBay posts where naughty people try to sell on their dev kits for inflated prices. “We should just get one; they’re awesome” “Hmm. Not sure Ben; looks like a bit of a gimmick.” I take that back now.

The sensation

The Oculus Rift is a headset that you strap over your eyes; It’s not too heavy and it’s not too bulky. Within the headset are a couple of lenses that point at your eyeballs. So far, so straightforward. No setup, no configuration – just pop it on. But as your eyes adjust, the experience is well, unnerving. Move your head, and the environment responds accordingly. Look up, down, tilt and pitch and there’s no lag, you’re simply in that environment. Your legs wobble a bit (I got a bit sweaty, but apparently that’s normal) as you’re guided around on rails. Use a controller like the Razer Hydra that was used for our demo and you’re able move about rather than being stuck on a predetermined course. This is a screen – a display – with no seams or edges. This isn’t a large display that simply takes up a lot of your field of view; it is your view.


Old tech / new tech

Stereoscopic imaging (presenting your eyes with two offset left and right images that your brain processes as a single image with depth) isn’t new as such. Those inescapable, shitty 3D films employ circular-polarised lenses to generate the same effect and consumer electronics companies want you to buy their TVs and active shutter specs so you can experience this at home. Lo-fidelity uses such as jitter gifs and Victoriana Stereoscopy work on a similar principle to this, too. In the Museum of London, there’s an invitation to a fundraiser to complete Brunel Sr’s Thames Tunnel that’s a stereoscopic view of what the banquet, held in the half finished tunnel itself, was going to be like. “Come to our party! We won’t just tell you what it’s like, we’ll show it to you.” Issued around 1834, this artefact (image above) is still utterly compelling. It draws you in; you want to head down there and join everyone for that meal under the Thames.

The anticipation

We have a Leap Motion controller kicking around in the studio. It’s unobtrusive, it’s very well made and the experience from unboxing through to up-and-running is generally pretty exemplary. The ecosystem is rational and smart; there’s a bunch of nascent apps and there’s decent dev support (especially important when viewed alongside Microsoft’s token dev support for the Kinect, when they launched their non-commercial, MS Visual Studio only ‘Kinect for Windows SDK’). But it’s still just a hand-wavy controller, albeit a very robust and sensitive one.

The Leap presents itself as a paradigm shift in interface controllers; although it’s fantastic, it doesn’t feel like a massive departure from how we as users interface with devices. The Oculus Rift presents itself as a far more stayed proposition: “what do you want to use this for?” It isn’t just a different means of displaying content, it’s a different means of displaying different types of content. Due to Unreal Engine and Unity support, a lot of the demonstrations are either straight game ports or riffs on gaming metaphors (sort of – I’m being broad here). My point is this: people are going to start using this for all manner of weird and wonderful things that simply won’t translate to screens on tablets, phones, tellys or computers.

A year ago, this project was in it’s infancy. It’s not that ‘it didn’t exist’, as after all a Kickstarter campaign doesn’t appear from dust and it’s also extremely apparent that the initial R&D sounds more labour of love than quick pitch and bootstrap (read more over at this great post on Eurogamer). The next 12 months is going to see the displays upped to a high definition, improved pipelines, increased adoption as well as the affect that newly appointed CTO John Cormack has. This is going to get exciting.