We recently hosted our second ITP Academy event all about Virtual and Augmented Reality, following our hugely successful first edition about the Internet of Things. If you want to know why we are so enthusiastic about VR and everything related to it, head over to our main blog and read Frederik’s article and slides from the event. Seriously, go read them now - they’re really good.
In short: at In The Pocket we think VR and AR are an exciting new medium that will let us experience and create content and communicate and connect with each other in ways we have never been able to do before. The technology is already deeply impressive, and will only keep improving. We are confident that in the coming years, VR and AR will start playing a vitally important role in our digital lives.
So how can you join in the fun? We’ve been playing around with the available hardware, platforms and SDK’s for a while and have found some great resources to quickly get started with VR development. If you’re a developer and you’ve been itching to build virtual worlds of your own, keep reading.
Choosing a Platform
There are many types of VR headsets in the market right now, each with their own capabilities and limitations. We expect that most of them will eventually converge on a shared set of features and performance characteristics, just like smartphones did - but right now it’s a good idea to design your app with your target audience’s hardware in mind.
On one end of the spectrum are PC-powered, motion-tracked headsets like the Oculus Rift and HTC Vive. These are in the € 600 to € 800 range and require a beefy gaming rig that costs at least that. In return, you get the amazing graphical horsepower of today’s GPU’s and full motion tracking of the headset and controllers. Your user can turn and walk around, reach for objects or UI in the virtual space, and use their whole body to intuitively interact with your app. Interacting with virtual reality in this way can be deeply awesome.
On the other end of the spectrum are cheap, simple VR devices like Google Cardboard and commercially sold (mostly plastic) equivalents. Most of these are passive devices - glorified phone cases with lenses that offer a basic VR experience that can be powered by any smartphone. They are certainly your best bet if you want to reach as large a market as possible, but you have no guarantees about the processing power or display quality of your user’s hardware - and there are no good options for input.
A good compromise between these two are active phone-powered headsets like the Samsung GearVR and the recently announced Google Daydream hardware spec. The GearVR, for example, requires a high-end Samsung smartphone so you have a target to optimize your app’s performance against. It has good lenses, accurate sensors inside the headset for better head tracking, and a simple but effective input solution in the form of a touchpad on the side of the device. Plus it benefits from a software layer developed by Oculus that is deeply integrated with the Android platform, letting you squeeze every drop of performance (and if you’re not careful, battery life!) out of your user’s phone.
According to a tweet by Oculus founder Palmer Luckey, as of a few weeks ago more than one million smartphones have plugged into a GearVR at some point. That makes the $100 headset the most widely adopted VR hardware to date, and as such a good target if you want to reach a large audience but don’t want to compromise on the quality of your app too much. So we’ll go with the GearVR for the rest of this article.
The Right Tools for the Job
Unity has been a popular game development platform for close to a decade, particularly for web and mobile games. The game engine and toolset are very approachable to beginning game developers, it offers a wide range of features and subsystems that integrate well with each other - and most importantly, it deploys to pretty much any platform you can think of. One Unity project can, with very minor configuration changes, deploy to Android, iOS, the web, Windows, Mac, all the major game consoles, Apple TV and others. And of course it now has broad support for all important VR and AR platforms as well.
Oculus has said that close to 90% of all apps created for the Rift and GearVR are built with Unity. Other good options exist: the Unreal Engine, for example, is at least as powerful and full-featured as Unity and has been shifting its focus to VR development as well. But as a toolset is has a much higher learning curve than Unity, and as such it’s mostly being used by large game studios for triple-A game production.
To get started with GearVR development using Unity, you’ll need the following:
- A GearVR and compatible Samsung phone
- Unity 5, personal edition (free)
- Oculus Utilities for Unity 5 to target the GearVR
- Oculus signing tool to deploy to your device
For our example project you’ll also need:
- Blender (free and open source) for modelling and importing 3D content
- The starter Unity project for our demo game (18MB)
Demo game: “Don’t Look Back!”
For our technical workshop at the ITP Academy event, we wanted to demo a short game experience specifically designed around the features and limitations of the GearVR. We were very impressed with Land’s End, a game by the creators of the very successful mobile game Monument Valley. In Land’s End the player explores a strangely beautiful landscape, navigating around and solving puzzles - all without using a controller of any kind. All interaction with the world and UI happens by looking at trigger points and using head movements to manipulate physics objects. We’re big fans of minimal UI and elegant input solutions at In The Pocket, so we decided to run with this idea!
We want to create a short level in some kind of moody cave, wherein the player can move around. As the player goes deeper into the cave, they need to solve a small “puzzle” by finding trigger points in the environment. Solving the puzzle opens a door to the end of the level; but it also lets loose a monster that chases the player. The game then becomes about navigating the rest of the level as quickly as possible, so the player can reach the finish before the monster gets them. Here’s an overview of the level:
Setting the Scene
If you download the Unity project for Don’t Look Back above, you’ll find an “App” scene that contains a cave-like 3D environment. The Unity Editor itself isn’t really a modelling tool, so we used a free version of the excellent ProBuilder plugin to construct most of the environment. More detailed parts like the glowing blue crystals were modeled using Blender. The environment is mostly textureless, relying purely on geometry and lighting to give shape to the world. This is becoming a trend in 3D graphics and gaming - not only because it looks nice and minimalistic, but also because it helps maintain good performance on relatively low-powered graphics hardware.
Solid performance is a must-have in all VR applications, but particularly on the GearVR. For a smooth and pleasant user experience, the phone’s GPU needs to render the game at a resolution of up to 1280x1440, 60 times per second, for each eye. Those are numbers even high-end PC’s had trouble with a few years ago! If we want a smartphone to deliver them, we need to optimize our scene as much as we can. The best way to do this is to offload as much computation as possible from runtime to build-time; in other words, pre-process everything that doesn’t need to change during gameplay.
All level geometry in our scene is marked as static; this lets the individual pieces be combined into a single mesh and allows Unity to bake lighting into lightmaps. The effect of this is easy to spot by enabling the “Stats” overlay on the Game panel. If we take a look at the statistics while in editor mode and compare them to play mode, we see the number of Batches (also known as draw calls, meaning chunks of geometry that the CPU sends to the GPU to draw) drops from around 100 to around 30 at the most. Fewer draw calls means the CPU doesn’t waste time and power submitting batches of geometry to the GPU, and the GPU wastes less time waiting for the CPU to finish.
Not everything can be precomputed, though: non-static objects, like the spinning “monster” model in the starting area or the moving doors later in the level, have animation and dynamic lighting applied to them so they need to be rendered as separate batches. Since these objects are fairly simple and don’t cast lights or shadows, we can get away with a small number of them on-screen just fine.
There is much more to be said about 3D graphics performance in the context of VR, and we can’t cover everything here. In general, if you’re seeing bad performance in your app it’s a good idea to follow one of our favorite mantras: always measure, never assume. Unity provides a very useful profiler that can show you frame-to-frame where the hardware is spending is time and where the bottlenecks in your app are. Debug builds will even stream profiling data from your test device to the Unity Editor on your Mac or PC in realtime. In short, the profiler is your friend and you should always keep an eye on it when testing your game or app.
Integrating the Oculus SDK
Right, so now we have a pretty nice game environment that’s optimized for fast performance on mobile. Let’s see it in glorious Virtual Reality!
Integrating the Oculus Utilities for Unity is extremely simple; just go to Assets > Import Package in Unity and select the .unitypackage you downloaded from the Oculus developer website. When you confirm the dialog that pops up, a bunch of scripts and prefabs will be added to the Unity project. The most important of these are the different VR-enabled player controllers. You’ll find these in
OVRCameraRigis a simple static VR point of view. When you add it to your scene and play the game on a headset, it will be the point you’re seeing the world from. It has no concept of the player’s size, physics or movement.
OVRPlayerControlleris a fully dynamic player object that extends the basic setup of
OVRCameraRigwith a player bounding box for collisions with the environment, plus default controls for movement and jumping.
OVRPlayerController is a great drop-in solution that works in a lot of cases, but for Don’t Look Back we don’t need the functionality (and complexity) it adds. For our player, let’s drag a
OVRCameraRig prefab into the scene and put it in the starting room. If you run the game inside the editor, you’ll now be able to look around with the mouse. Once we run it on a VR headset, the camera rig will create a camera for each eye and render in stereo; and orientation will of course no longer be set by your mouse but by the physical orientation of your head.
Deploying to GearVR
Deploying the game to GearVR is incredibly simple, but you’ll need one crucial piece of data first: a signature file for your GearVR-connected smartphone. Unity needs this ID to sign the generated .apk at the end of the build process; otherwise your phone will refuse to run the app. Instructions for generating a signature file for your device can be found here; once you have the file, put it in your Unity projects in the
To build the app directly to your phone, connect it to your PC or Mac via USB. Open the File menu in Unity, go to Build Settings > Android and hit Build and Run. Unity will generate an .apk file for you, sign it for deployment on your device using the signature file we added, and copy it to your phone. Once the game is installed, detach the USB cable and plug the phone into the GearVR. And voila, you’re inside your own virtual creation!
To be continued
Thus far we’ve taken a look at the game environment, looked at some techniques Unity affords us to get maximum performance out of the graphics hardware, and added a VR-enabled player object so we can look around the scene on the GearVR.
In the next post, we’ll look at some custom script components to add player movement, puzzle solving and monster chase gameplay; and we’ll conclude by listing some excellent online resources for you to learn more about VR development.
Thanks for reading, and see you next time!