5 min read

If you’re a Unity developer looking to get into VR development, then you’re in luck because Unity is definitely going all-in on the virtual reality front. With the recent announcements at Vision VR/AR Summit 2016, Unity has promised built-in support for all major virtual reality platforms, which are Oculus Rift (including GearVR), Playstation VR, SteamVR, and Google Cardboard. Unity is also currently developing an experimental version of the Unity editor that can be used inside of VR.

Given how there is one game engine that supports most VR platforms, there has never been a better time to get your feet wet in VR development. So let’s do exactly that; let’s get started with VR using Unity. And don’t worry; I’ll be here to help you every step of the way.

A journey of a thousand miles begins with a single step, so let’s do some simple stuff first. There are two basic topics that we’re going to cover this time: head tracking and interaction in VR. While those sound pretty basic, they are the fundamental components of a virtual reality experience, so it’s best to understand them in the beginning.


There are a couple of requirements that need to be satisfied before we start, however. We’re going to need:

  • Unity 5.3 or above
  • Windows 7 or above
  • Oculus Rift device (optional)
  • Oculus Runtime (only if you have an Oculus Rift)

While it’s recommended to have an Oculus Rift ready when you’re developing a VR app, it is not required.

Starting out

All right then, let’s get started. I’m going to assume that you’re already familiar with programming and Unity in general, so we won’t go into too much detail with that.

Let’s make a new Unity 3D project. Once you’ve made a new project, you need to turn on VR support by going to Edit > Project Settings > Player. On the Other Settings tab, check the Virtual Reality Supported box. Don’t forget to make sure that your selected build platform is PC, Mac, and Linux.

The next step is to create a scene and add some objects there. Any objects will do, but in my case, I just added a cube at the center of the scene. Then we create an empty GameObject and add (or move) the main camera as its child object.

This part is important—the main camera has to be a child of another object because when you’re developing for VR, you cannot move the camera object by yourself. The VR system will track your movement and adjust the camera position accordingly, overriding any change you made to the camera position. So, to have your camera moving, you’ll have to modify the position of the camera’s parent object instead of the camera itself.

Now, if you have an Oculus Rift in your hand, you’re set and ready to go! All you need to do is simply connect the device to your PC, and when you press play, your head movement will automatically be tracked and translated to the in-game camera and you will be able to see your scene in VR on your Oculus Rift.

Don’t fret if there’s no Oculus Rift available for you to use. We’ll just have to simulate head movements using this script (by Peter Koch from talesfromtherift). Copy that script to your project and attach it to your main camera. Now, if you play the project in the editor, you can rotate the camera freely by holding the Alt button and moving your cursor around.


Okay, time to step things up a notch. It’s not really an app if you can’t interact with it, so let’s inject some interactivity into the project. We’ll add the most basic interaction in virtual reality—gazing.

To add gazing to our project, we’re going to need this script. Copy and attach it to the main camera. Now when you’re playing the project, if you look at a GameObject that has a Collider, the object will slowly turn red. If you clicked while you’re looking at an object, that object will be teleported to a random position. Interaction is fun, isn’t it?

Well, let’s dive deeper into the script that enables that gazing interaction. Basically, at every frame, the script will check whether you’re looking at an object by casting a ray forward from the camera and seeing whether it hits an object or not.

if (Physics.Raycast(new Ray(transform.position, transform.forward), out hit, GAZE_LENGTH))

When the cast ray hits an object, it will check whether it’s the same object as the previous frame or not. If it’s the same object, a timer will be updated and the script will change the color of the object according to that timer.

//Increase gaze time
mGazeTime += Time.deltaTime;
float color = (MAX_GAZE_DURATION - mGazeTime) / MAX_GAZE_DURATION;
ColorObject(mObject, new Color(1f, color, color));

Interacting via clicks is just as simple. After the script has detected that there’s an object, it checks whether the left mouse button is clicked or not. If the button is clicked, the script will then move the object to another position.

if (mObject != null && Input.GetMouseButtonUp(0)) {
//Move object elsewhere
	float newX = Random.Range(-8, 8f);
	float newY = Random.Range(-2f, 2f);
	float newZ = Random.Range(0, 3f);
	mObject.transform.position = new Vector3(newX, newY, newZ);

Right now the script, while functional, is very basic. What if you want to have different objects behave differently when they’re being looked at? Or what if you want the script to interact with only several objects? Well, I figure I’ll leave all that to you as exercise.

This concludes the beginning of our journey into VR development. While we only scratched the surface of the development process in this post, we’ve learned enough stuff to actually make a functioning VR app or game. If you’re interested in working on more VR projects, check out the Unity VR samples. They have a bunch of reusable codes that you can use for your VR projects. Good luck!

About this author

Raka Mahesa is a game developer at Chocoarts  who is interested in digital technology in general. Outside of work hours, he likes to work on his own projects, with Corridoom VR  being his latest released game. Raka also regularly tweets as @legacy99


Please enter your comment!
Please enter your name here