DSC01061 (Medium)

Google Cardboard continues to be one of the primary ways that many creators and the general public are getting their first taste of virtual reality. In this three part series for Make:, I have been exploring how people used to building things in the physical world can get involved with the exciting virtual one that is emerging. We have reached the last piece of the puzzle — allowing our user to interact with the virtual world they see!

If you are completely new to Unity, I’d recommend reading the first article on getting started with Unity. If you’ve already tinkered with a bit of Unity in the past, but haven’t tried out the Google Cardboard SDK, the second article on building for Google Cardboard in Unity is the place to start! If you’ve got those bases covered, this final article in the series will cover adding interactivity to your Google Cardboard virtual reality apps via gaze and touch input.

Adding a physics raycaster to the camera

In order to determine what object a user is looking at, we need to add a physics raycaster to our Cardboard SDK’s camera. Imagine a straight line that shoots out from the camera and intersects whatever the camera is facing — that’s the idea of a raycaster. We then use that to trigger either a touch or gaze input action, which we’ll explore further below.

To add a physics raycaster to our camera, we go to our object hierarchy on the left and choose CardboardMain > Head > Main Camera. This is our camera object that follows the user’s headset viewing angle and direction. When you have this selected, go to the inspector column on the right, scroll to the bottom and click the “Add Component” button:

addingphysicsraycaster1

In the menu that appears, you will see a search box that will let you filter the objects inside it. Type in “physics” and then click “Physics Raycaster” (Note: we do not want “Physics 2D Raycaster,” so be careful with which one you choose here!).

addingphysicsraycaster2

When you click that, it should appear within your inspector like so:

addingphysicsraycaster3

Setting up objects so we can interact with them

Now that we have a physics raycaster, we need to ensure that the raycaster can detect the objects in our scene. To do this, we need to add a “collider” to these objects.

When you create some objects in Unity, they will already come with a collider by default. For example, adding in a cylinder into a Unity project will give it a default capsule collider component. This wraps tightly around the shape so that any time something touches this collider, we know for certain it is touching the object itself. The collider is visible as the green lined shape that appears when you click the object:

collider1

If we look at our collider’s settings inside the “Capsule Collider” component on the right, we can see quite a few things that are editable. The main ones that you’ll tend to change are:

  • The center point — This moves the collider around the object, which can be useful if you want only a part of the object to trigger events.
  • The radius — This makes the collider wider or smaller, giving it more/less buffer room around the object itself.
  • The height — This makes the collider taller or shorter, giving it more/less buffer room on the other axis.

For example, if we enlarge the cylinder collider’s radius and height, we get the following:

collider2

Chances are high that most of the time you won’t need to change the collider values for these sorts of objects. However, custom objects (which you are likely to be using in your projects!), do not have colliders by default. If I click on our Makey robot, you will see it doesn’t have any collider component at all. This is an issue which many beginner developers miss and is one of the more common questions I’m asked by people stuck trying to get started with Google Cardboard and Unity. Make sure that every object you want to be able to interact with has a collider!

To add a collider to our custom object, we go to “Add Component” once again and type in “collider” to filter to the different collider types. There are a range of different collider shapes to suit different objects. Feel free to experiment and choose the one that you feel fits your object the nicest.

collider3

Often, having a bit of a buffer around the object isn’t a bad thing — it can make it easier for your user to select or interact with the object. For the Makey robot, I’ve chosen a “Box Collider” because it was able to cover the robot’s overall physical space and a little bit extra in case the user wasn’t quite accurate enough with their glance.

When first creating a collider, you may struggle to actually see the collider itself! Colliders aren’t automatically sized to cover the object — you need to change the center point and size to cover it yourself. When a box collider first appears, it appears as a small 1×1×1 cube:

collider4

We then make it bigger, setting the size to 30×20×30 (adjust these values for your own custom object, you will see how well it fits the object by watching the green lines grow around it). You may have to move the actual collider a little bit off center to cover the whole object too — I had to move the robot’s collider up by 4:

collider5

Our interactivity script

To try out the capabilities of touch and gazing at objects, we need something to actually happen when we do these in the VR app. What we are going to do is make the bot move closer to us when we look at it and move further away when we click it. You could set up any number of other responses to object interactions — change its material so that its color changes, change the whole object into something else, change the background scene or other objects… the sky is the limit!

To set up our moving back and forth responses, we need to attach some basic coding to our Unity object (in my example, we will be adding it to the Maker Faire Bot). To attach a script to your object, start by clicking the object, going to the inspector window and clicking “Add Component” once more. Search for “script” and choose “New Script”:

script1

Name your script something that makes sense (e.g. “MakerFaireBot” or “MakerFaireBotController”), stick with “C Sharp” as the language (unless you’re already familiar with Unity and want to use UnityScript) and then click “Create and Add”:

script2

It’s much neater to have all of our assets in categorized folders, including scripts, so let’s right click on the “Assets” window and choose Create > Folder:

script3

From there, we call that new folder “Scripts” and then drag our new script which you will see in the Assets window into that folder. Unity will still have it linked correctly within your object without you needing to change it which is nice!

script4

The initial script will look like so:

using UnityEngine;
using System.Collections;

public class MakerFaireBot : MonoBehaviour {

    // Use this for initialization
    void Start () {
    
    }
    
    // Update is called once per frame
    void Update () {
    
    }
}

Let’s change it to this:

using UnityEngine;
using System.Collections;

public class MakerFaireBot : MonoBehaviour {
    public Vector3 lastPosition;

    // Use this for initialization
    void Start() {
        lastPosition = transform.position;
    }

    public void LookAtBot() {
        if (lastPosition.z != -8) {
            lastPosition = new Vector3(lastPosition.x, lastPosition.y, lastPosition.z - 0.5f);
            transform.position = lastPosition;
        }
    }
}

In our code changes, we have added a variable called lastPosition which remembers where the robot was last. Within the Start() function, we set this variable to be the current position of our robot via transform.position.

In a new function called lookAtBot(), we move our bot towards the camera by 0.1 on the Z axis each time it runs until we reach –8. We stop at –8 because our camera is positioned at –10 on the Z axis and we don’t want the bot going through us! Now, we just need a way to get this to happen when the user gazes at the robot.

Gaze input

For gaze functionality to work, we need to add one final component to our object — an event trigger. To do so, click “Add Component” once again while you have your custom object selected, find and select the “Event Trigger” component:

eventtrigger

Once it is added, you will see an option to “Add New Event Type”. Click that button and choose “PointerEnter”:

eventtrigger2

Click the “+” icon to add in a script that will be triggered any time that the “PointerEnter” event is fired. A series of options will appear:

eventtrigger3

Drag your custom object from your hierarchy on the left into the small area just underneath the “Runtime Order” menu (bottom left of the Event Trigger window):

eventtrigger4

We can now select public functions within that object to be called as event triggers. So we choose MakerFaireBot and then find our LookAtBot() function (you will choose your own function name!):

eventtrigger5

We are almost at a point where our event triggers will run! Before Unity’s scene knows to look out for them, we need to add an event system. To do this, go to GameObject > UI > Event System:

eventsystem1

Within the event system, we can teach it to look out for gaze events triggered by the Google Cardboard SDK. To do so, click on “Add Component” while you have “EventSystem” selected and find the “GazeInputModule”. This is from the Google Cardboard SDK. Add that to the “EventSystem” object.

eventsystem2

Once you have the GazeInputModule, untick the “Standalone Input Module” within EventSystem too. If this is enabled, your Cardboard device’s click events will not trigger!

eventsystem3

If you run the app now in Unity’s test mode and keep looking at your object, it will move closer and closer towards you!

inaction1

Touch input

Let’s set up one final response for any time the user clicks the object via the Cardboard clicker (not all of the Cardboard headsets out there have one but quite a few do). We will update our code in the custom object to have another function for moving the robot away from us. Let’s change it to:

using UnityEngine;
using System.Collections;

public class MakerFaireBot : MonoBehaviour {
    public Vector3 lastPosition;

    // Use this for initialization
    void Start() {
        lastPosition = transform.position;
    }

    public void LookAtBot() {
        if (lastPosition.z != -8) {
            lastPosition = new Vector3(lastPosition.x, lastPosition.y, lastPosition.z - 0.5f);
            transform.position = lastPosition;
        }
    }

    public void ClickBot() {
        if (lastPosition.z < 5) {
            lastPosition = new Vector3(lastPosition.x, lastPosition.y, lastPosition.z + 2f);
            transform.position = lastPosition;
        }
    }
}

This adds in a ClickBot() function which moves the object away from us as long as they are still less than 5 on our Unity scene’s Z axis (this limit is so the object doesn’t get beyond our reach!).

We then go back to our robot (or your custom object), go to the “Event Trigger” component and click “Add New Event Type” once more. This time, we choose “PointerClick”:

click1

It adds our LookAtBot() function as the click function automatically, which is lovely but not quite what we want. Click that function and select MakerFaireBot > ClickBot():

click2

If you play the scene again and click your custom object, it will now be pushed back, defending your personal space!

How can I tell if I’m looking at it?

We are missing one rather important thing — a way of telling the user where they are looking. It isn’t always clear whether you are looking at the right spot, especially when it comes to things like menus. We can add a crosshair (called a reticle within Google Cardboard’s SDK…) by going to your Assets and finding Assets > Cardboard > Prefabs > UI > CardboardReticle. This is a prefab Google have provided that will give your experience a crosshair.

Drag this prefab onto your Main Camera within CardboardMain:

cardboardreticle

If we play our scene now, we have a nice circle which grows when you have an object within your sights:

inaction2

That’s all folks!

We now have all three aspects of the basics for building Google Cardboard experiences covered — understanding a bit of Unity, setting up a scene to be viewable in virtual reality and finally, setting up interactivity in the scene via Google Cardboard. With these basic concepts, there are a whole range of possibilities available to you!

Makers everywhere — there is no reason not to give this new medium a go. Who knows what your unique view of the world from the perspective of a maker could produce? Build something wonderful. Make someone smile. Make something that influences the world in a positive way. Give someone a thrill. Teach someone a valuable lesson or skill. The territory of VR is still so new and unexplored that it really is up to you. Go your own path. Try things.

If you have found this series useful, I’d love to see what you build and create! You can get in touch with me on Twitter at @thatpatrickguy. For those who are keen to develop in virtual reality, I have more insight into building for VR on my website — Dev Diner. I’ve also written other developer focused VR tutorials over at SitePoint that might help — check those out! The virtual world is your oyster! Please do not hesitate to get in touch if you need some advice on getting started, I’m here to help. Thank you to those who’ve come along for the ride in this series!

 

Patrick Catanzariti

Patrick Catanzariti

Patrick is the founder of DevDiner.com, a site helping developers navigate the world of emerging tech. He also is a SitePoint editor focused on exploring the possibilities of new technology such as the Internet of Things, virtual/augmented reality and wearables. Alongside those roles, he is an instructor at SitePoint Premium and O'Reilly, a Meta Pioneer and a freelance web developer who loves every opportunity to tinker with something new in a tech demo.


  • Katie Wallace

    When I looked at the draft of 6355 dollars,,zq I have faith that brother of my friend was like really generating cash in his free time with his PC. His aunt’s neighbor has done this for only 11 months and by now repaid the loan on their home and bought a new Car.

    For Details Click Here
    vrf…

  • Jessica Goldstein


    “my .friend’s mate Is getting 98$. HOURLY. on the internet.”….


    two days ago new Mc.Laren. F1 bought after earning 18,512$,,,this was my previous month’s paycheck ,and-a little over, 17k$ Last month ..3-5 h/r of work a days ..with extra open doors & weekly. paychecks.. it’s realy the easiest work I have ever Do.. I Joined This 7 months ago and now making over 87$, p/h.Learn. More right Hereo!641➤➤➤➤➤ http://GlobalSuperEmploymentVacanciesReportsJobs/GetPaid/98$hourly…. .❖:❦:❖:❦:❖:❦:❖:❦:❖:❦:❖:❦:❖:❦:❖:❦:❖:❦:❖:❦:❖:❦:❖:❦:❖:❦:❖:❦:❖:❦:❖:❦:❖:❦:❖:❦:❖:❦:❖:❦:::::o!641………….

  • ceren kayalar

    Nice article! I just couldn’t make the robot move into the screen constantly while gazing at it. It is moving only whenever I exit and enter the collider space with the reticle. How can I fix this?