Introduction to Virtual Reality with Unity 3D

Created with Sketch.

Introduction to Virtual Reality with Unity 3D

The term “virtual reality” was invented to describe an artificial world, a world beyond the borders of the natural world. A world where everything is possible and that the only limit is the imagination of its creator.

But besides any entertainment, this technology offers the ability of making ideal environments for scientific observation and experimentation where a physical human presence would be impossible or overly difficult.

Examples are studies in micro-organisms and bacteria, nanotechnology or space colony development, microsurgery, etc.

History of virtual reality

Taking a historical look back we find that experimenting with virtual worlds starts many decades ago. According to Jonathan Linowes, virtual reality has begun many decades ago.

However, due to its development cost virtual reality was not widespread to the general public and used only for academic purposes and research, mainly military.

The first equipment for virtual reality was made in 1966 by Ivan Sutherland and a part of it was mounted on the ceiling of his laboratory.

Decades later, in 2012 Palmer Luckey (Oculus VRLLC) and renowned developer John Carmack (Doom, Wolfenstein 3D and Quake ) founded first campaign for the general public (Kikstarter project). They created the first development kit for virtual reality application development, the so-called Oculus Rift Development Kit 1 (DK1).

Technical evolution

Since then the evolution of technology and computers in particular has led to more realistic visual and audio effects that give the user the impression that he is actually living in virtual world unfolding in front of him. 

Generally a virtual reality application requires a headset with a stereoscopic projection screen. With the appropriate lenses of the headset the virtual world becomes alive and immersive.

The headset isolates the user from the real environment so to keep away any distractions. With built-in speakers or headphones the user has both visual and audio experience.

Hardware and software perceive user’s movement in the space adapting image and sound appropriately giving any desired illusion by the developer. 

The user has the ability to interact with virtual objects such as opening a door, finding a hidden object, etc. Of course the greater the desired experience the more complex and more expensive the equipment which is required.

Advertisements

Categories of Virtual Reality

Each virtual reality application can offer a different type of experience for each user. According to Jonathan Linowes there are many types of virtual reality experiences. Following are the main 6 categories:

  • Diorama: the simplest form in which the user simply watches as a 3rd person observer the space within he is located. He can look in any direction but he remain still at the starting point.
  • First person experience: the user has the impression that he lives in the virtual environment. With software / hardware help he can move around.
  • Interactive environment: similar experience to the first person one, but with the possibility of interacting with the elements of the environment.
  • Movement on rails: the user has the impression that he is on a roller coaster that moves with varied speed and at different altitudes for a more realistic experience.
  • Images and video 360° view image and video in a cloudy environment where the user is in focus and can monitor around it and within a 360° radius .
  • Multi-user community: other users appear in the environment while enabled communication between them turns this specific type of experience to social.

Virtual Reality development tools

Undoubtedly there are many tools for developing virtual reality applications according the platform on which the application will be executed and the material that is about to be used. 

The main and most representative categories of these tools (hereinafter referred to as development kits) are the following:

  • Desktop VR (PCs, laptops)
  • Mobile VR (mobile devices, smart phones)

The more powerful the computer and the graphics card it has the more the experience in the virtual environment is realistic. 

Desktop VR requires special equipment such as the Oculus Quest 2, which includes a head-mounted display (HMD headset) that connects with cables to a high end PC or laptop due to the graphics requirements. 

The system may also include motion sensors (eg Xbox one, Kinect etc) for enhancing the experience within the virtual environment.

Virtual reality for mobile devices

For mobile VR, HMDs are less demanding as their connection to a PC is not required. They have a special slot for mounting a mobile device and special glasses to focus its stereoscopic image projected on the screen of the mobile phone thus giving the impression of depth. 

The technology of today’s mobile devices allow viewing of very high definition image while with the special gyroscopic mechanism they have “understand” the focus of the user to adjust the projected image accordingly. 

With the accelerator they also “know” where it is the user further enhancing the virtual experience.

In both categories of VR applications the desired goal is the realistic display of an artificial 3D world and achieving the immersion of the viewer in this virtual world as much as possible. 

Avoiding nausea and other side effects

The most ideal rendering frequency of virtual graphics of the world is 50 – 60 Hertz (Ruth Aylett et all) so that there is smoothness in the scrolling and alternation of images and graphics of this virtual environment.

 Frequencies below 10Hertz (or 10 FPS) would result in uneven scrolling (bouncing) causing nausea and dizziness to the viewer. 

For this reason, especially on mobile VR must be selected state-of-the-art mobile devices with powerful processors and cards graphics (GPUs) that can meet the high demands of virtual reality.

It’s the developer’s responsibility to warn the user about the consequences of using the application with a device that does not meet the specifications of virtual reality.

High immersion results

For maximum achievement of “immersion” in the virtual environment out of sight the application must also cover the other human senses, such as hearing and touch.

For listening audio effects are reproduced stereoscopically as well as the image giving a sense of depth. The complete isolation of the user from the stimuli of the external environment helps to create the user feel that he is actually in another reality as true as his own. 

The touch can be replaced by special equipment (data gloves) which allows the user to interact with their virtual environment. 

In more advanced models special sensors could convey a sense of pressure or temperature (e.g. when the user touches something hot or cold)

Mobile Virtual Reality made by Unity

In mobile applications, such as the one we will analyze in the following paragraphs, the “sinking” is achieved by satisfying only 2 of the 5 human senses, sight and hearing. 

For the purpose of this will use equipment relatively accessible to the average user, such as an isolation helmet, mobile phone with gyroscope, octa-core processor and Android 6 operating system and headphones connected to the mobile device. 

For this article we used Unity3D 2020.1.x Personal user interface as the VR editing tool, the coding will be made with the Visual Studio Community.

In addition we will use the Google Cardboard XR Plugin for Unity. The “Island VR” application is inspired by the mini-game of Will Goldstone (Unity Game Essentials) 4.

Installing the plugin

By default Unity from version 2017.1 onwards offers the pre-installed feature VR application development for a wide variety of frameworks such as Cardboard, Oculus Rift, Day Dream, Vuforia etc. 

For this article we shall use the Google XR plugin which is compatible with Unity 2020.1.x.

To install the plugin open Unity editor and select Window – Package Manager. At the top left corner of the corresponding Window, click on the “+” (plus) icon and select “Add package from Git URL”.

Enter the following link:

https://github.com/googlevr/cardboard-xr-plugin.git

After the plugin installation, we also need do a few more options as they are described in the Quick Start for Google Cardboard.

Additional Google VR plugins

Last we need to install another plugin that will allow us to “live” our virtual world within Unity’s scene editor.

  •  With an Internet browser we visit the site:https://github.com/googlevr/gvr-unity-sdk/releases/download/v1.120.0/. Download the GoogleVRForUnity_1.120.0.unitypackage file to a local folder.
  • From the menu of Assets options, select Import Package -> Custom Package and from the relevant dialog box select the GoogleVRForUnity_1.120.0.unitypackage file of the above step. In the input window of the Google VR SDK plugin we leave everything selected except Demos, iOS, x86 and x86_64 and press the Import key as shown below:

How to use the VR plugin

After importing the plugin we need to do the steps which are described below:

  • In the Project pane select the GoogleVR -> Prefabs folder and drag the object GvrEditorEmulator in the Hierarchy pane. This item is responsible for its connection Unity virtual reality internal mechanism with plugin.
  • From the EventSystem subfolder drag the GvrEventSystem object to the Hierarchy pane. With this object we receive facts concerning the position of the user’s head as well as the item on which the user focuses.
  • From the Cardboard subfolder, drag the GvrReticlePointer object onto the main camera so that the former becomes a subsidiary of the camera. Having still selected the object of the camera, press the Add Component button (Object inspector pane) and in the search box write physics and select the component GvrPointerPhysicsRaycaster as shown in the image below:

The “GvrPointerPhysicsRaycaster” casts a virtual ray to the direction of the user’s gaze providing information of any hit object.

A visible means of interaction

In collaboration with “GvrReticlePointer” it provides visual information of the user’s focus on an object. Necessary the object (eg cube) must have a collider (box, capsule or mesh) and have the trigger event script component . 

The “GvrReticlePointer” has the Max Reticle Distance property through it the distance between the user and the object is determined. By definition it is 10 meters according to the Unity scale. 

The following images show the visual effect of the Reticle Pointer when the camera focuses on the red cube.

We may also observe the imaginary ray emitted by the camera via the GvrPointerPhysicsRaycaster component:

Interactive Object in action
Interactive Object in action

Island VR project (preview)

The purpose of the Island VR project is to introduce virtual reality made by the Unity 3D engine. The user “lives” on a deserted exotic island looking for a way to light the camp fire outside his hut.

The environment of the island was constructed using the terrain editor of Unity and the surrounding sea comes from the free assets of Unity (Daylight Simple Water object).

The image below shows a view of the island in Unity’s Scene View. The sea ​​that surrounds it and the sky dome which is a creation from Blender version 2.78

The Island

 By definition Unity has a very elegant object for the sky, the sky box which consists of 6 different images. But the cost in GPU resources is very high and it is not recommended for mobile applications.

After the user has completed the mini game, a special object is placed randomly in one of five pre-selected positions.  This item acts as a portal to a bonus scene where a small surprise awaits the user (Easter egg)!!!

The user may moves to any direction by simply lower the head at a less than 90o angle. However, an angle less than 20o should be avoided to prevent any unpleasant feeling while moving around.

Project hierarchy and organization

The project is organized into logical groups of objects so that their hierarchy is more understandable. These groups are shown in the image below:

Project Island VR
Project Island VR

We choose a mobile device platform due to the relative low cost of mobile phones and their high performance.

The VR equipment that can be used with these devices is of a low cost. A known equipment is the Google cardboard which is shown in the image below:

Google cardboard
Google Cardboard

Island VR project (analysis)

Unity has a basic class (MonoBehavior) which is mainly the parent class of the game objects we create in order to connect them to the game engine.

MonoBehavior derived classes apart from various events and methods also allows us to define properties that we can see and set in the object inspector. These properties influencing the behavior of the object (eg speed rotation of the object ka).

The Island VR project has a total of ten classes saved in the project’s Scripts folder that we will analyze in more details below. Each class is associated with an object in the scene and through them the whole flow of the program is controlled.

Analysis of game classes

  • CoconutCollision : Connects directly to the target object and checks if the latter has hit (collided) by a Coconut type object. In this case it stimulates the animation “down” and the target falls backwards. After a time of 5 seconds has elapsed, the animation “up” is triggered and the target returns to its original position. When all three targets get knocked within 5 seconds, the key that will unlock the hut door will reveille. Cooperates with the Scene Logic class from which it receives the AI ​​parameters.
  • CoconutThrow : Attached to the Launcher object and creates a copy of the Coconut object. By using object’s physics engine (rigidbody) exerts “acceleration force” as towards the Z axis to the direction at the target. Called by the EventLogic class when the user focuses (gaze) on a target object.
  • CoconutTidy : It is attached to the Coconut object and is about to destroy the object after 5 seconds. This happens because every Coconut object is created on stage and is not automatically destroyed, consuming system resources.
  • DoorLogic : Connects to the outpost object (hut) and controls the object door which is located in front of it. Communicates with the “OpenTheDoor” and “GetMatches” events which are called through the EventLogic class when the user focuses on the door and on the matches inside the hut, respectively. Updates the property has Matches notified through the SceneLogic class. It also controls the animation of opening and closing the door. The latter takes place after 6 seconds for the user to have time to enter inside the hut.
  • EventLogic : Connects to any object we want to interact with the user via the GvrPointerPhysicsRaycaster component. May call the “LookOnTarget” and “LookAwayTarget” events through the “EventTrigger” class which should also be attached to each interactive object. “LookOnTarget” checks if the “gaze” of the user falls on an interactive object and executes its associated method. The “LookAwayTarget” is called when user removes his “gaze” from the object.
  • GrvPlayerHMove : Connects to the player object and is responsible for controlling the user’s movement in the area of ​​the island. To do so the user’s head should be at an angle of 20o down. User moves at the direction of his gaze. Stops movement when the user stands in front of an obstacle (static object).

Analysis of game classes (continues)

  • InteractiveObject : A class that also links to any object. In combination with the “EventTrigger” script and the EventLogic class turns an object to interactive. May call the ENTER / EXIT methods of the “EventTrigger” which are declared in the EventLogic class.
  • RotateObject : Connects to an object and creates the effect of rotation of the object at which is attached.
  • SceneLogic : Connects to the homonymous object which provides artificial intelligence to project. It is possible to be connected with other objects (eg EventLogic) to bridge the internal communication of the project. So when the user throws 3 targets a key that unlocks the door of the hut appears, or when he gets the matches can light a fire and eventually complete the goal of the game. SceneLogic is starting first the effect of fire and then smoke to achieve more realistic results.
  • TextHints : Links to the Hud object below the Player’s main camera object. Hud (or head-up display) is a graphic object (canvas) and is used to display text to the user. Publishes the “ShowHint” method which accepts a text to be displayed on the user’s stereoscopic screen. Via MonoBehavior’s Update method controls the time the text will be visible on the screen.

Analysis of objects on stage

The arboreal hierarchy of game objects allows the synthesis of complex models with autonomy movements such as the movement of the legs and arms of a humanoid model, the opening and closing of a door etc. But the movement or the rotation of the parent object also affects all child objects below it.

Island VR has game objects (Unity classes) which can consist of more affiliate items. This is the case of the player which contains the objects Head -> Main Camera -> Reticle Pointer + Launcher + Hud -> Text

In more detail, the objects that will be active are divided into:

  • mainScene : The current scene of the program. The object of the scene (or backdrop) is the parent of all Unity optical and non-visual objects. There can be no executable work if there is not at least one scene in the whole project. It is saved in the project with a file name given by the user and ending with the “.scene” extension.
  • Player : The user object (the player). In virtual reality applications the player has a first person role and usually a 3D model is not necessary. There are of course exceptions depending on the realism we want to give to the project (eg we may want it to look part of it body when the user looks down or when extending his arms to interact with some object of the space). Also carries the camera through which the environment is viewable. It is oriented through the special mechanisms of the Google VR SDK plugin. Locomotion within the scene is achieved with the “GrvPlayerHMove” object and in combination with the default “CharacterController” object.
  • Outpost : The model of the hut. It consists of many smaller objects in which we can define different behavior, like the object of the door outside and of the matches inside the hut. It has the animation component through it we can perform the door open-closed animation:
Outpost model
Outpost model
  • Platform: The model of the podium with the targets. Each target object is interactive and when “hit” by the Coconut object the target falls backwards. A second animation restores the target object to its original position after completion of 5 seconds. The user’s goal is to drop all three goals in this timeline to get the key which unlock the hut door. Below is a selection on stage:
Platform model
Platform model
  • Campfire: The model of the hearth and the end goal of the user. It interacts with the “look” of the user when the latter has taken the matches from the hut, stimulating the effect of fire and smoke which are subsidiaries of this article. The image below shows selected on stage. This image also shows the capsule collider that surrounds the object so that it is “detectable” by the GvrPointerPhysicsRaycaster (view) of User:
Campfire model

More stage objects

  • Environment: An invisible object used to group its objects such as Directional Light, Sky Dome, Terrain Island) and Daylight Simple Water (the sea that surrounds the island). It does not provide an function beyond the organization of scene objects that are static and do not require the continuous employment of the CPU and GPU of the mobile device.
  • SceneLogic: Invisible object. It hosts the script with the same title that controls the game’s behavior (AI) depending on the actions of the user. Checks if the user has thrown down all targets in the given time to allow entry to the hut or if the user has taken the matches to “light” the fire, etc.
  • GvrEditorEmulator: Invisible object of Google VR SDK that simulates the motion of the user through the Unity Editor environment. When the application is executed on a mobile device this object has no effect. The user can control the movement of the Player head with the ALT or CTRL keys.
  • GvrEventSystem: Another invisible object in the Google VR SDK that works with the event system of Unity controlling input devices (eg gamepad, VR glove etc.) sending events in the application objects from the input devices, such as Enter, Exit, Down, Up, Click, Select, Deselect, UpdateSelected, and GvrPointerHover.
  • Coconut: The coconut model that is created when the user looks at a target. A force of acceleration (velocity) is applied to it, to the direction of the target, in order to hit the target. The object has a self-destruction mechanism, 5 seconds after it was created, to free up memory resources and CPU.

Island VR project (implementation)

This paragraph contains a brief analysis of the Island VR program main features and the “setting up” of the scene. 

The whole project (including the code) is available for study here.

Stage presence

A project in Unity consists mainly of scenes called game levels and we can compare them to movie sets. 

In a Unity project we can have only one scene active at a time. For Island VR we use a scene which is also our virtual world, as shown in the image below:

Island VR main scene
Island VR main scene

For reasons of simplicity the main points of interest of the project have been placed relatively close so as not to tire the user from too much movement, although it is worth browsing around the virtual world of Island VR.

Interactive objects

In a virtual environment like Island VR the user can interact with some objects in a variety of ways, like touching, or looking for some time. This way we enhance the liveliness to the space.

With the appropriate equipment (eg VR gloves) the user can touch, push or open an object, such as a door or a lever or even shoot at a supposed target. 

Unfortunately such equipment is not available for mobile devices even so the developers resort to other techniques to make an object interact with the user. 

The most common and the one that applied in Island VR is the interaction of the object with the “look” of the user.

The user can look at an object for a certain amount of time to trigger the associated event of the object.

Any object may become interactive after attaching some special components to it. These components are tracked from the “ray” of the “GvrPointerPhysicsRaycaster” class and trigger events.

Implementing interactivity

The components that offer this interaction are those listed below:

  • Collider : “blocks” the scan beam and returns hits with information about the object (eg its identity and location). There is a complete reference to colliders at Unity’s manual about Colliders.
  • InteractiveObject : receives events from the scan ray when it falls on one object or removed from it. In collaboration with the “EventTrigger” class it calls the “OnPointerEnter” and “OnPointerExit” events which should be defined for this object. The developer can also set the time that will be satisfactory to trigger the “OnPointerEnter”, “OnPointerClick”, and “OnPointerDown” events.
  • EventTrigger : the connection of the object to the events “OnPointerEnter”, “OnPointerClick” or “OnPointerExit”. Each event can be associated to methods of the same or of other objects. We need to pass the object to the relevant parameter as a reference, as in the following image:
Colliders
Events and Colliders

User – player object

The user object (or player) is one of the most important game objects for any virtual reality application. 

It represents the person using application and therefore “lives” in the virtual world. Usually has a first person role (FPS) which means that the object does not appear in the scene. 

The user sees through the player’s object camera as if he were himself on stage. Player object may not have a 3D model (mesh) thus burdening the game engine less. 

In some cases, a body model may be necessary for letting the user observe physical parts like feet or hands. This will increase the sense of “immersion” in the virtual world.

Unity’s built in objects

For the Island VR, a ready-made asset from the standard assets come from Unity was used. This asset can be imported by selecting Assets -> Import Package -> Characters as shown in the image below:

Character Asset
Character Asset

After importing the above package, select the First Person Controller object and drag it to the desired location on stage. Remove the components “MouseLook”, “CharacterMotor”, “FPSInputController” and “Graphics” sub-object which displays the model of a capsule. 

Then add the “GrvPlayerHMove” component for eye movement and “AudioSource” for audio playback effects. A blank game object named “Head” is added below the main game object of the player. Drag the camera under the head game object by placing the child object on the head. This way we can rotate the camera by rotating the parent object.

The final structure of the player game object is illustrated in the image below:

Player
Player

Head-up Display (HUD)

Graphic object (Canvas) which follows the camera and is within the player’s field of view (FOV). Its purpose is to display various graphic objects like an informative text (Text) or an icon (eg live bar etc). 

Because it is within the player’s field of view it appears in front of the user’s eyes. We need to set the size and the distance accordingly as well as to select World Space as the render mode to achieve this effect.

GvrReticlePointer

Component that can be added as a subsidiary to the user’s main camera. Provides a visual point that shows the direction of the user’s gaze. When the user does not focuses on an interactive object, is small in size, like a small white dot. On the contrary in this case its size gradually increases until it becomes a complete and fairly clear circle.

Controlled by the “GvrPointerInputModuleImpl” class which is located in the folder “GoogleVR\Scripts\EventSystem\InputModule” as shown below:

GvrReticlePointer class
GvrReticlePointer class

By default “GvrReticlePointer” has a range of 10 meters. This is usually more than enough for a standard virtual reality application. 

Interactive objects tweaking

However, GoogleVR SDK also provides the source code that controls the object. So we can modify it, ensuring that it will work with the “Distance” property of the Interactive Object class. We describe this amendment in the following steps:

  • Double click on the “GvrPointerInputModuleImpl.cs” unit to open it in its editor (usually Visual Studio).
  • Find line 104 and put in remarks the code of lines 104 and 105 (just below the barcode: CastRay();)
  •  Enter the following code and save the CS file of the class:
GvrPointerInputModuleImpl
GvrPointerInputModuleImpl class
  • We return to Unity to automatically compile the code.

The above code uses the “InteractiveObject” component to find the direction of the camera towards the object (focus). Then it checks if the latter is within the desired distance in square meters (eg 9m2).

Unity has the mathematical function Vector3.Distance (object_position, camera_position) or which returns the difference between the position points of the two objects. As long as it applies (internally) the mathematical function Sqrt (square root calculation) it is better to avoid using for reasons of speed, especially for applications that “run” on mobile devices.

The following images show the reticle pointer next to the hearth (left image) and on it (right image). We also see the informative text for user facility:

Reticle Pointer
Reticle pointer appearance

Display of information in the virtual space

In order to correctly display text to the user, we need a material with the shader UI/Default_OverlayNoZTest. 

The code of this shader allows us to display information in front of any object like a closed door etc. 

The following image shows how to connect Overlay material with the object Text:

Overlay Shader
Overlay Shader

After applying the overlay font material to the UI Text object we can set the distance of HUD along the Z axis to 2 meters, to have a logical distance from the user’s eyes.

We also increase the Font Size of the UI Text to 42 pixels (or more) to make the information text readable enough for the user.

Building the application for Android devices

After checking the application for errors, we build the application, by following the steps below:

  • Select the menu File -> Build Settings to display the corresponding window for creating the application.
  • Make sure the Scenes to Build list contains our scene and is checked as shown in the image below:
Build Settings
Build Settings

Otherwise click the “Add Open Scenes” button located at the bottom right of list of scenes.

  •  In the Platform list we select Android or iOS depending on our preferences.
  • Check the “Player Settings” to be like the screen shots below:
Other Settings
Other Settings
Publishing Settings
Publishing Settings
  • Press the “Build” button to build the application and create the APK (Android) application.
  • Install the APK file on an Android device.

The application was tested on an Xaomi Redmi mobile phone with an octa-core processor and 2 GBRAM. The device has a gyroscope to detect the user’s head movement.

Changes for Unity 2020.2

Regarding Unity 2020.2 there should be some changes we need to make in order to be able to build the project since some Unity classes had become obsolete.

Fist of all you need to open the “GvrBuildProcessor.cs” from the \Assets\GoogleVR\Editor folder by double-clicking on it.

Next change the IPreprocessBuild interface with the IPreprocessBuildWithReport one. The former interface has become obsolete as of this writing. If you use Visual Studio there will be a complain that the interface is not implemented. Click on the “bulb” (Quick Actions) and select the “Implement interface” option. A new method named “OnPreProcessBuild” will be added to the class body.

Replace the code found in this function with the following snippet:

OnPreprocessBuild(report.summary.platform, report.summary.outputPath);

This way we keep compatibility with the old Google VR SDK. In case you notice a Virtual Reality not supported error message in Unity console, ignore it and build your project regularly. To eliminate the error message find the “IsVRSupportEnabled” and replace the “return PlayerSettings.virtualRealitySuported;” with “return true;”. The “PlayerSettings.virtualRealitySupported” has also become obsolete.

Island VR project (game play view)

Below are images from the mobile device in stereoscopic view:

Focus (gaze) on camp fire
Focus (gaze) on camp fire
Targets down
Targets down
Matches for the fire
Matches for the fire
The goal has been accomplished
The goal has been accomplished

About the author

Bill Rigas is a Web Developer – Designer and co-founder of EverFounders blog with many years of professional experience on making software for airline companies. Owner of Msc Information of Technology diploma from University of East London and many certifications from edx. Also a certified Unity game developer from Walker Boys studio.