VR Best Practice
25 Mins
Check out the Unity development team's best practices for developing Virtual Reality applications, including optimizing rendering, decreasing latency, and platform-specific recommendations.
Topics we'll cover
Recommended Unity Versions
VR Best Practice
Introduction to VR Best Practice
As achieving the target frame rate for your chosen platform is an essential part of ensuring users have a great, nausea-free VR experience, optimization is a critical part of VR development. Unlike some other platforms, it’s best to optimize early and often with VR, rather than leaving it to a later stage in development. Testing regularly on the target devices is also essential.
VR is computationally expensive compared to non-VR projects, mainly due to having to render everything once per-eye, so make sure you’re familiar with some of the common issues when creating your VR experience. If you’re aware of these issues beforehand, you can design your project around them, saving a lot of hard work later on in your project lifecycle.
Mobile VR can be particularly demanding. Not only do you have the overhead of running a VR application, but mobile devices are a fraction as powerful as a desktop PC, so optimization will be of critical importance in your project.
All Unity best practices for performance carry over to VR. See these Unity guides for more suggestions on how to optimize your project:
  • Understanding Performance in Unity
  • Optimizing Shader Load Time
  • A Guide to Optimizing Memory

Rendering in VR
Rendering is one of the most recurring bottlenecks in VR projects. Optimizing rendering is essential to building a comfortable and enjoyable experience in VR.

Stereo Rendering Modes

Setting Stereo Rendering Method to Single Pass Instanced or Single Pass in the XR Settings section of Player Settings will allow for performance gains on the CPU and GPU. See the Single Pass Instanced Rendering manual page and the Single Pass Stereo Rendering manual page for more details. Additionally, refer to this blog post to read more about stereo rendering modes.


Batching is a good way to reduce work on the GPU by minimizing the cost of context switching for each draw call (i.e., uploading a new shader, its textures, and other data). Batching relies on having shared Materials and does have some implications on memory and the CPU. Please refer to the Draw Call Batching manual page for more information about Static and Dynamic Batching.

GPU Instancing

Use GPU Instancing to draw (or render) multiple copies of the same mesh at once, using a smaller number of draw calls. It is useful for drawing objects such as buildings, trees and grass, or other things that appear repeatedly in a Scene. See the GPU Instancing manual page to learn how to enable GPU Instancing in your project.

Lighting Strategy

Every lighting strategy has its pros, cons, and implications. Using fully realtime lighting and realtime global illumination in your VR project is not suggested as it has important impacts on rendering performance. For most projects, we suggest that you favor the use of non-directional lightmaps for static objects and the use of light probes for dynamic objects instead of rely on realtime lighting.
The Lighting Strategy manual page clearly explains the different lighting strategies offered by Unity and identifies two strategies as being the most ideal for VR. The first suggestion is to bake all lights combined with light probes and the second is to use the mixed lighting with Shadowmask combined with light probes. The most notable difference between these two strategies is that the latter allows real-time specular and shadows for Dynamic objects.


There are several best practices to keep in mind when working with cameras in VR.
  • The camera’s orientation and position (for platforms supporting 6 degrees of freedom) should always respond to the user’s motion, no matter which of camera viewpoint is used.
  • Actions that affect camera movement without user interaction can lead to simulation sickness. Avoid using camera effects similar to "Walking Bob" commonly found in first-person shooter games, camera zoom effects, camera shake events, and cinematic cameras.
  • Unity obtains the stereo projection matrices from the VR SDKs directly. Overriding the field of view manually is not allowed.
  • Depth of field or motion blur post-process effects affect a user's sight and often lead to simulation sickness.
  • Moving or rotating the horizon line or other large components of the environment can affect the user’s sense of stability and should be avoided.
  • Set the near clip plane of the first-person camera(s) to the minimal acceptable value for correct rendering of objects. Set your far clip plane to a value that optimizes frustum culling.
  • When using Canvas, favor World Space render mode over Screen Space render modes, as it very difficult for a user to focus on Screen Space UI.


In VR, most image effects are expensive as they are rendering the scene twice - once for each eye. As many post-processes require full screen draws, reducing the number of post-processing passes helps overall rendering performance. Full-frame post process effects are very expensive and should be used cautiously as they incur important implications on GPU fill rate.
As mentioned previously, the use of post-process effects such as depth of field or motion blur are not suggested in VR as they may to simulation sickness.
If you need to use post-processing effects in your project, we suggest that you use the Post-Processing Stack as it can merge several enabled effects into fewer passes. The Post-Processing Stack can also be extended to include your custom effects.


Anti-aliasing is highly desirable in VR as it helps to smooth the image, reduce jagged edges, and minimize specular aliasing.
Forward Rendering supplies MSAA which can be enabled in the Quality Settings.
When using Deferred Rendering, consider using an anti-aliasing post-processing effect. Unity’s Post-Processing Stack (mentioned above) offers the following:
  • FXAA - This is the cheapest solution for anti-aliasing and is recommended for low-end PC.
  • TAA - This state-of-the-art technique will give better results than the other techniques at a higher GPU cost. We recommend the use of this technique on high-end PCs.


Optimizing the shaders used in your project is an important step in improving your rendering performance. Always review shaders generated by visual shader editors, including those created from Shader Graph, as they may be unoptimized. See the Shader Performance guide for tips concerning shader optimization.
Graphics drivers do not actually prepare shaders until they are first needed. We suggest that you call use a Shader Variant Collection and call its WarmUp function at an opportune time (e.g., while displaying a loading screen for example) in order prepare essential shaders and to ultimately avoid frame rate hiccups when using a shader for the first time.
Some shaders, especially post-processing effects, may require some modifications in order to function correctly when using Single-Pass Stereo Rendering. Please refer to the Authoring and modifying Shaders to support Single-Pass Stereo rendering section of the Single-Pass Stereo Rendering manual page for more details.

Render Scale

Depending on the complexity of a VR scene and the hardware being used, you may want to adjust XRSettings.eyeTextureResolutionScale and XRSettings.renderViewportScale at opportune moments.
Setting XRSettings.eyeTextureResolutionScale to a value below 1.0 will lead to decreased usage of the GPU at the cost of image quality whereas setting this property to a value above 1.0 will lead to a crisper image at the cost of GPU timings and a larger memory footprint. It’s important to note that setting XRSettings.eyeTextureResolutionScale will reallocate Renderer Textures which will ultimately cause a hitch due to expensive memory allocation operations being called. Consider changing this property when starting the app, from an options screen with quality settings, or when entering scenes that require more or less GPU bandwidth.
Setting XRSettings.renderViewportScale, on the other hand, allows you to specify how much of the allocated eye textures to use. This property can be set without any frame rate hitches given that no textures are reallocated. This also implies that this setting accepts values between 0 and 1.0; 1.0 being the default value. Use this property to gain performance on the GPU in more demanding scenes of your project.

Asynchronous Reprojection

Most VR platforms have implemented a graphics technique called asynchronous reprojection which helps VR applications hit target frame rates during more computation heavy frames. Whenever an application drops a frame, asynchronous reprojection will kick in, rendering a new frame by applying the user’s latest orientation to the most recent rendered frame. Oculus, GearVR, Daydream, SteamVR, and PlayStation VR all have their own proprietary implementation of this technique. Although asynchronous reprojection is enabled by default on all applications that support it, it is important for the developer to understand this feature when comes the time to debug and optimize. Developers should never rely on this technique in place of optimization as this technique does cause positional and animation judder.

Motion-to-Photon Latency
Minimizing motion-to-photon latency is the key to giving the user the impression of presence; a term used to express the feeling of being physically connected to the experience one is living in virtual reality. It is recommended to target 20 milliseconds or less motion-to-photon latency for any input made via VR hardware to be reflected on the HMD’s screen. This includes the handling of HMD rotation and position as well as VR controller rotation and position.
Avoid handling input in FixedUpdate as this callback does not necessarily get called once per frame (depending on Time Settings and performance) and may incur increased input latency.
Tracked poses may potentially be handled twice, once in Update() and again in onBeforeRender(). Any additional handling that occurs in onBeforeRender() should be very lightweight or can result in serious input latency.

Mobile Considerations
Mobile VR can be particularly demanding as devices are less powerful than modern desktop PCs. It is important to take mobile platform limitations into account when designing mobile VR experiences. Mobile VR experiences should ideally be built from the ground up as porting a high-end PC or console VR experience to mobile is a trying task. The rule of thumb is to optimize your experience from the get-go and to tone down everything from visual effects to processing done of the CPU. Apart from what has already been mentioned above, please consider the following guidelines when developing your mobile VR application:
  • Minimize (or eliminate) the use of post-process effects
  • Keep geometry as simple as possible
  • Minimize draw calls
  • Avoid using Standard Shader or other computation-heavy shaders
  • Use Physics wisely as it is very computation-heavy. If possible, replace Physics colliders with distance-checking logic.

Thermal Throttling

When a device detects that it is heating up too much, thermal throttling may occur and will cause the device to intentionally begin to limit the amount of power your application can consume. This will naturally result in slower computations, and will negatively affect frame rate.
Thermal throttling is an important factor to keep in mind when developing mobile VR applications. It is a common issue mainly due to the fact that mobile VR often pushes devices to their maximum capacity. Mobile devices being enclosed in headsets whereas they were designed to be handheld also contributes to increased thermal impact.
Optimizing CPU and GPU processing, limiting network usage and location service usage will all contribute to improving your application’s energy impact. Consider setting Sustained Performance Mode when a predictable and consistent level of device performance over longer periods of time, without thermal throttling, is required. Overall, performance might be lower when this setting is enabled. This setting is based on the Android Sustained Performance API.

Platform Specific Recommendations
Each VR platform has its pros, cons, and specificities. For example, certain platforms make use of advanced rendering techniques that are not available on other platforms. Additionally, several platform-specific tools exist to help you track down problematic issues your project may be having.

RenderDoc (PC and Android)

RenderDoc is a stand-alone graphics debugging tool that allows you to do frame captures of your application running on PC or on Android.

Oculus Debug Tool (Oculus on PC)

The Oculus Debug Tool (ODT) enables one to to toggle Asynchronous Spacewarp (ASW), to change your field of view, and to view performance or debugging information within your application. One of the most pertinent views you can enable in ODT is the Oculus Performance Head-Up Display. This view can be used for viewing timings for render, latency and performance headroom in real-time.

SteamVR Frame Timings (SteamVR)

The SteamVR Frame Timing window can be accessed through the Video tab of SteamVR’s Settings window. This view shows CPU and GPU timings associated with SteamVR and is a great tool to monitor which frames are heavier to render, as well as the implications of reprojection in your VR project. The tool offers an option to display a more detailed graph, including the option to show the graph in-game. Do note that the Developer tab of SteamVR’s Settings window allows you to toggle Asynchronous Reprojection Interweaved Reprojection on and off.

Oculus Remote Monitor (GearVR and Oculus Go)

The Oculus Remote Monitor client is available for Windows and Mac OS X, and connects to applications running on remote mobile devices. It allows you to capture, store and display the streamed performance data from the device. It can be used to identify issues with tearing and missed frames.

Daydream Performance HUD (Daydream)

Similarly to the Oculus Performance Head-Up Display, the Daydream Performance HUD can be used to view rendering, Google VR, and other system metrics in real-time on device. Use this tool in order to investigate rendering issues concerning frame rate, asynchronous reprojection, and thermal throttling.

Fixed Foveated Rendering (Oculus Go)

Fixed Foveated Rendering (FFR) is a technique used on Oculus Go that makes use of tile-based rendering to render the edges of the eye texture at a lower resolution than the center. This is nearly imperceptible to the user as it only affects peripheral sections of the view and can significantly improve GPU fill performance. In Unity, Fixed Foveated Rendering can be enabled or disabled using the OVRManager.tiledMultiResLevel property. Read more about FFR in this blog post.

Asynchronous Spacewarp (Oculus Rift)

Asynchronous Spacewarp (ASW) is an extension of Oculus’ asynchronous reprojection implementation, which is dubbed Asynchronous Timewarp (ATW). Similarly to ATW, ASW kicks in when the frame rate drops below minimum requirements. This advanced technique allows positional information to be used in order to extrapolate where tracked elements, such as the player’s head and controllers, should be within a previous frame. This technique, combined with ATW, allow accurate intermediate frames to be generated when necessary. This leads to an enjoyable experience when frames are computationally demanding and frame rate requirements cannot be achieved. This feature is enabled by default and requires no extra effort, but developers should know that ASW may cause visual artifacts in some cases. See this blog post by Oculus to read more about Asynchronous Spacewarp.

PlayStation VR

The Unity manual installed with the PS4 Platform Installer includes a section on PlayStation VR (PS VR) specifics. Unity also provides a PlayStation VR sample project (available from Sony's DevNet Unity forum) that shows how to use PS VR specific features. PlayStation developers should also consult Sony DevNet’s technical notes and forums for more information about optimizing PS VR applications.
While the PlayStation 4 console is fixed hardware, there are two tier variants, comprising of the base PS4 and PS4 Pro systems. Always keep in mind that PS VR is supported by both of these models. Developers are required to maintain a framerate of at least 60 FPS, but are encouraged to reach 90 or 120 FPS if achievable.