Other AR considerations

Tutorial

·

Beginner

·

+10XP

·

30 mins

·

(8)

Unity Technologies

Other AR considerations

The UFO game is now fully functional on mobile from a technical perspective, but are there any other design decisions you can make to create a better AR experience for your users?

In this tutorial you’ll do the following:

  • Review key design areas of the AR app
  • Make changes to the app based on your designs

Languages available:

1. Overview

The core functionality of the port to AR is complete! Now that everything is working, it is time to think about ways to improve the AR experience for your players. Because this game was originally created to be a desktop experience, all of the design decisions were made based on the strengths of that specific platform. There are many settings and design choices you can make to optimize and extend the experience for augmented reality.



In this tutorial, you’ll review key areas of the application that significantly differ between the desktop platform and mobile AR platforms and make decisions about what design changes you’d like to make for each of them.


2. Tell the story with lighting

The first part of the design you’re going to consider is lighting.


The game’s backstory has the sheep abduction occuring at night. To illustrate this fact, all of the scenes of the game feature a starry sky and blue tinted lighting.



In the AR game, the skybox no longer appears in the scene, leaving only the lighting to show the user that the story takes place at night. What else can you do to help illustrate this nighttime setting? Are there any lighting adjustments you think you can make to better communicate this idea? Are there other ways that you can tell this part of the story?


3. Perform a lighting technical review

As you tested the game on your phone, you might have noticed a difference in the farm’s appearance between what you see in Unity and what you see on your phone.


In the Unity Editor, the scene is a cool blue and appears evenly lit. However, in the built app, the prefab takes on a very different appearance:



In the AR app, much of the scene looks significantly darker, particularly the field and corn. Meanwhile, the sheep and the farmer appear unchanged. What’s happening?



The desktop version of the game makes use of baked lighting. With baked lighting, Unity pre-calculates scene lighting information in advance of runtime (that is, before users interact with your experience). This means that all Unity needs to do at runtime is apply the lighting data to the scene. This setup allows the scene to run faster and more efficiently, as Unity doesn’t need to calculate how lighting should appear at every frame. To learn more about baked lighting, check out the Creative Core: Lighting mission.


The one limitation that baked lighting has is that it doesn’t work for dynamic objects — that is, objects that animate or otherwise move in some way during the game. This limitation means the sheep and the farmer can’t make use of baked lighting. To account for this, the Modes of all directional lights in the game have been set to Mixed. Mixed lighting renders lighting data at runtime and can adjust as things in the scene change, but Unity will apply pre-calculated baked lighting to objects that already have lighting data.



This limitation explains why the sheep and farmer look the same on your mobile device, while the rest of the scene looks darker. As already mentioned, baked light doesn’t work for objects that move. Instantiating the farm into the scene falls under this category, meaning that the baked lighting has actually broken.


Fix the broken lighting


There are two ways to repair the broken lighting in the scene. When considering which action to take, review other lighting design choices you might have made for your game, as well as how well each option runs on your phone:


  • Option 1: Add the directional light to your Gameplay prefab and bake new lighting data. If the light is kept with the objects that have associated lighting data, then the baked lighting won’t break. To learn how to bake lighting, refer to Bake a lightmap for your scene in the Creative Core pathway.

  • Option 2: Set the directional light to realtime and don’t work with baked light at all. This may seem like the easier option, but be sure to test what the end result is like on your phone. As a reminder, realtime lighting is calculated every frame that the game is running, so it can slow down your application.

Lighting is an important feature of any game, not just for practical visibility purposes, but for design purposes as well. Take time to consider how you can use lighting to improve the overall user experience of the game.


4. Evaluate the object visibility

On the desktop version of the game, the user’s camera view is fixed, and every important gameplay element is framed to be centered in this field of view. When we created the GameObjects used in this experience, we prioritized making everything proportional, such as the size of the farmer in relation to the barn.


In your AR port, the camera is no longer fixed. It’s now possible for the user to fully move around the play area, which has many benefits, but this freedom also means that certain gameplay elements can now become obscured or too small and harder to see.



Given the dynamic camera in AR, what edits might be made to improve the user experience? Playtest the game and consider the following questions:


  • Are there any objects that become difficult to see from different angles in the game?

  • Are there objects that could become larger or smaller to improve this problem?

  • Are there other elements that you can add to objects to make them easier to see?

After you have completed your testing, address the issues that you identified by making adjustments to the Gameplay prefab. Publish to your mobile device and test again to see if the problems you saw previously were resolved with your changes. Continue this iteration cycle until you’re happy with the results.


5. Integrate spatial audio

In the desktop version of the game, all audio was configured to sound the best for the fixed position of the cameras in each of the scenes.



The results were a soundscape that sounded correct for the precise position of the camera, but if the camera moved, the audio would need to be adjusted. Because the cameras didn’t move in the original version of the game, this wasn’t an issue. However, in the AR version, the camera in MainScene can move to any position.


The existing scene has a combination of 2D and 3D audio sources throughout. Play test the game and pay close attention to what you’re hearing. Make note of what you’re hearing from each of the following sources:


  • The farmer: snoring, waking up, shouting

  • The sheep

  • The UFO: ambient hum, the tractor beam activated

  • The ambient cricket song

Certain audio sources may benefit from the spatial audio treatment, while you might wish for others to be audible at all times throughout the game. Think about how each sound is used to inform what’s happening as you play.


Once you have determined how you would like to adjust the audio, select each audio source and modify it as needed. If you need a refresher on how to work with spatial audio, refer to the tutorial from an earlier mission, Add spatial audio to your marker-based app.


Don’t feel limited to what already exists in the game! If you think the AR version of the game would be better served with fewer, more, or even different audio effects, go for it!


6. AR in other scenes

In total, the UFO game contains four scenes: StartScene, MainScene, LoseScene, and WinScene. Currently, the only scene with AR content is MainScene. It’s perfectly fine to keep the AR elements exclusively to the gameplay, but you can absolutely add AR to other scenes as well. Consider the following challenges:


  • Buttons: In StartScene, LoseScene, and WinScene, there are several buttons. How would you adjust these buttons for AR? Would you try to make them worldspace UI components or keep them in screen space?

  • Camera angles: The LoseScene features a fixed camera angle showing the farmer pointing at the UFO as it flies away. This sequence tells the story of the farmer scaring the UFO away from his flock. How would you communicate this story without being able to rely on that camera angle?

  • Extra assets: The StartScene features the UFO hovering over a field of corn, but if you actually look at the scene, the field is really only a few meshes arranged specifically for the camera! How would you recreate the scene to preserve the overall mood of the original but have it work for AR?

Adding more AR elements to the game is an opportunity for you to apply your own design ideas. Try a few concepts and see how they work!


7. Next steps

By regularly experimenting with new concepts, you’ll continue to sharpen your AR design skills. As you’ve discovered throughout this tutorial, the key to improvement is continual testing and iterating on ideas. Keep this outlook in mind as you move on to your final challenge: creating an original app of your very own!


Complete this tutorial