Detect the user’s touch

Tutorial

Beginner

+10XP

60 mins

(229)

Unity Technologies

Detect the user’s touch

A single tap on the screen is one of the most simple and intuitive interactions with a mobile device. Most users instinctively try to tap on elements they think they can interact with on the screen.

In this tutorial, you will do the following:

  • Create a new visual script.
  • Detect if the user has touched the screen.
  • Detect the position of the touch on the screen.
  • Add a user interface that displays information about the user’s touch.

1. Overview

A single tap on the screen is one of the most simple and intuitive interactions with a mobile device. Most users instinctively try to tap on elements they think they can interact with on the screen.

Your app is successfully detecting and displaying planes detected in the environment. When the project is complete, tapping on one of those planes will spawn a portal to a virtual world. The first step towards that outcome is detecting when and where the user has touched the screen.

In this tutorial, you will do the following:

  • Create a new visual script.
  • Detect if the user has touched the screen.
  • Detect the position of the touch on the screen.
  • Add a user interface that displays information about the user’s touch.

2. Create a new script graph

First, follow these instructions to create a new script graph:

1. In the Hierarchy, right-click and select Create Empty.

2. Name the new empty GameObject “Visual Scripts”.

3. In the Inspector, select Add Component, then add a Script Machine component.

4. In the Script Machine component, find the Graph property and select New.

5. When the Save Graph dialog window appears, create a new folder inside the _ARPortal folder and name it “Visual Scripts”.

6. Name the new graph “TapToPlace” and select Save.

7. In the Script Machine component, update the graph text fields as follows:

  • Title: Tap to place
  • Summary: Detects touch on an AR plane and places a portal object

8. Select Edit Graph to open the TapToPlace script graph.

3. Add a debug UI to your app

Before you get started on your visual script, you should prepare to debug the app on your device.

Developing apps for a phone presents unique challenges for debugging. When the app is running on your device, there is no easy way to see errors in the console when they arise.

In situations like this, it can be helpful to implement a simple debugging user interface (UI) that displays directly on your device as the app is running. This can just be a simple text box that you can print messages to, to make sure that the app is working as you expect.

We have provided a simple UI for you to use, but feel free to customize its appearance.

Implement the debug UI

Follow these instructions to add a debug UI to your app and log a message:

1. In the Project window, go to _ARPortal > Prefabs > Testing.

2. Click and drag the Debug UI prefab from the Project window into the Hierarchy.

3. In the Hierarchy, expand the debug UI and the Panel child GameObject, then select the Text GameObject.

4. In the Inspector, find the Text component. Try editing the Text property to change the text that appears in the Game view.

Note: The size of the font may appear differently than in the screenshot above, depending on the aspect ratio and resolution of your Game view preview window. This size difference is expected behavior.

4. Save your scene.

5. Build and run the app to preview the debug UI on your device.

4. Create a new variable for the debug message

The debug UI is ready to display any message you want. Next, you need to make sure you can actually change the content of that message while the app is running from a visual script. You can do this using a variable.

Follow these instructions to set up your debug UI text as a variable:

1. In the Script Graph window, go to the Blackboard and select the Object tab.

2. In the (New Variable Name) field, enter “debugText” and select Add (+).

3. Set the Type property to Text.

4. Select the Value property object picker (circle) and choose Text.

5. Log a message to the debug UI

Now that you’ve set up the variable, you can change the debug message displayed at the start of your visual script.

Watch the video or follow the instructions below to see how it’s done.

Follow these instructions to change the debugText variable at the start of your visual script:

1. In the Graph Editor, use the fuzzy finder to add a Text: Set Text node. Connect the On Start node flow output to your new node.

2. Left-click the handle on the debugText variable and drag the variable into the Graph Editor, then connect it to the Text input of the Set Text node.

3. In the String input of the Set Text node, enter a custom message. For example, you could change it to “application started”.


4. Build and run the app to preview the new starting debug message.

6. Detect a touch

Next, you will add functionality to detect when the user has touched the device screen and send a message to the debug UI to verify that the touch detection is working correctly. The message should be something like, “touch detected.”

You’ll need to use the On Update event to detect a touch in your app. Since the On Update event runs constantly – every time the app is updated – it is perfect for this situation where the user could touch the screen at any time.

Watch the video or follow the instructions below to detect a touch and log the event to the debug UI:

1. In the Graph Editor, use the fuzzy finder to add an Input Get Touch (Index) node. Connect the On Update event node flow output to your new node.

The Index represents the number of simultaneous screen touches. The Index value is set to 0 when no touches are detected. If one touch is detected, the Index value is set to 1. The second simultaneous touch increases the Index value to 2, and so on.

2. From the On Start event node sequence you created earlier, select and duplicate the Set Text and Get Variable nodes. Connect the Get Touch node flow output to the flow input of the duplicated Set Text node.

3. In the duplicated Set Text node, change the String value message to something like “touch detected”.

4. Build and run the app to preview the new functionality. When you touch the screen, the debug UI should display touch detected.

7. Get the touch position

In addition to checking whether or not a touch has occurred, you also need to determine the location of that touch on the screen. You can use the X and Y coordinates of the touch to determine whether or not the user successfully touched a horizontal or vertical plane.

The process to determine the touch position

You will use the Touch: Get Position node to get the touch location. This node outputs a Vector2, which includes the X and Y coordinates of the detected touch.

You’re going to display these coordinates in the debug UIs text. However, the Set Text node will not accept a Vector2 input for its text — the input requires a String value.

In order to display the Vector2 coordinates from the Get Position node using the String input on the Set Text node, you first need to convert those Vector2 coordinates into a compatible String format. You can do this using a To String node, which converts data into String format that can be used for text.

Display the touch position coordinates

Watch the video or follow the instructions below to get the touch position and display those coordinates in the debug UI:

1. In the Graph Editor, add a Vector 2: To String node and insert it in the On Update sequence between the Get Touch and Set Text nodes.

2. Add a Touch: Get Position node, then connect it between the Get Touch and To String nodes.

3. Connect the String output from the To String node to the String input of the Set Text node.

4. Build and run the app to preview the new functionality. As you touch the screen, the debug UI should display the touch position’s X and Y coordinates.

8. More things to try

If you want to further develop your skills, explore new concepts, or improve your project, check out some of the optional activities below. Each activity is identified as being either Easy, Medium, or Difficult, so you know what you’re getting into!

These activities are entirely optional, so if you’re not interested, no problem – just skip this step.

We do recommend attempting at least one of them in order to get the most out of this learning experience. Good luck!

Easy: Display a helpful starting message

In this tutorial, you displayed the text, “application started” on the debug UI to verify the UI was working properly. Displaying text at the start of an AR experience is often very useful, since it is an opportunity to tell users how to interact with the application. In this app, you could tell the user, “move the camera around slowly to detect a vertical/horizontal surface.” Try implementing a more helpful message like this at the start of your application.

Difficult: Create a touch visualizer

In the video demos throughout this tutorial, you probably noticed the small white circle that followed the touch position.

The small circle can be helpful for debugging, since it confirms precisely when and where your finger makes contact with the screen.

Use the following guidance for help completing this challenge:

  • Use a screen space UI image in the shape of a circle for the visualizer.
  • Create a new Script Machine named “TouchVisualizer” to control the functionality. It is helpful to keep this kind of extra functionality separate from your main script.
  • Use the GameObject: Set Active node and the Transform: Set Position node to enable and position the UI circle.

9. Next steps

In this tutorial you detected a touch event and logged the coordinates in a custom debug UI. You then used the coordinates to verify that your visual scripts were working as expected. In the next tutorial, you’ll detect whether or not the user has touched a valid horizontal or vertical plane.

Complete this Tutorial