Sign In Start Free Trial
Account

Add to playlist

Create a Playlist

Modal Close icon
You need to login to use this feature.
  • Augmented Reality with Unity AR Foundation
  • Toc
  • feedback
Augmented Reality with Unity AR Foundation

Augmented Reality with Unity AR Foundation

By : Linowes
4 (7)
close
Augmented Reality with Unity AR Foundation

Augmented Reality with Unity AR Foundation

4 (7)
By: Linowes

Overview of this book

Augmented reality applications allow people to interact meaningfully with the real world through digitally enhanced content. The book starts by helping you set up for AR development, installing the Unity 3D game engine, required packages, and other tools to develop for Android (ARCore) and/or iOS (ARKit) mobile devices. Then we jump right into the building and running AR scenes, learning about AR Foundation components, other Unity features, C# coding, troubleshooting, and testing. We create a framework for building AR applications that manages user interaction modes, user interface panels, and AR onboarding graphics that you will save as a template for reuse in other projects in this book. Using this framework, you will build multiple projects, starting with a virtual photo gallery that lets you place your favorite framed photos on your real-world walls, and interactively edit these virtual objects. Other projects include an educational image tracking app for exploring the solar system, and a fun selfie app to put masks and accessories on your face. The book provides practical advice and best practices that will have you up and running quickly. By the end of this AR book, you will be able to build your own AR applications, engaging your users in new and innovative ways.
Table of Contents (14 chapters)
close
1
Section 1 – Getting Started with Augmented Reality
5
Section 2 – A Reusable AR User Framework
8
Section 3 – Building More AR Projects

Building and running a test scene

Before moving on and building an AR project, it is prudent to verify your project has been set up properly so far by trying to build and run it on your target device. For this, we'll create a minimal AR scene and verify that it satisfies the following checklist:

  • You can build the project for your target platform.
  • The app launches on your target device.
  • When the app starts, you see a video feed from its camera on the screen.
  • The app scans the room and renders depth points on your screen.

I'll walk you through this step by step. Don't worry if you don't understand everything; we will go through this in more detail together in Chapter 2, Your First AR Scene. Please do the following in your current project, which should be open in Unity:

  1. Create a new scene named BasicTest by selecting File | New Scene, then Basic (Built-In) template, then File | Save As. From here, navigate to your Scenes folder, call it BasicTest, and click Save.
  2. In the Hierarchy window, delete the default Main Camera (right-click and select Delete, or use the Del keyboard key).
  3. Add an AR Session object by selecting GameObject | XR | AR Session.
  4. Add an AR Session Origin object by selecting GameObject | XR | AR Session Origin.
  5. Add a point cloud manager to the Session Origin object by clicking Add Component in the Inspector window. Then, enter ar point in the search field and select AR Point Cloud Manager.

You will notice that the Point Cloud Manager has an empty slot for a Point Cloud Prefab, which is used for visualizing the detected depth points. A prefab is a GameObject saved as a project asset that can be added to the scene (instantiated) at runtime. We'll create a prefab using a very simple Particle System. Again, if this is new to you, don't worry about it – just follow along:

  1. Create a Particle System by selecting GameObject | Effects | Particle System.
  2. In the Inspector window, rename it PointParticle.
  3. On the Particle System component, uncheck the Looping checkbox.
  4. Set its Start Size to 0.1.
  5. Uncheck the Play on Awake checkbox.
  6. Click Add Component, enter ar point in the search field, and select AR Point Cloud.
  7. Likewise, click Add Component and select AR Point Cloud Visualizer.
  8. Drag the PointParticle object from the Hierarchy window to the Prefabs folder in the Project window (create the folder first if necessary). This makes the GameObject into a prefab.
  9. Delete the PointParticle object from the Hierarchy window using right-click | Delete or press the Del key.

The Inspector window of the PointParticle object should now look as follows:

Figure 1.16 – Inspector view of our PointParticle prefab with the settings we're using highlighted

Figure 1.16 – Inspector view of our PointParticle prefab with the settings we're using highlighted

We can now apply the PointParticle prefab to the AR Point Cloud Manager, as follows:

  1. In the Hierarchy window, select the AR Session Origin object.
  2. From the Project window, drag the PointParticle prefab into the AR Point Cloud Manager | Point Cloud Prefab slot. (Alternatively, click the "doughnut" icon to the right of the slot to open the Select GameObject window, select the Assets tab, and choose PointParticle).
  3. Save the scene using File | Save.

The resulting AR Session Origin should look as follows:

Figure 1.17 – Session Origin with a Point Cloud Manager component populated with the PointParticle prefab

Figure 1.17 – Session Origin with a Point Cloud Manager component populated with the PointParticle prefab

Now, we are ready to build and run the scene. Perform the following steps:

  1. Open the Build Settings window using File | Build Settings.
  2. Click the Add Open Scenes button to add this scene to the build list.
  3. In the Scenes in Build list, uncheck all scenes except the BasicTest one.
  4. Ensure your device is connected to your computer via USB cable.
  5. Press the Build And Run button to build the project and install it on your device. It will prompt you for a save location; I like to create a folder in my project root named Builds/. Give it a filename (if required) and press Save. It may take a while to complete this task.

If all goes well, the project will build, install on your device, and launch. You should see a camera video feed on your device's screen. Move the phone slowly in different directions. As it scans the environment, feature points will be detected and rendered on the screen. The following screen capture shows my office door with a point cloud rendered on my phone. As you scan, the particles in the environment that are closer to the camera appear larger than the ones further away, contributing to the user's perception of depth in the scene.

Figure 1.18 – Point cloud rendered on my phone using the BasicTest scene

Figure 1.18 – Point cloud rendered on my phone using the BasicTest scene

If you encounter errors while building the project, look at the Console window in the Unity Editor for messages (in the default layout, it's a tab behind the Project window). Read the messages carefully, generally starting from the top. If that doesn't help, then review each of the steps detailed in this chapter. If the fix is still not apparent, do an internet search for the message's text, as you can be certain you're probably not the first person to have a similar question!

Tip – Build Early and Build Often

It is important to get builds working as soon as possible in a project. If not now, then certainly do so before the end of the next chapter, as it does not make a lot of sense to be developing an AR application without having the confidence to build, run, and test it on a physical device.

With a successful build, you're now ready to build your own AR projects. Congratulations!

bookmark search playlist download font-size

Change the font size

margin-width

Change margin width

day-mode

Change background colour

Close icon Search
Country selected

Close icon Your notes and bookmarks

Delete Bookmark

Modal Close icon
Are you sure you want to delete it?
Cancel
Yes, Delete