AR Foundation Plane Detection
Previously we saw how to set up our Unity project for AR and Android deployment. If you haven’t checked it out already, then do it before continuing with this blog as it’s a prerequisite. In this blog, we’ll learn about plane detection, where we’ll be able to scan the environment using our phone’s camera to detect horizontal and vertical planes. We’ll also learn to create a custom plane and use it for visualizing the detected planes.
Let’s start by setting up our scene for plane detection
Create a new scene and get rid of the
Right-click on the Hierarchy window and select
AR Session Origin.
Once again right-click on the Hierarchy window, and select
AR Session OriginGameObject and click on
Add Component→ add the AR Plane Manager component. This component will create a Plane Prefab for each detected plane in the environment. The parameter Detection Mode has four options, based on the options selected the component will either scan horizontal planes or vertical planes or both or none. For now, we’ll leave it at
Everythingwhich will detect both horizontal and vertical planes.
The component requires us to add a Plane Prefab. So, let’s see how to create and add a plane prefab.
Creating Plane Prefab
We can create a Plane Prefab in just two steps:
Right-click on the Hierarchy window, select
ARDefault Plane. This will create a GameObject named
AR Default Planewith the following components attached to it:
- AR Plane: it represents a plane detected by an AR device. It contains all the data about a detected plane.
- ARPlaneMeshVisualizer: this component generates a Mesh from the boundary vertices and assigns it to a MeshCollider, MeshFilter, and LineRenderer, if present. So, if those 4 components are missing we’ll not be able to visualize the detected plane.
- MeshCollider: it takes a Mesh (generated by ARPlaneMeshVisualizer) and builds a Collider based on that Mesh. It is more accurate for collision detection than using primitives for complicated meshes.
- MeshFilter: it holds a reference to a Mesh. It works with a Mesh Renderer component on the same GameObject to visualize the mesh.
- Mesh Renderer: renders the mesh that the Mesh Filter references.
- LineRenderer: it’s used to draw anything from a simple straight line to a complex spiral.
Next, drag and drop the
AR Default PlaneGameObject into the Project window to convert it into a prefab. Finally, delete it from the scene.
With that, we have created a Plane Prefab. But the line renderer has a default line color (black) and the mesh renderer has a default material (orange). If you are not a fan of those colors then we can change them in just a few steps.
Right-click on the project window, select
Material→ name the material as PlaneMaterial.
Select the PlaneMaterial, change its Rendering Mode to
Transparent→ select the Albedo parameter and reduce the alpha value. By doing this we’ll be able to see the physical environment through the plane material.
AR Default Planeprefab, find the Mesh Renderer component and replace the DefaultPlane material with the PlaneMaterial.
On the same prefab, find the Line Renderer component and use the Color parameter to change the color of the line.
Before we build and test the application, there is one more step to do. That’s referencing the Plane Prefab. To do that, select the
AR Session Origin GameObject from the Hierarchy window → drag and drop the
AR Default Plane prefab into the Plane Prefab parameter of the AR Plane Manager component
Now, let’s build and test the application.
Build Settings. This will open the build settings window
Make sure that the device is connected. Also, the right scene and the Android platform are selected. Then click on
Build and Run.
Give an appropriate name for the application and click on
Once it’s built, the application will run on the device and we can test it. As we scan the environment, it should detect the horizontal and vertical planes and render the plane prefab on top of that.
What we saw in this blog was just the beginning, there are so many things that can be done after a plane has been detected. We can calculate its size, place an object on top of it, make an augmented object interact with it, etc. In the next article, we’ll be doing something similar. We’ll be creating a Tap To Place application where we’ll be able to spawn an object on the detected plane and move it to a different location on the plane based on the touch position.
Thanks for reading this blog post. 🧡 If you are interested in creating your own AR and VR apps, you can learn more about it here on immersive insiders. Also, if you have any questions, don't hesitate to reach out! We're always happy to help.