Markerless AR with AR Foundation in Unity

Markerless AR was a huge step forward for the AR Tech. Now, with AR Foundation in Unity, you can create AR experience without the need to have any marker.

This approach gives you a lot of flexibility, as you don’t rely on any physical item. Devices that support ARKit and ARCore have this functionality built-in in these solutions.

In works by detecting different surfaces, where later you can place any virtual object.

Want to learn how you can do that?

Setup project ?‍?

Similarly to the previous posts about Face Tracking and Image Tracking, we have to install AR Foundation packages.

To do that we have to open Package Manager and install the following packages: AR Foundation, ARCore XR Plugin (for Android support) and ARKit XR Plugin (for iOS support).

Installing AR packages from Package Manager.

With these installed, now we can start working on building the scene.

Build AR scene ?‍?

As in previous posts, here we also need to start by creating AR Session and AR Session Origin.

You can create both by going to Hierarchy and clicking Create > XR > AR Session and Create > XR > AR Session Origin.

Creating AR Session and AR Session Origin.

Surface Detection ?

AR Foundation has a few components that we can use to visualize detected surfaces.

First, I would recommend creating AR Default Plane and making prefab from it.

Creating AR Default Plane.

After that, you can remove the instance of that prefab in the scene.

Now, go to the AR Session Origin and add AR Plane Manager component. Of course, don’t forget to assign our plane prefab there!

AR Plane Manager in the Inspector.

If you want, you can build a project and test if surface detection is working properly.

Placing virtual objects in the real world! ?

Now it’s time to place virtual content in our world!

We can do that by creating a simple script.

using System.Collections.Generic;
using System.Linq;
using UnityEngine;
using UnityEngine.XR.ARFoundation;
using UnityEngine.XR.ARSubsystems;

/// <summary>
/// This class is responsible for placing and moving instance of the prefab in the real world.
/// </summary>
public class ARPlaceObject : MonoBehaviour
    // Reference to the AR Raycast Manager
    private ARRaycastManager raycastManager;

    // Prefab which will be spawned in the real world.
    private GameObject prefab;

    // Instance of the prefab.
    private GameObject prefabInstance;

    /// <summary>
    /// Unity method called in before first frame.
    /// </summary>
    private void Start()
        raycastManager = GetComponent<ARRaycastManager>();

        prefabInstance = Instantiate(prefab);

    /// <summary>
    /// Unity method called every frame.
    /// </summary>
    private void Update()
        // List of the hit points in real world.
        var hitList = new List<ARRaycastHit>();

        // Raycast from the center of the screen which should hit only detected surfaces.
        if (raycastManager.Raycast(new Vector2(Screen.width / 2, Screen.height / 2), hitList, TrackableType.PlaneWithinBounds | TrackableType.PlaneWithinPolygon))
            // In the instance is inactive, enable it.
            if (!prefabInstance.activeInHierarchy)

            // Sort hit list base on distance to the camera.
            hitList = hitList.OrderBy(h => h.distance).ToList();
            var hitPoint = hitList[0];

            // Position instance in the closest hit point.
            prefabInstance.transform.position = hitPoint.pose.position;
            prefabInstance.transform.up = hitPoint.pose.up;
            // In the instance is active, disable it.
            if (prefabInstance.activeInHierarchy)

You can attach it to the AR Session Origin. It should attach AR Raycast Manager also.

After that, you can create some kind of prefab that will be displayed on the detected surface.

AR Place Object in the Inspector.

Build settings ?

Similarly to the previous AR posts, you have to change the build setting, depending on the platform you are building for.

If you are building for Android, you have to change the API level to 24.

Android build settings.

And if you are building for the iOS, you have to use at least version 11.0.

iOS build settings.

The result ?

As a result of our work, we should be able to see two things: detected surfaces and our virtual object we added. ?

Cube in the real world without the marker!

You can continue to build upon that project by adding more objects or placing them somewhere in the world. ?

If you find this post useful or you are building something with this post, let me know in the comment section below!

You can also share it with your friends! And I would really appreciate that! ❤️

And if you are interested in getting emails when I release new post sign up for the newsletter!

You can also check the whole project at my public repository. ?

See you next time! ?

5 1 vote
Article Rating
Notify of
Inline Feedbacks
View all comments
Would love your thoughts, please comment.x