Apple ARKit with Xamarin.iOS – part1

Apple ARKit with Xamarin.iOS – part1

Short introduction

During WWDC 2017 Apple announced new framework for Augmented Reality called ARKit which enables developers create apps with easy-to-go API.

ARKit is also available for Xamarin developers! That is why I decided to create blog post series about how to start creating augmented reality apps with Xamarin.iOS and ARKit. In this article you will find introduction and description of many important components which will be used in demo application I prepared.


What do I need to start?

  1. Xcode 9 and iOS version 11 and later
  2. Mac computer
  3. Visual Studio for Mac
  4. IPhone 6S device or later


Let’s start

Before we start I would like to describe ARKit framework components which enable apps to support Augmented Reality.


ARKit introduction


ARKit is framework for creating Augmented Reality apps for iOS. Below I would like to introduce you into some fundamentals before you start creating own application.

ARSession – ARKit is based on session. It means that before you start playing with AR in your application you have to setup ARSession. This is shared object which manages camera, motion sensors and AR objects displayed in current scene.

Each session has to be configured using proper configuration class:

AROrientationTrackingConfiguration – configuraton for less capable devices. It uses rear-facing camera and measures only device orientation. This configuration is not recommended by Apple because it limits AR experience.

ARWorldTrackingConfiguration – configuration for full capable devices. It uses rear-facing camera, tracks a device’s orientation and position, and detects real-world flat surfaces.

ARSessionDelegate – protocol which provides methods to control session state, for instance if camera is still tracking displayed objects or method to collect errors generated by session.

ARFrame – AR session continuously captures video frames from the device camera. This frame is called ARFrame. It contains a captured image, detailed tracking, and scene information which contains the current tracking points and lighting conditions.


ARKit framework is integrated with SceneKit, high level framework which enables developers to add 3D content to their apps and even create games using provided engine.

Below you can find some common features:

ARSCNView – this is view to display Augmented Reality content combined with view from camera. It is based on SSCView.

ARSCNViewDelegate – protocol which provides methods to integrate SceneKit content with ARKit session.


This is enough for now. In this first blog post I would like to provide only basics and most important information to make it easier to start with ARKit.


Create Xamarin.iOS application with ARKit


1. First of all we need to create new Xamarin.iOS application. Open Visial Studio and select blank Xamarin.iOS application project template:

2. Open “Main.storyboard” file. Click on ViewController view and remove it. Now search for “ARKit Scene View”. Place it on ViewController. Please note the structure in “Document Outline” window:

3. Click on the Scene View control and in “Properties” window type “ARSCNView” in the “class” field:

4. Now create new folder in solution root called “ArKitRelated”. In this folder we will keep the code connected with ARKit:

5. Add class called “ArSceneGenerator” to above folder. Below I pasted the code with comments to make it easier for you to understand what is happening:

    public class ArSceneGenerator
        // Method responsible for creating new AR Scene:
        public void GenerateArScene(ARSCNView sceneView)
            // First we need to create scene with object loaded from local file:
            var scene = SCNScene.FromFile("art.scnassets/ship");

            // Then we need to assign this scene to SceneView from our ViewController:
            sceneView.Scene = scene;

            // Developer has possibility to set debug options.
            // For instance to display XYZ axis or to show planes points in front of camera:

            //sceneView.DebugOptions = ARSCNDebugOptions.ShowWorldOrigin
            //| ARSCNDebugOptions.ShowFeaturePoints;

            // Here we are assigning delegate to session to get infromation about current session state:
            sceneView.Session.Delegate = new ArSessionDelegate();

        // Once we create scene we need to position our AR model in it:
        public void PositionSceneObject(ARSCNView sceneView)
            // Each session has to be configured.
            //  We will use ARWorldTrackingConfiguration to have full access to device orientation,
            // rear camera, device position and to detect real-world flat surfaces:
            var configuration = new ARWorldTrackingConfiguration
                PlaneDetection = ARPlaneDetection.Horizontal,
                LightEstimationEnabled = true

            // Once we have our configuration we need to run session with it.
            // ResetTracking will just reset tracking by session to start it again from scratch:
            sceneView.Session.Run(configuration, ARSessionRunOptions.ResetTracking);

            // Next we need to find main "node" in the .dae file. In this case it is called "ship":
            var shipNode = sceneView.Scene.RootNode.FindChildNode("ship", true);

            // Then we have to set position of AR object - below I would like to display it in front of camera:
            shipNode.Position = new SCNVector3(0.0f, 0.0f, -20f);

            // Next we need to add ship object to scene:

            // At the end I configured simple rotating animation to rotate ship object in front of camera:
            shipNode.RunAction(SCNAction.RepeatActionForever(SCNAction.RotateBy(0f, 4f, 0, 5)));

6. Now create folder called “Delegates” inside “ArKitRelated” folder:

7. Inside above folder add class called “ArSessionDelegate”. Code with description is pasted below:

    // This delegate provides information about current AR session state.
    // If object is displayed properly in front of camera state will be changed to Normal
    // Sometimes it is not possible to display AR object properly and then state wil be changed to: InsufficientFeatures
    // You can read more about ARTrackingState enum here:
    public class ArSessionDelegate : ARSessionDelegate
        public ArSessionDelegate()

        public override void CameraDidChangeTrackingState(ARSession session, ARCamera camera)
            var state = string.Empty;
            var reason = string.Empty;

            switch (camera.TrackingState)
                case ARTrackingState.NotAvailable:
                    state = "Tracking Not Available";
                case ARTrackingState.Normal:
                    state = "Tracking Normal";
                case ARTrackingState.Limited:
                    state = "Tracking Limited";
                    switch (camera.TrackingStateReason)
                        case ARTrackingStateReason.ExcessiveMotion:
                            reason = "because of excessive motion";
                        case ARTrackingStateReason.Initializing:
                            reason = "because tracking is initializing";
                        case ARTrackingStateReason.InsufficientFeatures:
                            reason = "because of insufficient features in the environment";
                        case ARTrackingStateReason.None:
                            reason = "because of an unknown reason";

            Console.WriteLine("{0} {1}", state, reason);

Great. Now we need to add some graphic files (AR objects). Create “art.scnassets” folder in solution root and add two files to it (you can download them here):

“ship.dae” file contains 3D model to display in front of camera. “texture.png” file is just a file which contains textures for our object (you can think about it like about stickers).

8. Open “ViewController.cs” file and replace class code with below (description included in comments):

    public partial class ViewController : UIViewController
        private ArSceneGenerator _aRSceneGenerator;
        public ARSCNView SceneView
            get { return View as ARSCNView; }

        protected ViewController(IntPtr handle) : base(handle)
            // Note: this .ctor should not contain any initialization logic.

        public override void ViewDidLoad()

            // Here we create new ArSceneGenerator object:
            _aRSceneGenerator = new ArSceneGenerator();
            // Then we need to setup new scene:

        public override void ViewWillAppear(bool animated)

            // Before view is displayed we want to place AR object in the current scene:

        public override void ViewWillDisappear(bool animated)

            // Once view is not visible to user, pause current AR session:

        public override void DidReceiveMemoryWarning()

        public override bool ShouldAutorotate()
            return true;

        public override UIInterfaceOrientationMask GetSupportedInterfaceOrientations()
            return UIInterfaceOrientationMask.All;


Sum up

Launch application on real device and see the result!

I hope that this post will help you to start playing with ARKit and Xamarin. In the next post I will present some more complex sample and features.

Whole sample is available on my GitHub.