First Steps
No one can deny that WWDC 2017 was a smash hit! Apple introduced tons of new features and frameworks, causing developers everywhere to rejoice.
One of our favorites is ARKit
— and the community agrees. Everywhere you look, people are talking about ARKit
. That being said, let’s jump right in and whip up some of our own augmented reality!
What is Augmented Reality?
Augmented Reality (AR) is not a new concept; it’s been around for awhile. The basic idea is to fuse virtual reality with reality reality (that’s a thing, right?).The way it works is like so: AR lets us view a real-world space through a camera; it then introduces virtual objects into that space, giving us the impression that those virtual objects actually exist within that same space. The resulting effect is Augmented Reality.
Traditionally, AR uses tracking cards as anchor points within the real-world space. You point the camera at the tracking card, and once recognized, the tracking card serves as an anchor point, which in turn provides transform information about the real-world space. A 3D object can then be placed into the real-world by making use of this transform information. As the camera moves around the tracking card, the transform information gets updated, which updates the 3D object accordingly. This creates the impression that the 3D object is real. Everything is good… as long as you have that tracking card in view of the camera.
Games like Om Nom Candy Flick make use of tracking cards.
With the introduction of Microsoft’s HoloLens, things got way cooler. The HoloLens is spacial-aware, which means it has the ability to identify real-world surfaces and can use those surfaces to track movement – which means a tracking card is no longer required. The HoloLens is a fantastic specialised device, but it comes at a price. And it’s not a small price!
At WWDC 2017, Apple announced ARKit
and showed us some impressive demos. The first thing we noticed was how ARKit
has the ability to turn your iPhone or iPad into a HoloLens! That’s right! ARKit
is also spacial aware, and it doesn’t need tracking cards to track movement.
Introducing ARKit
The ARKit
framework is a mobile augmented reality platform that you can use to develop your own AR experiences. It’s a high-level API that provides a simple interface with powerful features. ARKit
is supported on A9 and higher capable devices.
ARKit
provides the following capabilities:
-
Tracking
It tracks your device position in real-world space in real-time. It makes use of visual inertial odometer, which combines camera tracking with motion sensor data. As an added bonus, no tracking cards are required!
-
Scene Understanding
It’s spacial-aware and has the ability to identify surfaces in the real-world environment. Things like floors, tables, and walls. This is also referred to as plane detection.
-
Rendering
It provides easy integration into
SpriteKit
,SceneKit
as Metal. As an added bonus there’s even support for it in Unity and Unreal Engine. Yeah baby! Now we’re talking.
Basic ARKit Concepts
Now that you know what ARKit
is all about, let’s take a closer look at a few fundamental concepts you’ll need in order to understand and make use of it.
-
Session
ARKit
is a session based framework. To start a newARSession
you need to provide it with anARSessionConfiguration
. After that, you just need to run the session. This process runs at 60 frames per second and will start gathering information from the camera and the motion sensors. You can either poll for information or you can make use of the available session update delegates. -
Configuration
ARKit
provides tracking based on device capabilities.ARSessionConfiguration
is used for less capable devices, and it only tracks device orientation.ARWorldTrackingSessionConfiguration
is used for fully-capable devices, and it tracks device orientation, relative position, and scene information. Additional AR session features can be enabled or disabled through these configuration classes. -
Frames
As the AR session runs, you can access the
ARFrame
for information. It contains a captured image, detailed tracking, and scene information which contains the current tracking points and lighting conditions. -
Feature Points
This is a list of real-world tracking points gathered by
ARKit
over time, which relates to trackable features within the real world. These points provide depth information and givesARKit
its spacial-aware capabilities. -
Anchors
An anchor is a position and orientation in real-world space.
ARKit
will maintain that position and orientation for you as you move the camera around. You need to add anchors to an AR session. -
Planes
A plane is a detected surface like a floor or a table surface. When enabled,
ARKit
will detect, maintain, and track available surfaces in the real-world space. You can then use these surfaces to place your content, which will appear on top of the surface. -
Lighting
ARKit
also tracks the current lighting conditions, which can be used to properly light virtual content in a more believable fashion.
Using ARKit
Apple made it simple to work with ARKit
. Let’s take a closer look at the steps required to get ARKit
up and running:
-
STEP 1: Configuration
First, create an
ARConfiguration
and set all its properties. Also, make sure you set a configuration based on the device capabilities (not all devices work withARWorldTrackingSessionConfiguration
):if ARWorldTrackingSessionConfiguration.isSupported { configuration = ARWorldTrackingSessionConfiguration() } else { configuration = ARSessionConfiguration() }
-
STEP 2: Create and Start the AR Session
Now that you have an
ARConfiguration
all sorted out, you need to create and start theARSession
. This will start the tracking process:// Create an AR Session let arSession = ARSession() // Nominate self for session delegation arSession.delegate = self // Run the session arSession.run(configuration)
-
STEP 3: Managing an AR session
Once the session is up and running you can manage the session as follows:
// PAUSE. This will suspend all AR processing. session.pause() // RESUME. This will resume the AR session and continue processing. session.run(session.configuration) // CHANGE. This will change or update the configuration, should you want to do so. session.run(newConfiguration) // RESET. This will restart the session tracking process from scratch. session.run(configuration, options: .resetTracking)
-
STEP 4: Handle Session Updates
While the AR session is running you can make use of the
ARSessionDelegate
protocol to react to various events:Frame Updates.
func session(_: ARSession, didUpdate: ARFrame)
provides a detailed frame by frame update. TheARFrame
contains a captured image, tracking and scene information, feature points, and lighting conditions.Anchor Added.
func session(_: ARSession, didAdd: [ARAnchor])
gets called when an anchor gets added to the AR session. Anchors can be added by callingsession.add(anchor: newAnchor)
.Anchor Updated.
func session(_: ARSession, didUpdate: [ARAnchor])
gets called when and anchor requires an update.Session Tracking State Change.
func session(_ session: ARSession, cameraDidChangeTrackingState camera: ARCamera)
provides the current state of the AR session. While the session is still being established, it will be in a Not available state. When there’s insufficient features available to track it, it will be in a Limited state. Finally, it will be in a Normal state if the session is actively running without any issues.Session Failure.
func session(_: ARSession, didFailWithError: Error)
provides session errors that may occur, allowing you to present the user with useful information. Or if you’d like, you can keep them guessing! Come on, you know that’s fun to do!
SceneKit Integration
ARKit
seamlessly integrates into SceneKit
. The best part: Apple made it so simple that even a cat can use it! -[:)]
There are a few things you need to be aware of when dealing with ARKit
along with SceneKit
. Just don’t share these with the cat. We need to hold them off as long as possible!
-
ARSCNView.
So traditionally for
SceneKit
, you had to use anSCNView
. There’s now a new view available for you to use:ARSCNView
. It includes everything you’ve been using inSceneKit
, but has been combined with theARKit
features. It contains anARSession
and anSCNView
. All you need to do is provide it with anARConfiguration
and kickoff theARSession
.
-
ARSCNViewDelegate.
This protocol provides all the basic session delegates from
ARSessionDelegate
along with a few delegates related to theSceneKit
renderer.Let’s take a closer look at some of the available delegates:
Node Added.
func renderer(SCNSceneRenderer, didAdd: SCNNode, for: ARAnchor)
. Take special note that every anchor added to the current AR session will not only create anARAnchor
, but it will also automagically create an equivalentSCNNode
within theSceneKit
scene. This is all managed byARKit
internally. Simply add your new content to this new node… and your done. Bazinga! -[:)]Node Updated.
func renderer(SCNSceneRenderer, didUpdate: SCNNode, for: ARAnchor)
gets called when theSCNode
‘s properties were updated due to an update on the relatedSCNAnchor
.Node Removed.
func renderer(SCNSceneRenderer, didRemove: SCNNode, for: ARAnchor)
gets called when theSCNode
was removed due to removal of the relatedARAnchor
.
With all that finally out of the way, it’s safe to say – You now know ARKit
! -[:)]
ARKit Demo Project
To end things off, here’s a fun little demo project you can mess around with. You’ll find the project under the resources/projects/ARDragon folder.
Run the project on your iPhone, point the camera at a table, and then sway it around a bit so the AR tracking kicks in. Then, tap anywhere on the screen.
Press play to find out what happens:
Whoa! There’s a friggin’ FIRE BREATHING DRAGON on my desk! Be careful and try not to get burned! -[:)]
When you’re looking through the project, take a look at these things:
-
Hit-Testing Feature Points
When you tap anywhere in the scene, a hit-test is performed to find the closest feature point. Once found, that feature point location in space is used to create a new anchor.
The code looks a little something like this:
// Get the first touch location on screen if let touchLocation = touches.first?.location(in: sceneView) { // Perform hit-test against all feature points if let hit = sceneView.hitTest(touchLocation, types: .featurePoint).first { // Add new anchor to AR session sceneView.session.add(anchor: ARAnchor(transform: hit.worldTransform)) } }
-
New Anchors Creates New Nodes
Remember, when you add a new anchor to the AR session, an equivalent
SCNNode
will be created. So all that needs to be done is to handle the delegate and add your own content to thatSCNNode
.The following code clones a dragon:
// Delegate for SCNNode added func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) { // Clone a new dragon let dragonClone = dragonNode.clone() dragonClone.position = SCNVector3Zero // Add dragon as a child of node node.addChildNode(dragonClone) }
-
Following The Camera
Did you notice the dragon always follows you around the room as you move? That’s accomplished by adding a
LookAt
constraint on theSCNNode
, using the camera’s point of view as a target.Here’s how you do that:
// Create a LookAt constraint, point at the cameras POV let constraint = SCNLookAtConstraint(target: sceneView.pointOfView) // Keep the rotation on the horizon constraint.isGimbalLockEnabled = true // Slow the constraint down a bit constraint.influenceFactor = 0.01 // Finally add the constraint to the node node.constraints = [constraint]
-
Light Estimation
Did you notice when the light is switched off the dragon also dimmed to blend into the darker scene? There’s an
ARConfiguration
property available to control this:isLightEstimationEnabled
. You don’t have to enable it, because it’s enabled by default. Just make sure you make use of PBR-based shading, and the rest will automagically work! -
Hit-Testing Nodes
When you tap on top of a dragon in the scene, a hit-test is performed to see if you tapped on an object. If so, that object is removed from the scene. See you later alligator… er, uh dragon.
To do that, you can use the following bit of code:
// Get the first touch location on screen if let touchLocation = touches.first?.location(in: sceneView) { // Perform hit-test against all objects in the scene if let hit = sceneView.hitTest(touchLocation, options: nil).first { // Remove the node hit.node.removeFromParentNode() } }
Next Steps
Well, folks, that brings us to the end of this tutorial, we hope you enjoyed it, and we can’t wait to see what you create with ARKit
and SceneKit
. When you’re done, share your creations with us here at DayOfTheIndie.com.