As software developers, we are always introduced with new features and frameworks. Everything is evolving and we must learn new things. As an iOS developer, once you have a feeling that you have played enough with and gained some knowledge in all commonly used Apple frameworks, you are introduced with new ones. As it happened in WWDC 2017 when we were introduced with ARKit (Augmented Reality Kit).
What is ARKit?
According to Apple developers page “ARKit blends digital objects and information with the environment around you, taking apps far beyond the screen and freeing them to interact with the real world in entirely new ways.” This sounds very interesting, because until now, we were truly limited inside devices’ screens.
This framework gained a lot of attention and now you can find many tutorials on how to use it. I as well was interested in this and will share my experience getting to know this new framework.
Official documentation can be found here: https://developer.apple.com/documentation/arkit/
Where can I use ARKit?
First of all, ARKit is only available on iPhone 6s, SE, 7, 8, X (including Plus models) and iPad Pro 9.7, 10.5, 12.9, also iPad (2017). Device needs to be running at least iOS 11.0 and Xcode 9 is mandatory.
Xcode 9 has new ‘Augmented Reality App’ template that includes one view controller which sets up ARSCNView and presents a scene with 3D spaceship model placed in front of you.
As I am a true believer in the approach of learning by doing, I wanted to build an app to get to know ARKit. Therefore, I built a simple shooter game that displays an area on a wall and spawns moving objects on it.
The goal of this post is to cover the ARKit part because this is new. So, lets start.
How I built an app with ARKit?
Instead of using the “Augmented Reality App” template we’ll use the “Single View App” template, because we want to start from scratch. Therefore, the first thing we need to do is delete the view that is located in the view controlled scene and drop an ARKit SceneKit View instead.
Then in ViewController.swift, replace UIKit import with ARKit.
import ARKit
And create an IBOutlet for our ARSCNView in storyboard.
Then override viewDidLoad() function and set the sceneView’s delegate. As an option you can set “Show statistics” to true. This will show statistics such as fps and timing information.
override func viewDidLoad() {
super.viewDidLoad()
// Set the view's delegate
sceneView.delegate = self
// Show statistics
sceneView.showsStatistics = true
sceneView.debugOptions = [
ARSCNDebugOptions.showFeaturePoints,
ARSCNDebugOptions.showWorldOrigin]
}
Next, we need to make ViewController class to conform to ARSCNViewDelegate protocol. So create an extension that conforms to this protocol. Then add a function that gets called when ARSession fails.
extension ViewController: ARSCNViewDelegate {
func session(_ session: ARSession, didFailWithError error: Error) {
// Present an error message to the user
}
}
Now we are ready to create ARSession configuration and start our session.
override func viewWillAppear(_ animated: Bool) {
super.viewWillAppear(animated)
// Create an ARSession configuration
let configuration = ARWorldTrackingConfiguration()
configuration.planeDetection = [.vertical]
// Run the view's session
sceneView.session.run(configuration)
}
In this configuration we are specifying if and how the session attempts to automatically detect flat surfaces in the camera-captured image. In previously released ARKit versions (up to ARKit 1.5) the only available option was horizontal. But now in iOS 11.3 with ARKit 1.5 it is possible to detect vertical planes, too.
We also want to pause started session.
override func viewWillDisappear(_ animated: Bool) {
super.viewWillDisappear(animated)
sceneView.session.pause()
}
Then we need to provide a “Privacy – Camera Usage Description” message in Info.plist and armv7 and arkit values under “Required device capabilities”.
Now we are successfully running ARSession in our app! It creates world origin and shows dots where it has detected ARAnchors.
The rest is just our execution of the game – we can create 2D sprite-based game using Apple’s SpriteKit or a 3D game using SceneKit. The only difference from ordinary 2D or 3D game is that we would be use and present game scene on ARSKView.
For example, ARSCNViewDelegate protocol has a function
renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor)
that is called each time it detects a plane.
func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
guard let anchor = anchor as? ARPlaneAnchor else {
print("Detected anchor is not a plane anchor",)
return
}
/*
Here we can get detected surface's position from node.position and
size in anchor.extent values (x, y, z)
*/
}
In our case, we would add a platform for our moving objects like this.
let platform = SCNNode(geometry: SCNPlane(
width: CGFloat(anchor.extent.x),
height: CGFloat(anchor.extent.y)
))
And append it to the last detected node.
And that’s it! This is all what we need from ARKit! Graphics rendering, collision detection, animation and other functionality is built in the game frameworks (SpriteKit, SceneKit or any other). But that is a different tutorial.
Good luck and happy coding your ARKit project!
Conclusion
It is relatively easy to integrate and use ARKit features in our apps. Documentation is well covered and you can find a lot of blog posts and tutorials for it. Just be careful with plane detection on surfaces without textures or in darker areas, this could pose problems.
We are all provided with a good opportunity to make our mobile applications even better. But only time will tell how ARKit will evolve and if it is useful. For now, it looks useful mostly just for entertainment. But we must remember that this framework is relatively new, so let’s wait and see what new functionality will be added and how Apple will improve the ARKit!
One cool example of ARKit project
Reenacted a famous scene from ‘The Ring’ to bring horror movies to life in AR: https://9to5mac.com/2018/03/13/the-ring-arkit-unity-demo/