PLAYING AROUND WITH ARKit
We have been hearing about Augmented Reality for years now, but the truth is that the first truly successful application of this technology happened only recently.
Pokemon Go, released in 2016, has become a global phenomenon, reaching more than 500 million downloads. The game allows players to interact with a virtual gaming world via their phones while moving around in the real one. With their phone’s camera, players can “see” virtual objects around them and complete challenges. The activity of Pokemon Go fanatics became so disruptive in certain places that some governments started to regulate its use.
For developers, working with AR hasn’t been an easy task. It was difficult to make an app from scratch because there was no open-source framework available from Apple or Google. We had some third-party frameworks available, such as Vuforia (http://www.vuforia.com/), but you had to pay to use them commercially. The release of iOS11 and Android 7 have been a game changer for AR. Apple included the ARKit framework with their new operating system, and Google ARCore. These specialized frameworks make coding augmented reality apps a much faster process. With these new frameworks we have not only substantially more opportunities, but also a paradigm shift.
Before ARKit and ARCore, in order to display an AR virtual object you needed a marker (QR Code or an image), but now you can scan plane surfaces and place virtual objects wherever you want. For example, the IKEA app uses ARKit in order to allow users to place virtual furniture in their home, so that they can, say, test if they want a red chair or a black one.
How does this work?
ARKit provides us with a very cool “world tracking” technology, which works using a technique called visual-inertial odometry. Although it is not as powerful as I wish because it can’t create 3D models, it gives us the opportunity to scan plane surfaces, such as a table, a carpet, or even the floor, and place virtual objects in these surfaces. It is very interesting because these virtual objects stay pinned in the exact position where you create them during the session.
Both Apple and Google are betting strongly on AR. They have invested heavily in R&D and they are claiming that this is the beginning of a new tech revolution. I agree with them but I think that the true revolution is mixing AR with AI (Artificial Intelligence) for daily tasks, which I will write about this in my next article. If you would like to read about this more now, check out the Vision Framework and CoreML, both available in iOS11.