At Apple’s last annual developer conference in June, one new tool captured the attention of developers perhaps more than any other product or tool introduced that day. Called ARKit, the tool for iOS 11 enables developers to create augmented reality applications that place digital objects in the real world.
With a new set of iPhones announced on Tuesday – iPhone 8, iPhone 8 Plus and iPhone X – Apple now offers the best hardware and software platform for developing AR applications.
Developers are jumping on board and spending a considerable amount of time, money and effort to take full advantage of ARKit. Ikea’s head of digital transformation, Michael Valdsgaard, said 70 employees at the retail giant spent nine and half weeks weeks toiling day and night on an ARKit app. The result, Ikea Place, allows customers to place digital versions of Ikea furniture in their homes before spending money on something that doesn’t work outside of a store setting.
Ikea starting toying with an AR tool in 2013, Valdsgaard told Forbes. But earlier versions required dedicated AR headsets, which lack the ubiquity of iPhones. That severely limited rollout. The experience was also never very solid. ARKit handles the complicated task of measuring the room and its dimensions to accurately place objects.
“We never had the vehicle or platform to make AR really good,” Valdsgaard said. “This is what ARKit does.”
But placing furniture may not be ARKit’s most compelling use. The Machines, a multiplayer strategy game Developer Directive Games demonstrated at the Apple event, places a battlefield in the real world in which players direct combat by moving their iPhones.
Apple’s ARKit does have limitations. It can detect horizontal surfaces like floors and tables, but not vertical surfaces like walls. Valdsgaard predicts that Apple will soon add this function.
Although ARKit will support iPhones as old as the iPhone 6s, it looks like AR apps will really shine on the latest hardware announced on Tuesday. The iPhone 8 and 8 Plus will contain Apple’s latest chip, the A11 Bionic. It has a six-core central processing unit – two high-performance cores and four high-efficiency cores – Apple’s first custom graphics processing unit, and an image signal processor. The CPU core will handle the world tracking, the ISP will do real-time lighting estimation, and the GPU will generate digital images.
“It’s clear Apple is focused on AR as a platform and is building their phones with the hardware needed to enable great experiences,” said Scott Montgomerie, cofounder and CEO of Scope AR, an enterprise AR developer.
More important for AR hardware will likely be the iPhone X, Apple’s highest-end device. Inside the iPhone X is the A11 chip as well as a processor dedicated to neural network computing. The neural network models are supposed to more accurately track faces and the world around them, which could make for some more compelling AR experiences.
Google, Microsoft and Facebook all have competing AR developer platforms. At one point, Google was trying to push its own hardware platform with Project Tango, a depth-sensing camera system it offered to third-party phone makers. Only two phone makers – Asus and Lenovo – have adopted Tango, but the resulting phones have been lackluster. Tango received little developer support.
Now Google is trying a similar to approach to Apple ARKit, ARCore for Android. But like everything else in the Android ecosystem, Google has a hard time controlling what’s happening on the hardware side. Only Google Pixel and Samsung Galaxy S8 will support ARCore at launch.
Valdsgaard says he is glad Google decided to move beyond Tango to support AR development, but Ikea has been just working with Apple’s AR platform for now.
“We’ve been focusing just on Apple,” Valdsgaard said. “Ikea is for many people. How can we reach many people? We need to focus on the biggest AR platform the world.” – Written by ,