WWDC has wrapped up and so now it is up to us in the world of iOS to make sense of all these newly unveiled features. Now, being a developer who is often working in the enterprise realm, I tend to look at many of these features from a slightly different angle. When people hear about ARKit adding augmented reality to iOS, they probably think about the implications for games like Pokémon GO and interior decorating apps that allow you to see a fake end table super imposed next to their couch. My first thoughts are about it’s possible use in retail and warehousing to help highlight objects of interest or present additional information for the user. With this post, I’ll be going over some of the new features and SDKs in iOS 11, and how they relate to the enterprise and working world.
ARKit – Previously developers had to do all of this by hand, and truth be told, it’s not easy. Super imposing into an image isn’t very difficult, but however, super imposing graphics in a 3-dimensional space with depth, that is also moving, is extremely difficult. ARKit does the heavy lifting for developers by locating features and planes, and it allows you to create 2d or 3d objects via SceneKit or SpriteKit to super impose onto these features and planes. As stated earlier AR can be useful to locating and highlighting items, but when combined with some other technologies it can be so much more. For example, when paired with iOS 11’s Machine Learning and Vision frameworks, mechanic might be able to have their device see an unknown engine part, detect what part it is, and then super impose the documentation for that part on half of their screen. Obviously, this technology on a phone or tablet has some limitations, but as smart glasses become more mature they will be a fantastic candidate for augmented reality in the enterprise. Read More