philip lelyveld The world of entertainment technology

9Jun/17Off

A Breakdown of the What Apple’s New ARKit Can Do for iPhones & iPads

Am0XU5eLTfwXuUsing only the RGB cameras and other sensors built into iPhones, along with a technique called visual-inertial odometry, or combining information from the device's motion-sensing hardware and computer vision through the device's camera, Apple has added a version of world tracking and virtual object anchoring that is comparable to some of the depth sensors that exist. Considering the lack of additional hardware the tracking is really impressive.

In addition, through the use of hit-testing, world tracking can understand the scene it sees. At least, on a basic level. Using plane detection the iPhone can find flat surfaces in the camera view which will allow the user to place a virtual object on a table for example.

ARKit also does ambient light estimation, which allows the application to more accurately match its virtual lighting to the lighting in the area. This will provide a more seamless and immersive AR experience for the user.

Along with all of these new additions to their SDK, they have added in Unity and Unreal support. Unity has a bitbucket repository up with a plugin to enable all of the added ARKit functionality to Unity applications for iOS.

So while the revelations presented at WWDC may not be the new AR wearable some of us were hoping for, and a few expected, they have in a single move created the world's largest AR platform.

See the full post here: https://next.reality.news/news/dev-report-breakdown-what-apples-new-arkit-can-do-for-iphones-ipads-0177958/

Comments (0) Trackbacks (0)

Sorry, the comment form is closed at this time.

Trackbacks are disabled.