On Monday, The New York Times published its first augmented reality feature. The article, written by John Branch, includes four AR moments and is a preview of the Winter Olympics in Pyeongchang, South Korea.
Readers are able to meet world-class Olympic competitors — the figure skater Nathan Chen, the big-air snowboarder Anna Gasser, the short-track speed skater J.R. Celski and the hockey goalie Alex Rigsby — midperformance. Through your phone, the room around you looks just as it is, except the athlete is in it with you.
Another advantage is the mode of interaction we provide. Instead of the abstractions of pinch-to-zoom or swipe or click, we simply ask readers to treat the graphic as a physical object. If you want to see the form from another angle, you simply walk around to that area. If you want to see something up close, simply lean in to that spot. News becomes something you can see, literally, from all sides.
For the AR experience, we placed these scans into context — for example, placing Nathan Chen at the 20-inch height off the ground he would be midquad, based off photo reference and sometimes motion capture. In your space, this will truly be a distance of 20 inches because this is all true to scale.
Launching John Branch’s article was not just about this single piece of journalism. It was also about exploring what visual journalism may look like in the near future. We are extending stories beyond the inches of a screen — and in so doing, envisioning a world in which devices begin to disappear and the spaces around us become the information surfaces themselves.
See the full story here: https://www.nytimes.com/2018/02/08/insider/olympics-immersive-journalism-augmented-reality.html