The startup Occipital created its Structure Sensor iPad accessory specifically to take detailed, real-time scans of a room. It occurred to the creators that if they built the sensor into a VR headset, that headset would have room-scale VR powers along the lines of the HTC Vive, where virtual objects are not only mixed with the real world, but can also interact with the viewer and the environment on the fly.
Enter the Bridge. ...Then you see the oblong sensor array protruding from the front. That, of course, is the Structure Sensor, which powers the augmented-reality experience. I got a chance to try out the Bridge at Mashable's offices, and it was definitely the best AR/VR headset demo I've ever seen on an iPhone.
Occipital has created a robot character, named Bridget that can interact with you and the room. Tell Bridget to fetch a virtual object and she'll go get it, avoiding real-world furniture along the way. You can tell her to seek out a power outlet for recharging (she doesn't really recharge, of course), and she'll get sad if physical objects get in her way.
One of the more thrilling things to do with the Bridge is go full virtual. If you click on Bridget, she'll display a menu of five 3D objects, and if you pick the one that looks like a person escaping, a huge virtual doorway will open in front of you. Step through it and you'll suddenly be in a fully virtual environment; the one I was in looked like a futuristic observation deck.
The power and potential of AR was perhaps epitomized by the debut of Pokémon Go, a classic Japanese video game reimagined as an AR experience on a smartphone screen, sending legions of gamers around the world trekking through parks and alleys — real ones! — for imaginary monsters. Now just imagine those imaginary monsters are instead floating, three-dimensional historical figures embedded with factoids for tourists, or coupons for deals at local Starbucks or Best Buy stores and then the potential of AR over VR becomes obvious.
A less faddish example of AR's potential is Snapchat Lenses feature, which debuted in 2015. Allowing anyone to instantly have a new, expressive face mapped onto their own, which moves as they move in the real world, Lensesare just the tip of the AR iceberg.
Meanwhile, in the background, the massively funded and super-secretive Magic Leap continues to lurk, with few having sampled what the company describes as a kind of mix of VR and AR. But despite the funding and the skunkworks hype, until the public meets the product, it's just a cool idea that has yet to prove its worth.
VR and AR are still infant technologies, but the vision behind them is more clear: Reality will slowly melt away and give way to "mixed reality," a new, tech-powered Disney-like world in which nearly every surface, object and location becomes an animated interface delivering digital magic in just about every corner of the planet.
The company recently posted a job looking for "an expert in producing 3D character and props animation," signaling that Snapchat, aided by wearable cameras in the form of Spectacles, may soon be populated by a vast array of virtual characters layered on top of real life.
Virtual and augmented reality games, such as Pokémon Go, exploded in popularity earlier this year. That resulted in some tense moments between residents near popular parks and game players. Milwaukee County Supervisor Sheldon Wasserman said game developers who want to use public parks as part of their game play should need a permit from the county.
"We need change," Wasserman said. "We need it now. We can't go through another year like that because games are coming, communities are going to be disturbed and players are going to be taken advantage of. We just need a logical framework for everybody to be happy and for a win for all involved."
Wasserman said the developers must be held accountable for increased traffic in public areas but players of the game won't be affected.
He said the costs of obtaining and enforcing the permits have yet to be decided.
If passed, the law would go into effect beginning in January.
Live sports are considered an ideal environment for a virtual reality experience, because they’re popular, visually arresting, and, with exceptions like a December Jets game, a desirable place to be. If VR can put you courtside at the U.S. Open or on the baseline at Warriors-Cavaliers, there are no shortage of people who will want to be there. Geography isn't a barrier. That is a big reason the NBA is going so gangbusters about VR—basketball has a widening world-wide audience, with a huge number of fans who are not likely to see a game in the flesh. VR is a chance to approximate that experience, putting you a few feet from Steph Curry when you’re really 6,000 miles away.
A couple of weeks ago, the league lent me a Samsung phone and a set of goggles to check it out. (For now, you need Samsung equipment plus an NBA “league pass” subscription to watch VR; but it will open up to other devices later in the season.)
If you’re wondering if the NBA on VR delivers a “Holy Mackerel” moment, I’m here to tell you: Yes, there is a bit of a Holy Mackerel moment, right at the beginning. Instead of looking at the game, you are enveloped in it. The action is much closer, fluid, mesmerizing. It’s utterly unlike a conventional TV broadcast (for this reason, the NBA uses a different announcer team for VR, since the product is so different).
...it’s currently 180, to use as much possible bandwidth to deliver the game—but it makes you aware of the stadium environment to your right and left. The score appears at the bottom when you look downward. All of this quickly becomes second nature.
Here are the hitches, in my opinion: It still looks a tad unreal, like it’s 85% real life, 15% a videogame. The players seem weirdly smaller than they actually are. The video definition can improve (and will). You also can’t watch it for a long time. At least I can’t. The VR environment is intoxicating, but best in small doses, as a supplementary experience to the standard broadcast.
n Henry’s enthusiasm, it’s easy to see the future. Virtual reality is a work in progress, and on its heels is augmented reality, a potentially superlative experience in which computer-generated imagery will put you in the environment (imagine hologram J.R. Smith in your TV room, and you get the idea.) Sports will be heavily invested in all of this. There will probably need to be breakthroughs to simplify it and take away necessities like goggles—let’s face it, the goggles are neat, but antisocial. A huge part of the sports-viewing experience is doing it with friends. (But what if you could virtually watch the game with friends….?)
A few weeks after F8, I chatted with Eric Romo, the founder and CEO of AltspaceVR. I was in New York wearing an HTC Vive headset and holding a pair of motion controllers. Romo was in California doing the same. A pair of laser-emitting boxes affixed to the walls mapped the rooms we were in and tracked our motion, allowing Romo and I to walk, wave, and interact as if we were in the same space, and not 3,000 miles apart. If we got too close to one another in the virtual space, I felt legitimately uncomfortable. It was easier to avoid talking over one another, because hands gestures and head position provided conversational cues.
Compared with Facebook’s avatars, our body language was limited. We were like children’s action figures, given just enough mobility to spark the imagination. Our bodies had arms and hands, but they stayed locked at our sides. Our actual hand position was represented by a pair of controllers floating in front of our avatars. We had legs, but we couldn’t walk around. Instead we used a cursor to teleport from one position to another. Our heads moved, but our gaze did not. Our mouths didn’t attempt to track the sounds of our words.
As more accurate tracking technology becomes available, AltspaceVR plans to add it in. Right now, for example, anyone with a Leap Motion controller can have highly detailed hands, complete with flexible fingers, in place of two floating game controllers. When comedians perform for big crowds in AltspaceVR, they can wear a full body suit, allowing them to act out physical gags.
The first ground rule of avatar design for Oculus is simple. "Leave it to the human brain, don’t fake it."
Of course, with convincing body language now a matter of computer code, the person across from you may not be what they seem. Imagine you’re in a boring meeting with a lot of people: in real life it would be rude to walk out the door. In VR you can sneak away, while your avatar stays sitting, nodding intelligently, and paying attention. "That autopilot version of a person I’m sure everyone would love to have," says Sheikh.
After my visit to the Panoptic Studio, I wanted to try the cutting edge of social VR for myself. So a few months later I traveled to Facebook’s headquarters in Menlo Park, where I got a demo of its Toybox software. Inside a windowless room optimized for virtual reality, I donned the Oculus headset and an assistant placed the Touch controllers in my hand. Now I was across a table from my partner, a pair of floating blue hands and a head.
In Toybox, I had the visceral sensation of being with another human being, but the world had none of the constraints or consequences you find in your living room. We could smash a vase with our slingshot, then snap a finger and it would be back. At one point we lowered the effects of gravity, and sent our pile of blocks floating in every direction. They scattered to the four corners of our playspace, and a few minutes later, reappeared neatly where they had begun.
Gnomes and Goblins is being developed exclusively for HTC’s Vive VR system at this point.
“We made a decision to only port to platforms that have the same level of sophistication,” Favreau told Digital Trends. “So we’re missing out on a larger potential audience, but we’re able to push the edge of what the technology has to offer, which is what’s more interesting to us right now.”
What separates this fantasy world from so many others are its inhabitants, which react directly with the user in ways that traditional Hollywood entertainment simply can’t.
That’s part of the challenge for Favreau and Wevr CEO Neville Spiteri, who’s working with the director on the project as part of a 10-man team that includes Oscar-winner Andy Jones (Avatar) and Jake Rowell, who developed the Call of Duty games at Infinity Ward for 5 years.
“We digitally built these characters from scratch with the AI and animation work being developed hand-in-hand,” Spiteri told Digital Trends. “There are layers of key-frame animation, but most is procedural animation derived by a complex AI system.”
Spiteri said Gnomes and Goblins borrows from both the storytelling of filmmaking and the interactivity of video games.
“Part of what’s important for this next wave of VR is rewarding the end user for coming back and re-engaging with the content,” Favreau said. “I’ve noticed that, with a lot of VR experiences, it’s a wonderful thing to experience and maybe I’ll go through it a few times to really understand everything that’s been laid in there, and then I’ll seem to share it with other people who’ve never done it before, and then you leave it on the shelf. You may revisit it later, but unless it’s been updated, you tend to want to look for new experiences.”
As much as the visual design and the task of building believable characters were important in an effort to bring this lucid-dreamlike world to life, Favreau says the team also put a lot of focus on audio to not only help guide people through the experience, but also to make it more immersive.
With Gnomes and Goblins, where a tremendous amount of effort was put into the sound design, we find that in the first time through the experience many people miss a lot of what we had made in there.”
With advances in virtual reality across multiple platforms, including Facebook’s upcoming Oculus Touch, PlayStation VR, and the launch of Microsoft’s Windows 10 mixed reality headsets in 2017, Gnomes and Goblins may well be positioned to expand its footprint down the road.
You can get a first look at Gnomes and Goblins at the Steam store now.
The Global Virtual Reality Association, or GVRA, was officially announced on Wednesday. This group is groundbreaking in that its founding members include many of the major competing VR headset manufacturers who have been the focus of so much press attention over the last year; Acer Starbreeze, Google, HTC Vive, Facebook's Oculus, Samsung and Sony Interactive Entertainment. GVRA states on its website that “while seeking to educate consumers, governments, and industry about VR’s potential, the association wants to get ahead of challenges with developing and deploying the technology responsibly.” They plan to organize working groups around important topics for the industry, and develop and share best practices. Noticeably absent from the list of founding members are VR experience production companies that are not subsidiaries of hardware manufacturers. This may be a short-term shortfall, or it may indicate that the group will focus on technical issues and not work to advance the language of the VR experience. According to the ETC’s Phil Lelyveld, both technical and language factors contribute to simulator sickness in VR.
Website here: https://www.gvra.com/about-us/
Here is the presentation I gave on Nov. 30, 2016 at the IP Conference in Guangzhou, China. 161130v2-china-deck
The slide-by-slide script that I prepared for the translator is here. 161130-china-deck-script
Here, we introduce an optical tool, holographic optical element (HOE), which can be applied for AR applications. HOEs are holographic volume grat- ings, which have transparency and distinctive optical characteris- tics. HOEs can perform the roles of optical elements by transform- ing an incident wavefront to a predefined wavefront via diffraction [Close 1975]. Since HOEs are usually recorded in thin film such as photopolymers, they have various advantages in weight, cost, and thickness. Thus, there have been several applications using HOEs, including see-through light field displays [Hong et al. 2014] and head-mounted displays [Kasai et al. 2001].
See the full academic paper here: http://delivery.acm.org/10.1145/3000000/2992142/a3-lee.pdf?ip=220.127.116.11&id=2992142&acc=OPEN&key=4D4702B0C3E38B35%2E4D4702B0C3E38B35%2E4D4702B0C3E38B35%2E6D218144511F3437&CFID=872315753&CFTOKEN=14780357&__acm__=1480962935_093f77cf615d493d57ee0be8f177c6af