The successful augmented reality startup, Magic Leap, just demo-ed their augmented reality shopping capabilities at a conference in China. The technology demonstrated makes shopping in your living room as immersive and engaging as you might imagine. The software also demonstrated the voice control capabilities that made perusing through the Chinese apps as seamless and effortless as you would hope.
Unfortunately, the conference was in China and thus speakers were only elaborating on the tech in Chinese,...
There are more useful applications of augmented reality, as [vijayvictory]’s Hackaday Prize entry shows us. He’s built an augmented reality helmet for firefighters that will detect temperature, gasses, smoke and the user’s own vital signs, displaying the readings on a heads up display.
The core of the build is a Particle Photon, a WiFi-enabled microcontroller that also gives this helmet the ability to relay data back to a base station, ostensibly one that’s not on fire. To this, [vijayvictory] has added an accelerometer, gas sensor, and a beautiful OLED display mounted just behind a prism. This display overlays the relevant data to the firefighter without obstructing their field of vision.
Emirates airlines has announced that it has launched the world’s first interactive amenity kit in economy class. The bags, which include necessities like a toothbrush, earplugs and eye mask, use augmented reality technology to unlock immersive content on traveller’s phones.
By using an app called Blippar, customers can scan the bag and unlock activities, music or health tips.
What stood out was that despite the lines, attendees largely didn’t care about virtual reality, augmented reality, or any other ballyhooed technology that was being utilized. They simply wanted to take a further step into their favorite fictional worlds — to the point where most at the Mr. Robot site called out the activation’s recreation of several sets as the draw, not the VR experience. "I just wanted to see how well they replicated the set, and get a real experience of how it would feel in Mr. Robot’s office," said 48-year-old Ben Kam. Down the line, O’Toole agreed. "We did the virtual reality Conan thing. So I’ve done virtual reality before," she said.
Comic-Con is all about promotion, with the idea that every piece of news or footage shown will soon ricochet around the world...
The irony is that the better the experience is, the more attention it will get, so there isincentive to create works that are actually interesting unto themselves. Filmmaker Kevin Cornish, who directed the Teen Wolf VR experience for MTV, explained that on his project, corporate sponsor AT&T was extremely pleased because the experience went beyond static, 360-degree video captures — low-hanging swipes at VR that are already starting to feel old thanks to the proliferation of 360-degree video on Facebook.
in the end, the state of VR at Comic-Con is a lot like the state of Comic-Con itself. Noble at times, catering to the pure of heart, but always looking to use the faithful to springboard into ever-higher orbits of hype.
Distribution has been slow because the music industry is waiting for the virtual reality ecosystem to catch up, according to Ben Lang, a co-founder and executive editor of Road to VR, an publication that covers the virtual reality industry. “A lot of these deals are strategic announcements to ward off other companies and show market leadership,” he said.
One setback has been logistical: It is still a challenge to get virtual reality viewers into consumers’ hands.
Although its competitors, like NextVR, give away content, Vrtify is experimenting with subscription and pay-per-view models. Facundo Diaz, its co-founder and chief executive, said 70 percent of the revenue went to the owner of the content, and the company received 30 percent, an arrangement similar to the music-streaming business, which has become a significant part of the music industry.
“This kind of revenue-sharing model gives us access to music labels and artists,” Mr. Diaz said. “It has opened a lot of doors for us.”
Virtual Reality Studio Kite & Lightning Raises a $2.5 Million Seed Round to Create Cinematic Social Gaming Universe
Kite & Lightning, the technology startup known for pioneering work in cinematic virtual reality (VR), today announced it raised a $2.5 million seed round led by Raine Ventures. The funding will enable Kite & Lightning to expand its team to create Bebylon Battle Royale, a world built around competitive spectator gaming. The hybrid film and gaming studio will continue to expand the universe around its core three VR pillars: gaming, social & story.
“This marks a pivotal moment for us as we transition to expanding beyond our two-person team,” said Ikrima Elhassan, co-founder and CEO of Kite & Lightning. “We’re elated to have world-class investors that share our vision and believe in our ability to pioneer groundbreaking VR experiences melding cinematic storytelling with interactive gaming.”
New Tools Open Up Virtual Reality to Journalists
I got really interested in the development of WebVR technology and the impact it could have in journalism. This led me to explore a variety of tools for making VR easier to develop, both for journalists and for developers without graphics programming knowledge.
See the full write-up here: http://gijn.org/2016/07/26/new-tools-open-up-virtual-reality-to-journalists/
The Veeso headset not only tracks the user’s mouth and jaw but also has infrared eye-tracking sensors built into the headset to track the user’s line of sight. This is especially effective for social VR, saving you from weird interactions with avatars with dead eyes.
Interactive video game technology is being developed to improve the quality of life for people living with dementia, and a new crowdfunding project aims to raise $90,000 to help bring this tranquil virtual environment to aged care facilities around the world.
As far as production software, we were torn between our familiarity with Unity and the rendering potential of Unreal. We stuck with Unity for now, and hope to explore Unreal more in future explorations.
Step 1: Becoming Pick Up Artists
You can’t explore weight without first nailing down the basic physics of how objects can be picked up and moved around, and we quickly found that these concepts were one in the same.
Step 2: Exploring Other Sensory Cues
In addition to the mechanics of lifting and grabbing, we felt it was important to explore other forms of feedback that could help convey an object’s weight. This manifested in two forms: visual and haptic feedback. In both cases, we tried to reinforce the amount of “strain” the user would feel as their controller approaches the thresholds of tension for a given object.
Step 3: Testing and Tweaking and Testing
With all of these factors, we had a seemingly endless list of permutations to explore. So to test and compare, we started working with multiple implementations in the same environment
Most of the primary differences weren’t even based on the physics — instead, they focused on the impact of secondary visual and haptic feedback on the physical interaction.
We created two scenes for user testing. In both, we arranged seven stations around the user. Our first scene had a single object per station, so testers could directly compare how it felt to pick something up using each method. As they worked through the stations, we asked a series of questions: Which object felt heaviest? Which method would you prefer if you had to pick up many objects? Which method felt the most natural?
While we saw a few high-level trends in our tests, no single implementation was a consensus top pick. This wasn’t unexpected; going into this exercise, we didn’t think there would ever be a ‘right’ answer. But we were able to draw plenty of conclusions from the input we received.
Loose Links Work Best When the Connection Can be Broken
But Unless It’s Vital to the Interaction, Weight Is an Annoyance
More Feedback is Welcome — and Don’t Forget Sound!
When we chose to add sound to this exercise, we didn’t think much of it. But it turned out to have a huge impact on how our testers perceived weight. The thuds of heavy objects hitting the ground or rubbing against each other reinforced the differences in mass.