castAR from Technical Illusions is a unique augmented reality gaming system that uses device mounted projectors and retro-reflective surfaces to achieve a holographic style experience for users. It was borne from the bowels of Valve’s R&D labs and had a hugely successful Kickstarter campaign last year – pulling in over $1M in backer support.
The ‘boxing’ video shows contents that early tier backers can expect to receive; CastAR Glasses, Video Interface / Breakout box, Power Supply, IR Tracking Marker, Video cables, some Sweets and (as a reward for taking the ‘not quite final’ glasses and assisting with testing) a personalised letter from Jeri and Rick – all in a neat carry case. Interestingly, early adopters don’t receive one retro-reflective sheet but 3. Again, the team really want to heat feedback from early users on which they think is the best want some feedback.
The early units do not include the still-in-development VR clip-on attachments, which supposedly provide a high FOV virtual reality experience, reflecting the projected light back to the user. This aspect of the castAR is still a relative unknown, but if it works well rounds out a quite compelling immersive entertainment package.
A new short film called “Zero Point,” released by a different cinematic VR company called Condition One, briefly achieved presence for me in a seemingly pastoral scene of bison walking through a field. My head naturally followed the bison along their path, but I noticed they were turning to look back at the rest of their herd — so I turned, too, only to get a face full of curious beast.
“‘Oh! The bison is right there!’” Condition One CEO Danfung Dennis quoted another viewer as having said. “Everyone has that experience.”
In reality, of course, the animal was inspecting Dennis’s 3-D camera rig, which probably looked out of place in the middle of a field. But as a viewer, taking the place of that camera, I instinctively leaned away to try and get some distance from a thing that wasn’t there.
So, if it’s possible to get presence, what are the problems?
“The screen resolution is really perfect now if you hold it at arm’s length,” Christensen said of his Nexus 5, a premium Android phone. “When you bring it up to your eyes, with lenses, suddenly you need more.”
Another big aid to presence that you won’t find in the first round of cinematic VR content is positional tracking. What that means in layman’s terms: You can turn your head to see more of a scene happening all around you, but you can’t currently move your head or body to inspect different facets of a 3-D object. When the bison came up to inspect me in “Zero Point,” I leaned away out of instinct but my perceived distance from it stayed the same.
Christensen and Dennis said positional tracking in live-action video is possible, to an extent. Dennis speculated that the best way to achieve this is to use special depth-sensing cameras, similar to the Xbox’s Kinect, to collect data about how far away objects are from the various normal-camera lenses. This information could make the 3-D effect of a video more dynamic, changing as users move their heads around, though it still wouldn’t get a bison out of my face.
When the camera does need to move, slow movement may be the worst kind for motion sickness.
“In traditional filmmaking, you ramp the camera up to speed and ramp it slowly down to stop,” he said. “In VR, you want to go instantly from zero to 60. It minimizes the mismatch between your inner ear and what your mind is seeing.”
And some kinds of movement are just unworkable. Dennis said a scene in “Zero Point,” shot from the perspective of someone slowly riding down an escalator, had to be cut because the artificial altitude change made a lot of viewers sick.
Jules Urbach, the CEO of Los Angeles-based rendering company OTOY, said VR is a “very easy sell” in Hollywood at the moment.
“VR is something that all six major studios and other content creators take very seriously,” Urbach said. “At this point, everyone kind of gets it. They remember Cinerama: Three screens were more immersive than one. Movie studios understand that greater immersion is a big deal.”
The BBC reports that Magic Leap are to join forces with Framestore, the special effects house responsible for the hit motion picture Gravity, for an event forming part of the Manchester International Festival in the UK. The event, titles ‘The Age of Starlight’ is a live event put together with help of UK TV Physicist and all-round Science celebrity Professor Brian Cox.
Professor Cox, recently journeyed to Miami to see Magic Leap’s tech for himself and came away suitably impressed. “It’s the premiere of a technology that allows you to put digital images into your field of vision directly,” he said. “I saw the prototype in Miami a few months ago and it’s stunning.” Of the UK event itself, Cox says “I want people to stagger out and have to have a sit-down for a long time before they go home.”
If the schedule for this event holds, it means that Magic Leap are preparing to start showing their technology in public very soon. ‘The Age of Starlight‘ is set to open alongside the Manchester International Festival in July 2015, which means the technology will have to work well enough for public use.
[Philip Lelyveld comment; what, exactly, IS the point of art? How does this differ from a badly designed psychology experiment.] So we were a little impressed and a little concerned when we heard about Seeing I, an art project where London-based performance artist Mark Farid pledges to wear a VR headset and headphones for 28 days straight. During that time, his sight and sound will be replaced by those of a stranger, simply called "The Other," who will be recording his everyday life through a binaural microphone and 180 degree stereo camera setup attached to his glasses (applications to be The Other are open, if the idea of having your entire life virtually broadcast to another person for a month appeals to you)
EyeEm has also developed its own computer vision software. The startup partnered with talented photographers to create a large collection of stock photography, and it needed image recognition technology to search and categorize the images. Their newest algorithms deal with the quality of the photo, so that the company can show users the best of their archives.
The scoring system looks at several aesthetic features, like the blurriness of certain objects or the placement of objects in each third of the photo, in addition to factors like human engagement.
[Philip Leyveld comment: VR allows you to break out of the limitations of the computer display 'frame' and be surrounded by data. This is an example of immersive data visualization. Productivity tools, even more than entertainment, may be VR's future.]
In Fidelity’s prototype virtual environment—which it says is the first financial services app written for Oculus—stocks are represented as office towers and lumped together in sector “neighborhoods.” The buildings’ footprints are shaped by trading volume and their rooftops are red or green depending on changes in price.
Fidelity is not claiming to have solved any actual problems with the app. But with $2 trillion under management, it wants to get ahead of how new interfaces might be used. “We have a hypothesis that virtual reality will take off in the consumer set in the next three to five years, so therefore we want to understand the technology,” says Hadley Stern, vice president at Fidelity Labs, a research wing of the brokerage company. “We want to get their feedback on this and start to think: how would active traders and other investors use virtual worlds to understand data?”
The company is unveiling the app, called StockCity, this week at a trade show for stock traders in Las Vegas.
But for now, Fidelity isn’t letting anyone access actual brokerage accounts until it works out security and user-authentication protocols for goggle-wearers.
See the full story here: http://www.technologyreview.com/news/532676/fidelitys-oculus-app-lets-you-fly-through-your-investments/?utm_campaign=newsletters&utm_source=newsletter-daily-all&utm_medium=email&utm_content=20141120
At Samsung’s 2014 developer conference last week, Warren Mayoss, Head of Technology Product Development at DreamWorks Animation, took to the stage during the second day keynote to talk VR. Mayoss shared some of the initial work that DreamWorks has done with virtual reality, including five insights for new VR developers.
Mayoss, who worked previously with EA for many years, was crucial to the launch of DreamWorks’ virtual reality initiative which seeks to extend the company’s storytelling franchises into new immersive experiences. That initiative manifests in the ‘DreamLab’, led by Brad Herman, where the company is experimenting with virtual reality.
1. Develop for the Medium “Don’t just assume that a flat screen will work.
2. Real World Phobias May Translate to VR Because VR is so immersive, developers have a responsibility to not only make experiences comfortable, but also not traumatic (unless of course that’s what the user wants).
3. VR is a New Medium “What you don’t want to do is present them with a lot of menus and instructions to get them into the game. Get them get into the experience and let them get comfortable and then start to tell them what to do… guide them with what you want them to do,” rather than bombarding them with text and info.
4. Start Prototyping
VR is new, and like the early days of any medium, best-practices are still being defined. Methods for everything from functional UI design to avoiding sim sickess are still being uncovered. Time and time again we hear stories of researchers and developers deploying what seem like obvious solutions only to have their expectations crumpled up and tossed to the trash bin.
“Start prototyping and developing. The only way we’re going to understand what works in VR and what doesn’t is to actually just start developing,” Mayoss said.
“We worry about motion sickness in these kinds of things so we’ve been experimenting with low-frequency vibrations and audio to try to make people more comfortable, it interrupts your processing…”
“For mobile VR, the high framerates and low latency requirements are very critical…
5. Business of VR Gear VR’s launch will help us in the mass market consumer proposition and so we think that app monetization will be similar to mobile,” he said. “You need to ask yourself what you want to do, what your goals are, and what you would do to stand out…”
Watch the 3 minute overview video here: http://www.forbes.com/video/3895912616001?curator=MediaREDEF
In an effort to develop universal standards and best practices for high-scale Internet video services, 17 content companies, service providers and technology vendors have gathered to create the Streaming Video Alliance. Together, these companies hope to improve the online video experience. SVA will initially focus on open architecture, quality of experience, and interoperability. Formation of the group comes as net neutrality continues its path as a complex and controversial issue.
The Streaming Video Alliance has three tiers of membership: sponsor/founding member ($25,000 annually); full member ($12,500); and supporting member ($5,500).
The group will focus on three initial areas: open architecture, to define specifications for network and cloud-based streaming and caching infrastructure; quality of experience, to create a common approach to defining, measuring, optimizing and reporting quality of the video streaming; and interoperability, to create standards for streaming video.
The group consists of the following companies: Alcatel-Lucent, Charter Communications, Cisco Systems, Comcast, Epix, Fox Networks Group, Korea Telecom, Level 3 Communications, Liberty Global, Limelight Networks, Major League Baseball Advanced Media, Qwilt, Telecom Italia, Telstra, Ustream, Wowza Media Systems and Yahoo.
Interestingly, SVA was founded without Netflix or YouTube, which are currently the greatest sources of Internet bandwidth usage, and have both invested heavily in their own distribution infrastructures.
Founding members of the Streaming Video Alliance (streamingvideoalliance.org) will meet together at least twice per year in person, and committees will have regular calls and meetings to create specs. The SVA is not a standards body, Rayburn said; rather, it plans to propose technical specs to relevant standards bodies.
[Augmented Reality technology has tremendous potential for helping sensory-impaired people. Adding motion-capture (Kinect, Leap Motion, ...) technology to headsets could allow visually impaired people to hear their surroundings. This Microsoft approach, though, seems like a commercial development with an ADA pitch tagged onto it. You'd have to put beacons on everything around the person to make this useful for the visually impaired.]
Microsoft is piloting a project in the UK to investigate the potential benefits of its 3D audio technology on those with visual impairments.
The computing giant partnered with the Guide Dogs charity, alongside Network Rail, Reading Borough Council, Reading Buses, Future Cities Catapult (an urban planning company) and supermarket powerhouse Tesco.
Using a set of bone-conducting headphones placed around the back of the wearer’s skull, a mini network of indoor and outdoor beacons work in tandem with a smartphone to enable the user to ‘hear’ their way around an area.
The Bluetooth beacons are fixed to physical objects which then communicate information back to the walker. It’s effectively creating what it calls a “sensor-boosted physical environment” and a 3D soundscape with verbal cues – this may be GPS navigation, bus time arrivals, or even tourist information.
You may be wondering why Tesco’s help has been sought here. Well, for the pilot, users can be guided to the aisle they’re looking for.