The winners — “Inside the Galactica” and “Catch Novela” —each won $2,500 in prize money, using the Watchwith API to tackle the theme “New Forms of Storytelling Driven by Technology.” While $1,500 of the prize money came from the companies that presented the challenges, Watchwith awarded the creators an additional $1,000 each.
Watchwith, which provides in-program audience engagement and ad solutions for TV networks, had asked hackers to develop apps which utilized the Watchwith API to enhance the consumer viewing experience, with special consideration given to projects that helped viewers identify cast members, discover music and utilize other related content.
See the full story here: http://www.2ndscreensociety.com/blog/2014/11/25/watchwith-api-scores-at-nbc-universal-hackathon/?utm_source=2nd+Screen+Authority&utm_campaign=f55de23682-my_google_analytics_key&utm_medium=email&utm_term=0_41ed43776d-f55de23682-87094753
In its current state, Lightpaper is manufactured by mixing ink and tiny LEDs together and printing them out on a conductive layer. That object is then sandwiched between two other layers and sealed. The tiny diodes are about the size of a red blood cell, and randomly dispersed on the material. When current runs through the diodes, they light up.
Rohinni isn't interested in the entrenched TV market. The company would rather put the technology to use where it can make a big difference soon; everything from illuminating a logo on a mobile phone to providing headlights for a car. A few companies are already working on Lightpaper implementations, but Smoot wouldn't name any.
Consumers should start to see Lightpaper in the wild around the middle of 2015. But Rohinni won't be aiming at the home hobbyist market until after it takes hold in the commercial and industrial space.
The big problem with the product's current, version one, is how it places the LEDs when printed. Right now, they aren't distributed evenly on the printed surface. This can cause a shimmering, or starry night effect. Smoot explained that for a lot of applications, this won't matter, but the challenge being worked on currently is to get specific placement of the diodes—to produce completely even light. Not an insurmountable task, a second version of Lightpaper is likely a few months out.
"The magical thing about this solution is it's brighter, it's thinner, it's flexible, it's addressable, and programmable. You can address the sections of the diodes, which is a whole other space when you start thinking about solutions of light that you can address sections of."
Seattle's Cinerama will reopen on November 20 for the premiere of The Hunger Games: Mockingjay - Part 1. The theater, owned by philanthropist and entrepreneur, Paul G. Allen, has been upgraded to a world-class theater with the best sight and sound offerings available.
The Christie 6P laser projector at Cinerama is the first to be installed in a commercial theater. Featuring a scalable laser light source and a dual-head system architecture, the projector achieves the best resolution, color and brightness levels on the largest premium screens.
Specifically designed for 3D, the Christie dual-head system delivers both left and right eye images simultaneously to the viewer, reducing the eye strain and fatigue that some people experience when viewing 3D content from a single-head projector. Cinerama maintains state-of-the-art equipment for showing movies in all formats, from the latest 2D and 3D blockbusters to classic 35mm, 70mm, and original 3-strip Cinerama films.
Dolby Atmos and Meyer Sound technology have also been incorporated into the theater. Cinerama's debut film, The Hunger Games: Mockingjay - Part 1, is mixed in Dolby Atmos. Surround-sound loudspeakers provided by Meyer Sound will array the walls and ceiling of the theater and five Meyer Sound Acheron main screen channel loudspeakers will be used to enhance the acoustic experience.
Cinerama's new Matt Plus screen by Harkness has a white surface and low gain to increase picture uniformity, color accuracy and contrast. The signature Cinerama curved screen will remain functional and stay stored behind the main screen.
castAR from Technical Illusions is a unique augmented reality gaming system that uses device mounted projectors and retro-reflective surfaces to achieve a holographic style experience for users. It was borne from the bowels of Valve’s R&D labs and had a hugely successful Kickstarter campaign last year – pulling in over $1M in backer support.
The ‘boxing’ video shows contents that early tier backers can expect to receive; CastAR Glasses, Video Interface / Breakout box, Power Supply, IR Tracking Marker, Video cables, some Sweets and (as a reward for taking the ‘not quite final’ glasses and assisting with testing) a personalised letter from Jeri and Rick – all in a neat carry case. Interestingly, early adopters don’t receive one retro-reflective sheet but 3. Again, the team really want to heat feedback from early users on which they think is the best want some feedback.
The early units do not include the still-in-development VR clip-on attachments, which supposedly provide a high FOV virtual reality experience, reflecting the projected light back to the user. This aspect of the castAR is still a relative unknown, but if it works well rounds out a quite compelling immersive entertainment package.
A new short film called “Zero Point,” released by a different cinematic VR company called Condition One, briefly achieved presence for me in a seemingly pastoral scene of bison walking through a field. My head naturally followed the bison along their path, but I noticed they were turning to look back at the rest of their herd — so I turned, too, only to get a face full of curious beast.
“‘Oh! The bison is right there!’” Condition One CEO Danfung Dennis quoted another viewer as having said. “Everyone has that experience.”
In reality, of course, the animal was inspecting Dennis’s 3-D camera rig, which probably looked out of place in the middle of a field. But as a viewer, taking the place of that camera, I instinctively leaned away to try and get some distance from a thing that wasn’t there.
So, if it’s possible to get presence, what are the problems?
“The screen resolution is really perfect now if you hold it at arm’s length,” Christensen said of his Nexus 5, a premium Android phone. “When you bring it up to your eyes, with lenses, suddenly you need more.”
Another big aid to presence that you won’t find in the first round of cinematic VR content is positional tracking. What that means in layman’s terms: You can turn your head to see more of a scene happening all around you, but you can’t currently move your head or body to inspect different facets of a 3-D object. When the bison came up to inspect me in “Zero Point,” I leaned away out of instinct but my perceived distance from it stayed the same.
Christensen and Dennis said positional tracking in live-action video is possible, to an extent. Dennis speculated that the best way to achieve this is to use special depth-sensing cameras, similar to the Xbox’s Kinect, to collect data about how far away objects are from the various normal-camera lenses. This information could make the 3-D effect of a video more dynamic, changing as users move their heads around, though it still wouldn’t get a bison out of my face.
When the camera does need to move, slow movement may be the worst kind for motion sickness.
“In traditional filmmaking, you ramp the camera up to speed and ramp it slowly down to stop,” he said. “In VR, you want to go instantly from zero to 60. It minimizes the mismatch between your inner ear and what your mind is seeing.”
And some kinds of movement are just unworkable. Dennis said a scene in “Zero Point,” shot from the perspective of someone slowly riding down an escalator, had to be cut because the artificial altitude change made a lot of viewers sick.
Jules Urbach, the CEO of Los Angeles-based rendering company OTOY, said VR is a “very easy sell” in Hollywood at the moment.
“VR is something that all six major studios and other content creators take very seriously,” Urbach said. “At this point, everyone kind of gets it. They remember Cinerama: Three screens were more immersive than one. Movie studios understand that greater immersion is a big deal.”
The BBC reports that Magic Leap are to join forces with Framestore, the special effects house responsible for the hit motion picture Gravity, for an event forming part of the Manchester International Festival in the UK. The event, titles ‘The Age of Starlight’ is a live event put together with help of UK TV Physicist and all-round Science celebrity Professor Brian Cox.
Professor Cox, recently journeyed to Miami to see Magic Leap’s tech for himself and came away suitably impressed. “It’s the premiere of a technology that allows you to put digital images into your field of vision directly,” he said. “I saw the prototype in Miami a few months ago and it’s stunning.” Of the UK event itself, Cox says “I want people to stagger out and have to have a sit-down for a long time before they go home.”
If the schedule for this event holds, it means that Magic Leap are preparing to start showing their technology in public very soon. ‘The Age of Starlight‘ is set to open alongside the Manchester International Festival in July 2015, which means the technology will have to work well enough for public use.
[Philip Lelyveld comment; what, exactly, IS the point of art? How does this differ from a badly designed psychology experiment.] So we were a little impressed and a little concerned when we heard about Seeing I, an art project where London-based performance artist Mark Farid pledges to wear a VR headset and headphones for 28 days straight. During that time, his sight and sound will be replaced by those of a stranger, simply called "The Other," who will be recording his everyday life through a binaural microphone and 180 degree stereo camera setup attached to his glasses (applications to be The Other are open, if the idea of having your entire life virtually broadcast to another person for a month appeals to you)
EyeEm has also developed its own computer vision software. The startup partnered with talented photographers to create a large collection of stock photography, and it needed image recognition technology to search and categorize the images. Their newest algorithms deal with the quality of the photo, so that the company can show users the best of their archives.
The scoring system looks at several aesthetic features, like the blurriness of certain objects or the placement of objects in each third of the photo, in addition to factors like human engagement.
[Philip Leyveld comment: VR allows you to break out of the limitations of the computer display 'frame' and be surrounded by data. This is an example of immersive data visualization. Productivity tools, even more than entertainment, may be VR's future.]
In Fidelity’s prototype virtual environment—which it says is the first financial services app written for Oculus—stocks are represented as office towers and lumped together in sector “neighborhoods.” The buildings’ footprints are shaped by trading volume and their rooftops are red or green depending on changes in price.
Fidelity is not claiming to have solved any actual problems with the app. But with $2 trillion under management, it wants to get ahead of how new interfaces might be used. “We have a hypothesis that virtual reality will take off in the consumer set in the next three to five years, so therefore we want to understand the technology,” says Hadley Stern, vice president at Fidelity Labs, a research wing of the brokerage company. “We want to get their feedback on this and start to think: how would active traders and other investors use virtual worlds to understand data?”
The company is unveiling the app, called StockCity, this week at a trade show for stock traders in Las Vegas.
But for now, Fidelity isn’t letting anyone access actual brokerage accounts until it works out security and user-authentication protocols for goggle-wearers.
See the full story here: http://www.technologyreview.com/news/532676/fidelitys-oculus-app-lets-you-fly-through-your-investments/?utm_campaign=newsletters&utm_source=newsletter-daily-all&utm_medium=email&utm_content=20141120
At Samsung’s 2014 developer conference last week, Warren Mayoss, Head of Technology Product Development at DreamWorks Animation, took to the stage during the second day keynote to talk VR. Mayoss shared some of the initial work that DreamWorks has done with virtual reality, including five insights for new VR developers.
Mayoss, who worked previously with EA for many years, was crucial to the launch of DreamWorks’ virtual reality initiative which seeks to extend the company’s storytelling franchises into new immersive experiences. That initiative manifests in the ‘DreamLab’, led by Brad Herman, where the company is experimenting with virtual reality.
1. Develop for the Medium “Don’t just assume that a flat screen will work.
2. Real World Phobias May Translate to VR Because VR is so immersive, developers have a responsibility to not only make experiences comfortable, but also not traumatic (unless of course that’s what the user wants).
3. VR is a New Medium “What you don’t want to do is present them with a lot of menus and instructions to get them into the game. Get them get into the experience and let them get comfortable and then start to tell them what to do… guide them with what you want them to do,” rather than bombarding them with text and info.
4. Start Prototyping
VR is new, and like the early days of any medium, best-practices are still being defined. Methods for everything from functional UI design to avoiding sim sickess are still being uncovered. Time and time again we hear stories of researchers and developers deploying what seem like obvious solutions only to have their expectations crumpled up and tossed to the trash bin.
“Start prototyping and developing. The only way we’re going to understand what works in VR and what doesn’t is to actually just start developing,” Mayoss said.
“We worry about motion sickness in these kinds of things so we’ve been experimenting with low-frequency vibrations and audio to try to make people more comfortable, it interrupts your processing…”
“For mobile VR, the high framerates and low latency requirements are very critical…