philip lelyveld The world of entertainment technology

26Mar/15Off

Developer of Augmented and Virtual Reality Solutions for Enterprise and Business, Augmented Pixels Raises $1 Million in Seed Financing

Augmented Pixels, a leading developer of interactive solutions with augmented and virtual experiences, announced that it has closed $1 million in Seed funding led by The Hive, a co-creation studio focused on building data-driven applications.

The investment will allow the company to concentrate on the development of new unique technologies for 3D scanning and creation of inexpensive high quality content for virtual reality.

See the full story here: http://www.itbusinessnet.com/article/Developer-of-Augmented-and-Virtual-Reality-Solutions-for-Enterprise-and-Business-Augmented-Pixels-Raises-1-Million-in-Seed-Financing-3812545

25Mar/15Off

Leap Motion will find its way into OSVR virtual reality headsets this June

A snap-on faceplate will add 3D hand-tracking capabilities to the open-source VR headset's design, but you'll have to pay extra.

Hand-sensing input technology company Leap Motion is now an official launch partner with OSVR, the open-source VR headset consortium backed by Razer, Sensics, and many others. Leap Motion has been working on integrating its hand-tracking motion-sensing tech with virtual reality technology over the last year, and even demonstrated the capability with OSVR back at CES in Las Vegas this past January. With OSVR, Leap Motion will be an attachable faceplate, offered as a separately purchased accessory or in a bundle with the Hacker Dev Kit VR headset this June.

HTC/Valve's upcoming Vive VR platform made a splash for its unique use of room-scanning technologies and innovative controller inputs. The open-source OSVR platform will work with a variety of input hardware, but Leap Motion will be there at the start to offer options right off the bat.

Read the full story here: http://www.cnet.com/news/leap-motion-hand-tracking-will-be-embedded-into-osvr-virtual-reality-headsets-this-june/

25Mar/15Off

Improbable: London virtual reality firm lands $20m investment from Facebook backer

improbable-london-vr-companyFounded at Cambridge University by computer scientists herman Narula and Rob Whitehead, Improbable provides a platform for developers to create large-scale simulations, such as the environment of a virtual reality game. Such games and other VR experiences are then viewed through headsets like the Oculus Rift or Samsung Gear VR.

For developers of VR, Improbable's product simplifies issues such as managing load across servers, handling scale, and simplifying the development of code. The obvious reason for Andreessen Horowitz to invest in Improbable is for its VR capabilities, given the rise of companies like Oculus, which was purchased by Facebook in 2014 for $2bn.

"Beyond gaming, Improbable is useful in any field that models complex systems — biology, economics, defense, urban planning, transportation, disease prevention, etc. Think of simulations as the flip side to 'big data'.

See the full story here: http://www.ibtimes.co.uk/improbable-london-virtual-reality-firm-lands-20m-investment-facebook-backer-1493366

24Mar/15Off

Stamps using augmented reality technology released to celebrate Irish animation

brownbag1An Post has released a four-stamp set celebrating the Irish animation industry - for the first time using augmented reality technology to virtually bring stamps to life.

When scanned by a smartphone with the CEE app installed, a specially produced film featuring Roy, Give Up Yer Aul Sins, The Secret of Kells, Nelly & Nora and other animated works can be seen.

See the full story here: http://www.businessandleadership.com/marketing/item/49977-stamps-using-augmented/

24Mar/15Off

Google Developers Show Impressive Future Of Augmented Reality On Project Tango Tablet

 Project Tango is currently a development platform tablet powered by NVIDIA’s Tegra K1 system-on-a-chip, that’s also enabled with a number of cameras and motion sensors, such that it is able to track and map it’s surroundings in real-time.  It’s specifically capable of area learning, motion tracking and depth perception, again, all on the fly.

See the full story here: http://www.forbes.com/sites/davealtavilla/2015/03/23/google-developers-show-impressive-future-of-augmented-reality-on-project-tango-tablet/ 

24Mar/15Off

Apple Patents A Light-Splitting iPhone Camera Sensor System

iphone-6-plus-camera1[Philip Lelyveld comment: This is the digital version of the 3 strip color process used to create color movies in the 1930s.  I wonder if anyone in Hollywood will try to invalidate elements of this patent as obvious and pre-existing technology.]

Apple has secured a new patent (via AppleInsider) for a special three sensor camera designed for thin, wireless devices like the iPhone. The three sensors would each capture a separate color component, as divided by a special light-splitting cube that would divide up light entering the camera into red, green and blue (or other color set) wavelengths.

Why? Better resolution and lower noise since they don’t need special filters or algorithms to separate out the color information of a captured image on a pixel-by-pixel basis. Using this tech, Apple would potentially be able to boost the image quality of its mobile cameras, especially in video capture scenarios.

See the full story here: http://techcrunch.com/2015/03/24/apple-patents-a-light-splitting-iphone-camera-sensor-system/?ncid=tcdaily

23Mar/15Off

‘I was blown away’: Welcome to football’s quarterback revolution

102714-CFB-STANFORD-3-AS-PI.vadapt.955.high.0Twenty-nine-year-old Derek Belch, a former kicker and now quality control assistant for the Cardinal, led all the meetings. Belch could've told his audience some stats like how Kevin Hogan went from completing 64 percent of his passes up to 76 percent after the Stanford quarterback started using this headset regularly for about 20 minutes before games. Or that the Cardinal went from averaging 24 points a game to 38 in those final three games. Or that the team finished the year scoring on every one of its last 27 trips to the Red Zone when their first two units were on the field, which would seem even more jaw-dropping when you consider the team was scoring just around 50 percent inside the 20-yard-line before that.

Instead, Belch used subtlety to drive home just what their technological breakthrough has done for Stanford football. He revealed a detail that never shows up on a quarterback's stat line and is lost on most in the stadium but is exactly the type of thing coaches love from their QBs. Belch knew one play — a 35-yard handoff to Remound Wright in the Foster Farms Bowl against Maryland with the Cardinal up 28-7 on their first drive of the third quarter — that would resonate inside the NFL world.

The initial play call,€“ "95 Bama," was designed as a strong-side run with one of the key blocks being the wide receiver picking up the strong safety. Problem was, Maryland's SAM linebacker was on the line and the Terps' safety was creeping up as their free safety shifted over. Hogan knew if the Cardinal ran the play as called, the receiver would have no shot to make the block, and Maryland's safety would snuff it out for a 4-yard loss. So Hogan "killed" the call and audibled to another run play where Stanford's guard was designated to kick out that safety and Wright dashed through a clear path of turf. Moments like this get to the essence of elite quarterback play every bit as much as fitting a pass between two defenders or extending a play by dodging a free blitzer. The decision had become second nature for Hogan because he'd seen it, done it, so many times in just that 20-minute session with the headset before the game. 

"The first day they filmed, it looked pretty funky," Bloomgren said. "It was like eight GoPros rubber-banded together and I'm thinking, 'What in the heck are we doing?'"

The first two months of the season, Belch and his crew did a lot of experimenting. They had to gauge exactly where to put the tripod. What depth worked best? Snapping the football proved to be problematic. Where should the actual QB stand? They tried him kneeling down in front of tripod, but that didn't look right. Worse still, Belch says the stitching of the video wasn't clear, so it was blurry and it might look like Stanford had two right tackles.

The world's leading authority on virtual reality and how the brain functions in a virtual environment, Bailenson has been visited by Facebook founder Mark Zuckerberg and Google founder Larry Page, is on Samsung's advisory board and works with Navy SEALS to help figure out how Frogmen can make better decisions underwater. Bailenson and Belch, who once booted the game-winning PAT to cap Stanford's monumental upset of USC as a 42-point underdog, had spent years kicking around the idea of how virtual reality might benefit the football world. They agreed that should be Belch's thesis. 

See the full story here: http://www.foxsports.com/college-football/story/stanford-cardinal-nfl-virtual-reality-qb-training-031115

 

23Mar/15Off

Virtual training could be NFL preparation’s sea change

In the next year or two, an NFL quarterback will step onto the practice field, put on a headset and run through the game plan against a holographic defense that looks, moves and thinks like the upcoming opponent.

At least that's the prediction of Brendan Reilly, the 28-year-old CEO of Eon Sports and a relative football novice who spent most of a year touring the country to show coaches beta versions of his virtual reality software and ask: How do we make this better?

"They all know how to make (players) bigger, faster, stronger," Reilly told USA TODAY Sports recently, sitting in the cramped room inside a shared office space that serves as company headquarters. "It's, how do we take that next leap in understanding? How do we apply fighter pilot training to quarterbacks?

Others are looking to enter the football space, too. Los Angeles-based augmented reality developer Daqri enlisted the help of former NFL punter and noted tech geek Chris Kluwe to get a conference call with the NFL about its Smart Helmet, which was built for the industrial workplace but has a sensor package the company says could be applied to training and practice quickly.

For instance, Reilly said, Philadelphia Eagles coach Chip Kelly wanted his offensive linemen to be able to see and process the game from a three-point stance. So developers integrated various types of motion tracking to follow head movement. Carolina Panthers quarterback Cam Newton wanted an improved field of vision, because turning his head in a real game would tip the defense.

See the full story here: http://www.usatoday.com/story/sports/nfl/2014/12/12/virtual-training-reality-practice-eon-sports-brendan-reilly/20284053/

20Mar/15Off

Reality Check: Comparing HoloLens and Magic Leap

The most impressive part of the HoloLens demos was the use of sensors to track where I was looking and gesturing, as well as what I was saying. My gaze was effectively a mouse, accurately highlighting what I was looking at. An up-and-down motion with my index finger—dubbed an “air tap” by the HoloLens crew—functioned as the mouse click to do things like paint a fish or place a flag in a certain spot on Mars. (I screwed this up a number of times; mostly because I wasn’t holding my finger up high enough.) Simple voice commands like “copy” and “rotate” worked well, too.

HoloLens is also really good at having virtual objects follow the user around. As I chatted with a Microsoft employee over Skype, the simple diagram he drew about how to connect a light switch hovered in the air near the electrical box on the wall, while his video-chat window remained in my field of view, even as I moved about. This fits neatly with the idea that augmented reality could help employees in the field make repairs to things like air conditioners (see “Augmented Reality Gets to Work”).

It’s clearly incredibly hard to make this kind of stuff work in a convincing way on a headset—once you’ve figured out how to make good-looking virtual images, there’s the task of cramming all of the necessary computer hardware into a wearable device, making sure it looks good as the wearer is walking around, and figuring out a way to power it. This raises big questions about how good augmented reality can really get, and how useful it will be in the near future. If it doesn’t wow you, both in form and function, why would you buy it?

[Philip Lelyveld comment: Apple has recently hired a number of people from the fashion industry.   It is thought that they are working on the design of Apple's VR and AR products.]

See the full story here: http://www.technologyreview.com/news/535806/reality-check-comparing-hololens-and-magic-leap/?utm_campaign=newsletters&utm_source=newsletter-daily-all&utm_medium=email&utm_content=20150320

19Mar/15Off

WEVR Launches $1 Million VR Grant Program, Up to $50,000 Available for VR Storytellers

VRthBlu_encounter_panorama5a-700x394[Philip Lelyveld comment: One of the initiatives that the ETC member companies, including the major Studios, have asked the ETC to pursue is to work with students and faculty to help build out the VR storytelling language and experiment to find what works in VR storytelling.  This WEVR effort compliments that as a side benefit of their goal of building out WEVR's library of content.]

WEVR, the Venice, Ca. company formerly known as WemoLab, announced OnWEVR today, a new $1 Million grant program that could net prospective VR content producers anywhere between $5,000 to $50,000 in project funds.

“There really aren’t many funding opportunities for VR storytellers, so we wanted to create something very lightweight that helps producers, helps us, and helps the VR community.”

The proposed project must be a VR experience a few minutes in length and use WEVR’s VR media player software.

See the full story here: http://www.roadtovr.com/1million-vr-grant-program-wevr-5000-50000-available-vr-storytellers/