Crossrail is Europe's largest infrastructure project. Stretching from Reading and Heathrow in the west, across to Shenfield and Abbey Wood in the east, the new railway will cover over 100km of track including 21km of new twin-bore rail tunnels and ten new stations.
Crossrail and its partners are already pioneering the use of Augmented Reality (AR) and the production of 3D Building Information Modelling (BIM) models is a handover requirement from each of their construction contracts, to support visualisation of project delivery. This means that Crossrail already has access to rich environments to support AR or Virtual Reality (VR) applications.
In this context successful applicants might consider how:
● AR could be used for daily site briefings, safety briefings, and to show progress against plan.
● AR / VR could be used to plan testing, allowing visualisation of the systems and assets, as they are integrated to operational level.
● AR / VR might be used to improve visualisation of intended operating parameters, and comparison with actual asset operation.
● AR / VR could be used by the operator to potentially assure their own readiness through training staff
● AR/VR could be used to plan maintenance activities, giving consideration to safety of the working space, and complexity of maintenance activity
Successful applicants should also consider how their solution could be commercialised for other engineering and construction projects.
Company founder Chris White released the first of its book series, Animal Kingdom, last fall. Since then, he's sold more than 15,000 copies at $29.95 each.
He expects to push out four more Animal Kingdom books this year, with $650,000 in funding from private investors. The new books will feature marine animals, insects, dinosaurs and household pets.
A tech consultant with friends at Pixar animation studios, Chris came up with the idea of using augmented reality to offer what he calls "tomorrow's learning."
White's first book does more then have pink flamingo rising from the page near a pond. It lets readers feed and pet the animals.
Readers also can see facts about each featured animal like their class, habitat, size, diet and population size.
The digital images she sees are generated using a software called Unity, the same one used to custom build video games for PCs, consoles, websites and mobile devices, White said.
Carbon 3D’s process is a variation on a method called stereolithography, which uses projected patterns of ultraviolet light to catalyze the formation of solid polymers from a pool of resin.
In Carbon 3D’s version, the pool of liquid resin sits in a vessel with a window at the bottom. The window is permeable like a contact lens, so it lets in not only light but also oxygen—which inhibits the chemical reaction just enough to prevent the polymer from solidifying on the bottom. That allows Carbon 3D to continuously print one layer on top of the next, which makes the process much faster and the resulting materials stronger, says DeSimone. “It looks like something growing out of a puddle,” he says.
DeSimone says that while most commercial 3-D printing systems have been designed by mechanical engineers, his chemistry focus sets Carbon 3D apart. “We want to offer materials properties that haven’t been seen before,” he says.
Stanford University’s Graduate School of Business in May launched an online certificate program that features customizable avatars for students who attend classes in a virtual space resembling the GSB campus. The Bay Area school joins the Massachusetts Institute of Technology’s Sloan School of Management, which uses similar technology in its executive-education programs.
A handful of other educational institutions and businesses have adopted the virtual-reality technology AvayaLive Engage, according to software company Avaya Inc.
Still, schools using the technology have found they are able to re-create the essence of the on-campus learning experience for students and faculty scattered across the globe.
...the technology made its in-class debut during Hurricane Sandy in October 2012. ... a “perfect opportunity to try using this technology to enable people who couldn’t attend because of the storm to attend virtually.”
The pilot was so successful, Mr. Hirst said, that it was incorporated into several of Sloan’s executive-education courses, including ones on big data and general management, where it caught the attention of Stanford’s GSB executive-education team.
Students and professors say the software, which allows avatars to gesture, jump and run, facilitates social interactions key to group projects and networking that would otherwise be absent from an online-learning experience.
Still, students and faculty say there’s no substitute—virtual or otherwise—for face-to-face interaction. “One drawback is not having the opportunity to really see facial expressions,” said Peter DeMarzo,professor of finance at Stanford’s GSB and faculty director of its remote-certificate program, though he added that a webcam shot of someone’s face could be placed on a wall or elsewhere in the virtual space.
I can focus on the different parts of the images at different depths as I would when gazing at something in real life, so when I look at, say, the chess pieces up close, those in the background look fuzzy, and vice versa when I focus on the pieces in the distance. And I don’t feel nauseous or dizzy like I sometimes do when I’m playing around with virtual reality, especially when looking at objects that are close to my face.
This conflict is what Wetzstein, an assistant professor of electrical engineering, and other researchers at Stanford are trying to solve with the headset I tried on, which they call a light field stereoscope—essentially, a device that uses a stack of two LCDs to show each eye a “light field” that makes virtual images look more natural than they typically do.
In hopes of making the stereoscopic virtual-reality experience more like what you see in real life, the Stanford researchers built a headset that contains two LCDs placed one in front of the other, with a backlight behind them, a spacer between them, and lenses in front of them. It’s connected to a computer that runs software necessary for the system to work.
The computer starts with a 3-D model, which the researchers’ software renders for each eye as a light field—in this case, Wetzstein says, it’s a five-by-five grid of slightly different 2-D images of the model, so 25 images in total for each individual eye. An algorithm uses the light fields to generate two images for each eye, and, for each eye, one of these images is shown on the rear LCD in the headset, while the other is shown on the front LCD. The images enter your pupils and are projected on your retinas.
What you see, Wetzstein says, is an approximation of the light field that’s being optically generated, which your eyes can freely move around and focus on where they want in virtual space.
A paper about the work will be presented in August at the Siggraph computer graphics and interaction conference in Los Angeles.
See the Siggraph paper here: http://www.computationalimaging.org/publications/the-light-field-stereoscope/
Inside the Box of Kurios is a brand new VR experience from Felix & Paul Studios which gives you a brilliant view of a Cirque du Soleil performance that’s been specially tailored for virtual reality. If you’ve got Gear VR, you’ve got a free ticket to this world-class performance. ....it puts you directly on stage where you are the focal point of the entire cast of this spectacular performance.
The 10 minute experience was captured in 360 degree audio and stereoscopic video for virtual reality by Paul Raphael and Félix Lajeunesse of Felix & Paul Studios, the same duo behind one of my favorite VR experiences, Strangers with Patrick Watson (also available on Gear VR), and several other live-action VR experiences.
Lauzon says that the experience’s longest segment, captured as a single shot with no edits, is a whopping eight minutes. Quite an impressive feat given that the production eschewed the tools of standard filmmaking—like cuts, camera changes, and camera movement.
See the full story with video here: http://www.roadtovr.com/inside-the-box-of-kurios-cirque-du-soleil-felix-paul-studios/
“We have doubled down on our concentration of virtual fitness,” said Shannon Fable, corporate director of programing. “It’s an electronic way to deliver reliable, affordable group fitness for free.”
Fable, who is based in Boulder, Colorado, said virtual fitness is also popular in personal training and kiosks provide classes during off-peak afternoon hours.
Virtual exercise in the form of instructors and classes projected onto a screen has already penetrated some 3,000 clubs worldwide, according to IHRSA (International Health, Racquet & Sportsclub Association).
The DiVE is used by faculty and students from a large number of fields, including chemistry and biology to history and art. Recent uses have virtually reconstructed ancient Roman communities, as well as medieval towns.
Heather Liu, a rising senior double majoring in biomedical engineering and electrical and computing engineering described her experience as new and unusual.
“It’s an application of all the things you learn in class,” said Liu. “It’s good to see the application of engineering.”
DiVE is open to public use Tuesdays from 4 to 5 p.m. Space is limited. The DiVE welcomes students who are interested in getting involved in virtual reality development and research. Interested students should contact DiVE director Regis Kopper to learn more.
See the full story here: http://today.duke.edu/2015/06/divevisit
Jaunt has announced NEO; a professional-grade 360-degree VR camera system. Jaunt spent 2 ½ years on R&D to overcome the shortcomings of off-the-shelf VR camera solutions, including those built into previous Jaunt camera rigs. In their press release they say that “NEO is a flexible system that offers turn-on-and-shoot simplicity for capturing spontaneous events, while also providing full manual control for creatives to set up the perfect shot.” NEO system features include fully synchronized global shutter sensor array, HDR, large format sensors for superior low-light performance, and time-lapse and high frame-rate capture. “Every aspect of the NEO design… was engineered with cinematic VR capture in mind,” said Koji Gardiner, Director of Hardware Engineering at Jaunt. "The camera is a very bespoke camera, and it's going to be produced at low volume," says CTO Arthur van Hoff. In addition to the NEO camera series, Jaunt also engineered important improvements to their creative workflow. Jaunt Studios partners can edit VR content using tools such as Avid, Premiere, Final Cut Pro X, Nuke, RV, Shotgun, Maya, 3Ds Max, After Effects, DaVinci Resolve, Scratch, and Lustre. Jaunt will lease the NEO to Jaunt Studio partners starting in August.
The material for this story came from two urls;
A NEW GAME SHOW PITS IN-STUDIO CONTESTANTS AGAINST VIEWERS ARMED WITH MOBILE DEVICES AS WE INCH CLOSER TO THE RUNNING MAN.
The idea is that the game play will be a competition between handpicked, in-studio players geared up in motion-capture suits. They will be doing things like navigating an obstacle course, walking a balance beam, or chucking giant eggs into space in what the Future Group describes as an "Angry Birds-style" game—all while people around the world are playing the same game optimized for their stand-alone devices.
"Imagine people in the studio are driving Mario Karts through the streets [of a virtual] New York," Kasin says. "[The studio] audience would see them driving in Mario Karts. The at-home audience could be doing the same task [on their mobile devices], and if you do better or worse than them, you can see [the] people in the studio driving past you."
The Future Group’s idea is that each production company would be able to make choices about how the game is presented. Among them is whether mobile players would be going head-to-head with each other, or only against the studio contestants. Kasin says the TV networks will be able to choose, for example, if they want to create competitions between mobile players by geographical region or other criteria.