philip lelyveld The world of entertainment technology

15Feb/19Off

Maryland iSchool researchers link virtual reality with sight and smell to better understand data

UMD-virtualUniversity of Maryland researchers in the Human-Computer Interaction Lab (HCIL) are linking virtual reality with sight and smell to help people better process information. HCIL is jointly supported by the UMD iSchool and the UMD Institute for Advanced Computer Studies.

HCIM student, Biswaksen Patnaik and PhD student, Andrea Batch are exploring ways to convey information with scent as a complement to the visual representation of data sets.

See the full story here: https://ischools.org/blog/2019/02/14/maryland-ischool-researchers-link-virtual-reality-with-sight-and-smell-to-better-understand-data/

Information Olfactation: Harnessing Scent to Convey Data” lays out theories on the different ways people may perceive scent including intensity, direction and combined fragrances. The theories are based on benchmarks traditionally used in data visualization and previous literature in perceptual psychology, the researchers say.

The paper describes two prototypes the team has built—one for a desktop computer and one for a virtual reality headset—that disperse essential oils through diffusers. They also designed three different types of graph layouts in which smell can assist in conveying data visually to the user.

So far the researchers have been experimenting with ways to convey data via scent using the prototypes on themselves. They plan to recruit students for a user-study to test how effective their proposed olfactory encodings are in conveying and perceiving data with various scents.

See the full story here: http://www.umiacs.umd.edu/about-us/news/hcil-team-combines-smell-sight-better-understand-data

Comments (0) Trackbacks (0)

Sorry, the comment form is closed at this time.

Trackbacks are disabled.