The Next Privacy Crisis
But most of them promise a uniquely powerful combination of three features.
- Their hardware is wearable, hands-free, and potentially always on — you don’t have to grab a device and put it away when you’re done using it
- Their images and audio can blend with or compensate for normal sensory perception of the world, rather than being confined to a discrete, self-contained screen
- Their sensors and software can collect and analyze huge amounts of information about their surroundings — through geolocation and depth sensing, computer vision programs, or intimate biometric technology like eye-tracking cameras
...Writer and researcher Erica Neely says that laws and social norms aren’t prepared for how AR could affect physical space....
Augmentation also doesn’t just mean adding things to a wearer’s surroundings. It also means letting a computing platform capture and analyze them without other people’s consent.
Take facial recognition — a looming crisis at the heart of AR. ...
It’s not clear how AR systems will make money either — and what kinds of behaviors the resulting business models will encourage. ...
Even basic AR applications, like mapping an apartment to place a virtual screen, could potentially gather a huge amount of information. (What’s the size of your living space? Which books are on your shelves? How healthy are the snacks on your kitchen counter?) Without robust privacy protections, it will be incredibly tempting to use that data for ads....
Facebook Reality Labs announced a set of Responsible Innovation Principles designed to allay fears about trust, privacy, and consent that have dogged the company. It also awarded a series of academic grants to study specific issues in AR, selecting proposals like “Social Tensions with Always-available AR for Accessibility” and “Anticipating Virtual Gossip — What are (Un)Intentional Dark Patterns in a Ubiquitously Augmented Reality?” ...
“It’s very typical that developers of a technology have one kind of idea about what its uptake in society might be, and then the actual uptake turns out to be something quite different,” says University of Washington professor and human-computer interaction expert Batya Friedman. A good system is flexible enough to adapt to these unexpected uses. ...
See the full story here: https://www.theverge.com/22746078/ar-privacy-crisis-rethink-computing
Pages
- About Philip Lelyveld
- Mark and Addie Lelyveld Biographies
- Presentations and articles
- Tufts Alumni Bio