“NeuralDisplay” could make AR less squinty, blurry, and nausea-inspiring
... Microdisplay manufacturer KOPIN, of Westborough, Mass., working in partnership with MIT’s Computer Science & Artificial Intelligence Laboratory, may have a solution: the NeuralDisplay. It combines eye-tracking with machine learning to compensate for a user’s vision on-the-fly without additional optics. ...
The first NeuralDisplay is a 1.5-inch square micro-OLED with a resolution of 3,840 x 3,840 and a maximum brightness of 10,000 candelas. These specifications place it in league with other leading micro-OLED, like Sony’s 1.3-inch 4K micro-OLED. It also has an unusual quad-pixel arrangement that places red, blue, and green sub-pixels alongside a fourth pixel containing a pixel imager. ...
The pixel imager ... has a different task: to measure the light reflected by the user’s eyes. ... to deduce details about a user’s eyes, including the direction of their gaze, their eye position in relation to the screen, and the dilation of their pupils. ...
Those measurements feed into an AI model that learns to compensate for the quirks of each user’s vision by adjusting the brightness and contrast of the display. ...
The NeuralDisplay isn’t a panacea, its makers confess, and its ability to compensate for near or farsightedness is uncertain. ...
See the full story here: https://spectrum.ieee.org/augmented-reality-display-adapts