ipad and glasses

The idea is to anticipate how your eyes will naturally distort whatever’s onscreen.

Thanks to technology that corrects vision problems, those of us who need glasses to see a TV or laptop screen clearly could ditch the eyeglasses.

 

The technology uses algorithms to alter an image based on a person’s glasses prescription together with a light filter set in front of the display. The algorithm alters the light from each individual pixel so that, when fed through a tiny hole in the plastic filter, rays of light reach the retina in a way that re-creates a sharp image. Researchers say the idea is to anticipate how your eyes will naturally distort whatever’s onscreen — something glasses or contacts typically correct — and adjust it beforehand so that what you see appears clear.

Brian A. Barsky, a University of California, Berkeley, computer science professor and affiliate professor of optometry and vision science who coauthored the paper, says it’s like undoing what the optics in your eyes are about to do. The technology is being developed in collaboration with researchers at MIT and Microsoft.

In addition to making it easier for people with simple vision problems to use all kinds of displays without glasses, the technique may help those with more serious vision problems caused by physical defects that can’t be corrected with glasses or contacts, researchers say. This includes spherical aberration, which causes different parts of the lens to refract light differently.

While similar methods have been tried before, the new approach produces a sharper, higher-contrast image. A paper on the research will be presented at the annual International Conference and Exhibition on Computer Graphics and Interactive Techniques, also known as Siggraph, in Vancouver, Canada, in August.

For the paper, researchers took images of things like a rainbow-colored hot-air balloon and a detail of a Vincent Van Gogh self-portrait and applied algorithms that warped the image by taking into account the specific eye condition it was told to account for. They then showed the images on an iPod Touch, to whose display they had affixed an acrylic slab topped with a plastic screen pierced with thousands of tiny, evenly spaced holes.

Gordon Wetzstein, who coauthored the paper while a research scientist at MIT’s Media Lab , says the screen allows a regular two-dimensional display to work as what’s known as a “light field display.” This means the screen controls the way individual light rays emanate from the display, leading to a sharper image without degrading contrast.

The researchers tested out their device by using a Canon DSLR camera with the focus set to simulate vision problems like farsightedness.

Wetzstein says the next step is to build prototype displays that people can use in the real world — something he expects could take a few years.

There are still challenges to work out. For instance, the technique depends on a person’s focal length; the technology researchers tested requires whoever’s using it to keep his eyes still or that software tracks head movement and adjusts the image accordingly. Barsky expects this won’t be much of a problem, though, saying that when we look at a display that doesn’t look right, we tend to naturally move around to improve the focus.

And while the technology can be adjusted for different viewers, it won’t currently work for several people simultaneously who have different vision needs. However, Ramesh Raskar, an associate professor at the MIT Media Lab who coauthored the paper, says that if researchers used a display with a high enough resolution — about double the 326 pixels per inch of the iPod Touch used in the paper — the technology could be made to be used by more than one person at once.

Via Mashable