Researchers at the University of Maryland have developed a technology that can reconstruct environments from the reflections in a person’s eyes. Using Neural Radiance Fields (NeRF), an AI technology that reconstructs environments from 2D photos, the team analyzed subtle reflections of light captured in human eyes to determine the person’s immediate surroundings. While the results showed promise in a controlled setting, the technology is still far from practical applications. Challenges, such as noise from the cornea and limited sensor resolution, need to be addressed before the technology can be used in real-world scenarios. However, the researchers believe their work can inspire future breakthroughs in 3D scene reconstruction.
Meta Data: {“keywords”:”University of Maryland, eye reflections, 3D scene reconstruction”}
Source link