Simulated human eye movement aims to train Metaverse platform

Engineers develop “virtual eyes” that closely mimic the behavior of the human eye,

US National Science Foundation-funded computer engineers based at Duke University have developed virtual eyes that simulate how humans see the world. Virtual eyes are accurate enough for companies to train virtual reality and augmented reality applications.

"virtual eyes" Replicate how human eyes track and react to stimuli.

“Virtual eyes” replicate how human eyes track and respond to stimuli. Image credits: Pxhere, CC0 Public Domain

Prabhakaran Balakrishnan, a program director at NSF, said, “The project aims to provide better mobile augmented reality to capture additional information using the Internet of Things, and to make mobile augmented reality more reliable and accessible to real-world applications.” Is.” Division of Information and Intelligent Systems.

The program, iSign, will help developers build applications for the rapidly expanding metaverse while protecting user data. The results of the study will be presented at the upcoming International Conference on Information Processing in Sensor Networks.

“If you’re interested in finding out whether someone is reading a comic book or advanced literature just by looking at their eyes, you can do that,” said Maria Gorlatova, one of the study’s authors.

“But training that kind of algorithm requires data from hundreds of people wearing headsets at a time. We wanted to develop software that would not only address the privacy concerns that come with collecting that kind of data, Rather, it allows smaller companies that do not have those levels of resources to be included in the Metaverse game.

Eye movements contain data that reveal information about responses to stimuli, emotional state, and concentration. Team of computer engineers developed virtual eyes that were trained by artificial intelligence to mimic the movement of human eyes reacting to various stimuli.

The information metaverse could be a blueprint for using AI to train platforms and software, potentially leading to algorithms adapted to a specific individual. It can also be used to design content production by measuring engagement responses.

“If you give EyeSyn several different inputs and run it long enough, you will produce a data set of synthetic eye movements that is large enough to train one [machine learning] Classifier for a new program, ”said Gorlatova.

When testing the accuracy of the virtual eyes, the engineers compared the behavior of the human eye to that of virtual eyes seeing the same phenomenon. The results demonstrated that the virtual eye closely simulated the movement of the human eye.

“The synthetic data alone are not perfect, but they are a good starting point,” Gorlatova said. “Small companies can use this instead of spending time and money trying to build their real-world datasets” [with human subjects], And since the algorithms can be personalized on the local system, people don’t have to worry about their private eye movement data being part of a larger database.

Source: NSF


Related Posts

Leave a Reply

Your email address will not be published.