MIT.nano has announced its next round of seed grants to support hardware and software research related to sensors, 3D/4D interaction and analysis, augmented and virtual reality (AR/VR), and gaming. The grants are awarded through the MIT.nano Immersion Lab Gaming Program, a four-year collaboration between MIT.nano and NCSOFT, a digital entertainment company and founding member of the MIT.nano Consortium.
“We are delighted to be able to continue to support research at the intersection of physical and digital thanks to this collaboration with NCSOFT,” says MIT.nano Associate Director Brian W. Anthony, who is also the lead research scientist in mechanical engineering and the Institute of Medical Engineering and Science. “These projects are just a few examples of ways researchers at MIT are exploring how new technologies can change how humans interact with the world and with each other.”
The MIT.nano Immersion Lab is a two-story immersive space dedicated to observing, understanding, and interacting with big data and synthetic environments. Equipped with equipment and software tools for motion capture, photogrammetry, and 4D experiences, and supported by expert technical staff, this open-access facility is available for use by any MIT student, faculty, or researcher, as well as outside users.
This year, three projects have been selected for receiving the seed grant:
Ian Condy: Innovations in Spatial Audio and Immersive Sound
Japanese culture and media studies professor Ian Kondi is exploring spatial sound research and technology for video gaming. Specifically, Condry and co-investigator Philip Tan, research scientist and creative director at MIT Game Lab, hope to develop software to link the “roar of the crowd” to online gameplay and e-sports so that players and spectators can hear. to be able to participate in the sound.
Condry and Tan will use MIT Spatial Sound Lab’s object-based mixing technology, combined with Immersion Lab’s tracking and playback capabilities, to collect data and compare different approaches to immersive audio. Both see the project as something that most likely fully engrossed “in real life” gaming experiences with 360-degree video, or mixed gaming, in which online and in-person players can attend the same event and interact with the cast. can talk with.
Robert Houp: Immersive athlete-training techniques and data-driven coaching support in fencing
Seeking to improve the athlete training, practice, and coaching experience to maximize learning while minimizing injury risk, MIT assistant coach Robert Houp aims to transform fencing pedagogy through expanded reality (XR) technology and biomechanical data. Have to move forward.
Hoop, who is working with MIT.Nano Immersion Lab staff, says preliminary data suggests that technology-assisted self-motion exercises can make a fencer’s movements more compact, and an immersive Reactive techniques can be improved by practicing in the environment. He spoke about data-driven coaching support and athlete training at the MIT.nano IMMERSED seminar in September 2021.
With this seed grant, Hoope helped develop an immersive training system for self-paced athlete learning, biofeedback systems to support coaches, conduct scientific studies to track athlete’s progress, and current understanding of rival interactions. planned to move forward. He envisions the work having implications in athletics, biomechanics and physical therapy, and using XR technology for training could expand to other sports.
Jeevan Kim: The next generation human/computer interface for advanced AR/VR gaming
The most widely used user interaction methods for AR/VR gaming are gaze and motion tracking. However, according to Jeevan Kim, associate professor of mechanical engineering, current state-of-the-art devices fail to deliver a truly immersive AR/VR experience due to their limitations in size, power consumption, intelligibility and reliability.
Kim, who is also an associate professor in materials science and engineering, developed a microLED/pupillary dilation (PD)-based gauge tracker and electronic, skin-based, controller-free motion tracker for next-generation AR/VR human computer interfaces. has been proposed. Kim’s gauge tracker is more compact and consumes less energy than traditional trackers. It can be integrated into see-through displays and used to develop compact AR glasses. The e-Skin motion tracker can clearly follow human skin and accurately detect human motion, which Kim says will facilitate more natural human interaction with AR/VR.
This is the third year of seed grant awards from the MIT.nano Immersion Lab Gaming Program. In the program’s first two calls for proposals in 2019 and 2020, 12 projects from five departments were awarded $1.5 million of joint research funding. The collaborative proposal selection process by MIT.nano and NCSOFT ensures that awarded projects develop industrially impressive advances, and that MIT researchers are in contact with technical partners at NCSOFT during the seed grant period.