Researchers are developing a new technology that uses hand gestures to perform commands on a computer.
The prototype, dubbed “TypeLike”, works via a simple laptop webcam with a simple affixed mirror. The program recognizes the user’s hands on or near the keyboard and prompts operations based on different hand positions.
For example, a user can place their right hand with their thumb next to the keyboard, and the program will recognize this as a volume boost signal. Different gestures and different combinations of gestures can be programmed to perform a variety of operations.
Innovation in the field of human-computer interaction aims to make the user experience faster and easier with less need for keyboard shortcuts or working with a mouse and trackpad.
“It started with a simple idea about new ways to use a webcam,” said Nalin Chhibber, a recent master’s graduate from the University of Waterloo’s Chariton School of Computer Science. “The webcam is pointed at your face, but most conversations that happen on a computer are around your hands. So we thought, what can we do if a webcam can pick up on hand gestures?”
Early insights led to the development of a small mechanical attachment that redirected the webcam downwards toward the hands. The team then created a software program that was able to understand different hand gestures under variable conditions and for different users. The team used machine learning techniques to train the TypeLike program.
“It’s a neural network, so you need to show an algorithmic example of what you’re trying to detect,” said Fabrice Matulik, senior researcher at Preferred Networks Inc. and a former postdoctoral researcher at Waterloo. “Some people make gestures a little bit different, and hands vary in size, so you have to collect a lot of data from different people with different lighting conditions.”
The team recorded a database of hand gestures with dozens of research volunteers. To help the team understand how to make the program as functional and versatile as possible, volunteers also conducted tests and surveys.
“We’re always ready to make things that people can easily use,” said Daniel Vogel, an associate professor of computer science at Waterloo. “People see something like typealike, or other new technology in the field of human-computer interaction, and they say it makes sense. That’s what we want. We want to make technology that’s intuitive and straightforward, but Sometimes it takes a lot of complicated research and sophisticated software to do this.”
Researchers say there are more applications for typealike programs in virtual reality where it could eliminate the need for hand-held controllers.
The study, authored by Chhibber, Matulik, Vogel and team member Hemant Bhaskar Surale, TypeLike: Near-keyboard hand posture for extended laptop interaction, was recently published in the journal Proceedings of Whose ACM Human Computer Interaction,
Simple, accurate and efficient: improving the way computers recognize hand gestures
Nalin Chhibber et al, Typalike, ACM’s Proceedings on Human-Computer Interaction (2021). DOI: 10.1145/3486952
Provided by University of Waterloo
Citation: System recognizes hand gestures to extend computer input on a keyboard (2022, January 5) Retrieved on 30 March 2022 from https://techxplore.com/news/2022-01-gestures-keyboard.html Gone
This document is subject to copyright. No part may be reproduced without written permission, except for any fair use for the purpose of personal study or research. The content is provided for information purposes only.