* If you want to update the article please login/register
One of the problems in the field of augmented reality technologies is designing a user interface suitable for headset environments. This report shows that a hands-free UI for an AR headset that uses facial expressions of the wearer to reveal user intentions. The headset wearer's facial movements are detected by a custom-built sensor that monitors skin deformation based on infrared diffusion characteristics of human skin. For the current UI system, we used a deep neural network classifier to determine the user's intended gestures from skin-deformation data, which are used as user input commands.
Source link: https://doi.org/10.3390/s19204441
* Please keep in mind that all text is summarized by machine, we do not bear any responsibility, and you should always check original source before taking any actions