US researchers have presented a new way of operating smartphones using eye contact and gestures, which could make it easier for users of large smartphones in particular to use the devices. A kind of cursor is moved on the screen by eye recognition. To trigger actions, the user swings the device to the left or right or pulls it towards him. In this way, one-hand operation is possible again, as was previously the case with smaller smartphones.
Realized with standard hardware
The special thing about EyeMU von der Future Interfaces Group vom Human-Computer Interaction Institute (HCII) der Carnegie Mellon University in Pittsburgh is that the prototype was realized with commercially available smartphones.
According to the researchers, the camera on the front and the built-in motion sensors proved to be sufficiently powerful. The software used to control the on-screen cursor was self-developed, after first using Google’s Face Mesh tool to study users’ gaze patterns.
“We asked ourselves: Is there a more natural mechanism for interacting with the phone?” Karan Ahuja, PhD student in the field of human-computer interaction, explains the creation process of EyeMU. Today’s common touch control of smartphones and voice control are already considered natural control. But before people touch something, they look at it, the researchers concluded.
What eye control failed so far
Many attempts with eye control fail because of the so-called Midas touch problem, says Professor Chris Harrison. If an action was triggered everywhere the user looks, too many applications would open at the same time.
In view of this, in EyeMU the selection is done with the eye, but the action itself is only triggered by a hand gesture. For example, the user selects a photo with the eyes and enlarges it by pulling the smartphone towards the body.