whebot.blogg.se

Android eye tracking
Android eye tracking






android eye tracking

My Google Next device already surfaces several contextual touch tiles when it sees me walk up to it. That type of context is what I’m envisioning if Google’s eye-gaze input technology could be adopted for current or future smart displays.

android eye tracking

Then when I say ‘Turn the burner on high”, I don’t need to say a wake word. If the system knows where I am and knows that I’m in front of cooktop… it knows that I’m cooking. Indeed, at our Level Up the Smarthome event in October, Jake Sprouse Head of Technology at Synapse shared thoughts on such context and potential smart interfaces that could use it. That provides useful context to show information relevant specifically to that person. This year, Google’s smart displays even gained the ability to recognize which family member is looking at the hardware. The Google Nest Hub Max has a huge display, facial recognition and costs $229. Many already have cameras for video calls and some are even advanced enough to use software that tracks you around a room. Implementing a Look to Speak-like experience isn’t much of a stretch for the smart displays of today. So when I see something like eye-tracking for input, I get excited. And while gestures are a potential option in the future for controlling devices, the technology isn’t quite mature enough yet. Phone apps were the first smart home interface, at least until digital assistants came along. I look at using phone apps for smart home control as an immature and non-optimal experience though. Using a smartphone for such controls is always an option, provided that the phone is within reach or in a pocket. I don’t want to speak to a smart speaker, for example, when my wife has fallen asleep next to me on the couch. But there are times and situations where voice isn’t ideal. Voice is a great invisible interface, for example, letting us command the devices in homes whether we use Alexa, the Google Assistant or Siri. I’ve written about various user interfaces several times prior and my take is: The more the better for an improved smart home experience. Based on the video demonstration, it appears amazing.Īnd while I don’t want to discount the positive accessibility aspect for the intended use, I can’t help but think of how useful this could be in the smart home. Look to Speak uses the camera of an Android phone to track the eye gaze of a user, allowing them to type or select a button by interpreting where they’re looking. Earlier this week, Google introduced Look to Speak, an experimental Android app that brings input access for those who can’t speak or physically type.








Android eye tracking