A system called EyeDrop uses a head-mounted eye tracker that simultaneously records your field of view so it knows where you are looking on the screen. Gazing at an object a photo, say and then pressing a key, selects that object. It can then be moved from the screen to a tablet or smartphone just by glancing at the second device, as long as the two are connected wirelessly.
The beauty of using gaze to support this is that our eyes naturally focus on content that we want to acquire, says Jayson Turner, who developed the system with colleagues at Lancaster University, UK.
Turner believes EyeDrop would be useful to transfer an interactive map or contact information from a public display to your smartphone or for sharing photos.
A button needs to be used to select the object you are looking at otherwise you end up with the Midas touch(点石成金) effect, whereby everything you look at gets selected by your gaze, says Turner. Imagine if your mouse clicked on everything it pointed at, he says.
Christian Holz, a researcher in human-computer interaction at Yahoo Labs in Sunnyvale, California, says the system is a nice take on getting round this fundamental problem of using gaze-tracking to interact. EyeDrop solves this in a slick (灵巧的)way by combining it with input on the touch devices we carry with us most of the time anyway and using touch input as a clutching mechanism, he says. This now allows users to seamlessly(无缝地) interact across devices far and close in a very natural manner.
【2014年职称英语考试《理工类B级》考试真题及答案】相关文章:
最新
2016-03-02
2016-03-02
2016-03-02
2016-03-02
2016-03-02
2016-03-02