NUI Central NY & NJ's Meetup

On Monday, April 15, 2013, OLC attended Natural User Interface Central NY & NJ’s meetup, For Your Eyes Only—Eye Tracking Explained and More, at WeWork Labs. The event, hosted by Ken Lonyai and Debra Benkler, delved into the world of eye-tracking technology and what it is all about. They hosted Tore Meyer, CEO of 4tiitoo over Skype and he shared his experiences with eye-tracking technology and talked about his new product, the eyeCharm, which had a successful Kickstarter campaign.

To begin the event, Ken Lonyai first gave four quick examples of eye-tracking software currently available in the market. Tobii was one example, to which Lonyai described as perfect for people with disabilities. He did mention that trying to type on a keyboard is painfully slow and cumbersome. Next, he showed a video of an accurate eye center localization, which uses a webcam and is “very good at staying focused on the eyeball.” Lonyai conceded that he had no idea how good it is on control. Debra Benkler added that natural user interface, especially in eye-tracking technology, has more going on in Europe and Asia than in the United States. Lonyai showed Umoove, which uses the Samsung Galaxy S4. “It uses an ordinary camera to track eyes,” Lonyai said.

Tore Meyer had a successful Kickstarter campaign to raise money for eyeChart in Germany. He has a business background, but seven years ago, he became interested in computers. Meyer developed tablet-operating interfaces and it gained some traction in the market. He then focused on computers and thought about how people will come to interact with them. Meyer combined eye-tracking software with computing and arrived at where he is now.

“Eye tracking is our main focus,” Meyer said. “It’s a very unique modeling. It’s accessing conscious input and unconscious input. That’s something that’s not been accessed before. When you search for something, you tell the computer to search what you want to search, but with eye tracking, your eye jumps to what you’re looking for.”

“One thing about webcams is that, aside from the person looking at the computer, it’s not as robust as IR [Infrared] cameras. With eye tracking, you’re creating reflections of your eyes. What we’re doing with IR light is, we know the source of the light and we can define where the user’s eyes are. Webcam is just checking where your eyes are. It’s measuring the pupil’s relation to the screen. We need the light source to really identify and separate the movement of the eyes,” Meyer said.

“The part about bringing up Microsoft’s Kinect for the Xbox 360 was to make the eyeCharm readily available for the consumer market. We also wanted developers to get working on eyeCharm, so we created two approaches to it: an SDK to dive into the heart of programming, but at the same time, we create a simple, rule-based language so you can add eye-tracking abilities to any app you want,” he added.

“We use a physical keyboard to separate something from clicking action, but if it was purely just the gaze, the user would have to stare at the button for maybe half-a-second to tell the software that it’s a clicking action. But the keyboard makes it a faster experience.

You’re using a keyboard anyway—you can use speech too to activate actions,” Meyer said. “We experimented with blinking, but in the long run, it gets pretty annoying. I believe that it can be solved, just not with the interface that we know of today. We believe that the interface in the future will be very different from today, I think. It’s hard to imagine, but it will be very different. Maybe it’ll be a totally immersive experience, or the human mind utilizing the geospatial space. This kind of processing ability is trained when you’re born, though. You can do other things with your base input. If you get rid of a secondary input—but of course, there will be a need for confirmation input in a subtle way—it’ll be interpreting what users do,” Meyer said.

“We believe that in a few years from now, eye-tracking software will be in every device. There’s no reason why it shouldn’t be in a webcam, The convenience of having eye tracking in a device is just incredible. It’s so much easier to look and click than looking for the mouse cursor, scrolling to a button and clicking on the object. Eye tracking now won’t be able to track certain people’s eyes, but it will be able to overcome that. Glasses are a problem with eye-tracking software because it masks the eyes and the reflection. It’s hard to compensate for, but as the technology matures, these obstacles can be overcome,” he concluded.

With this, the floor opened for audience questions.

“Are there any other tracking software to try and keep eyes located for people with glasses?” an audience member asked. “We are keeping track of the head, but the problem we’re having is that we can move our eyes right and left independently from the head. We can move our eyes separately. We won’t be able to find the pupils because we aren’t sure where you’re looking. We don’t know where the eyes are,” Meyer said.

“Where, when and how do you see eye-tracking software converging with mobile?” an audience member asked. “The reason we’re not going to tablets or mobile is because of touch and because of power consumption issues. The IR camera uses a lot of power. I believe the Google Glass would be a very strong competitor to mobile technology. You have to understand where the eyes are and I think the software will merge together when the Glass moves forward,” Meyer said.

“Are there any safety issues regarding IR lights?” a member asked. “No, absolutely not,” Meyer said. “The IR light is similar to temperature. Sunlight as a lot of ultraviolet radiation and high levels of IR, but we’re using very low energy IR.”

“Will driving user interface with eyes mean more cognitive load for the user?” an audience member asked. “Actually, it’s quite the opposite,” Meyer answered. “You can test this by looking at the computer screen in a wrong direction and trying to use the mouse. You never click on something without looking at it first. Initial users actually focus too hard when they first come in contact with the eye-tracking software, when using a normal gaze is enough. But it’s very simple for the software to compensate that,” he said.