Our team from Hull University has created a combination of hardware and software that allows for tracking a users eye to work out where they’re looking on screen.

We’re using an infrared camera mounted on glasses to take a picture of the user’s eye, with the infrared light causing dark pupil illumination. From this image we can analyse it to find the centre of the pupil and track it. By using the expected position of the pupil, we can also test for when the user blinks for a prolonged period, and trigger commands (such as mouse clicks on these events).

Here is an example of the hardware we’re using:

Above: My colleague Dave models our fashionable eyewear

Here are two youtube videos of the software in action:

We’ve also programmed a keyboard that can be accessed with just up/down/left/right gestures from the eye: