SAGE: Synchronized Attention Grant Enabler

Developing techniques for montoring attention over multiple screens..

Project summary

The aim of SAGE is to develop research techniques that will enable us to understand how people interact with multiple devices, with a particular focus on TV and ‘companion screens’.  Work so far has focused on using eye tracking and logging techniques to determine which device has the viewer’s attention. We’re now working on modelling the factors that influence the orientation of attention, with the ultimate goal of developing TV production-support software.

Publications

Further Information

Full scientific details: http://www.cs.manchester.ac.uk/our-research/groups/interaction-analysis-and-modelling/areas-and-projects/sage
Code repository: https://bitbucket.org/IAMLab/
Data repository: http://iam-data.cs.manchester.ac.uk/investigations
Technical reports: http://iam-data.cs.manchester.ac.uk/investigations

Funded by: EPSRC IAA Relationship Incubator
This project is in progress

Expanded Details

This project builds upon the MSO and SASWAT projects. Where MSO transferred our SASWAT models of visual attention of complex Web applications into digital television, this project aims to move closer towards understanding synchronisation of multiple devices, e.g,. tablet computers or mobile phones with TV content. At present no methodologies exist which enables us to investigate just where attention is focused on each device in combination. This project, therefore aims to develop a research methodology that will allow us to monitor people’s attention as they watch TV and interact with a second screen.

Our long term objective is to produce a software application which will assist program providers and broadcasters in deciding when to apply additional content to a personal mobile device (smartphone or tablet) and details how attention will be synchronized between these devices. However, current methodologies which combine visual attention on tablets and smartphones with visual attention on television displays do not exist, current gaze detection on these combinations of devices is not reliable, and the viewing environment is not taken into account. These problems exist because it is difficult to perform eye-tracking on tablets and smartphones at the same time as we are tracking on digital television. Further, the digital television is sited much further from the user than conventionally eye tracking technologies can handle. Finally, it is currently difficult to understand the methodological implications of interactions (possibly confounding interactions) between the different devices and a users attention. In this case we think gaze detection might be the answer.

In practical terms, then, this project will explore how multiple eye-trackers can be used simultaneously to monitor attention over a TV and a tablet computer. We will explore how to maximise data quality while also keeping the experiments as naturalistic as possible for the participants.

Associated information

Completion information

Final Report Summary: 
Completion Date: 
Final Report: Pending

▲ Up to the top