The next frontier of application context-awareness: mood
- 28 June, 2013 12:59
Mobile devices, such as smartphones, bristle with an array of increasingly sophisticated sensors ranging from gyroscopes and light sensors to hygrometers and thermometers. These sensors can be used to provide context to applications and a device's operating system, modifying the end user experience to account for their environment (a simple example being altering screen brightness and keyboard backlighting as ambient light changes).
Now a study produced at Microsoft Research Asia has floated the possibility of adding a new form of context-detection to devices, that doesn't assess the state of a device's environment and instead infers the state of the user themselves: mood sensors.
Robert LiKamWa, and Nicholas D. Lane, from Rice University and Yunxin Liu and Lin Zhong from Microsoft Research Asia have built a prototype mood sensing software system that runs on smartphones. The system is able to infer a user's mood with 66 per cent accuracy. When the underlying model was tailored for a single individual based on user correction of inaccurate mood detection, this rose to 93 per cent accuracy after two months of training. Using a "hybrid" model that incorporated data from other users, after 10 days 72 per cent accuracy could be achieved.
Automated mood detection could be used to enhance recommendation systems employed by services such as Netflix and to add an extra dimension to social networking and communications.
In addition, "[p]rivacy concerns aside, these moods would enhance social networks by allowing users to share mood states automatically," the researchers write in their paper, presented at the 11th International Conference on Mobile Systems, Applications and Services in Taiwan.
"Users would be able to know better how and when to communicate with others... When text messaging an upset boss, a user could be cautious of speaking brashly. Mood sensing can enable users to digitally communicate closer to the way they would in real life. For mood sharing, an automatic mood sensor will not only improve the usability but also more importantly, lower the social barrier for a user to share their mood: we do not directly tell others our mood very often, but we do not try to conceal our mood very often either."
Rather than building a new hardware sensor to detect physiological signs of a person's mood, the researchers built a lightweight software system they dubbed MoodScope. To build the system, 32 people were recruited for a two-month pre-study that involved structured, periodic self-reporting of their mood based on a circumplex model (see below).
During this field study, device data such as application usage, phone calls, email messages, text message, Web browsing history and location; personal data, such as contact identifiers, were anonymised. Tens of thousands of data points were gathered and categorised (for example, applications were grouped by category – entertainment, finance etc).
Coming out of the field study, a background MoodScope service was developed for both iOS and Android devices. The service consumes only 3.4 milli-Watt-hours per day, which the researchers estimated was the equivalent of losing only 20 minutes of standby time from a device's battery life. As part of the study, the researchers also developed an API that third-party applications could hook into to monitor a user's mood and react accordingly.
The paper notes the limitations of the MoodScope study's scope, such as the limited number and composition of participants in the field study. Future areas of development might reduce the frequency of mood input when training the system, reducing its intrusiveness. And as it stands, the system also has insufficient privacy protections for general release.
The researchers also noted that MoodScope can't capture every factor that affects mood: "The aim of our study is to investigate the relationship between smartphone usage patterns and user moods. We acknowledge that some external factors can go undetected with this approach. Similarly, user smartphone behaviour can change in ways that suggest dramatic shifts in mood – even when the cause is in fact unrelated to mood. Example situations include travel or critical work deadlines."
However, despite these limitations, the researchers see a promising future for MoodScope and similar systems. Detecting mood is a "vital next step for application context-awareness," they wrote.