PART 1: The Future of UX Research – Knowing What Your User Feels / by Gavin Lau

In order for UX design to continue to progress, the research methods we employ to gather the data we need to develop our approach also need to evolve.

At Pomegranate, our approach to UX design is based on our own unique methodology – something we like to call “Emotional Ignition™”. We believe that you have to fuse function and emotion to create the most engaging user journeys and, through this approach, we can create experiences with a clear focus on positive stimulation, targeting human behaviours by tending to both the heuristic needs of autonomy and the algorithmic needs of task fulfilment.

By combining functional necessity with behavioural understanding, we can design digital experiences that simultaneously stimulate both sides of the brain to create stronger levels of engagement and brand loyalty. With both emotion and mood influencing behaviour, we need to understand emotion, and know how our users are feeling.

 

But just how do we go about achieving this understanding of deeper emotional behaviours?

At present, many mainstream methods are only looking at user experiences retrospectively, using self-report (questionnaires or discussion), or looking at problems from more of a functional point of view. Both these methods pay a heavy reliance on participants’ interpretation and recollection of their experience, which is not necessarily what actually happened. Recent years have seen new methods arise, such as eye tracking, or physiological measurements and, although these make for great progress, they are still far from ideal.

In this article, we take a look at the future of the rapidly growing field of affective computing, a market which is expected to grow from USD 9.35 billion in 2015 to USD 42.51 billion by 2020, at an annual growth rate of 35.4% from 2015 to 2020, (Source) and review some of the techniques presently being used.


Identified Emotion-Tracking Cues

An emerging concept that is currently on the rise is Usability Labs. In these ‘labs’, UX and marketing agencies test their product on wired up participants, analysing various visual and physiological changes that have been find to co-occur with emotion. Analysts are typically looking for reactions within two areas: Traditional/Physiological cues and Behavioural cues.

Traditional/Physiological Cues


The sensors used here aim to measure changes in heart rate, galvanic skin response, muscle tension, breathing rate, or electrical activity in the brain. But as we try to reach deeper into our behavioural tendencies, obtaining information from these kinds of signals is not as straight-forward as it sounds, or, at least, not in a non-intrusive way.

 

When undergoing testing, the participants are attached to a plethora of wires, which are difficult to ignore when trying to carry out everyday tasks in an unconscious manner. Furthermore, participants have to come into the lab to test the interfaces, and thus are not in their usual setting and potentially not using products at a time of day they would normally interact with them.

Whilst the data captured may initially seem insightful, we have to question “Are results obtained in this way really accurate and closely reflecting reality?” Some would argue it’s accuracy is doubtful and hence we need to find a way to track emotions in a more remote way, one that users can be completely unaware of. This can either mean finding new, better ways to measure physiological sensors or finding different, easier to measure cues.

In light of these potential flaws, we are starting to see a move away from these lab settings and witnessing technology, such as smartphones, starting to incorporate elements that could come in handy for seamless user testing. Indeed, some SAMSUNG phones now have a heart rate monitor installed, which could be used for heart rate variability measurements and provide information on the user’s emotional state.


Behavioural Cues

In this instance, computers mimic how humans infer emotions: facial expressions, body postures and speech intonations are all measured. There are a number of behavioural cues that we are keen to understand. These include:

Image reference: affectiva.com

Image reference: affectiva.com

Facial Expressions


Huge progress has been made in recent years, and we have seen quite a few start-ups surfing the facial expression recognition wave, e.g. Affectiva, Emotient, or RealEyes. Facial expression monitoring techniques can also be used to code head gestures and do some mindreading, working out whether users are agreeing, disagreeing, concentrating, interested, unsure, etc.

Most successful of all is Affectiva, who has been building what is now the largest emotion data repository out there, with over 3 million faces analysed at the time of writing. There has been notable success in the application of their data too; it achieved 73% accuracy in predicting voter preference for US 2012 presidential elections by analysing facial expressions of people watching the presidential debate.

 

That said, one issue with facial expression monitoring is that the face cannot be obstructed during analysis, or the expressions cannot be tracked. This is an issue in the sense that we know that a lot of hand-over-face gestures take place during interactions. Hence, analysts are now working on interpreting hand-over-face gestures, and tracking them, to enable even more accurate expression tracking, where over-face gestures also indicate specific emotions.

Image reference: wikipedia.org/wiki/Kinect

 


Body Posture

Body posture is also a useful indicator, especially when it comes to interest/engagement vs. boredom. The Microsoft Kinect depth-camera, although only allowing for easy body posture tracking for standing positions, is nonetheless paving the way for body posture coding and detection. For sitting positions, however, studies have been found to prefer the use of posture-sensing chairs, meaning posture analysis is currently stuck in the lab.


Speech Analysis

Speech analysis also provides a lot of information on the emotional state and engagement of the speaker, and typical speech monitoring systems base their analysis on relative volume of speech, the structure of pauses, changes in rhythm and sounds, etc. BeyondVerbal, for example, is able to decode vocal intonations into emotions and attitude in real-time.

Text-pattern analysis


Finally, text-pattern analysis is also on the rise and, in the age of information (emails, text messages, social media posts, etc), being able to infer mood from written words from users can be an actual game-changer. Successful examples include IBM Tone Analyzer, that can detect emotional tones, social propensities, and writing styles in written communication.


Essentially, whilst important progress is being made in automatizing known behavioural/verbal cues for emotion interpretation, there are still a few glitches that need to be ironed out in how we collect the data. That’s said, with our determination to better understand how our emotions affect our behaviour, there is little doubt we will continue to make huge leaps forward in this rapidly developing field.



Source: http://www.pomegranate.co.uk/the-future-of...