Quantified Smiles: Physiological Feedback and User Experience

Article, Shan Lakhmani 1 Comment

Shan Lakhmani | July 14, 2012

When we create something—be it text editor, game, or even plunger—we create a user reaction to that item we’ve created. If you create a plunger with a three-inch handle, you are also creating a frustrating user experience. One must not only try to create a positive user experience, but one must also test to confirm that the desired experience actually occurs. So, how do we do that? Through quantitative analysis, of course!

Will Probably Yield a Poor User Experience

This will probably end poorly

There are a number of methods by which a user’s experience can be analyzed. Surveys are a particularly popular method, but their results can be a bit imprecise. Because surveys are taken after the completion of an experience, people’s answers are subject to the vagaries of uncertain memory [1]. Audio-Visual recording, user interviews, and talk-aloud protocols allow for real-time responses to the experience, but they tend to be time consuming and a bit subjective [1].

Physiological Feedback, however, is not only an objective measurement, but also one temporally close to the specific events and features that stimulate these responses. What does this mean for game development and game research? This means that we have another tool in our arsenal to understand games’ effects on people and how we can use those effects to create a better experience for the end user. I will use Facial Electromyography (EMG) as an example.

Facial Electromyography measures tiny muscle movements in the face by detecting the electrical activity of those facial muscles. Activity in the zygomatic muscle, which controls smiling, is associated with positive affect [2]. Smiling users are probably having a good time. The corrugator muscles, on the other hand, control the lowering of the brow. Movement of this muscle is associated with exerted mental effort; this muscle moves to a greater extent when a user’s perception of goal obstacles increases [2]. Consequently, in situations where users perceive an obstacle to goal completion, corrugator EMG levels are associated with levels of frustration.

The Corrugator Supercilii Muscle Zygomatic Muscle

The brow (left) and cheek (right) muscles. [3]

So, if one wants to examine a user’s software experience in terms of enjoyment and frustration, facial EMG is a very useful tool. To actually implement this methodology, one needs hardware onto which users can use the software under examination, hardware that detects facial movement, sensors to place on the face, and a computer onto which one can actually view the results. BIOPAC Systems Inc. produces hardware capable of reading Facial EMG data [4].

Biopac Hardware

Biopac’s modular hardware set [5]

Once everything is set up, one needs to place the sensors on the user. The Placement of the sensors on the face is a bit of an awkward proposition, unfortunately. Because weak electrical activity is being detected to the skin, the amount of impediments on the skin must be reduced. To do so, one must first clean the areas being examined, in this instance the brow and cheek, and scrub them vigorously to remove make-up and dead skin cells. After the areas are clean, one must take the sensors and apply electrically conductive gel to them. This fluid allows for the transmission of a clearer signal from facial movement. One attaches sticky “collars” to the sensors, and then one attaches these sensors to the proper areas of the face. People’s faces can be remarkably animated, so taping these wires down is suggested.

EMG sensors attached to the face

Image courtesy of Liu et al. (2008) [6]

Once everything is in place, you are ready to receive the data. With the proper software, you can detect even tiny facial movement! By measuring the muscle movement of the cheek and brow, one can measure the emotional valence, in microvolts, of different software features, thus allowing for comparisons between these features. One can also detect specific parts of software that frustrated the user, which can subsequently be redesigned to reduce this frustration.

If this kind of utility can come from Facial EMG alone, imagine the utility of Heart Rate, Skin Conductance, or gaze tracking data. Physiological feedback can bring user experience testing to a whole new level if you are willing to put in the effort!



References

[1]
Nacke, L., Niesenhaus, J., Engl, S., Canossa, A., Kuikkaniemi, K., & Immich, T. (2010) Bringing Digital Games to User Research and User Experience. Proceedings of the Entertainment Interfaces Track (EI-2010) at Interaktive Kulturen 2010, Duisburg, Germany. http://ceur-ws.org/Vol-634/Entertainment-Interfaces-Proceedings05.pdf

[2]
Hazlett, R. L., & Benedek, J. (2007). Measuring emotional valence to understand the user’s experience of software. International Journal of Human-Computer Studies, 65(4), 306–314.

[3]
These images are reproductions from Gray’s anatomy, which have been edited by Uwe Gille. These files are from the Wikimedia commons.

[4]
Biopac Electromyography

[5]
Image courtesy of the Body and Brain Interfaces Wiki

[6]
Liu, C., Conn, K., Sarkar, N., & Stone, W. (2008). Physiology-based affect recognition for computer-assisted intervention of children with Autism Spectrum Disorder. International Journal of Human-Computer Studies, 66(9), 662–677.


One Response to “Quantified Smiles: Physiological Feedback and User Experience”

Leave a Reply


8 − = zero