Article categories: In Other WordsSecond Nature
May 25th, 2011

Abstract: The aim of this artist presentation is to discuss the building of the Chameleon Project. It will be built over two years (2008-2010), and through ten prototypes with a cross-disciplinary group of an artist, social neuroscientist, emotion neuroscientist, affective computer scientists, technologists, human computer interaction scientists and a curator. The project investigates the scientific foundations of emotional contagion, drawing attention to how we innately and continuously synchronize with the facial expressions, voices and postures of others by unconsciously infecting each other with our emotions. It both follows and critiques the scientific method, thus creating scientific and artistic research producing new models to be used in scientific experiments and new ways to experience art. The presentation will be structured as follows: The project will be introduced, other relevant research and art work discussed, then the technical parts of the project will be described, including the building of the video databases, the face-reading technology, the video engine and the “emotional algorithms” that drive the video engine. In the following section there will be a discussion about the audience’s experience, as seen from observations and the feedback and evaluation sessions, and finally the presentation will end with a discussion on the collaboration.

Sensing You

Recent developments in laboratory science and visualisation technology have been essential to revealing the biological basis of emotions, leading cognitive and affective sciences to arrive at a more detailed understanding of how empathetic and emotional responses are used by human beings to facilitate interpersonal transactions (Hatfield 1992, Frith 2003). This knowledge is beginning to be embedded into technology, known as ‘affective computing’ (Kaliouby 2005, Pantic 2007, Bianchi-Berthouze 2003). Artists are now harnessing this research, exploring and building ‘emotionally’ interactive art installations able to engage in an emotional interaction loop with their audience (Wright 2007, Khut 2008) The Chameleon Project investigates the social role played by emotional expressions and the transfer mechanism of emotions that mediate social interaction. Research has suggested that over eighty percent of human communication is encoded in facial expression, body movements and tone of voice, and that these non-verbal channels are particularly important to communicate emotions and feelings (Hess 2001). These channels are at the basis of the emotional contagion phenomenon, that is ‘the tendency to automatically mimic and synchronize facial expressions, vocalization, postures, and movements with those of another person and, consequently, to converge emotionally’ (Hatfield 1992). This phenomenon is complex, involving rationality, instinct, and conditioned reactions (Frith 2003). Various studies have shown that emotional contagion takes place often in normal daily life highlighting our susceptibility to emotional contagion (Lanzetta 1989, Tamietto 2008, Hess 2001, Sato 2007, Neumann 2000, Mihalinec 2008, Bourgeois 2008). A likely neurological structure at the basis of this phenomenon is the mirror neuron system [Van der Gaad 2007], the system at the basis of imitation. However, humans do not always respond to an emotional expression with an exactly equivalent emotional expression. Other evolutionary principles may guide social interaction. An expression of anger may unconsciously trigger an expression of sadness as a form of empathy. Understanding the dynamics that are at the basis of this emotional loop and harnessing them to create emotional awareness is one of the aims of the project.

Creating an emotional visual databases

The project interacts with its audience by displaying videos of “emotional expression portraits” database created by artist, Tina Gonsalves (www.tinagonsalves.com). Much of the current scientific experiments exploring expression and emotion use Paul Ekman’s 1970s visual database of static facial expressions representing emotional states (Ekman 2010). An aim of the artist’s new corpus of facial emotion expression videos utilised in the Chameleon Project was to be more dynamic and aesthetic, engaging and emotionally probing. Bruno Averbeck, neuroscientific collaborator on the Chameleon Project agrees that this is important.

Most work which has been done on understanding emotions has been static. For example, much research has been done on how the brain responds to particular emotional expressions. The Chameleon Project is interesting because it brings in a dynamic element. This brings it one step closer to real social interactions. Experience with this project may very well lead to ideas which can be incorporated into our own work. 1

Figure 1: Tina Gonsalves, collage of past facial emotion expression databases used in scientific emotion studies : including Ekman, Karolinska, Darwin, Duchaine

Over the last year and a half, Tina Gonsalves has asked volunteers over the world (everyday people, actors, visual artists), to be filmed expressing emotions. She induced these emotional states using various techniques including creating scenarios in the studio where the emotions are reactions to staged event, and employing classical psychoanalytical techniques such as encouraging the volunteers to imagine different personal emotional scenarios from their past and to re-enact them as if in the present. Subjects were shot in a studio space with a neutral black background and with simple lighting.

Figure 2: Portrait 05, video stills of sad expression channel from the facial emotion expression database of the Chameleon Project.

Tina Gonsalves has now shot thirty portraits, and intends to shoot still more. After the shoot, the artist writes to the subjects to discuss the experience.

Figure 3: Portrait 08, video stills of angry expression channel from the facial emotion expression database of the Chameleon Project.

One of the subjects reported: The whole process from entering the darkened studio, to being confronted with a camera that stood only half a meter away from my face, I instantly thought would promote a perhaps less sincere and maybe uncomfortable response; but I was amazed to find how easily my recalling of these particular emotionally significant events unraveled, and how moments of embarrassment were soon ousted. 2 Another participant reports about the experience in the studio: …in the attempt to recreate the emotion, the feelings flooded back. I felt quite moved … This intimate relationship with a camera lens was a new experience for me and I found its scrutiny a great challenge.  I learned that while it becomes possible to represent a range of emotions, the sudden and immediate proximity of some, particularly sadness and fear, was potent and very real.3 By the end of the project the artist aims to shoot in Africa, Asia, Europe and the Americas. The artist wrote in her studio notes in March 2008 while working in Canada: “….It’s taken a while to get comfortable asking people to evoke emotions. It’s been exhausting, because it feels so personal. I have sourced some artists, some actors, and some people off the street! It’s been a varied response, ranging from deep deep crying for half an hour to more laughter and very light expression. It’s been hard to watch people cry and stand over the other side of the camera documenting it. For some, sadness has been very close to the surface, and recent events such as loss make sadness the easiest to access. 4 Where as Gonsalves’ notes in April 2009, working in Paris: It’s harder to coax Parisians to reveal emotions… I need to spend more time getting to know participants, to develop a more trusting relationship. I am asking them to give a lot, and it’s hard without a closeness. The studio time needs to be much longer than it was in Canada…5′

Testing the emotional contagion power of the database

The emotional facial expressions of the audience are used as agency in the Chameleon Project, driving the interactivity of the Chameleon Project. Therefore, it was important that the digital video portraits created by the artist elicit an automatic visible emotional facial expression through the mechanism of emotional contagion [10] for the participating audiences. This was tested in the lab by Nadia Berthouze and her research student, Matt Iacobini from the University College of London Interaction Center. At the time of the first evaluation, the Chameleon Project was not yet fully functional so a “Wizard of Oz” (Maulsby 1993) type of scenario was used to investigate the questions.

Figure 4: Mock up of Wizard of Oz set up for Chameleon Project.

 

Eleven volunteer observers were recruited to view the video portraits. The observers rated the emotional intensity of the video portrait. The observer’s facial expressions were filmed while watching the portraits. In a separated space, a person (a rater) was used to observe and rate the facial expressions of the viewer as they watched the portraits. The system kept a record of the list of videos presented to the observer, of the observer’s emotional responses (i.e., his/her facial expression) and of the rater labels. After the experiment, all the observers were given a multiple-choice form about their experience. The questionnaire results showed that: six observers felt the stimulus emotions; 8 observers felt the desire to respond to the stimulus emotions; five observers felt that they were interacting with a person or looking at a person. According to the rater’s classification, ten observers reacted visibly during the experiment by displaying a series of different emotional expressions in response to the video portraits.

These results showed that the visual database created by the artist was able to trigger emotional contagion in the audience. This brought us to the next stage; to integrate and test the emotion recognition technology.

Sensing the Audience’s Emotions

The facial emotion reading technology that monitors the audience is being developed by collaborators Rana El Kaliouby (http://web.media.mit.edu/~kaliouby/), Youssef Kashef, and Abdelrahman Mahmoud, lead by Rosalind Picard (El Kaliouby 2005). This system classifies the emotional state of the audience within a group of six emotions (happy, neutral, sad, angry, disgusted, surprised). These emotional states are known as “universal emotions” and are used as they the ones most scientific researchers have categorized. An added continual challenge has been to develop the technology to work under darker lighting scenarios, with moving faces, and varying distance.

Figure 5: Framegrab of Tina Gonsalves interacting with the Facial emotion reading interface (FaceSense) of the Chameleon Project developed by Rana El Kaliouby at the MIT Media Lab Affective Computing group, version 2, March 2009

Figure 6: Frame Grab of video engine of the Chameleon Project developed in Max MSP, prototype 06, March 2008

Figure 7: Tina Gonsalves, Initial Technical mock up of the Chameleon Project, prototype 10, 2007

The facial emotion reading system sends this information to the video engine that has been built by Tina Gonsalves, Jeff Man, Evan Raskob and Christian Topfner. The video engine implements an emotional algorithm so the video portrait response to the audience is empathically appropriate, lead by social neuroscientist, Chris Frith. Frith hypothesized about how emotional expressions are exchanged between social groups, building this into an algorithm that networks video engines, face reading technology and audience members. Frith’s model predicts the probability of how a person will respond to a stimulus within the group of six universal emotions. Tina Gonsalves, Nadia Berthouze and Matt Iacobini, tested Frith’s emotional transference hypothesis in the laboratory. For “Happy”, “Sad” and “Neutral” stimuli, observers tended to react with the same emotion. Sadness stimuli achieved a sixty per cent response of sadness. “Anger” stimuli elicited mainly a neutral reaction. This is in accordance with previous studies (Bourgeois 2008) that showed that in the case of negative emotion, a mimicry response can be inhibited or elicit a counter-expression when observer and expresser do not belong to the same social group. “Disgust” and “Surprise” stimuli elicited not only neutral but also happy expressions. Throughout the experiment there was a high frequency of Happy expressions, probably due to amusement rather than emotional contagion. These findings became the base line for the code response. Bruno Averbeck notes:

Although we use the mind reading technology to classify the expression, we don’t have any information on the intensity or subtlety of the expression, or the context of the expression. For example, someone can be crying because they are happy or sad. At some level it can be modeled into a code, but we are still working with simple models at this time. We are not able to capture the full range of emotions. 6

Working with Averbeck, we are now trying to extend on this, building memory and desire for mimicry into the algorithm. Through modeling memory into the algorithm, which leads each digital portraits of the visual database to experience “moods” and, over longer exhibition times, develop a “temperament”, both constantly affecting the ways the system responds. The constant learning of the system always influence the video engine. For example, on a bright sunny day in Brighton, most people entered the space quite happy. This mood of happiness spread to the portraits of Chameleon, and most of the portraits developed a happy mood and temperament. This means the emotional architecture of Chameleon is constantly changing and shifting, much like humans interrelations, Chameleon has the desire to be contagious, but always shaped with memory of past interactions. The models triggered the video portraits in a way that is representative of how emotional contagion works in day-to-day life. For science, this engine has the potential to become a live data-capturing tool to analyse the transfer of emotions between people.

The experience of interacting with the Chameleon Project

The Chameleon Project attempts to build an empathic dialogue with the audience, as the audience becomes increasingly aware of the consequences of their emotional expressions.

The visitor’s empathy becomes a powerful form of agency, whereby they become increasingly aware of and sensitive to the consequences of their interactions. This engenders a problematic of ethics, in that the visitor to the gallery must consider the degree to which they feel responsible for the subject’s changing emotional states… (Tofts 2008)

Gonsalves worked with curator, Helen Sloan to curate the Chameleon Project. Gonsalves built the work in ten progressions to understand the complexities of the project, build a stronger collaborative dialogue, create a visual language and recurring milestones for all collaborators, opportunities for frequent tests of the interactivity and to test the robustness of the technology. The work has been exhibited often, with most exhibition venues providing a chance to evaluate audience interaction. For Sloan, finding venues that can embrace this process has proved difficult at times:

Art spaces tend to commission clean finished work and science spaces like robust interactive and interpretive work. Gonsalves well presented iterations of a process fall between the gaps and it is no coincidence that digital galleries and spaces are the ones that support the project” 7

Figure 8: The Chameleon Project, installation view of prototype 03: Mimicking Emotional Contagion, Lighthouse UK, 2009

So far, the Project has been placed in waiting rooms such as University College of London’s Hospital Foyer, café/bar venues such as the London Science Museum’s Dana Center and London’s ICA bar, galleries such as Lighthouse and Fabrica in Brighton, and museums including London’s Natural History Museum. (http://www.tinagonsalves.com/camvideo.html) The artist sees these spaces as somewhat controlled, yet allow more freedom than the scientific laboratory, bringing the scientific study of social emotions one step closer to real social interactions.

Figure 9: The Chameleon Project, Installation view of prototype 06: Mimicking Emotional Contagion in Social Groups. After Darwin: Contemporary Expressions, Natural History Museum, UK, 2009

At venues such as Lighthouse, evaluation sessions were designed by Tina Gonsalves, and human computer interaction scientists Nadia Berthouze, Matt Iacobini and Kim Byers. Audiences were invited to interact with the work. They were then interviewed to find out more about their experience of the work. They were also asked to fill out an “emotional contagion scale questionnaire”. The video interview was transcribed, matched to emotional contagion scale questionnaires. The team used Grounded Theory, a technique where key points are extracted from the transcribed interviews and grouped into similar concepts in order to make them more workable. From these concepts, categories were formed which were allowed us insight into similar concerns and responses. Through the evaluation sessions we learn factors that shape the next iteration of the work. Finer details, such as those that built and broke a feeling of engagement, immersion, connection and emotional contagion, are revealed and affect ideas of narrative and display of the next prototype of the Chameleon Project

.

Figure 10: The Chameleon Project, Installation view of prototype 06, Mimicking Emotional Contagion in Social Groups. University College of London Hospital Foyer, UK, 2009

Figure 11 (above left): The Chameleon Project, photos of audience interacting with the first iteration of prototype 07: Integration of Facial Emotion Recognition Technology at the Dana Center, Science Museum, London, 2009

Feedback suggests that the Chameleon Project provoked the audience to think about emotional interaction.

Participant 01:’I suppose it is quite an intense experience and I will probably think about it later…’

Participant 05: ‘…he (the video portrait) was being quite flirtatious. The feeling I had inside was like having a connection with someone that you had met in a bar or something….’

Participant 08: ‘…I was close to the character. He was quite up front and in my face. And talking quite low and quite intimately’ Subject 09: ‘I was thinking of some sad things that happened to me, when [the digital portrait] was sad for a while, it felt like a long time, and it reminded me of some things’.

Participant 10: ‘I didn’t like it when he looked sad and I didn’t know why’.8

‘If you laugh, then the person laughs back. I was impressed by one of the interactions, as the software recognised, as expressed by the portrait, that there had been a recent death affecting the participant. This was in fact true, and that was a very unnerving moment…’ 9

Feedback also suggests that when audiences interact with the system, people are clearly introspecting, and thinking about how to behave, as they may do in a more traditional social context.

Participant 07:”I was changing my facial expression from very sad and angry to laughing. It was interesting because sometimes when they were sad and telling you how disappointed they were, you were changing your expression to happy so it was quite strange, laughing when they were sad or angry”….

Participant 09:”If someone was crying in a room, you wouldn’t really laugh at them. So, I was doing something I wouldn’t normally do. You normally empathize, so your face would be as equally sad. Playing against it was quite intriguing as its something you just don’t do.

Participant 12: “…I was looking at him and he turned around, it was strange, I asked myself why he would do that, it never happened to me, so I was looking at him to understand”. 10

 

Figure 12: The Chameleon Project, Installation view of prototype 09: Implementing Mood and Temperaments. Fabrica, UK, 2009 (photo: Phillip Carr)

 

Figure 13: The Chameleon Project, Installation view of prototype 09: Implementing mood and temperaments. Fabrica, UK, 2009 (photo: Phillip Carr)

 

Figure 14 : The Chameleon Project, photos of audience member interacting with the second iteration of prototype 07: Integration of Facial Emotion Recognition Technology at Lighthouse, Brighton, UK 2009

 

Figure 15: The Chameleon Project, photos of audience member interacting with the third iteration of prototype 07: Integration of Facial Emotion Recognition Technology, Superhuman Exhibition at RMIT gallery, Melbourne, Australia 2009 (Photo: Mark Ashkanasy, Copyright RMIT Gallery)

The collaboration of the Chameleon Project

Tina Gonsalves believes that the emotions expressed and monitored in laboratories don’t often correlate to the emotions that form the fabric of our everyday lives. Her past work has been a sustained investigation of how to further understand emotion and create tools that allow us to further explore emotions in more naturalistic scenarios, with more potent stimuli.

Studio notes in March 2008: There seem to be limited emotions being explored, visually underwhelming databases being used, and the non-ecological settings such as the lab to test responses… using small groups of subjects with narrow representation, what does the knowledge that science is building about emotions actually mean? ” 11

Figure 16: The Chameleon Project, photos of audience member interacting with the first iteration of prototype 09: Implementing mood and temperaments. Fabrica, UK, 2009

Because of the freedom introduced by the involvement of an artist, the scientific collaborators of the Chameleon Project are liberated to experiment in ways beyond the highly controlled experiments that they are usually restricted to. Tina Gonsalves sees the model of the Chameleon Project as being adaptable for science experiments. Much research into emotional responses takes place in the laboratory. This unfamiliar environment may cause anxiousness in the subject. With Chameleon, the aim is to create a more ecological and engaging experimental tool to monitor emotions, so that participants can experience emotional exchanges more naturally, closer to our day-to-day social interactions. Much research into emotional responses use underwhelming imagery. The Chameleon Project System provides a stimulus set to capture more authentic emotions to create new moving image databases for sciences. The scientific collaborators agree that this will be informative in generating novel research ideas and in devising focal studies to create a tool to help people who don’t naturally understand emotional interactions, such as people with autism, depression, or alyxthimea, a condition in which the person is unable to describe emotions in words. Chameleon is merging the boundaries of the Science/Art collaboration. Sloan writes: ‘This is incredibly exciting, at a point when some scientists still, in spite of claims to the contrary, have a tendency to see art as a visualisation or interpretive representation of their work’

Chris Frith sees the cross disciplinary collaboration as liberating: ‘This project has developed far beyond what I would dare to do in the carefully controlled experiments that we are restricted to. But the end result will provide us with marvelous tools for doing new experiments.’

Hugo Critchley, who has been working with the artist for the last five years, writes: ‘As the installation develops, the feedback from audience and one’s own subjective experience will be informative in generating research ideas and focal studies. The most compelling aspect of this is the capacity to engage’. He also sees the artist as a conduit for more cross disciplinary collaborative opportunities ‘The interdisciplinary collaboration has enriched my interest and understanding of emotional research beyond neuroscience & psychophysiology, developing links across disciplines arts humanities engineering etc that are still active, and developing as research collaborations.’

Rosalind Picard, who is mentoring the building of the face reading technology, is interested in its use for people with autism: ‘…As I watch (the Chameleon Project), my mind fills with ways we can help people to learn from the interactions portrayed. These are scripts with naked emotion, uncovered, and whether ugly or beautiful, they are hard to turn the eyes from. Here is an engaging palate for helping people who don’t naturally understand emotional interactions, and who want to deepen their ability to do so’.

Nadia Berthouze agrees: ‘I see Chameleon as a source of ideas for the creation of digital environments conducive to patients becoming aware of their emotional states’. 12

The Chameleon Project presents a genuine and rare collaboration across the boundaries of arts and science providing challenging and revealing new models for experimentation through art installations, research papers, and novel, more dynamic models for scientific research that incrementally reveal the emotional exchange, mimicry and contagion across social groups. The final exhibition of the work will take place in 2010.

Acknowledgements

The work has been core-supported by the Wellcome Trust, The Australia Council for the Arts, The Australian Network for Art and Technology Synapse Residency, Arts Council England, Lighthouse Brighton, Banff New Media Institute, University College London, MIT Media Lab, Solent University Rapid Prototyping Lab, Brighton and Sussex Medical School, Fabrica, SCAN and Dana Center @ the Science Museum.

Endnotes

1. Email excerpts between Tina Gonsalves and collaborators between March 2009-August 2009 for the development of the catalogue After Darwin: Contemporary Expressions at the Natural History Museum. 2 Email excerpts between Tina Gonsalves and subject. 3 Email excerpts between Tina Gonsalves and subject. 4 Excerpts taken from artist’s journal 5 Excerpts taken from artist’s journal 6 Email excerpts between Tina Gonsalves and collaborator. 7 Email excerpts between Tina Gonsalves and collaborator. 8 Taken from video transcripts of subjects taking part in evaluation of prototype 07 at Lighthouse, Brighton, March 2007 9 Taken from blog, http://transjuice.org/page20.htm 10 Taken from video transcripts of subjects taking part in evaluation of prototype 07 at Lighthouse, Brighton, March 2007 (http://www.tinagonsalves.com/chamselectframe02.htm) select evaluation. 11 Taken from Artists’ studio notes, March 2009 12 Email excerpts between Tina Gonsalves and collaborators between March 2009-August 2009 for the development of the catalogue After Darwin: Contemporary Expressions at the Natural History Museum.

Biographies

For over a decade, Tina Gonsalves has been using the fluid and malleable medium of video to explore complex emotional landscapes. Gonsalves is currently working with world-leaders in psychology, neuroscience and emotion computing in order to research and produce moving image artworks that respond to your emotions. Poetic installation video works, mobile and wearable technology works respond to pulse, sweat, voice and emotional expressions. She has been awarded numerous Artist in Residence programs including The Banff New Media Institute in Canada, the Centre for Contemporary Art in Prague, Asialink Residency in Bangkok, Thailand, (Pro)duction residency at Artsway, UK, the Advanced Institute of Media Arts and Sciences residency in Japan and Arts Council England’s International Fellowship based in London. She is currently honorary artist in resident at the Institute of Neurology at UCL in London, visiting artist at the Media Lab at The Massachusetts Institute of Technology in Boston, USA and artist in residence at Nokia Research Labs, Finland. Works by Gonsalves have been screened/exhibited extensively. URL: http://www.tinagonsalves.com

The premise of Nadia Berthouze’s research is that affect, emotion, and subjective experience should be factored into the design of interactive technology. Indeed, for technology to be truly effective in our social network, it should be able to adapt to the affective needs of each user group or even each individual. The aim of Berthouze’s research is to create systems/software that can sense the affective state of their users and use that information to tailor the interaction process. Body movement appears to be a promising medium for this goal: it supports cognitive processes, regulates emotions, and mediates affective and social communication. Berthouze is currently pursuing three lines of research looking at body movement as a medium to induce, recognize and measure the quality of experience of humans and in particularly of humans interacting and engaging through/with technology. She is trying to identify the various factors that affect the recognition process, including cross-cultural differences and task context. Finally, she is looking onto the existence of dialects in affective body movement communication, including avatar-specific dialects. Berthouze was awarded a 2 years International Marie Curie Reintegration Grant started in 2006 to investigate these issues in the clinical context and in the gaming industry.

From Matt Iacobini’s background in machine vision, and Artificial Intelligence Software Engineering studied at the University of Edinburgh and applied in a multidisciplinary context for microfluidics and nanotechnology at the Consiglio Nazionale delle Ricerche in Bologna, Matt Iacobini moved his research focus towards a more humanistic direction by studying Human Computer Interaction in the MSc course at the UCL Interaction Centre, and writing a research based thesis within the Chameleon project under the supervision of Dr Nadia Bianchi-Berthouze and Tina Gonsalves. At the moment he is dividing his time between research projects collaborating with scientists from UCL, MIT and Microsoft Research, and applying human computer interaction techniques to developing applications in the private sector.

References

Bianchi-Berthouze, N. Kleinsmith, A. “A categorical approach to affective gesture recognition”, Connection Science, 15 (4), 259-269, 2003 Bourgeois. P, Hess. U. “The impact of social context on mimicry”, Biological Psychology, 77 (3), 343-352, 2008. Camurri, A. Ricchetti, M. Trocca, M. “EyesWeb -  Toward Gesture and Affect Recognition in Dance/Music Interactive Systems”, IEEE International Conference on Multimedia Computing and Systems, 1999 Ekman, P. “Paul Ekman Group LLC” http://www.paulekman.com/ last accessed March 2010 Frith, F. Wolpert, D. “The Neuroscience of social Interaction. Decoding, Imitating, and Influencing the Actions of Others”. Oxford University Press, 2003. van der Gaad, C. Minderaa, R.B. Keysers, C. “Facial expressions: What the mirror neuron system can and cannot tell us”, Social Neuroscience, 2(3-4), 179-222, 2007 Gilroy, S.W. Cavazza, M.R. Chaignon, S.M. Mäkelä, M. Niranen, E. André, T. Vogt, J. Urbain, M. Billinghurst, H. Seichter, M. Benayoun, 2, “E-tree: emotionally driven augmented reality art”, International Multimedia Conference Proceeding of the 16th ACM international conference on Multimedia, 945-948, 2008 Hatfield. E, Cacioppo. J, Rapson. J. “Emotional Contagion”, Review of personality and social psychology. 14. Emotion and social behavior (pp. 151-177). Newbury Park, CA: Sage, 1992 Hess. U, Blairy, S. “Facial mimicry and emotional contagion to dynamic emotional facial expressions and their influence on decoding accuracy”, International Journal of Psychophysiology, 40 (2), 129-141, 2001 Kaliouby, R. and Robinson P,”Real-Time Vision for HCI” , chapter ~Real-time Inference of Complex Mental States from Facial Expressions and Head Gestures, 181-200. Spring-Verlag, 2005. Khut, G, “Interactive Art, Embodiment and Participation” http://georgekhut.com/ last accessed March 2010 Lundqvist, L.O. “Facial EMG reactions to facial expressions: a case of facial emotional contagion?” Scandinavian Journal of Psychology, 36, 130-141, 1995. Lanzetta. L.T, Englis, B.G. “Expectations of Cooperation and Competition and Their Effects on Observers’ Vicarious Emotional Responses”, Journal of Personality and Social Psychology, 56 (4), 543-554, 1989 Maulsby, D. Greenberg, S. Mander, R. “Prototyping an Intelligent Agent through Wizard of Oz”, Proc ACM SIGCHI Conf. Human Factors in Computing Systems, ACM Press, 277-284, 1993 Mihalinec, R. Stevanovic, K. “Emotion Recognition and its Aesthetic Interpretation”, IEEE 50th International Symposium on ELMAR, 2, 491-494, 2008 Neumann, R. Strack, F. “Mood contagion: The automatic transfer of mood between persons”, Journal of Personality and Social Psychology, 79 (2), 211-223, 2000 Pantic, M A. Pentland, A. Nijholt, T.S. Huang, “Human Computing and Machine Understanding of Human Behavior: A Survey”, Artifical Intelligence for Human Computing, 47-71, 2007 Sato, W. Yoshikawa, S. “Spontaneous facial mimicry in response to dynamic facial expressions”, Cognition, 104(1), 1-18, 2007 Tamietto, M. de Gelder, B. “Emotional Contagion for Unseen Bodily Expressions: Evidence from Facial EMG”, Proceeding of the FG 2008 meeting in Amsterdam, 2008 Tofts, D. “Tina Gonsalves: Unleashing Emotion”, Artlink, vol 28 no 2 Wright, A. Shinkle, E. Linney. A. “Alter Ego: Computer Reflections of human Emotions”. Proceedings of the 6th Digital Art Conference, 2005

Both comments and pings are currently closed.

Comments are closed.