Media Technology
Research Centre
University of Bath logo - links to University home
  Animating Virtual Humans
  Autonomous avatars and virtual humans are the subjects of much research to make them act in a human way. The architecture of these virtual characters can be very complex, including modules simulating cognitive processes, emotions, personality, vision, hearing, speech and so on.

Our work is concerned with the expressiveness of virtual humans. In fact, even with the most complex architecture, virtual characters will never be believable unless real humans are able to sense feeling from the avatar's actions. Emotions play a very important role in human behaviour. They influence perception, decision making and memory. They also influence how actions are carried out by the character; this is what we call the expressiveness of a virtual character.

We have developed a Dynamic Emotion Representation (DER) which is used to control the expressiveness of virtual characters. Our model represents three types of emotion, each one having different timescale influences. Primary Emotions give reactivity and adaptivity, by triggering emotional reaction patterns in less than a second. Secondary Emotions give an emotional value to objects involved in the cognitive processes, modify the perception of the world and change how actions are carried out. Secondary emotions may last upto a few minutes. Finally, the Mood Influence takes into consideration the general state of the character over hours or even days.

pictures of synthesised happy and faces
Our emotionally reactive talking head, takes tagged text as an input. Its DER controls the expressiveness of the virtual character. Both of the above sequences are produced from the same piece of tagged text. In the upper one, the character starts very happy, while in the lower one he is very sad. The responses unfold in very different ways and of course the DER is updated by these new inputs and so a different response again will occur subsequently.
  Researchers
 
picture of Joanna Bryson picture of Emmanuel Tanguy picture of Phil Willis
Joanna Bryson Emmanuel Tanguy Phil Willis
  Research Results
  The software is complete and can demonstrate a real-time face which responds differently according to its history. It can link to a real-time chatbot, talk to you and produce visual emotions.

See also Emmanuel Tanguy's home page.

Virtual Human Papers

Our papers are available for download by name or for download by year.
  • Emotions as Durative Dynamic State for Action Selection. Emmanuel Tanguy, Philip Willis, Joanna Bryson, Proc 20th Joint Conference on Artificial Intelligence (IJCAI) 2007, pp 1537-1542.
  • A Dynamic Emotion Representation Model within a Facial Animation System. E Tanguy, P J Willis, J J Bryson, International Journal of Humanoid Robotics Vol 3, 3, 293-300 Sept 2006.
  • Emotions: the Art of Communication Applied to Virtual Actors (2006, June) E. A. R. Tanguy, PhD thesis, Department of Computer Science, University of Bath. CSBU-2006-06. (ISSN 1740-9497)
  • The Role of Emotions in Modular Intelligent Control, Joanna J Bryson, Emmanuel Tanguy and Philip Willis; AISB Quarterly, Number 117, pages 1 and 6, Summer 2004.
  • A Layered Dynamic Emotion Representation for the Creation of Complex Facial Expressions, Emmanuel Tanguy, Philip Willis and Joanna J. Bryson; 4th International Workshop, IVA 2003, Kloster Irsee, Germany, 15-17 of September 2003, pages 101-105, (Springer Verlag LNCS series).
  • FACES: The Facial Animation, Construction and Editing System, Manjula Patel and Philip J. Willis, Proc. Eurographics '91, pp. 33-45, Sept 91.
  • Colouration Issues in Computer Generated Facial Animation, Manjula Patel, Computer Graphics Forum 14(2), June 1995, pp. 117-126.
  • Emotional Posturing: A Method Towards Achieving Emotional Figure Animation, Daniel J Densley and Philip Willis, Computer Animation `97, Geneva, Conference Proceedings (IEEE June 1997) pp. 8-14. ISBN 0-8186-7984-0
  Funding Agency
  The recent is funded by the Department of Computer Science, University of Bath. Earlier work was funded by EPSRC.
 
  Valid XHTML 1.0! Valid CSS!