- Full Description
Affective information processing assigns computers the human-like capabilities of observation, interpretation and generation of affect features. It is an important topic for harmonious human-computer interaction, by increasing the quality of human-computer communication and improving the intelligence of the computer. Discussing state of art of the research in affective information processing, this book summarises key technologies researched, such as facial expression recognition, face animation, emotional speech synthesis, intelligent agent, and virtual reality. The detailed discussion covers a wide range of topics including hot topics which look to challenge and improve current research work. Written to provide an opportunity for scientists, engineers and graduate students to learn problems, solutions and technologies in the topic area, this book will provide insight and prove a valuable reference tool.
- Table of Contents
Table of Contents
- Affect and Emotions in Intelligent Agents: Why and How?
- Cognitive Emotion Modeling in Natural Language Communication.
- A Linguistic Interpretation of the OCC Emotion Model for Affect Sensing from Text.
- Affective Agents for Education Against Bullying.
- Emotion Perception and Recognition from Speech.
- Expressive Speech Synthesis: Past, Present and Possible Futures.
- Emotional Speech Generation by Using Statistic Prosody Conversion Methods.
- Why the Same Expression may not Mean the Same When Shown on Different Faces or Seen by Different People.
- Automatic Facial Action Unit Recognition by Modeling their Semantic and Dynamic Relationships.
- Face Animation Based on Large Audio
- visual Database.
- Affect in Multimodal Information.
- verbal Feedback in Interactions.
- Emotion Recognition Based on Multimodal Information.
- A Multimodal Corpus Approach for the Study of Spontaneous Emotions.
- Physiological Sensing for Affective Computing.
- Evolutionary Expression of Emotions in Virtual Humans using Lights and Pixels.
Please Login to submit errata.No errata are currently published