Artificial Emotions
Project Proposal
Possible Topics
- Artificial Emotions
- Brain Control
- Wireless Gaming
- Artificial Intelligence
Our group has decided to go with the topic called "Artificial Emotions".
Possible Thesis Statements
- What is the noticeable trend on artificial emotions research for the short/long term future? Does this mean robots will eventually take over most of our jobs let’s say in 2060? What are the advantages and disadvantages of artificial emotions research? Will it ultimately help us or harm us?
- Research in artificial emotion is still in its infancy with multiple aspects of the field undergoing study around the globe; this has strong potential to enhance better human to Artificial Intelligence interaction and entertainment when these studies mature.
- Artificial Emotions have evolved over the past years, what are the current trends that will make better use of these techniques, now and in the future?
- In what possible ways could we make better use of techniques of artificial emotions in the modern world?
Top Two Selected Thesis Statements
- What is the noticeable trend on artificial emotions research for the short/long term future? Does this mean robots will eventually take over most of our jobs let’s say in 2060? What are the advantages and disadvantages of artificial emotions research? Will it ultimately help us or harm us?
- Research in artificial emotion is still in its infancy with multiple aspects of the field undergoing study around the globe; this has strong potential to enhance better human to Artificial Intelligence interaction and entertainment when these studies mature.
Selected Thesis Statement
- Research in artificial emotion is still in its infancy with multiple aspects of the field undergoing study around the globe; this has strong potential to enhance better human to Artificial Intelligence interaction and entertainment when these studies mature.
Rough Topic Outline Based Upon Thesis
- Intro
- Current Research/Tools and Techniques
- Modern Examples
- Future Development/Projects
- Conclusion
Initial Research Notes On Selected Thesis
it has been observed that the action and dynamism of them are not only controlled by external stimuli but also their own individual emotion, personality mood and attitude [1]. Darwin laid pioneering foundation of fundamental theory of emotion (comprising anger, contempt, disgust, fear, joy, sadness and surprise) through the analysis of their expression in human faces. Ekman’s study revealed that some emotions are not culturally determined [2].
Artificial emotion produces two fundamental components as output: gestures and actions. Actions are a general category and are dependent upon the context of the situation in which the character exists. A simulation’s control system uses AE to select and/or modify an action. When selecting an action, AE indicates what actions are appropriate to the character’s personality and current mood. Hence, for example, a timid character is unlikely to do anything aggressive. When modifying an action, AE can help to determine how an action is carried out [11]
[1] Mark Ingebretsen, “Toward More Human Video Game Characters”, IEEE Intelligent Systems, 1541-1672/08, July - August 2008, pp. 4-7. [2] P. Ekman, “Facial Expressions and Emotion”, American Psychologist, 48(4), 1993, pp. 384-392. [11] Seungwon Oh and Minsoo Hahn, “Proposal of an Artificial Emotion Expression System Based on External Stimulus and Emotional Elements”, Lecture Notes in Computer Science, Springer Berlin/ Heidelberg Volume 5208, 2008, pp. 526-527.
Characters that display emotion are critical to a rich and believable simulated environment, especially when those characters interact with real people possessing real emotions. Traditionally, animators have painstakingly create these behaviors for pre-rendered animations. This approach, however, is not possible when we wish to use autonomous, interactive characters that possess their own unique personalities and moods. Truly interactive characters must generate their behaviour autonomously through techniques based upon Artificial Emotion
http://www.waset.org/journals/waset/v53/v53-151.pdf
-investigation of artificial emotions is a well-known cognitive theory called the Two Factor Theory of Emotion, by psychologists Stanley Schachter and Jerome Singer. The theory states that in order for an emotion to be felt, two factors must be present:
Physiological change: The person feels elevated heart rate, sweaty skin and other elements of physiological arousal. Cognitive label of the physiological change: Based off the context of the situation, the person assigns a label to the physiological change. Simply put, when your body reacts physically to some stimuli and you mind assigns meaning to your physical state, you synthesize an emotional response. -researchers spent much of their effort on figuring out how to dissect the component aspects of emotion and reassemble them into new emotions of their choice. In effect, they figured out how to reconstitute artificial emotions within their subjects
-It turns out that the physiological changes that accompany many emotions, such as fear and lust, are remarkably the same. There is a wide range of stimuli, including loud noises, intense memories or even a fear of heights that activate the sympathetic nervous system, prepping the body for action in the face of stress. Your heart rate elevates. Your palms become sweaty. Your alertness increases and body hair stands on end. Different stimuli, same response.
Computer reaction to user emotion When confronted with a (possible) emotional reaction, a respondent has to decide how to react. To a great extent, the reaction will depend on a number of variables, including the confidence the respondent has in the inference that an emotional reaction has occurred, the relationship between the respondent and protagonist, and the subject under discussion. In fact, there are an enormous range of Moving Toward the Utilization of Artificial Emotion 281 possibilities that can occur in human-to-human interactions, and we must, of course, greatly simplify them for the purposes of a human-computer model. Consequently, we suggest three core courses of action a system might take when confronted with an emotional reaction. Table 3. Computer reactions to emotion [1] Ignore, proceed as before [2] Acknowledge directly [3] Acknowledge indirectly Let us examine each of these reactions in turn. In case [1] the system has for a variety of reasons (explored below) determined that there has been an emotional reaction, but chooses to ignore it. Previous questions may be reiterated, previous processes reviewed, or the next question may be asked with a flag to return later. The emotional reaction may, for example, be to something the system deems as peripheral to the core discussion, so rather than pursue it, it may be wise to choose a different tack. If, however, there are continued emotional reactions, then the system can always come back to the issue. In case [2] the system notes that an emotional reaction has occurred, and inquires of the user if this is correct. The system might ask why there has been an emotional reaction, or speculate in some way or other as to its significance. The point is that the system deals directly by putting, as it were, the emotional cards on the table. In case [3] the system notes that an emotional reaction has occurred, and changes strategy accordingly in order to investigate the degree, nature, or source of the reaction. In this case, the system is assuming, for example, that the reaction holds a clue to the underlying grounds for the user's position.
a description of the practical applications of the topic Game programming - http://www.gamasutra.com/view/feature/1992/constructing_artificial_emotions_.php
Current Development Project: Kismet Actroid - http://en.wikipedia.org/wiki/Actroid
Initial Research Finding
- Mark Ingebretsen, “Toward More Human Video Game Characters”, IEEE Intelligent Systems, 1541-1672/08, July - August 2008, pp. 4-7.
- P. Ekman, “Facial Expressions and Emotion”, American Psychologist, 48(4), 1993, pp. 384-392.
- Seungwon Oh and Minsoo Hahn, “Proposal of an Artificial Emotion Expression System Based on External Stimulus and Emotional Elements”, Lecture Notes in Computer Science, Springer Berlin/ Heidelberg Volume 5208, 2008, pp. 526-527.
- "Analyzing Artificial Emotion in Game Characters Using Soft Computing" PDF Link: http://www.waset.org/journals/waset/v53/v53-151.pdf
- Practical Applications in Game Programming WebsiteLink: http://www.gamasutra.com/view/feature/1992/constructing_artificial_emotions_.php
- Current Development,"Project: Kismet Actriod, WebsiteLink: http://en.wikipedia.org/wiki/Actroid
- Other Practical Example of Game Programming PDFLink: http://www.cp.eng.chula.ac.th/~vishnu/gameResearch/AI_november_2005/gamasutra.pdf
- Lee-Johnson, C.P.; Carnegie, D.A.; , "Mobile Robot Navigation Modulated by Artificial Emotions," Systems, Man, and Cybernetics, Part B: Cybernetics, IEEE Transactions on , vol.40, no.2, pp.469-480, April 2010 doi: 10.1109/TSMCB.2009.2026826 URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=5282573&isnumber=5431005 talks about what is needed for future development in AE
- Camurri, A.; Coglio, A.; , "An architecture for emotional agents," Multimedia, IEEE , vol.5, no.4, pp.24-33, Oct-Dec 1998
doi: 10.1109/93.735866 URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=735866&isnumber=15844 talks about techiques of modelling AE
Rough Draft
Artificial Emotion
Emotion is defined as a term for a mental and physiological state associated with a wide variety of feelings, behaviours or thoughts. Emotions are subjective experiences, or experienced from an individual point of view. Emotions are what set humans as individuals different from one another. People show emotions related to how their mood, disposition, temperament or personalities are at that time of interaction. It had been observed that the action and dynamism of them are not only controlled by external stimuli but also their own individual emotions, personality mood and attitude. Darwin laid pioneering foundation of fundamental theory of emotion (comprising anger, contempt, disgust, fear, joy, sadness and surprise) through the analysis of their expression in human faces. Ekman’s study revealed that some emotions are not culturally determined.
Artificial emotion produces two fundamental components as output: gestures and actions. Actions are a general category and are dependent upon the context of the situation in which the character exists. A simulation’s control system uses artificial emotion to select and/or modify an action. When selecting an action, artificial emotion indicates what actions are appropriate to the characters’ personality and current mood. Hence, for example, a timid character is unlikely to do anything aggressive. When modifying an action, artificial emotion can help to determine how an action is carried out.
Tools & Techniques
Characters that display emotion are critical to a rich and believable simulated environment, especially when those characters interact with real people possessing real emotions. Traditionally, animators had painstakingly created these behaviours for pre-rendered animations. This approach, however, is not possible when we wish to use autonomous, interactive characters that possess their own unique personalities and moods. Truly interactive characters must generate their behaviour autonomously through techniques based upon Artificial Emotion.
Psychologist Stanley Schachter and Jerome Singer had done investigation on artificial emotion and it is well-known cognitive theory called the Two Factor Theory of Emotion. The theory states that in order for an emotion to be felt, two factors must be present:
· Physiological change: The person feels elevated heart rate, sweaty skin and other elements of physiological arousal.
· Cognitive label of the physiological change: Based off the context of the situation, the person assigns a label to the physiological change.
Another word, when your body reacts physically to some stimuli and your mind assigns meaning to your physical state, you synthesize an emotional response.
Researchers spent much of their effort on figuring out how to dissect the component aspects of emotion and reassemble them into new emotions of their choice. In effect, they figured out how to reconstitute artificial emotions within their subjects. It turns out that physiological changes that accompany many emotions, such as fear and lust, are remarkably the same. There is a wide range of stimuli, including loud noises, intense memories or even a fear of heights that activate the sympathetic nervous system, prepping the body for action in the face of stress. Your heart rate elevates. Your palms become sweaty. Your alertness increases and body hair stands on end. This is an example of different stimuli, but same response.
Another method engineers had been investigating is how computer react to user emotion. When confronted with a possible emotional reaction, a respondent has to decide how to react. To a great extent, the reaction will depend on a number of variables, including the confidence the respondent has in the inference that an emotional reaction has occurred, the relationship between the respondent and protagonist, and the subject under discussion. In fact, there are an enormous range of possibilities that can occur in human-to-human interactions, and we must, of course, greatly simplify them for the purposes of a human-computer model. Consequently, we suggest three core courses of action a system might take when confronted with an emotional reaction.
1. Ignore, proceed as before – the system has for a variety of reasons determined that there has been an emotional reaction, but chooses to ignore it. Previous questions may be reiterated, previous processes reviewed, or the next question may be asked with a flag to return later. The emotional reaction may, for example, be to something the system deems as peripheral to the course discussion, so rather than pursue it, it may be wise to choose a different tack. If, however, there are continued emotional reactions, then the system can always come back to the issue.
2. Acknowledge directly – the system notes that an emotional reaction has occurred, and inquires of the user if this is correct. The system might ask why there has been an emotional reaction, or speculate in some way or other as to significance. The point is that the system deals directly by putting, as it were, the emotional cards on the table.
3. Acknowledge indirectly – the system notes that an emotional reaction has occurred, and changes strategy accordingly in order to investigate the degree, nature, or source of the reaction. In this case, the system is assuming, for example, that the reaction holds a clue to the underlying grounds for the user’s position.
Current Development
Kismet
One of the more successful projects in the research of artificial emotion is an MIT project, Kismet. Kismet is a robotic head that uses vocalizations and human facial expressions to convey emotions. It integrates theories and concepts from social development, psychology, ethology and evolution to enter into a natural social interaction with a human caregiver and learn from them, patterning the parent-infant relationship (Socialable machines - MIT). Kismet perceives stimuli from visual and auditory channels, and delivers social signals to the human through gaze direction, facial expression, body posture, and vocalization. It uses various systems in its framework to control certain aspects of it actions. These include the attention system, which is used to pick-up low-level stimuli and direct the robot’s attention to it, the behaviour system, which organizes the robot’s behavioural pattern into a task-based coherent structure, the motor system, which controls the robot’s movement, and more. In addition, there is the motivation system, which consists of the robots drives and needs. When the needs are met, the intensity level of each drive is reduced, and as it reduces, the robot becomes motivated to reduce that drive (Socialable machines - MIT). There are videos that show Kismet interacting with humans and conveying emotions. The project is still very much a work-in-progress, but the success of this project gives a glimpse of the potential of artificial emotion.
Albert Hubo
Albert Hubo is an unique humanoid robot that was developed by Dr. David Hanson and the Korea Advanced Institute of Science and Technology (KAIST). This robot is the first ever walking robot with realistic, humanlike expressions. During development, Dr. Hanson combined the technology of material science and used a material call “Frubber”. Frubber is the elastic polymer that makes realistic facial movement and speech possible while being stronger, more elastic and using only a fraction of the power required for existing materials. Frubber has the potential to become the future of prosthetics applications, facial reconstruction and many other uses.
Zeno
In the aim for younger audience,
Actriod
Osaka University and Kokoro Company Ltd. Developed a strong visual human-likeness humanoid robot call Actroid. The robot’s appearance had been modeled after an average young Japanese woman descent. The woman Actroid is really similar to a science fiction android where it can have lifelike functions such as blinking, speaking and breathing sound. There are some models of this type that are interactive robots with the ability to recognize and process speech and respond in kind. There are 47 points of articulation on the Actroid. The internal sensors allow the robot to react with a natural appearance by way of air actuators placed at the points of articulation in the upper body. The Actroid can also imitate human-like behavior with slight shifts in position, head and eye movements and the appearance of breathing in its chest. It can communicate on a rudimentary level with humans by speaking. The robot can also non-verbally interact with human. When addressed, the Actroid uses a combination of “floor sensors and omni-directional vision sensors” in order to maintain eye contact with the speaker. In addition, the robot can respond in limited ways to body language and tone of voice by changing their own facial expressions, stance and vocal inflection.
Use of AE in robotics
Artificial emotions are composed of a sequence of actions and gestures. Actions are a general category and are dependent upon the context of the situation in which the character exists. The two main applications of artificial emotions are robots and video games. The fundamental objective of the robot’s software architecture is to combine various properties associated with ‘intelligence’, like reactivity, emergence, situatedness, planning, deliberation and motivation. To summarize the approach, the architecture is based on behaviour-producing modules that are dynamically selected using modules that monitor specific conditions. The goals of the robot are managed using internal activation variables called motives. A behaviour-producing module that is selected may or may not be used to control the robot, according to the sensory conditions it monitors and the arbitration mechanism used to coordinate the robot’s behaviours. By observing how behaviour-producing modules are used over-time, the architecture can infer important information about the overall behaviour of the robot in the environment. We believe this to be essential for the implementation of artificial emotions. The emotional capability is incorporated in the control architecture like a global background state, allowing conditions to influence and to be influenced by all the architecture’s modules. The artificial emotions used for the task are related to temporality, i.e. they allow taking into consideration the limited duration of an individual’s life. Sadness, distress and joy contribute in solving this adaptation problem. In our implementation, distress is used to detect external conflicts (like being trapped somewhere, stalling, etc.) or internal conflicts (like the simultaneous action of too many goals at once). Sadness and joy are used to prioritize the goals of the robots according to what is experienced in the world. These emotions will be used to express the robot’s state. The evolution of these states over time will be memorized for the robot’s presentation.
Future
As life expectancy is growing gradually, people are expected to live longer and longer. Looking back in history, we acknowledge that life over centuries has become more appreciated and human rights are more valued.
One artificial emotion that has been in production for quite some time now and has undergone so much developmental research are the ones of the SWORD units (Special Weapons Observation Remote Direct Action System) which are the first patrolling armed robots. They were employed in Iraq back when the war started in 2004.
The artificial emotions of these machines have been in constant development. They are radio-controlled robots by professional soldiers therefore they cannot discern when or who to shoot, so they have as yet, not fired a shot. The future of these machines would be to make their emotions smarter and enable them to be independent fighters.
Edited Draft
http://matrix.senecac.on.ca/~mmistry1/Artifical%20Emotions.doc
Final Draft
http://matrix.senecac.on.ca/~mmistry1/Final_Artifical_Emotions.doc
Thesis Statement
Thesis
- Research in artificial emotion is still in its infancy with multiple aspects of the field undergoing study around the globe; this has strong potential to enhance better human to Artificial Intelligence interaction and entertainment when these studies mature.
Keywords
artificial emotion, enhance, human to AI interaction, entertainment.
Bibliography
- Mark Ingebretsen, “Toward More Human Video Game Characters”, IEEE Intelligent Systems, 1541-1672/08, July - August 2008, pp. 4-7.
- P. Ekman, “Facial Expressions and Emotion”, American Psychologist, 48(4), 1993, pp. 384-392.
- Seungwon Oh and Minsoo Hahn, “Proposal of an Artificial Emotion Expression System Based on External Stimulus and Emotional Elements”, Lecture Notes in Computer Science, Springer Berlin/ Heidelberg Volume 5208, 2008, pp. 526-527.
- "Analyzing Artificial Emotion in Game Characters Using Soft Computing" PDF Link: http://www.waset.org/journals/waset/v53/v53-151.pdf
- Practical Applications in Game Programming WebsiteLink: http://www.gamasutra.com/view/feature/1992/constructing_artificial_emotions_.php
- Current Development,"Project: Kismet Actriod, WebsiteLink: http://en.wikipedia.org/wiki/Actroid
- Other Practical Example of Game Programming PDFLink: http://www.cp.eng.chula.ac.th/~vishnu/gameResearch/AI_november_2005/gamasutra.pdf
- Lee-Johnson, C.P.; Carnegie, D.A.; , "Mobile Robot Navigation Modulated by Artificial Emotions," Systems, Man, and Cybernetics, Part B: Cybernetics, IEEE Transactions on , vol.40, no.2, pp.469-480, April 2010 doi: 10.1109/TSMCB.2009.2026826 URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=5282573&isnumber=5431005 talks about what is needed for future development in AE
- Camurri, A.; Coglio, A.; , "An architecture for emotional agents," Multimedia, IEEE , vol.5, no.4, pp.24-33, Oct-Dec 1998
doi: 10.1109/93.735866 URL: http://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=735866&isnumber=15844 talks about techiques of modelling AE