메뉴 건너뛰기




Volumn 82, Issue 1, 2016, Pages 101-133

A Survey of Autonomous Human Affect Detection Methods for Social Robots Engaged in Natural HRI

Author keywords

Affect classification models; Automated affect detection; Body language; Facial expressions; Human robot interactions; Multi modal; Physiological signals; Voice

Indexed keywords

AUTOMATION; COMPUTATIONAL LINGUISTICS; FACE RECOGNITION; HUMAN COMPUTER INTERACTION; HUMAN ROBOT INTERACTION; INTELLIGENT ROBOTS; MAN MACHINE SYSTEMS; PHYSIOLOGICAL MODELS; PHYSIOLOGY; ROBOTICS; ROBOTS; SIGNAL DETECTION; SPEECH; SPEECH RECOGNITION; SURVEYS;

EID: 84961181969     PISSN: 09210296     EISSN: 15730409     Source Type: Journal    
DOI: 10.1007/s10846-015-0259-2     Document Type: Article
Times cited : (93)

References (214)
  • 2
    • 84855423866 scopus 로고    scopus 로고
    • Operator performance in exploration robotics
    • Valero, A., Randelli, G., Botta, F.: Operator performance in exploration robotics. J. Intell. Robot. Syst. 64(3-4), 365–385 (2011)
    • (2011) J. Intell. Robot. Syst. , vol.64 , Issue.3-4 , pp. 365-385
    • Valero, A.1    Randelli, G.2    Botta, F.3
  • 3
    • 84857792537 scopus 로고    scopus 로고
    • Is someone in this office available to help me?
    • Rosenthal, S., Veloso, M.: Is someone in this office available to help me? J. Intell. Robot. Syst. 66(2), 205–221 (2011)
    • (2011) J. Intell. Robot. Syst. , vol.66 , Issue.2 , pp. 205-221
    • Rosenthal, S.1    Veloso, M.2
  • 4
    • 84886948120 scopus 로고    scopus 로고
    • Emotional state classification in patient–robot interaction using wavelet analysis and statistics-based feature selection
    • Swangnetr, M., Kaber, D.: Emotional state classification in patient–robot interaction using wavelet analysis and statistics-based feature selection. IEEE Trans. Human-Machine Syst. 43(1), 63–75 (2013)
    • (2013) IEEE Trans. Human-Machine Syst. , vol.43 , Issue.1 , pp. 63-75
    • Swangnetr, M.1    Kaber, D.2
  • 5
    • 84911498968 scopus 로고    scopus 로고
    • Determining the affective body language of older adults during socially assistive HRI
    • McColl, D., Nejat, G.: Determining the affective body language of older adults during socially assistive HRI. In: Proceedings of the IEEE Int. Conf. on Intell. Robots Syst. pp. 2633–2638 (2014)
    • (2014) Proceedings of the IEEE Int. Conf. on Intell. Robots Syst , pp. 2633-2638
    • McColl, D.1    Nejat, G.2
  • 6
    • 50649096770 scopus 로고    scopus 로고
    • Online affect detection and robot behavior adaptation for intervention of children with autism
    • Liu, C., Conn, K., Sarkar, N., Stone, W.: Online affect detection and robot behavior adaptation for intervention of children with autism. IEEE Trans. Robot. 24(4), 883–896 (2008)
    • (2008) IEEE Trans. Robot. , vol.24 , Issue.4 , pp. 883-896
    • Liu, C.1    Conn, K.2    Sarkar, N.3    Stone, W.4
  • 9
    • 0000134246 scopus 로고    scopus 로고
    • Social functions of emotions at four levels of analysis
    • Keltner, D., Haidt, J.: Social functions of emotions at four levels of analysis. Cogn. Emot. 13(5), 505–521 (1999)
    • (1999) Cogn. Emot. , vol.13 , Issue.5 , pp. 505-521
    • Keltner, D.1    Haidt, J.2
  • 11
    • 57949091304 scopus 로고    scopus 로고
    • Affective computing
    • Picard, R.: Affective computing. MIT Press (2000)
    • (2000) MIT Press
    • Picard, R.1
  • 12
    • 84904195112 scopus 로고    scopus 로고
    • Telenoid android robot as an embodied perceptual social regulation medium engaging natural human–humanoid interaction
    • Sorbello, R., Chella, A., Calí, C.: Telenoid android robot as an embodied perceptual social regulation medium engaging natural human–humanoid interaction. Robot. Auton. Syst. 62(9), 1329–1341 (2014)
    • (2014) Robot. Auton. Syst. , vol.62 , Issue.9 , pp. 1329-1341
    • Sorbello, R.1    Chella, A.2    Calí, C.3
  • 14
    • 84894088749 scopus 로고    scopus 로고
    • Meal-time with a socially assistive robot and older adults at a long-term care facility
    • McColl, D., Nejat, G.: Meal-time with a socially assistive robot and older adults at a long-term care facility. J. Human-Robot Interaction 2(1), 152–171 (2013)
    • (2013) J. Human-Robot Interaction , vol.2 , Issue.1 , pp. 152-171
    • McColl, D.1    Nejat, G.2
  • 15
    • 21244470679 scopus 로고    scopus 로고
    • Development and evaluation of interactive humanoid robots
    • Kanda, T., Ishiguro, H., Imai, M., Ono, T.: Development and evaluation of interactive humanoid robots. Proc. IEEE 92(11), 1839–1850 (2004)
    • (2004) Proc. IEEE , vol.92 , Issue.11 , pp. 1839-1850
    • Kanda, T.1    Ishiguro, H.2    Imai, M.3    Ono, T.4
  • 17
    • 3142780127 scopus 로고    scopus 로고
    • Whose Job Is It Anyway? A Study of Human-Robot Interaction in a Collaborative Task
    • Hinds, P.J., Roberts, T.L., Jones, H.: Whose Job Is It Anyway? A Study of Human-Robot Interaction in a Collaborative Task. J. Human-Computer Interaction 19(1), 151–181 (2004)
    • (2004) J. Human-Computer Interaction , vol.19 , Issue.1 , pp. 151-181
    • Hinds, P.J.1    Roberts, T.L.2    Jones, H.3
  • 18
    • 0035493842 scopus 로고    scopus 로고
    • Human–Robot Cooperation Using Multi-Agent-Systems
    • Längle, T., Wörn, H.: Human–Robot Cooperation Using Multi-Agent-Systems. J. Intell. Robot. Syst. 32(2), 143–160 (2001)
    • (2001) J. Intell. Robot. Syst. , vol.32 , Issue.2 , pp. 143-160
    • Längle, T.1    Wörn, H.2
  • 21
    • 0032120188 scopus 로고    scopus 로고
    • Design issues of a semi-autonomous robotic assistant for the health care environment
    • Ettelt, E., Furtwängler, R.: Design issues of a semi-autonomous robotic assistant for the health care environment. J. Intell. Robot. Syst. 22(3-4), 191–209 (1998)
    • (1998) J. Intell. Robot. Syst. , vol.22 , Issue.3-4 , pp. 191-209
    • Ettelt, E.1    Furtwängler, R.2
  • 22
    • 70350322317 scopus 로고    scopus 로고
    • Towards affect-sensitive assistive intervention technologies for children with autism. Affective Computing: Focus on Emotion Expression
    • Conn, K., Liu, C., Sarkar, N.: Towards affect-sensitive assistive intervention technologies for children with autism. Affective Computing: Focus on Emotion Expression. Synthesis and Recognition, pp. 365–390 (2008)
    • (2008) Synthesis and Recognition , pp. 365-390
    • Conn, K.1    Liu, C.2    Sarkar, N.3
  • 24
    • 0036850702 scopus 로고    scopus 로고
    • Robots that imitate humans
    • Breazeal, C., Scassellati, B.: Robots that imitate humans. Trends Cogn. Sci. 6(11), 481–487 (2002)
    • (2002) Trends Cogn. Sci. , vol.6 , Issue.11 , pp. 481-487
    • Breazeal, C.1    Scassellati, B.2
  • 25
    • 39249084830 scopus 로고    scopus 로고
    • The impact of social context on mimicry
    • Bourgeois, P., Hess, U.: The impact of social context on mimicry. Biol. Psychol. 77(3), 343–352 (2008)
    • (2008) Biol. Psychol. , vol.77 , Issue.3 , pp. 343-352
    • Bourgeois, P.1    Hess, U.2
  • 26
    • 0037209464 scopus 로고    scopus 로고
    • Automatic facial expression analysis: a survey
    • Fasel, B., Luettin, J.: Automatic facial expression analysis: a survey. Pattern Recogn. 36(1), 259–275 (2003)
    • (2003) Pattern Recogn. , vol.36 , Issue.1 , pp. 259-275
    • Fasel, B.1    Luettin, J.2
  • 27
    • 57149144228 scopus 로고    scopus 로고
    • A survey of affect recognition methods: Audio, visual, and spontaneous expressions
    • Zeng, Z., Pantic, M.: A survey of affect recognition methods: Audio, visual, and spontaneous expressions. IEEE Trans. Pattern Anal. Mach. Intell. 31(1), 39–58 (2009)
    • (2009) IEEE Trans. Pattern Anal. Mach. Intell. , vol.31 , Issue.1 , pp. 39-58
    • Zeng, Z.1    Pantic, M.2
  • 28
    • 79953822842 scopus 로고    scopus 로고
    • Affect detection: An interdisciplinary review of models, methods, and their applications
    • Calvo, R., D’Mello, S.: Affect detection: An interdisciplinary review of models, methods, and their applications. IEEE Trans. Affective Comput. 1(1), 18–37 (2010)
    • (2010) IEEE Trans. Affective Comput. , vol.1 , Issue.1 , pp. 18-37
    • Calvo, R.1    D’Mello, S.2
  • 29
    • 84866500519 scopus 로고    scopus 로고
    • Affective body expression perception and recognition: A survey
    • Kleinsmith, A., Bianchi-Berthouze, N.: Affective body expression perception and recognition: A survey. IEEE Trans. Affective Comput. 4(1), 15–33 (2013)
    • (2013) IEEE Trans. Affective Comput. , vol.4 , Issue.1 , pp. 15-33
    • Kleinsmith, A.1    Bianchi-Berthouze, N.2
  • 31
    • 80051684076 scopus 로고    scopus 로고
    • Robot head motion control with an emphasis on realism of neck–eye coordination during object tracking
    • Rajruangrabin, J., Popa, D.: Robot head motion control with an emphasis on realism of neck–eye coordination during object tracking. J. Intell. Robot. Syst. 63(2), 163–190 (2011)
    • (2011) J. Intell. Robot. Syst. , vol.63 , Issue.2 , pp. 163-190
    • Rajruangrabin, J.1    Popa, D.2
  • 32
    • 84929521464 scopus 로고    scopus 로고
    • Generation of realistic robot facial expressions for human robot interaction
    • Park, J., Lee, H., Chung, M.: Generation of realistic robot facial expressions for human robot interaction. J. Intell. Robot. Syst. (2014). doi:10.1007/s10846-014-0066-1
    • (2014) J. Intell. Robot. Syst.
    • Park, J.1    Lee, H.2    Chung, M.3
  • 34
    • 13444292157 scopus 로고    scopus 로고
    • Differences in effect of robot and screen agent recommendations on human decision-making
    • Shinozawa, K., Naya, F., Yamato, J., Kogure, K.: Differences in effect of robot and screen agent recommendations on human decision-making. Int. J. Human Comput. Stud. 62(2), 267–279 (2005)
    • (2005) Int. J. Human Comput. Stud. , vol.62 , Issue.2 , pp. 267-279
    • Shinozawa, K.1    Naya, F.2    Yamato, J.3    Kogure, K.4
  • 36
    • 84896257146 scopus 로고    scopus 로고
    • A Survey on Perception Methods for Human–Robot Interaction in Social Robots
    • Yan, H., Ang, M.H., Poo, A.N.: A Survey on Perception Methods for Human–Robot Interaction in Social Robots. Int. J. Soc. Robot. 6(1), 85–119 (2014)
    • (2014) Int. J. Soc. Robot. , vol.6 , Issue.1 , pp. 85-119
    • Yan, H.1    Ang, M.H.2    Poo, A.N.3
  • 37
    • 84862014987 scopus 로고    scopus 로고
    • A Model of the Perception of Facial Expressions of Emotion by Humans: Research Overview and Perspectives
    • Martinez, A., Du, S.: A Model of the Perception of Facial Expressions of Emotion by Humans: Research Overview and Perspectives. J. Mach. Learn. Res. 13(1), 1589–1608 (2012)
    • (2012) J. Mach. Learn. Res. , vol.13 , Issue.1 , pp. 1589-1608
    • Martinez, A.1    Du, S.2
  • 38
    • 0008278883 scopus 로고    scopus 로고
    • Being happy and seeing “happy” emotional state mediates visual word recognition
    • Niedenthal, P.M., Halberstadt, J.B., Setterlund, M.B.: Being happy and seeing “happy” emotional state mediates visual word recognition. Cogn. Emot. 11(4), 403–432 (1997)
    • (1997) Cogn. Emot. , vol.11 , Issue.4 , pp. 403-432
    • Niedenthal, P.M.1    Halberstadt, J.B.2    Setterlund, M.B.3
  • 40
    • 0003774595 scopus 로고
    • The expression of the emotions in man and animals
    • Darwin, C.: The expression of the emotions in man and animals. Amer. J. Med. Sci. 232(4), 477 (1956)
    • (1956) Amer. J. Med. Sci. , vol.232 , Issue.4 , pp. 477
    • Darwin, C.1
  • 41
    • 51649088412 scopus 로고
    • Affect, imagery
    • The positive affects, Oxford, England
    • Tomkins, S.: Affect, imagery, consciousness: vol. I. The positive affects. Oxford, England (1962)
    • (1962) consciousness , vol.I
    • Tomkins, S.1
  • 42
    • 77955737956 scopus 로고
    • Affect, imagery
    • The negative affects, Oxford, England
    • Tomkins, S.: Affect, imagery, consciousness: vol. II. The negative affects. Oxford, England (1963)
    • (1963) consciousness , vol.II
    • Tomkins, S.1
  • 43
    • 0015014687 scopus 로고
    • Constants across cultures in the face and emotion
    • Ekman, P., Friesen, W.V.: Constants across cultures in the face and emotion. J. Pers. Soc. Psychol. 17(2), 124–129 (1971)
    • (1971) J. Pers. Soc. Psychol. , vol.17 , Issue.2 , pp. 124-129
    • Ekman, P.1    Friesen, W.V.2
  • 44
    • 0004237496 scopus 로고
    • Emotion in the human face: Guidelines for research and an integration of findings
    • Ekman, P., Friesen, W., Ellsworth, P.: Emotion in the human face: Guidelines for research and an integration of findings. Pergamon Press (1972)
    • (1972) Pergamon Press
    • Ekman, P.1    Friesen, W.2    Ellsworth, P.3
  • 45
    • 85045673885 scopus 로고    scopus 로고
    • The Conceptualisation of Emotion Qualia: Semantic Clustering of Emotional Tweets
    • Bann, E.Y., Bryson, J.J.: The Conceptualisation of Emotion Qualia: Semantic Clustering of Emotional Tweets. Prog. Neural Process. 21, 249–263 (2012)
    • (2012) Prog. Neural Process. , vol.21 , pp. 249-263
    • Bann, E.Y.1    Bryson, J.J.2
  • 48
    • 58149453035 scopus 로고
    • Three dimensions of emotion
    • Schlosberg, H.: Three dimensions of emotion. Psychol. Rev. 61(2), 81–88 (1954)
    • (1954) Psychol. Rev. , vol.61 , Issue.2 , pp. 81-88
    • Schlosberg, H.1
  • 49
    • 84961127250 scopus 로고    scopus 로고
    • Re-constructing Emotional Spaces: From Experience to Regulation
    • Trnka, R., Balcar, K., Kuska, M.: Re-constructing Emotional Spaces: From Experience to Regulation. Prague Psychosocial Press (2011)
    • (2011) Prague Psychosocial Press
    • Trnka, R.1    Balcar, K.2    Kuska, M.3
  • 50
    • 84961182859 scopus 로고    scopus 로고
    • Circumplex models of personality and emotions. Washington
    • Plutchik, R., Conte, H.: Circumplex models of personality and emotions. Washington, DC (1997)
    • (1997) DC
    • Plutchik, R.1    Conte, H.2
  • 51
    • 21344454051 scopus 로고    scopus 로고
    • Pleasure-arousal-dominance: A general framework for describing and measuring individual differences in temperament
    • Mehrabian, A.: Pleasure-arousal-dominance: A general framework for describing and measuring individual differences in temperament. Curr. Psychol. 14(4), 261–292 (1996)
    • (1996) Curr. Psychol. , vol.14 , Issue.4 , pp. 261-292
    • Mehrabian, A.1
  • 52
    • 4644280844 scopus 로고
    • A circumplex model of affect
    • Russell, J.A.: A circumplex model of affect. J. Pers. Soc. Psychol. 39(6), 1161–1178 (1980)
    • (1980) J. Pers. Soc. Psychol. , vol.39 , Issue.6 , pp. 1161-1178
    • Russell, J.A.1
  • 54
    • 70449561248 scopus 로고    scopus 로고
    • A comparison of dimensional models of emotion: evidence from emotions, prototypical events, autobiographical memories, and words
    • Rubin, D.C., Talarico, J.M.: A comparison of dimensional models of emotion: evidence from emotions, prototypical events, autobiographical memories, and words. Memory 17(8), 802–808 (2009)
    • (2009) Memory , vol.17 , Issue.8 , pp. 802-808
    • Rubin, D.C.1    Talarico, J.M.2
  • 55
    • 0022115623 scopus 로고
    • Toward a consensual structure of mood
    • Watson, D., Tellegen, A.: Toward a consensual structure of mood. Psychol. Bull. 98(2), 219–235 (1985)
    • (1985) Psychol. Bull. , vol.98 , Issue.2 , pp. 219-235
    • Watson, D.1    Tellegen, A.2
  • 56
    • 0000903084 scopus 로고    scopus 로고
    • Discrete emotions or dimensions? The role of valence focus and arousal focus
    • Barrett, L.F.: Discrete emotions or dimensions? The role of valence focus and arousal focus. Cogn. Emot. 12(4), 579–599 (1998)
    • (1998) Cogn. Emot. , vol.12 , Issue.4 , pp. 579-599
    • Barrett, L.F.1
  • 58
    • 49949104269 scopus 로고    scopus 로고
    • Facial Expression Recognition for Human-robot Interaction–A Prototype
    • Wimmer, M., MacDonald, B.A., Jayamuni, D., Yadav, A.: Facial Expression Recognition for Human-robot Interaction–A Prototype. Robot Vision 4931, 139–152 (2008)
    • (2008) Robot Vision , vol.4931 , pp. 139-152
    • Wimmer, M.1    MacDonald, B.A.2    Jayamuni, D.3    Yadav, A.4
  • 62
    • 77956226639 scopus 로고    scopus 로고
    • Expressing and interpreting emotional movements in social games with robots
    • Barakova, E., Lourens, T.: Expressing and interpreting emotional movements in social games with robots. Personal Ubiquitous Comput. 14(5), 457–467 (2010)
    • (2010) Personal Ubiquitous Comput. , vol.14 , Issue.5 , pp. 457-467
    • Barakova, E.1    Lourens, T.2
  • 65
    • 78149279790 scopus 로고    scopus 로고
    • Computational model of emotion generation for human–robot interaction based on the cognitive appraisal theory
    • Kim, H.R., Kwon, D.S.: Computational model of emotion generation for human–robot interaction based on the cognitive appraisal theory. J. Intell. Robot. Syst. 60(2), 263–283 (2010)
    • (2010) J. Intell. Robot. Syst. , vol.60 , Issue.2 , pp. 263-283
    • Kim, H.R.1    Kwon, D.S.2
  • 67
    • 76649084566 scopus 로고    scopus 로고
    • Emotional feature extraction method based on the concentration of phoneme influence for human–robot interaction
    • Hyun, K.H., Kim, E.H., Kwak, Y.K.: Emotional feature extraction method based on the concentration of phoneme influence for human–robot interaction. Adv. Robot. 24(1-2), 47–67 (2010)
    • (2010) Adv. Robot. , vol.24 , Issue.1-2 , pp. 47-67
    • Hyun, K.H.1    Kim, E.H.2    Kwak, Y.K.3
  • 68
    • 70349192899 scopus 로고    scopus 로고
    • Speech emotion recognition via a max-margin framework incorporating a loss function based on the Watson and Tellegen’s emotion model
    • Yun, S., Yoo, C.D.: Speech emotion recognition via a max-margin framework incorporating a loss function based on the Watson and Tellegen’s emotion model. In: Proceedings of the IEEE Int. Conf. on Acoustics, Speech and Signal Processing, pp. 4169–4172 (2009)
    • (2009) Proceedings of the IEEE Int. Conf. on Acoustics, Speech and Signal Processing , pp. 4169-4172
    • Yun, S.1    Yoo, C.D.2
  • 69
    • 67650159383 scopus 로고    scopus 로고
    • Improved emotion recognition with a novel speaker-independent feature
    • Kim, E.H., Hyun, K.H., Kim, S.H.: Improved emotion recognition with a novel speaker-independent feature. IEEE/ASME Trans. Mechatron. 14(3), 317–325 (2009)
    • (2009) IEEE/ASME Trans. Mechatron. , vol.14 , Issue.3 , pp. 317-325
    • Kim, E.H.1    Hyun, K.H.2    Kim, S.H.3
  • 71
    • 33744758706 scopus 로고    scopus 로고
    • An empirical study of machine learning techniques for affect recognition in human–robot interaction
    • Rani, P., Liu, C., Sarkar, N., Vanman, E.: An empirical study of machine learning techniques for affect recognition in human–robot interaction. Pattern Anal. Applicat. 9(1), 58–69 (2006)
    • (2006) Pattern Anal. Applicat. , vol.9 , Issue.1 , pp. 58-69
    • Rani, P.1    Liu, C.2    Sarkar, N.3    Vanman, E.4
  • 73
    • 84933507985 scopus 로고    scopus 로고
    • Development and Testing of a Multimodal Acquisition Platform for Human-Robot Interaction Affective Studies
    • Lazzeri, N., Mazzei, D., De Rossi, D.: Development and Testing of a Multimodal Acquisition Platform for Human-Robot Interaction Affective Studies. J. Human-Robot Interaction 3(2), 1–24 (2014)
    • (2014) J. Human-Robot Interaction , vol.3 , Issue.2 , pp. 1-24
    • Lazzeri, N.1    Mazzei, D.2    De Rossi, D.3
  • 74
    • 78649974590 scopus 로고    scopus 로고
    • Bimodal emotion recognition
    • Paleari, M., Chellali, R., Huet, B.: Bimodal emotion recognition. Soc. Robot. 6414, 305–314 (2010)
    • (2010) Soc. Robot. , vol.6414 , pp. 305-314
    • Paleari, M.1    Chellali, R.2    Huet, B.3
  • 75
    • 84893738381 scopus 로고    scopus 로고
    • A real time and robust facial expression recognition and imitation approach for affective human-robot interaction using Gabor filtering
    • Cid, F., Prado, J.A., Bustos, P., Nunez, P.: A real time and robust facial expression recognition and imitation approach for affective human-robot interaction using Gabor filtering. In: Proceedings of the IEEE/RSJ Int. Conf. on Intell. Robots and Syst., pp. 2188–2193 (2013)
    • (2013) Proceedings of the IEEE/RSJ Int. Conf. on Intell. Robots and Syst. , pp. 2188-2193
    • Cid, F.1    Prado, J.A.2    Bustos, P.3    Nunez, P.4
  • 76
    • 84876165788 scopus 로고    scopus 로고
    • A two-dimensional facial-affect estimation system for human–robot interaction using facial expression parameters
    • Schacter, D., Wang, C., Nejat, G., Benhabib, B.: A two-dimensional facial-affect estimation system for human–robot interaction using facial expression parameters. Adv. Robot. 27(4), 259–273 (2013)
    • (2013) Adv. Robot. , vol.27 , Issue.4 , pp. 259-273
    • Schacter, D.1    Wang, C.2    Nejat, G.3    Benhabib, B.4
  • 82
    • 84906243225 scopus 로고    scopus 로고
    • Usual voice quality features and glottal features for emotional valence detection
    • Tahon, M.: Usual voice quality features and glottal features for emotional valence detection. In: Proceedings of Speech Prosody, pp. 1–8 (2012)
    • (2012) Proceedings of Speech Prosody , pp. 1-8
    • Tahon, M.1
  • 83
    • 35348845820 scopus 로고    scopus 로고
    • Affective State Estimation for Human-Robot Interaction
    • Kulic, D., Croft, E.A.: Affective State Estimation for Human-Robot Interaction. IEEE Trans. Robot. 23(5), 991–1000 (2007)
    • (2007) IEEE Trans. Robot. , vol.23 , Issue.5 , pp. 991-1000
    • Kulic, D.1    Croft, E.A.2
  • 86
    • 84863116137 scopus 로고    scopus 로고
    • Mental schemas of robots as more human-like are associated with higher blood pressure and negative emotions in a human-robot interaction
    • Broadbent, E., Lee, Y.I., Stafford, R.Q., Kuo, I.H.: Mental schemas of robots as more human-like are associated with higher blood pressure and negative emotions in a human-robot interaction. Int. J. Soc. Robot. 3(3), 291–297 (2011)
    • (2011) Int. J. Soc. Robot. , vol.3 , Issue.3 , pp. 291-297
    • Broadbent, E.1    Lee, Y.I.2    Stafford, R.Q.3    Kuo, I.H.4
  • 91
    • 84873841216 scopus 로고    scopus 로고
    • Towards expressive musical robots: a cross-modal framework for emotional gesture, voice and music. EURASIP
    • Lim, A., Ogata, T., Okuno, H.G.: Towards expressive musical robots: a cross-modal framework for emotional gesture, voice and music. EURASIP. J. Audio, Speech, and Music Processing 2012(1), 1–12 (2012)
    • (2012) J. Audio, Speech, and Music Processing , vol.2012 , Issue.1 , pp. 1-12
    • Lim, A.1    Ogata, T.2    Okuno, H.G.3
  • 94
    • 84961147047 scopus 로고
    • Facial expressions of emotion. Nonverbal Behavior and Communication (2nd ed.), pp
    • Fridlund, A.J., Ekman, P., Oster, H.: Facial expressions of emotion. Nonverbal Behavior and Communication (2nd ed.), pp. 143–223 (1987)
    • (1987) 143–223
    • Fridlund, A.J.1    Ekman, P.2    Oster, H.3
  • 95
    • 84961159785 scopus 로고
    • Human facial expression: An evolutionary view
    • Fridlund, A.J.: Human facial expression: An evolutionary view. Academic Press (1994)
    • (1994) Academic Press
    • Fridlund, A.J.1
  • 97
    • 41149089619 scopus 로고    scopus 로고
    • Facial expression recognition and tracking for intelligent human-robot interaction
    • Yang, Y., Ge, S.S., Lee, T.H., Wang, C.: Facial expression recognition and tracking for intelligent human-robot interaction. J. Intell. Serv. Robot. 1(2), 143–157 (2008)
    • (2008) J. Intell. Serv. Robot. , vol.1 , Issue.2 , pp. 143-157
    • Yang, Y.1    Ge, S.S.2    Lee, T.H.3    Wang, C.4
  • 98
    • 41149163502 scopus 로고    scopus 로고
    • The grand challenges in socially assistive robotics
    • Tapus, A., Maja, M., Scassellatti, B.: The grand challenges in socially assistive robotics. IEEE Robot. Autom. Mag. 14(1), 1–7 (2007)
    • (2007) IEEE Robot. Autom. Mag. , vol.14 , Issue.1 , pp. 1-7
    • Tapus, A.1    Maja, M.2    Scassellatti, B.3
  • 101
  • 105
    • 84919922196 scopus 로고    scopus 로고
    • A robot learns the facial expressions recognition and face/non-face discrimination through an imitation game
    • Boucenna, S., Gaussier, P., Andry, P., Hafemeister, L.: A robot learns the facial expressions recognition and face/non-face discrimination through an imitation game. Int. J. Soc. Robot. 6(4), 633–652 (2014)
    • (2014) Int. J. Soc. Robot. , vol.6 , Issue.4 , pp. 633-652
    • Boucenna, S.1    Gaussier, P.2    Andry, P.3    Hafemeister, L.4
  • 107
    • 84961160696 scopus 로고    scopus 로고
    • Facial emotion recognition and adaptative postural reaction by a humanoid based on neural evolution
    • Garcíia Bueno, J., González-Fierro, M., Moreno, L., Balaguer, C.: Facial emotion recognition and adaptative postural reaction by a humanoid based on neural evolution. Int. J. Adv. Comput. Sci. 3(10), 481–493 (2013)
    • (2013) Int. J. Adv. Comput. Sci. , vol.3 , Issue.10 , pp. 481-493
    • Garcíia Bueno, J.1    González-Fierro, M.2    Moreno, L.3    Balaguer, C.4
  • 109
    • 84961157259 scopus 로고    scopus 로고
    • Seeing Machines: faceAPI. (2009)
    • Seeing Machines: faceAPI. http://www.seeingmachines.com/product/faceapi/ (2009)
  • 113
    • 84905395782 scopus 로고    scopus 로고
    • Comprehensive database for facial expression analysis. In: Proceedings of the IEEE Int. Conf. on Automatic Face and Gesture Recognition, pp
    • Kanade, T., Cohn, J.F., Tian, Y.: Comprehensive database for facial expression analysis. In: Proceedings of the IEEE Int. Conf. on Automatic Face and Gesture Recognition, pp. 46-53 (2000)
    • (2000) 46-53
    • Kanade, T.1    Cohn, J.F.2    Tian, Y.3
  • 114
    • 0033281701 scopus 로고    scopus 로고
    • Improved boosting algorithms using confidence-rated predictions
    • Schapire, R.E., Singer, Y.: Improved boosting algorithms using confidence-rated predictions. Mach. Learn. 37(3), 297–336 (1999)
    • (1999) Mach. Learn. , vol.37 , Issue.3 , pp. 297-336
    • Schapire, R.E.1    Singer, Y.2
  • 116
    • 38049169334 scopus 로고    scopus 로고
    • Efficient Facial Expression Recognition for Human Robot Interaction
    • Dornaika, F., Raducanu, B.: Efficient Facial Expression Recognition for Human Robot Interaction. In: Computational and Ambient Intelligence, pp. 700–708 (2007)
    • (2007) Computational and Ambient Intelligence , pp. 700-708
    • Dornaika, F.1    Raducanu, B.2
  • 118
    • 0347380229 scopus 로고    scopus 로고
    • The CMU pose, illumination, and expression database
    • Sim, T., Baker, S., Bsat, M.: The CMU pose, illumination, and expression database. IEEE Trans. Pattern Anal. Mach. Intell. 25(12), 1615–1618 (2003)
    • (2003) IEEE Trans. Pattern Anal. Mach. Intell. , vol.25 , Issue.12 , pp. 1615-1618
    • Sim, T.1    Baker, S.2    Bsat, M.3
  • 120
    • 0033592606 scopus 로고    scopus 로고
    • & Seung, H.S.: Learning the parts of objects by non-negative matrix factorization
    • Lee, D.D.: & Seung, H.S.: Learning the parts of objects by non-negative matrix factorization. Nature 401(6755), 788–791 (1999)
    • (1999) Nature , vol.401 , Issue.6755 , pp. 788-791
    • Lee, D.D.1
  • 121
    • 0003710566 scopus 로고
    • Architectures: Algorithms, and Applications. Prentice-Hall Int
    • Fausett, L.: Fundamentals of Neural Networks: Architectures, Algorithms, and Applications. Prentice-Hall Int. (1994)
    • (1994) Fundamentals of Neural Networks
    • Fausett, L.1
  • 126
    • 0029080101 scopus 로고
    • Kansei engineering: a new ergonomic consumer-oriented technology for product development
    • Nagamachi, M.: Kansei engineering: a new ergonomic consumer-oriented technology for product development. Int. J. Ind. Ergon. 15(1), 3–11 (1995)
    • (1995) Int. J. Ind. Ergon. , vol.15 , Issue.1 , pp. 3-11
    • Nagamachi, M.1
  • 128
    • 84961138022 scopus 로고    scopus 로고
    • Models of perceptual judgment of emotion from facial expressions. Japanese Psychol
    • Yamada, H.: Models of perceptual judgment of emotion from facial expressions. Japanese Psychol. Rev. (2000)
    • (2000) Rev
    • Yamada, H.1
  • 129
    • 2142812371 scopus 로고    scopus 로고
    • Robust real-time face detection
    • Viola, P., Jones, M.J.: Robust real-time face detection. Int. J. Comput. Vis. 57(2), 137–154 (2004)
    • (2004) Int. J. Comput. Vis. , vol.57 , Issue.2 , pp. 137-154
    • Viola, P.1    Jones, M.J.2
  • 132
    • 56749185950 scopus 로고    scopus 로고
    • Visual-based emotion detection for natural man-machine interaction
    • Strupp, S., Schmitz, N., Berns, K.: Visual-based emotion detection for natural man-machine interaction. In: Advanced Artificial Intell. pp. 356–363 (2008)
    • (2008) Advanced Artificial Intell , pp. 356-363
    • Strupp, S.1    Schmitz, N.2    Berns, K.3
  • 136
    • 57049113178 scopus 로고    scopus 로고
    • Interacting with an artificial partner: modeling the role of emotional aspects
    • Cattinelli, I., Goldwurm, M., Borghese, N.A.: Interacting with an artificial partner: modeling the role of emotional aspects. Biol. Cybern. 99(6), 473–89 (2008)
    • (2008) Biol. Cybern. , vol.99 , Issue.6 , pp. 473-489
    • Cattinelli, I.1    Goldwurm, M.2    Borghese, N.A.3
  • 138
    • 31744447909 scopus 로고    scopus 로고
    • A real-time automated system for the recognition of human facial expressions
    • Anderson, K., McOwan, P.W.: A real-time automated system for the recognition of human facial expressions. IEEE Trans. Syst. Man Cybern. Part B 36(1), 96–105 (2006)
    • (2006) IEEE Trans. Syst. Man Cybern. Part B , vol.36 , Issue.1 , pp. 96-105
    • Anderson, K.1    McOwan, P.W.2
  • 143
    • 33646777478 scopus 로고    scopus 로고
    • Face and Facial Expression Recognition with an Embedded System for Human-Robot Interaction
    • Lee, Y.B., Moon, S.B., Kim, Y.G.: Face and Facial Expression Recognition with an Embedded System for Human-Robot Interaction. Affective Computing and Intell. Interaction 3784, 271–278 (2005)
    • (2005) Affective Computing and Intell. Interaction , vol.3784 , pp. 271-278
    • Lee, Y.B.1    Moon, S.B.2    Kim, Y.G.3
  • 145
    • 0014514002 scopus 로고
    • Significance of posture and position in the communication of attitude and status relationships
    • Mehrabian, A.: Significance of posture and position in the communication of attitude and status relationships. Psychol. Bull. 71(5), 359–372 (1969)
    • (1969) Psychol. Bull. , vol.71 , Issue.5 , pp. 359-372
    • Mehrabian, A.1
  • 146
    • 0033147925 scopus 로고    scopus 로고
    • The use of body movements and gestures as cues to emotions in younger and older adults
    • Montepare, J., Koff, E., Zaitchik, D., Albert, M.: The use of body movements and gestures as cues to emotions in younger and older adults. J. Nonverbal Behav. 23(2), 133–152 (1999)
    • (1999) J. Nonverbal Behav. , vol.23 , Issue.2 , pp. 133-152
    • Montepare, J.1    Koff, E.2    Zaitchik, D.3    Albert, M.4
  • 147
    • 0032357867 scopus 로고    scopus 로고
    • Bodily expression of emotion
    • Wallbott, H.: Bodily expression of emotion. Eur. J. Soc. Psychol. 28(6), 879–896 (1998)
    • (1998) Eur. J. Soc. Psychol. , vol.28 , Issue.6 , pp. 879-896
    • Wallbott, H.1
  • 149
    • 84856588942 scopus 로고    scopus 로고
    • Effort-shape and kinematic assessment of bodily expression of emotion during gait
    • Gross, M., Crane, E., Fredrickson, B.: Effort-shape and kinematic assessment of bodily expression of emotion during gait. Human Movement Sci. 31(1), 202–221 (2012)
    • (2012) Human Movement Sci. , vol.31 , Issue.1 , pp. 202-221
    • Gross, M.1    Crane, E.2    Fredrickson, B.3
  • 150
    • 84924223417 scopus 로고    scopus 로고
    • Online dynamic gesture recognition for human robot interaction
    • Xu, D., Wu, X., Chen, Y., Xu, Y.: Online dynamic gesture recognition for human robot interaction. J. Intell. Robot. Syst. (2014). doi:10.1007/s10846-014-0039-4
    • (2014) J. Intell. Robot. Syst.
    • Xu, D.1    Wu, X.2    Chen, Y.3    Xu, Y.4
  • 151
    • 31544433719 scopus 로고    scopus 로고
    • Gesture-based human-robot interaction using a knowledge-based software platform
    • Hasanuzzaman, M.: Gesture-based human-robot interaction using a knowledge-based software platform. Industrial Robot An Int. J. 33(1), 37–49 (2006)
    • (2006) Industrial Robot An Int. J. , vol.33 , Issue.1 , pp. 37-49
    • Hasanuzzaman, M.1
  • 152
    • 84863071297 scopus 로고    scopus 로고
    • Gesture Recognition Based on Localist Attractor Networks with Application to Robot Control
    • Yan, R., Tee, K., Chua, Y.: Gesture Recognition Based on Localist Attractor Networks with Application to Robot Control. In: IEEE Computational Intell. Mag., pp. 64–74 (2012)
    • (2012) IEEE Computational Intell. Mag. , pp. 64-74
    • Yan, R.1    Tee, K.2    Chua, Y.3
  • 153
    • 84961152630 scopus 로고    scopus 로고
    • An integrated color and hand gesture recognition control for wireless robot
    • Suryawanshi, D., Khandelwal, C.: An integrated color and hand gesture recognition control for wireless robot. Int. J. Adv. Eng. Tech. 3(1), 427–435 (2012)
    • (2012) Int. J. Adv. Eng. Tech. , vol.3 , Issue.1 , pp. 427-435
    • Suryawanshi, D.1    Khandelwal, C.2
  • 154
    • 84905506876 scopus 로고    scopus 로고
    • A framework for user-defined body gestures to control a humanoid robot
    • Obaid, M., Kistler, F., Häring, M.: A framework for user-defined body gestures to control a humanoid robot. Int. J. Soc. Robot. 6(3), 383–396 (2014)
    • (2014) Int. J. Soc. Robot. , vol.6 , Issue.3 , pp. 383-396
    • Obaid, M.1    Kistler, F.2    Häring, M.3
  • 156
    • 0034272562 scopus 로고    scopus 로고
    • A gesture based interface for human-robot interaction
    • Waldherr, S., Romero, R., Thrun, S.: A gesture based interface for human-robot interaction. Auton. Robot. 9(2), 151–173 (2000)
    • (2000) Auton. Robot. , vol.9 , Issue.2 , pp. 151-173
    • Waldherr, S.1    Romero, R.2    Thrun, S.3
  • 159
    • 44649198359 scopus 로고    scopus 로고
    • Multimodal interaction abilities for a robot companion
    • Burger, B., Ferrané, I., Lerasle, F.: Multimodal interaction abilities for a robot companion. Comput. Vis. Syst. 5008, 549–558 (2008)
    • (2008) Comput. Vis. Syst. , vol.5008 , pp. 549-558
    • Burger, B.1    Ferrané, I.2    Lerasle, F.3
  • 161
    • 0032688535 scopus 로고    scopus 로고
    • GripSee: A gesture-controlled robot for object perception and manipulation
    • Becker, M., Kefalea, E., Maël, E.: GripSee: A gesture-controlled robot for object perception and manipulation. Auton. Robot. 6(2), 203–221 (1999)
    • (1999) Auton. Robot. , vol.6 , Issue.2 , pp. 203-221
    • Becker, M.1    Kefalea, E.2    Maël, E.3
  • 162
    • 35248890370 scopus 로고    scopus 로고
    • Gesture recognition for control of rehabilitation robots
    • Gerlich, L., Parsons, B., White, A.: Gesture recognition for control of rehabilitation robots. Cogn. Tech. Work 9(4), 189–207 (2007)
    • (2007) Cogn. Tech. Work , vol.9 , Issue.4 , pp. 189-207
    • Gerlich, L.1    Parsons, B.2    White, A.3
  • 165
    • 0028321616 scopus 로고
    • Non-verbal aspects of therapist attunement
    • Davis, M., Hadiks, D.: Non-verbal aspects of therapist attunement. J. Clin. Psychol. 50(3), 393–405 (1994)
    • (1994) J. Clin. Psychol. , vol.50 , Issue.3 , pp. 393-405
    • Davis, M.1    Hadiks, D.2
  • 167
    • 84961155917 scopus 로고    scopus 로고
    • Microsoft: Kinect for Windows Programming Guide. (2014)
    • Microsoft: Kinect for Windows Programming Guide. http://msdn.microsoft.com/en-us/library/hh855348.aspx (2014)
  • 168
    • 78649924523 scopus 로고    scopus 로고
    • Van, Barakova, E.: Communicating emotions and mental states to robots in a real time parallel framework using Laban movement analysis
    • Lourens, T., Berkel, R.: Van, Barakova, E.: Communicating emotions and mental states to robots in a real time parallel framework using Laban movement analysis. Robot. Auton. Syst. 58(12), 1256–1265 (2010)
    • (2010) Robot. Auton. Syst. , vol.58 , Issue.12 , pp. 1256-1265
    • Lourens, T.1    Berkel, R.2
  • 169
    • 1842682532 scopus 로고    scopus 로고
    • A categorical approach to affective gesture recognition
    • Bianchi-Berthouze, N., Kleinsmith, A.: A categorical approach to affective gesture recognition. Connect. Sci. 15(4), 259–269 (2003)
    • (2003) Connect. Sci. , vol.15 , Issue.4 , pp. 259-269
    • Bianchi-Berthouze, N.1    Kleinsmith, A.2
  • 170
    • 84872518261 scopus 로고    scopus 로고
    • Perception and generation of affective hand movements
    • Samadani, A., Kubica, E., Gorbet, R., Kulic, D.: Perception and generation of affective hand movements. Int. J. Soc. Robot. 5(1), 35–51 (2013)
    • (2013) Int. J. Soc. Robot. , vol.5 , Issue.1 , pp. 35-51
    • Samadani, A.1    Kubica, E.2    Gorbet, R.3    Kulic, D.4
  • 171
    • 84857910982 scopus 로고    scopus 로고
    • Toward a minimal representation of affective gestures
    • Glowinski, D., Dael, N., Camurri, A.: Toward a minimal representation of affective gestures. IEEE Trans. Affective Comput. 2(2), 106–118 (2011)
    • (2011) IEEE Trans. Affective Comput. , vol.2 , Issue.2 , pp. 106-118
    • Glowinski, D.1    Dael, N.2    Camurri, A.3
  • 173
    • 0029083813 scopus 로고
    • Expression of emotion in voice and music
    • Scherer, K.: Expression of emotion in voice and music. J. Voice 9(3), 235–248 (1995)
    • (1995) J. Voice , vol.9 , Issue.3 , pp. 235-248
    • Scherer, K.1
  • 174
    • 84961184202 scopus 로고    scopus 로고
    • The effect of emotion on voice production and speech acoustics
    • Johnstone, T.: The effect of emotion on voice production and speech acoustics. University of Western Australia (2001)
    • (2001) University of Western Australia
    • Johnstone, T.1
  • 175
    • 60749109775 scopus 로고    scopus 로고
    • Emotional expression in prosody: a review and an agenda for future research
    • Scherer, K., Bänziger, T.: Emotional expression in prosody: a review and an agenda for future research. In: Proceedings of the Speech Prosody, pp. 359–366 (2004)
    • (2004) Proceedings of the Speech Prosody , pp. 359-366
    • Scherer, K.1    Bänziger, T.2
  • 176
    • 0000191372 scopus 로고    scopus 로고
    • Vocal communication of emotion
    • Guilford, New York
    • Iohnstone, T., Scherer, K.: Vocal communication of emotion. In: Handbook of Emotion, pp. 220–235. Guilford, New York (2000)
    • (2000) Handbook of Emotion , pp. 220-235
    • Iohnstone, T.1    Scherer, K.2
  • 177
    • 77956412938 scopus 로고    scopus 로고
    • Beyond arousal: Valence and potency/control cues in the vocal expression of emotion
    • Goudbeek, M., Scherer, K.: Beyond arousal: Valence and potency/control cues in the vocal expression of emotion. J. Acoust. Soc. 128, 1322 (2010)
    • (2010) J. Acoust. Soc. , vol.128 , pp. 1322
    • Goudbeek, M.1    Scherer, K.2
  • 179
    • 84892949462 scopus 로고    scopus 로고
    • Speech signal-based emotion recognition and its application to entertainment robots
    • Song, K., Han, M., Wang, S.: Speech signal-based emotion recognition and its application to entertainment robots. J. Chinese Inst. Eng. 37(1), 14–25 (2014)
    • (2014) J. Chinese Inst. Eng. , vol.37 , Issue.1 , pp. 14-25
    • Song, K.1    Han, M.2    Wang, S.3
  • 180
    • 33745202280 scopus 로고    scopus 로고
    • A database of German emotional speech
    • Burkhardt, F., Paeschke, A., Rolfes, M.: A database of German emotional speech. Interspeech 5, 1517–1520 (2005)
    • (2005) Interspeech , vol.5 , pp. 1517-1520
    • Burkhardt, F.1    Paeschke, A.2    Rolfes, M.3
  • 181
    • 70350300961 scopus 로고    scopus 로고
    • Feature vector classification based speech emotion recognition for service robots
    • Park, J., Kim, J., Oh, Y.: Feature vector classification based speech emotion recognition for service robots. IEEE Trans. Consum. Electron. 55(3), 1590–1596 (2009)
    • (2009) IEEE Trans. Consum. Electron. , vol.55 , Issue.3 , pp. 1590-1596
    • Park, J.1    Kim, J.2    Oh, Y.3
  • 185
    • 58149087103 scopus 로고    scopus 로고
    • Speech emotion recognition separately from voiced and unvoiced sound for emotional interaction robot
    • Kim, E., Hyun, K.: Speech emotion recognition separately from voiced and unvoiced sound for emotional interaction robot. In: Proceedings of the Int. Conf. Control, Automation, and Syst. pp. 2014–2019 (2008)
    • (2008) Proceedings of the Int. Conf. Control, Automation, and Syst , pp. 2014-2019
    • Kim, E.1    Hyun, K.2
  • 186
    • 84898788148 scopus 로고    scopus 로고
    • Mandarin emotion recognition based on multifractal theory towards human-robot interaction
    • Liu, H., Zhang, W.: Mandarin emotion recognition based on multifractal theory towards human-robot interaction. In: Proceedings of the IEEE Int. Conf. Robot. Biomimetics, pp. 593–598 (2013)
    • (2013) Proceedings of the IEEE Int. Conf. Robot. Biomimetics , pp. 593-598
    • Liu, H.1    Zhang, W.2
  • 187
    • 67649867403 scopus 로고    scopus 로고
    • Novel acoustic features for speech emotion recognition
    • Roh, Y., Kim, D., Lee, W., Hong, K.: Novel acoustic features for speech emotion recognition. Sci. China Series E: J. Techn. Sci. 52(7), 1838–1848 (2009)
    • (2009) Sci. China Series E: J. Techn. Sci. , vol.52 , Issue.7 , pp. 1838-1848
    • Roh, Y.1    Kim, D.2    Lee, W.3    Hong, K.4
  • 189
    • 77953534273 scopus 로고    scopus 로고
    • Autonomic nervous system activity in emotion: A review
    • Kreibig, S.: Autonomic nervous system activity in emotion: A review. Biol. Psychol. 84(3), 394–421 (2010)
    • (2010) Biol. Psychol. , vol.84 , Issue.3 , pp. 394-421
    • Kreibig, S.1
  • 190
    • 1242276315 scopus 로고    scopus 로고
    • Anxiety detecting robotic system–towards implicit human-robot collaboration
    • Rani, P., Sarkar, N., Smith, C., Kirby, L.: Anxiety detecting robotic system–towards implicit human-robot collaboration. Robotica 22(1), 85–95 (2004)
    • (2004) Robotica , vol.22 , Issue.1 , pp. 85-95
    • Rani, P.1    Sarkar, N.2    Smith, C.3    Kirby, L.4
  • 191
    • 0024635242 scopus 로고
    • Dimensions of appraisal and physiological response in emotion
    • Smith, C.: Dimensions of appraisal and physiological response in emotion. J. Pers. Soc. Psychol. 56(3), 339–353 (1989)
    • (1989) J. Pers. Soc. Psychol. , vol.56 , Issue.3 , pp. 339-353
    • Smith, C.1
  • 192
  • 193
    • 35348845820 scopus 로고    scopus 로고
    • Affective state estimation for human–robot interaction
    • Kulic, D., Croft, E.: Affective state estimation for human–robot interaction. IEEE Trans. Robot. 23(5), 991–1000 (2007)
    • (2007) IEEE Trans. Robot. , vol.23 , Issue.5 , pp. 991-1000
    • Kulic, D.1    Croft, E.2
  • 194
    • 0024023344 scopus 로고
    • Development and validation of brief measures of positive and negative affect: the PANAS scales
    • Watson, D., Clark, L., Tellegen, A.: Development and validation of brief measures of positive and negative affect: the PANAS scales. J. Pers. Soc. Psychol. 54(6), 1063–1070 (1988)
    • (1988) J. Pers. Soc. Psychol. , vol.54 , Issue.6 , pp. 1063-1070
    • Watson, D.1    Clark, L.2    Tellegen, A.3
  • 195
    • 0042553279 scopus 로고
    • Smoothing and differentiation of data by simplified least squares procedures
    • Savitzky, A., Golay, M.J.: Smoothing and differentiation of data by simplified least squares procedures. Anal. Chem. 36(8), 1627–1639 (1964)
    • (1964) Anal. Chem. , vol.36 , Issue.8 , pp. 1627-1639
    • Savitzky, A.1    Golay, M.J.2
  • 196
    • 25844518552 scopus 로고    scopus 로고
    • A wearable health care system based on knitted integrated sensors
    • Paradiso, R., Loriga, G., Taccini, N.: A wearable health care system based on knitted integrated sensors. IEEE Trans. Inf. Technol. Biomed. 9(3), 337–344 (2005)
    • (2005) IEEE Trans. Inf. Technol. Biomed. , vol.9 , Issue.3 , pp. 337-344
    • Paradiso, R.1    Loriga, G.2    Taccini, N.3
  • 198
    • 84863908750 scopus 로고    scopus 로고
    • The role of nonlinear dynamics in affective valence and arousal recognition
    • Valenza, G., Lanata, A., Scilingo, E.P.: The role of nonlinear dynamics in affective valence and arousal recognition. IEEE Trans. Affective Comput. 3(2), 237–249 (2012)
    • (2012) IEEE Trans. Affective Comput. , vol.3 , Issue.2 , pp. 237-249
    • Valenza, G.1    Lanata, A.2    Scilingo, E.P.3
  • 199
    • 84899653785 scopus 로고    scopus 로고
    • Muecas: a multi-sensor robotic head for affective human robot interaction and imitation
    • Cid, F., Moreno, J., Bustos, P., Núñez, P.: Muecas: a multi-sensor robotic head for affective human robot interaction and imitation. Sensors 14(5), 7711–7737 (2014)
    • (2014) Sensors , vol.14 , Issue.5 , pp. 7711-7737
    • Cid, F.1    Moreno, J.2    Bustos, P.3    Núñez, P.4
  • 200
    • 84887740934 scopus 로고    scopus 로고
    • A multimodal emotion detection system during human-robot interaction
    • Alonso-Martín, F., Malfaz, M., Sequeira, J., Gorostiza, J.F., Salichs, M.A.: A multimodal emotion detection system during human-robot interaction. Sensors 13(11), 15549–81 (2013)
    • (2013) Sensors , vol.13 , Issue.11 , pp. 15549-15581
    • Alonso-Martín, F.1    Malfaz, M.2    Sequeira, J.3    Gorostiza, J.F.4    Salichs, M.A.5
  • 203
    • 84903213220 scopus 로고    scopus 로고
    • The MEI Robot?: Towards Using Motherese to Develop Multimodal Emotional Intelligence
    • Lim, A., Member, S., Okuno, H.G.: The MEI Robot?: Towards Using Motherese to Develop Multimodal Emotional Intelligence. IEEE Trans. Auton. Ment. Dev. 6(2), 126–138 (2014)
    • (2014) IEEE Trans. Auton. Ment. Dev. , vol.6 , Issue.2 , pp. 126-138
    • Lim, A.1    Member, S.2    Okuno, H.G.3
  • 204
    • 33746777912 scopus 로고    scopus 로고
    • A motion capture library for the study of identity, gender, and emotion perception from biological motion
    • Ma, Y., Paterson, H., Pollick, F.: A motion capture library for the study of identity, gender, and emotion perception from biological motion. Behav. Res. Methods 38(1), 134–141 (2006)
    • (2006) Behav. Res. Methods , vol.38 , Issue.1 , pp. 134-141
    • Ma, Y.1    Paterson, H.2    Pollick, F.3
  • 205
    • 84856799639 scopus 로고    scopus 로고
    • Visuo-auditory multimodal emotional structure to improve human-robot-interaction
    • Prado, J.A., Simplício, C., Lori, N.F., Dias, J.: Visuo-auditory multimodal emotional structure to improve human-robot-interaction. Int. J. Soc. Robot. 4(1), 29–51 (2011)
    • (2011) Int. J. Soc. Robot. , vol.4 , Issue.1 , pp. 29-51
    • Prado, J.A.1    Simplício, C.2    Lori, N.F.3    Dias, J.4
  • 206
    • 4444257069 scopus 로고    scopus 로고
    • Praat, a system for doing phonetics by computer
    • Boersma, P.: Praat, a system for doing phonetics by computer. Glot Int. 5(9-10), 341–345 (2001)
    • (2001) Glot Int. , vol.5 , Issue.9-10 , pp. 341-345
    • Boersma, P.1
  • 208
    • 80052997429 scopus 로고    scopus 로고
    • Fusion of audio-and visual cues for real-life emotional human robot interaction
    • Rabie, A., Handmann, U.: Fusion of audio-and visual cues for real-life emotional human robot interaction. In: Pattern Recognition, vol. 6835, pp. 346–355 (2011)
    • (2011) Pattern Recognition, vol. 6835 , pp. 346-355
    • Rabie, A.1    Handmann, U.2
  • 210
    • 33947636062 scopus 로고    scopus 로고
    • ENCARA2: Real-time detection of multiple faces at different resolutions in video streams
    • Castrillón, M., Déniz, O., Guerra, C., Hernández, M.: ENCARA2: Real-time detection of multiple faces at different resolutions in video streams. J. Vis. Commun. Image Represent. 18(2), 130–140 (2007)
    • (2007) J. Vis. Commun. Image Represent. , vol.18 , Issue.2 , pp. 130-140
    • Castrillón, M.1    Déniz, O.2    Guerra, C.3    Hernández, M.4
  • 211
    • 48249117060 scopus 로고    scopus 로고
    • EmoVoice—A framework for online recognition of emotions from voice
    • Vogt, T., André, E., Bee, N.: EmoVoice—A framework for online recognition of emotions from voice. In: Perception in Multimodal Dialogue Syst., pp. 188–199 (2008)
    • (2008) Perception in Multimodal Dialogue Syst. , pp. 188-199
    • Vogt, T.1    André, E.2    Bee, N.3
  • 213
    • 84937542644 scopus 로고    scopus 로고
    • Measuring users’ responses to humans, robots, and human-like robots with functional near infrared spectroscopy
    • Strait, M., Scheutz, M.: Measuring users’ responses to humans, robots, and human-like robots with functional near infrared spectroscopy. In: Proceedings of the IEEE Int. Symp. Robot and Human Interactive Communication, pp. 1128–1133 (2014)
    • (2014) Proceedings of the IEEE Int. Symp. Robot and Human Interactive Communication , pp. 1128-1133
    • Strait, M.1    Scheutz, M.2
  • 214
    • 43649094529 scopus 로고    scopus 로고
    • What’s social about social emotions
    • Hareli, S., Parkinson, B.: What’s social about social emotions. J. Theory Soc. Behav. 38(2), 131–156 (2008)
    • (2008) J. Theory Soc. Behav. , vol.38 , Issue.2 , pp. 131-156
    • Hareli, S.1    Parkinson, B.2


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.