|
Volumn 14, Issue 1, 2002, Pages 99-105
|
Facial expressions modulate the time course of long latency auditory brain potentials
|
Author keywords
Audio visual integration; Cross modal bias; Evoked potential; Facial expression; Multisensory object recognition; P2b
|
Indexed keywords
ADULT;
AUDITORY STIMULATION;
BEHAVIOR;
CINGULATE GYRUS;
CONFERENCE PAPER;
CONTROLLED STUDY;
DIPOLE;
ELECTROENCEPHALOGRAM;
ELECTROPHYSIOLOGY;
EMOTION;
EVOKED AUDITORY RESPONSE;
FACIAL EXPRESSION;
HUMAN;
HUMAN EXPERIMENT;
LATENT PERIOD;
NORMAL HUMAN;
PRIORITY JOURNAL;
REACTION TIME;
SPEECH ARTICULATION;
STIMULUS RESPONSE;
TASK PERFORMANCE;
VOICE;
WAVEFORM;
ANALYSIS OF VARIANCE;
ARTICLE;
COMPARATIVE STUDY;
METHODOLOGY;
PHYSIOLOGY;
TIME;
ACOUSTIC STIMULATION;
ADULT;
ANALYSIS OF VARIANCE;
COMPARATIVE STUDY;
EMOTIONS;
EVOKED POTENTIALS, AUDITORY;
FACIAL EXPRESSION;
HUMAN;
TIME FACTORS;
HUMANS;
|
EID: 0036594804
PISSN: 09266410
EISSN: None
Source Type: Journal
DOI: 10.1016/S0926-6410(02)00064-2 Document Type: Conference Paper |
Times cited : (56)
|
References (30)
|