메뉴 건너뛰기




Volumn , Issue , 2012, Pages 4085-4088

Revisiting recurrent neural networks for robust ASR

Author keywords

Automatic Speech Recognition; Deep Learning; Recurrent Neural Networks

Indexed keywords

AUTOMATIC SPEECH RECOGNITION; CLEAN SPEECH; DEEP LEARNING; HIDDEN STATE; HIGH NOISE; MARKOVIAN DYNAMICS; MULTI-LAYER PERCEPTRONS; NETWORK STRUCTURES; NOISE CONDITIONS; NONLINEAR FUNCTIONS; OPTIMIZATION TECHNIQUES; PRE-TRAINING; ROBUST ASR; SECOND ORDERS;

EID: 84867626068     PISSN: 15206149     EISSN: None     Source Type: Conference Proceeding    
DOI: 10.1109/ICASSP.2012.6288816     Document Type: Conference Paper
Times cited : (130)

References (18)
  • 1
    • 0032658253 scopus 로고    scopus 로고
    • Temporal patterns (TRAPS) in ASR of noisy speech
    • H. Hermansky and S. Sharma, "Temporal patterns (TRAPS) in ASR of noisy speech," in ICASSP, 1999.
    • (1999) ICASSP
    • Hermansky, H.1    Sharma, S.2
  • 2
    • 80051644173 scopus 로고    scopus 로고
    • Comparing multilayer perceptron to Deep Belief Network Tandem features for robust ASR
    • O. Vinyals and S. V. Ravuri, "Comparing multilayer perceptron to Deep Belief Network Tandem features for robust ASR," in ICASSP, 2011.
    • (2011) ICASSP
    • Vinyals, O.1    Ravuri, S.V.2
  • 3
    • 79959840616 scopus 로고    scopus 로고
    • Investigation of Full-Sequence Training of Deep Belief Networks for Speech Recognition
    • A. Mohamed, D. Yu, and L. Deng, "Investigation of Full-Sequence Training of Deep Belief Networks for Speech Recognition," in Interspeech, 2010.
    • (2010) Interspeech
    • Mohamed, A.1    Yu, D.2    Deng, L.3
  • 5
    • 85079097438 scopus 로고
    • IPA: Improved phone modelling with recurrent neural networks
    • IEEE
    • T. Robinson, M. Hochberg, and S. Renals, "IPA: Improved phone modelling with recurrent neural networks," in ICASSP. IEEE, 1994.
    • (1994) ICASSP
    • Robinson, T.1    Hochberg, M.2    Renals, S.3
  • 6
    • 0033709098 scopus 로고    scopus 로고
    • Tandem Connectionist Feature Extraction for Conventional HMM Systems
    • H. Hermansky, D. P. W. Ellis, and S. Sharma, "Tandem Connectionist Feature Extraction for Conventional HMM Systems," in ICASSP, 2000.
    • (2000) ICASSP
    • Hermansky, H.1    Ellis, D.P.W.2    Sharma, S.3
  • 7
    • 0033692739 scopus 로고    scopus 로고
    • Feature Extraction Using Non-Linear Transformation for Robust Speech Recognition on the Aurora Database
    • S. Sharma, D. Ellis, S. Kajarekar, P. Jain, and H. Hermansky, "Feature Extraction Using Non-Linear Transformation For Robust Speech Recognition On The Aurora Database," in ICASSP, 2000.
    • (2000) ICASSP
    • Sharma, S.1    Ellis, D.2    Kajarekar, S.3    Jain, P.4    Hermansky, H.5
  • 8
    • 33745805403 scopus 로고    scopus 로고
    • A fast learning algorithm for deep belief nets
    • G. Hinton, S. Osindero, and Y. Teh, "A fast learning algorithm for deep belief nets," Neural Comput., vol. 18, pp. 1527-1554, 2006.
    • (2006) Neural Comput. , vol.18 , pp. 1527-1554
    • Hinton, G.1    Osindero, S.2    Teh, Y.3
  • 9
    • 33746600649 scopus 로고    scopus 로고
    • Reducing the dimensionality of data with neural networks
    • G. Hinton and R. Salakhutdinov, "Reducing the dimensionality of data with neural networks," Science, vol. 313, pp. 504-507, 2006.
    • (2006) Science , vol.313 , pp. 504-507
    • Hinton, G.1    Salakhutdinov, R.2
  • 11
    • 84865801985 scopus 로고    scopus 로고
    • Conversational Speech Transcription Using Context-Dependent Deep Neural Networks
    • F. Seide, G. Li, and D. Yu, "Conversational Speech Transcription Using Context-Dependent Deep Neural Networks," in Interspeech, 2011.
    • (2011) Interspeech
    • Seide, F.1    Li, G.2    Yu, D.3
  • 14
    • 80053459857 scopus 로고    scopus 로고
    • Generating text with recurrent neural networks
    • I. Sutskever, J. Martens, and G. Hinton, "Generating text with recurrent neural networks," in ICML, 2011.
    • (2011) ICML
    • Sutskever, I.1    Martens, J.2    Hinton, G.3
  • 16
    • 0028392483 scopus 로고
    • Learning long-term dependencies with gradient descent is difficult
    • Y. Bengio, P. Simard, and P. Frasconi, "Learning long-term dependencies with gradient descent is difficult," IEEE Transactions on Neural Networks, vol. 5, no. 2, pp. 157-166, 1994.
    • (1994) IEEE Transactions on Neural Networks , vol.5 , Issue.2 , pp. 157-166
    • Bengio, Y.1    Simard, P.2    Frasconi, P.3
  • 17
    • 84867614640 scopus 로고    scopus 로고
    • Krylov Subspace Descent for Deep Learning
    • O. Vinyals and D. Povey, "Krylov Subspace Descent for Deep Learning," in AISTATS, 2012.
    • (2012) AISTATS
    • Vinyals, O.1    Povey, D.2
  • 18
    • 0002787767 scopus 로고    scopus 로고
    • The Aurora experimental framework for the performance evaluation of speech recognition systems under noisy conditions
    • H.G. Hirsch and D. Pearce, "The Aurora experimental framework for the performance evaluation of speech recognition systems under noisy conditions," in ISCA ITRW ASR: Challenges for the Next Millennium, 2000.
    • (2000) ISCA ITRW ASR: Challenges for the next Millennium
    • Hirsch, H.G.1    Pearce, D.2


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.