메뉴 건너뛰기




Volumn 12, Issue 12, 2000, Pages 2941-2964

A quantitative study of fault tolerance, noise immunity, and generalization ability of MLPs

Author keywords

[No Author keywords available]

Indexed keywords


EID: 0039178084     PISSN: 08997667     EISSN: None     Source Type: Journal    
DOI: 10.1162/089976600300014782     Document Type: Article
Times cited : (60)

References (24)
  • 1
    • 0028553497 scopus 로고
    • Sensitivity to errors in artificial neural networks: A behavioral approach
    • Alippi, C., Piuri, V., & Sami, M. (1994). Sensitivity to errors in artificial neural networks: A behavioral approach. In Proc. IEEE Int. Symp. on Circuits & Systems (pp. 459-462).
    • (1994) Proc. IEEE Int. Symp. on Circuits & Systems , pp. 459-462
    • Alippi, C.1    Piuri, V.2    Sami, M.3
  • 2
    • 0033359363 scopus 로고    scopus 로고
    • An accurate measure for multilayer perceptron tolerance to weight deviations
    • Bernier, J. L., Ortega, J., Rodriguez, M. M., Rojas, I., & Prieto, A. (1999a). An accurate measure for multilayer perceptron tolerance to weight deviations. Neural Processing Letters, 10(2), 121-130.
    • (1999) Neural Processing Letters , vol.10 , Issue.2 , pp. 121-130
    • Bernier, J.L.1    Ortega, J.2    Rodriguez, M.M.3    Rojas, I.4    Prieto, A.5
  • 3
    • 0034058275 scopus 로고    scopus 로고
    • Improving the tolerance of multilayer perceptrons by minimizing the statistical sensitivity to weight deviations
    • Bernier, J. L., Ortega, J., Rojas, I., & Prieto, A. (2000). Improving the tolerance of multilayer perceptrons by minimizing the statistical sensitivity to weight deviations. Neurocomputing, 31, 87-103.
    • (2000) Neurocomputing , vol.31 , pp. 87-103
    • Bernier, J.L.1    Ortega, J.2    Rojas, I.3    Prieto, A.4
  • 5
    • 0001740650 scopus 로고
    • Training with noise is equivalent to tikhonov regularization
    • Bishop, C. (1995a). Training with noise is equivalent to Tikhonov regularization. Neural Computation, 7(1), 108-116.
    • (1995) Neural Computation , vol.7 , Issue.1 , pp. 108-116
    • Bishop, C.1
  • 8
    • 0026625982 scopus 로고
    • Sensitivity analysis of multilayer perceptron with differentiable activation functions
    • Choi, J. Y., & Choi, C. (1992). Sensitivity analysis of multilayer perceptron with differentiable activation functions, IEEE Trans. on Neural Networks, 3(1), 101-107.
    • (1992) IEEE Trans. on Neural Networks , vol.3 , Issue.1 , pp. 101-107
    • Choi, J.Y.1    Choi, C.2
  • 9
    • 0029440452 scopus 로고
    • Can deterministic penalty terms model the effects of synaptic weight noise on network fault-tolerance?
    • Edwards, P. J., & Murray, A. F. (1995). Can deterministic penalty terms model the effects of synaptic weight noise on network fault-tolerance? Int. Journal of Neural Systems, 6(4) 401-416.
    • (1995) Int. Journal of Neural Systems , vol.6 , Issue.4 , pp. 401-416
    • Edwards, P.J.1    Murray, A.F.2
  • 10
    • 0032523512 scopus 로고    scopus 로고
    • Towards optimally distributed computation
    • Edwards, P. J., & Murray, A. F. (1998a). Towards optimally distributed computation. Neural Computation, 10, 997-1015.
    • (1998) Neural Computation , vol.10 , pp. 997-1015
    • Edwards, P.J.1    Murray, A.F.2
  • 15
    • 0038906922 scopus 로고
    • Regularization techniques in artificial neural networks
    • Mao, J., & Jain, A. K. (1993). Regularization techniques in artificial neural networks. World Congress on Neural Networks (pp. 75-79).
    • (1993) World Congress on Neural Networks , pp. 75-79
    • Mao, J.1    Jain, A.K.2
  • 16
    • 85132037631 scopus 로고
    • Fault tolerance of the backpropagation neural network trained on noisy inputs
    • Minnix, J. I. (1992). Fault tolerance of the backpropagation neural network trained on noisy inputs. Int. Joint Conference on Neural Networks (pp. 75-79).
    • (1992) Int. Joint Conference on Neural Networks , pp. 75-79
    • Minnix, J.I.1
  • 17
    • 0028494739 scopus 로고
    • Synaptic weight noise euring MLP training: Enhanced MLP performance and fault tolerance resulting from synaptic weight noise during training
    • Murray, A. F., & Edwards, P. J. (1994). Synaptic weight noise euring MLP training: Enhanced MLP performance and fault tolerance resulting from synaptic weight noise during training. IEEE Transactions on Neural Networks, 5(5), 792-802.
    • (1994) IEEE Transactions on Neural Networks , vol.5 , Issue.5 , pp. 792-802
    • Murray, A.F.1    Edwards, P.J.2
  • 18
    • 0026679384 scopus 로고
    • Maximally fault tolerant neural networks
    • Neti, C., Schneider, M. H., & Young, E. D. (1992). Maximally fault tolerant neural networks. IEEE Transactions on Neural Networks, 3(1), 14-23.
    • (1992) IEE , vol.3 , Issue.1 , pp. 14-23
    • Neti, C.1    Schneider, M.H.2    Young, E.D.3
  • 19
    • 0029269583 scopus 로고
    • Complete and partial fault tolerance of feed-forward neural nets
    • Pathak, D. S., & Koren, I. (1995). Complete and partial fault tolerance of feed-forward neural nets. IEEE Transactions on Neural Networks, 6(2), 446-456.
    • (1995) IEEE Transactions on Neural Networks , vol.6 , Issue.2 , pp. 446-456
    • Pathak, D.S.1    Koren, I.2
  • 20
    • 0004042460 scopus 로고
    • PROBEN1 - A set of neural network benchmark problems and benchmarking rules
    • Universitat Karlsruhe, Germany
    • Prechelt, L. (1994). PROBEN1 - A set of neural network benchmark problems and benchmarking rules (Tech. Rep. No. 21/94). Universitat Karlsruhe, Germany.
    • (1994) Tech. Rep. No. 21/94
    • Prechelt, L.1
  • 21
    • 0028532748 scopus 로고
    • Comparative fault tolerance of parallel distributed processing networks
    • Segee, B. E., & Carter, M. J. (1994). Comparative fault tolerance of parallel distributed processing networks. IEEE Trans. on Computers, 43(11), 1323-1329.
    • (1994) IEEE Trans. on Computers , vol.43 , Issue.11 , pp. 1323-1329
    • Segee, B.E.1    Carter, M.J.2


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.