메뉴 건너뛰기




Volumn 2, Issue , 1997, Pages 943-947

Penalty terms for fault tolerance

Author keywords

[No Author keywords available]

Indexed keywords

DISTRIBUTED COMPUTATIONS; FAULT MODEL; HARDWARE ERROR; PENALTY TERM; REAL-WORLD TASK; REALISTIC MODEL; ROUGHNESS PENALTY; SOLUTION LOCUS;

EID: 0030702546     PISSN: 10987576     EISSN: None     Source Type: Conference Proceeding    
DOI: 10.1109/ICNN.1997.616152     Document Type: Conference Paper
Times cited : (15)

References (14)
  • 1
    • 0005887693 scopus 로고
    • Curvature-driven smoothing in backpropagation neural networks
    • C. Bishop. Curvature-driven smoothing in backpropagation neural networks. In Proc. International Joint Conference on Neural Networks, volume 2, pages 749-752,1990.
    • (1990) Proc. International Joint Conference on Neural Networks , vol.2 , pp. 749-752
    • Bishop, C.1
  • 2
    • 0001740650 scopus 로고
    • Training with noise is equivalent to Tikhonov regularization
    • C. Bishop. Training with noise is equivalent to Tikhonov regularization. Neural Computation, 7(1):108-116,1995.
    • (1995) Neural Computation , vol.7 , Issue.1 , pp. 108-116
    • Bishop, C.1
  • 3
    • 0141801432 scopus 로고
    • The 'illusion' of fault tolerance in neural networks for pattern recognition and signal processing
    • University of New Hampshire, Durham NH
    • M. Carter. The 'illusion' of fault tolerance in neural networks for pattern recognition and signal processing. In Proc. Technical Session on Fault-Tolerant Integrated Systems, University of New Hampshire, Durham NH, 1988.
    • (1988) Proc. Technical Session on Fault-Tolerant Integrated Systems
    • Carter, M.1
  • 4
    • 0029440452 scopus 로고
    • Can deterministic penalty terms model the effects of synaptic weight noise on network fault-tolerance?
    • P. Edwards and A. Murray. Can deterministic penalty terms model the effects of synaptic weight noise on network fault-tolerance? International Journal of Neural Systems, 6(4):401-416,1995.
    • (1995) International Journal of Neural Systems , vol.6 , Issue.4 , pp. 401-416
    • Edwards, P.1    Murray, A.2
  • 5
    • 84892175777 scopus 로고    scopus 로고
    • Analogue imprecision in mlp training
    • August
    • P. Edwards and A. Murray. Analogue Imprecision in MLP Training. World Scientific, August 1996.
    • (1996) World Scientific
    • Edwards, P.1    Murray, A.2
  • 6
    • 0029749340 scopus 로고    scopus 로고
    • Modelling weight-and input-noise in MLP learning
    • Washington D.C., June
    • P. Edwards and A. Murray. Modelling weight-and input-noise in MLP learning. In Proc. International Conference on Neural Networks, volume 1, pages 78-83, Washington D.C., June 1996.
    • (1996) Proc. International Conference on Neural Networks , vol.1 , pp. 78-83
    • Edwards, P.1    Murray, A.2
  • 9
    • 0028494739 scopus 로고
    • Synaptic weight noise during MLP training: Enhanced MLP performance and fault tolerance resulting from synaptic weight noise during training
    • Sept.
    • A. Murray and P. Edwards. Synaptic weight noise during MLP training: Enhanced MLP performance and fault tolerance resulting from synaptic weight noise during training. IEEE Trans. Neural Networks, 5(5):792-802, Sept. 1994.
    • (1994) IEEE Trans. Neural Networks , vol.5 , Issue.5 , pp. 792-802
    • Murray, A.1    Edwards, P.2
  • 12
    • 0025670892 scopus 로고
    • The multilayer perceptron as an approximation to a bayes optimal disciminant function
    • D. Ruck, S. Rogers, M. Kabrisky, M. Oxley, and B. Suter. The multilayer perceptron as an approximation to a bayes optimal disciminant function. IEEE Trans. Neural Networks, l(4):296-298,1990.
    • (1990) IEEE Trans. Neural Networks , vol.1 , Issue.4 , pp. 296-298
    • Ruck, D.1    Rogers, S.2    Kabrisky, M.3    Oxley, M.4    Suter, B.5


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.