메뉴 건너뛰기




Volumn 1, Issue , 1997, Pages 221-225

Regularization and error bars for the mixture of experts network

Author keywords

[No Author keywords available]

Indexed keywords

CRITICAL APPLICATIONS; DATA DISTRIBUTION; FUNCTION APPROXIMATION; GENERALIZATION PERFORMANCE; MIXTURE OF EXPERTS; MIXTURE OF EXPERTS NETWORK; MODULAR APPROACH; NETWORK PREDICTION;

EID: 0030703285     PISSN: 10987576     EISSN: None     Source Type: Conference Proceeding    
DOI: 10.1109/ICNN.1997.611668     Document Type: Conference Paper
Times cited : (12)

References (13)
  • 1
    • 0005631525 scopus 로고    scopus 로고
    • Locally weighted learning. Technical report
    • May
    • C. G. Atkeson, A. W. Moore, and S. Schaal. Locally weighted learning. Technical report, submitted to Artificial Intelligence Review, May 1996. http://www. cc. gatech. edu/fac/Chris. Atkeson.
    • (1996) Artificial Intelligence Review
    • Atkeson, C.G.1    Moore, A.W.2    Schaal, S.3
  • 3
    • 0000588294 scopus 로고
    • Improving the generalization properties of radial basis function neural networks
    • C. M. Bishop. Improving the generalization properties of radial basis function neural networks. Neural Computation, 3(4):579-588, 1991.
    • (1991) Neural Computation , vol.3 , Issue.4 , pp. 579-588
    • Bishop, C.M.1
  • 4
    • 0011847141 scopus 로고
    • Transforming neural-net output levels to probability distributions
    • In R. P. Lippmann, J. E. Moody, and D. S. Touretzky, editors. Morgan Kaufmann
    • J. S. Denker and Y. leCun. Transforming neural-net output levels to probability distributions. In R. P. Lippmann, J. E. Moody, and D. S. Touretzky, editors, Advances in Neural Information Processing Systems 3, pages 853-859. Morgan Kaufmann, 1991.
    • (1991) Advances in Neural Information Processing Systems , vol.3 , pp. 853-859
    • Denker, J.S.1    Lecun, Y.2
  • 5
  • 8
    • 0026899193 scopus 로고
    • Using radial basis functions to approximate a function and its error bounds
    • July
    • J. Leonard, M. Kramer, and L. Ungar. Using radial basis functions to approximate a function and its error bounds. ieeenn, 3:4:624-627, July 1992.
    • (1992) Ieeenn , vol.3 , Issue.4 , pp. 624-627
    • Leonard, J.1    Kramer, M.2    Ungar, L.3
  • 9
    • 0002704818 scopus 로고
    • A practical framework for backpropagation networks
    • May
    • D. J. C. MacKay. A practical framework for backpropagation networks. Neural Computation, 4(3):448-472, May 1992.
    • (1992) Neural Computation , vol.4 , Issue.3 , pp. 448-472
    • Mackay, D.J.C.1
  • 10
    • 0029725906 scopus 로고    scopus 로고
    • Advances in using hierarchical mixture of experts for signal classification
    • V. Ramamurti and J. Ghosh. Advances in using hierarchical mixture of experts for signal classification. In Proceedings of ICASSP 96, pages 3569-3572, 1996.
    • (1996) Proceedings of ICASSP 96 , pp. 3569-3572
    • Ramamurti, V.1    Ghosh, J.2
  • 11
    • 84898770368 scopus 로고    scopus 로고
    • Structural adaptation inmixture of experts
    • Track D
    • V. Ramamurti and J. Ghosh. Structural adaptation inmixture of experts. In Proceedings of IC,PR 96, Track D, pages 704-708, 1996.
    • (1996) Proceedings of IC,PR 96 , pp. 704-708
    • Ramamurti, V.1    Ghosh, J.2
  • 13
    • 85140116568 scopus 로고
    • Pm alternative model for mixture of experts
    • In G. Tesauro, D. S. Touretzky, and T. K. Leen, editors. The MIT Press
    • L. Xu, M. I. Jordan, and G. E. Hinton. Pm alternative model for mixture of experts. In G. Tesauro, D. S. Touretzky, and T. K. Leen, editors, Adviunces in Neural Information Processing Systems 7, pages 633-640. The MIT Press, 1995.
    • (1995) Adviunces in Neural Information Processing Systems 7 , pp. 633-640
    • Xu, L.1    Jordan, M.I.2    Hinton, G.E.3


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.