메뉴 건너뛰기




Volumn 1501, Issue , 1998, Pages 375-384

On the sample complexity for neural trees

Author keywords

[No Author keywords available]

Indexed keywords

COMPLEX NETWORKS; FEEDFORWARD NEURAL NETWORKS; FORESTRY; LEARNING ALGORITHMS;

EID: 84961363381     PISSN: 03029743     EISSN: 16113349     Source Type: Book Series    
DOI: 10.1007/3-540-49730-7_26     Document Type: Conference Paper
Times cited : (1)

References (22)
  • 2
    • 0003487046 scopus 로고
    • Cambridge Tracts in Theoretical Computer Science. Cambridge University Press, Cambridg
    • M. Anthony and N. Biggs. Computational Learning Theory. Cambridge Tracts in Theoretical Computer Science. Cambridge University Press, Cambridge, 1992.
    • (1992) Computational Learning Theory
    • Anthony, M.1    Biggs, N.2
  • 3
    • 0001160588 scopus 로고
    • What size net gives valid generalization?
    • E. B. Baum and D. Haussler. What size net gives valid generalization? Neural Computation, 1:151-160, 1989.
    • (1989) Neural Computation , vol.1 , pp. 151-160
    • Baum, E.B.1    Haussler, D.2
  • 5
    • 84918441630 scopus 로고
    • Geometrical and statistical properties of systems of linear inequalities with applications in pattern recognition
    • T. M. Cover. Geometrical and statistical properties of systems of linear inequalities with applications in pattern recognition. IEEE Transactions on Electronic Computers, 14:326-334, 1965.
    • (1965) IEEE Transactions on Electronic Computers , vol.14 , pp. 326-334
    • Cover, T.M.1
  • 6
    • 0013995259 scopus 로고
    • Capacity problems for linear machines
    • In L. N. Kanal, editor, Thompson Book Co., Washingto
    • T. M. Cover. Capacity problems for linear machines. In L. N. Kanal, editor, Pattern Recognition, pages 283-289, Thompson Book Co., Washington, 1968.
    • (1968) Pattern Recognition , pp. 283-289
    • Cover, T.M.1
  • 7
    • 0029256399 scopus 로고
    • Bounding the Vapnik-Chervonenkis dimension of concept classes parameterized by real numbers
    • P. W. Goldberg and M. R. Jerrum. Bounding the Vapnik-Chervonenkis dimension of concept classes parameterized by real numbers. Machine Learning, 18:131-148, 1995.
    • (1995) Machine Learning , vol.18 , pp. 131-148
    • Goldberg, P.W.1    Jerrum, M.R.2
  • 8
    • 0342959986 scopus 로고
    • On learning μ-Perceptron networks with binary weights
    • In S. J. Hanson, J. D. Cowan, and C. L. Giles, editors, Morgan Kaufmann, San Mateo, C
    • M. Golea, M. Marchand, and T. R. Hancock. On learning μ-Perceptron networks with binary weights. In S. J. Hanson, J. D. Cowan, and C. L. Giles, editors, Advances in Neural Information Processing Systems 5, pages 591-598. Morgan Kaufmann, San Mateo, CA, 1993.
    • (1993) Advances in Neural Information Processing Systems 5 , pp. 591-598
    • Golea, M.1    Marchand, M.2    Hancock, T.R.3
  • 9
    • 0029878597 scopus 로고    scopus 로고
    • On learning μ-Perceptron networks on the uniform distribution
    • M. Golea, M. Marchand, and T. R. Hancock. On learning μ-Perceptron networks on the uniform distribution. Neural Networks, 9:67-82, 1996.
    • (1996) Neural Networks , vol.9 , pp. 67-82
    • Golea, M.1    Marchand, M.2    Hancock, T.R.3
  • 10
    • 0028495825 scopus 로고
    • Learning nonoverlapping Perceptron networks from examples and membership queries
    • T. R. Hancock, M. Golea, and M. Marchand. Learning nonoverlapping Perceptron networks from examples and membership queries. Machine Learning, 16:161-183, 1994.
    • (1994) Machine Learning , vol.16 , pp. 161-183
    • Hancock, T.R.1    Golea, M.2    Marchand, M.3
  • 11
    • 0002192516 scopus 로고
    • Decision theoretic generalizations of the PAC model for neural net and other learning applications
    • D. Haussler. Decision theoretic generalizations of the PAC model for neural net and other learning applications. Information and Computation, 100:78-150, 1992.
    • (1992) Information and Computation , vol.100 , pp. 78-150
    • Haussler, D.1
  • 12
    • 0031077292 scopus 로고    scopus 로고
    • Polynomial bounds for VC dimension of sigmoidal and general pfaffian neural networks
    • M. Karpinski and A. Macintyre. Polynomial bounds for VC dimension of sigmoidal and general pfaffian neural networks. Journal of Computer and System Sciences, 54:169-176, 1997.
    • (1997) Journal of Computer and System Sciences , vol.54 , pp. 169-176
    • Karpinski, M.1    Macintyre, A.2
  • 14
    • 34250091945 scopus 로고
    • Learning quickly when irrelevant attributes abound: A new linear-threshold algorithm
    • N. Littlestone. Learning quickly when irrelevant attributes abound: A new linear-threshold algorithm. Machine Learning, 2:285-318, 1988.
    • (1988) Machine Learning , vol.2 , pp. 285-318
    • Littlestone, N.1
  • 15
    • 0039716477 scopus 로고
    • Neural nets with superlinear VC-dimension
    • W. Maass. Neural nets with superlinear VC-dimension. Neural Computation, 6:877-884, 1994.
    • (1994) Neural Computation , vol.6 , pp. 877-884
    • Maass, W.1
  • 16
    • 0010687655 scopus 로고
    • Vapnik-Chervonenkis dimension of neural nets
    • In M. A. Arbib, editor, MIT Press, Cambridge, Mass
    • W. Maass. Vapnik-Chervonenkis dimension of neural nets. In M. A. Arbib, editor, The Handbook of Brain Theory and Neural Networks, pages 1000-1003. MIT Press, Cambridge, Mass., 1995.
    • (1995) The Handbook of Brain Theory and Neural Networks , pp. 1000-1003
    • Maass, W.1
  • 17
    • 0042076259 scopus 로고
    • A comparison of the computational power of sigmoid and Boolean threshold circuits
    • In V. Roychowdhury, K.-Y. Siu, and A. Orlitsky, editors, Kluwer, Bosto
    • W. Maass, G. Schnitger, and E. D. Sontag. A comparison of the computational power of sigmoid and Boolean threshold circuits. In V. Roychowdhury, K.-Y. Siu, and A. Orlitsky, editors, Theoretical Advances in Neural Computation and Learning, pages 127-151. Kluwer, Boston, 1994.
    • (1994) Theoretical Advances in Neural Computation and Learning , pp. 127-151
    • Maass, W.1    Schnitger, G.2    Sontag, E.D.3
  • 18
    • 0000586256 scopus 로고
    • Lower bound methods and separation results for on-line learning models
    • W. Maass and G. Turan. Lower bound methods and separation results for on-line learning models. Machine Learning, 9:107-145, 1992.
    • (1992) Machine Learning , vol.9 , pp. 107-145
    • Maass, W.1    Turan, G.2
  • 20
    • 4243343441 scopus 로고
    • Zürich,, 1901. Reprinted in: L. Schlüfli, Gesammelte Mathematische Abhandlungen, Band I, Birkhüauser, Base
    • L. Schlafli. Theorie der vielfachen Kontinuitat.Zürcher & Furrer, Zürich, 1901. Reprinted in: L. Schlüfli, Gesammelte Mathematische Abhandlungen, Band I, Birkhüauser, Basel, 1950.
    • (1950) Theorie Der Vielfachen Kontinuitat.Zürcher & Furrer
    • Schlafli, L.1
  • 21
    • 0039130035 scopus 로고
    • Sample sizes for threshold networks with equivalences
    • J. Shawe-Taylor. Sample sizes for threshold networks with equivalences. Information and Computation, 118:65-72, 1995.
    • (1995) Information and Computation , vol.118 , pp. 65-72
    • Shawe-Taylor, J.1


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.