메뉴 건너뛰기




Volumn , Issue , 2008, Pages 45-54

Algorithms for subset selection in linear regression

Author keywords

Algorithms; Theory

Indexed keywords

ALGORITHMS; APPROXIMATION THEORY; COMPUTATION THEORY; COVARIANCE MATRIX; GENETIC ALGORITHMS; TREES (MATHEMATICS);

EID: 57049122980     PISSN: 07378017     EISSN: None     Source Type: Conference Proceeding    
DOI: 10.1145/1374376.1374384     Document Type: Conference Paper
Times cited : (200)

References (39)
  • 5
    • 1542719484 scopus 로고
    • Some effects of errors of measurement on multiple correlation
    • W. Cochran. Some effects of errors of measurement on multiple correlation. Journal of the American Statistical Association, 65(329):22-34, 1970.
    • (1970) Journal of the American Statistical Association , vol.65 , Issue.329 , pp. 22-34
    • Cochran, W.1
  • 6
    • 84891610832 scopus 로고    scopus 로고
    • Applied multiple regression/correlation analysis for the behavioral sciences
    • J. Cohen and P. Cohen. Applied multiple regression/correlation analysis for the behavioral sciences. Lawrence Erlbaum Assoc Publishers, 2003.
    • (2003) Lawrence Erlbaum Assoc Publishers
    • Cohen, J.1    Cohen, P.2
  • 8
    • 0034342180 scopus 로고    scopus 로고
    • On the optimality of the backward greedy algorithm for the subset selection problem
    • C. Couvreur and Y. Bressler. On the optimality of the backward greedy algorithm for the subset selection problem. SIAM Journal on Matrix Analysis and Applications, 21(3):797-808, 2000.
    • (2000) SIAM Journal on Matrix Analysis and Applications , vol.21 , Issue.3 , pp. 797-808
    • Couvreur, C.1    Bressler, Y.2
  • 12
    • 33645656824 scopus 로고    scopus 로고
    • For most large underdetermined systems of linear equations, the minimal 11-norm near-solution approximates the sparsest near-solution
    • D. Donoho. For most large underdetermined systems of linear equations, the minimal 11-norm near-solution approximates the sparsest near-solution. Communications on Pure and Applied Mathematics, 59:1207-1223, 2005.
    • (2005) Communications on Pure and Applied Mathematics , vol.59 , pp. 1207-1223
    • Donoho, D.1
  • 14
    • 84952499789 scopus 로고
    • Frequency of selecting noise variables in subset regression analysis: A simulation study
    • V. F. Flack and P. C. Chang. Frequency of selecting noise variables in subset regression analysis: A simulation study. The American Statistician Journal, 41(1):84-86, 1987.
    • (1987) The American Statistician Journal , vol.41 , Issue.1 , pp. 84-86
    • Flack, V.F.1    Chang, P.C.2
  • 18
    • 0033271978 scopus 로고    scopus 로고
    • A polynomial time algorithm for shaped partition problems
    • F. Hwang, S. Onn, and U. Rothblum. A polynomial time algorithm for shaped partition problems. SIAM Journal on Optimization, 10(1):70-81, 1999.
    • (1999) SIAM Journal on Optimization , vol.10 , Issue.1 , pp. 70-81
    • Hwang, F.1    Onn, S.2    Rothblum, U.3
  • 23
    • 0029291966 scopus 로고
    • Sparse approximation solutions to linear systems
    • B. Natarajan. Sparse approximation solutions to linear systems. SIAM Journal on Computing, 24:227-234, 1995.
    • (1995) SIAM Journal on Computing , vol.24 , pp. 227-234
    • Natarajan, B.1
  • 24
    • 0000095809 scopus 로고
    • An analysis of the approximations for maximizing submodular set functions
    • G. Nemhauser, L. Wolsey, and M. Fisher. An analysis of the approximations for maximizing submodular set functions. Mathematical Programming, f4:265-294, 1978.
    • (1978) Mathematical Programming, f4 , pp. 265-294
    • Nemhauser, G.1    Wolsey, L.2    Fisher, M.3
  • 25
    • 0035435181 scopus 로고    scopus 로고
    • The vector partition problem for convex optimization functions
    • S. Onn and L. Schulman. The vector partition problem for convex optimization functions. Mathematics of Operations Research, 26(3):583-590, 2001.
    • (2001) Mathematics of Operations Research , vol.26 , Issue.3 , pp. 583-590
    • Onn, S.1    Schulman, L.2
  • 26
    • 0000349428 scopus 로고
    • A generalized r2 criterion for regression models estimated by the instrumental variables method
    • M. H. Pesaran and R. J. Smith. A generalized r2 criterion for regression models estimated by the instrumental variables method. Econometrica, 62(3):705-710, 1994.
    • (1994) Econometrica , vol.62 , Issue.3 , pp. 705-710
    • Pesaran, M.H.1    Smith, R.J.2
  • 27
    • 0038490433 scopus 로고
    • Dynamic programming algorithms for recognizing small bandwidth graphs in polynomial time
    • J. Saxe. Dynamic programming algorithms for recognizing small bandwidth graphs in polynomial time. SIAM Journal on Algebraic Methods I, l(4):363-369, 1980.
    • (1980) SIAM Journal on Algebraic Methods I , vol.50 , Issue.4 , pp. 363-369
    • Saxe, J.1
  • 29
    • 0033130204 scopus 로고    scopus 로고
    • Greedy algorithms and m-term approximation with regard to redundant dictionaries
    • V. Temlyakov. Greedy algorithms and m-term approximation with regard to redundant dictionaries. Journal of Approximation Theory, 98:117-145, 1999.
    • (1999) Journal of Approximation Theory , vol.98 , pp. 117-145
    • Temlyakov, V.1
  • 31
    • 0001287271 scopus 로고    scopus 로고
    • Regression shrinkage and selection via the lasso
    • R. Tibshirani. Regression shrinkage and selection via the lasso. Journal of Royal Statistical Society, 58:267-288, 1996.
    • (1996) Journal of Royal Statistical Society , vol.58 , pp. 267-288
    • Tibshirani, R.1
  • 32
    • 5444237123 scopus 로고    scopus 로고
    • Greed is good: Algorithmic results for sparse approximation
    • J. Tropp. Greed is good: algorithmic results for sparse approximation. IEEE Trans. Information Theory, 50:2231-2242, 2004.
    • (2004) IEEE Trans. Information Theory , vol.50 , pp. 2231-2242
    • Tropp, J.1
  • 34
    • 33645712308 scopus 로고    scopus 로고
    • Just relax: Convex programming methods for identifying sparse signals
    • J. Tropp. Just relax: Convex programming methods for identifying sparse signals. IEEE Trans. Information Theory, 51:1030-1051, 2006.
    • (2006) IEEE Trans. Information Theory , vol.51 , pp. 1030-1051
    • Tropp, J.1
  • 36
    • 46749102236 scopus 로고
    • Suppressor variables and the semipartial correlation coefficient
    • W. F. Velicer. Suppressor variables and the semipartial correlation coefficient. Educational and Psychological Measurement, 38:953-958, 1978.
    • (1978) Educational and Psychological Measurement , vol.38 , pp. 953-958
    • Velicer, W.F.1
  • 37
    • 57049173208 scopus 로고    scopus 로고
    • Sharp thresholds for noisy and high-dimensional recovery of sparsity using 11-constrained quadratic programming
    • M. Wainwright. Sharp thresholds for noisy and high-dimensional recovery of sparsity using 11-constrained quadratic programming. In Proc. Allerton Conference on Communication, 2006.
    • (2006) Proc. Allerton Conference on Communication
    • Wainwright, M.1
  • 38
    • 0141642107 scopus 로고    scopus 로고
    • Suppressor variable(s) importance within a regression model
    • D. A. Walker. Suppressor variable(s) importance within a regression model. Journal of College Student Development, 44:127-133, 2003.
    • (2003) Journal of College Student Development , vol.44 , pp. 127-133
    • Walker, D.A.1
  • 39
    • 16244401458 scopus 로고    scopus 로고
    • H. Zou and T. Hastie. Regularization and variable selection via the elastic net. Journal of the R.oyal Statistical Society, Series B, 67(2):301-320, 2005.
    • H. Zou and T. Hastie. Regularization and variable selection via the elastic net. Journal of the R.oyal Statistical Society, Series B, 67(2):301-320, 2005.


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.