-
1
-
-
35748932917
-
A review of feature selection techniques in bioinformatics
-
Inza In, Larrañaga P
-
Saeys Y, Inza In, Larrañaga P. A review of feature selection techniques in bioinformatics. Bioinformatics. 2007; 23(19): 2507-2517.
-
(2007)
Bioinformatics
, vol.23
, Issue.19
, pp. 2507-2517
-
-
Saeys, Y.1
-
3
-
-
41549141939
-
Boosting algorithms: Regularization, prediction and model fitting
-
Bühlmann P, Hothorn T. Boosting algorithms: regularization, prediction and model fitting. Stat Sci. 2007; 22(4): 477-505.
-
(2007)
Stat Sci
, vol.22
, Issue.4
, pp. 477-505
-
-
Bühlmann, P.1
Hothorn, T.2
-
4
-
-
84914169260
-
The evolution of boosting algorithms. From machine learning to statistical modelling
-
Mayr A, Binder H, Gefeller O, Schmid M. The evolution of boosting algorithms. From machine learning to statistical modelling. Meth Inf Med. 2014; 53(6): 419-427.
-
(2014)
Meth Inf Med
, vol.53
, Issue.6
, pp. 419-427
-
-
Mayr, A.1
Binder, H.2
Gefeller, O.3
Schmid, M.4
-
5
-
-
0034164230
-
Additive logistic regression: A statistical view of boosting (with discussion and a rejoinder by the authors)
-
Friedman J, Hastie T, Tibshirani R. Additive logistic regression: a statistical view of boosting (with discussion and a rejoinder by the authors). Ann Statist. 2000; 28(2): 337-407.
-
(2000)
Ann Statist
, vol.28
, Issue.2
, pp. 337-407
-
-
Friedman, J.1
Hastie, T.2
Tibshirani, R.3
-
7
-
-
84914167843
-
Boostingan unusual yet attractive optimiser
-
Hothorn T. Boostingan unusual yet attractive optimiser. Meth Inf Med. 2014; 53(6): 417-418.
-
(2014)
Meth Inf Med
, vol.53
, Issue.6
, pp. 417-418
-
-
Hothorn, T.1
-
9
-
-
84914148148
-
Discussion of The evolution of boosting algorithms and Extending statistical boosting
-
Bühlmann P, Gertheiss J, Hieke S, Kneib T, Ma S, Schumacher M, et al. Discussion of The evolution of boosting algorithms and Extending statistical boosting. Meth Inf Med. 2014; 53(6): 436-445.
-
(2014)
Meth Inf Med
, vol.53
, Issue.6
, pp. 436-445
-
-
Bühlmann, P.1
Gertheiss, J.2
Hieke, S.3
Kneib, T.4
Ma, S.5
Schumacher, M.6
-
10
-
-
84942487147
-
Ridge regression: Applications to nonorthogonal problems
-
Hoerl AE, Kennard RW. Ridge regression: applications to nonorthogonal problems. Technometrics. 1970; 12: 69-82.
-
(1970)
Technometrics
, vol.12
, pp. 69-82
-
-
Hoerl, A.E.1
Kennard, R.W.2
-
11
-
-
84874257732
-
Better subset regression using the nonnegative garrote
-
Breiman L. Better subset regression using the nonnegative garrote. Technometrics. 1995; 37(4): 373-384.
-
(1995)
Technometrics
, vol.37
, Issue.4
, pp. 373-384
-
-
Breiman, L.1
-
12
-
-
0034287156
-
Asymptotics for lasso-type estimators
-
Knight K, Fu W. Asymptotics for lasso-type estimators. Ann Statist. 2000; 28(5): 1356-1378.
-
(2000)
Ann Statist
, vol.28
, Issue.5
, pp. 1356-1378
-
-
Knight, K.1
Fu, W.2
-
13
-
-
31344454903
-
Persistence in high-dimensional linear predictor selection and the virtue of overparametrization
-
Greenshtein E, Ritov Y. Persistence in high-dimensional linear predictor selection and the virtue of overparametrization. Bernoulli. 2004; 10(6): 971-988.
-
(2004)
Bernoulli
, vol.10
, Issue.6
, pp. 971-988
-
-
Greenshtein, E.1
Ritov, Y.2
-
15
-
-
51049121146
-
High-dimensional generalized linear models and the lasso
-
van de Geer SA. High-dimensional generalized linear models and the lasso. Ann Statist. 2008; 36(2): 614-645.
-
(2008)
Ann Statist
, vol.36
, Issue.2
, pp. 614-645
-
-
Van De Geer, S.A.1
-
16
-
-
33747163541
-
High-dimensional graphs and variable selection with the Lasso
-
Meinshausen N, Bühlmann P. High-dimensional graphs and variable selection with the Lasso. Ann Statist. 2006; 34(3): 1436-1462.
-
(2006)
Ann Statist
, vol.34
, Issue.3
, pp. 1436-1462
-
-
Meinshausen, N.1
Bühlmann, P.2
-
17
-
-
33845263263
-
On Model Selection Consistency of Lasso
-
Zhao P, Yu B. On Model Selection Consistency of Lasso. J Mach Learn Res. 2006; 7: 2541-2563.
-
(2006)
J Mach Learn Res
, vol.7
, pp. 2541-2563
-
-
Zhao, P.1
Yu, B.2
-
18
-
-
33846114377
-
The Adaptive Lasso and Its Oracle Properties
-
Zou H. The Adaptive Lasso and Its Oracle Properties. J Am Stat Assoc. 2006; 101(476): 1418-1429.
-
(2006)
J am Stat Assoc
, vol.101
, Issue.476
, pp. 1418-1429
-
-
Zou, H.1
-
19
-
-
65749083666
-
Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using -Constrained Quadratic Programming (Lasso)
-
Wainwright MJ. Sharp Thresholds for High-Dimensional and Noisy Sparsity Recovery Using -Constrained Quadratic Programming (Lasso). IEEE Trans Inf Theory. 2009; 55(5): 2183-2202.
-
(2009)
IEEE Trans Inf Theory
, vol.55
, Issue.5
, pp. 2183-2202
-
-
Wainwright, M.J.1
-
20
-
-
50949096321
-
The sparsity and bias of the Lasso selection in high-dimensional linear regression
-
Zhang CH, Huang J. The sparsity and bias of the Lasso selection in high-dimensional linear regression. Ann Statist. 2008; 36(4): 1567-1594.
-
(2008)
Ann Statist
, vol.36
, Issue.4
, pp. 1567-1594
-
-
Zhang, C.H.1
Huang, J.2
-
21
-
-
65349193793
-
Lasso-type recovery of sparse representations for high-dimensional data
-
Meinshausen N, Yu B. Lasso-type recovery of sparse representations for high-dimensional data. Ann Statist. 2009; 37(1): 246-270.
-
(2009)
Ann Statist
, vol.37
, Issue.1
, pp. 246-270
-
-
Meinshausen, N.1
Yu, B.2
-
22
-
-
26444493144
-
Boosting with early stopping: Convergence and consistency
-
Zhang T, Yu B. Boosting with early stopping: Convergence and consistency. Ann Statist. 2005; 33(4): 1538-1579.
-
(2005)
Ann Statist
, vol.33
, Issue.4
, pp. 1538-1579
-
-
Zhang, T.1
Yu, B.2
-
23
-
-
33745157294
-
Boosting for high-dimensional linear models
-
Bühlmann P. Boosting for high-dimensional linear models. Ann Statist. 2006; 34(2): 559-583.
-
(2006)
Ann Statist
, vol.34
, Issue.2
, pp. 559-583
-
-
Bühlmann, P.1
-
24
-
-
84858743801
-
The importance of knowing when to stopa sequential stopping rule for component-wise gradient boosting
-
Mayr A, Hofner B, Schmid M. The importance of knowing when to stopa sequential stopping rule for component-wise gradient boosting. Meth Inf Med. 2012; 51(2): 178-186.
-
(2012)
Meth Inf Med
, vol.51
, Issue.2
, pp. 178-186
-
-
Mayr, A.1
Hofner, B.2
Schmid, M.3
-
26
-
-
50849120401
-
Discussion: A tale of three cousins: Lasso, l2boosting and Dantzig
-
Meinshausen N, Rocha G, Yu B. Discussion: a tale of three cousins: Lasso, l2boosting and Dantzig. Ann Statist. 2007; 35(6): 2373-2384.
-
(2007)
Ann Statist
, vol.35
, Issue.6
, pp. 2373-2384
-
-
Meinshausen, N.1
Rocha, G.2
Yu, B.3
-
27
-
-
84869162742
-
On lars/homotopy equivalence conditions for over-determined lasso
-
Duan J, Soussen C, Brie D, Idier J, Wang YP. On lars/homotopy equivalence conditions for over-determined lasso. IEEE Signal Process Lett. 2012; 19(12): 894-897.
-
(2012)
IEEE Signal Process Lett
, vol.19
, Issue.12
, pp. 894-897
-
-
Duan, J.1
Soussen, C.2
Brie, D.3
Idier, J.4
Wang, Y.P.5
-
29
-
-
48849102758
-
Adapting Prediction Error Estimates for Biased Complexity Selection in High-Dimensional Bootstrap Samples
-
Binder H, Schumacher M. Adapting Prediction Error Estimates for Biased Complexity Selection in High-Dimensional Bootstrap Samples. Stat Appl Genet Mol Biol. 2008; 7(1): 1-28.
-
(2008)
Stat Appl Genet Mol Biol
, vol.7
, Issue.1
, pp. 1-28
-
-
Binder, H.1
Schumacher, M.2
-
31
-
-
77950537175
-
Regularization Paths for Generalized Linear Models via Coordinate Descent
-
Friedman J, Hastie T, Tibshirani R. Regularization Paths for Generalized Linear Models via Coordinate Descent. J Stat Softw. 2010; 33(1): 1-22
-
(2010)
J Stat Softw
, vol.33
, Issue.1
, pp. 1-22
-
-
Friedman, J.1
Hastie, T.2
Tibshirani, R.3
-
33
-
-
84884207090
-
Penalized likelihood and Bayesian function selection in regression models
-
Scheipl F, Kneib T, Fahrmeir L. Penalized likelihood and Bayesian function selection in regression models. Advances in Statistical Analysis. 2013; 97(4): 349-385.
-
(2013)
Advances in Statistical Analysis
, vol.97
, Issue.4
, pp. 349-385
-
-
Scheipl, F.1
Kneib, T.2
Fahrmeir, L.3
-
34
-
-
77954378998
-
Buckley-James Boosting for Survival Analysis with High-Dimensional Biomarker Data
-
Wang Z, Wang C. Buckley-James Boosting for Survival Analysis with High-Dimensional Biomarker Data. Stat Appl Genet Mol Biol. 2010; 9(1): 1-33.
-
(2010)
Stat Appl Genet Mol Biol
, vol.9
, Issue.1
, pp. 1-33
-
-
Wang, Z.1
Wang, C.2
-
35
-
-
0035470889
-
Greedy function approximation: A gradient boosting machine
-
Friedman J. Greedy function approximation: a gradient boosting machine. Ann Statist. 2001; 29(5): 1189-1232.
-
(2001)
Ann Statist
, vol.29
, Issue.5
, pp. 1189-1232
-
-
Friedman, J.1
-
39
-
-
79960127235
-
Identifying risk factors for severe childhood malnutrition by boosting additive quantile regression
-
Fenske N, Kneib T, Hothorn T. Identifying risk factors for severe childhood malnutrition by boosting additive quantile regression. J Am Stat Assoc. 2011; 106(494): 494-510.
-
(2011)
J am Stat Assoc
, vol.106
, Issue.494
, pp. 494-510
-
-
Fenske, N.1
Kneib, T.2
Hothorn, T.3
-
40
-
-
28944437658
-
Regularized ROC method for disease classification and biomarker selection with microarray data
-
Ma S, Huang J. Regularized ROC method for disease classification and biomarker selection with microarray data. Bioinformatics. 2005; 21(24): 4356-4362.
-
(2005)
Bioinformatics
, vol.21
, Issue.24
, pp. 4356-4362
-
-
Ma, S.1
Huang, J.2
-
41
-
-
55549110371
-
Boosting additive models using component-wise P-splines
-
Schmid M, Hothorn T. Boosting additive models using component-wise P-splines. Comput Stat Data Anal. 2008; 53: 298-311.
-
(2008)
Comput Stat Data Anal
, vol.53
, pp. 298-311
-
-
Schmid, M.1
Hothorn, T.2
-
42
-
-
84856711914
-
Geoadditive expectile regression
-
Sobotka F, Kneib T. Geoadditive expectile regression. Comput Stat Data Anal. 2012; 56: 755-767.
-
(2012)
Comput Stat Data Anal
, vol.56
, pp. 755-767
-
-
Sobotka, F.1
Kneib, T.2
-
43
-
-
84953350942
-
A unified framework of constrained regression
-
Hofner B, Kneib T, Hothorn T. A unified framework of constrained regression. Stat Comput. 2014; 26(1): 1-14.
-
(2014)
Stat Comput
, vol.26
, Issue.1
, pp. 1-14
-
-
Hofner, B.1
Kneib, T.2
Hothorn, T.3
-
44
-
-
66949120727
-
Variable Selection and Model Choice in Geoadditive Regression Models
-
Kneib T, Hothorn T, Tutz G. Variable Selection and Model Choice in Geoadditive Regression Models. Biometrics. 2009; 65(2): 626-634.
-
(2009)
Biometrics
, vol.65
, Issue.2
, pp. 626-634
-
-
Kneib, T.1
Hothorn, T.2
Tutz, G.3
-
45
-
-
33845509035
-
Generalized Additive Modeling with Implicit Variable Selection by Likelihoodbased Boosting
-
Tutz G, Binder H. Generalized Additive Modeling with Implicit Variable Selection by Likelihoodbased Boosting. Biometrics. 2006;62: 961-971.
-
(2006)
Biometrics
, vol.62
, pp. 961-971
-
-
Tutz, G.1
Binder, H.2
-
46
-
-
33645035051
-
Model selection and estimation in regression with grouped variables
-
Yuan M, Lin Y. Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society (Series B). 2006; 68(1): 49-67.
-
(2006)
Journal of the Royal Statistical Society (Series B)
, vol.68
, Issue.1
, pp. 49-67
-
-
Yuan, M.1
Lin, Y.2
-
48
-
-
34548275795
-
The Dantzig selector: Statistical estimation when p is much larger than n
-
Candes E, Tao T. The Dantzig selector: Statistical estimation when p is much larger than n. Ann Statist. 2007; 35(6): 2313-2351.
-
(2007)
Ann Statist
, vol.35
, Issue.6
, pp. 2313-2351
-
-
Candes, E.1
Tao, T.2
-
49
-
-
45849134070
-
Sparse inverse covariance estimation with the graphical lasso
-
Friedman J, Hastie T, Tibshirani R. Sparse inverse covariance estimation with the graphical lasso. Biostatistics. 2008; 9(3): 432-441.
-
(2008)
Biostatistics
, vol.9
, Issue.3
, pp. 432-441
-
-
Friedman, J.1
Hastie, T.2
Tibshirani, R.3
-
50
-
-
79954618626
-
Selection of Ordinally Scaled Independent Variables with Applications to International Classification of Functioning Core Sets
-
Gertheiss J, Hogger S, Oberhauser C, Tutz G. Selection of Ordinally Scaled Independent Variables with Applications to International Classification of Functioning Core Sets. Applied Statistics. 2010; 60(3): 377-395.
-
(2010)
Applied Statistics
, vol.60
, Issue.3
, pp. 377-395
-
-
Gertheiss, J.1
Hogger, S.2
Oberhauser, C.3
Tutz, G.4
-
51
-
-
77749280569
-
Feature Extraction in Signal Regression: A Boosting Technique for Functional Data Regression
-
Tutz G, Gertheiss J. Feature Extraction in Signal Regression: A Boosting Technique for Functional Data Regression. J Comput Graph Stat. 2010; 19: 154-174.
-
(2010)
J Comput Graph Stat
, vol.19
, pp. 154-174
-
-
Tutz, G.1
Gertheiss, J.2
-
52
-
-
79952327113
-
HingeBoost: ROC-Based Boost for Classification and Variable Selection
-
Wang Z. HingeBoost: ROC-Based Boost for Classification and Variable Selection. The International Journal of Biostatistics. 2011; 7(1): 1-30.
-
(2011)
The International Journal of Biostatistics
, vol.7
, Issue.1
, pp. 1-30
-
-
Wang, Z.1
-
53
-
-
84914104651
-
Extending statistical boosting: An overview of recent methodological developments
-
Mayr A, Binder H, Gefeller O, Schmid M. Extending statistical boosting: an overview of recent methodological developments. Meth Inf Med. 2014; 53(6): 428-435.
-
(2014)
Meth Inf Med
, vol.53
, Issue.6
, pp. 428-435
-
-
Mayr, A.1
Binder, H.2
Gefeller, O.3
Schmid, M.4
|