메뉴 건너뛰기




Volumn , Issue , 2010, Pages

Learning networks of stochastic differential equations

Author keywords

Gaussian processes; Graphical models; Model selection and structure learning; Sparsity and feature selection

Indexed keywords

DEGREES OF FREEDOM (MECHANICS); DIFFERENTIAL EQUATIONS; STOCHASTIC MODELS; STOCHASTIC SYSTEMS;

EID: 85162004913     PISSN: None     EISSN: None     Source Type: Conference Proceeding    
DOI: None     Document Type: Conference Paper
Times cited : (78)

References (19)
  • 2
    • 44949114109 scopus 로고    scopus 로고
    • Modeling and simulating chemical reactions
    • D. Higham. Modeling and Simulating Chemical Reactions. SIAM Review, 50:347-368, 2008.
    • (2008) SIAM Review , vol.50 , pp. 347-368
    • Higham, D.1
  • 7
    • 33744552752 scopus 로고    scopus 로고
    • For most large underdetermined systems of equations, the minimal l1-norm near-solution approximates the sparsest near-solution
    • D.L. Donoho. For most large underdetermined systems of equations, the minimal l1-norm near-solution approximates the sparsest near-solution. Communications on Pure and Applied Mathematics, 59(7):907-934, 2006.
    • (2006) Communications on Pure and Applied Mathematics , vol.59 , Issue.7 , pp. 907-934
    • Donoho, D.L.1
  • 8
    • 33646365077 scopus 로고    scopus 로고
    • For most large underdetermined systems of linear equations the minimal l1-norm solution is also the sparsest solution
    • D.L. Donoho. For most large underdetermined systems of linear equations the minimal l1-norm solution is also the sparsest solution. Communications on Pure and Applied Mathematics, 59(6):797-829, 2006.
    • (2006) Communications on Pure and Applied Mathematics , vol.59 , Issue.6 , pp. 797-829
    • Donoho, D.L.1
  • 9
    • 69049086702 scopus 로고    scopus 로고
    • Some sharp performance bounds for least squares regression with L1 regularization
    • T. Zhang. Some sharp performance bounds for least squares regression with L1 regularization. Annals of Statistics, 37:2109-2144, 2009.
    • (2009) Annals of Statistics , vol.37 , pp. 2109-2144
    • Zhang, T.1
  • 10
    • 65749083666 scopus 로고    scopus 로고
    • Sharp thresholds for high-dimensional and noisy sparsity recovery using l1-constrained quadratic programming (Lasso)
    • M.J. Wainwright. Sharp thresholds for high-dimensional and noisy sparsity recovery using l1-constrained quadratic programming (Lasso). IEEE Trans. Information Theory, 55:2183-2202, 2009.
    • (2009) IEEE Trans. Information Theory , vol.55 , pp. 2183-2202
    • Wainwright, M.J.1
  • 13
    • 45849134070 scopus 로고    scopus 로고
    • Sparse inverse covariance estimation with the graphical lasso
    • J. Friedman, T. Hastie, and R. Tibshirani. Sparse inverse covariance estimation with the graphical lasso. Biostatistics, 9(3):432, 2008.
    • (2008) Biostatistics , vol.9 , Issue.3 , pp. 432
    • Friedman, J.1    Hastie, T.2    Tibshirani, R.3
  • 15
    • 34247359516 scopus 로고    scopus 로고
    • Parameter estimation for multiscale diffusions
    • G.A. Pavliotis and A.M. Stuart. Parameter estimation for multiscale diffusions. J. Stat. Phys., 127:741-781, 2007.
    • (2007) J. Stat. Phys. , vol.127 , pp. 741-781
    • Pavliotis, G.A.1    Stuart, A.M.2
  • 19
    • 79951750271 scopus 로고    scopus 로고
    • High-dimensional Ising model selection using l1-regularized logistic regression
    • P. Ravikumar, M.J.Wainwright, and J. Lafferty. High-dimensional Ising model selection using l1-regularized logistic regression. Annals of Statistics, 2008.
    • (2008) Annals of Statistics
    • Ravikumar, P.1    Wainwright, M.J.2    Lafferty, J.3


* 이 정보는 Elsevier사의 SCOPUS DB에서 KISTI가 분석하여 추출한 것입니다.