-
2
-
-
84951612392
-
Remarks on the measurement of subjective probability and information
-
December
-
J. Aczel and J. Pfanzagl. Remarks on the measurement of subjective probability and information. Metrika, 11 (1):91-105, December 1967.
-
(1967)
Metrika
, vol.11
, Issue.1
, pp. 91-105
-
-
Aczel, J.1
Pfanzagl, J.2
-
5
-
-
26244461684
-
Clustering with bregman divergences
-
A. Banerjee, S. Merugu, I. S. Dhillon, and J. Ghosh. Clustering with bregman divergences. The Journal of Machine Learning Research, 6:1705-1749, 2005.
-
(2005)
The Journal of Machine Learning Research
, vol.6
, pp. 1705-1749
-
-
Banerjee, A.1
Merugu, S.2
Dhillon, I.S.3
Ghosh, J.4
-
6
-
-
0003652453
-
-
P. J. Bartlett, B. Schölkopf, D. Schuurmans, and A. J. Smola, editors, MIT Press
-
P. J. Bartlett, B. Schölkopf, D. Schuurmans, and A. J. Smola, editors. Advances in Large-Margin Classifiers. MIT Press, 2000.
-
(2000)
Advances in Large-Margin Classifiers
-
-
-
7
-
-
34247596518
-
Sparseness vs estimating conditional probabilities: Some asymptotic results
-
P. L. Bartlett and A. Tewari. Sparseness vs estimating conditional probabilities: Some asymptotic results. The Journal of Machine Learning Research, 8:775-790, 2007.
-
(2007)
The Journal of Machine Learning Research
, vol.8
, pp. 775-790
-
-
Bartlett, P.L.1
Tewari, A.2
-
8
-
-
33645505792
-
Convexity, classification, and risk bounds
-
March 2006
-
P. L. Bartlett, M. I. Jordan, and J. D. McAuliffe. Convexity, classification, and risk bounds. Journal of the American Statistical Association, 101 (473):138-156, March 2006.
-
Journal of the American Statistical Association
, vol.101
, Issue.473
, pp. 138-156
-
-
Bartlett, P.L.1
Jordan, M.I.2
McAuliffe, J.D.3
-
9
-
-
77956679007
-
Joint and separate convexity of the bregman distance
-
Dan Butnariu, Yair Censor, and Simeon Reich, editors, North-Holland
-
H. H. Bauschke and J. M. Borwein. Joint and separate convexity of the bregman distance. In Dan Butnariu, Yair Censor, and Simeon Reich, editors, Inherently Parallel Algorithms in Feasibility and Optimization and their Applications, volume 8 of Studies in Computational Mathematics, pages 23-36. North-Holland, 2001.
-
(2001)
Inherently Parallel Algorithms in Feasibility and Optimization and Their Applications, Volume 8 of Studies in Computational Mathematics
, pp. 23-36
-
-
Bauschke, H.H.1
Borwein, J.M.2
-
10
-
-
78649433471
-
Machine learning techniques - Reductions between prediction quality metrics
-
Z. Liu and C. H. Xia, editors, Springer US, April, URL
-
A. Beygelzimer, J. Langford, and B. Zadrozny. Machine learning techniques - reductions between prediction quality metrics. In Z. Liu and C. H. Xia, editors, Performance Modeling and Engineering, pages 3-28. Springer US, April 2008. URL http://hunch.net/~j1/proj ects/reductions/tutorial/paper/chapter.pdf.
-
(2008)
Performance Modeling and Engineering
, pp. 3-28
-
-
Beygelzimer, A.1
Langford, J.2
Zadrozny, B.3
-
11
-
-
33947253079
-
Loss functions for binary class probability estimation and classification: Structure and applications
-
University of Pennsylvania, November
-
A. Buja, W. Stuetzle, and Y. Shen. Loss functions for binary class probability estimation and classification: Structure and applications. Technical report, University of Pennsylvania, November 2005.
-
(2005)
Technical Report
-
-
Buja, A.1
Stuetzle, W.2
Shen, Y.3
-
12
-
-
0031327745
-
Optimal prediction under asymmetric loss
-
P. F. Christoffersen and F. X. Diebold. Optimal prediction under asymmetric loss. Econometric Theory, 13 (06):808-817, 2009.
-
(2009)
Econometric Theory
, vol.13
, Issue.6
, pp. 808-817
-
-
Christoffersen, P.F.1
Diebold, F.X.2
-
13
-
-
78649424698
-
Properties and benefits of calibrated classifiers
-
HP Laboratories, Palo Alto, July
-
I. Cohen and M. Goldszmidt. Properties and benefits of calibrated classifiers. Technical Report HPL-2004-22 (R.1), HP Laboratories, Palo Alto, July 2004.
-
(2004)
Technical Report HPL-2004-22 (R.1)
-
-
Cohen, I.1
Goldszmidt, M.2
-
14
-
-
34249753618
-
Support-vector networks
-
C. Cortes and V. Vapnik. Support-vector networks. Machine Learning, 20 (3):273-297, 1995.
-
(1995)
Machine Learning
, vol.20
, Issue.3
, pp. 273-297
-
-
Cortes, C.1
Vapnik, V.2
-
16
-
-
31644433509
-
Combining reconstructive and discriminative subspace methods for robust classification and regression by subsampling
-
S. Fidler, D. Skocaj, and A. Leonardis. Combining reconstructive and discriminative subspace methods for robust classification and regression by subsampling. IEEE Transactions on Pattern Analysis and Machine Intelligence, 28 (3):337-350, 2006.
-
(2006)
IEEE Transactions on Pattern Analysis and Machine Intelligence
, vol.28
, Issue.3
, pp. 337-350
-
-
Fidler, S.1
Skocaj, D.2
Leonardis, A.3
-
17
-
-
77957197521
-
-
arXiv:0905.2138v1 stat. ML, May, URL
-
Y. Freund. A more robust boosting algorithm. arXiv:0905.2138v1 [stat. ML], May 2009. URL http://arxiv.org/abs/0905.2138.
-
(2009)
A More Robust Boosting Algorithm
-
-
Freund, Y.1
-
19
-
-
33947274775
-
Strictly proper scoring rules, prediction, and estimation
-
March
-
T. Gneiting and A. E. Raftery. Strictly proper scoring rules, prediction, and estimation. Journal of the American Statistical Association, 102 (477):359-378, March 2007.
-
(2007)
Journal of the American Statistical Association
, vol.102
, Issue.477
, pp. 359-378
-
-
Gneiting, T.1
Raftery, A.E.2
-
20
-
-
67649344277
-
Forecasting and decision theory
-
G. Elliot, C. W. J. Granger, and A. Timmermann, editors, North-Holland, Amsterdam
-
C. W. J. Granger and M. J. Machina. Forecasting and decision theory. In G. Elliot, C. W. J. Granger, and A. Timmermann, editors, Handbook of Economic Forecasting, volume 1, pages 82-98. North-Holland, Amsterdam, 2006.
-
(2006)
Handbook of Economic Forecasting
, vol.1
, pp. 82-98
-
-
Granger, C.W.J.1
Machina, M.J.2
-
21
-
-
6344274901
-
Game theory, maximum entropy, minimum discrepancy and robust bayesian decision theory
-
P. D. Grünwald and A. P. Dawid. Game theory, maximum entropy, minimum discrepancy and robust bayesian decision theory. The Annals of Statistics, 32 (4):1367-1433, 2004.
-
(2004)
The Annals of Statistics
, vol.32
, Issue.4
, pp. 1367-1433
-
-
Grünwald, P.D.1
Dawid, A.P.2
-
23
-
-
0037598692
-
Local versus global models for classification problems: Fitting models where it matters
-
D. J. Hand and V. Vinciotti. Local versus global models for classification problems: Fitting models where it matters. The American Statistician, 57 (2):124-131, 2003.
-
(2003)
The American Statistician
, vol.57
, Issue.2
, pp. 124-131
-
-
Hand, D.J.1
Vinciotti, V.2
-
27
-
-
0742284347
-
Loss functions, complexities, and the legendre transformation
-
Y. Kalnishkan, V. Vovk, and M. V. Vyugin. Loss functions, complexities, and the legendre transformation. Theoretical Computer Science, 313 (2):195-207, 2004.
-
(2004)
Theoretical Computer Science
, vol.313
, Issue.2
, pp. 195-207
-
-
Kalnishkan, Y.1
Vovk, V.2
Vyugin, M.V.3
-
28
-
-
38049077316
-
Generalised entropy and asymptotic complexities of languages
-
Springer
-
Y. Kalnishkan, V. Vovk, and M. V. Vyugin. Generalised entropy and asymptotic complexities of languages. In Learning Theory, volume 4539/2007 of Lecture Notes in Computer Science, pages 293-307. Springer, 2007.
-
(2007)
Learning Theory, Volume 4539/2007 of Lecture Notes in Computer Science
, pp. 293-307
-
-
Kalnishkan, Y.1
Vovk, V.2
Vyugin, M.V.3
-
29
-
-
0032202014
-
Efficient noise-tolerant learning from statistical queries
-
November
-
M. Kearns. Efficient noise-tolerant learning from statistical queries. Journal of the ACM, 45 (6):983-1006, November 1998.
-
(1998)
Journal of the ACM
, vol.45
, Issue.6
, pp. 983-1006
-
-
Kearns, M.1
-
30
-
-
0035575628
-
Relative loss bounds for multidimensional regression problems
-
J. Kivinen and M. K. Warmuth. Relative loss bounds for multidimensional regression problems. Machine Learning, 45:301-329, 2001.
-
(2001)
Machine Learning
, vol.45
, pp. 301-329
-
-
Kivinen, J.1
Warmuth, M.K.2
-
34
-
-
1542277529
-
A note on margin-based loss functions in classification
-
Department of Statistics, University of Wisconsin, Madison, February
-
Y. Lin. A note on margin-based loss functions in classification. Technical Report 1044, Department of Statistics, University of Wisconsin, Madison, February 2002.
-
(2002)
Technical Report 1044
-
-
Lin, Y.1
-
35
-
-
56449118765
-
Random classification noise defeats all convex potential boosters
-
William W. Cohen, Andrew McCallum, and Sam T. Roweis, editors, doi: 10.1145/1390156.1390233
-
P. M. Long and R. A. Servedio. Random classification noise defeats all convex potential boosters. In William W. Cohen, Andrew McCallum, and Sam T. Roweis, editors, ICML, pages 608-615, 2008. doi: 10.1145/1390156.1390233.
-
(2008)
ICML
, pp. 608-615
-
-
Long, P.M.1
Servedio, R.A.2
-
36
-
-
77956005954
-
On the design of loss functions for classification: Theory, robustness to outliers, and savageboost
-
D. Koller, D. Schuurmans, Y. Bengio, and L. Bottou, editors
-
H. Masnadi-Shirazi and N. Vasconcelos. On the design of loss functions for classification: theory, robustness to outliers, and savageboost. In D. Koller, D. Schuurmans, Y. Bengio, and L. Bottou, editors, Advances in Neural Information Processing Systems 21, pages 1049-1056. 2009.
-
(2009)
Advances in Neural. Information Processing Systems
, vol.21
, pp. 1049-1056
-
-
Masnadi-Shirazi, H.1
Vasconcelos, N.2
-
39
-
-
78649415021
-
On the efficient minimization of classification calibrated surrogates
-
D. Koller, D. Schuurmans, Y. Bengio, and L. Bottou, editors, MIT Press
-
R. Nock and F. Nielsen. On the efficient minimization of classification calibrated surrogates. In D. Koller, D. Schuurmans, Y. Bengio, and L. Bottou, editors, Advances in Neural Information Processing Systems 21, pages 1201-1208. MIT Press, 2009b.
-
(2009)
Advances in Neural. Information Processing Systems
, vol.21
, pp. 1201-1208
-
-
Nock, R.1
Nielsen, F.2
-
40
-
-
0003243224
-
Probabilities for sv machines
-
A. Smola, P. Bartlett, B. Schölkopf, and D. Schuurmans, editors, MIT Press
-
J. Platt. Probabilities for sv machines. In A. Smola, P. Bartlett, B. Schölkopf, and D. Schuurmans, editors, Advances in Large Margin Classifiers, pages 61-71. MIT Press, 2000.
-
(2000)
Advances in Large Margin Classifiers
, pp. 61-71
-
-
Platt, J.1
-
41
-
-
0035283313
-
Robust classification for imprecise environments
-
F. Provost and T. Fawcett. Robust classification for imprecise environments. Machine Learning, 42 (3):203-231, 2001.
-
(2001)
Machine Learning
, vol.42
, Issue.3
, pp. 203-231
-
-
Provost, F.1
Fawcett, T.2
-
45
-
-
84950658032
-
Elicitation of personal probabilities and expectations
-
L. J. Savage. Elicitation of personal probabilities and expectations. Journal of the American Statistical Association, 66 (336):783-801, 1971.
-
(1971)
Journal of the American Statistical Association
, vol.66
, Issue.336
, pp. 783-801
-
-
Savage, L.J.1
-
46
-
-
0010300077
-
A general method for comparing probability assessors
-
M. J. Schervish. A general method for comparing probability assessors. The Annals of Statistics, 17 (4):1856-1879, 1989.
-
(1989)
The Annals of Statistics
, vol.17
, Issue.4
, pp. 1856-1879
-
-
Schervish, M.J.1
-
47
-
-
17444438778
-
New support vector algorithms
-
B. Schölkopf, A. Smola, R. C. Williamson, and P. L. Bartlett. New support vector algorithms. Neural Computation, 12:1207-1245, 2000.
-
(2000)
Neural. Computation
, vol.12
, pp. 1207-1245
-
-
Schölkopf, B.1
Smola, A.2
Williamson, R.C.3
Bartlett, P.L.4
-
49
-
-
0013920744
-
Admissible probability measurement procedures
-
June
-
E. Shuford, A. Albert, and H. E. Massengill. Admissible probability measurement procedures. Psychometrika, 31 (2):125-145, June 1966.
-
(1966)
Psychometrika
, vol.31
, Issue.2
, pp. 125-145
-
-
Shuford, E.1
Albert, A.2
Massengill, H.E.3
-
51
-
-
34547483052
-
How to compare different loss functions and their risks
-
August
-
I. Steinwart. How to compare different loss functions and their risks. Constructive Approximation, 26 (2):225-287, August 2007.
-
(2007)
Constructive Approximation
, vol.26
, Issue.2
, pp. 225-287
-
-
Steinwart, I.1
-
52
-
-
78649414235
-
Two oracle inequalities for regularized boosting classifiers
-
I. Steinwart. Two oracle inequalities for regularized boosting classifiers. Statistics and Its Interface, 2:271-284, 2009.
-
(2009)
Statistics and its Interface
, vol.2
, pp. 271-284
-
-
Steinwart, I.1
-
54
-
-
33744967706
-
Robust classification and regression using support vector machines
-
T. B. Trafalis and R. C. Gilbert. Robust classification and regression using support vector machines. European Journal of Operational Research, 173 (3):893-909, 2006.
-
(2006)
European Journal of Operational Research
, vol.173
, Issue.3
, pp. 893-909
-
-
Trafalis, T.B.1
Gilbert, R.C.2
-
55
-
-
84950930899
-
Bayesian estimation and prediction using asymmetric loss functions
-
June
-
A. Zellner. Bayesian estimation and prediction using asymmetric loss functions. Journal of the American Statistical Association, 81 (394):446-451, June 1986.
-
(1986)
Journal of the American Statistical Association
, vol.81
, Issue.394
, pp. 446-451
-
-
Zellner, A.1
-
56
-
-
85041123735
-
Divergence function, duality, and convex analysis
-
J. Zhang. Divergence function, duality, and convex analysis. Neural Computation, 16 (1):159-195, 2004a.
-
(2004)
Neural. Computation
, vol.16
, Issue.1
, pp. 159-195
-
-
Zhang, J.1
-
57
-
-
4644257995
-
Statistical behaviour and consistency of classification methods based on convex risk minimization
-
T. Zhang. Statistical behaviour and consistency of classification methods based on convex risk minimization. Annals of Mathematical Statistics, 32:56-134, 2004b.
-
(2004)
Annals of Mathematical Statistics
, vol.32
, pp. 56-134
-
-
Zhang, T.1
|