Citation Relationships

Legends: Link to a Model Reference cited by multiple papers

Kanamori T, Takenouchi T, Eguchi S, Murata N (2007) Robust loss functions for boosting. Neural Comput 19:2183-244 [PubMed]

References and models cited by this paper

References and models that cite this paper

Amari S, Nagaoka H (2000) Methods of information geometry
Bartlett PL, Jordan MI, Mcauliffe JD (2003) Convexity, classification, and risk bounds Unpublished manuscript
Bertsekas DP (1999) Nonlinear programming (2nd ed)
Blanchard G, Schafer C, Rozenholc Y, Muller KR (2005) Optimal dyadicdecision trees (Tech. rep.) Berlin: Fraunhofer FIRST, 2005. Available online
Breiman L (1994) Bagging predictors Tech. rep. 421 Berkeley: Statistics Department,University of California, Berkeley
Breiman L, Friedman JH, Olshen RA, Stone CJ (1983) Classification and regression trees
Copas J (1988) Binary regression models for contaminated data J Royal Statist Soc B 50:225-265
Cortes C, Vapnik V (1995) Support-vector networks Mach Learn 20:273-297
Demiriz A, Bennett KP, Shawe-Taylor J (2002) Linear programming boostingvia column generation Mach Learn 46:225-254
Domingo C, Watanabe O (2000) MadaBoost: A modification of AdaBoost Proceedings of the 13th Conference on Computational Learning Theory
Freund Y, Schapire R (1997) A decision-theoretic generalization of on-line learning and an application to boosting J Comput Sys Sci 55:119-139
Friedman J, Hastie T, Tibshirani R (2000) Additive logistic regression: A statictical view of boosting Ann Stat 28:337-374
Grunwald PD, Dawid AP (2004) Game theory, maximum entropy, minimum discrepancy, and robust Bayesian decision theory Ann Stat 32:1367-1433
Halmos PR (1974) Measure theory
Hampel FR, Rousseeuw PJ, Ronchetti EM, Stahel WA (1986) Robust statistics: The approach based on influence functions
Kalai A, Servedio RA (2003) Boosting in the presence of noise STOC03:Proceedings of the Thirty-Fifth Annual ACM Symposium on Theory of Computing
Kanamori T, Takenouchi T, Eguchi S, Murata N (2004) The most robust loss function for boosting Neural Information Processing: 11th International Conference :496-501
Lebanon G, Lafferty J (2002) Boosting and maximum likelihood for exponential models Advances in neural information processing systems, Dietterich TG:Becker S:Ghahramani Z, ed. pp.447
Maccullagh PA, Nelder J (1989) Generalized linear models
Mason L, Baxter J, Bartlett PL, Frean M (1999) Boosting algorithms as gradient descent Advances in neural information processing systems, Stearns MS:Solla S:Cohen D, ed.
Mclachlan G (1992) Discriminant analysis and statistical pattern recognition
Meir R, Ratsch G (2003) An introduction to Boosting and leveraging Advanced lectures on machine learning (Available on-line:, Mendelson S:Smola A, ed. pp.119
Murata N, Takenouchi T, Kanamori T, Eguchi S (2004) Information geometry of U-Boost and Bregman divergence. Neural Comput 16:1437-81 [Journal] [PubMed]
Ratsch G (2001) Robust boosting via convex optimization Unpublished doctoral dissertation
Ratsch G, Demiriz A, Bennett K (2002) Sparse regression ensembles in infinite and finite hypothesis spaces Mach Learn 48:193-221
Ratsch G, Onoda T, Muller KR (2001) Soft margins for AdaBoost Mach Learn 42:287-320
Rosset S (2005) Robust boosting and its relation to bagging Proc 11th ACM SICKDD International Conference on Knowledge Discovery in Data Mining :249-255
Schapire R, Freund Y, Bartlett P, Lee W (1998) Boosting the margin: A new explanation for the effectiveness of voting methods Ann Stat 26:1651-1686
Scholkopf B, Smola AJ (2001) Learning with kernels: Support vector machines, regularization, optimization, and beyond
Servedio R (2003) Smooth boosting and learning with malicious noise J Mach Learn Res 4:633-648
Takenouchi T, Eguchi S (2004) Robustifying AdaBoost by adding the naive error rate. Neural Comput 16:767-87 [Journal] [PubMed]
van_derVaart A (1998) Asymptotic statistics
Vapnik V (1998) Statistical Learning Theory
Victoria-feser MP (2002) Robust inference with binary data Psychometrika 67:21-32
(35 refs)