Citation Relationships

Murata N, Takenouchi T, Kanamori T, Eguchi S (2004) Information geometry of U-Boost and Bregman divergence. Neural Comput 16:1437-81 [PubMed]

References and models cited by this paper

References and models that cite this paper

Amari S (1995) Information geometry of EM and EM algorithms for neural networks Neural Netw 8:1379-1408

Amari S, Nagaoka H (2000) Methods of information geometry

Amari SI (1985) Differential-geometrical methods in statistics

Barron AR (1993) Universal approximation bounds for superposition of a sigmoidal function IEEE Trans Inform Theory 39:930-945

Bishop C (1995) Neural Networks For Pattern Recognition

Collins M, Schapire RE, Singer Y (2000) Logistic regression, Adaboost and Bregman distances Proc 13th Ann Conf Comput Learn Theory :158-169

Domingo C, Watanabe O (2000) MadaBoost: A modification of AdaBoost Proceedings of the 13th Conference on Computational Learning Theory

Eguchi S, Copas J (2001) Recent developments in discriminant analysis from an information geometric point of view J Korean Statist Soc 30:247-264

Eguchi S, Copas J (2002) A class of logistic type discriminant functions Biometrika 89:1-22

Eguchi S, Kano Y (2001) Robustifying maximum likelihood estimation (Research memorandum 802)

Freund Y (1995) Boosting a weak learning algorithm by majority Information And Computation 12:256-285

Freund Y, Schapire R (1996) Experiment with a new boosting algorithm Proc. of the 13th International Conference on Machine Learning :148-156

Freund Y, Schapire R (1997) A decision-theoretic generalization of on-line learning and an application to boosting J Comput Sys Sci 55:119-139

Friedman J, Hastie T, Tibshirani R (2000) Additive logistic regression: A statictical view of boosting Ann Stat 28:337-374

Hampel FR, Rousseeuw PJ, Ronchetti EM, Stahel WA (1986) Robust statistics: The approach based on influence functions

Hastie T, Tibshirani R, Friedman J (2001) The elements of statistical learning

Kearns M, Valiant LG (1988) Learning boolean formulae or finite automata is ashard as factoring Tech Rep TR-14-88 Harvard University Aiken Computation Laboratory

Kivinen J, Warmuth MK (1999) Boosting as entropy projection Proc 12th Ann Conf Comput Learn Theory :134-144

Lebanon G, Lafferty J (2001) Boosting and maximum likelihood for exponential models Tech Rep CMU-CS-01-144 School of Computer Science, Carnegie Mellon University

Mclachlan G (1992) Discriminant analysis and statistical pattern recognition

Mihoko M, Eguchi S (2002) Robust blind source separation by beta divergence. Neural Comput 14:1859-86 [Journal] [PubMed]

Murata N (1996) An Integral Representation of Functions Using Three-layered Networks and Their Approximation Bounds. Neural Netw 9:947-956 [PubMed]

Murata N, Yoshizawa S, Amari S (1994) Network information criterion-determining the number of hidden units for an artificial neural network model. IEEE Trans Neural Netw 5:865-72 [Journal] [PubMed]

Schapire RE (1990) The strength of weak learnability Machine Learning 5:197-227

Schapire RE, Freund Y, Bartlett P, Lee WS (1998) Boosting the margin: A new explanation for the effectiveness of voting methods Ann Stat 26:1651-1686

Takenouchi T, Eguchi S (2004) Robustifying AdaBoost by adding the naive error rate. Neural Comput 16:767-87 [Journal] [PubMed]

Vapnik V (1995) The Nature of Statistical Learning Theory

Amari S (2007) Integration of stochastic models by minimizing alpha-divergence. Neural Comput 19:2780-96 [Journal] [PubMed]

Kanamori T, Takenouchi T, Eguchi S, Murata N (2007) Robust loss functions for boosting. Neural Comput 19:2183-244 [Journal] [PubMed]

(29 refs)