Citation Relationships

Legends: Link to a Model Reference cited by multiple papers


Erdogmus D, Hild KE, Rao YN, Príncipe JC (2004) Minimax mutual information approach for independent component analysis. Neural Comput 16:1235-52 [PubMed]

References and models cited by this paper

References and models that cite this paper

Almeida LB (2000) Linear and nonlinear ICA based on mutual information Proceedings of AS-SPCC00 :117-122
Amari SI (1985) Differential-geometrical methods in statistics
Amari SI (1997) Neural learning in structured parameter spaces-natural Riemmanian gradient Advances in neural information processing systems, Mozer M:Jordan M:Petsche T, ed. pp.127
Amari SL, Cichocki A, Yang HH (1996) A new learning algorithm for blind signal separation. Advances in Neural Information Processing Systems., Touretzky D:Mozer M:Hasselmo M, ed. pp.757
Barndorff-nielsen OE (1978) Information and exponential families in statistical theory
Bell AJ, Sejnowski TJ (1995) An information-maximization approach to blind separation and blind deconvolution. Neural Comput 7:1129-59 [PubMed]
Cardoso JF (1994) On the performance of orthogonal source separation algorithms Proceedings Of EUSIPCO94 94:776-779
Cardoso JF (1998) Blind source separation: Statistical principles Proc IEEE 86:2009-2025
Cardoso JF (1999) High-order contrasts for independent component analysis. Neural Comput 11:157-92 [PubMed]
Cardoso JF, Souloumiac A (1993) Blind beamforming for non-gaussian signals Proc IEEE 140:362-370
Comon P (1994) Independent component analysis, a new concept? Signal Processing 36:287-314
Comon P (1996) Contrasts for multichannel blind deconvolution IEEE Signal Processing Letters 3:209-211
Cover TM, Thomas JA (1991) Elements of Information Theory
Crain BR (1974) Estimation of distributions using orthogonal expansions Ann Stat 2:454-463
Erdogmus D, Hild_II KE, Principe JC (2001) Independent component analysis using Renyi's mutual information and Legendre density estimation Proc IJCNN01 :2762-2767
Erdogmus D, Hild_II KE, Rao YN, Principe JC (2003) Independent component analysis using Jaynes maximum entropy principle Proc ICA (Available on-line: http:--www.kecl.ntt.co.jp-icl-signal-ica2003-cdrom-index.htm) :385-390
Girolami M (1997) Symmetric adaptive maximum likelihood estimation for noise cancellation and signal estimation Electronic Letters 33:1437-1438
Girolami M (2002) Orthogonal series density estimation and the kernel eigenvalue problem. Neural Comput 14:669-88 [Journal] [PubMed]
Girolami M, Fyfe C (1997) Kurtosis extrema and identification of independent components: A neural network approach Proc ICASSP :3329-3332
Golub GH, van_Loan CF (1996) Matrix computations
Hild_II KE, Erdogmus D, Principe JC (2001) Blind source separation using Renyi's mutual information IEEE Signal Processing Letters 8:174-176
Hyvarinen A (1998) New approximations of differential entropy for independent component analysis and projection pursuit Advances in neural information processing systems, Kearns M:Jordan M:Solla S, ed. pp.273
Hyvarinen A (1999) Survey on Independent Component Analysis Neural Computing Surveys 2:94-128
Hyvärinen A (1999) Fast and robust fixed-point algorithms for independent component analysis. IEEE Trans Neural Netw 10:626-34 [Journal] [PubMed]
Jaynes ET (1957) Information theory and statistical mechanics Phys Rev 106:620-630
Kapur J, Kesavan H (1992) Entropy optimization principles and applications
Karvanen J, Eriksson J, Koivunen V (2000) Maximum likelihood estimation of ICA model for wide class of source distributions Proc NNSP 00:445-454
Oja E (1999) The nonlinear PCA learning rule in independent component analysis Proc ICA :143-148
Parra L, Spence C (2000) Convolutive blind source separation of nonstationary sources IEEE Trans Speech Audio Process 8:320-327
Parzen E (1962) On the estimation of a probability density function and mode Ann Math Stat 33:1064-1076
Pham DT (1996) Blind separation of instantaneous mixture sources via an independent component analysis IEEE Trans Signal Processing 44:2768-2779
Pham DT (2001) Blind separation of instantaneous mixture of sources via the gaussian mutual information criterion Signal Processing 81:855-870
Principe JC, Xu D (1999) Information theoretic learning using Renyis quadratic entropy Proc ICA :407-412
Renyi A (1970) Probability theory
Shore JE, Johnson RW (1980) Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy IEEE Trans Information Theory 26:26-37
Simon C, Loubaton P, Vignat C, Jutten C, dUrso G (1998) Blind source separation of convolutive mixtures by maximizing of fourth-order cumulants: The non-IID case Pro ICASSIP 98:1584-1588
Sun X, Douglas SC (2001) Adaptive paraunitary filter banks for contrast-based multichannel blind deconvolution Proc ICASSP 01:2753-2756
Torkkola K (1996) Blind separation of delayed sources based on information maximization Proc NNSP :1-10
Torkkola K (1999) Blind separation for audio signals-are we there yet? Proc ICA :239-244
Weinstein E, Feder M, Oppenheim A (1993) Multi-channel signal separation by decorrelation IEEE Trans Speech Audio Processing 1:405-413
Wu HC, Principe JC (1997) A unifying criterion for blind source separation and decorrelation: Simultaneous diagonalization of correlation matrices Proc NNSP :465-505
Wu HC, Principe JC (1999) A gaussianity measure for blind source separation insensitive to the sign of kurtosis Proc NNSP :58-66
Wu HC, Principe JC (1999) Generalized anti-Hebbian learning for source separation Proc ICASSP :1073-1076
Xu D, Principe JC, Fisher J, Wu HC (1998) A novel measure for independent component analysis Proc ICASSP :1161-1164
Yang HH, Amari S (1997) Adaptive on-line learning algorithms for blind separation: Maximum entropy and minimum mutual information Neural Comput 9:457-482
(45 refs)