Nothing Special   »   [go: up one dir, main page]

Skip to main content

Robust Independent Component Analysis Using Quadratic Negentropy

  • Conference paper
Independent Component Analysis and Signal Separation (ICA 2007)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 4666))

  • 3086 Accesses

Abstract

We present a robust algorithm for independent component analysis that uses the sum of marginal quadratic negentropies as a dependence measure. It can handle arbitrary source density functions by using kernel density estimation, but is robust for a small number of samples by avoiding empirical expectation and directly calculating the integration of quadratic densities. In addition, our algorithm is scalable because the gradient of our contrast function can be calculated in O(LN) using the fast Gauss transform, where L is the number of sources and N is the number of samples. In our experiments, we evaluated the performance of our algorithm for various source distributions and compared it with other, well-known algorithms. The results show that the proposed algorithm consistently outperforms the others. Moreover, it is extremely robust to outliers and is particularly more effective when the number of observed samples is small and the number of mixed sources is large.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Bach, F.R., Jordan, M.I: Kernel independent component analysis. Journal of Machine Learning Research 3, 1–48 (2002)

    Article  MathSciNet  Google Scholar 

  2. Araki, S., Makino, S., Nishikawa, T., Saruwatari, H.: Fundamental limitation of frequency domain blind source separation for convolutive mixture of speech. In: Proc. ICASSP. vol. 5, pp. 2737–2740 (2001)

    Google Scholar 

  3. Learned-Miller, E.G.: Ica using spacings estimates of entropy. Journal of Machine Learning Research 4, 1271–1295 (2003)

    Article  MathSciNet  Google Scholar 

  4. Torkkola, K.: Feature extraction by non-parametric mutual information maximization. Journal of Machine Learning Research 3, 1415–1438 (2003)

    Article  MATH  MathSciNet  Google Scholar 

  5. Hild II, K.E., Erdogmus, D., Principe, J.C.: Blind source separation using renyi’s mutual information. IEEE Signal Processing Letters 8(6), 174–176 (2001)

    Article  Google Scholar 

  6. Hild II, K.E., Erdogmus, D., Principe, J.C.: An analysis of entropy estimators for blind source separation. Signal Processing 86, 182–194 (2006)

    Article  Google Scholar 

  7. Comon, P.: Independent component analysis, a new concept? Signal Processing 36, 287–314 (1994)

    Article  MATH  Google Scholar 

  8. Principe, J.C., Fisher III, J.W., Xu, D.: Information theoretic learning. In: Haykin, S. (ed.) Unsupervised Adaptive Filtering, Wiley, New York (2000)

    Google Scholar 

  9. Greengard, L., Strain, J.: The fast gauss transform. SIAM Journal on Scientific and Statistical Computing 12(1), 79–94 (1991)

    Article  MATH  MathSciNet  Google Scholar 

  10. Edelman, A., Arias, T.A., Smith, S.T.: The geometry of algorithms with orthogonality constraints. SIAM Journal on Matrix Analysis and Applications 20(2), 303–353 (1998)

    Article  MATH  MathSciNet  Google Scholar 

  11. Silverman, B.W.: Density Estimation for Statistics and Data Analysis. Chapman and Hall, Sydney (1986)

    MATH  Google Scholar 

  12. Hyvarinen, A., Oja, E.: A fast fixed-point algorithm for independent component analysis. Neural Computation 9, 1483–1492 (1997)

    Article  Google Scholar 

  13. Lee, T.-W., Girolami, M., Sejnowski, T.J.: Independent component analysis using an extended infomax algorithm for mixed sub-gaussian and super-gaussian sources. Neural Computation 11, 417–441 (1999)

    Article  Google Scholar 

  14. Boscolo, R., Pan, H., Roychowdhury, V.P.: Independent component analysis based on nonparametric density estimation. IEEE Transactions on Neural Networks 15(1), 55–65 (2004)

    Article  Google Scholar 

  15. Amari, S., Cichocki, A., Yang, H.: A new learning algorithm for blind source separation. In: Advances in Neural Information Processing 8 (Proc. NIPS 1995), pp. 757–763 (1996)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Mike E. Davies Christopher J. James Samer A. Abdallah Mark D Plumbley

Rights and permissions

Reprints and permissions

Copyright information

© 2007 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Lee, J., Kim, T., Lee, SY. (2007). Robust Independent Component Analysis Using Quadratic Negentropy. In: Davies, M.E., James, C.J., Abdallah, S.A., Plumbley, M.D. (eds) Independent Component Analysis and Signal Separation. ICA 2007. Lecture Notes in Computer Science, vol 4666. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-74494-8_29

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-74494-8_29

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-74493-1

  • Online ISBN: 978-3-540-74494-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics