Learning Centre

Advances in independent component analysis and nonnegative matrix factorization

 |  Login

Show simple item record

dc.contributor Aalto-yliopisto fi
dc.contributor Aalto University en
dc.contributor.author Yuan, Zhijian
dc.date.accessioned 2012-08-21T12:42:09Z
dc.date.available 2012-08-21T12:42:09Z
dc.date.issued 2009
dc.identifier.isbn 978-951-22-9831-0 (electronic)
dc.identifier.isbn 978-951-22-9830-3 (printed) #8195;
dc.identifier.issn 1797-5069
dc.identifier.uri https://aaltodoc.aalto.fi/handle/123456789/4610
dc.description.abstract A fundamental problem in machine learning research, as well as in many other disciplines, is finding a suitable representation of multivariate data, i.e. random vectors. For reasons of computational and conceptual simplicity, the representation is often sought as a linear transformation of the original data. In other words, each component of the representation is a linear combination of the original variables. Well-known linear transformation methods include principal component analysis (PCA), factor analysis, and projection pursuit. In this thesis, we consider two popular and widely used techniques: independent component analysis (ICA) and nonnegative matrix factorization (NMF). ICA is a statistical method in which the goal is to find a linear representation of nongaussian data so that the components are statistically independent, or as independent as possible. Such a representation seems to capture the essential structure of the data in many applications, including feature extraction and signal separation. Starting from ICA, several methods of estimating the latent structure in different problem settings are derived and presented in this thesis. FastICA as one of most efficient and popular ICA algorithms has been reviewed and discussed. Its local and global convergence and statistical behavior have been further studied. A nonnegative FastICA algorithm is also given in this thesis. Nonnegative matrix factorization is a recently developed technique for finding parts-based, linear representations of non-negative data. It is a method for dimensionality reduction that respects the nonnegativity of the input data while constructing a low-dimensional approximation. The non-negativity constraints make the representation purely additive (allowing no subtractions), in contrast to many other linear representations such as principal component analysis and independent component analysis. A literature survey of Nonnegative matrix factorization is given in this thesis, and a novel method called Projective Nonnegative matrix factorization (P-NMF) and its applications are provided. en
dc.format.extent Verkkokirja (759 KB, 90 s.)
dc.format.mimetype application/pdf
dc.language.iso en en
dc.publisher Teknillinen korkeakoulu en
dc.relation.ispartofseries Dissertations in information and computer science / Helsinki University of Technology,, 13 en
dc.relation.haspart [Publication 1]: Zhijian Yuan and Erkki Oja. 2004. A FastICA algorithm for non-negative independent component analysis. In: Carlos G. Puntonet and Alberto Prieto (editors). Proceedings of the 5th International Conference on Independent Component Analysis and Blind Signal Separation (ICA 2004). Granada, Spain. 22-24 September 2004. Springer. Lecture Notes in Computer Science, volume 3195, pages 1-8. en
dc.relation.haspart [Publication 2]: Scott C. Douglas, Zhijian Yuan, and Erkki Oja. 2006. Average convergence behavior of the FastICA algorithm for blind source separation. In: Justinian Rosca, Deniz Erdogmus, José C. Príncipe, and Simon Haykin (editors). Proceedings of the 6th International Conference on Independent Component Analysis and Blind Signal Separation (ICA 2006). Charleston, SC, USA. 5-8 March 2006. Springer. Lecture Notes in Computer Science, volume 3889, pages 790-798. en
dc.relation.haspart [Publication 3]: Erkki Oja and Zhijian Yuan. 2006. The FastICA algorithm revisited: convergence analysis. IEEE Transactions on Neural Networks, volume 17, number 6, pages 1370-1381. en
dc.relation.haspart [Publication 4]: Zhijian Yuan and Erkki Oja. 2005. Projective nonnegative matrix factorization for image compression and feature extraction. In: Heikki Kalviainen, Jussi Parkkinen, and Arto Kaarna (editors). Proceedings of the 14th Scandinavian Conference on Image Analysis (SCIA 2005). Joensuu, Finland. 19-22 June 2005. Springer. Lecture Notes in Computer Science, volume 3540, pages 333-342. en
dc.relation.haspart [Publication 5]: Zhirong Yang, Zhijian Yuan, and Jorma Laaksonen. 2007. Projective non-negative matrix factorization with applications to facial image processing. International Journal of Pattern Recognition and Artificial Intelligence, volume 21, number 8, pages 1353-1362. en
dc.relation.haspart [Publication 6]: Zhijian Yuan and Erkki Oja. 2007. A family of modified projective nonnegative matrix factorization algorithms. In: Mohammed Al-Mualla (editor). Proceedings of the 9th International Symposium on Signal Processing and Its Applications (ISSPA 2007). Sharjah, United Arab Emirates. 12-15 February 2007, pages 1-4. en
dc.relation.haspart [Publication 7]: Zhijian Yuan, Zhirong Yang, and Erkki Oja. Projective nonnegative matrix factorization: sparseness, orthogonality, and clustering. Submitted to a journal. en
dc.subject.other Computer science en
dc.title Advances in independent component analysis and nonnegative matrix factorization en
dc.type G5 Artikkeliväitöskirja fi
dc.contributor.department Tietojenkäsittelytieteen laitos fi
dc.subject.keyword independent component analysis en
dc.subject.keyword FastICA algorithms en
dc.subject.keyword nonnegative matrix factorization en
dc.identifier.urn URN:ISBN:978-951-22-9831-0
dc.type.dcmitype text en
dc.type.ontasot Väitöskirja (artikkeli) fi
dc.type.ontasot Doctoral dissertation (article-based) en
local.aalto.digifolder Aalto_68091
local.aalto.digiauth ask


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search archive


Advanced Search

article-iconSubmit a publication

Browse

Statistics