Title:
|
Independent component analysis with applications to blind source separation
|
In areas like statistics, data analysis and signal processing, a commonly encountered and well-established technique is to decompose signals into additive uncorrelated components. Often it allows a more insightful view into the nature of the problem and facilitates analysis. One of the more recent decomposition approaches is the independent component analysis (ICA). It can be seen as a generalisation of the principal component analysis (PCA) which coincides with ICA under the classical assumption that joint statistics are well captured by Gaussian distributions. ICA unrestrictedly allows any kind of source statistics with numerous application areas. Amongst them "blind equalisation' and 'blind source separation of instantaneous mixtures', which are investigated in this thesis . It may be a surprise but the deduced principles are applicable in either ICA or neural computing. This is emphasised by excessive usage of cross-references throughout this thesis. In order to build a ICA foundation, the minimum mutual information principle is introduced. From there, entropy and negentropy optimisation techniques are found to provide approximate but implementable solutions. Related to the maximisation of the approximate negentropy, with the maximum sum of squared cumulant criterion, a meticulous method has been reasoned. It allows closed-form ICA solutions with a high analytical potential where else burdensome nonlinear optimisation techniques dominate. Because of the computational demands of the latter, numerous adaptive solutions, amongst them subspace tracking techniques, Gram-Schmidt processes or stochastic gradient descents methods are suggested to take account of the otherwise exploding numerical complexity. The performance and the benefits of the novel methods are illustrated by comparative experiments. The contributions of this thesis can be summarised as follows: Many of the hitherto known methods for independence projection are unified using the concise minimum mutual information approach. Minimum mutual information and descending negentropy and entropy optimisation criteria are derived from the most elementary independence definition. Instead of the generally applied approaches based on the approximation of the maximum likelihood function, the meticulous maximum squared cumulant principle has been developed. From there, closed-form estimators such as — the powerful MaSSFOC and MaSSTOC (for asymmetric distributions), — the SKSE with an improved estimation performance in the standard and reduced (similar to "ML' [62], 'EML' [95, 129-134], AML' [55]) form, the SKDE in standard and reduced (similar 'AEML' [128], "AML[55]) form and the hybrid SKSE/DE are deduced and their statistical behaviour is modelled. Entropy optimising methods are developed and their global convergence (convexity) and stability with several (maximum likelihood) score functions is demonstrated. The Cramér-Rao bound for independent component analysis is determined for certain source distributions and universally for sample (cumulant) statistics. It will be shown that ICA is applicable to mixtures of any source statistics. The hierarchical projection pursuit technique EDA, which applies approximate subspace diagonalisation, is developed. The Godard (constant modulus) blind source separation is implemented. This thesis provides numerous "open ended" ideas and solutions at an early development stage which could be used as a starting point for independent implementation, to suit the particular needs of more specific applications than blind source separation and blind equalisation considered here.
|