# Independent Component Analysis

**A tutorial introduction**

## About this book

Independent component analysis (ICA) is becoming an increasingly important tool for analysing large data sets. In essence, ICA separates an observed set of signal mixtures into a set of statistically independent component signals, or source signals.

In so doing, this powerful method can extract the relatively small amount of useful information typically found in large data sets. The applications for ICA range from speech processing, brain imaging, and electrical brain signals to telecommunications and stock predictions.

In Independent Component Analysis, Jim Stone presents the essentials of ICA and related techniques (projection pursuit and complexity pursuit) in a tutorial style, using intuitive examples described in simple geometric terms.

The treatment fills the need for a basic primer on ICA that can be used by readers of varying levels of mathematical sophistication, including engineers, cognitive scientists, and neuroscientists who need to know the essentials of this evolving method.

An overview establishes the strategy implicit in ICA in terms of its essentially physical underpinnings and describes how ICA is based on the key observations that different physical processes generate outputs that are statistically independent of each other. The book then describes what Stone calls "the mathematical nuts and bolts" of how ICA works. Presenting only essential mathematical proofs, Stone guides the reader through an exploration of the fundamental characteristics of ICA.

Topics covered include the geometry of mixing and unmixing; methods for blind source separation; and applications of ICA, including voice mixtures, EEG, fMRI, and fetal heart monitoring. The appendixes provide a vector matrix tutorial, plus basic demonstration computer code that allows the reader to see how each mathematical method described in the text translates into working MATLAB computer code.

Published September 2004 by MIT Press

ISBN: 0262693151

Download Chapter 1 (PDF, 152KB)

## Contents

### Part I – Independent component analysis and blind source separation

**Chapter 1: Overview of independent component analysis**

1.1: Introduction

1.2: Independent component analysis: what is it?

1.3: How independent component analysis works

1.4: Independent component analysis and perception

1.5: Principal component analysis and factor analysis

1.6: Independent component analysis: what is it good for?

**Chapter 2: Strategies for blind source separation**

2.1: Introduction

2.2: Mixing signals

2.3: Unmixing signals

2.4: The number of sources and mixtures

2.5: Comparing strategies

2.6: Summary

### Part II – The geometry of mixtures

**Chapter 3: Mixing and Unmixing**

3.1: Introduction

3.2: Signals, variables, and scalars

3.3: The geometry of signals

3.4: Summary

**Chapter 4: Unmixing using the inner product**

4.1: Introduction

4.2: Unmixing Coefficients as weight vectors

4.3: The inner product

4.4: Matrices as geometric transformations

4.5: The mixing matrix transforms source signal axes

4.6: Summary

**Chapter 5: Independence and probability density functions**

5.1: Introduction

5.2: Histograms

5.3: Histograms and probability density functions

5.4: The central limit theorem

5.5: Cumulative density functions

5.6: Moments: mean, variance, skewness and kurtosis

5.7: Independence and correlation

5.8: Uncorrelated pendulums

5.9: Summary

### Part III – Methods for blind source separation

**Chapter 6: Projection pursuit**

6.1: Introduction

6.2: Mixtures are gaussian

6.3: Gaussian signals: good news, bad news

6.4: Kurtosis as a measure of non-normality

6.5: Weight vector angle and kurtosis

6.6: Using kurtosis to recover multiple source signals

6.7: Projection pursuit and ICA extract the same signals

6.8: When to stop extracting signals

6.9: Summary

**Chapter 7: Independent component analysis**

7.1: Introduction

7.2: Independence of joint and marginal distributions

7.3: Infomax: independence and entropy

7.4: Maximum likelihood ICA

7.5: Maximum likelihood and infomax equivalence

7.6: Extracting source signals using gradient ascent

7.7: Temporal and spatial ICA

7.8: Summary

**Chapter 8: Complexity pursuit**

8.1: Introduction

8.2: Predictability and complexity

8.3: Measuring complexity using signal predictability

8.4: Extracting signals by maximizing predictability

8.5: Summary

**Chapter 9: Gradient ascent**

9.1: Introduction

9.2: Gradient ascent on a line

9.3: Gradient ascent on a hill

9.4: Second order methods

9.5: The natural gradient

9.6: Global and local maxima

9.7: Summary

**Chapter 10: Principal component analysis and factor analysis**

10.1: Introduction

10.2: ICA and PCA

10.3: Eigenvectors and eigenvalues

10.4: PCA applied to speech signal mixtures

10.5: Factor analysis

10.6: Summary

### Part IV – Applications

**Chapter 11: Applications of ICA**

11.1: Introduction

11.2: Temporal ICA of voice mixtures

11.3: Temporal ICA of electroencephalograms158

11.4: Spatial ICA of fMRI data

11.5: Spatial ICA for color MRI data

11.6: Complexity pursuit for fetal heart monitoring

11.7: Complexity pursuit for learning stereo disparity

### Part V – Appendices

Appendix A: A vector matrix tutorial

Appendix B: Projection pursuit gradient ascent

Appendix C: Projection pursuit: stepwise separation of sources

Appendix D: ICA gradient ascent

Appendix E: Complexity pursuit gradient ascent

Appendix F: Principal component analysis for pre-processing data

Appendix G: Independent component analysis resources

Appendix H: Recommended reading

## Reviews

Book review by Simon Parsons in *Knowledge Engineering Review* (PDF, 29KB)

"This monograph provides a delightful tour, through the foothills of linear algebra to the higher echelons of independent components analysis, in a graceful and deceptively simple way. Its careful construction, introducing concepts as they are needed, discloses the fundamentals of source separation in a remarkably accessible and comprehensive fashion."

**Karl J. Friston, University College London**

"This fantastic book provides a broad introduction to both the theory and applications of independent component analysis. I recommend it to any student interested in exploring this emerging field."

**Martin J. McKeown, Associate Professor of Medicine (Neurology), University of British Columbia**

"Independent component analysis is a recent and powerful addition to the methods that scientists and engineers have available to explore large data sets in high-dimensional spaces. This book is a clearly written introduction to the foundations of ICA and the practical issues that arise in applying it to a wide range of problems."

**Terrence J. Sejnowski, Howard Hughes Medical Institute, Salk Institute for Biological Studies, and University of California, San Diego**

## MATLAB code

Projection pursuit code from Appendix B (M, 4KB)

ICA code from Appendix D (M, 3KB)

## Corrections

If you spot an error, email me at j.v.stone@sheffield.ac.uk.

p. 57, Equation 5.12 should have g(x0) instead of g(x) on the left hand side. Thanks to Guillem Serra.

p. 126, the text:

"First, the direction of steepest descent does not necessarily point directly at the maximum... "

Should be:

"First, the direction of steepest ascent does not necessarily point directly at the maximum... "

p. 146, no reference for figure 11.6 is given. This work was published as:

Muraki, S., Nakai, T., Kita, Y. and Tsuda, K. (2001) 'An attempt for coloring multichannel MR imaging data', *IEEE Transactions on Visualization and Computer Graphics*, 7(3), July, pp. 265–274. doi: http://dx.doi.org/10.1109/2945.942694