Independent Component Analysis (ICA) is a method quite similar to Principal Component Analysis. PCA aims to create a transformation that maximizes the variance of the resulting variables, while making them uncorrelated. ICA, on the other hand, aims to create variables that are statistically independent. Note that the ICA components are not assumed to be uncorrelated or orthogonal.
This allows ICA to pull out stronger signals in your data. It also doesnβt assume that the data is Gaussian.
One way to think about the difference between PCA and ICA, PCA can be used more effectively as a data compression technique, On the other hand, ICA helps uncover and separate the structure in the data itself.
The notion that ICA is a dimensionality reduction method is because the implementation of fastICA, which is commonly used, works incrementally.
ICA, much like PCA, requires that your data be normalized before it is applied.
Below is an example of the principle components in actions. We took the MNIST database and performed ICA on the pixel values as predictors. First We apply it to the entire data set.
The keras package is deprecated. Use the keras3 package instead.
We clearly see some effects here. Remember that it isnβt important whether something is positive or negative, just that something is different than something else. there isnβt super strong signals, but it appears that we are capturing 7-ness in the first IC and 6-ness in the third IC. We notice that each IC appears
70.2 Pros and Cons
70.2.1 Pros
Can identify stronger signals
70.2.2 Cons
Sensitive to noise and outliers
Computationally intensive
70.3 R Examples
We will be using the ames data set for these examples.