LDA is a dimensionality reduction technique (and a form of supervise learning technique) which is used for classification.

LDA takes multi-dimensional data, makes use of prior class information (Supervised Learning) and represents the data in a form which maximizes the distance between different classes.

You may wonder "How does it do it"?

It basically takes covariance of a class (of data) with itself, mean of the entire data, mean of each class, prior probabilities of the class. LDA also uses scatter within a class, scatter in between classes and tries to best separate 2 classes of data.

I wrote a small library doing the same in OpenCV as there was no class in C++ to do Linear Discriminant Analysis.

Before doing this with LDA, make sure that you have eigen libraries installed in your system and then install OpenCV (compile OpenCV with enabling use of Eigen Library). This is important as OpenCV relies on Eigen Library to calculate generalized Eigen Values, Vectors.

For dimensionality reduction:

Y = W . X

where W is the weights, Y is the project vector(projected vector in lower dimensions) and X is the original feature vector(in higher dimensions).

**NOTE**: The dimensionality of Y is always equal to

**(number of classes - 1).**

So, if you have 3 classes (

**C=3**) of data (where each feature vector is of length

**N**) and you want to project it to lower dimensions using LDA, the resultant projected vector will be of length 2 (i.e. C-1 = 3-1 = 2).

To install Eigen Library give the following command:

This makes sure that you have installed Eigen library on your PC. If you are not able to install using the above command, download the deb file from here.

To get a complete overview of LDA, look these links:

Download the following git repo and build it. If you have no idea on how to do it, you can follow the following steps: