ENN for Pattern Recognition:

[Online Resources]


Overview:

This website includes the supplementary document, source code implementation, and numerious demos of the Extended Nearest Neighbor (ENN) method for pattern recognition, as originally propsoed in our paper [1]. Briefly speaking, ENN predicts a class label of an unknown test sample based on the maximum gain of intra-class coherence. Unlike the classic K-nearest neighbor (KNN) method, in which only the nearest neighbors of a test sample are used to estimate class membership, the ENN method makes a prediction in a "two-way communication" style: it not only considers who are the nearest neighbors of the test sample, but also who consider the test sample as their nearest neighbors. By exploiting the generalized class-wise statistics from all training data, ENN is able to learn from the global distribution, therefore improving pattern recognition performance and providing a powerful technique for a wide range of data processing applications.


Supplementary document:

The supplementary document of our ENN paper can be found from here [Link], which include three secions:

[Section #1]: An example of the calculation of ;

[Section #2]: Detailed classification performance of ENN method in comparision with KNN and Maximum a Posterior (MAP) for four Gaussian data models with different parameters;

[Section #3]: Detailed performance analysis and significance testing of ENN with KNN over 20 datasets from UCI Machine Learning Repository.

 


ENN Algorithm:

There are three versions of the ENN method:


ENN Classifier:
The ENN classification decision rule is described as the following:

ENN


ENN.V1 Classifier:

The ENN.V1 is described as following. In our paper, we have proved that the ENN.V1 is identical to the original ENN classifier, but does not require recalculating the generalized class-wise statistic for every test sample. Therefore, from a computational cost perspective, ENN.V1 is more efficient and we recommend to use this version in practical applications.


ENN.V2 Classifier:

Under certain conditions, our ENN method can be approximated by a simple expression as the following, named as ENN.V2. In our paper, we have discussed the detailed derivation of the ENN.V2 classifier.



Note: Among the three versions, ENN is the most straightforward approach to calculate and implement, ENN.V1 is more computationally efficient because it does not require re-calculating the generalized class-wise statistic for every testing sample, and ENN.V2 is an approximate approach under certain assumptions, which are generally satisfied (or loosely satisfied) for many practical situations.


 Source code and demo implementation

Demo 1: Given the training data with class labels, predict the class label of testing data.

Function:    [PredictionLabel] = ENN(TrainingData, TrainingLabel, TestingData, K)

This function returns the predicted class lable of TestingData using ENN method. TrainingData and TestingData must be the matrices with the same number of columns, i.e., the same number of features. TrainingLabel is a class label vector for TrainingData. K is an integer representing the number of nearest neighbors (a parameter of ENN). 

(You can download the source code of demo1 here)

Note: The algorithm implementation given in the source code is based on  with the consideration of computation reduction.

 

Demo 2: Given a dataset, evaluate and compare the classification performance of KNN and ENN decision rule using N-Fold cross validation approach.

Function:   [ACC_KNN, ACC_ENN] = ENNTest(Data, Label, K, NFold)

This function returns the classification accuracy of the classic KNN rule (represnted by ACC_KNN) and our proposed ENN rule (represented by ACC_ENN) using N-fold cross validation approach.  Data and Label represents the input dataset and their corresponding class labels, respectively. K is an integer representing the number of nearest neighbors (a parameter of KNN and ENN). NFold is an integer denoting the number of folds in cross-validation approach.

(You can download the source code of demo2 here)

Note: The algorithm implementation given in the source code is based on  with the consideration of computation reduction.

 

Demo 3:Similar to Demo 2, but we use ENN.V2 as the implementation. Please note that in order to use ENN.V2, we require the data to satisfy the two conditions as specified in ENN.V2 algorithm. However, in practical applications, we notice even if the two conditions are not strictly satisfied for real data sets, in many situations the ENN.V2 can still provide competitive performance.

Function:   [ACC_KNN, ACC_ENN] = ENNV2Test(Data, Label, K, NFold)

(You can download the source code of demo3 here)

Note: This function is similar to the ENNTest function as shown in Demo2, but use ENN.V2 algorithm as the implementation. 

 


Reference

[1] B. Tang and H. He, "ENN: Extended Nearest Neighbor Method for Pattern Recognition," IEEE Computational Intelligence Magazine, 2015 (in press)