Total views : 382

A Classification Approach using Multi-Layer Perception with Back-Propagation Algorithm

Affiliations

  • Department of Computer Science, Bharathiar University, Coimbatore – 641046, Tamil Nadu, India
  • Department of MCA, MAM College of Engineering, Tiruchirappalli – 621105, Tamil Nadu, India

Abstract


In data mining feature subset selection is a preprocessing step in classification that lessens dimensionality, eliminates unrelated data, increases accuracy and improves unambiguousness. The next step in classification is to produce enormous amount of rules from the reduced feature set from which high class rules are chosen to build effectual classifier. In this paper, Information Gain (IG) has been used to rank the features. Multi-Layer Perception (MLP) with back-propagation reduces features to achieve higher accuracy in classification. Artificial Neural Networks (ANN) classifier is used for classification. We handle the discretization of continuous valued features by dividing the series of values into a limited number of subsections. Wine Recognition data set taken from the UCI machine learning repository is used for testing. Original 13 features are drawn in classification. The thirteen features are reduced to five features. Experimental results show that the accuracy in training dataset is 98.62% and in the validation dataset is 96.06%. The accuracy difference between 13 features and 5 features in the training data is 5.54% and in validation data is 2.00%. We then build a Decision Tree and concentrate on discovering significant rules from the reduced data set that provide better classification.

Keywords

Back-Propagation, Classification, Decision Tree, Feature Subset Selection, Multi-Layer Perception

Full Text:

 |  (PDF views: 181)

References


  • Holte RC. Very simple classification rules perform well on most commonly used datasets. Machine Learning. 1993 Apr; 11(1):63–90.
  • Kohavi R, John GH. Wrappers for feature subset selection.Artificial Intelligence. 1997 Dec; 92(1-2):273–324.
  • Hall M, Holmes G. Benchmarking feature selection techniques for discrete class data mining. IEEE Transactions on Knowledge and Data Engineering. 2003 Nov; 15(6):1437–47.
  • Dumais S, Platt J, Heckerman D, Sahami M. Inductive learning algorithms and representations for text categorization.Proceedings of the Seventh International Conference on Information and Knowledge Management, CIKM’98;1998. p. 148–55.
  • Yang Y, Pedersen JO. A comparative study on feature selection in text categorization. International Conference on Machine Learning; 1997. p. 412–20.
  • Prochazka AP. Feed-forward and recurrent neural networks in signal prediction. IEEE. 4th International Corference on Computational Cybernetics; 2007. p. 93–6.
  • Zebardast B, Maleki I, Maroufi A. A novel multilayer perceptron Artificial Neural Network based recognition forKurdish Manuscript. Indian Journal of Science and Technology.2014 Mar; 7(3):343–51.
  • Quinlan JR. C4.5: Programs for machine learning. Morgan Kaufmann; San Mateo, CA. 1993. p. 235–40.
  • Ahmad W, Narayanan A. Feature weighing for efficient clustering. IEEE Sixth International Conference on Advanced Information Management and Services; Seoul. 2010 Nov 30-Dec 2. p. 236–42.
  • Shaharanee IZM, Jamil J. Features selection and rule removal for frequent association rule based classification. Proceedings of the 4th International Conference on Computing and Informatics, ICOCI 2013; 2013. p. 377–82.
  • Krishnaveni V, Arumugam G. Harmony search based wrapper feature subset method for 1_nearest neighbor classifier.2013 International Conference on Pattern Recognition, Informatics and Mobile Engineering (PRIME); Salem. 2013 Feb 21-22. p. 24–9.
  • Bibi KF, Banu MN. Feature subset selection based on filter technique. IEEE International Conference on Computing and Communications Technologies; Chennai. 2015 Feb 26-27. p. 1–6.

Refbacks

  • »
  • »
  • »
  • »
  • »
  • »


Creative Commons License
This work is licensed under a Creative Commons Attribution 3.0 License.