alexa DECISION TREE LEARNING WITH ERROR CORRECTED INTERVAL V
ISSN ONLINE(2320-9801) PRINT (2320-9798)

International Journal of Innovative Research in Computer and Communication Engineering
Open Access

OMICS International organises 3000+ Global Conferenceseries Events every year across USA, Europe & Asia with support from 1000 more scientific Societies and Publishes 700+ Open Access Journals which contains over 50000 eminent personalities, reputed scientists as editorial board members.

Open Access Journals gaining more Readers and Citations

700 Journals and 15,000,000 Readers Each Journal is getting 25,000+ Readers

This Readership is 10 times more when compared to other Subscription Journals (Source: Google Analytics)

Research Article

DECISION TREE LEARNING WITH ERROR CORRECTED INTERVAL VALUES OF NUMERICAL ATTRIBUTES IN TRAINING DATA SETS

C. SUDARSANA REDDY1, S. AQUTER BABU2 and Dr. V. VASU3
  1. Department of Computer Science and Engineering, S.V. University College of Engineering, S.V. University, Tirupati, Andhra Pradesh, India
  2. Assistant Professor of Computer Science, Department of Computer Science, Dravidian University, Kuppam - 517425, Chittoor District, Andhra Pradesh, India
  3. Department of Mathematics, S.V. University, Tirupati, (A.P), India
Related article at Pubmed, Scholar Google
 

Abstract

Classification is the most important technique in data mining. A Decision tree is the most important classification technique in machine learning and data mining. Data measurement errors are common in any data collection process, particularly when the training datasets contain numerical attributes. Values of numerical attributes contain data measurement errors in many training data sets. We extend certain or traditional or classical decision tree building algorithms to handle training data sets with numerical attributes containing measurement errors. We have discovered that the classification accuracy of a certain or classical or traditional decision tree classifier can be much improved if the data measurement errors in the values of numerical (or continuous) attributes in the training data sets are properly controlled (corrected or handled) appropriately. The present study proposes a new algorithm for decision tree classifier construction. This new algorithm is named as Interval Decision Tree (IDT) classifier construction. IDT classifiers are more accurate and efficient than certain or traditional decision tree classifiers. An interval is constructed for each value of each attribute in the training data set and within the interval the best error corrected value is approximated and then entropy is calculated. Extensive experiments have been conducted which show that the resulting IDT classifiers are more accurate than certain or traditional or classical decision tree classifiers.

Keywords

Share This Page

Additional Info

Loading
Loading Please wait..
Peer Reviewed Journals
 
Make the best use of Scientific Research and information from our 700 + peer reviewed, Open Access Journals
International Conferences 2017-18
 
Meet Inspiring Speakers and Experts at our 3000+ Global Annual Meetings

Contact Us

 
© 2008-2017 OMICS International - Open Access Publisher. Best viewed in Mozilla Firefox | Google Chrome | Above IE 7.0 version
adwords