Unsupervised Splitting Rules for Neural Tree Classifiers

Michael P. Perrone, Nathan Intrator

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

This paper presents two unsupervised neural network splitting rules for use with CART-like neural tree algorithms in high dimensional data space. These splitting rules use an adaptive variance estimate to avoid some possible local minima which arise in unsupervised methods. We explain when the unsupervised splitting rules outperform supervised neural network splitting rules and when the unsupervised splitting rules outperform the standard node impurity splitting rules of CART. Using these unsupervised splitting rules leads to a nonparametric classifier for high dimensional space that extracts local features in an optimized way.

Original languageEnglish
Title of host publicationProceedings - 1992 International Joint Conference on Neural Networks, IJCNN 1992
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages820-825
Number of pages6
ISBN (Electronic)0780305590
DOIs
StatePublished - 1992
Externally publishedYes
Event1992 International Joint Conference on Neural Networks, IJCNN 1992 - Baltimore, United States
Duration: 7 Jun 199211 Jun 1992

Publication series

NameProceedings of the International Joint Conference on Neural Networks
Volume4

Conference

Conference1992 International Joint Conference on Neural Networks, IJCNN 1992
Country/TerritoryUnited States
CityBaltimore
Period7/06/9211/06/92

Fingerprint

Dive into the research topics of 'Unsupervised Splitting Rules for Neural Tree Classifiers'. Together they form a unique fingerprint.

Cite this