This paper proposes a new pruning method based on merging neurons with similar functional behavior which is defined by the internal representations of each neuron for the entire training set. Classification of neurons by their functional behavior with respect to the input vectors provides a powerful tool for pruning neurons and connections, thus reducing the network complexity and increasing its generalization capability. The most remarkable property of this pruning scheme is its ability to preserve net functionality by transferring the role of every removed neuron to the most fitted neuron of the surviving ones, using a unique merging and compensation procedure. The implementation of the proposed method is demonstrated using a detailed numerical example and its performance is examined by a statistical measure calculated by repeating the training procedure several times. The influence of parameter selection on pruning performance and generalization ability is discussed and demonstrated by examining statistical results.