W. Toplak (Austria)
Data Reduction, Neural Networks, Chaos Theory, Performance Evaluation, Prediction, System Characteristics
The reduction of data patterns for modelling and calibration is very important since databases are growing in every branch of science. The aims are to shrink the amount of used patterns and to simultaneously sustain the original characteristics which are represented by a huge set of time series data. Through this action it is possible to train feed-forward Neural Networks (NN) more quickly and to reduce knowledge bases used by means of statistical pattern recognition. The advantages are that computation time for NN training and recall time for statistical pattern recognition is reduced. However, data reduction techniques may reveal individual weaknesses in mapping original characteristics. In this work two techniques are compared on the basis of traffic patterns. The first method is Kohonen’s well known Self Organizing Map (SOM) using the euclidean distance measure and the second method is a new invention called Aspects of Lyapunov, Entropy and Variance (ALEV). This selection heuristic is based on indicators which descend from statistics and nonlinear dynamics (chaos theory). It will be shown how ALEV outperforms the SOM in computation time as well as in the ability to preserve the characteristics of the observed traffic system which is represented by the collected data.
Important Links:
Go Back