Ryotaro Kamimura and Ryozo Kitajima
maximum information, SOM, deep learning, semi-supervised learning, active learning, information-theoretic method
Information-theoretic methods have been extensively used in neural networks since Linsker stated the maximum information preservation principle for multi-layered neural networks. Though several attempts have made to show the effectiveness of the principle, they have been limited to simple models. The present paper tries to show that the information maximization principle with the self-organizing maps (SOM) can be effective in training multi-layered neural networks. Additionally, the method aims to solve the problem of labeled data shortage dealt with by the semi-supervised and active learning. The method was applied to the artificial data with two classes. Experimental results showed that the present method with information maximization gave the best generalization performance. The SOM knowledge could be effective in training multi-layered neural networks when the maximum information principle was applied.
Important Links:
Go Back