R. Kamimura (Japan)
Mutual information, free energy, entropy, competitive learning, minimum information
In this paper, we propose free energy-based competitive learning and its computational method called minimum information production learning. The free energy has been introduced to overcome the fundamental problem of information-theoretic competitive learning, that is, fi delity to input patterns. Mutual information maximization for competitive learning so far developed for competitive learning is unconstrained maximization. This means that final connection weights are not always faithful to input patterns. The free energy with built-in cost functions has been very useful in dealing with faithful representations. However, in the free energy, we have some cases where mu tual information is degraded in the later stage of learning. The new computational method of minimum information production learning has been introduced to stabilize learn ing in the later stage of learning. We applied the method to the famous Iris problem and a student survey. In both cases, we succeeded in increasing the performance in terms of training and generalization errors. In addition, we found that when information could not be increased, minimum in formation production learning made it possible to stabilize learning processes.
Important Links:
Go Back