The 8-3-8 Encoding Problem
Introduction:-The 8-3-8 encoding problem is a classic among the multilayer perceptron test training problems. In our MLP we have an input layer with eight neurons i1, i2. . . i8, an output layer with eightneurons Ω1,Ω, . . . ,Ω8 and one hiddenlayer with three neurons. Thus, this networkrepresents a function .The training task is that an input of a value1 into the neuron ij should lead to an outputof a value 1 from the neuron Ωj.
The network with the 3hidden neurons represents some kind of binary encoding. Thus, our network is a machine in which the input is first encoded and afterwards decoded again. Analogously, we can train a 1024-10-1024 encoding problem. But is it possible to improve the efficiency of this procedure? Could there be, for example, a 1024-9-1024- or an 8-2-8-encoding network? Yes, even that is possible, since the network does not depend on binary encodings: Thus, an 8-2-8 network is sufficient for our problem. But the encoding of the networks far more difficult to understand and the training of the networks requires a lot more time.
An 8-1-8 network, however, does not work, since the possibility that the output of one neuron is compensated by another one is essential, and if there is only one hidden neuron, there is certainly no compensatory neuron.