Tilinglike learning in the parity machine

Michael Biehl, Manfred Opper

Research output: Contribution to journalArticleAcademicpeer-review

16 Citations (Scopus)
1 Downloads (Pure)


An algorithm for the training of multilayered feedforward neural networks is presented. The strategy is very similar to the well-known tiling algorithm, yet the resulting architecture is completely different. New hidden units are added to one layer only in order to correct the errors of the previous ones; standard perceptron learning can be applied. The output of the network is given by the product of these k (±1) neurons (parity machine). In a special case with two hidden units, the capacity αc and stability of the network can be derived exactly by means of a replica-symmetric calculation. Correlations between the two sets of couplings vanish exactly. For the case of arbitrary k, estimates of αc are given. The asymptotic capacity per input neuron of a network trained according to the proposed algorithm is found to be αc ~ k lnk for k→∞ in the estimation. This is in agreement with recent analytic results for the algorithm-independent capacity of a parity machine.
Original languageEnglish
Pages (from-to)6888-6894
Number of pages7
JournalPhysical Review A
Issue number10
Publication statusPublished - 1991
Externally publishedYes

Cite this