On the learnability of recursive data

Autor(en): Hammer, B
Stichwörter: Automation & Control Systems; COMPLEXITY; computational learning theory; Engineering; Engineering, Electrical & Electronic; folding networks; Mathematics; Mathematics, Interdisciplinary Applications; NEURAL NETWORKS; PAC learning; recurrent neural networks; VC dimension
Erscheinungsdatum: 1999
Herausgeber: SPRINGER LONDON LTD
Journal: MATHEMATICS OF CONTROL SIGNALS AND SYSTEMS
Volumen: 12
Ausgabe: 1
Startseite: 62
Seitenende: 79
Zusammenfassung: 
We establish some general results concerning PAC learning: We find a characterization of the property that any consistent algorithm is PAC. It is shown that the shrinking width property is equivalent to PUAC learnability. By counterexample, PAC and PUAC learning are shown to be different concepts. We find conditions ensuring that any nearly consistent algorithm is PAC or PUAC, respectively. The VC dimension of recurrent neural networks and folding networks is infinite. For restricted inputs, however, bounds exist. The bounds for restricted inputs are transferred to folding networks. We find conditions on the probability of the input space ensuring polynomial learnability: the probability of sequences or trees has to converge to zero sufficiently fast with increasing length or height. Finally, we find an example for a concept class that requires exponentially growing sample sizes for accurate generalization.
ISSN: 09324194
DOI: 10.1007/PL00009845

Show full item record

Google ScholarTM

Check

Altmetric