On the generalization ability of recurrent networks

Autor(en): Hammer, B
Herausgeber: Dorffner, G
Bischof, H
Hornik, K
Stichwörter: Computer Science; Computer Science, Artificial Intelligence; Computer Science, Theory & Methods
Erscheinungsdatum: 2001
Herausgeber: SPRINGER-VERLAG BERLIN
Journal: ARTIFICIAL NEURAL NETWORKS-ICANN 2001, PROCEEDINGS
LECTURE NOTES IN COMPUTER SCIENCE
Volumen: 2130
Startseite: 731
Seitenende: 736
Zusammenfassung: 
The generalization ability of discrete time partially recurrent networks is examined. It is well known that the VC dimension of recurrent networks is infinite in most interesting cases and hence the standard VC analysis cannot be applied directly. We find guarantees for specific situations where the transition function forms a contraction or the probability of long inputs is restricted. For the general case, we derive posterior bounds which take the input data into account. They are obtained via a generalization of the luckiness framework to the agnostic setting. The general formalism allows to focus on reppresentative parts of the data as well as more general situations such as long term prediction.
Beschreibung: 
International Conference on Artificial Neural Networks (ICANN 2001), VIENNA UNIV TECHNOL, VIENNA, AUSTRIA, AUG 21-25, 2001
ISBN: 9783540424864
ISSN: 03029743

Show full item record

Google ScholarTM

Check

Altmetric