Even on finite test sets smaller nets may perform better

Autor(en): Elsken, T
Stichwörter: Computer Science; Computer Science, Artificial Intelligence; FEEDFORWARD; feedforward neural nets; identification from input/output relation; NEURAL NETWORKS; Neurosciences; Neurosciences & Neurology
Erscheinungsdatum: 1997
Herausgeber: PERGAMON-ELSEVIER SCIENCE LTD
Journal: NEURAL NETWORKS
Volumen: 10
Ausgabe: 2
Startseite: 369
Seitenende: 385
Zusammenfassung: 
For feedforward multilayered neural nets we state conditions on the transfer function f under which such nets are uniquely defined by their mappings (up to trivial manipulations). More important we give sufficient conditions on f such that for two arbitrary structures having different numbers of layers there is a finite test set S on which the optimal smaller net performs better. That is there exist weights and thresholds for the smaller structure such that the resulting net has an error (with respect to S) which is less than that of the bigger net, no matter how the weights and thresholds are chosen for the latter. (C) 1997 Elsevier Science Ltd. All Rights Reserved.
ISSN: 08936080
DOI: 10.1016/S0893-6080(96)00068-8

Show full item record

Google ScholarTM

Check

Altmetric