Even on finite test sets smaller nets may perform better

DC FieldValueLanguage
dc.contributor.authorElsken, T
dc.date.accessioned2021-12-23T15:59:34Z-
dc.date.available2021-12-23T15:59:34Z-
dc.date.issued1997
dc.identifier.issn08936080
dc.identifier.urihttps://osnascholar.ub.uni-osnabrueck.de/handle/unios/4005-
dc.description.abstractFor feedforward multilayered neural nets we state conditions on the transfer function f under which such nets are uniquely defined by their mappings (up to trivial manipulations). More important we give sufficient conditions on f such that for two arbitrary structures having different numbers of layers there is a finite test set S on which the optimal smaller net performs better. That is there exist weights and thresholds for the smaller structure such that the resulting net has an error (with respect to S) which is less than that of the bigger net, no matter how the weights and thresholds are chosen for the latter. (C) 1997 Elsevier Science Ltd. All Rights Reserved.
dc.language.isoen
dc.publisherPERGAMON-ELSEVIER SCIENCE LTD
dc.relation.ispartofNEURAL NETWORKS
dc.subjectComputer Science
dc.subjectComputer Science, Artificial Intelligence
dc.subjectFEEDFORWARD
dc.subjectfeedforward neural nets
dc.subjectidentification from input/output relation
dc.subjectNEURAL NETWORKS
dc.subjectNeurosciences
dc.subjectNeurosciences & Neurology
dc.titleEven on finite test sets smaller nets may perform better
dc.typejournal article
dc.identifier.doi10.1016/S0893-6080(96)00068-8
dc.identifier.isiISI:A1997WN91300014
dc.description.volume10
dc.description.issue2
dc.description.startpage369
dc.description.endpage385
dc.publisher.placeTHE BOULEVARD, LANGFORD LANE, KIDLINGTON, OXFORD, ENGLAND OX5 1GB
dcterms.isPartOf.abbreviationNeural Netw.
Show simple item record

Google ScholarTM

Check

Altmetric