On-line learning with minimized change of the global mapping Optimized local learning by incremental risk minimization

Autor(en): Buschermoehle, Andreas
Brockmann, Werner 
Stichwörter: Big data; CLASSIFICATION; Computer Science; Computer Science, Artificial Intelligence; GRADIENT DESCENT; LEAST-SQUARES ALGORITHM; MODEL; Nonstationary environments; On-line learning; PERCEPTRON; Real-time; REGRESSION
Erscheinungsdatum: 2015
Herausgeber: SPRINGER HEIDELBERG
Journal: EVOLVING SYSTEMS
Volumen: 6
Ausgabe: 2, SI
Startseite: 131
Seitenende: 151
Zusammenfassung: 
On-line learning regression has been extensively studied as it has the advantages of allowing continuous adaptation to nonstationary environments, handling big data, and a fixed low computation and memory demand. Most research deals with direct linear regression. But the influence of a nonlinear transformation of the inputs through a fixed model structure is still an open problem. We present an on-line learning approach which is able to deal with all kinds of nonlinear model structures. Its emphasis is on minimizing the effect of local training examples on changes of the global mapping. Thus it yields a robust behavior by preventing overfitting on sparse data as well as fatal forgetting. This paper presents a first order version called incremental risk minimization algorithm (IRMA) in detail. It then extends this approach to a second order version of IRMA, which continuously adapts the learning process itself to the data at hand. For both versions it is proven that every learning step minimizes the worst case loss. We finally demonstrate the effectiveness by a series of experiments with synthetic and real data sets.
ISSN: 18686478
DOI: 10.1007/s12530-014-9118-9

Show full item record

Page view(s)

1
Last Week
0
Last month
0
checked on Feb 22, 2024

Google ScholarTM

Check

Altmetric