On-line learning with minimized change of the global mapping Optimized local learning by incremental risk minimization
|Big data; CLASSIFICATION; Computer Science; Computer Science, Artificial Intelligence; GRADIENT DESCENT; LEAST-SQUARES ALGORITHM; MODEL; Nonstationary environments; On-line learning; PERCEPTRON; Real-time; REGRESSION
On-line learning regression has been extensively studied as it has the advantages of allowing continuous adaptation to nonstationary environments, handling big data, and a fixed low computation and memory demand. Most research deals with direct linear regression. But the influence of a nonlinear transformation of the inputs through a fixed model structure is still an open problem. We present an on-line learning approach which is able to deal with all kinds of nonlinear model structures. Its emphasis is on minimizing the effect of local training examples on changes of the global mapping. Thus it yields a robust behavior by preventing overfitting on sparse data as well as fatal forgetting. This paper presents a first order version called incremental risk minimization algorithm (IRMA) in detail. It then extends this approach to a second order version of IRMA, which continuously adapts the learning process itself to the data at hand. For both versions it is proven that every learning step minimizes the worst case loss. We finally demonstrate the effectiveness by a series of experiments with synthetic and real data sets.
Show full item record
checked on Feb 22, 2024