Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Tolerating Concept and Sampling Shift in Lazy Learning Using Prediction Error Context Switching

Tolerating Concept and Sampling Shift in Lazy Learning Using Prediction Error Context Switching In their unmodified form, lazy-learning algorithms may have difficulty learning and tracking time-varying input/output function maps such as those that occur in concept shift. Extensions of these algorithms, such as Time-Windowed forgetting (TWF), can permit learning of time-varying mappings by deleting older exemplars, but have decreased classification accuracy when the input-space sampling distribution of the learning set is time-varying. Additionally, TWF suffers from lower asymptotic classification accuracy than equivalent non-forgetting algorithms when the input sampling distributions are stationary. Other shift-sensitive algorithms, such as Locally-Weighted forgetting (LWF) avoid the negative effects of time-varying sampling distributions, but still have lower asymptotic classification in non-varying cases. We introduce Prediction Error Context Switching (PECS) which allows lazy-learning algorithms to have good classification accuracy in conditions having a time-varying function mapping and input sampling distributions, while still maintaining their asymptotic classification accuracy in static tasks. PECS works by selecting and re-activating previously stored instances based on their most recent consistency record. The classification accuracy and active learning set sizes for the above algorithms are compared in a set of learning tasks that illustrate the differing time-varying conditions described above. The results show that the PECS algorithm has the best overall classification accuracy over these differing time-varying conditions, while still having asymptotic classification accuracy competitive with unmodified lazy-learners intended for static environments. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Artificial Intelligence Review Springer Journals

Tolerating Concept and Sampling Shift in Lazy Learning Using Prediction Error Context Switching

Artificial Intelligence Review , Volume 11 (5) – Oct 15, 2004

Loading next page...
 
/lp/springer-journals/tolerating-concept-and-sampling-shift-in-lazy-learning-using-sIeXmgg4OZ

References (24)

Publisher
Springer Journals
Copyright
Copyright © 1997 by Kluwer Academic Publishers
Subject
Computer Science; Artificial Intelligence (incl. Robotics); Computer Science, general
ISSN
0269-2821
eISSN
1573-7462
DOI
10.1023/A:1006515405170
Publisher site
See Article on Publisher Site

Abstract

In their unmodified form, lazy-learning algorithms may have difficulty learning and tracking time-varying input/output function maps such as those that occur in concept shift. Extensions of these algorithms, such as Time-Windowed forgetting (TWF), can permit learning of time-varying mappings by deleting older exemplars, but have decreased classification accuracy when the input-space sampling distribution of the learning set is time-varying. Additionally, TWF suffers from lower asymptotic classification accuracy than equivalent non-forgetting algorithms when the input sampling distributions are stationary. Other shift-sensitive algorithms, such as Locally-Weighted forgetting (LWF) avoid the negative effects of time-varying sampling distributions, but still have lower asymptotic classification in non-varying cases. We introduce Prediction Error Context Switching (PECS) which allows lazy-learning algorithms to have good classification accuracy in conditions having a time-varying function mapping and input sampling distributions, while still maintaining their asymptotic classification accuracy in static tasks. PECS works by selecting and re-activating previously stored instances based on their most recent consistency record. The classification accuracy and active learning set sizes for the above algorithms are compared in a set of learning tasks that illustrate the differing time-varying conditions described above. The results show that the PECS algorithm has the best overall classification accuracy over these differing time-varying conditions, while still having asymptotic classification accuracy competitive with unmodified lazy-learners intended for static environments.

Journal

Artificial Intelligence ReviewSpringer Journals

Published: Oct 15, 2004

There are no references for this article.