Please use this identifier to cite or link to this item: https://hdl.handle.net/10316/44320
Title: Adaptive learning for dynamic environments: A comparative approach
Authors: Costa, Joana 
Silva, Catarina 
Antunes, Mário 
Ribeiro, Bernardete 
Keywords: Dynamic environments; EnsemblesLearn++.NSETwitter
Issue Date: 2017
metadata.degois.publication.title: Engineering Applications of Artificial Intelligence
metadata.degois.publication.volume: 65
Abstract: Nowadays most learning problems demand adaptive solutions. Current challenges include temporal data streams, drift and non-stationary scenarios, often with text data, whether in social networks or in business systems. Various efforts have been pursued in machine learning settings to learn in such environments, specially because of their non-trivial nature, since changes occur between the distribution data used to define the model and the current environment. In this work we present the Drift Adaptive Retain Knowledge (DARK) framework to tackle adaptive learning in dynamic environments based on recent and retained knowledge. DARK handles an ensemble of multiple Support Vector Machine (SVM) models that are dynamically weighted and have distinct training window sizes. A comparative study with benchmark solutions in the field, namely the Learn++.NSE algorithm, is also presented. Experimental results revealed that DARK outperforms Learn++.NSE with two different base classifiers, an SVM and a Classification and Regression Tree (CART).
URI: https://hdl.handle.net/10316/44320
DOI: 10.1016/j.engappai.2017.08.004
Rights: openAccess
Appears in Collections:I&D CISUC - Artigos em Revistas Internacionais

Files in This Item:
File Description SizeFormat
07807338.pdf1.5 MBAdobe PDFView/Open
Show full item record

SCOPUSTM   
Citations

8
checked on Oct 28, 2024

WEB OF SCIENCETM
Citations

5
checked on May 2, 2023

Page view(s) 5

1,402
checked on Nov 6, 2024

Download(s)

370
checked on Nov 6, 2024

Google ScholarTM

Check

Altmetric

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.