首页 | 本学科首页   官方微博 | 高级检索  
   检索      

一种具有遗忘特性的在线学习算法框架
引用本文:孙博良,李国辉.一种具有遗忘特性的在线学习算法框架[J].国防科技大学学报,2014,36(4):188-194.
作者姓名:孙博良  李国辉
作者单位:湖南省长沙市国防科技大学信息系统与管理学院,湖南省长沙市国防科技大学信息系统与管理学院
摘    要:基于凸优化中的对偶理论,提出了一种具有遗忘特性的在线学习算法框架。其中,Hinge函数的Fenchel对偶变换是将基本学习问题由批量学习转化为在线学习的关键。新的算法过程是通过以不同方式提升含有约束变量的对偶问题实现的:(1)梯度提升;(2)贪婪提升。回顾了以往的相关研究工作,并指出了与之的区别与联系。人造数据集和真实数据集上的实验结果证实了算法框架的有效性。算法可以很好地处理数据流中的分类面漂移问题,为设计和分析新的在线学习算法提供了一个新的思路。

关 键 词:在线学习  Fenchel对偶  梯度提升  贪婪提升
收稿时间:2013/11/12 0:00:00

An online learning algorithmic framework with forgetting property
SUN Boliang and LI Guohui.An online learning algorithmic framework with forgetting property[J].Journal of National University of Defense Technology,2014,36(4):188-194.
Authors:SUN Boliang and LI Guohui
Institution:College of Information System and Management, National University of Defense Technology, Changsha 410073, China;College of Information System and Management, National University of Defense Technology, Changsha 410073, China
Abstract:Based on the notion of duality in convex optimization, we propose a novel online learning algorithmic framework with forgetting property. The Fenchel conjugate of hinge functions is a key to transfer the basic learning problem from batch to online. New online learning algorithms are derived by different dual ascending procedures: (1) gradient ascent; (2) greedy ascent. We also recap and draw connections to earlier works. Detailed experiments on synthetic and real-world datasets verify the effectiveness of our approaches. An important conclusion is that our derived online learning algorithms can handle the settings where the target hypothesis is not fixed but drifts with the sequence of examples. This paper paves a way to the design and analysis of online learning algorithms.
Keywords:online learning  Fenchel conjugate  gradient ascent  greedy ascent
本文献已被 CNKI 等数据库收录!
点击此处可从《国防科技大学学报》浏览原始摘要信息
点击此处可从《国防科技大学学报》下载免费的PDF全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号