Nothing Special   »   [go: up one dir, main page]

Computer Science ›› 2018, Vol. 45 ›› Issue (6): 251-258.doi: 10.11896/j.issn.1002-137X.2018.06.045

• Artificial Intelligence • Previous Articles     Next Articles

Unsupervised Active Learning Based on Adaptive Sparse Neighbors Reconstruction

LV Ju-jian1,2, ZHAO Hui-min1,2, CHEN Rong-jun1, LI Jian-hong3   

  1. Guangdong Polytechnic Normal University,Guangzhou 510665,China1;
    Key Laboratory of Guangzhou Digital Content Processing and Security Technology,Guangzhou 510665,China2;
    Language Engineering and Computing Laboratory,Guangdong University of Foreign Studies,Guangzhou 510006,China3
  • Received:2017-01-11 Online:2018-06-15 Published:2018-07-24

Abstract: In many information processing tasks,individuals are easy to get a lot of unlabeled data,but labeling the unlabeled data is quite time-consuming and usually expensive.As an important learning method in the field of machine lear-ning,active learning reduces the cost of labeling data by selecting the most information data points to label.However,most of the existing active learning algorithms are supervised method based on the classifier,not suitable for the sample selection problem without any label information.Aiming at this problem,a novel unsupervised active learning algorithm was proposed,called active learning based on adaptive sparse neighbors reconstruction,by learning from the optimal experiment design and combining the adaptive sparse neighbors reconstruction.The proposed algorithm adaptively selects the neighborhood scale according to different regional distribution of dataset,searches the sparse neighbors and calculates the reconstruct coefficients simultaneously,and can choose the most representative data points of the distribution structure of dataset without any label information.Empirical results on both synthetic and real-world data sets show that the proposed algorithm has high performance in classification accuracy and robustness under the same labeling cost.

Key words: Active learning, Local linear reconstruction, Optimal experimental design, Sparse reconstruction, Transductive experimental design

CLC Number: 

  • TP181
[1]ANGLUIN D.Queries and concept learning[J].Machine Learning,1988,2(4):319-342.
[2]SETTLES B.Active learning literature survey:Computer Sciences Technical Report 1648[R].University of Wisconsin-Ma-dison,2010.
[3]LEWIS D,CATLETT J.Heterogeneous uncertainty sampling for supervised learning[C]//International Conference on Machine Learning(ICML).1994:148-156.
[4]FUJII A,TOKUNAGA T,INUI K,et al.Selective sampling for example based word sense disambiguation[J].Computational Linguistics,1998,24(4):573-597.
[5]TONG S,KOLLER D.Support vector machine active learning with applications to text classification[C]//International Conference on Machine Learning(ICML).2000:999-1006.
[6]LINDENBAUM M,MARKOVITCH S,RUSAKOV D.Selective sampling for nearest neighbor classifiers[J].Machine Learning,2004,54(2):125-152.
[7]YANG Y,MA Z,NIE F.et al.Multi-Class Active Learning by Uncertainty Sampling with Diversity Maximization[J].International Journal of Computer Vision,2015,113(2):113-127.
[8]NGUYEN H T,SMEULDERS A.Active learning using preclustering[C]//International Conference on Machine Learning(ICML).2004:79-86.
[9]ATKINSON A,DONEV A,TOBIAS R.Optimum Experimental Designs[M].New York:SAS Oxford University Press,2007.
[10]YU K,BI J,TRESP V.Active Learning via transductive experimental design[C]//International Conference on Machine Lear-ning(ICML).2006:1081-1088.
[11]ZHANG L,CHEN C,BU J.Active learning based on locally linear reconstruction[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2011,33(10):2026-2038.
[12]ROWEIS S T,SAUL L K.Nonlinear dimensionality reduction by locally linear embedding[J].Science,2000,290(5500):2323-2326.
[13]XIA J M,YANG J A,CHEN G.Active learning based on sparse linear reconstruction[J].Pattern Recognition and Artificial Intelligence,2013,26(12):1121-1129.(in Chinese)
夏建明,杨俊安,陈功.基于稀疏线性重构的主动学习算法[J].模式识别与人工智能,2013,26(12):1121-1129.
[14]ELHAMIFAR E.Sparse manifold clustering and embedding[C]//International Conference on Neural Information Proces-sing Systems.2011:55-63.
[15]DONOHO D.For most large underdetermined systems of linear equations the minimal L1-norm solution is also the sparsest solution[J].Communications on Pure and Applied Mathematics,2006,59(6):797-829.
[16]WRIGHT J,YANG A,GANESH A,et al.Robust face recognition via sparse representation [J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2009,31(2):210-227.
[17]ZHANG Z,XU Y,LI X,et al.A Survey of Sparse Representation:Algorithms and Applications[J].IEEE Access,2017,3:49-530.
[18]BOYD S,VANDENBERGHE L.Convex Optimization[M].Cambridgeshire Cambridge University Press,2004.
[19]GRANT M,BOYD S.CVX:Matlab Software for Disciplined Convex Programming(Version1.21) [EB/OL].http://cvxr.com/cvx.
[20]GEORGHIADES A,BELHUMEURAND P,KRIEGMAN D.
From few to many:Illumination cone models for face recognition under variable lighting and pose[J].IEEE Transactions on Pattern Analysis and Machine Intelligence,2001,23(6):643-660.
[21]ROWEIS S.USPS Handwritten Digits [EB/OL].http://www.cs.nyu.edu/~roweis/data.html.
[1] ZHOU Hui, SHI Hao-chen, TU Yao-feng, HUANG Sheng-jun. Robust Deep Neural Network Learning Based on Active Sampling [J]. Computer Science, 2022, 49(7): 164-169.
[2] HOU Xia-ye, CHEN Hai-yan, ZHANG Bing, YUAN Li-gang, JIA Yi-zhen. Active Metric Learning Based on Support Vector Machines [J]. Computer Science, 2022, 49(6A): 113-118.
[3] ZHANG Ren-zhi, ZHU Yan. Malicious User Detection Method for Social Network Based on Active Learning [J]. Computer Science, 2021, 48(6): 332-337.
[4] WANG Ti-shuang, LI Pei-feng, ZHU Qiao-ming. Chinese Implicit Discourse Relation Recognition Based on Data Augmentation [J]. Computer Science, 2021, 48(10): 85-90.
[5] DONG Xin-yue, FAN Rui-dong, HOU Chen-ping. Active Label Distribution Learning Based on Marginal Probability Distribution Matching [J]. Computer Science, 2020, 47(9): 190-197.
[6] LI Jin-xia, ZHAO Zhi-gang, LI Qiang, LV Hui-xian and LI Ming-sheng. Improved Locality and Similarity Preserving Feature Selection Algorithm [J]. Computer Science, 2020, 47(6A): 480-484.
[7] QIAN Ling-long, WU Jiao, WANG Ren-feng, LU Hui-juan. Multi-document Automatic Summarization Based on Sparse Representation [J]. Computer Science, 2020, 47(11A): 97-105.
[8] LI Xiu-qin, WANG Tian-jing, BAI Guang-wei, SHEN Hang. Two-phase Multi-target Localization Algorithm Based on Compressed Sensing [J]. Computer Science, 2019, 46(5): 50-56.
[9] LI Yi-hong, LIU Fang-zheng, DU Zhen-yu. Malware Detection Algorithm for Improving Active Learning [J]. Computer Science, 2019, 46(5): 92-99.
[10] ZHAO Hai-yan, WANG Jing, CHEN Qing-kui, CAO Jian. Application of Active Learning in Recommendation System [J]. Computer Science, 2019, 46(11A): 153-158.
[11] SUN Jin, CHEN Ruo-yu, LUO Heng-li. Research on Face Tagging Based on Active Learning [J]. Computer Science, 2018, 45(9): 299-302.
[12] YOU Si-si, YING Long, GUO Wen, DING Xin-miao and HUA Zhen. Discriminative Visual Tracking by Collaborative Structural Sparse Reconstruction [J]. Computer Science, 2018, 45(3): 69-75.
[13] LI Chang-li, ZHANG Lin, FAN Tang-huai. Hyperspectral Image Classification Based on Adaptive Active Learning and Joint Bilateral Filtering [J]. Computer Science, 2018, 45(12): 223-228.
[14] LI Feng and WAN Xiao-qiang. SMS Automatic Classification Based on Relational Matrix [J]. Computer Science, 2017, 44(Z6): 428-432.
[15] WANG Chang-bao, LI Qing-wen and YU Hua-long. Active,Online and Weighted Extreme Learning Machine Algorithm for Class Imbalance Data [J]. Computer Science, 2017, 44(12): 221-226.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!