Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/1401890.1401928acmconferencesArticle/Chapter ViewAbstractPublication PageskddConference Proceedingsconference-collections
research-article

Knowledge transfer via multiple model local structure mapping

Published: 24 August 2008 Publication History

Abstract

The effectiveness of knowledge transfer using classification algorithms depends on the difference between the distribution that generates the training examples and the one from which test examples are to be drawn. The task can be especially difficult when the training examples are from one or several domains different from the test domain. In this paper, we propose a locally weighted ensemble framework to combine multiple models for transfer learning, where the weights are dynamically assigned according to a model's predictive power on each test example. It can integrate the advantages of various learning algorithms and the labeled information from multiple training domains into one unified classification model, which can then be applied on a different domain. Importantly, different from many previously proposed methods, none of the base learning method is required to be specifically designed for transfer learning. We show the optimality of a locally weighted ensemble framework as a general approach to combine multiple models for domain transfer. We then propose an implementation of the local weight assignments by mapping the structures of a model onto the structures of the test domain, and then weighting each model locally according to its consistency with the neighborhood structure around the test example. Experimental results on text classification, spam filtering and intrusion detection data sets demonstrate significant improvements in classification accuracy gained by the framework. On a transfer learning task of newsgroup message categorization, the proposed locally weighted ensemble framework achieves 97% accuracy when the best single model predicts correctly only on 73% of the test examples. In summary, the improvement in accuracy is over 10% and up to 30% across different problems.

References

[1]
C.G. Atkeson, A.W. Moore, and S.Schaal. Locally weighted learning. Artificial Intelligence Review, 11(1-5):11--73, 1997.
[2]
E. Bauer and R. Kohavi. An empirical comparison of voting classification algorithms: Bagging, boosting, and variants. Machine Learning, 36:105--139, 2004.
[3]
S. Ben-David, J. Blitzer, K. Crammer, and F. Pereira. Analysis of representations for domain adaptation. In Proc. of NIPS' 07, pages 137--144. 2007.
[4]
P.N. Bennett, S.T. Dumais, and E.Horvitz. The combination of text classifiers using reliability indicators. Information Retrieval, 8(1):67--100, 2005.
[5]
S. Bickel, M. Bruckner, and T. Scheffer. Discriminative learning for differing training and test distributions. In Proc. of ICML' 07, pages 81--88, 2007.
[6]
A.J. Carlson, C.M. Cumby, J.L.R. Nicholas D.Rizzolo, and D.Roth. Snow learning architecture. http://l2r.cs.uiuc.edu/~cogcomp/asoftware.php?skey=SNOW#projects.
[7]
R. Caruana. Multitask learning. Machine Learning, 28(1):41--75, 1997.
[8]
C.-C. Chang and C.-J. Lin. Libsvm: a library for support vector machines, 2001. Software available at http://www.csie.ntu.edu.tw/~cjlin/libsvm.
[9]
W. Dai, G.-R. Xue, Q. Yang, and Y. Yu. Co-clustering based classification for out-of-domain documents. In Proc. of KDD' 07, pages 210--219, 2007.
[10]
W. Dai, Q. Yang, G.-R. Xue, and Y. Yu. Boosting for transfer learning. In Proc. of ICML' 07, pages 193--200.
[11]
H. Daumé and D. Marcu. Domain adaptation for statistical classifiers. Journal of Artificial Intelligence Research, 26:101--126, 2006.
[12]
T. Dietterich. Ensemble methods in machine learning. In Proc. of MCS '00, pages 1--15, 2000.
[13]
W. Fan. Systematic data selection to mine concept-drifting data streams. In Proc. KDD' 04, pages 128--137, 2004.
[14]
W. Fan and I. Davidson. On sample selection bias and its efficient correction via model averaging and unlabeled examples. In Proc. of SDM'07.
[15]
J. Gao, W. Fan, and J. Han. On appropriate assumptions to mine data streams: Analysis and practice. In Proc. ICDM' 07, pages 143--152, 2007.
[16]
A. Genkin, D. D. Lewis, and D. Madigan. Bbr: Bayesian logistic regression software. http://stat.rutgers.edu/~madigan/BBR/.
[17]
J. Hoeting, D. Madigan, A. Raftery, and C. Volinsky. Bayesian model averaging: a tutorial. Statist. Sci., 14:382--417, 1999.
[18]
J. Huang, A. J. Smola, A. Gretton, K. M. Borgwardt, and B. Scholkopf. Correcting sample selection bias by unlabeled data. In Proc. of NIPS' 06, pages 601--608. 2007.
[19]
R. Jacobs, M. Jordan, S. Nowlan, and G. Hinton. Adaptive mixtures of local experts. Neural Computation, 3(1):79--87, 1991.
[20]
T. Joachims. Making large-scale svm learning practical. advances in kernel methods- support vector learning. MIT-Press, 1999.
[21]
G. Karypis. Cluto - family of data clustering software tools. http://glaros.dtc.umn.edu/gkhome/views/cluto.
[22]
X. Li and J. Bilmes. A Bayesian divergence prior for classifier adaptation. In Proc. of AISTATS' 07, 2007.
[23]
D.M. Roy and L.P. Kaelbling. Efficient bayesian task-level transfer learning. In Proc. of IJCAI '07.
[24]
S. Satpal and S. Sarawagi. Domain adaptation of conditional probability models via feature subsetting. In Proc. of ECML/PKDD' 07, pages 224--235, 2007.
[25]
H. Shimodaira. Improving predictive inference under covariate shift by weighting the log-likelihood function. Journal of Statistical Planning and Inference, 90(2):227--244, 2000.
[26]
A. Storkey and M. Sugiyama. Mixture regression for covariate shift. In Proc. of NIPS' 06, pages 1337--1344.
[27]
H. Wang, W. Fan, P. Yu, and J. Han. Mining concept-drifting data streams using ensemble classifiers. In Proc. of KDD'03, pages 226--235, 2003.
[28]
X. Zhu. Semi-supervised learning literature survey. Technical Report 1530, Computer Sciences, University of Wisconsin-Madison, 2005.

Cited By

View all
  • (2024)Cross-domain decision method based on instance transfer and model transfer for fault diagnosisAdvances in Mechanical Engineering10.1177/1687813224124583616:4Online publication date: 20-Apr-2024
  • (2024)Communication-efficient Multi-service Mobile Traffic Prediction by Leveraging Cross-service CorrelationsProceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining10.1145/3637528.3671730(794-805)Online publication date: 25-Aug-2024
  • (2024)A Fast Parametric and Structural Transfer Leaky Integrator Echo State Network for Reservoir ComputingIEEE Transactions on Systems, Man, and Cybernetics: Systems10.1109/TSMC.2024.335856754:5(3257-3269)Online publication date: May-2024
  • Show More Cited By

Index Terms

  1. Knowledge transfer via multiple model local structure mapping

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    KDD '08: Proceedings of the 14th ACM SIGKDD international conference on Knowledge discovery and data mining
    August 2008
    1116 pages
    ISBN:9781605581934
    DOI:10.1145/1401890
    • General Chair:
    • Ying Li,
    • Program Chairs:
    • Bing Liu,
    • Sunita Sarawagi
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 24 August 2008

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. classification
    2. ensemble
    3. semi-supervised learning
    4. transfer learning

    Qualifiers

    • Research-article

    Conference

    KDD08

    Acceptance Rates

    KDD '08 Paper Acceptance Rate 118 of 593 submissions, 20%;
    Overall Acceptance Rate 1,133 of 8,635 submissions, 13%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • Downloads (Last 12 months)71
    • Downloads (Last 6 weeks)4
    Reflects downloads up to 24 Nov 2024

    Other Metrics

    Citations

    Cited By

    View all
    • (2024)Cross-domain decision method based on instance transfer and model transfer for fault diagnosisAdvances in Mechanical Engineering10.1177/1687813224124583616:4Online publication date: 20-Apr-2024
    • (2024)Communication-efficient Multi-service Mobile Traffic Prediction by Leveraging Cross-service CorrelationsProceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining10.1145/3637528.3671730(794-805)Online publication date: 25-Aug-2024
    • (2024)A Fast Parametric and Structural Transfer Leaky Integrator Echo State Network for Reservoir ComputingIEEE Transactions on Systems, Man, and Cybernetics: Systems10.1109/TSMC.2024.335856754:5(3257-3269)Online publication date: May-2024
    • (2024)Pre-Trained Transformer-Based Parallel Multi-Channel Adaptive Image Sequence Interpolation NetworkIEEE Transactions on Circuits and Systems for Video Technology10.1109/TCSVT.2024.340939534:10(10464-10478)Online publication date: Oct-2024
    • (2024)Adaptive Fine-Tuning in Degradation-Time-Series Forecasting via Generating Source DomainIEEE Access10.1109/ACCESS.2023.334115912(15093-15104)Online publication date: 2024
    • (2024)Automated patch correctness predicting to fix software defectExpert Systems with Applications: An International Journal10.1016/j.eswa.2024.124877256:COnline publication date: 5-Dec-2024
    • (2024)Transfer learning with convolutional neural networks for hydrological streamline delineationEnvironmental Modelling & Software10.1016/j.envsoft.2024.106165181:COnline publication date: 18-Nov-2024
    • (2023)Multi-modal cognitive computingSCIENTIA SINICA Informationis10.1360/SSI-2022-022653:1(1)Online publication date: 11-Jan-2023
    • (2023)An Efficient Transfer Learning Method with Auxiliary InformationACM Transactions on Knowledge Discovery from Data10.1145/361293018:1(1-23)Online publication date: 6-Sep-2023
    • (2023)A Transfer Learning Framework for Predictive Energy-Related Scenarios in Smart BuildingsIEEE Transactions on Industry Applications10.1109/TIA.2022.317922259:1(26-37)Online publication date: Jan-2023
    • Show More Cited By

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media