Abstract
Real-world problems usually exhibit dual-heterogeneity, i.e., every task in the problem has features from multiple views, and multiple tasks are related with each other through one or more shared views. To solve these multi-task problems with multiple views, we propose a shared structure learning framework, which can learn shared predictive structures on common views from multiple related tasks, and use the consistency among different views to improve the performance. An alternating optimization algorithm is derived to solve the proposed framework. Moreover, the computation load can be dealt with locally in each task during the optimization, through only sharing some statistics, which significantly reduces the time complexity and space complexity. Experimental studies on four real-world data sets demonstrate that our framework significantly outperforms the state-of-the-art baselines.
Chapter PDF
Similar content being viewed by others
References
Ando, R.K., Zhang, T.: A framework for learning predictive structures from multiple tasks and unlabeled data. Journal of Machine Learning Research 6, 01 (2005)
Argyriou, A., Evgeniou, T., Pontil, M.: Multi-task feature learning. In: Advances in Neural Information Processing Systems, Vancouver, BC, Canada, pp. 41–48 (2007)
Blum, A., Mitchell, T.: Combining labeled and unlabeled data with co-training. In: Proceedings of the Eleventh Annual Conference on Computational Learning Theory, COLT 1998, pp. 92–100. ACM, New York (1998)
Boyd, S., Vandenberghe, L.: Convex optimization. Cambridge University Press (2004)
Caruana, R.: Multitask learning. Machine Learning 28(1), 41–75 (1997)
Chen, J., Tang, L., Liu, J., Ye, J.: A convex formulation for learning shared structures from multiple tasks. In: Proceedings of the 26th International Conference on Machine Learning, ICML 2009, Montreal, QC, Canada, pp. 137–144 (2009)
Chen, J., Zhou, J., Ye, J.: Integrating low-rank and group-sparse structures for robust multi-task learning. In: Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Diego, CA, United States, pp. 42–50 (2011)
Chen, N., Zhu, J., Xing, E.P.: Predictive subspace learning for multi-view data: A large margin approach. In: Annual Conference on Neural Information Processing Systems 2010, NIPS 2010, Vancouver, BC, Canada (2010)
Chua, T.S., Tang, J., Hong, R., Li, H., Luo, Z., Zheng, Y.: Nus-wide: A real-world web image database from national university of singapore. In: CIVR 2009 - Proceedings of the ACM International Conference on Image and Video Retrieval, Santorini Island, Greece, pp. 368–375 (2009)
Cieri, C., Strassel, S., Graff, D., Martey, N., Rennert, K., Liberman, M.: Corpora for topic detection and tracking. In: Allan, J. (ed.) Topic Detection and Tracking, pp. 33–66. Kluwer Academic Publishers, Norwell (2002)
Evgeniou, T., Pontil, M.: Regularized multi-task learning. In: Proceedings of the Tenth ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Seattle, WA, United States, pp. 109–117 (2004)
Farquhar, J.D., Hardoon, D.R., Meng, H., Shawe-Taylor, J., Szedmak, S.: Two view learning: Svm-2k, theory and practice. In: Advances in Neural Information Processing Systems, Vancouver, BC, Canada, pp. 355–362 (2005)
Frank, A., Asuncion, A.: UCI machine learning repository (2013), http://archive.ics.uci.edu/ml
He, J., Lawrence, R.: A graph-based framework for multi-task multi-view learning. In: Proceedings of the 28th International Conference on Machine Learning, ICML 2011, Bellevue, WA, United States, pp. 25–32 (2011)
Jalali, A., Ravikumar, P., Sanghavi, S., Ruan, C.: A dirty model for multi-task learning. In: 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010, Vancouver, BC, Canada (2010)
Muslea, I., Minton, S., Knoblock, C.A.: Active + Semi-supervised Learning = Robust Multi-View Learning. In: International Conference on Machine Learning, pp. 435–442 (2002)
Nigam, K., Ghani, R.: Analyzing the effectiveness and applicability of co-training. In: International Conference on Information and Knowledge Management, pp. 86–93 (2000)
Sindhwani, V., Niyogi, P., Belkin, M.: A Co-Regularization Approach to Semi-supervised Learning with Multiple Views. In: Workshop on Learning with Multiple Views, International Conference on Machine Learning (2005)
Sindhwani, V., Rosenberg, D.S.: An rkhs for multi-view learning and manifold co-regularization. In: Proceedings of the 25th International Conference on Machine Learning, Helsinki, Finland, pp. 976–983 (2008)
Skolidis, G., Sanguinetti, G.: Bayesian multitask classification with gaussian process priors. IEEE Transactions on Neural Networks 22(12), 2011–2021 (2011)
Vapnik, V.: The nature of statistical learning theory. Springer (1999)
Wold, S., Esbensen, K., Geladi, P.: Principal component analysis. Chemometrics and Intelligent Laboratory Systems 2(1), 37–52 (1987)
Yu, S., Krishnapuram, B., Rosales, R., Bharat Rao, R.: Bayesian co-training. Journal of Machine Learning Research 12, 2649–2680 (2011)
Yu, S., Tresp, V., Yu, K.: Robust multi-task learning with t-processes. In: Twenty-Fourth International Conference on Machine Learning, Corvalis, OR, United States, vol. 227, pp. 1103–1110 (2007)
Zhang, J., Huan, J.: Inductive multi-task learning with multiple view data. In: Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Beijing, China, pp. 543–551 (2012)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Jin, X., Zhuang, F., Wang, S., He, Q., Shi, Z. (2013). Shared Structure Learning for Multiple Tasks with Multiple Views. In: Blockeel, H., Kersting, K., Nijssen, S., Železný, F. (eds) Machine Learning and Knowledge Discovery in Databases. ECML PKDD 2013. Lecture Notes in Computer Science(), vol 8189. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-40991-2_23
Download citation
DOI: https://doi.org/10.1007/978-3-642-40991-2_23
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-40990-5
Online ISBN: 978-3-642-40991-2
eBook Packages: Computer ScienceComputer Science (R0)