Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3637528.3671721acmconferencesArticle/Chapter ViewAbstractPublication PageskddConference Proceedingsconference-collections
research-article
Open access

POND: Multi-Source Time Series Domain Adaptation with Information-Aware Prompt Tuning

Published: 24 August 2024 Publication History

Abstract

Time series domain adaptation stands as a pivotal and intricate challenge with diverse applications, including but not limited to human activity recognition, sleep stage classification, and machine fault diagnosis. Despite the numerous domain adaptation techniques proposed to tackle this complex problem, they primarily focus on domain adaptation from a single source domain. Yet, it is more crucial to investigate domain adaptation from multiple domains due to the potential for greater improvements. To address this, three important challenges need to be overcome: 1). The lack of exploration to utilize domain-specific information for domain adaptation, 2). The difficulty to learn domain-specific information that changes over time, and 3). The difficulty to evaluate learned domain-specific information. In order to tackle these challenges simultaneously, in this paper, we introduce PrOmpt-based domaiN Discrimination (POND), the first framework to utilize prompts for time series domain adaptation. Specifically, to address Challenge 1, we extend the idea of prompt tuning to time series analysis and learn prompts to capture common and domain-specific information from all source domains. To handle Challenge 2, we introduce a conditional module for each source domain to generate prompts from time series input data. For Challenge 3, we propose two criteria to select good prompts, which are used to choose the most suitable source domain for domain adaptation. The efficacy and robustness of our proposed POND model are extensively validated through experiments across 50 scenarios encompassing four datasets. Experimental results demonstrate that our proposed POND model outperforms all state-of-the-art comparison methods by up to 66% on the F1-score.

Supplemental Material

MP4 File - Promotional Video
Promotional Video of the paper "POND: Multi-Source Time Series Domain Adaptation with Information-Aware Prompt Tuning"

References

[1]
Anguita, D., Ghio, A., Oneto, L., Parra, X., Reyes-Ortiz, J. L., et al. A public domain dataset for human activity recognition using smartphones. In Esann (2013), vol. 3, p. 3.
[2]
Bai, G., Ling, C., and Zhao, L. Temporal domain generalization with drift-aware dynamic neural networks. In The Eleventh International Conference on Learning Representations (2022).
[3]
Brown, T., Mann, B., Ryder, N., Subbiah, M., Kaplan, J. D., Dhariwal, P., Neelakantan, A., Shyam, P., Sastry, G., Askell, A., et al. Language models are few-shot learners. Advances in neural information processing systems 33 (2020), 1877--1901.
[4]
Cai, R., Chen, J., Li, Z., Chen, W., Zhang, K., Ye, J., Li, Z., Yang, X., and Zhang, Z. Time series domain adaptation via sparse associative structure alignment. In Proceedings of the AAAI Conference on Artificial Intelligence (2021), vol. 35, pp. 6859--6867.
[5]
Cai, Z., Bai, G., Jiang, R., Song, X., and Zhao, L. Continuous temporal domain generalization. arXiv preprint arXiv:2405.16075 (2024).
[6]
Cao, D., Jia, F., Arik, S. O., Pfister, T., Zheng, Y., Ye, W., and Liu, Y. Tempo: Prompt-based generative pre-trained transformer for time series forecasting. arXiv preprint arXiv:2310.04948 (2023).
[7]
Chang, C., Peng, W.-C., and Chen, T.-F. Llm4ts: Two-stage fine-tuning for time-series forecasting with pre-trained llms. arXiv preprint arXiv:2308.08469 (2023).
[8]
Cheng, P., Hao, W., Dai, S., Liu, J., Gan, Z., and Carin, L. Club: A contrastive log-ratio upper bound of mutual information. In International conference on machine learning (2020), PMLR, pp. 1779--1788.
[9]
Fedus, W., Zoph, B., and Shazeer, N. Switch transformers: Scaling to trillion parameter models with simple and efficient sparsity. The Journal of Machine Learning Research 23, 1 (2022), 5232--5270.
[10]
Goldberger, A. L., Amaral, L. A., Glass, L., Hausdorff, J. M., Ivanov, P. C., Mark, R. G., Mietus, J. E., Moody, G. B., Peng, C.-K., and Stanley, H. E. Physiobank, physiotoolkit, and physionet: components of a new research resource for complex physiologic signals. circulation 101, 23 (2000), e215--e220.
[11]
Gruver, N., Finzi, M., Qiu, S., and Wilson, A. G. Large language models are zero-shot time series forecasters. arXiv preprint arXiv:2310.07820 (2023).
[12]
He, H., Queen, O., Koker, T., Cuevas, C., Tsiligkaridis, T., and Zitnik, M. Domain adaptation for time series under feature and label shifts. In Proceedings of the 40th International Conference on Machine Learning (23--29 Jul 2023), A. Krause, E. Brunskill, K. Cho, B. Engelhardt, S. Sabato, and J. Scarlett, Eds., vol. 202 of Proceedings of Machine Learning Research, PMLR, pp. 12746--12774.
[13]
Jin, M., Wang, S., Ma, L., Chu, Z., Zhang, J. Y., Shi, X., Chen, P.-Y., Liang, Y., Li, Y.-F., Pan, S., et al. Time-llm: Time series forecasting by reprogramming large language models. arXiv preprint arXiv:2310.01728 (2023).
[14]
Jin, M., Wen, Q., Liang, Y., Zhang, C., Xue, S., Wang, X., Zhang, J., Wang, Y., Chen, H., Li, X., Pan, S., Tseng, V. S., Zheng, Y., Chen, L., and Xiong, H. Large models for time series and spatio-temporal data: A survey and outlook. arXiv preprint arXiv:2310.10196 (2023).
[15]
Jin, X., Park, Y., Maddix, D., Wang, H., and Wang, Y. Domain adaptation for time series forecasting via attention sharing. In International Conference on Machine Learning (2022), PMLR, pp. 10280--10297.
[16]
Kwapisz, J. R., Weiss, G. M., and Moore, S. A. Activity recognition using cell phone accelerometers. ACM SigKDD Explorations Newsletter 12, 2 (2011), 74--82.
[17]
Lai, K.-H.,Wang, L., Chen, H., Zhou, K.,Wang, F., Yang, H., and Hu, X. Contextaware domain adaptation for time series anomaly detection. In Proceedings of the 2023 SIAM International Conference on Data Mining (SDM) (2023), SIAM, pp. 676--684.
[18]
Lessmeier, C., Kimotho, J. K., Zimmer, D., and Sextro,W. Condition monitoring of bearing damage in electromechanical drive systems by using motor current signals of electric motors: A benchmark data set for data-driven classification. In PHM Society European Conference (2016), vol. 3.
[19]
Lester, B., Al-Rfou, R., and Constant, N. The power of scale for parameterefficient prompt tuning. arXiv preprint arXiv:2104.08691 (2021).
[20]
Li, Y., Chen, Z., Zha, D., Du, M., Ni, J., Zhang, D., Chen, H., and Hu, X. Towards learning disentangled representations for time series. In Proceedings of the 28th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (2022), pp. 3270--3278.
[21]
Ling, C., Zhao, X., Lu, J., Deng, C., Zheng, C., Wang, J., Chowdhury, T., Li, Y., Cui, H., Zhao, T., et al. Domain specialization as the key to make large language models disruptive: A comprehensive survey. arXiv preprint arXiv 2305 (2023).
[22]
Liu, Q., and Xue, H. Adversarial spectral kernel matching for unsupervised time series domain adaptation. In IJCAI (2021), pp. 2744--2750.
[23]
Luo, C., Chen, Z., Tang, L.-A., Shrivastava, A., Li, Z., Chen, H., and Ye, J. Tinet: learning invariant networks via knowledge transfer. In Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining (2018), pp. 1890--1899.
[24]
Luo, D., Cheng, W., Wang, Y., Xu, D., Ni, J., Yu, W., Zhang, X., Liu, Y., Chen, Y., Chen, H., et al. Time series contrastive learning with information-aware augmentations. In Proceedings of the AAAI Conference on Artificial Intelligence (2023), vol. 37, pp. 4534--4542.
[25]
Nichol, A., Achiam, J., and Schulman, J. On first-order meta-learning algorithms. arXiv preprint arXiv:1803.02999 (2018).
[26]
Nie, Y., Nguyen, N. H., Sinthong, P., and Kalagnanam, J. A time series is worth 64 words: Long-term forecasting with transformers. In The Eleventh International Conference on Learning Representations (2022).
[27]
Peng, X., Huang, Z., Sun, X., and Saenko, K. Domain agnostic learning with disentangled representations. In International Conference on Machine Learning (2019), PMLR, pp. 5102--5112.
[28]
Poole, B., Ozair, S., Van Den Oord, A., Alemi, A., and Tucker, G. On variational bounds of mutual information. In International Conference on Machine Learning (2019), PMLR, pp. 5171--5180.
[29]
Purushotham, S., Carvalho,W., Nilanon, T., and Liu, Y. Variational recurrent adversarial deep domain adaptation. In International Conference on Learning Representations (2016).
[30]
Ragab, M., Eldele, E., Chen, Z.,Wu, M., Kwoh, C.-K., and Li, X. Self-supervised autoregressive domain adaptation for time series data. IEEE Transactions on Neural Networks and Learning Systems (2022).
[31]
Ragab, M., Eldele, E., Tan, W. L., Foo, C.-S., Chen, Z., Wu, M., Kwoh, C.-K., and Li, X. Adatime: A benchmarking suite for domain adaptation on time series data. ACM Transactions on Knowledge Discovery from Data 17, 8 (2023), 1--18.
[32]
Rahman, M. M., Fookes, C., Baktashmotlagh, M., and Sridharan, S. On minimum discrepancy estimation for deep domain adaptation. Domain Adaptation for Visual Understanding (2020), 81--94.
[33]
Shu, R., Bui, H., Narui, H., and Ermon, S. A dirt-t approach to unsupervised domain adaptation. In International Conference on Learning Representations (2018).
[34]
Stisen, A., Blunck, H., Bhattacharya, S., Prentow, T. S., Kjærgaard, M. B., Dey, A., Sonne, T., and Jensen, M. M. Smart devices are different: Assessing and mitigatingmobile sensing heterogeneities for activity recognition. In Proceedings of the 13th ACM conference on embedded networked sensor systems (2015), pp. 127-- 140.
[35]
Sun, B., and Saenko, K. Deep coral: Correlation alignment for deep domain adaptation. In Computer Vision--ECCV 2016 Workshops: Amsterdam, The Netherlands, October 8--10 and 15--16, 2016, Proceedings, Part III 14 (2016), Springer, pp. 443--450.
[36]
Sun, C., Li, Y., Li, H., and Hong, S. Test: Text prototype aligned embedding to activate llm's ability for time series. arXiv preprint arXiv:2308.08241 (2023).
[37]
Tishby, N., Pereira, F. C., and Bialek, W. The information bottleneck method. arXiv preprint physics/0004057 (2000).
[38]
Wang, D., Chen, Z., Fu, Y., Liu, Y., and Chen, H. Incremental causal graph learning for online root cause analysis. In Proceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (2023), pp. 2269--2278.
[39]
Wang, D., Chen, Z., Ni, J., Tong, L., Wang, Z., Fu, Y., and Chen, H. Interdependent causal networks for root cause localization. In Proceedings of the 29th ACM SIGKDD Conference on Knowledge Discovery and Data Mining (2023), pp. 5051--5060.
[40]
Wang, J., and Zhao, L. Multi-instance domain adaptation for vaccine adverse event detection. In Proceedings of the 2018 World Wide Web Conference (2018), pp. 97--106.
[41]
Wang, Y., Chauhan, J., Wang, W., and Hsieh, C.-J. Universality and limitations of prompt tuning. In Proceedings of Advances in Neural Information Processing Systems 36 (NeurIPS 2023) (2023).
[42]
Wilson, G., Doppa, J. R., and Cook, D. J. Multi-source deep domain adaptation with weak supervision for time-series sensor data. In Proceedings of the 26th ACM SIGKDD international conference on knowledge discovery & data mining (2020), pp. 1768--1778.
[43]
Wu, Z.,Wang, S., Gu, J., Hou, R., Dong, Y., Vydiswaran, V. V., and Ma, H. Idpg: An instance-dependent prompt generation method. In Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (2022), pp. 5507--5521.
[44]
Xue, H., and Salim, F. D. Promptcast: A new prompt-based learning paradigm for time series forecasting. IEEE Transactions on Knowledge and Data Engineering (2023).
[45]
Yang, L., and Hong, S. Unsupervised time-series representation learning with iterative bilinear temporal-spectral fusion. In International Conference on Machine Learning (2022), PMLR, pp. 25038--25054.
[46]
Ying, Z., Bourgeois, D., You, J., Zitnik, M., and Leskovec, J. Gnnexplainer: Generating explanations for graph neural networks. Advances in neural information processing systems 32 (2019).
[47]
Yue, Z., Wang, Y., Duan, J., Yang, T., Huang, C., Tong, Y., and Xu, B. Ts2vec: Towards universal representation of time series. In Proceedings of the AAAI Conference on Artificial Intelligence (2022), vol. 36, pp. 8980--8987.
[48]
Zhao, M., Yue, S., Katabi, D., Jaakkola, T. S., and Bianchi, M. T. Learning sleep stages from radio signals: A conditional adversarial architecture. In International Conference on Machine Learning (2017), PMLR, pp. 4100--4109.
[49]
Zhao, S., Li, B., Xu, P., and Keutzer, K. Multi-source domain adaptation in the deep learning era: A systematic survey. arXiv preprint arXiv:2002.12169 (2020).
[50]
Zhao, W. X., Zhou, K., Li, J., Tang, T., Wang, X., Hou, Y., Min, Y., Zhang, B., Zhang, J., Dong, Z., et al. A survey of large language models. arXiv preprint arXiv:2303.18223 (2023).
[51]
Zhou, T., Niu, P., Wang, X., Sun, L., and Jin, R. One fits all: Power general time series analysis by pretrained lm. arXiv preprint arXiv:2302.11939 (2023).
[52]
Zhu, Y., Zhuang, F., Wang, J., Ke, G., Chen, J., Bian, J., Xiong, H., and He, Q. Deep subdomain adaptation network for image classification. IEEE transactions on neural networks and learning systems 32, 4 (2020), 1713--1722.

Recommendations

Comments

Please enable JavaScript to view thecomments powered by Disqus.

Information & Contributors

Information

Published In

cover image ACM Conferences
KDD '24: Proceedings of the 30th ACM SIGKDD Conference on Knowledge Discovery and Data Mining
August 2024
6901 pages
ISBN:9798400704901
DOI:10.1145/3637528
This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike International 4.0 License.

Sponsors

Publisher

Association for Computing Machinery

New York, NY, United States

Publication History

Published: 24 August 2024

Check for updates

Author Tags

  1. domain adaptation
  2. information bottleneck
  3. prompt tuning
  4. time series

Qualifiers

  • Research-article

Conference

KDD '24
Sponsor:

Acceptance Rates

Overall Acceptance Rate 1,133 of 8,635 submissions, 13%

Contributors

Other Metrics

Bibliometrics & Citations

Bibliometrics

Article Metrics

  • 0
    Total Citations
  • 156
    Total Downloads
  • Downloads (Last 12 months)156
  • Downloads (Last 6 weeks)156
Reflects downloads up to 24 Sep 2024

Other Metrics

Citations

View Options

View options

PDF

View or Download as a PDF file.

PDF

eReader

View online with eReader.

eReader

Get Access

Login options

Media

Figures

Other

Tables

Share

Share

Share this Publication link

Share on social media