Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3640457.3688171acmconferencesArticle/Chapter ViewAbstractPublication PagesrecsysConference Proceedingsconference-collections
short-paper

GLAMOR: Graph-based LAnguage MOdel embedding for citation Recommendation

Published: 08 October 2024 Publication History

Abstract

Digital publishing’s exponential growth has created vast scholarly collections. Guiding researchers to relevant resources is crucial, and knowledge graphs (KGs) are key tools for unlocking hidden knowledge. However, current methods focus on external links between concepts, ignoring the rich information within individual papers. Challenges like insufficient multi-relational data, name ambiguity, and cold-start issues further limit existing KG-based methods, failing to capture the intricate attributes of diverse entities. To solve these issues, we propose GLAMOR, a robust KG framework encompassing entities e.g., authors, papers, fields of study, and concepts, along with their semantic interconnections. GLAMOR uses a novel random walk-based KG text generation method and then fine-tunes the language model using the generated text. Subsequently, the acquired context-preserving embeddings facilitate superior top@k predictions. Evaluation results on two public benchmark datasets demonstrate our GLAMOR’s superiority against state-of-the-art methods especially in solving the cold-start problem.

References

[1]
Zafar Ali, Pavlos Kefalas, Khan Muhammad, Bahadar Ali, and Muhammad Imran. 2020. Deep learning in citation recommendation models survey. Expert Systems with Applications 162 (2020), 113790.
[2]
Zafar Ali, Guilin Qi, Khan Muhammad, Bahadar Ali, and Waheed Ahmed Abro. 2020. Paper recommendation based on heterogeneous network embedding. Knowledge-Based Systems 210 (2020), 106438.
[3]
Iz Beltagy, Kyle Lo, and Arman Cohan. 2019. SciBERT: A Pretrained Language Model for Scientific Text. arxiv:1903.10676
[4]
Huiyuan Chen, Xiaoting Li, Kaixiong Zhou, Xia Hu, Chin-Chia Michael Yeh, Yan Zheng, and Hao Yang. 2022. TinyKG: Memory-Efficient Training Framework for Knowledge Graph Neural Recommender Systems(RecSys ’22). Association for Computing Machinery, New York, NY, USA, 257–267.
[5]
Danilo Dessí, Francesco Osborne, Diego Reforgiato Recupero, Davide, Buscaldi, Enrico Motta, and Harald Sack. 2020. AI-KG: An Automatically Generated Knowledge Graph of Artificial Intelligence. In International Workshop on the Semantic Web. https://api.semanticscholar.org/CorpusID:221817644
[6]
Jacob Devlin, Ming-Wei Chang, Kenton Lee, and Kristina Toutanova. 2018. Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv:1810.04805 (2018).
[7]
Chantat Eksombatchai, Pranav Jindal, Jerry Zitao Liu, Yuchen Liu, Rahul Sharma, Charles Sugnet, Mark Ulrich, and Jure Leskovec. 2018. Pixie: A system for recommending 3+ billion items to 200+ million users in real-time. In Proceedings of the world wide web conference. 1775–1784.
[8]
Michael Färber, David Lamprecht, Johan Krause, Linn Aung, and Peter Haase. 2023. SemOpenAlex: The Scientific Landscape in 26 Billion RDF Triples. In International Semantic Web Conference. Springer, 94–112.
[9]
Shashank Gupta and Vasudeva Varma. 2017. Scientific Article Recommendation by Using Distributed Representations of Text and Graph. In Proceedings of the 26th International Conference on World Wide Web Companion. Republic and Canton of Geneva, Switzerland, 1267–1268.
[10]
Binbin Hu, Yuan Fang, and Chuan Shi. 2019. Adversarial learning on heterogeneous information networks. In Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. 120–129.
[11]
Edward J Hu, Yelong Shen, Phillip Wallis, Zeyuan Allen-Zhu, Yuanzhi Li, Shean Wang, Lu Wang, and Weizhu Chen. 2021. Lora: Low-rank adaptation of large language models. arXiv:2106.09685 (2021).
[12]
Anita Khadka and Petr Knoth. 2018. Using citation-context to reduce topic drifting on pure citation-based recommendation. In Proceedings of the 12th ACM Conference on Recommender Systems (Vancouver, British Columbia, Canada) (RecSys ’18). Association for Computing Machinery, New York, NY, USA, 362–366.
[13]
Christin Katharina Kreutz and Ralf Schenkel. 2022. Scientific Paper Recommendation Systems: a Literature Review of recent Publications. (2022). arxiv.org/abs/2201.00682
[14]
Pengfei Liu, Weizhe Yuan, Jinlan Fu, Zhengbao Jiang, Hiroaki Hayashi, and Graham Neubig. 2023. Pre-train, prompt, and predict: A systematic survey of prompting methods in natural language processing. Comput. Surveys 55, 9 (2023), 1–35.
[15]
Silvio Micali and Zeyuan Allen Zhu. 2016. Reconstructing markov processes from independent and anonymous experiments. Discrete Applied Mathematics 200 (2016), 108–122.
[16]
R OpenAI. 2023. Gpt-4 technical report. arxiv 2303.08774. View in Article 2, 5 (2023).
[17]
Long Ouyang, Jeffrey Wu, Xu Jiang, Diogo Almeida, Carroll Wainwright, Pamela Mishkin, Chong Zhang, Sandhini Agarwal, Katarina Slama, Alex Ray, 2022. Training language models to follow instructions with human feedback. Advances in neural information processing systems 35 (2022), 27730–27744.
[18]
Tianshuang Qiu, Chuanming Yu, Yunci Zhong, Lu An, and Gang Li. 2021. A scientific citation recommendation model integrating network and text representations. Scientometrics 126, 11 (2021), 9199–9221.
[19]
Victor Sanh, Lysandre Debut, Julien Chaumond, and Thomas Wolf. 2019. DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter. arXiv:1910.01108 (2019).
[20]
Amanpreet Singh, Mike D’Arcy, Arman Cohan, Doug Downey, and Sergey Feldman. 2022. SciRepEval: A Multi-Format Benchmark for Scientific Document Representations. In Conference on Empirical Methods in Natural Language Processing. https://api.semanticscholar.org/CorpusID:254018137
[21]
Yanchao Tan, Zihao Zhou, Hang Lv, Weiming Liu, and Carl Yang. 2024. Walklm: A uniform language model fine-tuning framework for attributed graph embedding. Advances in Neural Information Processing Systems 36 (2024).
[22]
Jian Tang, Meng Qu, Mingzhe Wang, Ming Zhang, Jun Yan, and Qiaozhu Mei. 2015. LINE: Large-scale Information Network Embedding. In Proceedings of the 24th International Conference on World Wide Web (WWW) (Florence, Italy). 1067–1077.
[23]
Nicolas Webersinke, Mathias Kraus, Julia Anna Bingler, and Markus Leippold. 2021. Climatebert: A pretrained language model for climate-related text. arXiv:2110.12010 (2021).
[24]
Wei Wei, Xubin Ren, Jiabin Tang, Qinyong Wang, Lixin Su, Suqi Cheng, Junfeng Wang, Dawei Yin, and Chao Huang. 2024. Llmrec: Large language models with graph augmentation for recommendation. In Proceedings of the 17th ACM International Conference on Web Search and Data Mining. 806–815.
[25]
Xia Xiao, Junyan Xu, Jiaying Huang, Chengde Zhang, and Xinzhong Chen. 2023. TCRec: A novel paper recommendation method based on ternary coauthor interaction. Knowl. Based Syst. 280 (2023), 111065. https://api.semanticscholar.org/CorpusID:263811307
[26]
Hongyuan Xu, Yunong Chen, Zichen Liu, Yanlong Wen, and Xiaojie Yuan. 2022. TaxoPrompt: A Prompt-based Generation Method with Taxonomic Context for Self-Supervised Taxonomy Expansion. In IJCAI. 4432–4438.
[27]
Dan Zhang, Yifan Zhu, Yuxiao Dong, Yuandong Wang, Wenzheng Feng, Evgeny Kharlamov, and Jie Tang. 2023. ApeGNN: Node-Wise Adaptive Aggregation in GNNs for Recommendation. In Proceedings of the ACM Web Conference. 759–769.
[28]
Zihuai Zhao, Wenqi Fan, Jiatong Li, Yunqing Liu, Xiaowei Mei, Yiqi Wang, Zhen Wen, Fei Wang, Xiangyu Zhao, and Jiliang Tang. 2024. Recommender systems in the era of large language models (llms). IEEE Transactions on Knowledge and Data Engineering (2024).
[29]
Yifan Zhu, Qika Lin, Hao Lu, Kaize Shi, Ping Qiu, and Zhendong Niu. 2021. Recommending scientific paper via heterogeneous knowledge embedding based attentive recurrent neural networks. Knowledge-Based Systems 215 (2021), 106744.
[30]
Zhaorui Zhu, Hongyi Yu, Caiyao Shen, Jianping Du, Zhixiang Shen, and Zhenyu Wang. 2023. Causal language model aided sequential decoding with natural redundancy. IEEE Transactions on Communications (2023).

Index Terms

  1. GLAMOR: Graph-based LAnguage MOdel embedding for citation Recommendation

    Recommendations

    Comments

    Please enable JavaScript to view thecomments powered by Disqus.

    Information & Contributors

    Information

    Published In

    cover image ACM Conferences
    RecSys '24: Proceedings of the 18th ACM Conference on Recommender Systems
    October 2024
    1438 pages
    Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

    Sponsors

    Publisher

    Association for Computing Machinery

    New York, NY, United States

    Publication History

    Published: 08 October 2024

    Permissions

    Request permissions for this article.

    Check for updates

    Author Tags

    1. Attributed Graph Embedding
    2. Citation Recommendation
    3. Cold-start
    4. GLAMOR
    5. Large Language Model
    6. Recommender Systems

    Qualifiers

    • Short-paper
    • Research
    • Refereed limited

    Conference

    Acceptance Rates

    Overall Acceptance Rate 254 of 1,295 submissions, 20%

    Contributors

    Other Metrics

    Bibliometrics & Citations

    Bibliometrics

    Article Metrics

    • 0
      Total Citations
    • 106
      Total Downloads
    • Downloads (Last 12 months)106
    • Downloads (Last 6 weeks)60
    Reflects downloads up to 30 Nov 2024

    Other Metrics

    Citations

    View Options

    Login options

    View options

    PDF

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader

    HTML Format

    View this article in HTML Format.

    HTML Format

    Media

    Figures

    Other

    Tables

    Share

    Share

    Share this Publication link

    Share on social media