Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3639478.3643533acmconferencesArticle/Chapter ViewAbstractPublication PagesicseConference Proceedingsconference-collections
poster

F-CodeLLM: A Federated Learning Framework for Adapting Large Language Models to Practical Software Development

Published: 23 May 2024 Publication History

Abstract

Large Language Models (LLMs) have revolutionized code intelligence tasks, but their performance in specific software development tasks often requires fine-tuning with task-specific data. However, acquiring such data is challenging due to privacy concerns. We introduce F-CodeLLM, a novel federated learning framework for adapting LLMs to software development tasks while preserving code data privacy. Leveraging federated learning and LoRA-based efficient fine-tuning, F-CodeLLM allows organizations to collaboratively improve LLMs without sharing sensitive data. Our experiments demonstrate that F-CodeLLM achieves comparable results to centralized fine-tuning methods and excels in multi-language environments, marking a significant advancement in the application of LLMs for software engineering.

References

[1]
Edward J Hu, Phillip Wallis, et al. 2021. LoRA: Low-Rank Adaptation of Large Language Models. In International Conference on Learning Representations.
[2]
Brendan McMahan, Eider Moore, et al. 2017. Communication-efficient learning of deep networks from decentralized data. In Artificial intelligence and statistics. PMLR, 1273--1282.

Index Terms

  1. F-CodeLLM: A Federated Learning Framework for Adapting Large Language Models to Practical Software Development

      Recommendations

      Comments

      Please enable JavaScript to view thecomments powered by Disqus.

      Information & Contributors

      Information

      Published In

      cover image ACM Conferences
      ICSE-Companion '24: Proceedings of the 2024 IEEE/ACM 46th International Conference on Software Engineering: Companion Proceedings
      April 2024
      531 pages
      ISBN:9798400705021
      DOI:10.1145/3639478
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all other uses, contact the owner/author(s).

      Sponsors

      In-Cooperation

      • Faculty of Engineering of University of Porto

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 23 May 2024

      Check for updates

      Author Tags

      1. code intelligence
      2. federated fine-tuning
      3. large language model
      4. software development

      Qualifiers

      • Poster

      Funding Sources

      Conference

      ICSE-Companion '24
      Sponsor:

      Acceptance Rates

      Overall Acceptance Rate 276 of 1,856 submissions, 15%

      Upcoming Conference

      ICSE 2025

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • 0
        Total Citations
      • 108
        Total Downloads
      • Downloads (Last 12 months)108
      • Downloads (Last 6 weeks)23
      Reflects downloads up to 02 Oct 2024

      Other Metrics

      Citations

      View Options

      Get Access

      Login options

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      Media

      Figures

      Other

      Tables

      Share

      Share

      Share this Publication link

      Share on social media