Nothing Special   »   [go: up one dir, main page]

skip to main content
10.1145/3643795.3648394acmconferencesArticle/Chapter ViewAbstractPublication PagesicseConference Proceedingsconference-collections
research-article

Learn to Code Sustainably: An Empirical Study on Green Code Generation

Published: 10 September 2024 Publication History

Abstract

The increasing use of information technology has led to a significant share of energy consumption and carbon emissions from data centers. These contributions are expected to rise with the growing demand for big data analytics, increasing digitization, and the development of large artificial intelligence (AI) models. The need to address the environmental impact of software development has led to increased interest in green (sustainable) coding and claims that the use of AI models can lead to energy efficiency gains. Here, we provide an empirical study on green code and an overview of green coding practices, as well as metrics used to quantify the sustainability awareness of AI models. In this framework, we evaluate the sustainability of auto-generated code. The auto-generated code considered in this study is produced by generative commercial AI language models, GitHub Copilot, OpenAI ChatGPT-3, and Amazon CodeWhisperer. Within our methodology, in order to quantify the sustainability awareness of these AI models, we propose a definition of the code's "green capacity", based on certain sustainability metrics. We compare the performance and green capacity of human-generated code and code generated by the three AI language models in response to easy-to-hard problem statements. Our findings shed light on the current capacity of AI models to contribute to sustainable software development.

References

[1]
Bilge Acun, Benjamin Lee, Fiodar Kazhamiaka, Aditya Sundarrajan, Kiwan Maeng, Manoj Chakkaravarthy, David Brooks, and Carole-Jean Wu. 2023. Carbon Dependencies in Datacenter Design and Management. ACM SIGEnergy Energy Informatics Review 3,3 (Oct. 2023), 21--26.
[2]
Amodei and Hernandez. 2018. . https://openai.com/research/ai-and-compute
[3]
Mahmoud Assran, Joshua Romoff, Nicolas Ballas, Joelle Pineau, and Michael Rabbat. 2019. Gossip-based Actor-Learner Architectures for Deep Reinforcement Learning. (2019).
[4]
Alfredo Canziani, Adam Paszke, and Eugenio Culurciello. 2016. An Analysis of Deep Neural Network Models for Practical Applications.
[5]
Steven Dalton and Iuri Frosio. 2020. Accelerating Reinforcement Learning through GPU Atari Emulation. In Proceedings of the 34th International Conference on Neural Information Processing Systems (Vancouver, BC, Canada) (NIPS'20). Curran Associates Inc., Red Hook, NY, USA, Article 1659, 10 pages.
[6]
Howard David, Eugene Gorbatov, Ulf R. Hanebutte, Rahul Khanna, and Christian Le. 2010. RAPL: memory power estimation and capping. In Proceedings of the 16th ACM/IEEE international symposium on Low power electronics and design (ISLPED'10). ACM.
[7]
Arnaldo Carvalho de Melo and Red Hat. 2010. The New Linux' perf' Tools. https://api.semanticscholar.org/CorpusID:10296207
[8]
Radosvet Desislavov, Fernando Martinez-Plumed, and Jose Hernander-Orallo. 2023. Trends in AI inference energy consumption: beyond the performance-vs-parameter laws of deep learning. Sustainable Computing: Informatics and Systems 38 (2023), 100857.
[9]
Radosvet Desislavov, Fernando Martínez-Plumed, and José Hernández-Orallo. 2023. Trends in AI inference energy consumption: Beyond the performance-vs-parameter laws of deep learning. Sustainable Computing: Informatics and Systems 38 (2023), 100857.
[10]
Kevin Ellis, Maxwell Nye, Yewen Pu, Felix Sosa, Josh Tenenbaum, and Armando Solar-Lezama. 2019. Write, execute, assess: Program synthesis with a repl. Advances in Neural Information Processing Systems 32 (2019).
[11]
Sumit Gulwani. 2010. Dimensions in Program Synthesis. In Proceedings of the 12th International ACM SIGPLAN Symposium on Principles and Practice of Declarative Programming (Hagenberg, Austria) (PPDP '10). Association for Computing Machinery, New York, NY, USA, 13--24.
[12]
Sumit Gulwani, Oleksandr Polozov, and Rishabh Singh. 2017. Program Synthesis. Foundations and Trends® in Programming Languages 4, 1--2 (2017), 1--119.
[13]
Peter Henderson, Jieru Hu, Joshua Romoff, Emma Brunskill, Dan Jurafsky, and Joelle Pineau. 2020. Towards the Systematic Reporting of the Energy and Carbon Footprints of Machine Learning.
[14]
Ralph Hintemann and Simon Hinterholzer. 2020. Energy consumption of data centers worldwide How will the Internet become green?
[15]
Kangjing Huang, Xiaokang Qiu, Peiyuan Shen, and Yanjun Wang. 2020. Reconciling enumerative and deductive program synthesis. In Proceedings of the 41st ACM SIGPLAN International Conference on Programming Language Design and Implementation, PLDI 2020, London, UK, June 15--20, 2020, Alastair F. Donaldson and Emina Torlak (Eds.). ACM, 1159--1174.
[16]
IEA. 2022. World Energy Outlook 2022. Technical Report. IEA, Paris. https://www.iea.org/reports/world-energy-outlook-2022
[17]
IEA. 2023. Tracking Clean Energy Progress 2023. Technical Report. IEA, Paris. https://www.iea.org/reports/tracking-clean-energy-progress-2023
[18]
Ireland Central Statistics Office. 2023. Data Centres Metered Electricity Consumption 2022. ISSN: 2811-5422.
[19]
Yunho Jeon and Junmo Kim. 2018. Constructing Fast Network through Deconstruction of Convolution. In Advances in Neural Information Processing Systems, S. Bengio, H. Wallach, H. Larochelle, K. Grauman, N. Cesa-Bianchi, and R. Garnett (Eds.), Vol. 31. Curran Associates, Inc. https://proceedings.neurips.cc/paper_files/paper/2018/file/9719a00ed0c5709d80dfef33795dcef3-Paper.pdf
[20]
Alexandre Lacoste, Alexandra Luccioni, Victor Schmidt, and Thomas Dandres. 2019. Quantifying the Carbon Emissions of Machine Learning.
[21]
Nevena Lazic, Tyler Lu, Craig Boutilier, MK Ryu, Eehern Jay Wong, Binz Roy, and Greg Imwalle. 2018. Data Center Cooling using Model-predictive Control. In Proceedings of the Thirty-second Conference on Neural Information Processing Systems (NeurIPS-18). Montreal, QC, 3818--3827. https://papers.nips.cc/paper/7638-data-center-cooling-using-model-predictive-control
[22]
LeetCode. 2010. The World's Leading Online Programming Learning Platform --- leetcode.com. https://leetcode.com/. [Accessed 04-12-2023].
[23]
Ningning Ma, Xiangyu Zhang, Hai-Tao Zheng, and Jian Sun. 2018. ShuffleNet V2: Practical Guidelines for Efficient CNN Architecture Design. Springer International Publishing, 122--138.
[24]
Javier Mancebo, Félix García, and Coral Calero. 2021. A process for analysing the energy efficiency of software. Information and Software Technology 134 (2021), 106560.
[25]
Eric Masanet, Arman Shehabi, Nuoa Lei, and Sarah Smith. 2020. Recalibrating global data center energy-use estimates. Science 367, 6481 (02 2020), 984--986.
[26]
Patterson, Gonzalez, Le, Liang, Munguia, Rothchild, So, Texier, and Dean. 2021. Carbon Emissions and Large Neural Network Training. arXiv preprint arxiv:2104.10350 (2021).
[27]
Zheng Qin, Zhaoning Zhang, Dongsheng Li, Yiming Zhang, and Yuxing Peng. 2018. Diagonalwise Refactorization: An Efficient Training Method for Depthwise Convolutions. In 2018 International Joint Conference on Neural Networks (IJCNN). IEEE.
[28]
Mark Sandler, Andrew Howard, Menglong Zhu, Andrey Zhmoginov, and Liang-Chieh Chen. 2018. MobileNetV2: Inverted Residuals and Linear Bottlenecks. (2018).
[29]
Roy Schwartz, Jesse Dodge, Noah A. Smith, and Oren Etzioni. 2020. Green AI. Commun. ACM 63, 12 (Nov. 2020), 54--63.
[30]
David Richard So, Quoc V. Le, and Chen Liang. 2019. The Evolved Transformer. https://arxiv.org/pdf/1901.11117.pdf
[31]
Emma Strubell, Ananya Ganesh, and Andrew McCallum. 2019. Energy and Policy Considerations for Deep Learning in NLP. In Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics. Association for Computational Linguistics, Florence, Italy, 3645--3650.
[32]
Mingxing Tan and Quoc V. Le. 2019. EfficientNet: Rethinking Model Scaling for Convolutional Neural Networks. (2019).
[33]
Carole-Jean Wu, Ramya Raghavendra, Udit Gupta, Bilge Acun, Newsha Ardalani, Kiwan Maeng, Gloria Chang, Fiona Aga, Jinshi Huang, Charles Bai, et al. 2022. Sustainable ai: Environmental implications, challenges and opportunities. Proceedings of Machine Learning and Systems 4 (2022), 795--813.

Cited By

View all
  • (2024)Carbon Footprint Evaluation of Code Generation through LLM as a Service2024 Stuttgart International Symposium on Automotive and Engine Technology10.1007/978-3-658-45010-6_15(230-241)Online publication date: 30-Jun-2024

Index Terms

  1. Learn to Code Sustainably: An Empirical Study on Green Code Generation
        Index terms have been assigned to the content through auto-classification.

        Recommendations

        Comments

        Please enable JavaScript to view thecomments powered by Disqus.

        Information & Contributors

        Information

        Published In

        cover image ACM Conferences
        LLM4Code '24: Proceedings of the 1st International Workshop on Large Language Models for Code
        April 2024
        144 pages
        ISBN:9798400705793
        DOI:10.1145/3643795
        Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than the author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected].

        Sponsors

        In-Cooperation

        • Faculty of Engineering of University of Porto

        Publisher

        Association for Computing Machinery

        New York, NY, United States

        Publication History

        Published: 10 September 2024

        Check for updates

        Qualifiers

        • Research-article

        Conference

        LLM4Code '24
        Sponsor:

        Upcoming Conference

        ICSE 2025

        Contributors

        Other Metrics

        Bibliometrics & Citations

        Bibliometrics

        Article Metrics

        • Downloads (Last 12 months)81
        • Downloads (Last 6 weeks)35
        Reflects downloads up to 23 Nov 2024

        Other Metrics

        Citations

        Cited By

        View all
        • (2024)Carbon Footprint Evaluation of Code Generation through LLM as a Service2024 Stuttgart International Symposium on Automotive and Engine Technology10.1007/978-3-658-45010-6_15(230-241)Online publication date: 30-Jun-2024

        View Options

        Login options

        View options

        PDF

        View or Download as a PDF file.

        PDF

        eReader

        View online with eReader.

        eReader

        Media

        Figures

        Other

        Tables

        Share

        Share

        Share this Publication link

        Share on social media