Abstract
Graph Neural Networks (GNNs) are specialized neural networks that operate on graph-structured data, utilizing the connections between nodes to learn and process information. To achieve optimal performance, GNNs require the automatic selection of the best loss and optimization functions, which allows the model to adapt to the unique features of the dataset being used. This eliminates the need for manual selection, saving time and minimizing the requirement for domain-specific knowledge. The automatic selection of loss and optimization functions is a crucial factor in achieving state-of-the-art results when training GNNs. In this study, we trained Graph Convolutional Networks (GCNs) and Graph Attention Networks (GAT) models for node classification on three benchmark datasets. To automatically select the best loss and optimization functions, we utilized performance metrics. We implemented a learning rate scheduler to adjust the learning rate based on the model’s performance, which led to improved results. We evaluated the model’s performance using multiple metrics and reported the best loss function and performance metric, enabling users to compare its performance to other models. Our approach achieved state-of-the-art results, highlighting the importance of selecting the appropriate loss and optimizer functions. Additionally, we developed a real-time visualization of the GCN model during training, providing users with a detailed understanding of the model’s behavior. Overall, this study provides a comprehensive understanding of GNNs and their application to graph-structured data, with a specific focus on real-time visualization of GNN behavior during training.
This research was supported by the research training group “Dataninja” (Trustworthy AI for Seamless Problem Solving: Next Generation Intelligence Joins Robust Data Analysis) funded by the German federal state of North Rhine-Westphalia and the project SAIL. SAIL is funded by the Ministry of Culture and Science of the State of North Rhine-Westphalia under grant no. NW21-059B.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Zhou, J., et al.: Graph neural networks: a review of methods and applications. AI open 1, 57–81 (2020)
Zheng, L., Zhou, J., Chen, C., Wu, B., Wang, L., Zhang, B.: Asfgnn: automated separated-federated graph neural network. Peer-to-Peer Network. Appl. 14(3), 1692–1704 (2021)
Niknam, T., Narimani, M., Aghaei, J., Azizipanah-Abarghooee, R.: Improved particle swarm optimisation for multi-objective optimal power flow considering the cost, loss, emission and voltage stability index. IET Generation, Trans. Distrib. 6(6), 515–527 (2012)
Zhang, S., Tong, H., Xu, J., Maciejewski, R.: Graph convolutional networks: a comprehensive review. Comput. Social Netw. 6(1), 1–23 (2019). https://doi.org/10.1186/s40649-019-0069-y
Li, Q., Han, Z., Wu, X.M.: Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol. 32 (2018)
Danel, T., et al.: Spatial graph convolutional networks. In: Yang, H., Pasupa, K., Leung, A.C.-S., Kwok, J.T., Chan, J.H., King, I. (eds.) ICONIP 2020. CCIS, vol. 1333, pp. 668–675. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-63823-8_76
Wang, X., Zhu, M., Bo, D., Cui, P., Shi, C., Pei, J.: Am-gcn: Adaptive multi-channel graph convolutional networks. In: Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pp. 1243–1253 (2020)
Veličković, P., Cucurull, G., Casanova, A., Romero, A., Liò, P., Bengio, Y.: Graph Attention Networks. In: International Conference on Learning Representations (2018)
Sanaullah, Baig, H., Madsen, J., Lee, J.A.: A parallel approach to perform threshold value and propagation delay analyses of genetic logic circuit models. ACS Synth. Biol. 9(12), 3422–3428 (2020)
Sanaullah, Koravuna, S., Rückert, U., Jungeblut, T.: Snns model analyzing and visualizing experimentation using ravsim. In: Engineering Applications of Neural Networks: 23rd International Conference, EAAAI/EANN 2022, Chersonissos, Crete, Greece, June 17–20, 2022, Proceedings. pp. 40–51. Springer (2022)
Yan, W., Culp, C., Graf, R.: Integrating bim and gaming for real-time interactive architectural visualization. Autom. Constr. 20(4), 446–458 (2011)
Sen, P., Namata, G., Bilgic, M., Getoor, L., Galligher, B., Eliassi-Rad, T.: Collective classification in network data. AI Mag. 29(3), 93–93 (2008)
Gabruseva, T., Poplavskiy, D., Kalinin, A.: Deep learning for automatic pneumonia detection. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition Workshops, pp. 350–351 (2020)
McCloskey, D.N.: The loss function has been mislaid: the rhetoric of significance tests. Am. Econ. Rev. 75(2), 201–205 (1985)
Code availability. https://github.com/Rao-Sanaullah/GNN-Classification-with-Automatic-Loss-Function-and-Optimizer-Selection Accessed Apr 2023
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Ethics declarations
Availability
In this study, we have made the code used in our experiments publicly available on GitHub [15]. This allows other researchers to replicate our experiments and build upon our work and to ensure the reproducibility of our results, we have used publicly available datasets for generating all the test cases.
Rights and permissions
Copyright information
© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Sanaullah, Koravuna, S., Rückert, U., Jungeblut, T. (2023). Streamlined Training of GCN for Node Classification with Automatic Loss Function and Optimizer Selection. In: Iliadis, L., Maglogiannis, I., Alonso, S., Jayne, C., Pimenidis, E. (eds) Engineering Applications of Neural Networks. EANN 2023. Communications in Computer and Information Science, vol 1826. Springer, Cham. https://doi.org/10.1007/978-3-031-34204-2_17
Download citation
DOI: https://doi.org/10.1007/978-3-031-34204-2_17
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-34203-5
Online ISBN: 978-3-031-34204-2
eBook Packages: Computer ScienceComputer Science (R0)