Nothing Special   »   [go: up one dir, main page]

You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
applsci-logo

Journal Browser

Journal Browser

Knowledge Graphs: State-of-the-Art and Applications

A special issue of Applied Sciences (ISSN 2076-3417). This special issue belongs to the section "Computing and Artificial Intelligence".

Deadline for manuscript submissions: 20 February 2025 | Viewed by 6797

Special Issue Editors


E-Mail Website
Guest Editor
Institute for Electromagnetic Sensing of the Environment, National Research Council of Italy (CNR-IREA), Via Bassini 15, I-20133 Milan, Italy
Interests: semantic web; open data; FAIR technologies; ontologies; spatial data infrastructures; health informatics

E-Mail Website
Guest Editor
CNR IREA, Via Bassini 15, 20133 Milan, Italy
Interests: fuzzy logic and soft computing for the representation and management of imprecision and uncertainty of textual and geographic information; volunteered geographic information user-driven quality assessment in citizen science; crowdsourced information spatiotemporal analytics; information retrieval on the web; flexible query languages for information retrieval and geographic information systems; ill-defined environmental knowledge representation and management; multisource geographic information fusion and synthesis
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

It has taken more than 50 years since the inception of the relational model for a novel paradigm to gain momentum, due to the increasingly interconnected and networked nature of modern data, the need for incremental modeling of data structures, the ease of integration with the richer semantics of ontologies and the promise of explainable artificial intelligence. Knowledge graphs provide the theoretical underpinning for such data representation strategies and the link to more expressive knowledge representation formalisms. The Special Issue aims to portray the status of and outlook for this technology via both seminal contributions on its underlying techniques and a selected range of case studies.

Relevant topics include, but are not limited to, the following:

  • Automatic KG definition obtained from collections of unstructured heterogeneous documents and data,
  • Flexible KG querying,
  • KG interactive visualization,
  • KG hierarchical summarization,
  • KGs and ontology integration,
  • KGs and geospatial data management,
  • KGs as a means to support XAI.

Dr. Fugazza Cristiano
Dr. Gloria Bordogna
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Applied Sciences is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • automatic KG definition obtained from collections of unstructured heterogeneous documents and data
  • flexible KG querying
  • KG interactive visualization
  • KG hierarchical summarization
  • KGs and ontology integration
  • KGs and geospatial data management
  • KGs as a means to support XAI

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (4 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

22 pages, 4268 KiB  
Article
Multi-Head Self-Attention-Enhanced Prototype Network with Contrastive–Center Loss for Few-Shot Relation Extraction
by Jiangtao Ma, Jia Cheng, Yonggang Chen, Kunlin Li, Fan Zhang and Zhanlei Shang
Appl. Sci. 2024, 14(1), 103; https://doi.org/10.3390/app14010103 - 21 Dec 2023
Cited by 2 | Viewed by 1319
Abstract
Few-shot relation extraction (FSRE) constitutes a critical task in natural language processing (NLP), involving learning relationship characteristics from limited instances to enable the accurate classification of new relations. The existing research primarily concentrates on using prototype networks for FSRE and enhancing their performance [...] Read more.
Few-shot relation extraction (FSRE) constitutes a critical task in natural language processing (NLP), involving learning relationship characteristics from limited instances to enable the accurate classification of new relations. The existing research primarily concentrates on using prototype networks for FSRE and enhancing their performance by incorporating external knowledge. However, these methods disregard the potential interactions among different prototype networks, and each prototype network can only learn and infer from its limited instances, which may limit the robustness and reliability of the prototype representations. To tackle the concerns outlined above, this paper introduces a novel prototype network called SACT (multi-head self-attention and contrastive-center loss), aimed at obtaining more comprehensive and precise interaction information from other prototype networks to bolster the reliability of the prototype network. Firstly, SACT employs a multi-head self-attention mechanism for capturing interaction information among different prototypes from traditional prototype networks, reducing the noise introduced by unknown categories with a small sample through information aggregation. Furthermore, SACT introduces a new loss function, the contrastive–center loss function, aimed at tightly clustering samples from a similar relationship category in the center of the feature space while dispersing samples from different relationship categories. Through extensive experiments on FSRE datasets, this paper demonstrates the outstanding performance of SACT, providing strong evidence for the effectiveness and practicality of SACT. Full article
(This article belongs to the Special Issue Knowledge Graphs: State-of-the-Art and Applications)
Show Figures

Figure 1

Figure 1
<p>A demonstration of 3-way 1-shot scenario. Words with underscores signify entity mentions. The model is trained on support set instances to predict the relationship between the two known entities in the query set.</p>
Full article ">Figure 2
<p>An illustration of the impact of prototype interaction information on query instances. Gray spheres represent prototype networks, while spheres of other colors represent representations of support instances. The green spheres with question marks represent representations of query instances. (<b>a</b>) Originally, the representation of the query instance closely resembles the blue prototype. (<b>b</b>) After interacting with information from different prototypes, the position of the query instance representation changes, thereby modifying the prototypes.</p>
Full article ">Figure 3
<p>The architecture of our proposed SACT for the FSRE task. SACT first introduces the input relation information, support set, and query set into a BERT encoder to obtain the relationship information representation in the upper part of the sentence encoder module and the sentence representations in the lower part. Subsequently, the prototype network is further enhanced through a multi-head self-attention mechanism and optimized using the contrast–center loss function. In the diagram, relationship information is represented by triangles, the support set is denoted by circles, circles with question marks represent the query set, and pentagrams symbolize prototype representations.</p>
Full article ">Figure 4
<p>An example of sentence representation generated by the sentence encoder. It illustrates how an input sentence is transformed into a numerical representation that can be utilized for further processing.</p>
Full article ">Figure 5
<p>Schematic diagram of the prototype enhancement process. The initial prototypes of N categories are input into a multi-head self-attention mechanism to obtain enhanced prototypes. These enhanced prototypes are then combined with the initial prototypes to form the final prototypes.</p>
Full article ">Figure 6
<p>Diagram of the contrastive–center loss. In the upper-left corner, we have the basic prototype representations. Through the influence of the center loss function, you can observe a significant reduction in the distance between positive samples. After being affected by the contrastive loss function, there is some increase in the distance between class centers and negative samples. However, the contrast–center loss function used by SACT not only reduces the distance between positive samples but also increases the distance between centers and negative samples.</p>
Full article ">Figure 7
<p>Comparison of three CNN encoder-based models with SACT on the FewRel 1.0 dataset.</p>
Full article ">Figure 8
<p>Comparison between HCPR and SACT in 1-shot setting.</p>
Full article ">Figure 9
<p>Comparison of SACT with other BERT-based models on the FewRel 1.0 dataset.</p>
Full article ">Figure 10
<p>Comparison of SACT with other CP-based models on FewRel 1.0 dataset.</p>
Full article ">Figure 11
<p>Comparison of SACT with other prototype network models on FewRel 1.0 dataset.</p>
Full article ">Figure 12
<p>Comparison of SACT with other models on the FewRel 2.0 dataset.</p>
Full article ">
19 pages, 2313 KiB  
Article
Few-Shot Knowledge Graph Completion Model Based on Relation Learning
by Weijun Li, Jianlai Gu, Ang Li, Yuxiao Gao and Xinyong Zhang
Appl. Sci. 2023, 13(17), 9513; https://doi.org/10.3390/app13179513 - 22 Aug 2023
Viewed by 1501
Abstract
Considering the complexity of entity pair relations and the information contained in the target neighborhood in few-shot knowledge graphs (KG), existing few-shot KG completion methods generally suffer from insufficient relation representation learning capabilities and neglecting the contextual semantics of entities. To tackle the [...] Read more.
Considering the complexity of entity pair relations and the information contained in the target neighborhood in few-shot knowledge graphs (KG), existing few-shot KG completion methods generally suffer from insufficient relation representation learning capabilities and neglecting the contextual semantics of entities. To tackle the above problems, we propose a Few-shot Relation Learning-based Knowledge Graph Completion model (FRL-KGC). First, a gating mechanism is introduced during the aggregation of higher-order neighborhoods of entities in formation, enriching the central entity representation while reducing the adverse effects of noisy neighbors. Second, during the relation representation learning stage, a more accurate relation representation is learned by using the correlation between entity pairs in the reference set. Finally, an LSTM structure is incorporated into the Transformer learner to enhance its ability to learn the contextual semantics of entities and relations and predict new factual knowledge. We conducted comparative experiments on the publicly available NELL-One and Wiki-One datasets, comparing FRL-KGC with six few-shot knowledge graph completion models and five traditional knowledge graph completion models for five-shot link prediction. The results showed that FRL-KGC outperformed all comparison models in terms of MRR, Hits@10, Hits@5, and Hits@1 metrics. Full article
(This article belongs to the Special Issue Knowledge Graphs: State-of-the-Art and Applications)
Show Figures

Figure 1

Figure 1
<p>An example of a five-shot KGC task.</p>
Full article ">Figure 2
<p>Overview of the FRL-KGC framework.</p>
Full article ">Figure 3
<p>The main structure of the high-order neighborhood entity encoder based on the gating mechanism.</p>
Full article ">Figure 4
<p>The main structure of the relation representation encoder.</p>
Full article ">Figure 5
<p>The main structure of a learning framework composed of a Transformer and an LSTM.</p>
Full article ">Figure 6
<p>The schematic diagram of matching process calculation.</p>
Full article ">Figure 7
<p>Impact of few-shot size <span class="html-italic">K</span> in the performance of FKGC methods on Wiki-One dataset.</p>
Full article ">
20 pages, 2114 KiB  
Article
Sparse Subgraph Prediction Based on Adaptive Attention
by Weijun Li, Yuxiao Gao, Ang Li, Xinyong Zhang, Jianlai Gu and Jintong Liu
Appl. Sci. 2023, 13(14), 8166; https://doi.org/10.3390/app13148166 - 13 Jul 2023
Cited by 2 | Viewed by 1545
Abstract
Link prediction is a crucial problem in the analysis of graph-structured data, and graph neural networks (GNNs) have proven to be effective in addressing this problem. However, the computational and temporal costs associated with large-scale graphs remain a concern. This study introduces a [...] Read more.
Link prediction is a crucial problem in the analysis of graph-structured data, and graph neural networks (GNNs) have proven to be effective in addressing this problem. However, the computational and temporal costs associated with large-scale graphs remain a concern. This study introduces a novel method for link prediction called Sparse Subgraph Prediction Based on Adaptive Attention (SSP-AA). The method generates sparse subgraphs and utilizes Graph SAmple and aggreGatE (GraphSAGE) for prediction, aiming to reduce computation and time costs while providing a foundation for future exploration of large-scale graphs. Certain key issues in GraphSAGE are addressed by integrating an adaptive attention mechanism and a jumping knowledge module into the model. To address the issue of adaptive weight distribution in GraphSAGE, an aggregation function is employed, which is based on the attention mechanism. This modification enables the model to distribute weights adaptively among neighboring nodes, significantly improving its ability to capture node relationships. Furthermore, to tackle the common issue of over-smoothing in GNNs, a jumping knowledge module is integrated, enabling information sharing across different layers and providing the model with the flexibility to select the appropriate representation depth based on the specific situation. By enhancing the quality of node representations, SSP-AA further boosts the performance of GraphSAGE in various prediction tasks involving graph-structured data. Full article
(This article belongs to the Special Issue Knowledge Graphs: State-of-the-Art and Applications)
Show Figures

Figure 1

Figure 1
<p>(<b>a</b>) Wandering from the target node. (<b>b</b>) Closed enclosing subgraph with labels.</p>
Full article ">Figure 2
<p>Weight normalization.</p>
Full article ">Figure 3
<p>Jump knowledge connection.</p>
Full article ">Figure 4
<p>LSTM structure diagram.</p>
Full article ">Figure 5
<p>LSTM attention aggregation.</p>
Full article ">Figure 6
<p>Model frame diagram.</p>
Full article ">Figure 7
<p>Performance of different models in multiple datasets.</p>
Full article ">Figure 8
<p>The AUC values of the SSP-AA model under different high-order neighbor parameters.</p>
Full article ">

Review

Jump to: Research

21 pages, 6613 KiB  
Review
Event Knowledge Graph: A Review Based on Scientometric Analysis
by Shishuo Xu, Sirui Liu, Changfeng Jing and Songnian Li
Appl. Sci. 2023, 13(22), 12338; https://doi.org/10.3390/app132212338 - 15 Nov 2023
Cited by 1 | Viewed by 1473
Abstract
In the last decade, the event knowledge graph field has received significant attention from both academic and industry communities, leading to the proliferated publication of numerous scientific papers in diverse journals, countries, and disciplines. However, a comprehensive and systematic survey of the recent [...] Read more.
In the last decade, the event knowledge graph field has received significant attention from both academic and industry communities, leading to the proliferated publication of numerous scientific papers in diverse journals, countries, and disciplines. However, a comprehensive and systematic survey of the recent literature in this area to obtain how the development of event knowledge graph evolves over time is lacking. To address this gap, we performed scientometric analyses utilizing the CiteSpace software of version 6.2.R4 package to extract and analyze data from the Web of Science database, including information about authors, journals, countries, and keywords. We then constructed four networks, including the author co-citation network, journal co-citation network, collaborative country network, and keyword co-occurrence network. Analyzing these networks allowed us to identify core authors, research hotspots, landmark journals, and national collaborations, as well as emerging trends by assessing the central nodes and nodes with strong citation bursts. Our contribution mainly lies in providing a scientometric way to quantitatively capture the research patterns in the last decade in the event knowledge graph field. Our work provides not only a structured view of the state-of-the-art literature but also insights into future trends in the event knowledge graph field, aiding researchers in conducting further research in this area. Full article
(This article belongs to the Special Issue Knowledge Graphs: State-of-the-Art and Applications)
Show Figures

Figure 1

Figure 1
<p>A typhoon event knowledge graph.</p>
Full article ">Figure 2
<p>An overall workflow of constructing an event knowledge graph.</p>
Full article ">Figure 3
<p>An overall framework for conducting scientometric survey in the event knowledge graph field.</p>
Full article ">Figure 4
<p>Statistics on the number of papers published each year.</p>
Full article ">Figure 5
<p>The visualization of the merged network of author co-citation network analysis for years 2012 to 2022.</p>
Full article ">Figure 6
<p>The cluster analysis results of author co-citation network for the years 2012 to 2022.</p>
Full article ">Figure 7
<p>The visualization of journal co-citation network for the years 2012–2022.</p>
Full article ">Figure 8
<p>The clusters of the journal co-citation network for the years 2012 to 2022.</p>
Full article ">Figure 9
<p>The visualization of the collaborative country network for the years 2012–2022.</p>
Full article ">Figure 10
<p>The citation burst history of a country in the timespan of 2012 to 2022.</p>
Full article ">Figure 11
<p>The visualization of keyword co-occurrence network for years 2012 to 2022.</p>
Full article ">
Back to TopTop