AHD-SLE: Anomalous Hyperedge Detection on Hypergraph Symmetric Line Expansion
<p>Hypergraph expansions. (<b>a</b>) A hypergraph with 7 nodes and 3 hyperedges. (<b>b</b>) Star expansion. (<b>c</b>) Clique expansion. (<b>d</b>) Line expansion. (<b>e</b>) Symmetric line expansion.</p> "> Figure 2
<p>The bijection between node–hyperedge pairs and SLE graph nodes.</p> "> Figure 3
<p>Anomalous Hyperedge Detection method on hypergraph Symmetric Line Expansion.</p> "> Figure 4
<p>Impact of different anomalous proportions for anomaly detection. (<b>a</b>) is the impact on AUC, (<b>b</b>) is the impact on R@k.</p> "> Figure 5
<p>Impact of different <math display="inline"><semantics> <msub> <mi>w</mi> <mi>v</mi> </msub> </semantics></math> and <math display="inline"><semantics> <msub> <mi>w</mi> <mi>e</mi> </msub> </semantics></math> for detection performance. The x-axis is the value of the <math display="inline"><semantics> <msub> <mi>w</mi> <mi>v</mi> </msub> </semantics></math>. Accordingly, the <math display="inline"><semantics> <mrow> <msub> <mi>w</mi> <mi>e</mi> </msub> <mo>=</mo> <mn>2</mn> <mo>−</mo> <msub> <mi>w</mi> <mi>v</mi> </msub> </mrow> </semantics></math>. The y-axis in (<b>a</b>) is the AUC value under different <math display="inline"><semantics> <msub> <mi>w</mi> <mi>v</mi> </msub> </semantics></math>, and the y-axis in (<b>b</b>) is the R@k value under different <math display="inline"><semantics> <msub> <mi>w</mi> <mi>v</mi> </msub> </semantics></math>.</p> "> Figure 6
<p>Impact of different GCN hidden sizes for detection performance. (<b>a</b>) is the impact on AUC, (<b>b</b>) is the impact on R@k.</p> ">
Abstract
:1. Introduction
- (1)
- We construct a novel hyperedge representation learning framework based on SLE graphs, which can preserve the higher-order information of hypergraphs.
- (2)
- We propose an Anomalous Hyperedge Detection method on hypergraph Symmetric Line Expansion (AHD-SLE). AHD-SLE uses the hyperedge embedding representation learned through SLE to detect anomalous hyperedge.
- (3)
- Experiments on five different types of real-world hypergraph datasets show that AHD-SLE has obvious performance advantages over the baseline algorithm in AUC and Recall metrics.
2. Related Works
2.1. Anomaly Edge and Hyperedge Detection
2.2. Hypergraph Representation Learning
3. Preliminaries
3.1. Problem Statement
3.2. Symmetric Line Expansion
- Each incident node–hyperedge pair in the hypergraph is mapped to a “line node” in SLE.
- “Line nodes” are connected if they have the same node or hyperedge.
- The incident node–hyperedge pair is bijective with the “line node” of the SLE. As a result, the hypergraph can also be mapped back from its corresponding SLE without the loss of higher-order information. That is, the SLE can preserve the higher-order information of the hypergraph.
- In SLE, the nodes are isomorphic, while the edges are heterogeneous. There are two types of edges: edges formed by common nodes, and edges formed by common hyperedges. Due to the previously mentioned symmetric structure of hypergraph, these two types of edges in SLE can be considered isomorphic for information aggregation.
4. Method
- (1)
- Map the hypergraph into its SLE graph and construct the feature mapping matrix; then, calculate the node feature matrix of SLE using the node feature mapping matrix;
- (2)
- Feed the SLE’s feature matrix and adjacency matrix into the GCN network for node feature aggregation, and then calculate the hyperedge embedding vector using the hyperedge backmapping matrix;
- (3)
- Design the anomaly scoring function to calculate the anomaly score for each hyperedge embedding vector, and the hyperedge with lower scores is considered anomalous.
4.1. Hypergraph Expansion
- Node map matrix: This can map the hypergraph nodes features to SLE nodes. If the node part of the SLE node is v, the features of v are directly used as the features of . The definition is as follows:Then, the SLE node feature matrix from hypergraph nodes can be calculated by .
- Node backmap matrix: This can map the SLE nodes features to hypergraph nodes. Since hypergraph node v may be converted to multiple SLE nodes, multiple SLE nodes whose node portion is v need to be aggregated in the backmapping. Considering that each SLE node also contains the hyperedge part where v is located, the inverse of the hyperedge degree is used as the aggregation weight. The definition is as follows:
- Hyperedge map matrix: This can map the hyperedge features to SLE nodes. If the edge part of the SLE node is e, the features of e are directly used as the features of . The definition is as follows:Then, the SLE node feature matrix from the hyperedges can be calculated by .
- Hyperedge backmap matrix: This can map the SLE nodes features to hyperedges. Similar to the node backmapping, the hyperedge e may be mapped to multiple SLE nodes, and the inverse of the hypergraph nodes degree is used as the feature aggregation weight. The definition is as follows:
4.2. Hyperedge Representation Learning
- (1)
- Node feature mapping: Map the node features of the hypergraph to the node features of the SLE.
- (2)
- Node information aggregation: In SLE, there are two types of neighbors: those based on common nodes and those based on common hyperedges. In order to better aggregate information, different aggregation weights are assigned to these two different types of neighbors. Graph convolution operations are then used to aggregate neighborhood features.By adding two orders of self-loops of nodes in the process of information aggregation, then , and the matrix representation of the convolution is
- (3)
- Hyperedge feature back-mapping: Backmap the line node feature to the hyperedge using
4.3. Hyperedge Anomaly Score
4.4. Loss Function
5. Experiments
5.1. Datasets
5.2. Anomaly Injection
5.3. Baselines
- DeepWalk (Code available at https://github.com/shenweichen/graphembedding, (accessed on 4 June 2024)) [24]: This method obtains a node sequence through n-step random walks and then uses word2vec’s skip-gram algorithm to obtain the embedded representation of the nodes.
- LINE (Code available at https://github.com/shenweichen/graphembedding, (accessed on 4 June 2024)) [26]: This method defines the first-order and second-order node similarity and then concatenates them to obtain the embedded representation of the nodes.
- Node2Vec (Code available at https://github.com/shenweichen/graphembedding, (accessed on 4 June 2024)) [25]: This method balances the random walk strategy with hyperparameters during the sampling process and inputs the obtained node sequence into the skip-gram to learn node representations.
- HGNN (Code available at https://github.com/iMoonLab/HGNN, (accessed on 4 June 2024)) [32]: This method extends graph convolution operations to hypergraphs and designs hypergraph convolution operations to learn embedding representations of high-order data structures.
- HyperGCN (Code available at https://github.com/malllabiisc/HyperGCN, (accessed on 4 June 2024)) [35]: This method uses a nonlinear Laplace operator to define the GCN on a hypergraph and then aggregates neighborhood information to obtain the embedded representation of nodes.
- Hyper-SAGNN (Code available at https://github.com/ma-compbio/Hyper-SAGNN, (accessed on 4 June 2024)) [49]: This method combines two graph embedding methods, random walk, and encoder to design a new self-attention mechanism and obtain an embedding representation of nonuniform hypergraphs.
- UniGNN (Code available at https://github.com/OneForward/UniGNN, (accessed on 4 June 2024)) [42]: This method proposes a unified framework to characterize the message-passing process in the GNN and HyperGNN through defining the update process of the GNN as a two-stage aggregation process.
- NHP-U (Code available at https://drive.google.com/file/d/1z7XwXo5ohTudUTjyS4KfQbqwze24Ry-o/view, (accessed on 4 June 2024)) [48]: The method obtains the hyperedge embedded representation through a hyperedge-aware GCN layer that aggregates information about the nodes in the hyperedge and a max–min-based scoring function.
5.4. Metrics
- The AUC metric is widely used to measure the accuracy of detection methods. The measurement of this index is in the range of [0.5,1]. We randomly pick an anomalous hyperedge and a normal hyperedges to compare their scores; if among n independent comparisons, there are times the anomalous hyperedge having a higher score and times they have the same score, the AUC value is defined as follows:
- The Recall is an indicator to evaluate whether a detection algorithm is comprehensive, i.e., how many anomalous hyperedges are correctly detected. Here, we only focus on the top-ranked detection results, sort the anomaly scores from high to low, and select the top k detection results (k is half of the anomalous hyperedges) to calculate the Recall@k (represented by R@k), which is defined as follows:
5.5. Implementation Details
5.6. Results
5.6.1. Detection Performance
5.6.2. Algorithmic Robustness
5.6.3. Parameter Sensitivity
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Ding, Q.; Katenka, N.; Barford, P.; Kolaczyk, E.; Crovella, M. Intrusion as (anti) social communication: Characterization and detection. In Proceedings of the 18th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Beijing, China, 12–16 August 2012; pp. 886–894. [Google Scholar]
- Ott, M.; Cardie, C.; Hancock, J. Estimating the prevalence of deception in online review communities. In Proceedings of the 21st international conference on World Wide Web, Lyon, France, 16–20 April 2012; pp. 201–210. [Google Scholar]
- Hu, X.; Li, B.; Zhang, Y.; Zhou, C.; Ma, H. Detecting compromised email accounts from the perspective of graph topology. In Proceedings of the 11th International Conference on Future Internet Technologies, Nanjing, China, 15–17 June 2016; pp. 76–82. [Google Scholar]
- Milojević, S. Principles of scientific research team formation and evolution. Proc. Natl. Acad. Sci. USA 2014, 111, 3984–3989. [Google Scholar] [CrossRef] [PubMed]
- Pearcy, N.; Crofts, J.J.; Chuzhanova, N. Hypergraph models of metabolism. Int. J. Biol. Vet. Agric. Food Eng. 2014, 8, 752–756. [Google Scholar]
- Wolf, M.M.; Klinvex, A.M.; Dunlavy, D.M. Advantages to modeling relational data using hypergraphs versus graphs. In Proceedings of the 2016 IEEE High Performance Extreme Computing Conference (HPEC), Waltham, MA, USA, 13–15 September 2016; pp. 1–7. [Google Scholar]
- Mayfield, M.M.; Stouffer, D.B. Higher-order interactions capture unexplained complexity in diverse communities. Nat. Ecol. Evol. 2017, 1, 0062. [Google Scholar] [CrossRef] [PubMed]
- Yoon, S.E.; Song, H.; Shin, K.; Yi, Y. How much and when do we need higher-order information in hypergraphs? A case study on hyperedge prediction. In Proceedings of the Web Conference 2020, Taipei, Taiwan, 20–24 April 2020; pp. 2627–2633. [Google Scholar]
- Gao, Y.; Zhang, Z.; Lin, H.; Zhao, X.; Du, S.; Zou, C. Hypergraph learning: Methods and practices. IEEE Trans. Pattern Anal. Mach. Intell. 2020, 44, 2548–2566. [Google Scholar] [CrossRef] [PubMed]
- Krishnagopal, S.; Bianconi, G. Topology and dynamics of higher-order multiplex networks. Chaos Solitons Fractals 2023, 177, 114296. [Google Scholar] [CrossRef]
- Bick, C.; Gross, E.; Harrington, H.A.; Schaub, M.T. What are higher-order networks? SIAM Rev. 2023, 65, 686–731. [Google Scholar] [CrossRef]
- Tian, H.; Zafarani, R. Higher-Order Networks Representation and Learning: A Survey. arXiv 2024, arXiv:2402.19414. [Google Scholar]
- Antelmi, A.; Cordasco, G.; Polato, M.; Scarano, V.; Spagnuolo, C.; Yang, D. A Survey on Hypergraph Representation Learning. ACM Comput. Surv. 2023, 56, 1–38. [Google Scholar] [CrossRef]
- Zien, J.Y.; Schlag, M.D.; Chan, P.K. Multilevel spectral hypergraph partitioning with arbitrary vertex sizes. IEEE Trans. Comput.-Aided Des. Integr. Circuits Syst. 1999, 18, 1389–1399. [Google Scholar] [CrossRef]
- Sun, L.; Ji, S.; Ye, J. Hypergraph spectral learning for multi-label classification. In Proceedings of the 14th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Las Vegas, NV, USA, 24–27 August 2008; pp. 668–676. [Google Scholar]
- Bandyopadhyay, S.; Das, K.; Murty, M.N. Line hypergraph convolution network: Applying graph convolution for hypergraphs. arXiv 2020, arXiv:2002.03392. [Google Scholar]
- Yang, C.; Wang, R.; Yao, S.; Abdelzaher, T. Semi-supervised hypergraph node classification on hypergraph line expansion. In Proceedings of the 31st ACM International Conference on Information & Knowledge Management, Atlanta, GA, USA, 17–21 October 2022; pp. 2352–2361. [Google Scholar]
- Akoglu, L.; McGlohon, M.; Faloutsos, C. Oddball: Spotting anomalies in weighted graphs. In Proceedings of the Advances in Knowledge Discovery and Data Mining: 14th Pacific-Asia Conference, PAKDD 2010, Hyderabad, India, 21–24 June 2010; pp. 410–421. [Google Scholar]
- Hooi, B.; Song, H.A.; Beutel, A.; Shah, N.; Shin, K.; Faloutsos, C. Fraudar: Bounding graph fraud in the face of camouflage. In Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Francisco, CA, USA, 13–17 August 2016; pp. 895–904. [Google Scholar]
- Yu, W.; Cheng, W.; Aggarwal, C.C.; Zhang, K.; Chen, H.; Wang, W. Netwalk: A flexible deep embedding approach for anomaly detection in dynamic networks. In Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, London, UK, 19–23 August 2018; pp. 2672–2681. [Google Scholar]
- Teng, X.; Yan, M.; Ertugrul, A.M.; Lin, Y.R. Deep into hypersphere: Robust and unsupervised anomaly discovery in dynamic networks. In Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence, Stockholm, Sweden, 13–19 July 2018. [Google Scholar]
- Ding, K.; Li, J.; Bhanushali, R.; Liu, H. Deep anomaly detection on attributed networks. In Proceedings of the 2019 SIAM International Conference on Data Mining, Calgary, AB, Canada, 2–4 May 2019; pp. 594–602. [Google Scholar]
- Peng, Z.; Luo, M.; Li, J.; Xue, L.; Zheng, Q. A deep multi-view framework for anomaly detection on attributed networks. IEEE Trans. Knowl. Data Eng. 2020, 34, 2539–2552. [Google Scholar] [CrossRef]
- Perozzi, B.; Al-Rfou, R.; Skiena, S. Deepwalk: Online learning of social representations. In Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, New York, NY, USA, 24–27 August 2014; pp. 701–710. [Google Scholar]
- Grover, A.; Leskovec, J. node2vec: Scalable feature learning for networks. In Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Francisco, CA, USA, 13–17 August 2016; pp. 855–864. [Google Scholar]
- Tang, J.; Qu, M.; Wang, M.; Zhang, M.; Yan, J.; Mei, Q. Line: Large-scale information network embedding. In Proceedings of the 24th International Conference on World Wide Web, Florence, Italy, 18–22 May 2015; pp. 1067–1077. [Google Scholar]
- Zhou, J.; Cui, G.; Hu, S.; Zhang, Z.; Yang, C.; Liu, Z.; Wang, L.; Li, C.; Sun, M. Graph neural networks: A review of methods and applications. AI Open 2020, 1, 57–81. [Google Scholar] [CrossRef]
- Xu, L.; Wei, X.; Cao, J.; Yu, P.S. ICANE: Interaction content-aware network embedding via co-embedding of nodes and edges. Int. J. Data Sci. Anal. 2020, 9, 401–414. [Google Scholar] [CrossRef]
- Lee, G.; Choe, M.; Shin, K. BOURNE: Bootstrapped Self-supervised Learning Framework for Unified Graph Anomaly Detection. In Proceedings of the 40th International Conference on Data Engineering, ICDE-24, Utrecht, The Netherlands, 13–17 May 2024. [Google Scholar]
- Zhang, M.; Cui, Z.; Jiang, S.; Chen, Y. Beyond link prediction: Predicting hyperlinks in adjacency space. In Proceedings of the AAAI Conference on Artificial Intelligence, New Orleans, LA, USA, 2–7 February 2018. [Google Scholar]
- Kumar, T.; Darwin, K.; Parthasarathy, S.; Ravindran, B. HPRA: Hyperedge prediction using resource allocation. In Proceedings of the 12th ACM Conference on Web Science, Southampton, UK, 6–10 July 2020; pp. 135–143. [Google Scholar]
- Feng, Y.; You, H.; Zhang, Z.; Ji, R.; Gao, Y. Hypergraph neural networks. In Proceedings of the AAAI Conference on Artificial Intelligence, Honolulu, HI, USA, 27 January–1 February 2019; pp. 3558–3565. [Google Scholar]
- Arya, D.; Gupta, D.K.; Rudinac, S.; Worring, M. Hypersage: Generalizing inductive representation learning on hypergraphs. arXiv 2020, arXiv:2010.04558. [Google Scholar]
- Bai, S.; Zhang, F.; Torr, P.H. Hypergraph convolution and hypergraph attention. Pattern Recognit. 2021, 110, 107637. [Google Scholar] [CrossRef]
- Yadati, N.; Nimishakavi, M.; Yadav, P.; Nitin, V.; Louis, A.; Talukdar, P. Hypergcn: A new method of training graph convolutional networks on hypergraphs. In Proceedings of the 33rd International Conference on Neural Information Processing Systems, Vancouver, BC, Canada, 8–14 December 2019; pp. 1511–1522. [Google Scholar]
- Ranshous, S.; Chaudhary, M.; Samatova, N.F. Efficient outlier detection in hyperedge streams using minhash and locality-sensitive hashing. In Proceedings of the Complex Networks & Their Applications VI: Proceedings of Complex Networks 2017 (The Sixth International Conference on Complex Networks and Their Applications), Lyon, France, 29 November–1 December 2017; pp. 105–116. [Google Scholar]
- Lee, G.; Choe, M.; Shin, K. HashNWalk: Hash and Random Walk Based Anomaly Detection in Hyperedge Streams. In Proceedings of the Thirty-First International Conference on Neural Information Processing Systems, Vienna, Austria, 23–29 July 2022; pp. 2129–2137. [Google Scholar]
- Chun, J.; Lee, G.; Shin, K.; Jung, J. Random walk with restart on hypergraphs: Fast computation and an application to anomaly detection. Data Min. Knowl. Discov. 2023, 38, 1222–1257. [Google Scholar] [CrossRef]
- Su, H.; Yang, Y.; Zhang, X.; Zhao, C. Anomalous social network event detection based on Higher-order networks. In Proceedings of the 2022 8th International Conference on Big Data and Information Analytics (BigDIA), Guiyang, China, 24–25 August 2022; pp. 324–329. [Google Scholar]
- Agarwal, S.; Branson, K.; Belongie, S. Higher order learning with graphs. In Proceedings of the 23rd International Conference on Machine Learning, Pittsburgh, PA, USA, 25–29 June 2006; pp. 17–24. [Google Scholar]
- Liu, Z.; Tang, B.; Ye, Z.; Dong, X.; Chen, S.; Wang, Y. Hypergraph Transformer for Semi-Supervised Classification. arXiv 2023, arXiv:2312.11385. [Google Scholar]
- Huang, J.; Yang, J. UniGNN: A Unified Framework for Graph and Hypergraph Neural Networks. In Proceedings of the Thirtieth International Joint Conference on Artificial Intelligence, Virtual Event, 19–27 August 2021; pp. 2563–2569. [Google Scholar]
- Srinivasan, B.; Zheng, D.; Karypis, G. Learning over families of sets-hypergraph representation learning for higher order tasks. In Proceedings of the 2021 SIAM International Conference on Data Mining (SDM), Virtual Event, 29 April–1 May 2021; pp. 756–764. [Google Scholar]
- Thonet, T.; Renders, J.M.; Choi, M.; Kim, J. Joint Personalized Search and Recommendation with Hypergraph Convolutional Networks. In Proceedings of the European Conference on Information Retrieval, Stavanger, Norway, 10–14 April 2022; pp. 443–456. [Google Scholar]
- Jiang, J.; Wei, Y.; Feng, Y.; Cao, J.; Gao, Y. Dynamic Hypergraph Neural Networks. In Proceedings of the 28th International Joint Conference on Artificial Intelligence, Macao, China, 10–16 August 2019; pp. 2635–2641. [Google Scholar]
- Tan, Y.; Ma, Z.; Zhan, Y.; Mao, Q. Hypergraph induced graph convolutional network for multi-label image recognition. In Proceedings of the 2020 International Conference on Internet of Things and Intelligent Applications (ITIA), Zhenjiang, China, 27–29 November 2020; pp. 1–5. [Google Scholar]
- Bandyopadhyay, S.; Das, K.; Murty, M.N. Hypergraph attention isomorphism network by learning line graph expansion. In Proceedings of the 2020 IEEE International Conference on Big Data (Big Data), Atlanta, GA, USA, 10–13 December 2020; pp. 669–678. [Google Scholar]
- Yadati, N.; Nitin, V.; Nimishakavi, M.; Yadav, P.; Louis, A.; Talukdar, P. NHP: Neural hypergraph link prediction. In Proceedings of the 29th ACM International Conference on Information & Knowledge Management, Virtual Event, 19–23 October 2020; pp. 1705–1714. [Google Scholar]
- Zhang, R.; Zou, Y.; Ma, J. Hyper-SAGNN: A self-attention based graph neural network for hypergraphs. In Proceedings of the 8th International Conference on Learning Representations, ICLR 2020, Addis Ababa, Ethiopia, 26–30 April 2020. [Google Scholar]
- Kingma, D.P.; Ba, J. Adam: A method for stochastic optimization. arXiv 2014, arXiv:1412.6980. [Google Scholar]
Anomaly Type | Category | Method | Key Idea |
---|---|---|---|
Anomalous node detection | Traditional method | OddBall [18], Fraudar [19] | Transform graph anomaly detection into a traditional anomaly detection. |
Network representation learning | NetWalk [20], DeepSphere [21] | Encode graph structure into an embedded vector space and identify anomalous nodes. | |
GNN-based representation learning | DOMINANT [22], ALARM [23] | Encode graph structure and attribute information into an embedded vector space to identify anomalous nodes. | |
Anomalous edge detection | Heuristics metrics | CN, AA, RA, Katz, Jaccard [16] | Design heuristics metrics of nodes similarity to measure the degree of edge anomaly. |
Network representation learning | DeepWalk [24], Node2Vec [25], LINE [26] | Encode graph structure to low-dimensional vectors which distance is used to measure node similarity. | |
GNN-based representation learning | GCN, GAT, GraphSAGE [27] | Encode nodes attributes and graph structure to low-dimensional vectors whose distance is used to measure node similarity. | |
Direct method | ICANE [28] | Directly performs edge representation learning for anomalous edge detection. | |
BOURNE [29] | Employ self-supervised learning to compare the pairs of target nodes and edges to derive anomaly scores. | ||
Anomalous hyperedge detection | Heuristics metrics | HCN [30], HKatz [30], HRA [31] | Extend similarity metrics used in the ordinary graphs to hypergraphs. |
Hypergraph representation learning | HGNN [32], HyperSAGE [33], Hyper-Atten [34], HyperGCN [35] | Encode hypergraph nodes attributes and graph structure to low-dimensional vectors whose distance is used to measure node similarity. | |
Direct method | LSH [36] | Design similarity metrics between hyperedges using locally sensitive hashing. | |
HashNWalk [37] | Design similarity metrics based on the hash function and random walk similarity metrics. | ||
ARCHER [38] | Develop the random walk with restart (RWR) on hypergraph for hypergraph embedded representation. | ||
AHIN [39] | Propose a hypergraph contrastive learning method to capture abnormal event patterns. |
Category | Method | Task |
---|---|---|
Clique expansion | HyperGT [41] | Node classification |
HyperSAGE [33] | Node classification | |
UniGNN [42] | Node classification | |
FamilySet [43] | Hyperedge classification, hyperedge expansion | |
HyperSaR [44] | Search and recommendation tasks | |
Star expansion | HGNN [32] | Node classification |
DHGNN [45] | Node classification, sentiment prediction | |
Hyper-Atten [34] | Node classification | |
HyperGCN [35] | Node classification | |
HI-GCN [46] | Multilabel image recognition | |
Line expansion | LHCN [16] | Node classification |
HAIN [47] | Node classification | |
LEGCN [17] | Node classification |
Dataset | Hypergraph Type | Nodes Number | Hyperedges Number | Features Size |
---|---|---|---|---|
iAF1260b | metabolic reactions | 1668 | 2084 | 26 |
iJO1366 | metabolic reactions | 1805 | 2253 | 26 |
USPTO | organic reactions | 16,293 | 11,433 | 298 |
dblp | coauthorship | 20,685 | 30,956 | 3763 |
Reverb45k | knowledge graph | 28,798 | 66,914 | 382 |
Datasets | iAF1260b | iJO1366 | USPTO | DBLP | Reverb45k | |||||
---|---|---|---|---|---|---|---|---|---|---|
Model | AUC | R@k | AUC | R@k | AUC | R@k | AUC | R@k | AUC | R@k |
DeepWalk | 0.50 | 0.22 | 0.62 | 0.32 | 0.57 | 0.29 | 0.57 | 0.30 | 0.68 | 0.40 |
LINE | 0.54 | 0.26 | 0.53 | 0.24 | 0.56 | 0.30 | 0.53 | 0.28 | 0.59 | 0.37 |
Node2Vec | 0.56 | 0.26 | 0.60 | 0.28 | 0.57 | 0.29 | 0.59 | 0.31 | 0.73 | 0.37 |
HGNN | 0.63 | 0.28 | 0.61 | 0.31 | 0.68 | 0.27 | 0.66 | 0.35 | 0.64 | 0.33 |
HyperGCN | 0.62 | 0.28 | 0.59 | 0.32 | 0.69 | 0.30 | 0.67 | 0.37 | 0.67 | 0.32 |
Hyper-SAGNN | 0.61 | 0.26 | 0.57 | 0.31 | 0.66 | 0.28 | 0.64 | 0.34 | 0.66 | 0.33 |
UniGNN | 0.60 | 0.30 | 0.58 | 0.30 | 0.54 | 0.24 | 0.63 | 0.32 | 0.56 | 0.30 |
NHP-U | 0.64 | 0.31 | 0.63 | 0.32 | 0.74 | 0.37 | 0.69 | 0.38 | 0.75 | 0.44 |
AHD-SLE | 0.69 | 0.36 | 0.67 | 0.35 | 0.77 | 0.38 | 0.69 | 0.39 | 0.78 | 0.45 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Li, Y.; Yu, H.; Li, H.; Pan, F.; Liu, S. AHD-SLE: Anomalous Hyperedge Detection on Hypergraph Symmetric Line Expansion. Axioms 2024, 13, 387. https://doi.org/10.3390/axioms13060387
Li Y, Yu H, Li H, Pan F, Liu S. AHD-SLE: Anomalous Hyperedge Detection on Hypergraph Symmetric Line Expansion. Axioms. 2024; 13(6):387. https://doi.org/10.3390/axioms13060387
Chicago/Turabian StyleLi, Yingle, Hongtao Yu, Haitao Li, Fei Pan, and Shuxin Liu. 2024. "AHD-SLE: Anomalous Hyperedge Detection on Hypergraph Symmetric Line Expansion" Axioms 13, no. 6: 387. https://doi.org/10.3390/axioms13060387