Granular neural networks with a reference frame

Y Chen, X Zhang, Y Zhuang, B Yao, B Lin - Knowledge-Based Systems, 2023 - Elsevier
Y Chen, X Zhang, Y Zhuang, B Yao, B Lin
Knowledge-Based Systems, 2023Elsevier
The neural network has a good nonlinear fitting ability, which is also a cornerstone of deep
learning. However, some disadvantages of neural networks are higher complexity, more
iterations and longer training time. In deep learning, the gradient is easy to disappear when
the network has many hidden layers. Considering the inherent defects of neural networks,
we propose a neural network model: the Granular Neural network Classifier (GNC). Firstly, a
reference frame is constructed by a random sampling method. In the reference frame, a …
Abstract
The neural network has a good nonlinear fitting ability, which is also a cornerstone of deep learning. However, some disadvantages of neural networks are higher complexity, more iterations and longer training time. In deep learning, the gradient is easy to disappear when the network has many hidden layers. Considering the inherent defects of neural networks, we propose a neural network model: the Granular Neural network Classifier (GNC). Firstly, a reference frame is constructed by a random sampling method. In the reference frame, a training sample is granulated on features to form conditional granules by the feature similarity, and these granules are combined into a conditional granular vector. Meanwhile, the training sample on the decision feature is expanded to a decision granule. Furthermore, some measures and operations of granules are defined, and several activation functions of granules are constructed, then the GNC is proposed. Further, a loss function of the GNC is presented, and its derivative form is proved. Therefore, a gradient descent algorithm of the GNC is designed. Since the granules have good structural characteristics, the GNC can be parallel computed. Finally, several UCI datasets and image datasets are used to test the GNC from some aspects on the number of network layers, the influence of reference frames and classification accuracy. The experimental results show that the GNC is correct and effective. In addition, the GNC with less hidden layers achieves a good result as the traditional neural network with multi layers, which relieves the problem of gradient disappearance.
Elsevier
Showing the best result for this search. See all results