Nothing Special   »   [go: up one dir, main page]

CN110427967A - The zero sample image classification method based on embedded feature selecting semanteme self-encoding encoder - Google Patents

The zero sample image classification method based on embedded feature selecting semanteme self-encoding encoder Download PDF

Info

Publication number
CN110427967A
CN110427967A CN201910566369.1A CN201910566369A CN110427967A CN 110427967 A CN110427967 A CN 110427967A CN 201910566369 A CN201910566369 A CN 201910566369A CN 110427967 A CN110427967 A CN 110427967A
Authority
CN
China
Prior art keywords
self
semanteme
encoding encoder
sample
relative priority
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910566369.1A
Other languages
Chinese (zh)
Inventor
芦楠楠
周丙
张欣茹
胡小忠
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China University of Mining and Technology CUMT
Original Assignee
China University of Mining and Technology CUMT
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China University of Mining and Technology CUMT filed Critical China University of Mining and Technology CUMT
Priority to CN201910566369.1A priority Critical patent/CN110427967A/en
Publication of CN110427967A publication Critical patent/CN110427967A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Analysis (AREA)
  • Compression Or Coding Systems Of Tv Signals (AREA)

Abstract

The invention discloses the zero sample image classification methods based on embedded feature selecting semanteme self-encoding encoder.The present invention optimizes the objective function of semantic self-encoding encoder using embedded feature selecting, the mapping matrix made rarefaction as far as possible, to achieve the purpose that the low-level image feature to match with semantic attribute selects;The mapping of low-level image feature to semantic attribute is carried out using obtained rarefaction mapping matrix in test phase, the characteristic dimension of negative consequence can be inhibited to play automatically, the characteristic dimension of positive effect is played in enhancing, achievees the purpose that characteristic matching selects, improves the precision of zero sample image classification.

Description

The zero sample image classification method based on embedded feature selecting semanteme self-encoding encoder
Technical field
The invention belongs to area of pattern recognition, in particular to a kind of zero sample image classification method.
Background technique
For area of pattern recognition, zero sample learning always is the hot spot of research problem.Due to artificial marker samples Shortage, marked classification can not cover all object class, so the application scenarios of zero sample problem are than more classification problems More tally with the actual situation.The method of current zero sample image classification mainly has: the method based on attribute;Text based method; Method based on classification similitude;In conjunction with the method for different middle layers.And current zero sample learning mainly be effect most Good " method based on attribute ".
Zero sample learning based on attribute is exactly using attribute as middle layer, to carry out visible class to invisible class knowledge Migration.Because semantic attribute is one group of vector for describing class feature, for example, " having fur ", " having tail ", " have Four feet " etc. descriptive natures word, so attribute is common to visible class and invisible class, therefore zero sample can be played The effect of this study middle layer.Attribute study at present includes that two-value property study and relative priority learn two aspects, the two Difference is that the attribute value of two-value property is " 0 " or " 1 ", to respectively correspond "None" or " having " of the category this attribute;And belong to relatively The attribute value of property is a continuous value, indicates the relative intensity value of attribute.The concept of relative priority and setting are more in line with people The cognition and actual conditions of class, its corresponding classifying quality i.e. precision are better than two-value property under identical circumstances.Institute To take the model method better effect of relative priority for zero sample classification.
Specific implementation for zero sample learning based on attribute presently mainly passes through direct attribute forecast model (DAP) and proxy attribute prediction model (IAP) both models.Class label in DAP is directly predicted by attributive classification device It obtains, and class label is then that indirect predictions obtain in IAP.The most important difference of DAP and IAP model is the classification of study The difference of device, IAP needs to learn multi classifier, and its test sample is only possible to be given to invisible class.And DAP only needs to learn One group of attributive classification device is practised, and its test sample can be predicted as visible class and invisible class by no limitation.
The difficult point of zero sample learning is that test sample classification is not intersected with training sample classification, the classification of conventional method As a result often it is partial to train class label, results in the strong inclined problem of zero sample learning.Semantic self-encoding encoder is belonged to semanteme Property as middle layer, characteristics of the underlying image is input, and output is reconstructed to input feature vector, and here it is the data after encoding Initial data can be reverted to as far as possible under original coding rule, this just alleviates strong inclined problem to a certain extent.But It is that, since there are global characteristics and local feature for image, there is also noise problems in some scenarios, so, the bottom of image Not all dimension all plays positive effect to the study of a certain attribute in feature, those interfering characteristic dimensions can shadow The accuracy of attribute study is rung, so as to cause zero sample image classification performance is influenced.
Summary of the invention
In order to solve the technical issues of above-mentioned background technique is mentioned, the invention proposes semantic based on embedded feature selecting Zero sample image classification method of self-encoding encoder.
In order to achieve the above technical purposes, the technical solution of the present invention is as follows:
The zero sample image classification method based on embedded feature selecting semanteme self-encoding encoder, comprising the following steps:
(1) match selection of feature is merged with the training of semantic self-encoding encoder, obtains embedded feature selecting language The objective function of adopted self-encoding encoder:
Wherein, W is mapping matrix, and subscript T indicates that transposition, X are the low-level image feature of sample, and S is relative priority vector, and λ is The weight parameter of coded portion,For regularization parameter;
(2) it is optimized using the objective function that proximal end gradient descent method obtains step (1), the insertion after being optimized Formula feature selecting semanteme self-encoding encoder model;
(3) in the training stage of zero sample image classification, by the low-level image feature of training sample and corresponding relative priority to Embedded feature selecting semanteme self-encoding encoder model after amount input optimization, obtains the mapping matrix W of rarefaction;
(4) the mapping square of the rarefaction obtained in test phase, the low-level image feature and step (3) of input test sample Battle array, predicts the relative priority vector of test sample;
(5) distribution of class label is carried out according to the relative priority vector that step (4) obtains.
Further, in step (2), the iterative equation of each step is as follows:
In above formula, WkIterative value is walked for the kth of W, is enabledf′(Wk) it is f (W) in the first derivative of kth step, L=-SXT+SSTW0+λW0XXT-λSXT, W0For all 1's matrix identical with W dimension.
Further, in step (3), the training stage is selected in image set at random first with folding cross validation Sample class, then in the image set remaining classification as test set;Then by the low-level image feature of training sample classification and accordingly Relative priority vector input optimization after embedded feature selecting semanteme self-encoding encoder be trained, obtain the mapping of rarefaction Matrix.
Further, the relative priority vector of the relative priority vector sum test sample of training sample is counted as Gauss The form of distribution.
Further, in step (5), the mean value and variance and survey of the relative priority vector of training sample are calculated separately The mean value and variance of the relative priority vector of sample sheet, then the classification by maximum a posteriori probability progress class label.
By adopting the above technical scheme bring the utility model has the advantages that
The present invention is directed to semantic self-encoding encoder and carries out actively low-level image feature cannot being selected to be learnt when zero sample classification Problem adds a L1 norm regular terms in conjunction with embedded feature selecting that is, in objective function, proposes embedded feature choosing Semantic self-encoding encoder is selected, and is optimized using proximal end gradient descent method, the accuracy for carrying out relative priority study is improved With the accuracy of final zero sample classification, whole performance is improved.The present invention can be used for being related to field drifting problem and Zero sample image classification scene with strong problem partially, can also be used in zero sample classification of image with noise background.
Detailed description of the invention
Fig. 1 is overall flow figure of the invention;
Fig. 2 is embedded feature selecting semanteme self-encoding encoder structure chart in the present invention.
Specific embodiment
Below with reference to attached drawing, technical solution of the present invention is described in detail.
As shown in Figure 1, the present invention devises the classification of zero sample image based on embedded feature selecting semanteme self-encoding encoder Method, steps are as follows:
Step 1: adding a L1 norm regularization item in the objective function of semantic self-encoding encoder, the matching of feature is selected It selects and combines together with the training of semantic self-encoding encoder, constitute embedded feature selecting semanteme self-encoding encoder, objective function is as follows:
Wherein, W is mapping matrix, and subscript T indicates that transposition, X are the low-level image feature of sample, and S is relative priority vector, and λ is The weight parameter of coded portion,For regularization parameter.
Semantic self-encoding encoder includes three layers: input layer, middle layer, output layer.Wherein input layer is the low-level image feature of image; Middle layer is semantic attribute layer, that is, using semantic attribute as middle layer;Output layer is that semantic attribute layer is obtained by decoding 's.It enables output and input as identical as possible, thus solves the problems, such as to a certain extent strong inclined in zero sample learning.
As shown in Fig. 2, a L1 norm regular terms, which is added, in the structure of original semantic self-encoding encoder carrys out constraint consistency square The training process of battle array W, obtains the mapping matrix W with rarefaction.Because the mathematical meaning of L1 norm regular terms is in matrix All elements seek the sum of absolute value, therefore the number for asking its minimum value that can make nonzero element in matrix is as few as possible, so that it may Obtain the mapping matrix W of rarefaction.
Step 2: the objective function that step 1 obtains being optimized using proximal end gradient descent method.
For first two of objective function, enableThen its first derivative f ' (W) meet L-Lipschitz condition, then have:
(f′(W2)-f′(W1))≤L(W2-W1)
Above formula meets again:
(f′(W2)-f′(W1))=(f " (W) (W2-W1))≤L(W2-W1)
Therefore, from the above the value of L be f (W) second dervative maximum value, and because of its second dervative are as follows:
F " (W)=- SXT+SSTW+λWXXT-λSXT
Again because the value of the element of W matrix is all the value belonged between 0 and 1, when W takes all 1's matrix, L can be got Maximum value.If W0It is the identical all 1's matrix of W dimension in above formula, therefore the value of L are as follows:
L=-SXT+SSTW0+λW0XXT-λSXT
In WkF (W) can be nearby approximately: by the second Taylor series formula
Wherein, const is the constant unrelated with W,<,>indicating inner product, it is clear that minimum value is in following Wk+1It obtains:
Therefore, the iteration of each step can be obtained:
It enablesIt can obtain:
By abbreviation and arranges and can obtain:
Wherein subscript i indicates Wk+1With i-th of component of Z.
Step 3: in the training stage of zero sample image classification, by the low-level image feature of training sample and corresponding relative priority The model that 2 optimum results of vector input step are constituted obtains the mapping matrix W of rarefaction.
First with the folding cross validation sample class for selecting the training stage in image set at random, then in the image set Remaining classification is as test set;Then by the low-level image feature of training sample classification and the input optimization of corresponding relative priority vector Embedded feature selecting semanteme self-encoding encoder afterwards is trained, and obtains the mapping matrix of rarefaction.
Step 4: in the mapping square for the rarefaction that test phase, the low-level image feature and step 3 of input test sample obtain Battle array, predicts the relative priority vector of test sample.
Step 5: the distribution of class label is carried out according to the relative priority vector that step 4 obtains.
It will be Gaussian Profile with the relative priority vector statistical of label priori, and calculate its mean value and variance;It will measure in advance To the attribute vector primary system of test sample be calculated as Gaussian Profile, and calculate its mean value and variance;According to obtained mean value and side Difference carries out maximum a posteriori probability and distributes class label.
Embodiment is merely illustrative of the invention's technical idea, and this does not limit the scope of protection of the present invention, it is all according to Technical idea proposed by the present invention, any changes made on the basis of the technical scheme are fallen within the scope of the present invention.

Claims (5)

1. the zero sample image classification method based on embedded feature selecting semanteme self-encoding encoder, which is characterized in that including following Step:
(1) match selection of feature is merged with the training of semantic self-encoding encoder, obtains embedded feature selecting semanteme certainly The objective function of encoder:
Wherein, W is mapping matrix, and subscript T indicates that transposition, X are the low-level image feature of sample, and S is relative priority vector, and λ is coding Partial weight parameter,For regularization parameter;
(2) it is optimized using the objective function that proximal end gradient descent method obtains step (1), the embedded spy after being optimized Sign selects semantic self-encoding encoder model;
(3) training stage classified in zero sample image, the low-level image feature of training sample and corresponding relative priority vector is defeated Embedded feature selecting semanteme self-encoding encoder model after entering optimization, obtains the mapping matrix W of rarefaction;
(4) mapping matrix of the rarefaction obtained in test phase, the low-level image feature and step (3) of input test sample, in advance Measure the relative priority vector of test sample;
(5) distribution of class label is carried out according to the relative priority vector that step (4) obtains.
2. according to claim 1 based on zero sample image classification method of embedded feature selecting semanteme self-encoding encoder, It is characterized in that, in step (2), the iterative equation of each step is as follows:
In above formula, WkIterative value is walked for the kth of W, is enabledFor f (W) In the first derivative of kth step, L=-SXT+SSTW0+λW0XXT-λSXT, W0For all 1's matrix identical with W dimension.
3. according to claim 1 based on zero sample image classification method of embedded feature selecting semanteme self-encoding encoder, It is characterized in that, in step (3), first with the folding cross validation sample class for selecting the training stage in image set at random Not, then in the image set remaining classification as test set;Then opposite by the low-level image feature of training sample classification and accordingly Embedded feature selecting semanteme self-encoding encoder after attribute vector input optimization is trained, and obtains the mapping matrix of rarefaction.
4. according to claim 1 based on zero sample image classification method of embedded feature selecting semanteme self-encoding encoder, It is characterized in that, the relative priority vector of the relative priority vector sum test sample of training sample is counted into the shape for Gaussian Profile Formula.
5. according to claim 4 based on zero sample image classification method of embedded feature selecting semanteme self-encoding encoder, It is characterized in that, in step (5), calculates separately the mean value and variance and test sample of the relative priority vector of training sample The mean value and variance of relative priority vector, then the classification by maximum a posteriori probability progress class label.
CN201910566369.1A 2019-06-27 2019-06-27 The zero sample image classification method based on embedded feature selecting semanteme self-encoding encoder Pending CN110427967A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910566369.1A CN110427967A (en) 2019-06-27 2019-06-27 The zero sample image classification method based on embedded feature selecting semanteme self-encoding encoder

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910566369.1A CN110427967A (en) 2019-06-27 2019-06-27 The zero sample image classification method based on embedded feature selecting semanteme self-encoding encoder

Publications (1)

Publication Number Publication Date
CN110427967A true CN110427967A (en) 2019-11-08

Family

ID=68409698

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910566369.1A Pending CN110427967A (en) 2019-06-27 2019-06-27 The zero sample image classification method based on embedded feature selecting semanteme self-encoding encoder

Country Status (1)

Country Link
CN (1) CN110427967A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111914872A (en) * 2020-06-04 2020-11-10 西安理工大学 Zero sample image classification method with label and semantic self-coding fusion
CN113221814A (en) * 2021-05-26 2021-08-06 华瑞新智科技(北京)有限公司 Road traffic sign identification method, equipment and storage medium
CN114005005A (en) * 2021-12-30 2022-02-01 深圳佑驾创新科技有限公司 Double-batch standardized zero-instance image classification method

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150310862A1 (en) * 2014-04-24 2015-10-29 Microsoft Corporation Deep learning for semantic parsing including semantic utterance classification
CN106203472A (en) * 2016-06-27 2016-12-07 中国矿业大学 A kind of zero sample image sorting technique based on the direct forecast model of mixed attributes
US20170127016A1 (en) * 2015-10-29 2017-05-04 Baidu Usa Llc Systems and methods for video paragraph captioning using hierarchical recurrent neural networks
US20180165554A1 (en) * 2016-12-09 2018-06-14 The Research Foundation For The State University Of New York Semisupervised autoencoder for sentiment analysis
CN108564121A (en) * 2018-04-09 2018-09-21 南京邮电大学 A kind of unknown classification image tag prediction technique based on self-encoding encoder
CN108921226A (en) * 2018-07-11 2018-11-30 广东工业大学 A kind of zero sample classification method based on low-rank representation and manifold regularization
CN109492662A (en) * 2018-09-27 2019-03-19 天津大学 A kind of zero sample classification method based on confrontation self-encoding encoder model
CN109829299A (en) * 2018-11-29 2019-05-31 电子科技大学 A kind of unknown attack recognition methods based on depth self-encoding encoder

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150310862A1 (en) * 2014-04-24 2015-10-29 Microsoft Corporation Deep learning for semantic parsing including semantic utterance classification
US20170127016A1 (en) * 2015-10-29 2017-05-04 Baidu Usa Llc Systems and methods for video paragraph captioning using hierarchical recurrent neural networks
CN106203472A (en) * 2016-06-27 2016-12-07 中国矿业大学 A kind of zero sample image sorting technique based on the direct forecast model of mixed attributes
US20180165554A1 (en) * 2016-12-09 2018-06-14 The Research Foundation For The State University Of New York Semisupervised autoencoder for sentiment analysis
CN108564121A (en) * 2018-04-09 2018-09-21 南京邮电大学 A kind of unknown classification image tag prediction technique based on self-encoding encoder
CN108921226A (en) * 2018-07-11 2018-11-30 广东工业大学 A kind of zero sample classification method based on low-rank representation and manifold regularization
CN109492662A (en) * 2018-09-27 2019-03-19 天津大学 A kind of zero sample classification method based on confrontation self-encoding encoder model
CN109829299A (en) * 2018-11-29 2019-05-31 电子科技大学 A kind of unknown attack recognition methods based on depth self-encoding encoder

Non-Patent Citations (8)

* Cited by examiner, † Cited by third party
Title
CLVSIT: "特征选择-嵌入式选择", 《CSDN》 *
ELYOR KODIROV ET AL.: "Semantic Autoencoder for Zero-Shot Learning", 《ARXIV》 *
ELYOR KODIROV ET AL.: "unsupervised domain adaptation for zero-shot learning", 《ICCV"15》 *
TEN_YN: "近端梯度下降算法(Proximal Gradient Algorithm)", 《HTTPS://BLOG.CSDN.NET/QQ_38290475/ARTICLE/DETAILS/81052206》 *
YANG LIU ET AL.: "Zero Shot Learning via Low-rank Embedded Semantic AutoEncoder", 《PROCEEDINGS OF THE TWENTY-SEVENTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE》 *
冯兴东: "《分布式统计计算》", 30 April 2018, 上海财经大学出版社 *
巩萍等: "基于属性关系图正则化特征选择的零样本分类", 《中国矿业大学学报》 *
褚宝增等: "《现代数学地质》", 31 August 2014, 中国科学技术出版社 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111914872A (en) * 2020-06-04 2020-11-10 西安理工大学 Zero sample image classification method with label and semantic self-coding fusion
CN111914872B (en) * 2020-06-04 2024-02-02 西安理工大学 Zero sample image classification method with label and semantic self-coding fused
CN113221814A (en) * 2021-05-26 2021-08-06 华瑞新智科技(北京)有限公司 Road traffic sign identification method, equipment and storage medium
CN114005005A (en) * 2021-12-30 2022-02-01 深圳佑驾创新科技有限公司 Double-batch standardized zero-instance image classification method
CN114005005B (en) * 2021-12-30 2022-03-22 深圳佑驾创新科技有限公司 Double-batch standardized zero-instance image classification method

Similar Documents

Publication Publication Date Title
CN113378632B (en) Pseudo-label optimization-based unsupervised domain adaptive pedestrian re-identification method
CN109086700B (en) Radar one-dimensional range profile target identification method based on deep convolutional neural network
CN107563428B (en) Based on the Classification of Polarimetric SAR Image method for generating confrontation network
CN111242841B (en) Image background style migration method based on semantic segmentation and deep learning
CN107145830B (en) Hyperspectral image classification method based on spatial information enhancing and deepness belief network
CN110427967A (en) The zero sample image classification method based on embedded feature selecting semanteme self-encoding encoder
CN109886238A (en) Unmanned plane Image Change Detection algorithm based on semantic segmentation
CN105184298B (en) A kind of image classification method of quick local restriction low-rank coding
CN106651884B (en) Mean field variation Bayes&#39;s SAR image segmentation method based on sketch structure
CN106611422B (en) Stochastic gradient Bayes&#39;s SAR image segmentation method based on sketch structure
CN107944483B (en) Multispectral image classification method based on dual-channel DCGAN and feature fusion
CN106683102B (en) SAR image segmentation method based on ridge ripple filter and convolutional coding structure learning model
CN106611423B (en) SAR image segmentation method based on ridge ripple filter and deconvolution structural model
CN102542302A (en) Automatic complicated target identification method based on hierarchical object semantic graph
CN108681689B (en) Frame rate enhanced gait recognition method and device based on generation of confrontation network
CN104298999B (en) EO-1 hyperion feature learning method based on recurrence autocoding
CN107403434A (en) SAR image semantic segmentation method based on two-phase analyzing method
CN108460391A (en) Based on the unsupervised feature extracting method of high spectrum image for generating confrontation network
CN109145832A (en) Polarimetric SAR image semisupervised classification method based on DSFNN Yu non local decision
CN104346814B (en) Based on the SAR image segmentation method that level vision is semantic
CN104408731B (en) Region graph and statistic similarity coding-based SAR (synthetic aperture radar) image segmentation method
CN105740917B (en) The semi-supervised multiple view feature selection approach of remote sensing images with label study
Jing et al. AutoRSISC: Automatic design of neural architecture for remote sensing image scene classification
Xing et al. Diffsketcher: Text guided vector sketch synthesis through latent diffusion models
CN105160666A (en) SAR (synthetic aperture radar) image change detection method based on non-stationary analysis and conditional random field

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination