Nothing Special   »   [go: up one dir, main page]

CN111914915A - Data classifier integration method and device based on support vector machine and storage medium - Google Patents

Data classifier integration method and device based on support vector machine and storage medium Download PDF

Info

Publication number
CN111914915A
CN111914915A CN202010704744.7A CN202010704744A CN111914915A CN 111914915 A CN111914915 A CN 111914915A CN 202010704744 A CN202010704744 A CN 202010704744A CN 111914915 A CN111914915 A CN 111914915A
Authority
CN
China
Prior art keywords
model
sample
current
support vector
vector machine
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010704744.7A
Other languages
Chinese (zh)
Inventor
邹斌
覃一默
徐婕
沈知飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hubei University
Original Assignee
Hubei University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hubei University filed Critical Hubei University
Priority to CN202010704744.7A priority Critical patent/CN111914915A/en
Publication of CN111914915A publication Critical patent/CN111914915A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2411Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention relates to a data classifier integration method, a device and a storage medium based on a support vector machine, wherein the method comprises the following steps: acquiring a training set; according to a known initial model at an initial moment, calibrating the step number in the step of generating the loop iteration model to obtain a plurality of models and the weight of each model; wherein the model generating step comprises: selectively sampling in the training set according to a known model at the previous moment to obtain a current sample set, training the current sample set by adopting a support vector machine to obtain a current model, and determining the weight of the current model according to a preset rule; integrating according to all the generated models and corresponding weights to determine a final model, and determining a data classifier according to the final model; inputting the data to be classified into the data classifier, and classifying the data to be classified. According to the technical scheme, the complexity of the training model in the data classification process can be reduced, and the training speed is increased.

Description

Data classifier integration method and device based on support vector machine and storage medium
Technical Field
The invention relates to the technical field of artificial intelligence, in particular to a data classifier integration method and device based on a support vector machine and a storage medium.
Background
Support Vector Machines (SVMs) are generalized linear classifiers that classify data in a supervised learning manner, classify data based on an optimal interval idea, and classify data by constructing a linear or nonlinear interface and using the interface, which are widely used in theoretical research and practice.
At present, when classifying data using a support vector machine, assuming that the sample capacity of the support vector machine is n, the algorithm complexity is about O (n)3) Therefore, when n is larger, for example, larger than 10000, the algorithm complexity is higher, and a longer training time is required for establishing the model, which is not favorable for processing big data.
Disclosure of Invention
In order to reduce the algorithm complexity of a training model in the data classification process and improve the training speed when the model is established, the invention provides a data classifier integration method and device based on a support vector machine and a storage medium.
The technical scheme for solving the technical problems is as follows:
in a first aspect, the present invention provides a data classifier integration method based on a support vector machine, including:
a training set is obtained.
According to a known initial model at an initial moment, calibrating the step number in the step of generating the loop iteration model to obtain a plurality of models and the weight of each model; wherein the model generating step comprises: selectively sampling in the training set according to a known model at the previous moment to obtain a current sample set, training the current sample set by adopting a support vector machine to obtain a current model, and determining the weight of the current model according to a preset rule.
And integrating according to all the generated models and the corresponding weights, determining a final model, and determining a data classifier according to the final model.
Inputting the data to be classified into the data classifier, and classifying the data to be classified.
In a second aspect, the present invention provides an integrated device of a data classifier based on a support vector machine, including:
and the acquisition module is used for acquiring the training set.
The model generation module is used for calibrating the step number in the step of generating the loop iteration model according to the known initial model at the initial moment to obtain a plurality of models and the weight of each model; wherein the model generating step comprises: selectively sampling in the training set according to a known model at the previous moment to obtain a current sample set, training the current sample set by adopting a support vector machine to obtain a current model, and determining the weight of the current model according to a preset rule.
And the classifier generating module is used for integrating all the generated models and corresponding weights, determining a final model and determining a data classifier according to the final model.
And the classification module is used for inputting the data to be classified into the data classifier and classifying the data to be classified.
In a third aspect, the invention provides a data classifier integration device based on a support vector machine, which comprises a memory and a processor.
The memory is used for storing the computer program.
The processor is configured to implement the support vector machine-based data classifier integration method as described above when executing the computer program.
In a fourth aspect, the present invention provides a computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, implements the method for integrating a data classifier based on a support vector machine as described above.
The support vector machine-based data classifier integration method, the device and the storage medium have the advantages that: according to the known former model, selective sampling is carried out in the training set, a small number of training samples can be extracted to train the model according to the requirement, the complexity of the training process is reduced, and the training speed is increased. And the weights of the models are continuously updated in an iterative mode, the models are integrated according to the weights corresponding to the models, the final model is determined, the data classifier is determined according to the final model, the data to be classified is classified, the classification capability of the data classifier can be remarkably improved, and the situation of model overfitting can be reduced.
Drawings
FIG. 1 is a schematic flowchart of a data classifier integration method based on a support vector machine according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of an integrated device of a data classifier based on a support vector machine according to an embodiment of the present invention.
Detailed Description
The principles and features of this invention are described below in conjunction with the following drawings, which are set forth by way of illustration only and are not intended to limit the scope of the invention.
As shown in fig. 1, an embodiment of the present invention provides a data classifier integration method based on a support vector machine, including:
100, obtaining a training set, and enabling the training set to be Dtrain
200, according to an initial model f of a known initial time0The step number is calibrated in the step of generating the loop iteration model, and a plurality of models and the weight of each model are obtained; wherein the model generating step comprises: selectively sampling in the training set according to the known model at the previous moment to obtain a current sample set DtTraining the current sample set D by using a support vector machinetObtaining a current model ftAnd determining the weight of the current model fx according to a preset rule
Figure BDA0002594275340000041
300, integrating according to all the generated models and the corresponding weights to determine the final model FTAnd according to said final model FTDetermining a data classifier;
and 400, inputting the data to be classified into the data classifier, and classifying the data to be classified.
In the embodiment, selective sampling is performed in the training set according to the known previous model, so that a small number of training samples can be extracted as required to train the model, the complexity of the training process is reduced, and the training speed is increased. And the weights of the models are continuously updated in an iterative mode, the models are integrated according to the weights corresponding to the models, the final model is determined, the data classifier is determined according to the final model, the data to be classified is classified, the classification capability of the data classifier can be remarkably improved, and the situation of model overfitting can be reduced.
Preferably, before the step number is calibrated by the step of generating the loop iteration model according to the initial model at the known initial time, the method further includes:
from the training set DtrainIn random drawing of an initial sample set D0Training the initial sample set D by using a support vector machine0Obtaining an initial model f0And determining the initial model f according to the preset rule0Weight of (2)
Figure BDA0002594275340000042
Specifically, an initial sample set is randomly extracted at an initial time, and the initial model obtained through training is used as an initial model of a subsequent iteration.
Preferably, the training set D is assumedtrainComprising n samples, from the training set DtrainIn random drawing of an initial sample set D0Training the initial sample set D by using a support vector machine0Obtaining an initial model f0The method specifically comprises the following steps:
from the training set DtrainRandomly extracting N samples to form the initial sample set D0Where N ═ N (T +1) N, T is a preset constant, and T ∈ N+
Training the initial sample set D by using a support vector machine0The initial sample set D0The sample in (2) is divided into a set of features, X ═ X1x2,…,xN) And set of labels Y ═ Y1y2,…,yN) The feature set comprises features of each sample, and the tag set comprises each sampleThe label of the book;
converting the set of features into (X)1x2,…,xN) And said set of labels Y ═ Y (Y)1y2,…,yN) Substituting a first formula, and solving by adopting a quadratic programming algorithm to obtain an intermediate parameter, wherein the first formula comprises:
Figure BDA0002594275340000051
wherein α ═ (α)12,…,αN) Is the set of intermediate parameters, αiFor the i-th sample the intermediate parameter, αjThe intermediate parameter, y, for the j-th sampleiIs the label of the ith sample, yjIs the label of the jth sample, xiIs a feature of the ith sample, xjIs the feature of the jth sample;
substituting the intermediate parameters into a second formula, and solving to obtain a solution (omega, b) of an original expression of the support vector machine, wherein the second formula comprises:
Figure BDA0002594275340000052
wherein (x)s,ys) Any support vector of the support vector machine;
and constructing the initial model of the support vector machine according to the solution of the original expression.
Preferably, said determining said initial model f according to said preset rules0Weight of (2)
Figure BDA0002594275340000053
The method comprises the following steps:
determining the initial model f according to a third formula0Weight of (2)
Figure BDA0002594275340000054
The third formula includes:
Figure BDA0002594275340000055
wherein,
Figure BDA0002594275340000056
is the weight of the initial model, e0For the initial model f0In the training set DtrainError rate in (1), by p (y)i≠sign(f0(xi) In x) is determined, where x isi、yiRespectively, the feature and the label of the ith sample i, sign () is expressed as
Figure BDA0002594275340000057
Preferably, the selective sampling is performed in the training set according to the known model at the previous time to obtain the current sample set DtTraining the current sample set D by using a support vector machinetObtaining a current model ftAnd determining the current model f according to a preset ruletWeight of (2)
Figure BDA0002594275340000058
The method comprises the following steps:
for the current moment, a sample is extracted according to a known model of the previous moment, and the training set D is calculated according to a preset decision modeltrainDetermining samples used as training data according to the acceptance probability, and obtaining the current sample set Dt
Training the current sample set D using a support vector machinetObtaining the current model ft
Determining the current model f according to a fourth formulatWeight of (2)
Figure BDA0002594275340000061
The fourth formula includes:
Figure BDA0002594275340000062
wherein,
Figure BDA0002594275340000063
for the current model ftWeight of e, etFor the current model ftIn the training set DtrainError rate in (1), by p (y)i≠sign(f0(xi) In x) is determined, where x isi、yiRespectively, the features and labels of the sample i, sign () being expressed as
Figure BDA0002594275340000064
Preferably, the method extracts samples according to a known model of a previous time, and calculates the training set D according to a preset decision modeltrainDetermining samples used as training data according to the acceptance probability, and obtaining the current sample set DtThe method comprises the following steps:
from the training set D according to the model at the previous momenttrainRespectively randomly extracting two samples, and sequentially respectively taking the two samples as a current sample zαAnd candidate sample zβ
Determining the current sample z according to a fifth formulaαAnd candidate sample zβThe fifth formula includes:
=exp(-l(ft-1,zβ))/exp(-l(ft-1,zα))),
wherein is the current sample zαAnd the candidate sample zβSaid transition parameter between,/(f)t-1,zβ) Extracting the sample z for the model at the previous time instantβLoss of time function, l (f)t-1,zα) Extracting the current sample z for the model at the previous momentαLoss function of time, ft-1Is a model of the previous moment;
determining the candidate sample z according to the transfer parameterβAnd deciding whether to accept the candidate sample z according to the acceptance probabilityβStoring the current sample set Dt
If 1, yαyβ1, then the acceptance probability is p1=min{1,exp(-yβft-1)/exp(yαft-1) H, dividing the candidate sample zβStoring the current sample set Dt;yαFor the current sample zαLabel of (a), yβIs a candidate sample zβThe label of (1).
If 1, yαyβ-1, or < 1, then the acceptance probability is p2Min {1 }, the candidate sample z is divided intoβStoring the current sample set Dt
If > 1, the sample z is sampledβStoring the current sample set Dt
If n is continuous2A sample is rejected from being stored in the current sample set DtThen the acceptance probability is p2Min {1, q }, and (n) th2+1 samples into the current sample set Dt(ii) a Wherein n is2And q is a preset constant;
repeatedly extracting samples and determining whether to store the current sample set DtUp to the current sample set DtThe sample volume of (a) reaches a preset target volume.
Specifically, the training set D is newly settrainIn the random sampling, a sample is taken according to the sample zβAnd the re-extracted sample, repeating the above steps, and determining whether to store the re-extracted sample in the current sample set Dt. By this selective method, the current sample set D is extracted from the training sett
Preferably, the final model F is determined by integrating all the models generated and the corresponding weightsTThe method comprises the following steps:
according to all the models and the corresponding stationsThe weights are subjected to weighted summation to obtain the final model FT
In particular, the amount of the solvent to be used,
Figure BDA0002594275340000071
FTfor the final model, from the final model FTOutput data classifier sign (F)T)。
As shown in fig. 2, an embodiment of the present invention provides an integrated apparatus of a data classifier based on a support vector machine, including:
the acquisition module is used for acquiring a training set;
the model generation module is used for calibrating the step number in the step of generating the loop iteration model according to the known initial model at the initial moment to obtain a plurality of models and the weight of each model; wherein the model generating step comprises: selectively sampling in the training set according to a known model at the previous moment to obtain a current sample set, training the current sample set by adopting a support vector machine to obtain a current model, and determining the weight of the current model according to a preset rule;
the classifier generating module is used for integrating all the generated models and corresponding weights, determining a final model and determining a data classifier according to the final model;
and the classification module is used for inputting the data to be classified into the data classifier and classifying the data to be classified.
Another embodiment of the present invention provides a data classifier integrated device based on a support vector machine, including a memory and a processor; the memory for storing a computer program; the processor is configured to implement the support vector machine-based data classifier integration method as described above when executing the computer program. The device may be a server, a computer, or the like.
Yet another embodiment of the present invention provides a computer-readable storage medium having stored thereon a computer program, which, when being executed by a processor, implements the method for integrating a data classifier based on a support vector machine as described above.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), or the like. In this application, the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment of the present invention. In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (10)

1. A data classifier integration method based on a support vector machine is characterized by comprising the following steps:
acquiring a training set;
according to a known initial model at an initial moment, circularly iterating the steps of the model to calibrate the steps, and obtaining a plurality of models and the weight of each model; wherein the model generating step comprises: selectively sampling in the training set according to a known model at the previous moment to obtain a current sample set, training the current sample set by adopting a support vector machine to obtain a current model, and determining the weight of the current model according to a preset rule;
integrating according to all the generated models and corresponding weights to determine a final model, and determining a data classifier according to the final model;
inputting the data to be classified into the data classifier, and classifying the data to be classified.
2. The support vector machine-based data classifier integration method according to claim 1, further comprising, before the step of circularly iterating the model generation step to calibrate the number of steps based on the initial model at the known initial time, the step of circularly iterating the model generation step:
and randomly extracting an initial sample set from the training set, training the initial sample set by adopting a support vector machine to obtain an initial model, and determining the weight of the initial model according to the preset rule.
3. The support vector machine-based data classifier integration method according to claim 2, wherein assuming that the training set includes n samples, randomly extracting an initial sample set from the training set, training the initial sample set with a support vector machine, and obtaining an initial model specifically includes:
randomly extracting N mutually independent samples from the training set to form the initial sample set, wherein N and N satisfy a relation N ═ T +1) N, and T is a preset constant;
training the initial sample set by adopting a support vector machine, and dividing samples in the initial sample set into a feature set and a label set, wherein the feature set comprises the features of all samples, and the label set comprises the labels of all samples;
substituting the feature set and the label set into a first formula, and solving by adopting a quadratic programming algorithm to obtain an intermediate parameter, wherein the first formula comprises:
Figure FDA0002594275330000021
wherein α ═ (α)12,…,αN) Is the set of intermediate parameters, αiFor the i-th sample the intermediate parameter, αjThe intermediate parameter, y, for the j-th sampleiIs the label of the ith sample, yjIs the label of the jth sample, xiIs a feature of the ith sample, xjThe characteristic of the jth sample is shown, and T is a preset constant;
substituting the intermediate parameters into a second formula, and solving to obtain a solution (omega, b) of an original expression of the support vector machine, wherein the second formula comprises:
Figure FDA0002594275330000022
and constructing the initial model of the support vector machine according to the solution of the original expression.
4. The support vector machine-based data classifier integration method according to claim 3, wherein the determining the weight of the initial model according to the preset rule comprises:
determining weights for the initial model according to a third formula, the third formula comprising:
Figure FDA0002594275330000023
wherein,
Figure FDA0002594275330000024
is the weight of the initial model, e0Is the error rate of the initial model in the training set.
5. The support vector machine-based data classifier integration method according to any one of claims 1 to 4, wherein the selectively sampling in the training set according to the known model at the previous moment to obtain a current sample set, training the current sample set by using a support vector machine to obtain a current model, and determining the weight of the current model according to a preset rule comprises:
for the current moment, extracting samples according to a known model of the previous moment, calculating the acceptance probability of each sample in the training set according to a preset decision model, determining the sample used as training data according to the acceptance probability, and obtaining the current sample set;
training the current sample set by adopting a support vector machine to obtain the current model;
determining a weight of the current model according to a fourth formula, the fourth formula comprising:
Figure FDA0002594275330000031
wherein,
Figure FDA0002594275330000032
is the weight of the current model, etThe error rate of the current model in the training set.
6. The support vector machine-based data classifier integration method according to claim 5, wherein the extracting samples according to a known model of a previous time, calculating an acceptance probability of each sample in the training set according to a preset decision model, determining samples used as training data according to the acceptance probability, and obtaining the current sample set comprises:
respectively randomly and repeatedly extracting two samples from the training set according to the model at the previous moment, and sequentially and respectively making the two samples be current samples zαAnd candidate sample zβ
Determining the current sample z according to a fifth formulaαAnd candidate sample zβThe fifth formula includes:
=exp(-l(ft-1,zβ))/exp(-l(ft-1,zα))),
wherein is the current sample zαAnd the candidate sample zβSaid transition parameter between,/(f)t-1,zβ) For the model of the previous time instant and extracting the candidate sample zβLoss of time function, l (f)t-1,zα) For the model of the previous moment and extracting the current sample zαLoss function of time, ft-1Is a model of the previous moment;
determining the candidate sample z according to the transfer parameterβAnd deciding whether to accept the candidate sample z according to the acceptance probabilityβStoring the current sample set;
if 1, yαyβ1, then the acceptance probability is p1=min{1,exp(-yβft-1)/exp(yαft-1) H, dividing the candidate sample zβStoring said current sample set, yαFor the current sample zαLabel of (a), yβIs a candidate sample zβThe label of (1);
if 1, yαyβ-1, or < 1, then the acceptance probability is p2Min {1 }, the candidate sample z is divided intoβStoring the current sample set;
if > 1, the candidate sample z is selectedβStoring the current sample set;
if n is continuous2If a sample is rejected to be stored in the current sample set, the acceptance probability is p2Min {1, q }, and (n) th2+1 samples are stored in the current sample set; wherein n is2And q is a preset constant;
and repeatedly extracting samples and determining whether the samples are stored in the current sample set until the sample capacity of the current sample set reaches a preset target capacity.
7. The support vector machine-based data classifier integration method according to claim 6, wherein the integrating according to all the generated models and the corresponding weights, and the determining the final model comprises:
and carrying out weighted summation according to all the models and the corresponding weights to obtain the final model.
8. A support vector machine-based data classifier integration apparatus, comprising:
the acquisition module is used for acquiring a training set;
the model generation module is used for calibrating the step number in the step of generating the loop iteration model according to the known initial model at the initial moment to obtain a plurality of models and the weight of each model; wherein the model generating step comprises: selectively sampling in the training set according to a known model at the previous moment to obtain a current sample set, training the current sample set by adopting a support vector machine to obtain a current model, and determining the weight of the current model according to a preset rule;
the classifier generating module is used for integrating all the generated models and corresponding weights, determining a final model and determining a data classifier according to the final model;
and the classification module is used for inputting the data to be classified into the data classifier and classifying the data to be classified.
9. A data classifier integrated device based on a support vector machine is characterized by comprising a memory and a processor;
the memory for storing a computer program;
the processor, when executing the computer program, is configured to implement the support vector machine-based data classifier integration method according to any one of claims 1 to 7.
10. A computer-readable storage medium, characterized in that the storage medium has stored thereon a computer program which, when being executed by a processor, carries out the support vector machine-based data classifier integration method according to any one of claims 1 to 7.
CN202010704744.7A 2020-07-21 2020-07-21 Data classifier integration method and device based on support vector machine and storage medium Pending CN111914915A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010704744.7A CN111914915A (en) 2020-07-21 2020-07-21 Data classifier integration method and device based on support vector machine and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010704744.7A CN111914915A (en) 2020-07-21 2020-07-21 Data classifier integration method and device based on support vector machine and storage medium

Publications (1)

Publication Number Publication Date
CN111914915A true CN111914915A (en) 2020-11-10

Family

ID=73280177

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010704744.7A Pending CN111914915A (en) 2020-07-21 2020-07-21 Data classifier integration method and device based on support vector machine and storage medium

Country Status (1)

Country Link
CN (1) CN111914915A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113642623A (en) * 2021-08-05 2021-11-12 深圳大学 Complex support vector machine classification method based on unitary space multi-feature fusion

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113642623A (en) * 2021-08-05 2021-11-12 深圳大学 Complex support vector machine classification method based on unitary space multi-feature fusion
CN113642623B (en) * 2021-08-05 2023-08-18 深圳大学 Complex support vector machine classification method based on unitary space multi-feature fusion

Similar Documents

Publication Publication Date Title
CN111079639B (en) Method, device, equipment and storage medium for constructing garbage image classification model
CN111523621B (en) Image recognition method and device, computer equipment and storage medium
US11049011B2 (en) Neural network classifier
CN110033281B (en) Method and device for converting intelligent customer service into manual customer service
JP4697670B2 (en) Identification data learning system, learning device, identification device, and learning method
CN116635866A (en) Method and system for mining minority class data samples to train a neural network
CN114841257B (en) Small sample target detection method based on self-supervision comparison constraint
CN110188195B (en) Text intention recognition method, device and equipment based on deep learning
CN109871885A (en) A kind of plants identification method based on deep learning and Plant Taxonomy
KR20230133854A (en) Cross-domain adaptive learning
CN110968725B (en) Image content description information generation method, electronic device and storage medium
CN113806580B (en) Cross-modal hash retrieval method based on hierarchical semantic structure
CN106529490B (en) Based on the sparse system and method for realizing writer verification from coding code book
CN112632984A (en) Graph model mobile application classification method based on description text word frequency
CN112270334B (en) Few-sample image classification method and system based on abnormal point exposure
CN114398935A (en) Deep learning-based medical image report multi-label classification method
CN114998602A (en) Domain adaptive learning method and system based on low confidence sample contrast loss
CN109101984B (en) Image identification method and device based on convolutional neural network
CN113987188B (en) Short text classification method and device and electronic equipment
CN111694954A (en) Image classification method and device and electronic equipment
CN111914915A (en) Data classifier integration method and device based on support vector machine and storage medium
CN116630816B (en) SAR target recognition method, device, equipment and medium based on prototype comparison learning
CN117390454A (en) Data labeling method and system based on multi-domain self-adaptive data closed loop
CN116226747A (en) Training method of data classification model, data classification method and electronic equipment
CN115063374A (en) Model training method, face image quality scoring method, electronic device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20201110