Nothing Special   »   [go: up one dir, main page]

CN110309512A - A kind of Chinese grammer error correction method thereof based on generation confrontation network - Google Patents

A kind of Chinese grammer error correction method thereof based on generation confrontation network Download PDF

Info

Publication number
CN110309512A
CN110309512A CN201910606372.1A CN201910606372A CN110309512A CN 110309512 A CN110309512 A CN 110309512A CN 201910606372 A CN201910606372 A CN 201910606372A CN 110309512 A CN110309512 A CN 110309512A
Authority
CN
China
Prior art keywords
network
sentence
word
text
corrigendum
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910606372.1A
Other languages
Chinese (zh)
Inventor
赵建博
李思
孙忆南
梁景贵
朱勇杰
吕游伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing University of Posts and Telecommunications
Original Assignee
Beijing University of Posts and Telecommunications
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing University of Posts and Telecommunications filed Critical Beijing University of Posts and Telecommunications
Priority to CN201910606372.1A priority Critical patent/CN110309512A/en
Publication of CN110309512A publication Critical patent/CN110309512A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/253Grammatical analysis; Style critique
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • Computational Linguistics (AREA)
  • General Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Evolutionary Computation (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Biophysics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Machine Translation (AREA)

Abstract

The invention discloses a kind of based on the Chinese grammer error correction method thereof for generating confrontation network, belongs to field of information processing.The feature of this method includes: to generate corrigendum sentence first with generation network;Using network query function loss function is differentiated, optimization generates network;Differentiate that sentence corrects source using differentiation network;Optimization differentiates network;To generation network and differentiate the continuous iteration optimization of network.The present invention, so that Chinese grammer error correction effect gets a promotion, has very big use value by generating confrontation network.

Description

A kind of Chinese grammer error correction method thereof based on generation confrontation network
Technical field
The present invention relates to field of information processing, in particular to a kind of Chinese grammer error correction side neural network based Method.
Background technique
Chinese grammer error correction is the new task of the comparison in Chinese natural language processing, it is therefore an objective to be judged to non-Chinese Whether sentence written by the people of mother tongue includes syntax error, proposes more direction-determining board to the place comprising mistake.
There are two types of Chinese grammer error correction method thereofs most common at present.One is utilize Chinese grammer error detection model Mistake is first detected, N-Gram dictionary is recycled to calculate the frequency that word occurs jointly, the sentence corrected.Another kind is to utilize Model foundation end to end Chinese grammer error correction model of the sequence to sequence.This model regards syntax error corrigendum task It is correct sentence by the sentence translation comprising syntax error for a translation duties.But these networks have certain lack Point, the former depends on the higher training dataset of quality dependent on large-scale dictionary, the latter.Recently, a large amount of training data quilt Exploitation, therefore also have more and more people with the model of sequence to sequence into the task of Chinese grammer error correction.But Only with corpus, the sentence after no very good solution model corrigendum less meets Chinese habit for most work The problem of.And the present invention uses to solve the problem above-mentioned and generates confrontation network, obtains relatively good generation model, obtains Preferable syntax error more plus effect is arrived.
Summary of the invention
In order to solve existing technical problem, the present invention provides a kind of based on the Chinese syntax error for generating confrontation network Method for correcting.Scheme is as follows:
Step 1, we handle the sentence comprising syntax error of input, using in generation network acquisition sentence Word information and contextual information, generative grammar mistake be corrected after sentence.
Step 2, we are by the sentence comprising syntax error and generate the sentence feeding differentiation network after network is corrected, benefit With loss function, optimization generates network.
Step 3, we are by the sentence comprising syntax error, after the sentence and generation network corrigendum after artificial mark corrigendum Sentence be sent into and differentiate network, calculate corrigendum sentence from artificial mark or the probability of corrigendum network.
Step 4 calculates loss function, optimization using corrigendum sentence from artificial mark or the probability of corrigendum network Differentiate network.
Step 5, return step one are continued to optimize and generate network and corrigendum network.
Detailed description of the invention
Fig. 1 is the step process signal provided by the invention based on the Chinese grammer error correction method thereof for generating confrontation network Figure
Fig. 2 is text generation network structure of the sequence based on convolutional neural networks to sequence
Specific embodiment
It next will be for a more detailed description to embodiment of the present invention.
Fig. 1 is the step process signal provided by the invention based on the Chinese grammer error correction method thereof for generating confrontation network Figure, including:
Step S1: it generates network and generates corrigendum sentence;
Step S2: optimization generates network;
Step S3: differentiate that network differentiates that sentence corrects source;
Step S4: optimization differentiates network;
Step S5: iteration optimization;
Each step will be specifically described below:
Step S1: it generates network and generates corrigendum sentence.The present invention initially sets up the word for generating network code layer and decoding layer Word each in text is mapped as corresponding word and numbered by the mapping dictionary that language is numbered to term vector.It establishes and generates network volume The term vector table of code layer and decoding layer represents a term vector per number corresponding corresponding word number, every a line line by line.Pass through Word number is mapped as corresponding former term vector by term vector table.Each original term vector is added with corresponding position vector, is formed defeated Enter term vector.The input term vector of each word is respectively formed the input of the text matrix and decoding layer of coding layer input in connection text Text matrix.Assuming that Chinese word share it is N number of, then term vector matrix can be expressed as the matrix of a N*d, wherein d table Show the dimension of term vector.Input term vector can be expressed as x.
X=v+p
Wherein, v indicates that the former term vector of word in text, p indicate the corresponding position vector of word v.
Obtained coding layer text Input matrix is generated to the coding layer of network.Network code layer is generated by multilayer convolution mind It is formed through network.Each layer of convolutional neural networks are formed by an one-dimensional convolution sum one is non-linear.It is connected, is connected by residual error The convolutional neural networks of each layer.The calculating of one layer of convolutional neural networks can indicate are as follows:
[A B]=conv (X)
Wherein X indicates to be divided into two-part as a result, σ indicates non-linear letter after text vector, A and B indicate convolution algorithm Number,Indicate dot product.
The calculation of decoding layer is made of as coding layer multilayer convolutional neural networks.Each layer of convolutional Neural net Network is formed by an one-dimensional convolution sum one is non-linear.It is connected by residual error, connects the convolutional neural networks of each layer.
The output vector of the hidden layer of encoder and the hidden layer output vector of decoder carry out operation, the power that gains attention machine Weight processed, is calculated text vector.The text vector of l i-th of time section of layerCalculating can indicate are as follows:
WhereinIndicate that decoding layer is exported in the hidden layer of i-th of time section of l layer,SunRespectively its is corresponding Weight and deviation, eiIt is the target term vector of previous moment,For the hidden layer output at u layers of j moment of coding layer, pjIndicate position Set vector.
Using obtained prediction word as the input of subsequent time, the prediction word of subsequent time is calculated, until having predicted At sentence after generative grammar error correction.Fig. 2 gives the sequence based on convolutional neural networks to the text generation net of sequence Network structure chart.Predict that next word calculating can indicate are as follows:
Wherein p indicates the probability of next word prediction, Wo, boIndicate the weight and deviation of output, yiIndicate i moment Word.
Step S2: optimization generates network.It, will be literary first with the mapping dictionary for differentiating that the word of network is numbered to term vector Each word is mapped as corresponding word number in this, recycles the term vector table of differentiation network by the sentence comprising syntax error And each word generated in the sentence that network is corrected is mapped as corresponding term vector, and the word of each numeralization is separately connected into Text matrix comprising syntax error and the text matrix for generating network corrigendum.Differentiation network is two sorter networks, is differentiated Correct the source of sentence.First by the text matrix comprising syntax error and the text rectangular of generation network corrigendum at sentence pair [src, tgtp], wherein src, tgtpIt is the text matrix comprising syntax error and the text matrix for generating network corrigendum respectively, divides Not Tong Guo a Recognition with Recurrent Neural Network or convolutional neural networks, respectively generate text representation vector, then to text representation vector into Row processing obtains differentiating the probability that the corrigendum sentence of input is determined as from artificial mark or corrigendum network by network.Probability meter Calculation process can indicate are as follows:
vs=Ms(Wssrc+bs)
vtp=Mt(Wttgtp+bt)
vd=[vs, vtp]
p(ltgt| src, tgtp)=softmax (Wdvd+bd)
Wherein Ms, MtThe text matrix for respectively indicating the text matrix comprising syntax error and generating network corrigendum is passed through Neural network, Ws, bsRespectively calculate weight and deviation comprising syntax error text representation vector, Wt, btRespectively calculate The weight and deviation of the text representation vector of corrigendum, vs, vtpIt respectively include the expression vector and more text of syntax error text This expression vector, p indicate that label is ltgtProbability.
The calculating of loss function can indicate are as follows:
Wherein D, G, which are respectively indicated, to be differentiated network and generates network, and z indicates to generate the input of network.
Step S3: first with the mapping dictionary for differentiating that the word of network is numbered to term vector, by word each in text It is mapped as corresponding word number, recycles the term vector table for differentiating network by the sentence comprising syntax error, artificial mark is more Positive sentence and each word generated in the sentence that network is corrected are mapped as corresponding term vector, by the word of each numeralization point The text matrix comprising syntax error is not connected into, the text matrix of artificial mark corrigendum and the text square for generating network corrigendum Battle array.By the text matrix comprising syntax error and the text rectangular of artificial mark corrigendum at sentence to [src, tgtg], it will wrap The text rectangular of text matrix and generation network corrigendum containing syntax error is at sentence to [src, tgtp], wherein src, tgtg And tgtpIt is the text matrix comprising syntax error respectively, the text matrix of artificial mark corrigendum and the text for generating network corrigendum Matrix.Text matrix comprising syntax error and the text matrix of corrigendum pass through a Recognition with Recurrent Neural Network or convolutional Neural respectively Network generates text representation vector respectively, then handles text representation vector, obtains differentiating network by the corrigendum language of input Sentence is determined as the probability from artificial mark or corrigendum network.Probability calculation process is identical as the probability calculation process of step S2.
Step S4: optimization differentiates network.It obtains differentiating that the corrigendum sentence of input is determined as from people by network in step S3 The probability of work mark or corrigendum network, by probability calculation loss function, optimization differentiates network.Loss function calculating can indicate Are as follows:
Wherein D (x) indicates that namely manually mark corrigendum sentence differentiates to truthful data, and G (z) indicates to generate network The corrigendum sentence of generation.
Step S5: iteration optimization.Return step S1 continues iteration optimization and generates network and corrigendum network, works as loss function No longer decline, stop optimization when remaining unchanged, obtains generating model.
It is a kind of based on the Chinese grammer error correction method thereof for generating confrontation network and each to what is proposed in conjunction with attached drawing above The specific embodiment of module is expounded.By the description of embodiment of above, one of ordinary skill in the art can To be clearly understood that the present invention can realize by means of software and necessary general hardware platform.
According to the thought of the present invention, there will be changes in the specific implementation manner and application range.In conclusion this Description should not be construed as limiting the invention.
Invention described above embodiment does not constitute the restriction to invention protection scope.It is any of the invention Made modifications, equivalent substitutions and improvements etc., should all be included in the protection scope of the present invention within spirit and principle.

Claims (10)

1. a kind of based on the Chinese grammer error correction method thereof for generating confrontation network, which is characterized in that the method includes following Step:
(1) it generates network and generates corrigendum sentence: the sentence comprising syntax error is handled, utilize generation network acquisition sentence In each word information and its contextual information, generative grammar mistake be corrected after sentence;
(2) optimization generates network: the sentence after the syntax error that the sentence comprising syntax error and step (1) obtain is corrected Input differentiates network, and using loss function, optimization generates network;
(3) differentiate that network differentiates that sentence corrects source: to the sentence comprising syntax error, the sentence and step of artificial mark corrigendum (1) sentence after the syntax error obtained is corrected is handled, and calculates separately corrigendum sentence from artificial or generation network Probability;
(4) optimization differentiates network: the corrigendum sentence obtained using step (3) derives from probability that is artificial or generating network, calculates Loss function, optimization differentiate network;
(5) iteration optimization: return step (1) is continued to optimize and generates network and differentiation network.
2. the method as described in claim 1, which is characterized in that the step (1) specifically includes:
(1.1) generating network text word vectorsization indicates: utilizing the word for the term vector table and decoding layer for generating network code layer Each word in sentence comprising syntax error is mapped as corresponding former term vector by vector table, each original term vector with it is corresponding Position vector be added to form input term vector, by each word it is corresponding input term vector connect into coding layer input text matrix And the text matrix of the input of decoding layer;
(1.2) it generates corrigendum sentence: step (1.1) is obtained to the text of the input of the text matrix and decoding layer of coding layer input Input matrix generates network, generates network and passes through the language after capture word information and contextual information generative grammar error correction Sentence.
3. the method as described in claim 1, which is characterized in that it is based on convolutional neural networks that the step (1), which generates network, Sequence to sequence text generation network.
4. method according to claim 2, which is characterized in that step (1.1) word vectorsization indicate or word Vectorization indicates.
5. method according to claim 2, which is characterized in that the step (1.2) specifically includes:
The coding layer text Input matrix that (1.2.1) obtains step (1.1) generates the coding layer of network, obtains coding layer and hides Layer output vector;
The decoding layer text Input matrix that (1.2.2) obtains step (1.1) generates the decoding layer of network, obtains decoding layer and hides Layer output vector;
The decoding layer hidden layer that (1.2.3) obtains the coding layer hidden layer output vector that step (1.2.1) obtains with (1.2.2) Output vector does attention mechanism operation, the power that gains attention weight;
Coding layer hidden layer that the attention weight and step (1.2.1) that (1.2.4) obtains step (1.2.3) obtain export to Amount does weighted sum, obtains text vector;
The decoding layer hidden layer output vector that (1.2.5) obtains the text vector that step (1.2.4) obtains with step (1.2.2) It is handled, obtains the prediction word of subsequent time;
The prediction word that (1.2.6) obtains step (1.2.5) as the input return step (1.2.1) of new coding layer until Output ending mark, the sentence after obtaining syntax error corrigendum.
6. method as claimed in claim 5, which is characterized in that the step (1.2.1) generates the coding layer of network and described The decoding layer that step (1.2.2) generates network is multilayer convolutional network.
7. the method as described in claim 1, which is characterized in that the step (2) specifically includes:
(2.1) differentiate that network text word vectorsization indicate: using the term vector table for differentiating network, by the sentence comprising syntax error Son and the obtained syntax error of step (1) be corrected after sentence in each word be mapped as corresponding term vector, by each number The word of value is separately connected into the text matrix comprising syntax error and generates the text matrix of network corrigendum;
(2.2) using loss function optimization generate network: by step (2.1) obtain comprising the text matrix of syntax error and life Sentence pair is constituted at the text matrix of network corrigendum, sentence is differentiated into network to input, calculates loss function, optimization generates net Network.
8. the method for claim 7, which is characterized in that step (2.1) word vectorsization indicate or word Vectorization indicates.
9. the method as described in claim 1, which is characterized in that the step (3) specifically includes:
(3.1) differentiate that network text word vectorsization indicate: using the term vector table for differentiating network, by the sentence comprising syntax error Son, each word in sentence after what the sentence and step (1) of artificial mark corrigendum obtained be corrected are mapped as corresponding word The word to quantize in each sentence is separately connected by vector, obtains the text matrix comprising syntax error, artificial mark corrigendum Text matrix and the text matrix for generating network corrigendum;
(3.2) differentiate corrigendum sentence source: the text matrix comprising syntax error and artificial mark that step (3.1) is obtained are more Positive text matrix constitutes sentence pair, and step (3.1) is obtained the text matrix comprising syntax error and generates network corrigendum Text matrix constitute sentence pair, sentence is differentiated into network to being sent into, calculates corrigendum sentence from artificial and generate network Probability.
10. the method as described in claim 1, which is characterized in that the step (2), (3), (4) differentiate that network is based on convolution The sorter network of neural network or Recognition with Recurrent Neural Network.
CN201910606372.1A 2019-07-05 2019-07-05 A kind of Chinese grammer error correction method thereof based on generation confrontation network Pending CN110309512A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910606372.1A CN110309512A (en) 2019-07-05 2019-07-05 A kind of Chinese grammer error correction method thereof based on generation confrontation network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910606372.1A CN110309512A (en) 2019-07-05 2019-07-05 A kind of Chinese grammer error correction method thereof based on generation confrontation network

Publications (1)

Publication Number Publication Date
CN110309512A true CN110309512A (en) 2019-10-08

Family

ID=68078446

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910606372.1A Pending CN110309512A (en) 2019-07-05 2019-07-05 A kind of Chinese grammer error correction method thereof based on generation confrontation network

Country Status (1)

Country Link
CN (1) CN110309512A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111008277A (en) * 2019-10-30 2020-04-14 创意信息技术股份有限公司 Automatic text summarization method
CN111126059A (en) * 2019-12-24 2020-05-08 上海风秩科技有限公司 Method and device for generating short text and readable storage medium
CN111767718A (en) * 2020-07-03 2020-10-13 北京邮电大学 Chinese grammar error correction method based on weakened grammar error feature representation
CN111985219A (en) * 2020-07-30 2020-11-24 哈尔滨工业大学 Text grammar error correction method fusing monolingual data
CN111985218A (en) * 2020-07-30 2020-11-24 哈尔滨工业大学 Automatic judicial literature proofreading method based on generation of confrontation network
CN112364631A (en) * 2020-09-21 2021-02-12 山东财经大学 Chinese grammar error detection method and system based on hierarchical multitask learning
CN113743110A (en) * 2021-11-08 2021-12-03 京华信息科技股份有限公司 Word missing detection method and system based on fine-tuning generation type confrontation network model

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106354710A (en) * 2016-08-18 2017-01-25 清华大学 Neural network relation extracting method
CN107368475A (en) * 2017-07-18 2017-11-21 中译语通科技(北京)有限公司 A kind of machine translation method and system based on generation confrontation neutral net
CN108897740A (en) * 2018-05-07 2018-11-27 内蒙古工业大学 A kind of illiteracy Chinese machine translation method based on confrontation neural network
CN108984525A (en) * 2018-07-06 2018-12-11 北京邮电大学 A kind of Chinese grammer error-detecting method based on the term vector that text information is added
WO2019024050A1 (en) * 2017-08-03 2019-02-07 Lingochamp Information Technology (Shanghai) Co., Ltd. Deep context-based grammatical error correction using artificial neural networks
CN109657251A (en) * 2018-12-17 2019-04-19 北京百度网讯科技有限公司 Method and apparatus for translating sentence
CN109948152A (en) * 2019-03-06 2019-06-28 北京工商大学 A kind of Chinese text grammer error correcting model method based on LSTM

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106354710A (en) * 2016-08-18 2017-01-25 清华大学 Neural network relation extracting method
CN107368475A (en) * 2017-07-18 2017-11-21 中译语通科技(北京)有限公司 A kind of machine translation method and system based on generation confrontation neutral net
WO2019024050A1 (en) * 2017-08-03 2019-02-07 Lingochamp Information Technology (Shanghai) Co., Ltd. Deep context-based grammatical error correction using artificial neural networks
CN108897740A (en) * 2018-05-07 2018-11-27 内蒙古工业大学 A kind of illiteracy Chinese machine translation method based on confrontation neural network
CN108984525A (en) * 2018-07-06 2018-12-11 北京邮电大学 A kind of Chinese grammer error-detecting method based on the term vector that text information is added
CN109657251A (en) * 2018-12-17 2019-04-19 北京百度网讯科技有限公司 Method and apparatus for translating sentence
CN109948152A (en) * 2019-03-06 2019-06-28 北京工商大学 A kind of Chinese text grammer error correcting model method based on LSTM

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
HOOGLE: "神经网络中的注意力机制", 《知乎 知识蒸馏》 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111008277A (en) * 2019-10-30 2020-04-14 创意信息技术股份有限公司 Automatic text summarization method
CN111126059A (en) * 2019-12-24 2020-05-08 上海风秩科技有限公司 Method and device for generating short text and readable storage medium
CN111767718A (en) * 2020-07-03 2020-10-13 北京邮电大学 Chinese grammar error correction method based on weakened grammar error feature representation
CN111767718B (en) * 2020-07-03 2021-12-07 北京邮电大学 Chinese grammar error correction method based on weakened grammar error feature representation
CN111985219A (en) * 2020-07-30 2020-11-24 哈尔滨工业大学 Text grammar error correction method fusing monolingual data
CN111985218A (en) * 2020-07-30 2020-11-24 哈尔滨工业大学 Automatic judicial literature proofreading method based on generation of confrontation network
CN112364631A (en) * 2020-09-21 2021-02-12 山东财经大学 Chinese grammar error detection method and system based on hierarchical multitask learning
CN112364631B (en) * 2020-09-21 2022-08-02 山东财经大学 Chinese grammar error detection method and system based on hierarchical multitask learning
CN113743110A (en) * 2021-11-08 2021-12-03 京华信息科技股份有限公司 Word missing detection method and system based on fine-tuning generation type confrontation network model

Similar Documents

Publication Publication Date Title
CN110309512A (en) A kind of Chinese grammer error correction method thereof based on generation confrontation network
CN109062907B (en) Neural machine translation method integrating dependency relationship
CN110765966B (en) One-stage automatic recognition and translation method for handwritten characters
CN109190131B (en) Neural machine translation-based English word and case joint prediction method thereof
CN109635124B (en) Remote supervision relation extraction method combined with background knowledge
CN108052512B (en) Image description generation method based on depth attention mechanism
CN106202153B (en) A kind of the spelling error correction method and system of ES search engine
CN109492202A (en) A kind of Chinese error correction of coding and decoded model based on phonetic
CN109933808B (en) Neural machine translation method based on dynamic configuration decoding
CN106547735A (en) The structure and using method of the dynamic word or word vector based on the context-aware of deep learning
CN111767718B (en) Chinese grammar error correction method based on weakened grammar error feature representation
CN108897740A (en) A kind of illiteracy Chinese machine translation method based on confrontation neural network
CN107836000A (en) For Language Modeling and the improved artificial neural network of prediction
CN115879546A (en) Method and system for constructing composite neural network psychology medicine knowledge map
CN109598002A (en) Neural machine translation method and system based on bidirectional circulating neural network
CN113157919A (en) Sentence text aspect level emotion classification method and system
Zhang et al. Learning sentiment-inherent word embedding for word-level and sentence-level sentiment analysis
CN110543566A (en) intention classification method based on self-attention neighbor relation coding
CN108959260A (en) A kind of Chinese grammer error-detecting method based on textual term vector
CN117151084B (en) Chinese spelling and grammar error correction method, storage medium and equipment
WO2020040255A1 (en) Word coding device, analysis device, language model learning device, method, and program
CN110175330A (en) A kind of name entity recognition method based on attention mechanism
CN114169447A (en) Event detection method based on self-attention convolution bidirectional gating cyclic unit network
CN110888944B (en) Attention convolutional neural network entity relation extraction method based on multi-convolutional window size
CN108762523A (en) Output characters through input method prediction technique based on capsule networks

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20191008