Nothing Special   »   [go: up one dir, main page]

CN106354710A - Neural network relation extracting method - Google Patents

Neural network relation extracting method Download PDF

Info

Publication number
CN106354710A
CN106354710A CN201610685532.2A CN201610685532A CN106354710A CN 106354710 A CN106354710 A CN 106354710A CN 201610685532 A CN201610685532 A CN 201610685532A CN 106354710 A CN106354710 A CN 106354710A
Authority
CN
China
Prior art keywords
sentence
vector
entity
pair
relation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610685532.2A
Other languages
Chinese (zh)
Inventor
孙茂松
林衍凯
刘知远
栾焕博
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tsinghua University
Original Assignee
Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tsinghua University filed Critical Tsinghua University
Priority to CN201610685532.2A priority Critical patent/CN106354710A/en
Publication of CN106354710A publication Critical patent/CN106354710A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F40/00Handling natural language data
    • G06F40/20Natural language analysis
    • G06F40/279Recognition of textual entities
    • G06F40/289Phrasal analysis, e.g. finite state techniques or chunking
    • G06F40/295Named entity recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • General Physics & Mathematics (AREA)
  • Computational Linguistics (AREA)
  • General Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Evolutionary Computation (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Biophysics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Audiology, Speech & Language Pathology (AREA)
  • Machine Translation (AREA)

Abstract

The invention discloses a neural network relation extracting method. A method for relation extracting of a neural network of an attention mechanism is selected based on the levels of sentences. The method comprises the specific steps that for each sentence and a pair of entities related to the sentence, sentence vector representation of the pair of entities is established with a convolutional neural network; sentence vector representation expressing the relation of the pair of entities is selected with a set sentence level attention mechanism, and comprehensive sentence vector representation of the pair of entities is obtained; the relation of the pair of entities is predicted according to the comprehensive sentence vector representation of the pair of entities. In this way, by means of the method, the interference of noise in remote supervision data can be reduced in neural network relation extracting, information of different sentences can be considered at the same time, the stability of a model can be improved, and good practicability is achieved.

Description

A kind of neutral net Relation extraction method
Technical field
The present invention relates to natural language processing technique field, particularly to a kind of neutral net Relation extraction method.
Background technology
Develop rapidly with society, come into the information explosion epoch at present, have the new data of magnanimity to produce daily.Mutually As presently the most easily information acquisition platform, user is increasingly urgent with the demand of conclusion to effective information screening, such as networking What extracts effective information from mass data becomes a difficult problem.In order to solve this difficult problem it is proposed that knowledge mapping concept, knowledge The noun of the specific term in mass data and things is expressed as entity by collection of illustrative plates, specific term be in the world all persons, Name, title or team etc., the internal relation between entity are expressed as relation it is intended to mass data is expressed as profit between entity With relation as bridge ternary relation group.For example, " Beijing is the capital of China " this data, then utilizes in knowledge mapping Tlv triple relation is expressed as: Beijing, is ... capital, China.
Although the knowledge mapping having built up contains more than one hundred million data, compared to endless data, it is still remote It is far from perfect.Automatically perfect in order to constantly carry out to knowledge mapping, employ a lot of technology.Wherein just there is Relation extraction skill Art, this technology can automatically extract from natural language text structured data.
At present, usually adopt the Relation extraction technology based on supervised learning in Relation extraction technology, this technology needs Substantial amounts of artificial mark training data, unusual time and effort consuming.For this problem, the Relation extraction technology based on remote supervisory Propose and can automatically generate the technology of training data by contrasting between plain text and configured knowledge base, If the Relation extraction technology based on remote supervisory assumes that two entities simultaneously appear in a sentence, then this sentence is just Express the relation between this two entities, by this hypothesis, it is possible to use all in the knowledge base of setting comprise this two realities The sentence of body is as the training data of this two entity relationships.But, trained based on the Relation extraction technology of remote supervisory Data has that one serious it is simply that the noise of the training data producing is very serious, because not all comprises The sentence of two entities all can reflect the relation between this two entities.In order to reduce the noise of training data, producing training By the method for probability graph model, the Relation extraction technology based on remote supervisory is processed during data, that is, pass through joint probability The relation changed between sentence and two entities solves.
Further, if Relation extraction technology adopts non-neural net method, because non-neutral net introduces Natural language processing instrument extracts relationship characteristic, also inevitably introduces some noises.At present, Relation extraction technology adopts The model realization of neutral net, but, the model of neutral net is not also gone on processing the noise that remote supervisory data produces Effective method.Therefore, how to solve application in remote supervisory data for the neural net method be one highly important Problem.
Content of the invention
In view of this, the embodiment of the present invention provides a kind of neutral net Relation extraction method, and the method can be supervised to long-range The noise superintending and directing data is processed, and improves Relation extraction effect.
According to above-mentioned purpose, the present invention is achieved in that
A kind of neutral net Relation extraction method, the method includes:
A pair of entity to each sentence and its correlation, using convolutional neural networks build the pair of entity sentence to Amount represents;
Sentence level attention mechanism using setting selects the sentence of the relation that have expressed the pair of inter-entity therein Subvector represents, the synthetic sentence subvector obtaining the pair of entity represents;
The Relationship Prediction carrying out the pair of inter-entity is represented according to the synthetic sentence subvector of the pair of entity, obtains pre- Measured value.
It is preferred that the sentence vector representation of the pair of entity of described structure is:
Term vector is built respectively to a pair of entity in sentence, described term vector includes the spelling of meaning of a word vector and position vector Connect information;
By convolution, pond and non-enjoy operation described term vector is converted to sentence vector representation.
It is preferred that described by convolution, pond with non-enjoy operation and described term vector is converted to sentence vector representation is:
First, will be operated between term vector sequence w and convolution matrix w, convolution operation can be long by one Spend the sliding window for l local feature is extracted.Define q firstiIt is all of term vector in i-th window interior Concatenation information:
qi=wi-l+1:i(1≤i≤m+l-1);
Secondly, the term vector of all words beyond sentence boundary is regarded as and seems null vector;
Then, the i-th dimension characterizing definition that convolution obtains is: pi=[wq+b]i;Wherein b is bias vector;
Again, the i-th dimension of sentence vector representation is defined as by pond: [x]i=max (pi);Or adopt segmentation pond, Every one-dimensional characteristic p that convolution is obtainediFrom the beginning entity and tail entity are divided into three sections of (pi1,pi2,pi3), then respectively to each section Carry out pond: [x]ij=max (pij);
Finally, [x]iIt is defined as [x]ijSplicing, non-linearization is being carried out to vector x, is obtaining final sentence vector table Show.
The pair of inter-entity is have expressed using the sentence level attention mechanism selection arranging is therein described at an angle Relation sentence vector representation, obtaining the process that the synthetic sentence subvector of the pair of entity represents is:
The resultant vector defining all sentences represents;
The weight of each sentence vector is defined by the way of average;
Define the weight of each sentence vector representation using the sentence level attention mechanism of setting.
It is preferred that the described resultant vector defining all sentences is expressed as:
Assume all sentences expression vector be x1,x2,…,xn, the resultant vector of sentence is represented and is defined as all sentences The weighted sum of vector representation vector:
s = σ i α i x i ,
Wherein αiIt is defined as each sentence vector xiWeight.
It is preferred that the described vectorial weight of each sentence that defined by the way of average is:
Assume the contribution that all sentence vector representations represent for the described synthetic sentence subvector to entity finally giving Equalization, the described synthetic sentence subvector to entity obtaining represents the meansigma methodss as each sentence vector representation, public as follows Shown in formula:
s = σ i 1 n x i , .
It is preferred that the described sentence level attention mechanism using setting defines the weight of each sentence vector representation For:
The bilinear function that selects using setting weighs the vector representation x of each sentenceiDescribed in finally to be predicted To the correlation degree between relation r of inter-entity, equation below:
ei=xiar;
Wherein, a is diagonal parameter matrix, and r is the vector representation of relationship by objective (RBO) r of inquiry, by Selective attention power machine The weight formulating each sentence vector representation adopted is as follows:
α i = exp ( e i ) σ k exp ( e k ) .
It is preferred that the described synthetic sentence subvector according to the pair of entity represents the relation carrying out the pair of inter-entity Prediction, obtaining predictive value is:
First, define the probability of the relation of final prediction, described definition of probability is:
p ( r | s , θ ) = exp ( o r ) σ k = 1 n r exp ( o k )
Wherein nrFor the number of all relation species, o is the input of final neutral net, is defined as follows:
O=ms+d
Wherein d is bias vector, and m is all relation vector representing matrixs;
Secondly, by stochastic gradient descent, minimize evaluation function, all parameters are learnt and updates, comprising: be logical Cross minimum evaluation function and learn all of parameter.Evaluation function formula is as follows:
Wherein s is the number of all training sentence set, and θ represents all of model Parameter, carries out parameter optimization using stochastic gradient descent algorithm.
Finally, prevent from training over-fitting by dropout, comprising: using dropout mechanism, final output o enters one Step is defined as:Wherein h is the vector of Bernoulli Jacob's distribution that every one-dimension probability is p, and final is owned Sentence expression vector is multiplied by p, obtains
As can be seen from the above scheme, the embodiment of the present invention proposes the nerve based on sentence level Selective attention power mechanism Cyberrelationship abstracting method, particularly as follows: a pair of entity to each sentence and its correlation, is built described using convolutional neural networks The sentence vector representation of a pair of entity;Have expressed the pair of reality using the sentence level attention mechanism selection of setting is therein The sentence vector representation of the relation between body, the synthetic sentence subvector obtaining the pair of entity represents;According to the pair of entity Synthetic sentence subvector represent the Relationship Prediction carrying out the pair of inter-entity.So, the embodiment of the present invention not only can be in god Reduce the interference of noise in remote supervisory data in extracting through cyberrelationship, the information of different sentences can also be considered simultaneously, carry The stability of high model, has good practicality.
Brief description
Fig. 1 is the method flow diagram that neutral net provided in an embodiment of the present invention extracts relation;
Fig. 2 is that the selection of the sentence level attention mechanism using setting provided in an embodiment of the present invention is therein have expressed institute After stating the sentence vector representation of the relation to inter-entity, obtain the schematic diagram that the described synthetic sentence subvector to entity represents;
Fig. 3 is the method specific example schematic diagram described in Fig. 1 provided in an embodiment of the present invention.
Specific embodiment
For making the objects, technical solutions and advantages of the present invention become more apparent, develop simultaneously embodiment referring to the drawings, right The present invention is described in further detail.
Fig. 1 is the method flow diagram that neutral net provided in an embodiment of the present invention extracts relation, and it concretely comprises the following steps:
Step 101, a pair of entity to each sentence and its correlation, build the pair of entity using convolutional neural networks Sentence vector representation;
Step 102, select therein to have expressed the pair of inter-entity using the sentence level attention mechanism of setting The sentence vector representation of relation, the synthetic sentence subvector obtaining the pair of entity represents;
Step 103, represent that the relation carrying out the pair of inter-entity is pre- according to the synthetic sentence subvector of the pair of entity Survey, obtain predictive value.
In the method, the detailed process of described step 101 is:
Step 1011, the word to the input of sentence are indicated, and obtain term vector;
In this step, the input of convolutional neural networks is all words of sentence.First by all words in sentence It is converted into continuous vector representation.Here, the word of each input is converted into the vector in a term vector matrix;
Further, this step is also distinguished to the position of two entities using position vector.Here, term vector is used for identifying The syntactic and semantic information of each word, is obtained using text depth representing model (word2vec) study;Position vector is used for The positional information of presentation-entity, is defined as the vector representation of the mutual alignment difference between each word and head entity, tail entity.? Whole term vector is defined as the vectorial concatenation information with position vector of the meaning of a word that word2vec learns.
Step 1012, convolution of passing through, the word of input is represented the vector table being converted into sentence by pondization and nonlinear operation Show;
In this step, convolution operation is defined as the operation between term vector sequence w and convolution matrix w, convolution Operation can be extracted to local feature by the sliding window that a length is l.Define q firstiIt is in i-th window The concatenation information of all of term vector in portion:
qi=wi-l+1:i(1≤i≤m+l-1)
Due to the border of sentence may be exceeded during window sliding, add some blank words on the border of sentence, that is, Say, the term vector of all words beyond sentence boundary is regarded as and seems null vector.Then, the i-th dimension characterizing definition that convolution obtains For:
pi=[wq+b]i
Wherein b is bias vector;
Further, the i-th dimension of the expression of final sentence is defined as by pond:
[x]i=max (pi)
Additionally, in Relation extraction task it is contemplated that after the position of two entities, pondization can be improved to point further Duan Chihua, every one-dimensional characteristic p that convolution is obtainediFrom the beginning entity and tail entity are divided into three sections of (pi1,pi2,pi3), then right respectively Each section carries out pond:
[x]ij=max (pij)
Then [x]iIt is defined as [x]ijSplicing.
In this step last, is carrying out such as the non-linearization of tanh function, is obtaining final sentence vector to vector x Represent.
In the method, described step 102 particularly as follows:
Step 1021, the resultant vector of all sentences of definition represent;
Very intuitively it is assumed that the expression vector of all sentences is x1,x2,…,xn, the resultant vector of sentence is represented definition Weighted sum for all sentence vector representations vector:
s = σ i α i x i ,
Wherein αiIt is defined as each sentence vector xiWeight;
Step 1022, define the weight of each sentence vector by the way of average;
In this step it is assumed that all sentence vector representations are for the described synthetic sentence subvector to entity finally giving The contribution representing is impartial, then the described synthetic sentence subvector to entity finally giving represents as each sentence vector table The meansigma methodss shown, shown in equation below:
s = σ i 1 n x i , ;
Step 1023, the weight of each sentence vector representation is defined using the sentence level attention mechanism of setting;
In this step, weigh the vector representation x of each sentence based on the function of inquiry using oneiWant with final Correlation degree between described relation r to inter-entity of prediction.Here to be defined using selection bilinear function:
ei=xiar;
Wherein, a is diagonal parameter matrix, and r is the vector representation of relationship by objective (RBO) r of inquiry, can pass through Selective attention The weight that power mechanism defines each sentence vector representation is as follows:
α i = exp ( e i ) σ k exp ( e k ) ;
Due to considering the information of described relation r to inter-entity finally to be predicted, using the sentence level note of setting After meaning power mechanism is selected, the final weight of noise sentence can effectively be reduced by the weight of each sentence vector.
In the method, the further step of described step 103:
Step 1031, the probability of the relation of the final prediction of definition;
In this step, by give all sentences set and described to entity, relation that each of final prediction obtains Definition of probability be:
p ( r | s , θ ) = exp ( o r ) σ k = 1 n r exp ( o k )
Wherein nrFor the number of all relation species, o is the input of final neutral net, is defined as follows:
O=ms+d
Wherein d is bias vector, and m is all relation vector representing matrixs;
Step 1032, pass through stochastic gradient descent, minimize evaluation function, all parameters are learnt and are updated;
Specifically, learn all of parameter by minimizing evaluation function.Evaluation function formula is as follows:
j ( θ ) = σ i = 1 s log p ( r i | s i , θ )
Wherein s is the number of all training sentence set, and θ is represented all of model parameter, calculated using stochastic gradient descent Method carries out parameter optimization.
Step 1033, by dropout prevent train over-fitting;
In this step, in order to prevent from training over-fitting, employ existing dropout mechanism, final output o enters One step is defined as:
Wherein h is the vector of Bernoulli Jacob's distribution that every one-dimension probability is p.In final prediction, by final all sentences Subrepresentation vector is multiplied by p, that is,
Fig. 2 is that the selection of the sentence level attention mechanism using setting provided in an embodiment of the present invention is therein have expressed institute After stating the sentence vector representation of the relation to inter-entity, obtain the schematic diagram that the described synthetic sentence subvector to entity represents, its In, m1,m2..., mc is each sentence vector representation, r1,r2,…,rcFor representing to the relation of inter-entity each described, incite somebody to action both After being associated, using formulaEach sentence vector representation is defined by Selective attention power mechanism, obtains Final r.
Fig. 3 is the method specific example schematic diagram described in Fig. 1 provided in an embodiment of the present invention, as illustrated, from down to up, A pair of entity in one sentence according to the method described in Fig. 1, after coarse grain layer by layer, finally given described to inter-entity Relationship Prediction value.
As can be seen from the above scheme, the embodiment of the present invention proposes the nerve based on sentence level Selective attention power mechanism Cyberrelationship abstracting method, not only can reduce the interference of noise in remote supervisory data in neutral net Relation extraction, also The information of different sentences can be considered simultaneously, improve the stability of model, there is good practicality.
The object, technical solutions and advantages of the present invention are further described, institute by above act preferred embodiment It should be understood that the foregoing is only presently preferred embodiments of the present invention, not in order to limit the present invention, all the present invention's Spirit and principle within, any modification, equivalent and improvement of being made etc., should be included in protection scope of the present invention it Interior.

Claims (8)

1. a kind of neutral net Relation extraction method is it is characterised in that the method includes:
A pair of entity to each sentence and its correlation, builds the sentence vector table of the pair of entity using convolutional neural networks Show;
Using setting sentence level attention mechanism select the relation that have expressed the pair of inter-entity therein sentence to Amount represents, the synthetic sentence subvector obtaining the pair of entity represents;
The Relationship Prediction carrying out the pair of inter-entity is represented according to the synthetic sentence subvector of the pair of entity, is predicted Value.
2. the method for claim 1 is it is characterised in that the sentence vector representation of the pair of entity of described structure is:
Term vector is built respectively to a pair of entity in sentence, described term vector includes the splicing letter of meaning of a word vector and position vector Breath;
By convolution, pond and non-enjoy operation described term vector is converted to sentence vector representation.
3. the method for claim 1 is it is characterised in that described operated institute's predicate by convolution, pond and non-enjoying Vector is converted to sentence vector representation:
First, will be operated between term vector sequence w and convolution matrix w, convolution operation can be l by a length Sliding window local feature is extracted.Define q firstiIt is the splicing of all of term vector in i-th window interior Information:
qi=wi-l+1:i(1≤i≤m+l-1);
Secondly, the term vector of all words beyond sentence boundary is regarded as and seems null vector;
Then, the i-th dimension characterizing definition that convolution obtains is: pi=[wq+b]i;Wherein b is bias vector;
Again, the i-th dimension of sentence vector representation is defined as by pond: [x]i=max (pi);Or adopt segmentation pond, will roll up The long-pending every one-dimensional characteristic p obtainingiFrom the beginning entity and tail entity are divided into three sections of (pi1,pi2,pi3), then respectively each section is carried out Chi Hua: [x]ij=max (pij);
Finally, [x]iIt is defined as [x]ijSplicing, non-linearization is being carried out to vector x, is obtaining final sentence vector representation.
4. the method for claim 1 is it is characterised in that the described sentence level attention mechanism using setting selects it In the relation that have expressed the pair of inter-entity sentence vector representation, obtain the synthetic sentence subvector table of the pair of entity The process shown is:
The resultant vector defining all sentences represents;
The weight of each sentence vector is defined by the way of average;
Define the weight of each sentence vector representation using the sentence level attention mechanism of setting.
5. method as claimed in claim 4 is it is characterised in that the resultant vector of all sentences of described definition is expressed as:
Assume all sentences expression vector be x1,x2,…,xn, the resultant vector of sentence is represented and is defined as all sentence vectors The weighted sum of expression vector:
s = σ i α i x i ,
Wherein αiIt is defined as each sentence vector xiWeight.
6. method as claimed in claim 4 is it is characterised in that described each sentence vector of being defined by the way of average Weight is:
Assume that the contribution that all sentence vector representations represent for the described synthetic sentence subvector to entity finally giving is impartial, The described synthetic sentence subvector to entity obtaining represents the meansigma methodss as each sentence vector representation, equation below institute Show:
s = σ i 1 n x i , .
7. method as claimed in claim 4 is it is characterised in that the described sentence level attention mechanism definition using setting is every The weight of one sentence vector representation is:
The bilinear function that selects using setting weighs the vector representation x of each sentenceiFinally to be predicted described to entity Between relation r between correlation degree, equation below:
ei=xiar;
Wherein, a is diagonal parameter matrix, and r is the vector representation of relationship by objective (RBO) r of inquiry, fixed by Selective attention power mechanism The weight of each sentence vector representation adopted is as follows:
α i = exp ( e i ) σ k exp ( e k ) .
8. the method for claim 1 is it is characterised in that the described synthetic sentence subvector according to the pair of entity represents Carry out the Relationship Prediction of the pair of inter-entity, obtaining predictive value is:
First, define the probability of the relation of final prediction, described definition of probability is:
p ( r | s , θ ) = exp ( o r ) σ k = 1 n r exp ( o k )
Wherein nrFor the number of all relation species, o is the input of final neutral net, is defined as follows:
O=ms+d
Wherein d is bias vector, and m is all relation vector representing matrixs;
Secondly, by stochastic gradient descent, minimize evaluation function, all parameters are learnt and updates, comprising: by Littleization evaluation function learns all of parameter.Evaluation function formula is as follows:
Wherein s is the number of all training sentence set, and θ represents all of model parameter, Parameter optimization is carried out using stochastic gradient descent algorithm.
Finally, prevent from training over-fitting by dropout, comprising: using dropout mechanism, final output o is fixed further Justice is:Wherein h is the vector of Bernoulli Jacob's distribution that every one-dimension probability is p, by final all sentences Represent that vector is multiplied by p, obtain
CN201610685532.2A 2016-08-18 2016-08-18 Neural network relation extracting method Pending CN106354710A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610685532.2A CN106354710A (en) 2016-08-18 2016-08-18 Neural network relation extracting method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610685532.2A CN106354710A (en) 2016-08-18 2016-08-18 Neural network relation extracting method

Publications (1)

Publication Number Publication Date
CN106354710A true CN106354710A (en) 2017-01-25

Family

ID=57843370

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610685532.2A Pending CN106354710A (en) 2016-08-18 2016-08-18 Neural network relation extracting method

Country Status (1)

Country Link
CN (1) CN106354710A (en)

Cited By (50)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106844738A (en) * 2017-02-14 2017-06-13 华南理工大学 The sorting technique of Junker relation between food materials based on neutral net
CN106970981A (en) * 2017-03-28 2017-07-21 北京大学 A kind of method that Relation extraction model is built based on transfer matrix
CN107133211A (en) * 2017-04-26 2017-09-05 中国人民大学 A kind of composition methods of marking based on notice mechanism
CN107194422A (en) * 2017-06-19 2017-09-22 中国人民解放军国防科学技术大学 A kind of convolutional neural networks relation sorting technique of the forward and reverse example of combination
CN107220237A (en) * 2017-05-24 2017-09-29 南京大学 A kind of method of business entity's Relation extraction based on convolutional neural networks
CN107239446A (en) * 2017-05-27 2017-10-10 中国矿业大学 A kind of intelligence relationship extracting method based on neutral net Yu notice mechanism
CN107256228A (en) * 2017-05-02 2017-10-17 清华大学 Answer selection system and method based on structuring notice mechanism
CN107273800A (en) * 2017-05-17 2017-10-20 大连理工大学 A kind of action identification method of the convolution recurrent neural network based on attention mechanism
CN107273349A (en) * 2017-05-09 2017-10-20 清华大学 A kind of entity relation extraction method and server based on multilingual
CN107392229A (en) * 2017-06-21 2017-11-24 清华大学 A kind of network representation method based on the Relation extraction that most gears to the needs of the society
CN107766506A (en) * 2017-10-20 2018-03-06 哈尔滨工业大学 A kind of more wheel dialog model construction methods based on stratification notice mechanism
CN107944559A (en) * 2017-11-24 2018-04-20 国家计算机网络与信息安全管理中心 A kind of entity relationship automatic identifying method and system
CN108304911A (en) * 2018-01-09 2018-07-20 中国科学院自动化研究所 Knowledge Extraction Method and system based on Memory Neural Networks and equipment
CN108399180A (en) * 2017-02-08 2018-08-14 腾讯科技(深圳)有限公司 A kind of knowledge mapping construction method, device and server
CN108491680A (en) * 2018-03-07 2018-09-04 安庆师范大学 Drug relationship abstracting method based on residual error network and attention mechanism
CN108536754A (en) * 2018-03-14 2018-09-14 四川大学 Electronic health record entity relation extraction method based on BLSTM and attention mechanism
CN108563653A (en) * 2017-12-21 2018-09-21 清华大学 A kind of construction method and system for knowledge acquirement model in knowledge mapping
CN108733792A (en) * 2018-05-14 2018-11-02 北京大学深圳研究生院 A kind of entity relation extraction method
CN108875809A (en) * 2018-06-01 2018-11-23 大连理工大学 The biomedical entity relationship classification method of joint attention mechanism and neural network
CN109062897A (en) * 2018-07-26 2018-12-21 苏州大学 Sentence alignment method based on deep neural network
CN109376864A (en) * 2018-09-06 2019-02-22 电子科技大学 A kind of knowledge mapping relation inference algorithm based on stacking neural network
CN109522920A (en) * 2018-09-18 2019-03-26 义语智能科技(上海)有限公司 Training method and equipment based on the synonymous discrimination model for combining semantic feature
CN109522557A (en) * 2018-11-16 2019-03-26 中山大学 Training method, device and the readable storage medium storing program for executing of text Relation extraction model
CN109597894A (en) * 2018-09-30 2019-04-09 阿里巴巴集团控股有限公司 A kind of correlation model generation method and device, a kind of data correlation method and device
CN109635124A (en) * 2018-11-30 2019-04-16 北京大学 A kind of remote supervisory Relation extraction method of combination background knowledge
CN109670050A (en) * 2018-12-12 2019-04-23 科大讯飞股份有限公司 A kind of entity relationship prediction technique and device
CN109902171A (en) * 2019-01-30 2019-06-18 中国地质大学(武汉) Text Relation extraction method and system based on layering knowledge mapping attention model
CN109992629A (en) * 2019-02-28 2019-07-09 中国科学院计算技术研究所 A kind of neural network Relation extraction method and system of fusion entity type constraint
CN110196978A (en) * 2019-06-04 2019-09-03 重庆大学 A kind of entity relation extraction method for paying close attention to conjunctive word
CN110210016A (en) * 2019-04-25 2019-09-06 中国科学院计算技术研究所 Bilinearity neural network Deceptive news detection method and system based on style guidance
CN110245292A (en) * 2019-05-28 2019-09-17 华东师范大学 A kind of natural language Relation extraction method based on neural network filtering noise characteristic
CN110263019A (en) * 2019-06-18 2019-09-20 中南民族大学 Construction method, device and the storage medium of entity relation extraction model
CN110275960A (en) * 2019-06-11 2019-09-24 中国电子科技集团公司电子科学研究院 Representation method and system based on the knowledge mapping and text information for censuring sentence
CN110309512A (en) * 2019-07-05 2019-10-08 北京邮电大学 A kind of Chinese grammer error correction method thereof based on generation confrontation network
CN110309516A (en) * 2019-05-30 2019-10-08 清华大学 Training method, device and the electronic equipment of Machine Translation Model
CN110580340A (en) * 2019-08-29 2019-12-17 桂林电子科技大学 neural network relation extraction method based on multi-attention machine system
CN110852066A (en) * 2018-07-25 2020-02-28 清华大学 Multi-language entity relation extraction method and system based on confrontation training mechanism
CN111091007A (en) * 2020-03-23 2020-05-01 杭州有数金融信息服务有限公司 Method for identifying relationships among multiple enterprises based on public sentiment and enterprise portrait
CN111192692A (en) * 2020-01-02 2020-05-22 上海联影智能医疗科技有限公司 Entity relationship determination method and device, electronic equipment and storage medium
CN111191461A (en) * 2019-06-06 2020-05-22 北京理工大学 Remote supervision relation extraction method based on course learning
CN111274812A (en) * 2018-12-03 2020-06-12 阿里巴巴集团控股有限公司 Character relation recognition method, device and storage medium
CN111831783A (en) * 2020-07-07 2020-10-27 北京北大软件工程股份有限公司 Chapter-level relation extraction method
CN112328784A (en) * 2019-08-05 2021-02-05 上海智臻智能网络科技股份有限公司 Data information classification method and device
CN112347263A (en) * 2019-08-06 2021-02-09 上海智臻智能网络科技股份有限公司 Knowledge graph construction method
CN112347265A (en) * 2019-08-06 2021-02-09 上海智臻智能网络科技股份有限公司 Knowledge graph construction method
CN112347196A (en) * 2019-08-06 2021-02-09 上海智臻智能网络科技股份有限公司 Entity relation extraction method and device based on neural network
CN112463982A (en) * 2020-11-27 2021-03-09 华东师范大学 Relationship extraction method based on explicit and implicit entity constraint
CN113011161A (en) * 2020-12-29 2021-06-22 中国航天科工集团第二研究院 Method for extracting human and pattern association relation based on deep learning and pattern matching
CN114218956A (en) * 2022-01-24 2022-03-22 平安科技(深圳)有限公司 Relation extraction method and system based on neural network and remote supervision
CN114757179A (en) * 2022-04-13 2022-07-15 成都信息工程大学 Entity relationship joint extraction method and device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009017464A1 (en) * 2007-07-31 2009-02-05 Agency For Science, Technology And Research Relation extraction system
CN104021115A (en) * 2014-06-13 2014-09-03 北京理工大学 Chinese comparative sentence recognizing method and device based on neural network

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2009017464A1 (en) * 2007-07-31 2009-02-05 Agency For Science, Technology And Research Relation extraction system
CN104021115A (en) * 2014-06-13 2014-09-03 北京理工大学 Chinese comparative sentence recognizing method and device based on neural network

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
YANKAI LIN ET AL.: "Neural Relation Extraction with Selective Attention over Instances", 《PROCEEDINGS OF THE 54TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS》 *
刘丽佳 等: "基于LM算法的领域概念实体属性关系抽取", 《中文信息学报》 *

Cited By (72)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108399180A (en) * 2017-02-08 2018-08-14 腾讯科技(深圳)有限公司 A kind of knowledge mapping construction method, device and server
CN108399180B (en) * 2017-02-08 2021-11-26 腾讯科技(深圳)有限公司 Knowledge graph construction method and device and server
CN106844738A (en) * 2017-02-14 2017-06-13 华南理工大学 The sorting technique of Junker relation between food materials based on neutral net
CN106844738B (en) * 2017-02-14 2019-07-16 华南理工大学 The classification method of Junker relationship between food materials neural network based
CN106970981A (en) * 2017-03-28 2017-07-21 北京大学 A kind of method that Relation extraction model is built based on transfer matrix
CN106970981B (en) * 2017-03-28 2021-01-19 北京大学 Method for constructing relation extraction model based on transfer matrix
CN107133211A (en) * 2017-04-26 2017-09-05 中国人民大学 A kind of composition methods of marking based on notice mechanism
CN107256228A (en) * 2017-05-02 2017-10-17 清华大学 Answer selection system and method based on structuring notice mechanism
CN107273349B (en) * 2017-05-09 2019-11-22 清华大学 A kind of entity relation extraction method and server based on multilingual
CN107273349A (en) * 2017-05-09 2017-10-20 清华大学 A kind of entity relation extraction method and server based on multilingual
CN107273800A (en) * 2017-05-17 2017-10-20 大连理工大学 A kind of action identification method of the convolution recurrent neural network based on attention mechanism
CN107220237A (en) * 2017-05-24 2017-09-29 南京大学 A kind of method of business entity's Relation extraction based on convolutional neural networks
CN107239446A (en) * 2017-05-27 2017-10-10 中国矿业大学 A kind of intelligence relationship extracting method based on neutral net Yu notice mechanism
WO2018218707A1 (en) * 2017-05-27 2018-12-06 中国矿业大学 Neural network and attention mechanism-based information relation extraction method
CN107194422A (en) * 2017-06-19 2017-09-22 中国人民解放军国防科学技术大学 A kind of convolutional neural networks relation sorting technique of the forward and reverse example of combination
CN107392229A (en) * 2017-06-21 2017-11-24 清华大学 A kind of network representation method based on the Relation extraction that most gears to the needs of the society
CN107766506A (en) * 2017-10-20 2018-03-06 哈尔滨工业大学 A kind of more wheel dialog model construction methods based on stratification notice mechanism
CN107944559A (en) * 2017-11-24 2018-04-20 国家计算机网络与信息安全管理中心 A kind of entity relationship automatic identifying method and system
CN108563653A (en) * 2017-12-21 2018-09-21 清华大学 A kind of construction method and system for knowledge acquirement model in knowledge mapping
CN108563653B (en) * 2017-12-21 2020-07-31 清华大学 Method and system for constructing knowledge acquisition model in knowledge graph
CN108304911A (en) * 2018-01-09 2018-07-20 中国科学院自动化研究所 Knowledge Extraction Method and system based on Memory Neural Networks and equipment
CN108491680A (en) * 2018-03-07 2018-09-04 安庆师范大学 Drug relationship abstracting method based on residual error network and attention mechanism
CN108536754A (en) * 2018-03-14 2018-09-14 四川大学 Electronic health record entity relation extraction method based on BLSTM and attention mechanism
CN108733792B (en) * 2018-05-14 2020-12-01 北京大学深圳研究生院 Entity relation extraction method
CN108733792A (en) * 2018-05-14 2018-11-02 北京大学深圳研究生院 A kind of entity relation extraction method
CN108875809A (en) * 2018-06-01 2018-11-23 大连理工大学 The biomedical entity relationship classification method of joint attention mechanism and neural network
CN110852066B (en) * 2018-07-25 2021-06-01 清华大学 Multi-language entity relation extraction method and system based on confrontation training mechanism
CN110852066A (en) * 2018-07-25 2020-02-28 清华大学 Multi-language entity relation extraction method and system based on confrontation training mechanism
CN109062897A (en) * 2018-07-26 2018-12-21 苏州大学 Sentence alignment method based on deep neural network
CN109376864A (en) * 2018-09-06 2019-02-22 电子科技大学 A kind of knowledge mapping relation inference algorithm based on stacking neural network
CN109522920A (en) * 2018-09-18 2019-03-26 义语智能科技(上海)有限公司 Training method and equipment based on the synonymous discrimination model for combining semantic feature
CN109522920B (en) * 2018-09-18 2020-10-13 义语智能科技(上海)有限公司 Training method and device of synonymy discriminant model based on combination of semantic features
CN109597894B (en) * 2018-09-30 2023-10-03 创新先进技术有限公司 Correlation model generation method and device, and data correlation method and device
CN109597894A (en) * 2018-09-30 2019-04-09 阿里巴巴集团控股有限公司 A kind of correlation model generation method and device, a kind of data correlation method and device
CN109522557B (en) * 2018-11-16 2021-07-16 中山大学 Training method and device of text relation extraction model and readable storage medium
CN109522557A (en) * 2018-11-16 2019-03-26 中山大学 Training method, device and the readable storage medium storing program for executing of text Relation extraction model
CN109635124A (en) * 2018-11-30 2019-04-16 北京大学 A kind of remote supervisory Relation extraction method of combination background knowledge
CN111274812B (en) * 2018-12-03 2023-04-18 阿里巴巴集团控股有限公司 Figure relation recognition method, equipment and storage medium
CN111274812A (en) * 2018-12-03 2020-06-12 阿里巴巴集团控股有限公司 Character relation recognition method, device and storage medium
CN109670050A (en) * 2018-12-12 2019-04-23 科大讯飞股份有限公司 A kind of entity relationship prediction technique and device
CN109902171A (en) * 2019-01-30 2019-06-18 中国地质大学(武汉) Text Relation extraction method and system based on layering knowledge mapping attention model
CN109992629A (en) * 2019-02-28 2019-07-09 中国科学院计算技术研究所 A kind of neural network Relation extraction method and system of fusion entity type constraint
CN109992629B (en) * 2019-02-28 2021-08-06 中国科学院计算技术研究所 Neural network relation extraction method and system fusing entity type constraints
CN110210016A (en) * 2019-04-25 2019-09-06 中国科学院计算技术研究所 Bilinearity neural network Deceptive news detection method and system based on style guidance
CN110245292A (en) * 2019-05-28 2019-09-17 华东师范大学 A kind of natural language Relation extraction method based on neural network filtering noise characteristic
CN110309516A (en) * 2019-05-30 2019-10-08 清华大学 Training method, device and the electronic equipment of Machine Translation Model
CN110196978A (en) * 2019-06-04 2019-09-03 重庆大学 A kind of entity relation extraction method for paying close attention to conjunctive word
CN111191461B (en) * 2019-06-06 2021-08-03 北京理工大学 Remote supervision relation extraction method based on course learning
CN111191461A (en) * 2019-06-06 2020-05-22 北京理工大学 Remote supervision relation extraction method based on course learning
CN110275960A (en) * 2019-06-11 2019-09-24 中国电子科技集团公司电子科学研究院 Representation method and system based on the knowledge mapping and text information for censuring sentence
CN110275960B (en) * 2019-06-11 2021-09-14 中国电子科技集团公司电子科学研究院 Method and system for expressing knowledge graph and text information based on named sentence
CN110263019A (en) * 2019-06-18 2019-09-20 中南民族大学 Construction method, device and the storage medium of entity relation extraction model
CN110263019B (en) * 2019-06-18 2021-08-31 中南民族大学 Method and device for constructing entity relationship extraction model and storage medium
CN110309512A (en) * 2019-07-05 2019-10-08 北京邮电大学 A kind of Chinese grammer error correction method thereof based on generation confrontation network
CN112328784A (en) * 2019-08-05 2021-02-05 上海智臻智能网络科技股份有限公司 Data information classification method and device
CN112328784B (en) * 2019-08-05 2023-04-18 上海智臻智能网络科技股份有限公司 Data information classification method and device
CN112347196A (en) * 2019-08-06 2021-02-09 上海智臻智能网络科技股份有限公司 Entity relation extraction method and device based on neural network
CN112347196B (en) * 2019-08-06 2023-05-23 上海智臻智能网络科技股份有限公司 Entity relation extraction method and device based on neural network
CN112347263B (en) * 2019-08-06 2023-04-14 上海智臻智能网络科技股份有限公司 Knowledge graph construction method
CN112347265B (en) * 2019-08-06 2023-04-14 上海智臻智能网络科技股份有限公司 Knowledge graph construction method
CN112347265A (en) * 2019-08-06 2021-02-09 上海智臻智能网络科技股份有限公司 Knowledge graph construction method
CN112347263A (en) * 2019-08-06 2021-02-09 上海智臻智能网络科技股份有限公司 Knowledge graph construction method
CN110580340A (en) * 2019-08-29 2019-12-17 桂林电子科技大学 neural network relation extraction method based on multi-attention machine system
CN111192692B (en) * 2020-01-02 2023-12-08 上海联影智能医疗科技有限公司 Entity relationship determination method and device, electronic equipment and storage medium
CN111192692A (en) * 2020-01-02 2020-05-22 上海联影智能医疗科技有限公司 Entity relationship determination method and device, electronic equipment and storage medium
CN111091007A (en) * 2020-03-23 2020-05-01 杭州有数金融信息服务有限公司 Method for identifying relationships among multiple enterprises based on public sentiment and enterprise portrait
CN111831783B (en) * 2020-07-07 2023-12-08 北京北大软件工程股份有限公司 Method for extracting chapter-level relation
CN111831783A (en) * 2020-07-07 2020-10-27 北京北大软件工程股份有限公司 Chapter-level relation extraction method
CN112463982A (en) * 2020-11-27 2021-03-09 华东师范大学 Relationship extraction method based on explicit and implicit entity constraint
CN113011161A (en) * 2020-12-29 2021-06-22 中国航天科工集团第二研究院 Method for extracting human and pattern association relation based on deep learning and pattern matching
CN114218956A (en) * 2022-01-24 2022-03-22 平安科技(深圳)有限公司 Relation extraction method and system based on neural network and remote supervision
CN114757179A (en) * 2022-04-13 2022-07-15 成都信息工程大学 Entity relationship joint extraction method and device

Similar Documents

Publication Publication Date Title
CN106354710A (en) Neural network relation extracting method
CN110083705B (en) Multi-hop attention depth model, method, storage medium and terminal for target emotion classification
CN106484664B (en) Similarity calculating method between a kind of short text
CN113239186B (en) Graph convolution network relation extraction method based on multi-dependency relation representation mechanism
CN111259987B (en) Method for extracting event main body by multi-model fusion based on BERT
CN106547735A (en) The structure and using method of the dynamic word or word vector based on the context-aware of deep learning
CN108664632A (en) A kind of text emotion sorting algorithm based on convolutional neural networks and attention mechanism
CN112507699B (en) Remote supervision relation extraction method based on graph convolution network
CN109543722A (en) A kind of emotion trend forecasting method based on sentiment analysis model
CN109558487A (en) Document Classification Method based on the more attention networks of hierarchy
CN110245229A (en) A kind of deep learning theme sensibility classification method based on data enhancing
CN106650725A (en) Full convolutional neural network-based candidate text box generation and text detection method
CN107133214A (en) A kind of product demand preference profiles based on comment information are excavated and its method for evaluating quality
CN105468713A (en) Multi-model fused short text classification method
CN107153642A (en) A kind of analysis method based on neural network recognization text comments Sentiment orientation
CN111079409B (en) Emotion classification method utilizing context and aspect memory information
CN108197294A (en) A kind of text automatic generation method based on deep learning
CN107943784A (en) Relation extraction method based on generation confrontation network
CN108536801A (en) A kind of civil aviaton's microblogging security public sentiment sentiment analysis method based on deep learning
CN113761893B (en) Relation extraction method based on mode pre-training
CN108549658A (en) A kind of deep learning video answering method and system based on the upper attention mechanism of syntactic analysis tree
CN110825850B (en) Natural language theme classification method and device
CN112559734A (en) Presentation generation method and device, electronic equipment and computer readable storage medium
CN108875034A (en) A kind of Chinese Text Categorization based on stratification shot and long term memory network
CN114462409A (en) Audit field named entity recognition method based on countermeasure training

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20170125

WD01 Invention patent application deemed withdrawn after publication