Nothing Special   »   [go: up one dir, main page]

CN108268643A - A kind of Deep Semantics matching entities link method based on more granularity LSTM networks - Google Patents

A kind of Deep Semantics matching entities link method based on more granularity LSTM networks Download PDF

Info

Publication number
CN108268643A
CN108268643A CN201810058399.7A CN201810058399A CN108268643A CN 108268643 A CN108268643 A CN 108268643A CN 201810058399 A CN201810058399 A CN 201810058399A CN 108268643 A CN108268643 A CN 108268643A
Authority
CN
China
Prior art keywords
entity
candidate
feature vector
lstm
censured
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201810058399.7A
Other languages
Chinese (zh)
Inventor
高升
罗安根
王新怡
徐雅静
李思
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing University of Posts and Telecommunications
Original Assignee
Beijing University of Posts and Telecommunications
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing University of Posts and Telecommunications filed Critical Beijing University of Posts and Telecommunications
Priority to CN201810058399.7A priority Critical patent/CN108268643A/en
Publication of CN108268643A publication Critical patent/CN108268643A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/30Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
    • G06F16/36Creation of semantic tools, e.g. ontology or thesauri

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

The invention discloses a kind of Deep Semantics matching entities link methods based on more granularity LSTM networks, belong to field of information processing.The feature of this method includes:First using the other two-way LSTM networks of character level, extraction entity is censured, the format surface character representation of candidate entity;The two-way LSTM networks of word rank are used again, and sentence encodes where being censured to entity, outputs it the context semantic feature vector censured as entity, then using the information of structural knowledge collection of illustrative plates, learns the context semantic feature vector of candidate entity;It is censured finally for entity, the format surface of candidate entity, context semantic feature vector, respectively gauging surface form, semantic similarity score, the two is combined as entity censures candidate entity to final score.The present invention makes entity link effect get a promotion by combining more granularity LSTM networks and representation of knowledge learning method.

Description

A kind of Deep Semantics matching entities link method based on more granularity LSTM networks
Technical field
The present invention relates to field of information processing, the Deep Semantics matching (Deep more particularly to based on more granularity LSTM networks Semantic Match Model, DSMM) entity link method.
Background technology
Entity link is the basic link of each application field of natural language processing, its target is will be in free text Entity denotion is linked to the entity corresponding to object knowledge collection of illustrative plates, so as to solve existing ambiguity problem between entity.Chain of entities The core connect in research is how candidate entity sets to be ranked up to pick out correct mapping entity.Entity link it is good It is bad to directly influence upper strata task, for example, information retrieval and automatic question answering.
Most of traditional entity link algorithm generally using non-structured knowledge mapping, is censured and candidate by entity The context text of entity, the artificial feature vector for extracting the two.However the feature vector that this mode obtains cannot represent word Or the inherent semantic information of entity, lack the adaptability to different scenes.And the present invention employs to solve the problem above-mentioned Character granularity, two two-way LSTM networks of word granularity and representation of knowledge learning method TransE, extract format surface and up and down The feature of two levels of literary semanteme, and the structured message of knowledge mapping is combined, obtain preferable entity link effect.
Invention content
In order to solve the technical issues of existing, matched the present invention provides the Deep Semantics based on more granularity LSTM networks real Body link method.Scheme is as follows:
Step 1, using the other two-way LSTM networks (char-LSTM) of character level, extract entity censure, candidate entity Format surface character representation, while the semantic information of word in itself to a certain extent can be included.
Step 2, using the two-way LSTM networks (word-LSTM) of word rank, sentence where being censured to entity is compiled Code outputs it the context semantic feature vector censured as entity.Using in structural knowledge collection of illustrative plates " above and below structure Text ", study obtain the context semantic feature vector of candidate entity.
Step 3, format surface feature vector and context semantic feature vector for entity denotion, candidate entity, Gauging surface form and semantic matching similarity score respectively, by format surface matching similarity and context semantic matches phase Entity denotion-candidate's entity is combined as to final matching similarity score like degree.
Description of the drawings
Fig. 1 is Deep Semantics matching (DSMM) entity link system provided by the invention based on more granularity LSTM networks Network structure
Fig. 2 is the structure chart of char/word-LSTM used in DSMM algorithms provided by the invention
Fig. 3 is the cycling element structure chart of LSTM networks
Specific embodiment
Next it is the implementation to the present invention is for a more detailed description.
Fig. 1 is the network knot of the Deep Semantics matching entities link system provided by the invention based on more granularity LSTM networks Composition, including:
Step S1:Format surface matches
Step S2:Context semantic matches
Step S3:Similarity measurement
Fig. 2 gives the structure chart of char/word-LSTM.
Each step will be specifically described below:
Step S1:Format surface matches.Since the length that entity is censured, candidate entity is universal is all very short, the present invention uses The other two-way LSTM networks (char-LSTM) of character level are come the format surface character representation that both extracts.The robust of char-LSTM Property more preferable, character errors caused by can receiving due to some printings, tense or other spellings, while can include The semantic information of word to a certain extent in itself.
Fig. 3 gives a kind of cellular construction of LSTM units, and a LSTM unit can be described as in moment t:
it=σ (Wi·xt+Ui·ht-1+bi)
ft=σ (Wf·xt+Uf·ht-1+bf)
ot=σ (Wo·xt+Uo·ht-1+bo)
ht=ot⊙tanh(Ct)
Wherein x is input, and C is mnemon state, and i, f, o are input gate respectively, forget door and out gate, σ and tanh It is logistic sigmoid functions and hyperbolic tangent functions.⊙ is that numerical value contraposition is multiplied.W, U and b is weight Matrix and bias term.It is the candidate mnemon state calculated.Mnemon state C is in the control of input gate, forgetting door Under, it updates to obtain from the mnemon state of candidate mnemon state and previous moment.And out gate then controls mnemon The output of state.
T character M={ c is included for given1,c2,…,cTEntity censure m, input is the character style M of m ={ c1,c2,…,cT};The character of each input unit is converted into corresponding character vector by character style by term vector layer ei c
ei c=Wcharic
Wherein, Wchar∈Rdc×|Vc|It is character vector matrix, dcFor the dimension of character vector, VcIt is to include all different words The dictionary of symbol, icIt is a solely heat vector, i.e., it is 1 that it, which is value on except i-th dimension, and the value in remaining dimension is all 0.
Input LSTM layers two-way is the character vector obtained after convertingTake two-way LSTM layers it is last Layer state is hidden as output, i.e. entity censures the format surface feature vector Loc of mm.For candidate entity e, by same Char-LSTM can obtain the feature vector Loc of entity ee
Step S2:Context semantic matches.Due to merely doing entity link presence by format surface feature vector Problems, the present invention use the two-way LSTM networks (word-LSTM) of word rank, and sentence where being censured to entity is compiled Code outputs it the context semantic feature vector censured as entity.
Given entity censures the m and its sentence sen at place, and each word of sen is converted to corresponding term vector;It is right In i-th of word, its term vector and position vector are spliced into the input as LSTM layers, i.e.,
Wherein, ei w=WwordiwFor the term vector of i-th of word, ei p=WposiwPosition vector for i-th of word.Wword∈ Rdw×|Vw|And Wp∈Rdp×|Vp|It is the term vector matrix and position vector matrix for searching respectively.Wherein, dwAnd dpIt is word respectively The dimension of vector sum position vector.VwAnd VpIt is dictionary and position dictionary respectively.iwAnd ipAll it is only heat vector, respectively only in w Peacekeeping p dimension values are 1, and the whole values of other dimensions are 0.For the word in sen, its position coordinates refer to it in sentence with reality The relative distance that body is censured.
Then by the output H=[h of all units LSTM layers two-way1,h2,…,hT] attention layers are input to, finally The context semantic feature vector that entity censures m is obtained after by nonlinear transformation by the weighted sum of each component of H:
M=tanh (H)
α=softmax (wTM)
R=H αT
Glom=tanh (r)
For the semantic feature vector Glo of candidate entitye, take the representation of knowledge learn method TransE.In TransE In, for each triple (s, r, o) ∈ k, entity and the embedded of relationship is trained to represent by making E (s)+E (r)=E (o). Using the structuring " context " of entity come the expression of learning object, the contextual feature vector Glo of candidate entity is obtainede
Step S3:Similarity measurement.Since format surface matching similarity and context semantic matches similarity can Important information content provided entity link, and the two is combined as entity denotion-candidate's entity to (m, e) most by present invention selection Whole matching similarity score.
The matching similarity score of gauging surface form and the two semantic levels respectively:
ml=cosine (Locm, Loce)
mg=cosine (Glom, Gloe)
Then it is real format surface matching similarity and context semantic matches similarity to be combined as entity denotion-candidate The body matching similarity score final to (m, e):
Score (m, e)=ml+mg
In the training stage of algorithm, the present invention selects the hinge loss function of negative sample mode as last cost letter Number:
Wherein, γ is presetting hyper parameter, and presentation-entity is censured between the similarity of correct entity and incorrect entity Every.E is correct mapping entity, and e ' is the select false links of stochastical sampling from all entities of the knowledge base of reference Entity.That is, entity for correctly linking, similarity score should be than the mistake selected at random for the meaning of loss function expression Link entity at least large-spacing γ.
Finally in the system of test, it is only necessary to which the matching similarity calculated between each candidate entity and entity denotion obtains Point, the entity of highest scoring is chosen as final result.
(Deep Semantic are matched to the Deep Semantics based on more granularity LSTM networks proposed above in association with attached drawing Match Model, DSMM) specific embodiment of entity link system and each module is expounded.Pass through more than embodiment party The description of formula, it is necessary general hard that one of ordinary skill in the art can be understood that the present invention can add by software The mode of part platform is realized, naturally it is also possible to which by hardware realization, but the former is more preferably embodiment.Based on such reason Solution, the part that technical scheme of the present invention substantially in other words contributes to the prior art can be with computer software product Form embodies, which is stored in a storage medium, is used including some instructions so that one or more computer Equipment performs the method described in each embodiment of the present invention.
Thought according to the present invention, in specific embodiments and applications there will be changes.In conclusion this Description should not be construed as limiting the invention.
Invention described above embodiment does not form the restriction to invention protection domain.It is any the present invention Modifications, equivalent substitutions and improvements made within spirit and principle etc., should all be included in the protection scope of the present invention.

Claims (4)

  1. A kind of 1. Deep Semantics matching entities link method based on more granularity LSTM networks, which is characterized in that the chain of entities Method is connect to include with lower structure and step:
    (1) format surface matches:Using the other two-way LSTM networks (char-LSTM) of character level, extract entity and censure, is candidate real The format surface character representation of body, while the semantic information of word in itself to a certain extent can be included.
    (2) context semantic matches:Using the two-way LSTM networks (word-LSTM) of word rank, sentence where being censured to entity It is encoded, outputs it the context semantic feature vector censured as entity.Using " structure of the entity in knowledge mapping Context ", study obtain the context semantic feature vector of candidate entity.
    (3) similarity measurement:Entity is censured, the format surface feature vector of candidate entity and context semantic feature to Amount, difference gauging surface form and semantic matching similarity score, by format surface matching similarity and context semanteme Entity denotion-candidate's entity is combined as to final matching similarity score with similarity.
  2. 2. the method as described in claim 1, which is characterized in that the step (1) specifically includes:
    (1.1) m is censured for the given entity comprising T character, input is the character style of m;
    (1.2) character of each input unit is converted into corresponding character vector by input by term vector layer;
    (1.3) character vector obtained after converting is as input LSTM layers two-way;
    (1.4) taking two-way LSTM layers last hiding layer state, i.e. entity censures the format surface feature vector of m as output.
  3. 3. the method as described in claim 1, which is characterized in that the step (2) specifically includes:
    (2.1) it gives entity and censures the m and its sentence sen at place, each word of sen is converted into corresponding term vector;
    (2.2) for i-th of word, its term vector and position vector are spliced to the input as LSTM layers;
    (2.3) output of all units LSTM layers two-way is input to attention layers, each component is weighted with it is non-thread Property transformation;
    (2.4) the method TransE of representation of knowledge study is taken, is represented using the structuring " context " of entity come learning object, Obtain the contextual feature vector of candidate entity.
  4. 4. the method as described in claim 1, which is characterized in that the step (3) specifically includes:
    (3.1) the format surface feature vector for entity denotion, candidate entity and context semantic feature vector, are counted respectively Calculate format surface and the similarity score of the two semantic levels;
    (3.2) format surface matching similarity and context semantic matches similarity are combined as entity denotion-candidate's entity To final similarity mode score;
    (3.3) similarity score between each candidate entity and entity denotion is calculated, chooses the entity of highest scoring as most Whole answer.
CN201810058399.7A 2018-01-22 2018-01-22 A kind of Deep Semantics matching entities link method based on more granularity LSTM networks Pending CN108268643A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810058399.7A CN108268643A (en) 2018-01-22 2018-01-22 A kind of Deep Semantics matching entities link method based on more granularity LSTM networks

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810058399.7A CN108268643A (en) 2018-01-22 2018-01-22 A kind of Deep Semantics matching entities link method based on more granularity LSTM networks

Publications (1)

Publication Number Publication Date
CN108268643A true CN108268643A (en) 2018-07-10

Family

ID=62776302

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810058399.7A Pending CN108268643A (en) 2018-01-22 2018-01-22 A kind of Deep Semantics matching entities link method based on more granularity LSTM networks

Country Status (1)

Country Link
CN (1) CN108268643A (en)

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109063021A (en) * 2018-07-12 2018-12-21 浙江大学 A kind of knowledge mapping distribution representation method for capableing of encoding relation semanteme Diversity structure
CN109189862A (en) * 2018-07-12 2019-01-11 哈尔滨工程大学 A kind of construction of knowledge base method towards scientific and technological information analysis
CN109241294A (en) * 2018-08-29 2019-01-18 国信优易数据有限公司 A kind of entity link method and device
CN109344911A (en) * 2018-10-31 2019-02-15 北京国信云服科技有限公司 A kind of parallel processing classification method based on multilayer LSTM model
CN109800879A (en) * 2018-12-21 2019-05-24 科大讯飞股份有限公司 Construction of knowledge base method and apparatus
CN110059160A (en) * 2019-04-17 2019-07-26 东南大学 A kind of knowledge base answering method and device based on context end to end
CN110147401A (en) * 2019-05-22 2019-08-20 苏州大学 Merge the knowledge base abstracting method of priori knowledge and context-sensitive degree
CN111310438A (en) * 2020-02-20 2020-06-19 齐鲁工业大学 Chinese sentence semantic intelligent matching method and device based on multi-granularity fusion model
WO2020124959A1 (en) * 2018-12-21 2020-06-25 平安科技(深圳)有限公司 Semantic similarity matching method based on cross attention mechanism, and apparatus therefor
WO2020135337A1 (en) * 2018-12-29 2020-07-02 新华三大数据技术有限公司 Entity semantics relationship classification
CN111414765A (en) * 2020-03-20 2020-07-14 北京百度网讯科技有限公司 Sentence consistency determination method and device, electronic equipment and readable storage medium
WO2020147369A1 (en) * 2019-01-18 2020-07-23 华为技术有限公司 Natural language processing method, training method, and data processing device
CN111597820A (en) * 2020-05-11 2020-08-28 北京理工大学 ICT supply chain bid item and enterprise product entity matching method
CN111738012A (en) * 2020-05-14 2020-10-02 平安国际智慧城市科技股份有限公司 Method and device for extracting semantic alignment features, computer equipment and storage medium
CN111882124A (en) * 2020-07-20 2020-11-03 武汉理工大学 Homogeneous platform development effect prediction method based on generation confrontation simulation learning
CN113220899A (en) * 2021-05-10 2021-08-06 上海博亦信息科技有限公司 Intellectual property identity identification method based on academic talent information intellectual map
CN113378018A (en) * 2021-08-16 2021-09-10 南京烽火星空通信发展有限公司 Header list entity relationship matching method based on deep learning multi-head selection model
CN113535986A (en) * 2021-09-02 2021-10-22 中国医学科学院医学信息研究所 Data fusion method and device applied to medical knowledge graph
CN113761208A (en) * 2021-09-17 2021-12-07 福州数据技术研究院有限公司 Scientific and technological innovation information classification method and storage device based on knowledge graph
CN114462379A (en) * 2020-11-09 2022-05-10 中国科学院信息工程研究所 Improved script learning method and device based on event evolution diagram
CN114781471A (en) * 2021-06-02 2022-07-22 清华大学 Entity record matching method and system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106295796A (en) * 2016-07-22 2017-01-04 浙江大学 Entity link method based on degree of depth study
CN106569998A (en) * 2016-10-27 2017-04-19 浙江大学 Text named entity recognition method based on Bi-LSTM, CNN and CRF
US20170109355A1 (en) * 2015-10-16 2017-04-20 Baidu Usa Llc Systems and methods for human inspired simple question answering (hisqa)
CN106909655A (en) * 2017-02-27 2017-06-30 中国科学院电子学研究所 Found and link method based on the knowledge mapping entity that production alias is excavated
CN106934020A (en) * 2017-03-10 2017-07-07 东南大学 A kind of entity link method based on multiple domain entity index

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170109355A1 (en) * 2015-10-16 2017-04-20 Baidu Usa Llc Systems and methods for human inspired simple question answering (hisqa)
CN106295796A (en) * 2016-07-22 2017-01-04 浙江大学 Entity link method based on degree of depth study
CN106569998A (en) * 2016-10-27 2017-04-19 浙江大学 Text named entity recognition method based on Bi-LSTM, CNN and CRF
CN106909655A (en) * 2017-02-27 2017-06-30 中国科学院电子学研究所 Found and link method based on the knowledge mapping entity that production alias is excavated
CN106934020A (en) * 2017-03-10 2017-07-07 东南大学 A kind of entity link method based on multiple domain entity index

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
REN YUANFANG等: "Relation classification via sequence features and bi-directional LSTMs", 《WUHAN UNIVERSITY JOURNAL OF NATURAL SCIENCES》 *
张子睿等: "基于BI-LSTM-CRF模型的中文分词法", 《长春理工大学学报(自然科学版)》 *

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109189862A (en) * 2018-07-12 2019-01-11 哈尔滨工程大学 A kind of construction of knowledge base method towards scientific and technological information analysis
CN109063021A (en) * 2018-07-12 2018-12-21 浙江大学 A kind of knowledge mapping distribution representation method for capableing of encoding relation semanteme Diversity structure
CN109241294A (en) * 2018-08-29 2019-01-18 国信优易数据有限公司 A kind of entity link method and device
CN109344911A (en) * 2018-10-31 2019-02-15 北京国信云服科技有限公司 A kind of parallel processing classification method based on multilayer LSTM model
CN109344911B (en) * 2018-10-31 2022-04-12 北京国信云服科技有限公司 Parallel processing classification method based on multilayer LSTM model
WO2020124959A1 (en) * 2018-12-21 2020-06-25 平安科技(深圳)有限公司 Semantic similarity matching method based on cross attention mechanism, and apparatus therefor
CN109800879A (en) * 2018-12-21 2019-05-24 科大讯飞股份有限公司 Construction of knowledge base method and apparatus
CN109800879B (en) * 2018-12-21 2022-02-01 科大讯飞股份有限公司 Knowledge base construction method and device
WO2020135337A1 (en) * 2018-12-29 2020-07-02 新华三大数据技术有限公司 Entity semantics relationship classification
WO2020147369A1 (en) * 2019-01-18 2020-07-23 华为技术有限公司 Natural language processing method, training method, and data processing device
CN110059160B (en) * 2019-04-17 2021-02-09 东南大学 End-to-end context-based knowledge base question-answering method and device
CN110059160A (en) * 2019-04-17 2019-07-26 东南大学 A kind of knowledge base answering method and device based on context end to end
CN110147401A (en) * 2019-05-22 2019-08-20 苏州大学 Merge the knowledge base abstracting method of priori knowledge and context-sensitive degree
CN111310438A (en) * 2020-02-20 2020-06-19 齐鲁工业大学 Chinese sentence semantic intelligent matching method and device based on multi-granularity fusion model
CN111414765A (en) * 2020-03-20 2020-07-14 北京百度网讯科技有限公司 Sentence consistency determination method and device, electronic equipment and readable storage medium
CN111597820A (en) * 2020-05-11 2020-08-28 北京理工大学 ICT supply chain bid item and enterprise product entity matching method
CN111738012A (en) * 2020-05-14 2020-10-02 平安国际智慧城市科技股份有限公司 Method and device for extracting semantic alignment features, computer equipment and storage medium
CN111738012B (en) * 2020-05-14 2023-08-18 平安国际智慧城市科技股份有限公司 Method, device, computer equipment and storage medium for extracting semantic alignment features
CN111882124B (en) * 2020-07-20 2022-06-07 武汉理工大学 Homogeneous platform development effect prediction method based on generation confrontation simulation learning
CN111882124A (en) * 2020-07-20 2020-11-03 武汉理工大学 Homogeneous platform development effect prediction method based on generation confrontation simulation learning
CN114462379A (en) * 2020-11-09 2022-05-10 中国科学院信息工程研究所 Improved script learning method and device based on event evolution diagram
CN113220899A (en) * 2021-05-10 2021-08-06 上海博亦信息科技有限公司 Intellectual property identity identification method based on academic talent information intellectual map
CN114781471A (en) * 2021-06-02 2022-07-22 清华大学 Entity record matching method and system
CN114781471B (en) * 2021-06-02 2022-12-27 清华大学 Entity record matching method and system
CN113378018A (en) * 2021-08-16 2021-09-10 南京烽火星空通信发展有限公司 Header list entity relationship matching method based on deep learning multi-head selection model
CN113535986A (en) * 2021-09-02 2021-10-22 中国医学科学院医学信息研究所 Data fusion method and device applied to medical knowledge graph
CN113535986B (en) * 2021-09-02 2023-05-05 中国医学科学院医学信息研究所 Data fusion method and device applied to medical knowledge graph
CN113761208A (en) * 2021-09-17 2021-12-07 福州数据技术研究院有限公司 Scientific and technological innovation information classification method and storage device based on knowledge graph

Similar Documents

Publication Publication Date Title
CN108268643A (en) A kind of Deep Semantics matching entities link method based on more granularity LSTM networks
CN110765775B (en) Self-adaptive method for named entity recognition field fusing semantics and label differences
CN110795543B (en) Unstructured data extraction method, device and storage medium based on deep learning
CN109543180B (en) Text emotion analysis method based on attention mechanism
CN106295796B (en) entity link method based on deep learning
CN110377903B (en) Sentence-level entity and relation combined extraction method
CN110083710B (en) Word definition generation method based on cyclic neural network and latent variable structure
CN112347268A (en) Text-enhanced knowledge graph joint representation learning method and device
CN108733792A (en) A kind of entity relation extraction method
CN113312500A (en) Method for constructing event map for safe operation of dam
CN109471895A (en) The extraction of electronic health record phenotype, phenotype name authority method and system
CN110222163A (en) A kind of intelligent answer method and system merging CNN and two-way LSTM
CN111274800A (en) Inference type reading understanding method based on relational graph convolution network
CN110059160A (en) A kind of knowledge base answering method and device based on context end to end
CN113095415A (en) Cross-modal hashing method and system based on multi-modal attention mechanism
CN110569355B (en) Viewpoint target extraction and target emotion classification combined method and system based on word blocks
CN113761890A (en) BERT context sensing-based multi-level semantic information retrieval method
CN113553440A (en) Medical entity relationship extraction method based on hierarchical reasoning
CN113627550A (en) Image-text emotion analysis method based on multi-mode fusion
CN115587594A (en) Network security unstructured text data extraction model training method and system
CN111145914A (en) Method and device for determining lung cancer clinical disease library text entity
CN116561272A (en) Open domain visual language question-answering method and device, electronic equipment and storage medium
CN114780723B (en) Portrayal generation method, system and medium based on guide network text classification
CN111914553A (en) Financial information negative subject judgment method based on machine learning
CN113516094B (en) System and method for matching and evaluating expert for document

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20180710