WO2023273337A1 - 一种基于代表特征的遥感图像中的密集目标检测方法 - Google Patents
一种基于代表特征的遥感图像中的密集目标检测方法 Download PDFInfo
- Publication number
- WO2023273337A1 WO2023273337A1 PCT/CN2022/074542 CN2022074542W WO2023273337A1 WO 2023273337 A1 WO2023273337 A1 WO 2023273337A1 CN 2022074542 W CN2022074542 W CN 2022074542W WO 2023273337 A1 WO2023273337 A1 WO 2023273337A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- feature
- similarity
- network
- feature map
- category
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 17
- 239000013598 vector Substances 0.000 claims abstract description 23
- 238000000605 extraction Methods 0.000 claims abstract description 14
- 238000001514 detection method Methods 0.000 claims description 26
- 238000004364 calculation method Methods 0.000 claims description 13
- 230000006870 function Effects 0.000 claims description 6
- 238000005259 measurement Methods 0.000 claims description 6
- 238000010606 normalization Methods 0.000 claims description 6
- 238000000691 measurement method Methods 0.000 claims description 4
- 238000013527 convolutional neural network Methods 0.000 claims description 3
- 238000004880 explosion Methods 0.000 claims description 3
- 239000000523 sample Substances 0.000 description 8
- 238000013135 deep learning Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000012805 post-processing Methods 0.000 description 2
- 240000004050 Pentaglottis sempervirens Species 0.000 description 1
- 235000004522 Pentaglottis sempervirens Nutrition 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000012545 processing Methods 0.000 description 1
- 238000012549 training Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/22—Matching criteria, e.g. proximity measures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
- G06F18/20—Analysing
- G06F18/24—Classification techniques
- G06F18/241—Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/045—Combinations of networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
Definitions
- the invention relates to target detection, in particular to a dense target detection method in remote sensing images based on representative features.
- Remote sensing technology is a rapidly developing high-tech.
- the information network it forms provides people with a large amount of scientific data and dynamic information.
- Remote sensing image detection is the benchmark problem of target detection. It is used in many fields such as agriculture, meteorological surveying and mapping, and environmental protection. It has great application value.
- the target detection method based on deep learning, the target category corresponding to the ground truth label, the The sample is a positive sample, and the positive sample with a large error between the predicted value of the target classification confidence and the true value label is a difficult sample.
- Existing excellent detection models can detect most objects in the image, but often miss some difficult positive samples that are more difficult to detect.
- the difficult positive samples When detecting difficult samples, when the classification confidence of the target detection model for the positive sample prediction is lower than the set confidence threshold, the difficult positive samples will be filtered out in the post-processing stage, resulting in a decrease in the detection performance of the detection model; or artificially Lowering the confidence threshold in the post-processing stage of the network makes the detection model lose the ability to suppress low-confidence negative samples. Therefore, it is more challenging to accurately detect densely arranged multiple objects in remote sensing images.
- the purpose of the present invention is to provide a dense target detection method in remote sensing images based on representative features, by adaptively increasing the classification confidence of difficult positive samples, and then accurately detecting densely arranged multiple targets in remote sensing images. similar objects.
- a dense target detection method in remote sensing images based on representative features of the present invention comprising the following steps:
- Construct four network modules including a feature extraction network, a feature pyramid network, a preliminary prediction network and a final prediction network, input the remote sensing image to be detected into the feature extraction network and the feature pyramid network in turn, and output a preliminary feature map;
- step 3 Take the similarity obtained in step 3 as the weight, and adaptively increase the classification confidence based on the classification confidence of difficult positive samples, as the final classification confidence of difficult positive samples.
- step 2 the process of obtaining the highest classification confidence and representative features in step 2 is:
- the similarity in the step 3 includes feature semantic similarity and feature space similarity
- the feature semantic similarity calculation process includes:
- the described embedded Gaussian similarity measurement function is:
- RF k represents the representative feature RepFeature k of the k-th category
- F hw represents the feature vector of the h-th row and w-th column in the feature map FM ODM output by the final prediction network Feature vector RepFeature k
- i represents the eigenvalue of the i-th dimension in n dimensions
- W ⁇ , W ⁇ are learning weight matrices; ⁇ (RF k ) i , ⁇ (F hw ) i respectively represent the eigenvalues of the two eigenvectors in each dimension;
- N( ⁇ (RF)) is the normalization factor, by calculating the sum of the similarities between the feature vector F hw of the hth row and wth column in the final prediction network and K effective representative features RF k , K is the data The number of categories in the set, and the embedded Gaussian similarity is normalized to a range of 0 to 1 to avoid the gradient explosion problem caused by excessive similarity.
- the formula for calculating the normalization factor is as follows;
- the feature space similarity calculation process includes the following steps:
- Spatial_i means that RF k and F hw are taken from the i-th layer feature map from the bottom up of the feature pyramid network, and ⁇ is the scale parameter;
- the final classification confidence of the difficult positive sample in step 4 is added by weight to the confidence of the position at (h, w) of the final prediction network feature map with respect to category k by representing the confidence of category k Realized above, the calculation formula is:
- the method for measuring the feature semantic similarity includes using any one of Euclidean similarity, cosine similarity or Gaussian similarity.
- the feature extraction network uses a convolutional layer to reduce the size of the original image, and the extracted effective features are input to the feature pyramid network; the feature extraction network selects a ResNet or HRNet convolutional neural network.
- the preliminary prediction network selects the feature alignment module in the S 2 A-NET model to preliminarily predict the category information and location information of the object.
- the final prediction network selects the rotation detection module in the S 2 A-NET model to predict the final category information and position information of the object.
- the present invention uses representative features and representative confidence to adaptively improve the classification confidence of difficult positive samples, and improves the classification ability of difficult positive samples in dense remote sensing image scenes;
- Fig. 1 is a schematic diagram of the representative feature acquisition process of the present invention
- Fig. 2 is the flow chart of computing similarity of the present invention
- Fig. 3 is a schematic diagram of improving classification confidence for difficult positive samples in the present invention.
- (1) Construct four network modules, including feature extraction network, feature pyramid network, preliminary prediction network and final prediction network, input the remote sensing image to be detected into the feature extraction network, use the convolutional layer to reduce the size of the original image, feature extraction The network inputs the extracted effective features into the feature pyramid network, and then outputs the preliminary feature map FM FAM .
- the feature extraction network selects ResNet or HRNet convolutional neural network; the preliminary prediction network selects the feature alignment module FAM in the S 2 A-NET model; the final prediction network selects the rotation detection module ODM in the S 2 A-NET model.
- the classification confidence threshold is set to 0.6. Only when the representative confidence of category k is greater than 0.6, The representative features of category k are effective representative features. When RepConfidence k is low, such as 0.3, 0.4, the probability of RepFeature k itself belonging to category k is also low, and cannot be an effective representative feature.
- the measurement method of feature semantic similarity can adopt any one of Euclidean similarity, cosine similarity or Gaussian similarity.
- Gaussian similarity is used to calculate feature semantic similarity. The process is as follows:
- the embedded Gaussian similarity measurement function is used to calculate the similarity of feature semantic information, and the measurement method used is normalized.
- the embedded Gaussian similarity measurement function is:
- RF k represents the representative feature RepFeature k of the k-th category
- F hw represents the feature vector of the h-th row and w-th column in the feature map FM ODM output by the final prediction network Feature vector RepFeature k
- n of the feature vector in this embodiment is 256
- i represents the feature value of the i-th dimension in the n dimensions
- W ⁇ , W ⁇ are learning weight matrices; ⁇ (RF k ) i , ⁇ (F hw ) i respectively represent the eigenvalues of the two eigenvectors in each dimension;
- N( ⁇ (RF)) is the normalization factor.
- the feature space similarity calculation process includes the following steps:
- Spatial_i means that RF k and F hw are taken from the i-th layer feature map from the bottom up of the feature pyramid network, and ⁇ is a scale parameter.
- ⁇ is set to 1/64 so that two features with closer distances can be There is a high spatial location correlation.
- step 3 Take the similarity obtained in step 3 as the weight, and adaptively increase the classification confidence based on the classification confidence of difficult positive samples, as the final classification confidence of difficult positive samples, as shown in Figure 3.
- the final classification confidence of the difficult positive sample is weighted by adding the representative confidence of category k to the confidence of category k at the (h, w)th position of the final prediction network feature map Realized above, the calculation formula is:
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Evolutionary Computation (AREA)
- Bioinformatics & Computational Biology (AREA)
- Computational Linguistics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Biophysics (AREA)
- Evolutionary Biology (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Software Systems (AREA)
- Image Analysis (AREA)
Abstract
一种基于代表特征的遥感图像中的密集目标检测方法,包括:构建特征提取网络、特征金字塔网络、初步预测网络和最终预测网络,将待检测遥感图像依次输入到特征提取网络和特征金字塔网络中,输出初步特征图;将初步特征图输入到初步预测网络中,在数据集所有类别中选取每个类别语义信息的代表特征和各个类别在整张特征图中的代表置信度;将初步预测网络输出的特征图输入到最终预测网络中,得到最终特征图,计算同类别的代表特征与最终特征图相同位置的特征向量之间的相似度;以相似度为权重,在困难正样本分类置信度的基础上自适应提升分类置信度。
Description
本发明涉及目标检测,特别涉及一种基于代表特征的遥感图像中的密集目标检测方法。
遥感技术是一种发展迅速的高新技术,它所形成的信息网络为人们提供大量的科学数据和动态信息,遥感图像检测是目标检测的基准问题,在农业、气象测绘、环境保护等多个领域有着很大的应用价值。
随着深度学习算法在计算机视觉领域取得的极大成功,已被认为是遥感图像处理的首选方法。由于俯瞰视角拍摄和更大的空间视野,遥感图像中存在更多的密集场景并包含大量密集排列的物体,在基于深度学习的目标检测方法中,与真值标签对应的目标类别来说,该样本为正样本,对于目标分类置信度的预测值与真值标签误差较大的正样本为困难样本。现有的优良检测模型可以检测出图像中的多数物体,却往往会漏掉其中检测难度较大的部分困难正样本。当检测困难样本时,目标检测模型对于正样本预测的分类置信度低于设定的置信度阈值时,困难正样本会在后处理阶段被滤除,从而导致检测模型检测性能的降低;或者人为在网络的后处理阶段降低置信度阈值,则使得检测模型失去抑制低置信度负样本的能力。所以,在遥感图像中准确地检测出密集排列的多个物体具有更多挑战性。
发明内容
发明目的:针对以上问题,本发明目的是提供一种基于代表特征的遥感图像中的密集目标检测方法,通过自适应增加困难正样本的分类置信度,进而准确检测出遥感图像中密集排列的多个同类物体。
技术方案:本发明的一种基于代表特征的遥感图像中的密集目标检测方法,包括如下步骤:
(1)构建四个网络模块,包括特征提取网络、特征金字塔网络、初步预测网络和最终预测网络,将待检测遥感图像依次输入到特征提取网络和特征金子塔网络中,输出初步特征图;
(2)将初步特征图输入到初步预测网络中,在数据集所有类别中选取每个类别语义信息的代表特征和各个类别在整张特征图中代表置信度;
(3)将初步预测网络输出的特征图输入到最终预测网络中,得到最终特征图,计算同类别的代表特征与最终特征图相同位置特征向量之间的相似度;
(4)以步骤3得到的相似度为权重,在困难正样本分类置信度基础上自适应提升分类置信度,作为困难正样本的最终分类置信度。
进一步,所述步骤2得到最高分类置信度和代表特征的过程为:
(204)设置分类置信度阈值,只有当类别k的代表置信度大于分类置信度阈值时,类别k的代表特征才为有效的代表特征。
进一步,所述步骤3中相似度包括特征语义相似度和特征空间相似度,所述特征语义相似度计算过程包括:
采用嵌入高斯相似性度量函数计算特征语义信息相似性,并对采用的度量方法进行归一化,所述的嵌入高斯相似性度量函数为:
其中RF
k表示第k种类别的代表特征RepFeature
k,F
hw表示最终预测网络输出的特征图FM
ODM中第h行、第w列的特征向量
特征向量RepFeature
k、
均为1×1×n维,i表示n个维度中第i个维度的特征值;
采用线性嵌入空间的形式:
φ(RF
k)=W
φRF
k
θ(F
hw)=W
θF
hw
其中W
φ、W
θ是学习权重矩阵;φ(RF
k)
i、θ(F
hw)
i分别表示两个特征向量在每个维度中的特征值;
N(φ(RF))为归一化因子,通过计算最终预测网络中第h行、第w列的特征向量F
hw分别与K个有效代表特征RF
k的相似度的加和,K为数据集的类别数目,将 嵌入高斯相似度归一化为0到1的范围内,以避免相似度过高而产生的梯度爆炸问题,归一化因子计算公式如式为;
进一步,所述特征空间相似度计算过程包括如下步骤:
(302)利用dis(RF
k,F
hw)乘以各个特征图的步长stride
i得到两个特征向量在原始图像上的空间距离Corr
Spatial_i(RF
k,F
hw),计算公式为:
其中,Spatial_i表示RF
k、F
hw取自特征金字塔网络自底向上的第i层特征图,α是尺度参数;
所以,所述步骤3中的相似度表达式为:
Similarity(RF
k,F
hw)
=Sim
Embedded_Gaussian(RF
k,F
hw)+Corr
Spatial_i(RF
k,F
hw)
进一步,所述特征语义相似度的度量方法包括采用欧式相似度、余弦相似度或高斯相似度中的任一种。
进一步,所述特征提取网络采用卷积层缩小原始图像的尺寸,并将提取的有效特征输入到特征金字塔网络;所述特征提取网络选取ResNet或HRNet卷积神经网络。
进一步,所述初步预测网络选取S
2A-NET模型中的特征对齐模块,初步预测 物体的类别信息和位置信息。
进一步,所述最终预测网络选取S
2A-NET模型中的旋转检测模块,预测物体最终的类别信息和位置信息。
有益效果:本发明与现有技术相比,其显著优点是:
1、本发明利用代表特征和代表置信度自适应提升了困难正样本分类置信度,提升了遥感图像密集场景下的困难正样本的分类能力;
2、利用两个阶段分类支路参数确保相似度计算过程的一致性,减少检测模型的复杂度以及网络参数量。
图1为本发明代表特征获取过程示意图;
图2为本发明计算相似度流程图;
图3为本发明困难正样本提升分类置信度示意图。
本实施例所述的本一种基于代表特征的遥感图像中的密集目标检测方法,包括如下步骤:
(1)构建四个网络模块,包括特征提取网络、特征金字塔网络、初步预测网络和最终预测网络,将待检测遥感图像输入到特征提取网络中,采用卷积层缩小原始图像的尺寸,特征提取网络将提取的有效特征输入到特征金字塔网络中,然后输出初步特征图FM
FAM。
其中特征提取网络选取ResNet或HRNet卷积神经网络;初步预测网络选取S
2A-NET模型中的特征对齐模块FAM;最终预测网络选取S
2A-NET模型中的旋转检测模块ODM。
(2)将初步特征图FM
FAM输入到特征对齐模块FAM中,在数据集所有类别中选取每个类别语义信息的代表特征和各个类别在整张特征图中代表置信度,如图1所示,过程为:
(203)在初步特征图FM
FAM中提取第h行、第w列的特征信息
用以表 示类别k的代表特征RepFeature
k,其中FM
FAM是初步预测网络的分类支路和回归支路所共享的前一层特征图;FM
FAM同时包含物体类别和位置信息的相关特征,用于后续特征之间相似度的计算;特征图FM
FAM为H×W×C维,其中H、W为特征图的长、宽,C为特征图的通道数,本实施例中C为256;
(204)设置分类置信度阈值以保证代表特征的可靠性和成为代表特征的难度之间的最佳平衡,本实施例中将阈值设置为0.6,只有当类别k的代表置信度大于0.6时,类别k的代表特征才为有效的代表特征。当RepConfidence
k较低如0.3、0.4时,RepFeature
k自身属于类别k的概率也较低,无法成为有效的代表特征。
(3)将初步预测网络输出的特征图输入到最终预测网络中,得到最终特征图,计算同类别的代表特征与最终特征图相同位置特征向量之间的相似度,流程图如图2所示。相似度包括特征语义相似度和特征空间相似度,
特征语义相似度的度量方法可以采用欧式相似度、余弦相似度或高斯相似度中的任一种。本实施采用高斯相似度来进行特征语义相似度计算,过程如下:
采用嵌入高斯相似性度量函数计算特征语义信息相似性,并对采用的度量方法进行归一化,嵌入高斯相似性度量函数为:
其中RF
k表示第k种类别的代表特征RepFeature
k,F
hw表示最终预测网络输出的特征图FM
ODM中第h行、第w列的特征向量
特征向量RepFeature
k、
均为1×1×n维,本实施例中特征向量的维度n取256,i表示n个维度中第i个维度的特征值;
采用线性嵌入空间的形式:
φ(RF
k)=W
φRF
k
θ(F
hw)=W
θF
hw
其中W
φ、W
θ是学习权重矩阵;φ(RF
k)
i、θ(F
hw)
i分别表示两个特征向量在每个维度中的特征值;
N(φ(RF))为归一化因子,通过计算最终预测网络中第h行、第w列的特征向量F
hw分别与15个有效代表特征RF
k的相似度的加和,将嵌入高斯相似度归一化为0到1的范围内,以避免相似度过高而产生的梯度爆炸问题,归一化因子计算公式如式为;
特征空间相似度计算过程包括如下步骤:
其中
为特征向量RepFeature
k在特征图中的横、纵坐标,
为特征向量
在特征图中的横、纵坐标;训练模型时,使用特征金子塔网络自底向上感受5层特征图进行预测,5层特征图的步长stride
i取值分别是8,16,32,64,128;
(302)利用dis(RF
k,F
hw)乘以各个特征图的步长stride
i得到两个特征向量在原始图像上的空间距离Corr
Spatial_i(RF
k,F
hw),计算公式为:
其中,Spatial_i表示RF
k、F
hw取自特征金字塔网络自底向上的第i层特征图,α是尺度参数,本实施例中将α设为1/64以使得两个距离较近的特征可以有较高的空间位置相关性。
相似度表达式为:
Similarity(RF
k,F
hw)
=Sim
Embedded_Gaussian(RF
k,F
hw)+Corr
Spatial_i(RF
k,F
hw)
(4)以步骤3得到的相似度为权重,在困难正样本分类置信度基础上自适应提升分类置信度,作为困难正样本的最终分类置信度,示意图如图3所示。
Claims (9)
- 一种基于代表特征的遥感图像中的密集目标检测方法,其特征在于,包括如下步骤:(1)构建四个网络模块,包括特征提取网络、特征金字塔网络、初步预测网络和最终预测网络,将待检测遥感图像依次输入到特征提取网络和特征金子塔网络中,输出初步特征图;(2)将初步特征图输入到初步预测网络中,在数据集所有类别中选取每个类别语义信息的代表特征和各个类别在整张特征图中代表置信度;(3)将初步预测网络输出的特征图输入到最终预测网络中,得到最终特征图,计算同类别的代表特征与最终特征图相同位置特征向量之间的相似度;(4)以步骤3得到的相似度为权重,在困难正样本分类置信度基础上自适应提升分类置信度,作为困难正样本的最终分类置信度。
- 根据权利要求1所述的密集目标检测方法,其特征在于,所述步骤2得到最高分类置信度和代表特征的过程为:(204)设置分类置信度阈值,只有当类别k的代表置信度大于分类置信度阈值时,类别k的代表特征才为有效的代表特征。
- 根据权利要求2所述的密集目标检测方法,其特征在于,所述步骤3中相似度包括特征语义相似度和特征空间相似度,所述特征语义相似度计算过程包括:采用嵌入高斯相似性度量函数计算特征语义信息相似性,并对采用的度量方法进行归一化,所述的嵌入高斯相似性度量函数为:其中RF k表示第k种类别的代表特征RepFeature k,F hw表示最终预测网络输出的 特征图FM ODM中第h行、第w列的特征向量 特征向量RepFeature k、 均为1×1×n维,i表示n个维度中第i个维度的特征值;采用线性嵌入空间的形式:φ(RF k)=W φRF kθ(F hw)=W θF hw其中W φ、W θ是学习权重矩阵;φ(RF k) i、θ(F hw) i分别表示两个特征向量在每个维度中的特征值;N(φ(RF))为归一化因子,通过计算最终预测网络中第h行、第w列的特征向量F hw分别与K个有效代表特征RF k的相似度的加和,将嵌入高斯相似度归一化为0到1的范围内,以避免相似度过高而产生的梯度爆炸问题,归一化因子计算公式如式为;
- 根据权利要求3所述的密集目标检测方法,其特征在于,所述特征空间相似度计算过程包括如下步骤:(302)利用dis(RF k,F hw)乘以各个特征图的步长stride i得到两个特征向量在原始图像上的空间距离Corr Spatial_i(RF k,F hw),计算公式为:其中,Spatial_i表示RF k、F hw取自特征金字塔网络自底向上的第i层特征图,α是尺度参数;所以,所述步骤3中的相似度表达式为:Similarity(RF k,F hw)=Sim Embedded_Gaussian(RF k,F hw)+Corr Spatial_i(RF k,F hw)
- 根据权利要求3所述的密集目标检测方法,其特征在于,所述特征语义相似度的度量方法包括采用欧式相似度、余弦相似度或高斯相似度中的任一种。
- 根据权利要求1所述的密集目标检测方法,其特征在于,所述特征提取网络采用卷积层缩小原始图像的尺寸,并将提取的有效特征输入到特征金字塔网络;所述特征提取网络选取ResNet或HRNet卷积神经网络。
- 根据权利要求1所述的密集目标检测方法,其特征在于,所述初步预测网络选取S 2A-NET模型中的特征对齐模块。
- 根据权利要求1所述的密集目标检测方法,其特征在于,所述最终预测网络选取S 2A-NET模型中的旋转检测模块。
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110725564.1A CN113536986B (zh) | 2021-06-29 | 2021-06-29 | 一种基于代表特征的遥感图像中的密集目标检测方法 |
CN202110725564.1 | 2021-06-29 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2023273337A1 true WO2023273337A1 (zh) | 2023-01-05 |
Family
ID=78097103
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2022/074542 WO2023273337A1 (zh) | 2021-06-29 | 2022-01-28 | 一种基于代表特征的遥感图像中的密集目标检测方法 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN113536986B (zh) |
WO (1) | WO2023273337A1 (zh) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116935226A (zh) * | 2023-08-01 | 2023-10-24 | 西安电子科技大学 | 一种基于HRNet的改进型遥感图像道路提取方法、系统、设备及介质 |
CN117746314A (zh) * | 2023-11-20 | 2024-03-22 | 江苏星图智能科技有限公司 | 一种基于多级联合判定ood物体的方法、设备及介质 |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113536986B (zh) * | 2021-06-29 | 2024-06-14 | 南京逸智网络空间技术创新研究院有限公司 | 一种基于代表特征的遥感图像中的密集目标检测方法 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20200097771A1 (en) * | 2018-09-25 | 2020-03-26 | Nec Laboratories America, Inc. | Deep group disentangled embedding and network weight generation for visual inspection |
CN111259758A (zh) * | 2020-01-13 | 2020-06-09 | 中国矿业大学 | 一种针对密集区域的两阶段遥感图像目标检测方法 |
CN112818777A (zh) * | 2021-01-21 | 2021-05-18 | 上海电力大学 | 一种基于密集连接与特征增强的遥感图像目标检测方法 |
CN113536986A (zh) * | 2021-06-29 | 2021-10-22 | 南京逸智网络空间技术创新研究院有限公司 | 一种基于代表特征的遥感图像中的密集目标检测方法 |
Family Cites Families (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108829826B (zh) * | 2018-06-14 | 2020-08-07 | 清华大学深圳研究生院 | 一种基于深度学习和语义分割的图像检索方法 |
CN109145713B (zh) * | 2018-07-02 | 2021-09-28 | 南京师范大学 | 一种结合目标检测的小目标语义分割方法 |
CN109063594A (zh) * | 2018-07-13 | 2018-12-21 | 吉林大学 | 基于YOLOv2的遥感图像快速目标检测方法 |
CN109961089B (zh) * | 2019-02-26 | 2023-04-07 | 中山大学 | 基于度量学习和元学习的小样本和零样本图像分类方法 |
CN110298298B (zh) * | 2019-06-26 | 2022-03-08 | 北京市商汤科技开发有限公司 | 目标检测及目标检测网络的训练方法、装置及设备 |
CN110287927B (zh) * | 2019-07-01 | 2021-07-27 | 西安电子科技大学 | 基于深度多尺度和上下文学习的遥感影像目标检测方法 |
CN110569879B (zh) * | 2019-08-09 | 2024-03-15 | 平安科技(深圳)有限公司 | 舌头图像提取方法、装置及计算机可读存储介质 |
CN111126205A (zh) * | 2019-12-12 | 2020-05-08 | 南京邮电大学 | 一种基于旋转定位网络的光学遥感图像飞机目标检测方法 |
CN111753677B (zh) * | 2020-06-10 | 2023-10-31 | 杭州电子科技大学 | 基于特征金字塔结构的多角度遥感船舶图像目标检测方法 |
CN111723748B (zh) * | 2020-06-22 | 2022-04-29 | 电子科技大学 | 一种红外遥感图像舰船检测方法 |
CN112766361A (zh) * | 2021-01-18 | 2021-05-07 | 山东师范大学 | 一种同色系背景下目标果实检测方法及检测系统 |
-
2021
- 2021-06-29 CN CN202110725564.1A patent/CN113536986B/zh active Active
-
2022
- 2022-01-28 WO PCT/CN2022/074542 patent/WO2023273337A1/zh active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20200097771A1 (en) * | 2018-09-25 | 2020-03-26 | Nec Laboratories America, Inc. | Deep group disentangled embedding and network weight generation for visual inspection |
CN111259758A (zh) * | 2020-01-13 | 2020-06-09 | 中国矿业大学 | 一种针对密集区域的两阶段遥感图像目标检测方法 |
CN112818777A (zh) * | 2021-01-21 | 2021-05-18 | 上海电力大学 | 一种基于密集连接与特征增强的遥感图像目标检测方法 |
CN113536986A (zh) * | 2021-06-29 | 2021-10-22 | 南京逸智网络空间技术创新研究院有限公司 | 一种基于代表特征的遥感图像中的密集目标检测方法 |
Non-Patent Citations (2)
Title |
---|
LI PENG, REN PENG, ZHANG XIAOYU, WANG QIAN, ZHU XIAOBIN, WANG LEI: "Region-Wise Deep Feature Representation for Remote Sensing Images", REMOTE SENSING, vol. 10, no. 6, 1 January 2018 (2018-01-01), pages 1 - 14, XP093020230, DOI: 10.3390/rs10060871 * |
XIN-YI TONG; GUI-SONG XIA; FAN HU; YANFEI ZHONG; MIHAI DATCU; LIANGPEI ZHANG: "Exploiting Deep Features for Remote Sensing Image Retrieval: A Systematic Investigation", ARXIV.ORG, 23 July 2017 (2017-07-23), pages 1 - 26, XP081560840 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN116935226A (zh) * | 2023-08-01 | 2023-10-24 | 西安电子科技大学 | 一种基于HRNet的改进型遥感图像道路提取方法、系统、设备及介质 |
CN117746314A (zh) * | 2023-11-20 | 2024-03-22 | 江苏星图智能科技有限公司 | 一种基于多级联合判定ood物体的方法、设备及介质 |
Also Published As
Publication number | Publication date |
---|---|
CN113536986A (zh) | 2021-10-22 |
CN113536986B (zh) | 2024-06-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Yang et al. | A robust one-stage detector for multiscale ship detection with complex background in massive SAR images | |
WO2023273337A1 (zh) | 一种基于代表特征的遥感图像中的密集目标检测方法 | |
Xie et al. | Multilevel cloud detection in remote sensing images based on deep learning | |
CN108062574B (zh) | 一种基于特定类别空间约束的弱监督目标检测方法 | |
CN110569738B (zh) | 基于密集连接网络的自然场景文本检测方法、设备和介质 | |
CN112784869B (zh) | 一种基于注意力感知与对抗学习的细粒度图像识别方法 | |
Liu et al. | Bipartite differential neural network for unsupervised image change detection | |
CN107291855A (zh) | 一种基于显著对象的图像检索方法及系统 | |
CN114861761B (zh) | 一种基于孪生网络特征与几何验证的回环检测方法 | |
CN106886785A (zh) | 一种基于多特征哈希学习的航拍图像快速匹配算法 | |
CN112883850A (zh) | 一种基于卷积神经网络的多视角空天遥感图像匹配方法 | |
CN110110618B (zh) | 一种基于pca和全局对比度的sar目标检测方法 | |
CN114821358A (zh) | 光学遥感图像海上舰船目标提取与识别方法 | |
CN112489089B (zh) | 一种微型固定翼无人机机载地面运动目标识别与跟踪方法 | |
CN116310837B (zh) | 一种sar舰船目标旋转检测方法及系统 | |
CN113239895A (zh) | 基于注意力机制的胶囊网络的sar图像变化检测方法 | |
CN110458064B (zh) | 结合数据驱动型和知识驱动型的低空目标检测与识别方法 | |
Liu et al. | A novel deep framework for change detection of multi-source heterogeneous images | |
CN112699954B (zh) | 一种基于深度学习和词袋模型的闭环检测方法 | |
CN115082781A (zh) | 一种舰船图像检测方法、装置以及存储介质 | |
CN106951873B (zh) | 一种遥感图像目标识别方法 | |
Wang et al. | Few-Shot Object Detection with Multi-level Information Interaction for Optical Remote Sensing Images | |
CN105844299B (zh) | 一种基于词袋模型的图像分类方法 | |
CN104851090B (zh) | 图像变化检测方法及装置 | |
Qi et al. | Deep object-centric pooling in convolutional neural network for remote sensing scene classification |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 22831171 Country of ref document: EP Kind code of ref document: A1 |