Nothing Special   »   [go: up one dir, main page]

CN117953400A - Green micro-reconstruction system for old and old urban communities - Google Patents

Green micro-reconstruction system for old and old urban communities Download PDF

Info

Publication number
CN117953400A
CN117953400A CN202410199648.XA CN202410199648A CN117953400A CN 117953400 A CN117953400 A CN 117953400A CN 202410199648 A CN202410199648 A CN 202410199648A CN 117953400 A CN117953400 A CN 117953400A
Authority
CN
China
Prior art keywords
image
areas
old
shooting
aerial vehicle
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202410199648.XA
Other languages
Chinese (zh)
Other versions
CN117953400B (en
Inventor
张楠
张伟
刘光鹏
施国良
黄伟明
陈蕾
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Prefabricated Architectural Design Institute Co ltd
Guangzhou Municipal Construction Group Co ltd
Guangzhou Municipal Group Co ltd
Gz Municipal Group Design Institute Co ltd
Original Assignee
Guangdong Prefabricated Architectural Design Institute Co ltd
Guangzhou Municipal Construction Group Co ltd
Guangzhou Municipal Group Co ltd
Gz Municipal Group Design Institute Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Prefabricated Architectural Design Institute Co ltd, Guangzhou Municipal Construction Group Co ltd, Guangzhou Municipal Group Co ltd, Gz Municipal Group Design Institute Co ltd filed Critical Guangdong Prefabricated Architectural Design Institute Co ltd
Priority to CN202410199648.XA priority Critical patent/CN117953400B/en
Publication of CN117953400A publication Critical patent/CN117953400A/en
Application granted granted Critical
Publication of CN117953400B publication Critical patent/CN117953400B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/17Terrestrial scenes taken from planes or by drones
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/16Image acquisition using multiple overlapping images; Image stitching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A30/00Adapting or protecting infrastructure or their operation
    • Y02A30/60Planning or developing urban green infrastructure

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Remote Sensing (AREA)
  • Image Processing (AREA)

Abstract

The invention belongs to the field of image recognition, and discloses a green micro-transformation system for old and old communities in towns, which comprises a shooting coordinate determining module, a flight control module, an unmanned aerial vehicle and an image recognition module; the shooting coordinate determining module is used for determining shooting coordinates of the unmanned aerial vehicle when the old community is subjected to aerial photography; the flight control module is used for controlling the unmanned aerial vehicle to arrive at shooting coordinates, and performing nodding shooting on the old cell to obtain a nodding image; the unmanned aerial vehicle is used for sending the nodding image to the image recognition module; the image recognition module is used for performing image stitching on all the nodding images to obtain stitched images, performing image recognition on the stitched images to obtain areas existing in the stitched images, and dividing and combining the areas which are of the same type and are adjacent to each other into the same area so as to obtain distribution diagrams of areas of each type of old cells. The distribution situation of different areas in the old cell can be rapidly determined, and distribution diagrams of different areas in the cell can be obtained.

Description

Green micro-reconstruction system for old and old urban communities
Technical Field
The invention relates to the field of image recognition, in particular to a green micro-transformation system for old communities in towns.
Background
When the old cells in the towns are micro-reformed, the distribution diagram of various areas of the cells is usually required to be acquired first, so that a designer can purposefully reform the old cells according to the distribution situation of areas with different functions in the cells. For example, the roads inside the cell are changed to make it more convenient for the residents of the cell to enter and exit the cell.
The original floor plan is likely to have been lost due to the long history of construction of old cells. Therefore, how to quickly acquire distribution diagrams of different areas of the ground of old cells becomes a technical problem to be solved.
Disclosure of Invention
The invention aims to disclose a green micro-modification system for old and old communities in towns, and solve the problem of how to quickly acquire distribution diagrams of different areas of the ground of the old and old communities.
In order to achieve the above purpose, the present invention provides the following technical solutions:
the invention provides a green micro-reconstruction system for old communities in towns, which comprises a shooting coordinate determining module, a flight control module, an unmanned aerial vehicle and an image recognition module, wherein the shooting coordinate determining module is used for determining shooting coordinates of old communities in towns;
The shooting coordinate determining module is used for determining shooting coordinates of the unmanned aerial vehicle when the old community is subjected to aerial photography;
The flight control module is used for controlling the unmanned aerial vehicle to fly to each shooting coordinate respectively and controlling the unmanned aerial vehicle to perform nodding shooting on the old cell after the unmanned aerial vehicle reaches the shooting coordinates to obtain a nodding shooting image;
the unmanned aerial vehicle is used for sending the shot image obtained through shooting to the image recognition module;
The image recognition module is used for performing image stitching on all the nodding images to obtain stitched images, performing image recognition on the stitched images to obtain areas existing in the stitched images, and dividing and combining the areas which are of the same type and are adjacent to each other into the same area so as to obtain distribution diagrams of areas of each type of old cells.
Optionally, the flight control module is further configured to control the unmanned aerial vehicle to fly to a first altitude above the old cell, and to control the unmanned aerial vehicle to vertically shoot the old cell downward at the first altitude, so as to obtain an aerial image.
Optionally, determining the shooting coordinates of the unmanned aerial vehicle when performing aerial shooting on the old cell includes:
Cutting the aerial image according to the latitude and longitude range of the old cell on the map to obtain a cut image;
Partitioning the cut image, and dividing the cut image into a plurality of local areas;
And respectively acquiring the longitude and latitude of the center of each local area, and taking the longitude and latitude as shooting coordinates.
Optionally, controlling the unmanned aerial vehicle to fly to each shooting coordinate respectively includes:
and controlling the unmanned aerial vehicle to fly to a second height right above each shooting coordinate respectively, wherein the second height is smaller than the first height.
Optionally, partitioning the cut image into a plurality of local areas includes:
s1, acquiring a minimum circumscribed rectangle corresponding to a cutting image;
S2, cutting an image formed by pixel points within a minimum circumscribed rectangular range to obtain a plurality of subareas with the same size;
S3, carrying out local combination on the sub-areas to obtain a plurality of local areas.
Optionally, cutting an image formed by pixel points within a minimum circumscribed rectangular range to obtain a plurality of sub-areas with the same size, including:
Calculating an image P consisting of pixel points in a minimum circumscribed rectangular range to obtain the number N of sub-areas;
p is cut into N subregions of equal area.
Optionally, the remaining sub-regions are locally combined to obtain a plurality of local regions, including:
s10, storing all the subareas into a set K;
s20, randomly selecting a sub-region from the set K as a processing region;
s30, saving the processing area to a set Q;
S40, respectively calculating a combining probability value UAB of each sub-region adjacent to the processing region and belonging to the set K;
S50, judging whether the maximum value of the UAB is smaller than a set combining probability value threshold, if so, storing the subarea corresponding to the maximum value of the UAB into a set Q, taking the subarea with the maximum value of the UAB as a new processing area, and entering S60; if not, entering S70;
S60, deleting the subregions in the set Q from the set K, judging whether the number of the subregions in the set Q is larger than a set number threshold, and if not, entering S40; if yes, go to S70;
S70, forming a local area by all sub-areas in the set Q, and entering S80;
s80, resetting the set Q as an empty set, and entering S90;
S90, judging whether the number of the subareas in the set K is 1, and if so, outputting all the partial areas; if not, the process proceeds to S20.
Optionally, performing image recognition on the stitched image to obtain an area in the stitched image, including:
And carrying out image recognition on the spliced image by adopting a pre-trained neural network algorithm to acquire the existing region in the spliced image.
Optionally, the types of areas include residential areas, business areas, sports areas, road areas, and greening areas.
The beneficial effects are that:
according to the invention, the unmanned aerial vehicle is utilized to shoot at different positions above the old cell, then the shot nod images obtained by shooting are spliced to obtain the spliced image, and then the spliced image is subjected to image recognition, so that the distribution conditions of different areas in the old cell are rapidly determined, and the distribution map of the different areas in the cell is obtained.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed for the description of the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic diagram of a green micro-retrofit system for old town cells according to the present invention.
FIG. 2 is a schematic illustration of the process of dividing a cut image into a plurality of partial areas according to the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, based on the embodiments of the invention, which a person of ordinary skill in the art would obtain without inventive faculty, are within the parameters of the scope of the invention.
The invention provides an embodiment shown in fig. 1, which provides a green micro-modification system for old and old communities in towns, comprising a shooting coordinate determining module, a flight control module, an unmanned aerial vehicle and an image recognition module;
The shooting coordinate determining module is used for determining shooting coordinates of the unmanned aerial vehicle when the old community is subjected to aerial photography;
The flight control module is used for controlling the unmanned aerial vehicle to fly to each shooting coordinate respectively and controlling the unmanned aerial vehicle to perform nodding shooting on the old cell after the unmanned aerial vehicle reaches the shooting coordinates to obtain a nodding shooting image;
the unmanned aerial vehicle is used for sending the shot image obtained through shooting to the image recognition module;
The image recognition module is used for performing image stitching on all the nodding images to obtain stitched images, performing image recognition on the stitched images to obtain areas existing in the stitched images, and dividing and combining the areas which are of the same type and are adjacent to each other into the same area so as to obtain distribution diagrams of areas of each type of old cells.
According to the invention, the unmanned aerial vehicle is utilized to shoot at different positions above the old cell, then the shot nod images obtained by shooting are spliced to obtain the spliced image, and then the spliced image is subjected to image recognition, so that the distribution conditions of different areas in the old cell are rapidly determined, and the distribution map of the different areas in the cell is obtained.
Specifically, when the unmanned aerial vehicle is shot in a nodding mode, the lens of the unmanned aerial vehicle is perpendicular to the ground plane for shooting.
Optionally, obtaining a stitched image for all the nodding images includes:
first round splicing:
Firstly randomly selecting one nodding image as a reference image, and then splicing other nodding images adjacent to shooting coordinates of the reference image with the nodding images respectively, so that the first round of splicing is completed;
Second round splicing:
the images obtained by the first wheel stitching are used as reference images for the second wheel stitching, and other nodding images adjacent to shooting coordinates of the reference images are respectively stitched with the nodding images, so that the second wheel stitching is completed;
And so on until all of the nodding images are stitched into one stitched image.
Specifically, in the process of stitching the same wheel, other nodding images adjacent to shooting coordinates of the reference image can be sequentially stitched with the reference image according to a clockwise or anticlockwise sequence, after each stitching, a new reference image in the process of stitching the same wheel can be obtained, the next stitching is performed on the new reference image, and as the number of times of stitching increases, the area of the reference image also gradually increases.
Optionally, dividing the same type of adjacent areas into the same area includes:
For the region c of the type A, combining all regions with the same type as the region c in the regions adjacent to the region c with the region c to obtain a region obtained by first combination;
For the region obtained by the first merging, merging all regions with the same type as the region obtained by the first merging in the regions adjacent to the region obtained by the first merging with the region obtained by the first merging to obtain the region obtained by the second merging;
And the like until no region with the same type as the region obtained after the multiple times of merging exists in the regions adjacent to the region obtained after the multiple times of merging, thereby obtaining the merging result of the region with one type.
Alternatively, on the distribution map of each type of area of the old cell, the same area is represented with the same color, and the type of area is represented in the area using letters or letters.
Optionally, the flight control module is further configured to control the unmanned aerial vehicle to fly to a first altitude above the old cell, and to control the unmanned aerial vehicle to vertically shoot the old cell downward at the first altitude, so as to obtain an aerial image.
Specifically, when the unmanned aerial vehicle shoots at the first height, all areas of the old community are contained in the shot image.
Optionally, determining the shooting coordinates of the unmanned aerial vehicle when performing aerial shooting on the old cell includes:
Cutting the aerial image according to the latitude and longitude range of the old cell on the map to obtain a cut image;
Partitioning the cut image, and dividing the cut image into a plurality of local areas;
And respectively acquiring the longitude and latitude of the center of each local area, and taking the longitude and latitude as shooting coordinates.
Because the aerial image is photographed at a very high height, the characteristics in the image are not enough to distinguish each area in the old cell, so the method and the device can obtain the cut image by dividing the aerial image, then divide the cut image into a plurality of local areas, facilitate the subsequent photographing of each local area at a lower height, and splice the photographed images to obtain the image of the clearer old cell, thereby improving the probability that the image identification can successfully identify different areas in the old cell.
By taking the middle point of the local area as the shooting coordinate, repeated shooting of the same area in the shooting process can be avoided, and excessive images for image stitching can be obtained.
Specifically, the aerial image is cut according to the latitude and longitude range of the old cell on the map, and a cut image is obtained, which comprises the following steps:
Firstly, acquiring a set B of longitude and latitude of each point of the edge of an area surrounded by an old cell on a map, then acquiring longitude and latitude of each pixel point in an aerial image, and then storing the pixel points of which the longitude and latitude belong to the set B in the aerial image into a set C;
in the aerial image, connecting any two adjacent pixel points in the set C so as to obtain a closed area;
And taking the pixel points in the closed area as the pixel points of the cut image.
The aerial image is cut through the longitude and latitude range, and the area which does not belong to the old cell in the aerial image can be removed, so that the number of pixel points which need to be calculated in the subsequent calculation process is reduced, and a distribution map is obtained more quickly.
Optionally, controlling the unmanned aerial vehicle to fly to each shooting coordinate respectively includes:
and controlling the unmanned aerial vehicle to fly to a second height right above each shooting coordinate respectively, wherein the second height is smaller than the first height.
Specifically, for example, the second height is 500 meters and the first height is 100 meters.
Optionally, as shown in fig. 2, partitioning the cut image into a plurality of local areas includes:
s1, acquiring a minimum circumscribed rectangle corresponding to a cutting image;
S2, cutting an image formed by pixel points within a minimum circumscribed rectangular range to obtain a plurality of subareas with the same size;
S3, carrying out local combination on the sub-areas to obtain a plurality of local areas.
The traditional cutting process is usually carried out based on pixel points, so that a very accurate cutting result can be obtained, and in the invention, the very accurate cutting result is not needed, but only a rough cutting result is needed to be obtained.
Optionally, cutting an image formed by pixel points within a minimum circumscribed rectangular range to obtain a plurality of sub-areas with the same size, including:
Calculating an image P consisting of pixel points in a minimum circumscribed rectangular range to obtain the number N of sub-areas;
p is cut into N subregions of equal area.
The number of the subareas is not appointed in advance, because when aerial photographing is carried out on different old communities, the obtained area distribution characteristics of the pixel points of the aerial photographing image and the high-low distribution characteristics of the gray values are different, if the preset number is adopted to obtain the subareas, when the area distribution of the pixel points is uniform, and the high-low distribution is uniform, excessive subareas are easily obtained, so that the excessive number of the subareas which need to be locally combined affects the efficiency of obtaining the local areas; when the area distribution is uneven and the height distribution is also uneven, the obtained sub-areas are insufficient, so that the difference of pixel points in the sub-areas is overlarge, the possibility that the obtained local areas contain areas in multiple types of old cells is increased, and the same type of areas are not beneficial to being shot into the same shot image in the shot process, so that the accuracy of subsequent image splicing is affected.
Optionally, calculating the image P composed of the pixel points within the minimum circumscribed rectangular range to obtain the number N of sub-regions, including:
acquiring a gray image GP corresponding to P;
Dividing GP into 81 sub-images with the same size;
the number of subregions N is calculated using the following formula:
bsNum denotes a standard value of the number of preset sub-areas, GPSet denotes a set of all sub-images, vasf i denotes a variance of gray values of pixels in the sub-image i, vasfmi denotes a minimum value of the variance of gray values of pixels of the sub-image in GPSet, segthre GP denotes a division threshold value calculated by using a single threshold division algorithm, graymid GP denotes a median value of gray values of pixels in GP, α denotes a preset gray distribution uniformity degree weight, and β denotes a preset gray concentration degree weight.
When the number of the subregions is acquired, the GP is divided into 81 sub-images with the same size, and then the number of the self-adaptive subregions is calculated based on the dividing threshold value of the GP based on the variance of the gray values of the pixel points in the sub-images. When the difference between the variances of the gray values of the pixel points in the sub-images is larger, the absolute value of the difference between the dividing threshold value and the median value of the gray values is larger, the region distribution of the pixel points in P is more uneven, and the height distribution is more uneven; when the difference between the variances of the gray values of the pixel points in the sub-images is smaller, the absolute value of the difference between the dividing threshold value and the median value of the gray values is smaller, the invention can adaptively reduce the number of the sub-areas, thereby improving the subsequent local merging efficiency, and therefore, the invention can obtain better balance between the accuracy of local merging and the local merging efficiency.
Optionally, the preset gray level distribution uniformity degree weight is 0.55, and the preset gray level concentration degree weight is 0.45.
Optionally, the number of subregions is 26244.
Optionally, dividing the GP into 9×9 sub-images of the same size includes:
The number of lines and columns of pixel points in GP are represented by GPA and GPB distribution, and the number of lines of pixel points in the sub-image is Column number of pixel points in the sub-image is/>
Dividing GP into 81 sizesIs a sub-image of (c).
Optionally, cutting P into N subregions with the same area includes:
N is expressed as n=n1×n2, N1 and N2 are integers, and the absolute value of the difference between N1 and N2 is made smallest among the two integers where all multiplication can give N;
the number of rows and columns of pixel points of the subarea are respectively And/>
Cutting P into N areasIs a sub-region of (c).
Optionally, the remaining sub-regions are locally combined to obtain a plurality of local regions, including:
s10, storing all the subareas into a set K;
s20, randomly selecting a sub-region from the set K as a processing region;
s30, saving the processing area to a set Q;
S40, respectively calculating a combining probability value UAB of each sub-region adjacent to the processing region and belonging to the set K;
S50, judging whether the maximum value of the UAB is smaller than a set combining probability value threshold, if so, storing the subarea corresponding to the maximum value of the UAB into a set Q, taking the subarea with the maximum value of the UAB as a new processing area, and entering S60; if not, entering S70;
S60, deleting the subregions in the set Q from the set K, judging whether the number of the subregions in the set Q is larger than a set number threshold, and if not, entering S40; if yes, go to S70;
S70, forming a local area by all sub-areas in the set Q, and entering S80;
s80, resetting the set Q as an empty set, and entering S90;
S90, judging whether the number of the subareas in the set K is 1, and if so, outputting all the partial areas; if not, the process proceeds to S20.
When the sub-regions are locally merged, the sub-regions of the local region determined to belong are continuously deleted from the set K, so that the calculation of repeatedly locally merging the sub-regions of the local region determined to belong can be avoided, and the efficiency of local merging is improved. In addition, the invention sets the quantity threshold value, so that the obtained local area is prevented from being too large, and when the local area is too large, the gap between the local area with the largest area and the local area with the smallest area in all finally obtained local areas is easily caused to be too large, so that the unmanned aerial vehicle cannot shoot the local area with the largest area into the same nodding image when nodding at the first height.
Optionally, the calculation process of the combined probability value UAB includes:
Representing the sub-region adjacent to the processing region and belonging to the set K as K, the calculation formula of the combined probability value UAB between K and the processing region is:
distmid k denotes the distance between the center of k and the center of the first sub-image stored in the set Q, distaim denotes a preset distance comparison value, avegray k denotes the average of the gray values of the pixels in k, avegray ca l denotes the average of the gray values of the pixels in the processing region, gradaim k denotes the gradient direction in which the number of pixels in k is the largest, gradaim cal denotes the gradient direction in which the number of pixels in the processing region is the largest, λ 1 denotes a preset distance weight, λ 2 denotes a preset gray weight, and λ 3 denotes a preset direction weight.
The present invention does not merely determine from gray values when computing the merge probability values, because the present invention locally merges sub-regions, and only considers a single gray value to not fully represent the merge probability between a sub-region and a processing region. Therefore, the present invention also incorporates two factors, namely a distance and a gradient direction, the addition of the distance enables the region obtained by partial merging to be substantially extended around the periphery of the first sub-image stored in the set Q, so that the partial region takes on the shape of an elongated strip, which is obviously also disadvantageous for the unmanned aerial vehicle to take on a photograph in the center of the partial region. Because the drone cannot take the local area entirely into the same nodding image. The addition of the gradient direction can more accurately represent the association degree between k and the processing region, the smaller the absolute value of the difference value of the gradient direction is, the larger the absolute value of the difference value of the mean value of the gray values is, and therefore the greater the possibility that k and the processing region belong to the same type of region in the old cell is, and the probability that the same type of region is shot into the same nodding image during nodding can be effectively improved, and therefore the accuracy of image splicing can be improved.
Optionally, the preset distance comparison value is 3.
Optionally, the preset distance weight is 0.6, the preset gray scale weight is 0.3, and the preset direction is 0.1.
Optionally, the set number threshold is 9.
Optionally, the merge probability value threshold is 0.7.
Optionally, the acquiring process of the gradient direction with the largest number of pixel points includes:
Respectively obtaining the number of pixel points in the gradient direction of each numerical value;
And acquiring the gradient direction with the maximum number of pixel points.
Optionally, performing image recognition on the stitched image to obtain an area in the stitched image, including:
And carrying out image recognition on the spliced image by adopting a pre-trained neural network algorithm to acquire the existing region in the spliced image.
Optionally, the types of areas include residential areas, business areas, sports areas, road areas, and greening areas.
Specifically, the residential area includes a multi-storey residential area and a single villa area. Commercial areas include restaurants, stagers, supermarkets, and the like. The sports area is an area where sports equipment is placed, and comprises basketball courts, table tennis courts and the like. The greening area refers to the area in which greening plants are located in a cell.
In the several embodiments provided by the present invention, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of elements is merely a logical functional division, and there may be additional divisions of actual implementation, e.g., multiple elements or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in the embodiments of the present invention may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The above embodiments are only for illustrating the technical solution of the present invention, and are not limiting; although the invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the technical solutions of the embodiments of the present invention.

Claims (9)

1. The urban old community green micro-modification system is characterized by comprising a shooting coordinate determining module, a flight control module, an unmanned aerial vehicle and an image recognition module;
The shooting coordinate determining module is used for determining shooting coordinates of the unmanned aerial vehicle when the old community is subjected to aerial photography;
The flight control module is used for controlling the unmanned aerial vehicle to fly to each shooting coordinate respectively and controlling the unmanned aerial vehicle to perform nodding shooting on the old cell after the unmanned aerial vehicle reaches the shooting coordinates to obtain a nodding shooting image;
the unmanned aerial vehicle is used for sending the shot image obtained through shooting to the image recognition module;
The image recognition module is used for performing image stitching on all the nodding images to obtain stitched images, performing image recognition on the stitched images to obtain areas existing in the stitched images, and dividing and combining the areas which are of the same type and are adjacent to each other into the same area so as to obtain distribution diagrams of areas of each type of old cells.
2. The urban old cell green micro-modification system according to claim 1, wherein the flight control module is further configured to control the unmanned aerial vehicle to fly to a first altitude above the old cell, and to control the unmanned aerial vehicle to vertically photograph the old cell at the first altitude downwards, so as to obtain the aerial image.
3. The urban old cell green micro-modification system according to claim 2, wherein determining the shooting coordinates of the unmanned aerial vehicle when the old cell is subjected to aerial shooting comprises:
Cutting the aerial image according to the latitude and longitude range of the old cell on the map to obtain a cut image;
Partitioning the cut image, and dividing the cut image into a plurality of local areas;
And respectively acquiring the longitude and latitude of the center of each local area, and taking the longitude and latitude as shooting coordinates.
4. A town old cell green micro-reconstruction system as claimed in claim 3, wherein controlling the unmanned aerial vehicle to fly to each shooting coordinate separately comprises:
and controlling the unmanned aerial vehicle to fly to a second height right above each shooting coordinate respectively, wherein the second height is smaller than the first height.
5. A town old cell green micro-modification system as claimed in claim 3, wherein partitioning the cut image into a plurality of partial regions comprises:
s1, acquiring a minimum circumscribed rectangle corresponding to a cutting image;
S2, cutting an image formed by pixel points within a minimum circumscribed rectangular range to obtain a plurality of subareas with the same size;
S3, carrying out local combination on the sub-areas to obtain a plurality of local areas.
6. The urban old cell green micro-modification system according to claim 5, wherein the cutting of the image composed of the pixels within the minimum circumscribed rectangle range to obtain a plurality of sub-areas with the same size comprises:
Calculating an image P consisting of pixel points in a minimum circumscribed rectangular range to obtain the number N of sub-areas;
p is cut into N subregions of equal area.
7. The urban old cell green micro-modification system according to claim 5, wherein the partial merging of the sub-areas to obtain a plurality of partial areas comprises:
s10, storing all the subareas into a set K;
s20, randomly selecting a sub-region from the set K as a processing region;
s30, saving the processing area to a set Q;
S40, respectively calculating a combining probability value UAB of each sub-region adjacent to the processing region and belonging to the set K;
S50, judging whether the maximum value of the UAB is smaller than a set combining probability value threshold, if so, storing the subarea corresponding to the maximum value of the UAB into a set Q, taking the subarea with the maximum value of the UAB as a new processing area, and entering S60; if not, entering S70;
S60, deleting the subregions in the set Q from the set K, judging whether the number of the subregions in the set Q is larger than a set number threshold, and if not, entering S40; if yes, go to S70;
S70, forming a local area by all sub-areas in the set Q, and entering S80;
s80, resetting the set Q as an empty set, and entering S90;
S90, judging whether the number of the subareas in the set K is 1, and if so, outputting all the partial areas; if not, the process proceeds to S20.
8. The urban old cell green micro-modification system according to claim 1, wherein the image recognition of the stitched image, obtaining the existing region in the stitched image, comprises:
And carrying out image recognition on the spliced image by adopting a pre-trained neural network algorithm to acquire the existing region in the spliced image.
9. The town old cell green micro-retrofit system of claim 1, wherein the types of areas comprise residential areas, business areas, sports areas, road areas, and greening areas.
CN202410199648.XA 2024-02-23 2024-02-23 Green micro-reconstruction system for old and old urban communities Active CN117953400B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410199648.XA CN117953400B (en) 2024-02-23 2024-02-23 Green micro-reconstruction system for old and old urban communities

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410199648.XA CN117953400B (en) 2024-02-23 2024-02-23 Green micro-reconstruction system for old and old urban communities

Publications (2)

Publication Number Publication Date
CN117953400A true CN117953400A (en) 2024-04-30
CN117953400B CN117953400B (en) 2024-08-23

Family

ID=90796256

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410199648.XA Active CN117953400B (en) 2024-02-23 2024-02-23 Green micro-reconstruction system for old and old urban communities

Country Status (1)

Country Link
CN (1) CN117953400B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118627862A (en) * 2024-08-12 2024-09-10 浙江省城乡规划设计研究院 Old cell update-oriented environmental data acquisition and update space analysis method

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180157911A1 (en) * 2016-12-02 2018-06-07 GEOSAT Aerospace & Technology Methods and systems for automatic object detection from aerial imagery
CN115147746A (en) * 2022-09-02 2022-10-04 广东容祺智能科技有限公司 Saline-alkali geological identification method based on unmanned aerial vehicle remote sensing image
CN116805396A (en) * 2023-08-24 2023-09-26 杭州稻道农业科技有限公司 Satellite remote sensing-based farmland weed precise identification method and device
CN116958528A (en) * 2023-07-18 2023-10-27 中邮通建设咨询有限公司 Multi-unmanned aerial vehicle cooperative target detection method based on image fusion
CN117288168A (en) * 2023-11-24 2023-12-26 山东中宇航空科技发展有限公司 Unmanned aerial vehicle city building system of taking photo by plane of low-power consumption
CN117474935A (en) * 2023-10-31 2024-01-30 深圳银星智能集团股份有限公司 Map partitioning method, robot and computer readable storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180157911A1 (en) * 2016-12-02 2018-06-07 GEOSAT Aerospace & Technology Methods and systems for automatic object detection from aerial imagery
CN115147746A (en) * 2022-09-02 2022-10-04 广东容祺智能科技有限公司 Saline-alkali geological identification method based on unmanned aerial vehicle remote sensing image
CN116958528A (en) * 2023-07-18 2023-10-27 中邮通建设咨询有限公司 Multi-unmanned aerial vehicle cooperative target detection method based on image fusion
CN116805396A (en) * 2023-08-24 2023-09-26 杭州稻道农业科技有限公司 Satellite remote sensing-based farmland weed precise identification method and device
CN117474935A (en) * 2023-10-31 2024-01-30 深圳银星智能集团股份有限公司 Map partitioning method, robot and computer readable storage medium
CN117288168A (en) * 2023-11-24 2023-12-26 山东中宇航空科技发展有限公司 Unmanned aerial vehicle city building system of taking photo by plane of low-power consumption

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
韩涛;黄如金;: "无人机航测在土地开发整理项目中的应用", 城市地理, no. 04, 25 February 2018 (2018-02-25), pages 122 - 123 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118627862A (en) * 2024-08-12 2024-09-10 浙江省城乡规划设计研究院 Old cell update-oriented environmental data acquisition and update space analysis method

Also Published As

Publication number Publication date
CN117953400B (en) 2024-08-23

Similar Documents

Publication Publication Date Title
CN109886997B (en) Identification frame determining method and device based on target detection and terminal equipment
CN117953400B (en) Green micro-reconstruction system for old and old urban communities
CN111798467B (en) Image segmentation method, device, equipment and storage medium
US20190228529A1 (en) Image Segmentation Method, Apparatus, and Fully Convolutional Network System
CN112287912A (en) Deep learning-based lane line detection method and device
US8711141B2 (en) 3D image generating method, 3D animation generating method, and both 3D image generating module and 3D animation generating module thereof
CN110310301B (en) Method and device for detecting target object
CN110473238B (en) Image registration method and device, electronic equipment and storage medium
CN113947766B (en) Real-time license plate detection method based on convolutional neural network
CN112580558A (en) Infrared image target detection model construction method, detection method, device and system
CN111062331B (en) Image mosaic detection method and device, electronic equipment and storage medium
CN106529520A (en) Marathon match associated photo management method based on athlete number identification
CN115239886A (en) Remote sensing UAV-MVS image point cloud data processing method, device, equipment and medium
WO2021042552A1 (en) Regional three-dimensional reconstruction method and apparatus, and computer readable storage medium
CN114663298A (en) Disparity map repairing method and system based on semi-supervised deep learning
CN115228092B (en) Game battle force evaluation method, device and computer readable storage medium
CN111798415A (en) Method and device for monitoring buildings in expressway control area and storage medium
CN116608850A (en) Method, system, device and medium for constructing robot navigation map
CN114283081B (en) Depth recovery method based on pyramid acceleration, electronic device and storage medium
CN110827243B (en) Method and device for detecting abnormity of coverage area of grid beam
CN114596003B (en) Analysis method and device for visual comfort of plant landscape
CN109767484A (en) With the light and color homogenization method and system of color consistency in a kind of portion three-dimensional picture pasting
CN116310899A (en) YOLOv 5-based improved target detection method and device and training method
CN115830342A (en) Method and device for determining detection frame, storage medium and electronic device
CN114998629A (en) Satellite map and aerial image template matching method and unmanned aerial vehicle positioning method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant