Nothing Special   »   [go: up one dir, main page]

CN113566807A - Automatic navigation method, navigation device and storage medium - Google Patents

Automatic navigation method, navigation device and storage medium Download PDF

Info

Publication number
CN113566807A
CN113566807A CN202010352432.4A CN202010352432A CN113566807A CN 113566807 A CN113566807 A CN 113566807A CN 202010352432 A CN202010352432 A CN 202010352432A CN 113566807 A CN113566807 A CN 113566807A
Authority
CN
China
Prior art keywords
image
guided vehicle
automatic
color
world coordinates
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010352432.4A
Other languages
Chinese (zh)
Inventor
曹鹏蕊
谢恺
罗为
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Longhua New Generation Communication And Intelligent Computing Research Institute
Fuhuake Precision Industry Shenzhen Co ltd
Original Assignee
Shenzhen Longhua New Generation Communication And Intelligent Computing Research Institute
Fuhuake Precision Industry Shenzhen Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Longhua New Generation Communication And Intelligent Computing Research Institute, Fuhuake Precision Industry Shenzhen Co ltd filed Critical Shenzhen Longhua New Generation Communication And Intelligent Computing Research Institute
Priority to CN202010352432.4A priority Critical patent/CN113566807A/en
Publication of CN113566807A publication Critical patent/CN113566807A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/04Context-preserving transformations, e.g. by using an importance map
    • G06T3/047Fisheye or wide-angle transformations

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Theoretical Computer Science (AREA)
  • Navigation (AREA)

Abstract

The invention provides an automatic navigation method, which comprises the steps of controlling a camera unit of a navigation device to shoot an image of a running path of an automatic guided vehicle; extracting a color area from the shot image, and performing binarization processing on the image with the extracted color area; converting the image subjected to binarization processing into a top view; acquiring world coordinates of a central point of each color area in the top view; respectively determining linear equations of lines on two sides according to the acquired world coordinates of a plurality of central points of the color areas on two sides; determining the pose relationship between the navigation device and the two side lines according to the linear equation and the geometric relationship; and navigating the automatic guided vehicle according to the pose relation. The invention also provides a navigation device and a storage medium. The invention reduces the cost required by the navigation system of the automatic guided vehicle, is easy to deploy, has simple path change and expansion and is simple for later maintenance.

Description

Automatic navigation method, navigation device and storage medium
Technical Field
The present invention relates to the field of navigation, and in particular, to an automatic navigation method, a navigation apparatus, and a storage medium.
Background
Automated Guided Vehicles (AGVs) have become an important infrastructure in intelligent logistics, intelligent manufacturing, and digital factories. AGV can make things convenient for the transportation of mill, improves the production efficiency and the operating efficiency of mill. Common AGV navigation methods include magnetic stripe navigation, laser navigation, two-dimensional code navigation, and the like. The magnetic stripe navigation AGV system runs reliably and stably, but a magnetic stripe needs to be additionally laid in an application scene. The laying workload of the magnetic strips is large, the application field can be modified to finish the laying, if the navigation path is changed, the magnetic strips need to be laid again in the application field, and the bottom viscosity of the magnetic strips can be weakened due to long-term use, so that the magnetic strips are not beneficial to reuse. The AGV based on laser navigation is to construct an indoor complete map through a laser radar to obtain complete surrounding environment information, obtain the surrounding information in real time through laser scanning, and realize navigation through an algorithm. Although the AGV using laser navigation does not need to lay a fixed track on an application site, when a large object or a wall is encountered, the AGV may be positioned incorrectly, and a phenomenon of departing from a preset path may occur. Therefore, the AGV navigation system in the prior art is generally high in cost, not easy to deploy and difficult to maintain later.
Disclosure of Invention
In view of the above, it is necessary to provide an automatic navigation method, a navigation apparatus, and a storage medium, which locate and navigate an automatic guided vehicle using a line set on a travel path of the automatic guided vehicle.
A first aspect of the present invention provides an automatic navigation method including:
the method comprises the steps of controlling a camera unit of a navigation device to shoot an image of a running path of an automatic guided vehicle, wherein the image comprises lines on two sides of the running path, and the lines comprise color areas which are arranged at intervals;
extracting the color area from the shot image, and carrying out binarization processing on the image with the extracted color area;
converting the image subjected to binarization processing into a top view;
acquiring world coordinates of a central point of each color area in the top view;
respectively determining linear equations of lines on two sides according to the acquired world coordinates of a plurality of central points of the color areas on two sides;
determining the pose relationship between the navigation device and the two side lines according to the linear equation and the geometric relationship; and
and navigating the automatic guided vehicle according to the pose relation.
Preferably, the image is a fisheye image, the method further comprising:
and carrying out distortion correction on the image shot by the shooting unit.
Preferably, the extracting the color region from the captured image and subjecting the image from which the color region is extracted to binarization processing includes:
converting the image from the rgb channel to the hsv channel; and
and carrying out binarization processing on the converted hsv image by setting a filtered color range.
Preferably, the converting the binarized image into a top view includes:
determining world coordinates of four end points of a preset rectangular area in a shooting range of the camera shooting unit;
converting world coordinates of the four endpoints into image pixel coordinates; and
affine transformation is carried out on the image according to the image pixel coordinates of the four end points so as to form the top view corresponding to the image.
Preferably, the acquiring the world coordinates of the central point of each color region in the top view comprises:
segmenting the top view into two images, wherein each image comprises a line;
determining the central point of each color area in the two images through contour searching; and
the world coordinates of each center point are determined.
Preferably, the determining the linear equations of the two side lines according to the acquired world coordinates of the central points of the two side color regions respectively includes:
and respectively performing straight line fitting on the world coordinates of a plurality of central points of the color areas in the left image and the right image to determine straight line equations of the two side lines.
Preferably, the pose relationship is a distance between a center point of the automatic guided vehicle and the two side lines.
Preferably, the navigating the automatic guided vehicle according to the pose relationship comprises:
and in the running process of the automatic guided vehicle, controlling the running route of the automatic guided vehicle and keeping the current distance between the center point of the automatic guided vehicle and the two side lines to be the same as the preset initial distance.
A second aspect of the present invention provides a navigation device comprising:
a processor; and
a memory, in which a plurality of program modules are stored, the plurality of program modules being loaded by the processor and executing the automatic navigation method.
A third aspect of the present invention provides a storage medium having stored thereon at least one computer instruction for execution by a processor and loaded to perform the method of automatic navigation described above.
The automatic navigation method, the navigation device and the storage medium can position and navigate the automatic guided vehicle by shooting the lines arranged on the running path of the automatic guided vehicle, reduce the cost required by the navigation system of the automatic guided vehicle, are easy to deploy, and are simple in path change and expansion and simple in later maintenance.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the provided drawings without creative efforts.
FIG. 1 is a schematic diagram of an application environment architecture of an automatic navigation method according to a preferred embodiment of the present invention.
Fig. 2 is a schematic structural diagram of a navigation device according to a preferred embodiment of the present invention.
FIG. 3 is a schematic structural diagram of an automatic navigation system according to a preferred embodiment of the present invention.
Fig. 4 is an image captured by the image capturing unit according to the preferred embodiment of the present invention.
Fig. 5 is an image of the image of fig. 4 subjected to binarization processing.
FIG. 6 is a diagram illustrating a shooting range of the image capturing unit according to the preferred embodiment of the present invention.
Fig. 7 is an image of the image of fig. 5 formed by segmentation.
FIG. 8 is a schematic view of the geometry of the automated guided vehicle and the two side lines in accordance with the preferred embodiment of the present invention.
FIG. 9 is a flowchart illustrating an automatic navigation method according to a preferred embodiment of the present invention.
Description of the main elements
Navigation device 1
Processor 10
Automatic navigation system 100
Camera module 101
Orthotic module 102
Extraction module 103
Conversion module 104
Acquisition module 105
Determination module 106
Navigation module 107
Memory 20
Computer program 30
Image pickup unit 40
Automatic guided vehicle 2
Steps S901-S908
The following detailed description will further illustrate the invention in conjunction with the above-described figures.
Detailed Description
In order that the above objects, features and advantages of the present invention can be more clearly understood, a detailed description of the present invention will be given below with reference to the accompanying drawings and specific embodiments. It should be noted that the embodiments and features of the embodiments of the present application may be combined with each other without conflict.
In the following description, numerous specific details are set forth to provide a thorough understanding of the present invention, and the described embodiments are merely a subset of the embodiments of the present invention, rather than a complete embodiment. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. The terminology used in the description of the invention herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention.
Fig. 1 is a schematic view of an application environment architecture of an automatic navigation method according to a preferred embodiment of the present invention.
The automatic navigation method is applied to a navigation device 1, and the navigation device 1 and an automatic guide vehicle 2 are in communication connection. The communication connection may be a wired connection or a Wireless connection, such as radio, Wireless Fidelity (WIFI), cellular, etc.
In the present embodiment, the navigation device 1 may be an electronic device, such as a personal computer or a smart phone, in which an automatic navigation program is installed. The navigation device 1 may be mounted on the automatic guided vehicle 2.
The automatic guided vehicle 2 can automatically travel under the navigation of the navigation device 1 for loading and shipping.
Fig. 2 is a schematic structural diagram of a navigation device according to a preferred embodiment of the invention.
The navigation device 1 comprises, but is not limited to, a processor 10, a memory 20, a computer program 30 stored in the memory 20 and executable on the processor 10, and a camera unit 40. The computer program 30 is, for example, an automatic navigation program. The processor 10 implements steps in an automatic navigation method, such as steps S901 to S908 shown in fig. 9, when executing the computer program 30. Alternatively, the processor 10 implements the functions of each module/unit in the automatic navigation system when executing the computer program 30, such as the module 101 and 107 in fig. 3.
Illustratively, the computer program 30 may be partitioned into one or more modules/units that are stored in the memory 20 and executed by the processor 10 to implement the present invention. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution process of the computer program 30 in the navigation device 1. For example, the computer program 30 may be divided into a camera module 101, a rectification module 102, an extraction module 103, a transformation module 104, an acquisition module 105, a determination module 106 and a navigation module 107 in fig. 3. The specific functions of the modules refer to the functions of the modules in the automatic navigation system embodiment.
It will be appreciated by a person skilled in the art that the schematic diagram is merely an example of the navigation apparatus 1 and does not constitute a limitation of the navigation apparatus 1, and that more or less components than those shown in the figures may be included, or some components may be combined, or different components may be included, for example, the navigation apparatus 1 may further include an input output device, a network access device, a bus, etc.
The Processor 10 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. The general purpose processor may be a microprocessor or the processor 10 may be any conventional processor or the like, the processor 10 being the control center of the navigation device 1, various interfaces and lines connecting the various parts of the entire navigation device 1.
The memory 20 may be used to store the computer program 30 and/or modules/units, and the processor 10 implements various functions of the navigation device 1 by running or executing the computer program and/or modules/units stored in the memory 20 and calling data stored in the memory 20. The memory 20 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data created according to the use of the navigation device 1, and the like. In addition, the memory 20 may include high speed random access memory, and may also include non-volatile memory, such as a hard disk, a memory, a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), at least one magnetic disk storage device, a Flash memory device, or other volatile solid state storage device.
In the present embodiment, the image capturing unit 40 is a fisheye camera, and includes at least a fisheye lens. Wherein the fisheye lens is a lens with a focal length of 16mm and a viewing angle close to or equal to 180 degrees. In the present embodiment, the memory 20 stores internal reference matrices K and D, an external reference rotation matrix R, and a translation vector t of the imaging unit 40.
Please refer to fig. 3, which is a functional block diagram of an automatic navigation system according to a preferred embodiment of the present invention.
In some embodiments, the automatic navigation system 100 is operated in the navigation device 1. The automatic navigation system 100 may include a plurality of functional modules comprised of program code segments. Program codes of respective program segments in the automatic navigation system 100 may be stored in the memory 20 of the navigation device 1 and executed by the at least one processor 10 to implement an automatic navigation function.
In the present embodiment, the automatic navigation system 100 may be divided into a plurality of functional modules according to the functions to be executed. Referring to fig. 3, the functional modules may include a camera module 101, a rectification module 102, an extraction module 103, a conversion module 104, an acquisition module 105, a determination module 106, and a navigation module 107. The module referred to in the present invention refers to a series of computer program segments capable of being executed by at least one processor and performing a fixed function, which are stored in the memory 20. It will be appreciated that in other embodiments the modules may also be program instructions or firmware (firmware) that are fixed in the processor 10.
The camera module 101 is configured to control the camera unit 40 to capture an image of a traveling path of the automated guided vehicle 2.
In the present embodiment, the imaging unit 40 captures a fisheye image (image a in fig. 4) of a travel path ahead of the automated guided vehicle 2. The image comprises lines on two sides of the driving path, and the lines comprise color areas arranged at intervals. Preferably, the lines are yellow and black zebra stripes, and the color areas are yellow areas, that is, the lines include yellow areas arranged at intervals.
The correction module 102 is configured to perform distortion correction on the image captured by the image capturing unit 40.
Referring to fig. 4, in the present embodiment, the rectification module 102 calls an Opencv (Open Source Computer Vision Library) function and performs distortion rectification on the image a by using the internal reference matrix K, D of the image capturing unit 40, so as to obtain a rectified image B.
The extraction module 103 is configured to extract the color region from the captured image, and perform binarization processing on the image with the extracted color region.
Referring to fig. 5, in this embodiment, the extraction module 103 converts the corrected image B from rgb channel to hsv channel, and performs binarization processing on the converted hsv image by setting a filtered color range to obtain an image C.
Specifically, the calculation formula for converting the rgb channel into the hsv channel by the extraction module 103 includes:
Figure BDA0002472325920000121
Figure BDA0002472325920000122
v=max (3)。
wherein max is the maximum value of r, g and b, and min is the minimum value of r, g and b.
In this embodiment, the filtered color ranges are low _ range (h _ low, s _ low, v _ low) and high _ range (h _ high, s _ high, v _ high), and the calculation formula for the extraction module 103 to binarize the hsv channel image is as follows:
Figure BDA0002472325920000123
where 0 represents black and 255 represents white.
The conversion module 104 is configured to convert the image subjected to the binarization processing into a top view.
In the present embodiment, the conversion module 104 determines world coordinates of four end points of a preset rectangular area within the shooting range of the imaging unit 40. Referring to fig. 6, the predetermined rectangular area is a dashed rectangle in the figure, and the four endpoints are a, b, c, and d, respectively.
Specifically, the conversion module 104 uses the central point of the automatic guided vehicle 2 as an origin, the central axis is a y-axis, the vertical direction of the central axis is an x-axis, and the vertical direction of the automatic guided vehicle 2 is a z-axis to establish a three-dimensional rectangular coordinate system. Assuming that the length of the automatic guided vehicle 2 is L, the width thereof is W, the height thereof is H, the length of the preset rectangular area is 60cm, and the width thereof is 240cm, the world coordinate of the end point a is (-120, L/2, 0), the world coordinate of the end point b is (-120, 60+ L/2, 0), the world coordinate of the end point c is (120, 60+ L/2, 0), and the world coordinate of the end point d is (120, L/2, 0).
Further, the conversion module 104 converts world coordinates of the four endpoints into image pixel coordinates. Specifically, the conversion module 104 converts the world coordinates of the four endpoints into image pixel coordinates through the internal reference matrix K, the external reference rotation matrix R and the translation vector t of the camera unit 40, and the calculation formula includes:
Figure BDA0002472325920000131
Figure BDA0002472325920000132
wherein, the calculated image pixel coordinates of the four endpoints a, b, c and d are respectively a _ pixel (u)a,va)、b_pixel(ub,vb)、c_pixel(uc,vc)、d_pixel(ud,vd)。
Further, the conversion module 104 affine-transforms the image according to the image pixel coordinates of the four end points to form a top view corresponding to the image.
Specifically, the conversion module 104 converts the pixel coordinate a _ pixel (u)a,va) Mapped as a _ floor (0,600), image pixel coordinate b _ pixel (u)b,vb) Mapped as b _ floor (0,0), image pixel coordinatesc_pixel(uc,vc) Mapped to c _ floor (1200,0), image pixel coordinate d _ pixel (u)d,vd) Mapped as d _ floor (1200,600). And obtaining a mapping matrix M, M _ inv through a getPerspectTransform function in the Opencv function, and obtaining a top view image (such as an image C1 in the figure 5) which is 60cm away from the camera and 240cm wide through a warpPeractive function in the Opencv function and the mapping matrix M.
The obtaining module 105 is configured to obtain world coordinates of a center point of each color region in the top view.
In this embodiment, the acquisition module 105 divides the top view into two images (e.g., images C2, C3 in fig. 7). Wherein each image comprises a line. The acquisition module 105 determines the center point of each color region in each image by contour finding.
Specifically, the obtaining module 105 obtains the outlines in the images C2 and C3 by using a findcontours function in an Opencv function, and performs conditional filtering on the outlines. Wherein the condition comprises the area of the contour being greater than a minimum value, such as 50; the area ratio of the minimum outline rectangle to the outline is greater than a first preset value, such as 0.4; and the aspect ratio of the minimum circumscribed rectangle of the contour is greater than a second preset value, for example 0.5. The acquisition module 105 records the center point image pixel coordinates of the contour satisfying the above conditions as constants _ center.
Further, the acquisition module 105 determines world coordinates for each center point. In this embodiment, the acquisition module 105 converts the image pixel coordinates of each center point into corresponding world coordinates.
Specifically, the obtaining module 105 synthesizes all valid contour center points in the image C2 into a set of constraints _ center _ pixel _ left _ array, and synthesizes all valid contour center points in the image C3 into a set of constraints _ center _ pixel _ right _ array; the coordinates of the world _ center _ world _ left _ array and the geometry _ center _ world _ right _ array are converted into the coordinates of the world _ center _ pixel _ left _ array and the geometry _ center _ world _ right _ array. The calculation formula adopted by the obtaining module 105 to convert the image pixel coordinate of the contour center point into the world coordinate includes:
Figure BDA0002472325920000151
Figure BDA0002472325920000152
wherein R is an external reference rotation matrix of the image capturing unit 40, t is a translation vector, and K is an internal reference matrix.
The determining module 106 is configured to determine linear equations of lines on two sides according to the acquired world coordinates of the central points of the color regions on two sides.
In this embodiment, the determining module 106 performs straight line fitting on the world coordinates of the central points in the acquired images C2 and C3 by using a polyfit function in numpy (an open source scientific calculation library of Python), respectively, to obtain straight line equations of two lines in the images C2 and C3, that is, straight line equations of two lines on two sides of the driving path of the automated guided vehicle 2.
The determining module 106 is further configured to determine a pose relationship between the automatic guided vehicle 2 and the two side lines according to the linear equation and the geometric relationship.
In the present embodiment, the pose relationship is a distance between the center point of the automatic guided vehicle 2 and the two side lines.
Referring to fig. 8, r is a straight line fitted from the image C3, l is a straight line fitted from the image C2, and the origin of the coordinate axes is the center point AGV _ center of the automatic guided vehicle 2. The distance between the intersection point of the straight line r and the y axis and the origin is b _ r, the distance between the intersection point of the straight line l and the y axis and the origin is b _ l, the included angle between the straight line r and the x axis is a _ r, and the included angle between the straight line l and the x axis is a _ l. According to the geometrical relationship, namely the characteristics of the right triangle, the calculation formula of the distance delta _ left _ dist between the center point AGV _ center of the automatic guided vehicle 2 and the straight line l and the distance delta _ right _ dist between the center point AGV _ center and the straight line r is as follows:
Figure BDA0002472325920000161
the navigation module 107 is configured to navigate the automatic guided vehicle 2 according to the pose relationship.
In the present embodiment, the navigation module 107 controls the driving route of the automatic guided vehicle 2 while the automatic guided vehicle 2 is driving, so as to keep the current distance between the center point of the automatic guided vehicle 2 and the two side lines the same as the preset initial distance, thereby preventing the automatic guided vehicle 2 from deviating from the preset driving route.
Please refer to fig. 9, which is a flowchart illustrating an automatic navigation method according to the present invention. The order of the steps in the flow chart may be changed and some steps may be omitted according to different needs.
In step S901, the imaging unit 40 is controlled to capture an image of the travel route of the automatic guided vehicle 2.
In step S902, distortion correction is performed on the image captured by the imaging unit 40.
Step S903 of extracting the color region from the captured image, and performing binarization processing on the image from which the color region is extracted.
Step S904, the image subjected to the binarization processing is converted into a top view.
Step S905, a world coordinate of a center point of each color region in the top view is obtained.
Step S906, linear equations of the lines on the two sides are respectively determined according to the acquired world coordinates of the central points of the color areas on the two sides.
And step S907, determining the pose relationship between the automatic guided vehicle 2 and the two side lines according to the linear equation and the geometric relationship.
Step S908, navigating the automated guided vehicle 2 according to the pose relationship.
The integrated modules/units of the navigation device 1 may be stored in a computer-readable storage medium if they are implemented in the form of software functional units and sold or used as separate products. Based on such understanding, all or part of the flow of the method according to the embodiments of the present invention may also be implemented by a computer program, which may be stored in a computer-readable storage medium, and which, when executed by a processor, may implement the steps of the above-described embodiments of the method. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
The automatic navigation method, the navigation device and the storage medium provided by the invention can be used for positioning and navigating the automatic guided vehicle by shooting the lines arranged on the driving path of the automatic guided vehicle, so that the cost required by a navigation system is reduced, the automatic navigation method, the navigation device and the storage medium are easy to deploy, the path is simple to change and expand, and the later maintenance is simple.
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be embodied in other specific forms without departing from the spirit or essential attributes thereof. The present embodiments are therefore to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein. Any reference sign in a claim should not be construed as limiting the claim concerned. Furthermore, it is obvious that the word "comprising" does not exclude other elements or steps, and the singular does not exclude the plural. Several units or means recited in the apparatus claims may also be embodied by one and the same item or means in software or hardware. The terms first, second, etc. are used to denote names, but not any particular order.
Finally, it should be noted that the above embodiments are only for illustrating the technical solutions of the present invention and not for limiting, and although the present invention is described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications or equivalent substitutions may be made on the technical solutions of the present invention without departing from the spirit and scope of the technical solutions of the present invention.

Claims (10)

1. An automated navigation method, the method comprising:
the method comprises the steps of controlling a camera unit of a navigation device to shoot an image of a running path of an automatic guided vehicle, wherein the image comprises lines on two sides of the running path, and the lines comprise color areas which are arranged at intervals;
extracting the color area from the shot image, and carrying out binarization processing on the image with the extracted color area;
converting the image subjected to binarization processing into a top view;
acquiring world coordinates of a central point of each color area in the top view;
respectively determining linear equations of lines on two sides according to the acquired world coordinates of a plurality of central points of the color areas on two sides;
determining the pose relationship between the automatic guided vehicle and the two side lines according to the linear equation and the geometric relationship; and
and navigating the automatic guided vehicle according to the pose relation.
2. The automated navigation method of claim 1, wherein the image is a fisheye image, the method further comprising:
and carrying out distortion correction on the image shot by the shooting unit.
3. The automatic navigation method according to claim 1, wherein the extracting the color region from the captured image and subjecting the image from which the color region is extracted to binarization processing includes:
converting the image from the rgb channel to the hsv channel; and
and carrying out binarization processing on the converted hsv image by setting a filtered color range.
4. The automatic navigation method according to claim 1, wherein the converting the binarized image into a top view comprises:
determining world coordinates of four end points of a preset rectangular area in a shooting range of the camera shooting unit;
converting world coordinates of the four endpoints into image pixel coordinates; and
affine transformation is carried out on the image according to the image pixel coordinates of the four end points so as to form the top view corresponding to the image.
5. The automated navigation method of claim 4, wherein the obtaining world coordinates of a center point of each color region in the top view comprises:
segmenting the top view into two images, wherein each image comprises a line;
determining the central point of each color area in the two images through contour searching; and
the world coordinates of each center point are determined.
6. The automatic navigation method of claim 5, wherein the determining the linear equations of the two side lines according to the acquired world coordinates of the central points of the two side color regions respectively comprises:
and respectively performing straight line fitting on the world coordinates of a plurality of central points of the color areas in the two images to determine straight line equations of the two side lines.
7. The automated navigation method according to claim 1, wherein the pose relationship is a distance between a center point of the automated guided vehicle and the two side lines.
8. The automated navigation method of claim 7, wherein the navigating the automated guided vehicle according to the pose relationship comprises:
and in the running process of the automatic guided vehicle, controlling the running route of the automatic guided vehicle and keeping the current distance between the center point of the automatic guided vehicle and the two side lines to be the same as the preset initial distance.
9. A navigation device, wherein the server comprises:
a processor; and
a memory having stored therein a plurality of program modules that are loaded by the processor and execute the automated navigation method of any one of claims 1-8.
10. A storage medium having stored thereon at least one computer instruction, wherein the instruction is loaded by a processor and performs the automated navigation method of any one of claims 1-8.
CN202010352432.4A 2020-04-28 2020-04-28 Automatic navigation method, navigation device and storage medium Pending CN113566807A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010352432.4A CN113566807A (en) 2020-04-28 2020-04-28 Automatic navigation method, navigation device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010352432.4A CN113566807A (en) 2020-04-28 2020-04-28 Automatic navigation method, navigation device and storage medium

Publications (1)

Publication Number Publication Date
CN113566807A true CN113566807A (en) 2021-10-29

Family

ID=78158194

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010352432.4A Pending CN113566807A (en) 2020-04-28 2020-04-28 Automatic navigation method, navigation device and storage medium

Country Status (1)

Country Link
CN (1) CN113566807A (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20080024776A (en) * 2006-09-14 2008-03-19 주식회사 만도 Method and apparatus for recognizing parking slot marking by using hough transformation and parking assist system using same
JP2010107348A (en) * 2008-10-30 2010-05-13 Panasonic Corp Calibration target and in-vehicle calibration system using it
US20100268452A1 (en) * 2007-06-12 2010-10-21 Tsuyoshi Kindo Navigation device, navigation method, and navigation program
CN102506852A (en) * 2011-11-01 2012-06-20 丁幼春 Visual navigation system and navigation method thereof for agricultural vehicle
CN107392103A (en) * 2017-06-21 2017-11-24 海信集团有限公司 The detection method and device of road surface lane line, electronic equipment
JP2018169947A (en) * 2017-03-30 2018-11-01 株式会社日立情報通信エンジニアリング Lane recognition apparatus and lane recognition program
CN109784250A (en) * 2019-01-04 2019-05-21 广州广电研究院有限公司 The localization method and device of automatically guiding trolley
CN110246177A (en) * 2019-06-25 2019-09-17 上海大学 Automatic wave measuring method based on vision
WO2020024234A1 (en) * 2018-08-02 2020-02-06 深圳前海达闼云端智能科技有限公司 Route navigation method, related device, and computer readable storage medium

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20080024776A (en) * 2006-09-14 2008-03-19 주식회사 만도 Method and apparatus for recognizing parking slot marking by using hough transformation and parking assist system using same
US20100268452A1 (en) * 2007-06-12 2010-10-21 Tsuyoshi Kindo Navigation device, navigation method, and navigation program
JP2010107348A (en) * 2008-10-30 2010-05-13 Panasonic Corp Calibration target and in-vehicle calibration system using it
CN102506852A (en) * 2011-11-01 2012-06-20 丁幼春 Visual navigation system and navigation method thereof for agricultural vehicle
JP2018169947A (en) * 2017-03-30 2018-11-01 株式会社日立情報通信エンジニアリング Lane recognition apparatus and lane recognition program
CN107392103A (en) * 2017-06-21 2017-11-24 海信集团有限公司 The detection method and device of road surface lane line, electronic equipment
WO2020024234A1 (en) * 2018-08-02 2020-02-06 深圳前海达闼云端智能科技有限公司 Route navigation method, related device, and computer readable storage medium
CN109784250A (en) * 2019-01-04 2019-05-21 广州广电研究院有限公司 The localization method and device of automatically guiding trolley
CN110246177A (en) * 2019-06-25 2019-09-17 上海大学 Automatic wave measuring method based on vision

Similar Documents

Publication Publication Date Title
CN110390640B (en) Template-based Poisson fusion image splicing method, system, equipment and medium
US20230005278A1 (en) Lane extraction method using projection transformation of three-dimensional point cloud map
CN109815831B (en) Vehicle orientation obtaining method and related device
CN109544635B (en) Camera automatic calibration method based on enumeration heuristic
JP2014096135A (en) Moving surface boundary recognition device, mobile equipment control system using the same, moving surface boundary recognition method, and program for recognizing moving surface boundary
CN114898321B (en) Road drivable area detection method, device, equipment, medium and system
CN114140533A (en) Method and device for calibrating external parameters of camera
CN113343872B (en) Traffic light identification method, device, equipment, medium and product
CN113763438B (en) Point cloud registration method, device, equipment and storage medium
CN114120254A (en) Road information identification method, device and storage medium
Lin et al. Construction of fisheye lens inverse perspective mapping model and its applications of obstacle detection
CN112446898A (en) Positioning method, device, equipment, system and storage medium based on vehicle-road cooperation
CN113566807A (en) Automatic navigation method, navigation device and storage medium
CN109146973B (en) Robot site feature recognition and positioning method, device, equipment and storage medium
CN110989579A (en) Indoor AGV (automatic guided vehicle) guiding method and device, computer equipment and storage medium thereof
CN115565155A (en) Training method of neural network model, generation method of vehicle view and vehicle
CN114549978A (en) Mobile robot operation method and system based on multiple cameras
CN111428537B (en) Method, device and equipment for extracting edges of road diversion belt
CN114926332A (en) Unmanned aerial vehicle panoramic image splicing method based on unmanned aerial vehicle mother vehicle
Li et al. Lane detection and road surface reconstruction based on multiple vanishing point & symposia
CN112560606A (en) Trailer angle identification method and device
US20240193964A1 (en) Lane line recognition method, electronic device and storage medium
CN115063763B (en) Method and device for detecting boundary line of drivable area, vehicle and storage medium
US20240202887A1 (en) Method for detecting vehicle deviation, electronic device, and storage medium
CN117115242B (en) Identification method of mark point, computer storage medium and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20211029