WO2018180175A1 - Mobile body, signal processing device, and computer program - Google Patents
Mobile body, signal processing device, and computer program Download PDFInfo
- Publication number
- WO2018180175A1 WO2018180175A1 PCT/JP2018/007787 JP2018007787W WO2018180175A1 WO 2018180175 A1 WO2018180175 A1 WO 2018180175A1 JP 2018007787 W JP2018007787 W JP 2018007787W WO 2018180175 A1 WO2018180175 A1 WO 2018180175A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- map
- sensor data
- sensor
- maps
- intermediate maps
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 65
- 238000004590 computer program Methods 0.000 title claims description 7
- 238000000034 method Methods 0.000 claims abstract description 62
- 238000010586 diagram Methods 0.000 description 27
- 238000004891 communication Methods 0.000 description 21
- 238000004364 calculation method Methods 0.000 description 19
- 238000012986 modification Methods 0.000 description 10
- 230000004048 modification Effects 0.000 description 10
- 238000005259 measurement Methods 0.000 description 8
- 238000007781 pre-processing Methods 0.000 description 6
- 206010027146 Melanoderma Diseases 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 5
- 230000007613 environmental effect Effects 0.000 description 5
- 230000003287 optical effect Effects 0.000 description 5
- 239000004065 semiconductor Substances 0.000 description 5
- 230000006870 function Effects 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000005855 radiation Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000001678 irradiating effect Effects 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 239000000758 substrate Substances 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
Definitions
- the present disclosure relates to a mobile object, a signal processing device, and a computer program.
- An “environment map” is a map showing the geometrical state of an entity in space.
- An autonomous mobile robot measures the surrounding shape using an external sensor such as a laser distance sensor, and geometrically matches the surrounding shape and the environment map to estimate its own current position and orientation (identification). To do. As a result, the autonomous mobile robot can move along the route created on the environment map.
- JP 2012-93811 discloses a technique for updating an environmental map.
- the update of the environment map is a process of combining new measurement data with the shape of the past environment map and overwriting the past environment map with the new measurement data. Since the error included in the past environment map shape affects the accuracy when the new measurement data is combined, the updated environment map may include a new error. Therefore, when the update process is repeated, the error is accumulated, and the accuracy of the environment map can be lowered.
- an attribute that the shape is invariable is set in an area on the environment map where the actual shape is invariable (invariable area).
- the environmental map is not updated for the invariant area having the attribute.
- no error is accumulated in the invariant region, and accuracy can be maintained for shapes other than the invariant region.
- the environmental map created from the measurement data reflects the position and size of the objects that existed in the space at the time of measurement. If the object is a movable object such as a box and is moved after the measurement data is acquired, inconsistency occurs between the arrangement of the objects in the actual space and the environment map.
- an autonomous mobile robot has to identify its own position with reference to an existing environment map. In this case, there is a possibility that the self-position is erroneously identified.
- the exemplary embodiment of the present disclosure can accurately identify a self-position by using a plurality of intermediate maps generated before and after the object moves even when the object moves. Provides technology to generate maps.
- the moving body of the present disclosure includes a motor, a driving device that controls the motor to move the moving body, a sensor that senses a surrounding space and outputs sensor data, and a plurality of sheets.
- a storage device that stores intermediate map data, wherein each of the plurality of intermediate maps is generated from sensor data of the surrounding space sensed by the sensor at intervals of time.
- a processing circuit for generating one map from the plurality of intermediate maps stored in the storage device, wherein the one map includes a plurality of feature points in an identifiable manner, and the plurality of feature points Each includes a processing circuit that indicates a position where the degree of coincidence is greater than a predetermined value among the plurality of intermediate maps.
- the processing circuit performs processing for identifying the self-position by comparing the one map with sensor data newly output from the sensor.
- a mobile system includes a mobile body and a signal processing device.
- the moving body includes a motor, a driving device that moves the moving body by controlling the motor, a sensor that senses the surrounding space and outputs sensor data, and a storage device that stores data of a plurality of intermediate maps. Circuit.
- Each of the plurality of intermediate maps is generated from sensor data of the surrounding space sensed by the sensor with a time interval.
- the signal processing device includes a processing circuit that generates one map from a plurality of intermediate maps stored in the storage device of the moving body.
- the one map includes a plurality of feature points in an identifiable manner, and each of the plurality of feature points is a point having a degree of matching greater than a predetermined value between the plurality of intermediate maps.
- the control circuit of the moving body performs processing for identifying the self position by comparing one map generated by the processing circuit of the signal processing device with the sensor data newly output from the sensor. Thereby, the moving body can identify its own position with higher accuracy.
- FIG. 1 is a diagram illustrating an exemplary AGV 10 that travels in a passage 1 in a factory.
- FIG. 2 is a diagram illustrating an outline of an exemplary management system 100 that manages the traveling of the AGV 10.
- FIG. 3 is a diagram illustrating an example of each target position ( ⁇ ) set in the travel route of the AGV 10.
- FIG. 4A is a diagram illustrating an example of a movement route of the AGV 10 that continuously moves.
- FIG. 4B is a diagram illustrating an example of a movement route of the AGV 10 that continuously moves.
- FIG. 4C is a diagram illustrating an example of a movement route of the AGV 10 that continuously moves.
- FIG. 5 is an external view of an exemplary AGV 10.
- FIG. 5 is an external view of an exemplary AGV 10.
- FIG. 6 is a diagram illustrating a hardware configuration of the AGV 10.
- FIG. 7 is a diagram showing the AGV 10 that scans the surrounding space using the laser range finder 15 while moving.
- FIG. 8 is a diagram illustrating the AGV 10 that generates a map while moving.
- FIG. 9 is a diagram illustrating the AGV 10 that generates a map while moving.
- FIG. 10 is a diagram illustrating an AGV 10 that generates a map while moving.
- FIG. 11 is a diagram illustrating the AGV 10 that generates a map while moving.
- FIG. 12 is a diagram schematically showing the completed intermediate map 30.
- FIG. 13 is a diagram schematically illustrating a general position identification process.
- FIG. 14 is a diagram schematically illustrating a general position identification process.
- FIG. 13 is a diagram schematically illustrating a general position identification process.
- FIG. 15 is a diagram schematically illustrating a general position identification process.
- FIG. 16 is a diagram illustrating an example in which one or a plurality of obstacles 38 are placed in a part of a space where the AGV 10 travels.
- FIG. 17 is a diagram illustrating an example of sensor data 40 that reflects the position of the obstacle 38.
- FIG. 18 is a schematic diagram of a matching process between the reference map 30 and the sensor data 40.
- FIG. 19 is a diagram showing an area 34e including a set 6 of a plurality of points erroneously determined to match as a result of the matching process.
- FIG. 20 is a diagram schematically illustrating the procedure of the second process performed by the positioning device 14e.
- FIG. 21 is a diagram showing one map 50b that includes distinguishable feature points and non-feature points.
- FIG. 22 is a schematic diagram of the matching process between the map 50a and the sensor data 40.
- FIG. 23 is a diagram illustrating points that are determined to match as a result of matching.
- FIG. 24 is a flowchart illustrating a processing procedure for generating a new map.
- FIG. 25 is a diagram illustrating a hardware configuration of the travel management device 20 and a configuration of the mobile system 60.
- an automatic guided vehicle is mentioned as an example of a moving body.
- the automated guided vehicle is also called AGV (Automated Guided Vehicle), and is also described as “AGV” in this specification.
- FIG. 1 shows an AGV 10 that travels in a passage 1 in a factory, for example.
- FIG. 2 shows an overview of a management system 100 that manages the running of the AGV 10 according to this example.
- the AGV 10 has map data and travels while recognizing which position it is currently traveling.
- the travel route of the AGV 10 follows a command from the travel management device 20.
- the AGV 10 moves by rotating a plurality of built-in motors according to a command and rotating wheels.
- the command is transmitted from the traveling management device 20 to the AGV 10 by radio. Communication between the AGV 10 and the travel management device 20 is performed using wireless access points 2a, 2b, etc. provided near the ceiling of the factory.
- the communication conforms to, for example, the Wi-Fi (registered trademark) standard.
- Wi-Fi registered trademark
- FIG. 1 a plurality of AGVs 10 may travel.
- the traveling of each of the plurality of AGVs 10 may or may not be managed by the traveling management device 20.
- the outline of the operation of the AGV 10 and the travel management device 20 included in the management system 100 is as follows.
- the AGV 10 is described as the (n + 1) th position (hereinafter, “position M n + 1 ”) as the target position from the nth position in accordance with a command (nth command (n: positive integer)) from the travel management device 20. )).
- the target position can be determined by the administrator for each AGV 10, for example.
- the AGV 10 When the AGV 10 reaches the target position M n + 1 , the AGV 10 transmits an arrival notification (hereinafter referred to as “notification”) to the travel management device 20.
- the notification is sent to the travel management device 20 via the wireless access point 2a.
- AGV10 collates the output of the sensor which senses the periphery, and map data, and identifies a self position. Then, it may be determined whether or not the self position matches the position M n + 1 .
- the traveling management apparatus 20 When the notification is received, the traveling management apparatus 20 generates the next command ((n + 1) th command) for moving the AGV 10 from the position M n + 1 to the position M n + 2 .
- the (n + 1) th command includes the position coordinates of the position M n + 2 and may further include numerical values such as acceleration time and moving speed during constant speed traveling.
- the traveling management device 20 transmits the (n + 1) th command to the AGV 10.
- the AGV 10 analyzes the (n + 1) th command and performs a preprocessing calculation necessary for the movement from the position M n + 1 to the position M n + 2 .
- the preprocessing calculation is, for example, calculation for determining the rotation speed, rotation time, etc. of each motor for driving each wheel of the AGV 10.
- FIG. 3 shows an example of each target position ( ⁇ ) set in the travel route of AGV10.
- the interval between two adjacent target positions does not have to be a fixed value and can be determined by an administrator.
- the AGV 10 can move in various directions according to commands from the travel management device 20.
- 4A to 4C show examples of movement paths of the AGV 10 that moves continuously.
- FIG. 4A shows a moving path of the AGV 10 when traveling straight. After reaching the position M n + 1 , the AGV 10 can perform a pre-processing calculation, operate each motor according to the calculation result, and continue to move linearly to the next position M n + 2 .
- FIG. 4B shows a moving path of the AGV 10 that makes a left turn at the position M n + 1 and moves toward the position M n + 2 .
- the AGV 10 performs a preprocessing calculation after reaching the position M n + 1 and rotates at least one motor located on the right side in the traveling direction according to the calculation result.
- the AGV 10 rotates counterclockwise by an angle ⁇ on the spot, all the motors rotate at a constant speed toward the position M n + 2 and go straight.
- FIG. 4C shows a movement path of the AGV 10 when moving in a circular arc shape from the position M n + 1 to the position M n + 2 .
- the AGV 10 performs a preprocessing calculation after reaching the position M n + 1, and increases the rotational speed of the outer peripheral motor relative to the inner peripheral motor in accordance with the calculation result. As a result, the AGV 10 can continue to move along an arcuate path toward the next position M n + 2 .
- FIG. 5 is an external view of an exemplary AGV 10 according to the present embodiment.
- FIG. 6 shows the hardware configuration of the AGV 10.
- the AGV 10 includes four wheels 11a to 11d, a frame 12, a transfer table 13, a travel control device 14, and a laser range finder 15.
- the front wheel 11 a, the rear wheel 11 b, and the rear wheel 11 c are shown, but the front wheel 11 d is not clearly shown because it is hidden behind the frame 12.
- the traveling control device 14 is a device that controls the operation of the AGV 10.
- the traveling control device 14 mainly includes a plurality of integrated circuits including a microcomputer (described later), a plurality of electronic components, and a substrate on which the plurality of integrated circuits and the plurality of electronic components are mounted.
- the travel control device 14 performs data transmission / reception with the travel management device 20 and pre-processing calculation described above.
- the laser range finder 15 is an optical device that measures the distance to a target by, for example, irradiating the target with infrared laser light 15a and detecting the reflected light of the laser light 15a.
- the laser range finder 15 of the AGV 10 has a pulsed laser beam while changing its direction every 0.25 degrees in a space in the range of 135 degrees to the left and right (total 270 degrees) with respect to the front of the AGV 10, for example. 15a is emitted, and the reflected light of each laser beam 15a is detected. Thereby, the data of the distance to the reflection point in the direction determined by the angle for a total of 1080 steps every 0.25 degrees can be obtained.
- the arrangement of objects around the AGV can be obtained from the position and orientation of the AGV 10 and the scan result of the laser range finder 15.
- the position and posture of a moving object are called poses.
- the position and orientation of the moving body in the two-dimensional plane are expressed by position coordinates (x, y) in the XY orthogonal coordinate system and an angle ⁇ with respect to the X axis.
- the position and posture of the AGV 10, that is, the pose (x, y, ⁇ ) may be simply referred to as “position” hereinafter.
- the positioning device to be described later collates (matches) the local map data created from the scan result of the laser range finder 15 with a wider range of environmental map data, thereby self-position (x, y, ⁇ on the environmental map). ) Can be identified.
- the position of the reflection point seen from the radiation position of the laser beam 15a can be expressed using polar coordinates determined by the angle and the distance.
- the laser range finder 15 outputs sensor data expressed in polar coordinates.
- the laser range finder 15 may convert the position expressed in polar coordinates into orthogonal coordinates and output the result.
- Examples of objects that can be detected by the laser range finder 15 are people, luggage, shelves, and walls.
- the laser range finder 15 is an example of an external sensor for sensing the surrounding space and acquiring sensor data.
- Other examples of such an external sensor include an image sensor and an ultrasonic sensor.
- the “sensor data” output from the laser range finder 15 is a plurality of sets of vector data in which the angle ⁇ and the distance L are set as one set.
- the angle ⁇ changes by 0.25 degrees within a range of ⁇ 135 degrees to +135 degrees.
- the angle may be expressed with the right side as positive and the left side as negative with respect to the front of the AGV 10.
- the distance L is the distance to the object measured for each angle ⁇ .
- the distance L is obtained by dividing half of the difference between the emission time of the infrared laser beam 15a and the reception time of the reflected light (that is, the time required for the round trip of the laser beam) by the speed of light.
- FIG. 6 also shows a specific configuration of the traveling control device 14 of the AGV 10.
- the AGV 10 includes a travel control device 14, a laser range finder 15, four motors 16 a to 16 d, and a drive device 17.
- the traveling control device 14 includes a microcomputer 14a, a memory 14b, a storage device 14c, a communication circuit 14d, and a positioning device 14e.
- the microcomputer 14a, the memory 14b, the storage device 14c, the communication circuit 14d, and the positioning device 14e are connected by a communication bus 14f and can exchange data with each other.
- the laser range finder 15 is also connected to the communication bus 14f via a communication interface (not shown), and transmits measurement data as measurement results to the microcomputer 14a, the positioning device 14e and / or the memory 14b.
- the microcomputer 14a is a control circuit (computer) that performs calculations for controlling the entire AGV 10 including the travel control device 14.
- the microcomputer 14a is a semiconductor integrated circuit.
- the microcomputer 14a transmits a PWM (Pulse Width Modulation) signal to the driving device 17 to control the driving device 17 and adjust the current flowing through the motor. As a result, each of the motors 16a to 16d rotates at a desired rotation speed.
- PWM Pulse Width Modulation
- the memory 14b is a volatile storage device that stores a computer program executed by the microcomputer 14a.
- the memory 14b can also be used as a work memory when the microcomputer 14a and the positioning device 14e perform calculations.
- the storage device 14c is a non-volatile semiconductor memory device that stores map data.
- the storage device 14c may be a magnetic recording medium typified by a hard disk or an optical recording medium typified by an optical disk, and a head for writing and / or reading data on any recording medium.
- the apparatus and the control device of the head device may be included. Details of the map data will be described later with reference to FIG.
- the communication circuit 14d is a wireless communication circuit that performs wireless communication conforming to, for example, the Wi-Fi (registered trademark) standard.
- the positioning device 14e receives sensor data from the laser range finder 15 and reads map data stored in the storage device 14c.
- the positioning device 14e performs a process of comparing the sensor data and the map data to identify the self position. The specific operation of the positioning device 14e will be described later.
- the microcomputer 14a and the positioning device 14e are separate components, but this is an example. It may be a single chip circuit or a semiconductor integrated circuit capable of independently performing the operations of the microcomputer 14a and the positioning device 14e.
- FIG. 6 shows a chip circuit 14g including the microcomputer 14a and the positioning device 14e.
- the microcomputer 14a, the positioning device 14e, and / or the chip circuit 14g may be referred to as a computer or a processing circuit.
- an example in which the microcomputer 14a and the positioning device 14e are separately provided will be described.
- the four motors 16a to 16d are attached to the four wheels 11a to 11d, respectively, and rotate each wheel.
- the number of motors is an example. Two or three may be sufficient and five or more may be sufficient.
- the drive device 17 has motor drive circuits 17a to 17d for adjusting the current flowing through each of the four motors 16a to 16d.
- Each of the motor drive circuits 17a to 17d is a so-called inverter circuit, and the current flowing to each motor is turned on or off by the PWM signal transmitted from the microcomputer 14a, thereby adjusting the current flowing to the motor.
- the process of generating a map can be divided into a first process and a second process.
- the first process is a process for generating a plurality of intermediate maps.
- the second process is a process of determining a plurality of feature points from the plurality of intermediate maps and generating one map.
- the first process is realized by SLAM (Simultaneous Localization and Mapping) technology as an example.
- SLAM Simultaneous Localization and Mapping
- the AGV 10 scans the surrounding space by operating the laser range finder 15 while actually traveling in a factory where the AGV 10 is used, and generates a map while estimating its own position.
- the AGV 10 may travel on a specific route while being controlled by the administrator, and generate a map from the sensor data acquired by the laser range finder 15.
- FIG. 7 to 11 each show an AGV 10 that generates a map while moving.
- FIG. 7 shows an AGV 10 that scans the surrounding space using the laser range finder 15. Laser light is emitted at every predetermined step angle, and scanning is performed.
- the position of the reflection point of the laser beam is indicated by using a plurality of points represented by the symbol “•”, such as point 4 in FIG.
- the positioning device 14e accumulates the position of the black spot 4 obtained as a result of traveling, for example, in the memory 14b.
- the map is gradually completed by continuously performing scanning while the AGV 10 travels. 8 to 11, only the scan range is shown for the sake of simplicity.
- the scan range is also an example, and is different from the above-described example of 270 degrees in total.
- FIG. 12 schematically shows the completed intermediate map 30.
- the positioning device 14e accumulates the data of the intermediate map 30 in the memory 14b or the storage device 14c.
- the number or density of black spots shown in the figure is an example.
- the AGV 10 generates a plurality of intermediate maps by running a plurality of times at intervals, and stores them in the memory 14b or the storage device 14c.
- the AGV 10 runs at a time interval that can be appropriately determined by those skilled in the art, such as several days, hours, tens of minutes, minutes, etc., and generates a plurality of intermediate maps.
- the reason why the AGV 10 generates a plurality of intermediate maps is that an object that has not been moved (referred to as a “fixed object”) and an object that has been moved (referred to as a “movable object”) can be identified. Reflection points existing at the same position across a plurality of intermediate maps can be regarded as representing a fixed object. On the other hand, a reflection point whose position is changed between a plurality of intermediate maps means that the existing movable object has been removed or a new movable object has been placed. In the present specification, an object that can be actually moved but that has not been moved in the process of generating a plurality of intermediate maps is also referred to as a “fixed object”. On the other hand, even an object that is generally considered to have few opportunities to move, such as a wall, an object moved in the process of generating a plurality of intermediate maps is called a “movable object”.
- the AGV can identify its own position. it can.
- an example of processing for identifying a self-location using a single map will be described.
- FIG. 13 to FIG. 15 schematically show a procedure for general position identification processing.
- AGV has acquired in advance a map (hereinafter referred to as “reference map 30”) corresponding to the intermediate map 30 of FIG.
- the AGV acquires the sensor data 32 shown in FIG. 13 at a predetermined time interval or at all times during traveling, and executes a process for identifying the self position.
- the AGV sets various regions (for example, regions 34 a, 34 b, 34 c) whose positions and angles are changed on the reference map 30, and a plurality of reflection points included in each region and a reflection included in the sensor data 32.
- Match points FIG. 14 shows a point (for example, point 5) determined to be a match as a result of the collation, by the symbol “ ⁇ ”. If the ratio of the number of coincidence points is larger than a predetermined reference value, the AGV determines that the sensor data 32 coincides with a plurality of black points in the region 34d.
- AGV determines the emission position of the laser beam that can obtain each black spot included in the region 34d, that is, the self position.
- the identified self-position 36 is represented by the symbol “X”.
- AGV can identify its own position.
- the AGV cannot always correctly identify the self position, and a position different from the actual position may be erroneously identified as the self position.
- FIG. 16 shows an example in which one or more obstacles 38 are placed in a part of the space where the AGV 10 travels.
- the sensor data acquired by the AGV differs depending on the presence or absence of the obstacle 38.
- FIG. 17 shows an example of the sensor data 40 reflecting the position of the obstacle 38.
- FIG. 18 is a schematic diagram of a matching process between the reference map 30 and the sensor data 40.
- AGV sets various regions 34 a, 34 b, 34 c and the like, and collates a plurality of reflection points included in each of the regions with sensor data 40.
- FIG. 19 shows a region 34e including a set 6 of a plurality of points erroneously determined to match as a result of the matching process.
- the first process is performed a plurality of times to obtain a plurality of intermediate maps, and then the second process is further performed.
- FIG. 20 schematically shows the procedure of the second process performed by the positioning device 14e.
- the positioning device 14e generates one map 50a from the six intermediate maps 30a to 30f acquired with a time interval.
- the number of intermediate maps may be any number as long as it is plural. For example, the number may be 2 to 5, or 7 or more.
- Each black dot constituting the six intermediate maps 30a to 30f indicates the position of the reflection point of the laser beam.
- the positioning device 14e determines feature points from the intermediate maps 30a to 30f.
- the “feature point” is a point having a relatively high “matching degree” between a plurality of intermediate maps acquired multiple times, and more specifically, the matching degree is higher than a predetermined value. It means a large virtual coordinate point.
- Each virtual coordinate point is one unit for determining “free space” through which laser light can pass or “occupied space” through which laser light cannot pass.
- an example of the “occupied space” is the surface of the object existing in the space and the inside of the object.
- the degree of coincidence can be calculated by various methods. For example, when the “degree of coincidence” is expressed by the degree of coincidence regarding the positions of the black dots constituting the intermediate maps 30a to 30f, it can be calculated by the following method.
- the positioning device 14e adjusts the angles of the intermediate maps 30a to 30f on the basis of one or a plurality of feature amounts that are commonly included in the intermediate maps 30a to 30f, and superimposes the intermediate maps 30a to 30f.
- the “feature amount” is represented by a positional relationship (arrangement) of a plurality of black spots representing a position where a movable object is considered to be substantially not placed, for example, a staircase or a lifting device.
- the positioning device 14e determines whether there are four or more intermediate maps having black spots at the same position. When there are four or more intermediate maps having black spots at the same position, the black spots at the positions are reflected as “feature points” in the newly generated map. On the other hand, when there are less than three intermediate maps having black spots at the same position, the black spots at the positions are excluded from “feature points”. Note that “four” is an example of a threshold value. The feature point only needs to be included in common on at least two intermediate maps, and in this case, the threshold value is “two”.
- the positioning device 14e sets a virtual coordinate position, determines a black spot closest to the coordinate position for each intermediate map, obtains a deviation amount from the coordinate position, and further obtains a sum of deviation amounts.
- the positioning device 14e may obtain the difference or ratio between the obtained sum and a predetermined allowable value as the degree of coincidence. When the degree of coincidence is obtained for each virtual coordinate position on the generated map, the positioning device 14e can determine the feature point.
- the coordinate position determined to be each feature point represents the surface of a fixed object that exists in common on a plurality of maps.
- the coordinate position that has not been determined to be a feature point represents the position where the surface of the removed movable object exists or the position where the surface of the newly placed movable object exists.
- the positioning device 14e of the AGV 10 generates one map 50a in which the feature points are determined. As shown in FIG. 20, the map 50a shows the positions of a plurality of feature points 51, but the position 52 where the feature points 51 do not exist is blank. The map 50a is a map in which only each feature point is identified.
- FIG. 21 shows one map 50b that includes distinguishable feature points and non-feature points.
- a plurality of feature points 51 and a plurality of non-feature points 53 are shown to be identifiable.
- each point is used to determine whether or not each point coincides with a point acquired as sensor data at a rate corresponding to the value of each rate. That is, the “ratio” represents the “weight” used for calculation during the position identification process.
- a black point having a weight of 1 that is always included in the calculation during the position identification process may be referred to as a “feature point”, and a black point less than 1 may be referred to as a non-feature point.
- a black point having a weight greater than or equal to a threshold, for example, 0.6 or more may be referred to as a “feature point”, and a black point having a weight less than 0.6 may be referred to as a non-feature point.
- a black point with a weight greater than at least 0 may be referred to as a “feature point”, and a black point with a weight of 0 may be referred to as a non-feature point.
- a black point having a weight greater than or equal to an arbitrary threshold value or greater than the threshold value may be referred to as a feature point, and a black point having a weight less than or less than the threshold value may be referred to as a non-feature point.
- a black point with a weight set may be referred to as a “feature point”, and a black point with no weight set may be referred to as a “non-feature point”.
- the positioning device 14e performs a process of identifying the self-position by comparing the sensor data with a plurality of feature points included in the map 50a or 50b. Below, the process which collates the sensor data 40 and map 50a shown in FIG. 17 as an example is demonstrated.
- FIG. 22 is a schematic diagram of the matching process between the map 50a and the sensor data 40.
- the positioning device 14e sets various regions 34a, 34b, 34c, etc., and collates a plurality of feature points included in each region with the sensor data 40 by the method described with reference to FIG.
- FIG. 23 shows a point (for example, point 7) determined to be a match as a result of the collation, by the symbol “ ⁇ ”.
- the example of FIG. 23 differs from the example of FIG. 19 in that, in the example of FIG. 23, the position (or region) 52 where the feature point 51 does not exist is not used for the collation processing. Since the positioning device 14e performs the matching process using the feature points and sensor data having a high degree of coincidence, a more reliable matching result can be obtained. As a result, the positioning device 14e can correctly identify the position 42 that actually exists as the self-position.
- the positioning device 14e limits the range in which the area is set in order to reduce the amount of calculation.
- the AGV 10 travels by receiving an instruction from the travel management device 20 about the target position M n + 1 to be next. Therefore, a region may be set so as to include the travel route between the positions M n and M n + 1 , and a plurality of features in each region may be compared with the sensor data 40. Thereby, the comparison calculation can be omitted for an area including a position far away from the current travel route.
- the positioning device 14e may perform a process of matching the map 50b with the sensor data 40 by giving different weights to the feature points 51 and the non-feature points 53.
- the weight of each feature point 51 is set larger than the weight of the non-feature point 53.
- FIG. 24 is a flowchart illustrating a processing procedure for generating a new map.
- step S10 the positioning device 14e of the AGV 10 senses the surrounding space while traveling in the factory and acquires a plurality of intermediate maps.
- step S ⁇ b> 11 the positioning device 14 e determines a feature point having a matching degree greater than a predetermined value between a plurality of intermediate maps. The method for calculating the coincidence and the method for determining the feature points are as described above.
- step S12 the positioning device 14e generates a map including the feature points in an identifiable manner.
- the map generated in step S12 may be the map 50a shown in FIG. 20 or the map 50b shown in FIG.
- the above-described processing may be performed by the microcomputer 14a instead of the positioning device 14e, or may be performed by the chip circuit 14g in which the microcomputer 14a and the positioning device 14e are integrated. That is, the “processing circuit” may perform the above processing.
- the above-mentioned process uses all of the plurality of intermediate maps, it is sufficient to use at least two intermediate maps instead of all of them.
- the positioning device 14e of the AGV 10 generates a plurality of intermediate maps from the sensor data, and further generates one map from the plurality of intermediate maps.
- an external signal processing device other than the AGV 10 for example, the travel management device 20, may perform the process of generating one final map from the sensor data.
- a system including the AGV 10 and the signal processing device is referred to as a “map generation system”.
- a system in which the AGV 10 performs a process of identifying its own position using one map generated from a plurality of intermediate maps is called a “mobile system”.
- traveling management device 20 will be exemplified as an external signal processing device.
- the AGV 10 takes a round of the factory, acquires a group of sensor data necessary for the intermediate map, and writes it on, for example, a removable recording medium.
- the administrator removes the recording medium and causes the traveling management device 20 to read the written group of sensor data.
- the AGV 10 may wirelessly transmit the obtained sensor data to the travel management device 20 for each scan or several scans while traveling in the factory.
- the travel management device 20 can acquire a group of sensor data necessary for the intermediate map.
- the intermediate map is created not by the AGV 10 but by the travel management device 20.
- FIG. 25 shows the configuration of the mobile system 60.
- FIG. 25 also shows a hardware configuration of the travel management device 20.
- the travel management device 20 includes a CPU 21, a memory 22, a storage device 23, a communication circuit 24, and an image processing circuit 25.
- the CPU 21, the memory 22, the storage device 23, the communication circuit 24, and the image processing circuit 25 are connected by a communication bus 27 and can exchange data with each other.
- the CPU 21 is a signal processing circuit (computer) that controls the operation of the travel management device 20.
- the CPU 21 is a semiconductor integrated circuit.
- the memory 22 is a volatile storage device that stores a computer program executed by the CPU 21.
- the memory 22 can also be used as a work memory when the CPU 21 performs calculations.
- the storage device 23 stores a group of sensor data received from the AGV 10.
- the storage device 23 may store data of a plurality of intermediate maps generated from a group of sensor data, and may further store data of one map generated from a plurality of intermediate maps.
- the storage device 23 may be a nonvolatile semiconductor memory, a magnetic recording medium represented by a hard disk, or an optical recording medium represented by an optical disk.
- the storage device 23 can also store position data indicating each position that can be a destination of the AGV 10 that is necessary to function as the travel management device 20.
- the position data can be represented by coordinates virtually set in the factory by an administrator, for example.
- the location data is determined by the administrator.
- the communication circuit 24 performs wired communication based on, for example, the Ethernet (registered trademark) standard.
- the communication circuit 24 is connected to the wireless access points 2a, 2b and the like by wire, and can communicate with the AGV 10 via the wireless access points 2a, 2b and the like.
- the travel management device 20 receives individual data of a plurality of intermediate maps or a group of data from the AGV 10 via the communication circuit 24, and transmits the generated data of one map to the AGV 10.
- the communication circuit 24 receives the data of the position to which the AGV 10 should go from the CPU 21 via the bus 27 and transmits it to the AGV 10.
- the communication circuit 24 transmits data (for example, notification) received from the AGV 10 to the CPU 21 and / or the memory 22 via the bus 27.
- the image processing circuit 25 is a circuit that generates video data to be displayed on the external monitor 29.
- the image processing circuit 25 operates exclusively when the administrator operates the travel management device 20. In the present embodiment, further detailed explanation is omitted.
- the monitor 29 may be integrated with the travel management apparatus 20. Further, the CPU 21 may perform the processing of the image processing circuit 25.
- the CPU 21 or the image processing circuit 25 of the travel management device 20 reads a group of sensor data with reference to the storage device 23.
- Each sensor data is, for example, vector data including a set of the position and orientation of the AGV 10 when the sensor data is acquired, the angle at which the laser light is emitted, and the distance from the radiation point to the reflection point.
- the CPU 21 or the image processing circuit 25 converts each sensor data into the position of the coordinate point on the orthogonal coordinates. By converting all the sensor data, a plurality of intermediate maps are obtained.
- the CPU 21 or the image processing circuit 25 generates one map from a plurality of intermediate maps. That is, in the process of FIG. 24, the travel management device 20 may perform the above-described steps S11 and S12.
- the generated map data is temporarily stored in the memory 22 or the storage device 23.
- the generated map data is then transmitted to the AGV 10 that generated the intermediate map data via the access points 2a and 2b via a wireless or removable recording medium.
- the intermediate map data may be transmitted to another AGV 10.
- the travel management device 20 generates one map from a group of sensor data.
- the travel management device 20 is an example. If a signal processing circuit (or computer) corresponding to the CPU 21, a memory, and a communication circuit are included, an arbitrary signal processing device can generate one map from a group of sensor data. In that case, functions and configurations necessary for functioning as the travel management device 20 can be omitted from the signal processing device.
- Modification 2 is an application of Modification 1.
- the entire factory may be divided into a plurality of smaller zones, and an intermediate map may be created using one AGV 10 for each zone. For example, a factory of 150 mx 100 m is divided into six zones with one zone of 50 mx 50 m.
- the AGV 10 in each zone acquires a group of sensor data, and causes an external signal processing device other than the AGV 10 such as the travel management device 20 to acquire a group of sensor data by any of the methods described in the first modification.
- the travel management device 20 generates a plurality of intermediate maps for each zone by the above-described processing, and further generates one zone map from the plurality of intermediate maps. Then, the travel management device 20 creates an integrated map that combines the zone maps into one.
- the traveling management device 20 may transmit the obtained integrated map to all the AGVs 10, or may cut out and transmit only the corresponding zones to the AGVs 10 in which moving zones are determined in advance.
- the AGV 10 generates one map from a plurality of intermediate maps.
- the positioning device 14e of the AGV 10 obtains a plurality of intermediate maps and then collates the sensor data with the data of each intermediate map, and each of the intermediate maps has a plurality of matching degrees greater than a predetermined value. Determine feature points.
- the positioning device 14e collates the sensor data with the data of the first intermediate map, and determines a plurality of first feature points for the first intermediate map.
- the positioning device 14e collates the sensor data with the data of the second intermediate map, and determines a plurality of second feature points for the second intermediate map.
- the positioning device 14e determines a plurality of feature points (referred to as “common feature points”) included in common with the plurality of first feature points and the plurality of second feature points.
- the common feature point corresponds to a feature point included in the map 50a or 50b in the above-described embodiment.
- the positioning device 14e may determine, for each position, whether the ratio of the number of intermediate maps having feature points to the total number of intermediate maps is greater than or equal to the threshold. Similar to the above-described embodiment, the positioning device 14e may determine that the position is a common feature point when the ratio is greater than the threshold value or greater than or equal to the threshold value.
- the positioning device 14e identifies the self-position by using the collation result between the common feature point data and the sensor data.
- the “matching result” may be the result of the matching of the common feature point data and the sensor data by the positioning device 14e, or the result of the processing performed before the common feature point is determined. It may be.
- Modification 4 In the examples described above, a map of a two-dimensional space in which the AGV 10 travels is assumed. However, a three-dimensional space map may be generated using a laser range finder that can also scan a space in the height direction.
- AGV10 generates a plurality of intermediate maps including not only the plane direction but also the height direction. Then, further considering the element in the height direction, a point having a degree of matching greater than a predetermined value between the plurality of intermediate maps is determined as a plurality of feature points. If a single map including the plurality of feature points in an identifiable manner is used, a process for identifying the self-position using the feature amount in the height direction can be performed.
- the technology of the present disclosure can be widely used in a mobile body that performs processing for identifying a self-position, a travel management device that controls the mobile body, and a management system that includes the mobile body and the travel management device.
- 2a, 2b wireless access point 10 automatic guided vehicle (AGV), 14 travel control device, 14a microcomputer, 14b memory, 14c storage device, 14d communication circuit, 14e positioning device, 15 laser range finder, 16a-16d motor, 17 drive Device, 17a-17d motor drive circuit, 20 travel management device
Landscapes
- Engineering & Computer Science (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
This mobile body (10) comprises: motors (16a-16d); a drive device (17) for the motors; a sensor (15) which senses surrounding spaces and outputs sensor data; a storage device (14c) which stores data for a plurality of interim maps, each of the plurality of interim maps being generated from the sensor data on surrounding spaces which are each sensed by the sensor at time intervals; and processing circuits (14a, 14e, 14g) which generate a single map from the plurality of interim maps. The single map includes a plurality of characteristic points (51) in an identifiable manner. Each of the plurality of characteristic points indicates a position where the degree of correspondence among the plurality of interim maps exceeds a prescribed value. The processing circuits carry out a process of matching the single map against the sensor data newly outputted from the sensor and identifying the self-position.
Description
本開示は、移動体、信号処理装置およびコンピュータプログラムに関する。
The present disclosure relates to a mobile object, a signal processing device, and a computer program.
環境地図上に作成された経路に沿って移動を行う自律移動ロボットが存在する。「環境地図」とは、空間内の存在物の幾何状況を示した地図である。自律移動ロボットは、レーザ距離センサ等の外界センサを用いて周囲の形状を計測し、周囲の形状と環境地図とを幾何学的に合わせ込んで、自身の現在の位置および姿勢を推定(同定)する。その結果、自律移動ロボットは、環境地図上に作成された経路に沿って移動することができる。
There are autonomous mobile robots that move along the route created on the environment map. An “environment map” is a map showing the geometrical state of an entity in space. An autonomous mobile robot measures the surrounding shape using an external sensor such as a laser distance sensor, and geometrically matches the surrounding shape and the environment map to estimate its own current position and orientation (identification). To do. As a result, the autonomous mobile robot can move along the route created on the environment map.
空間内の物体の配置および形状は変わり得るため、実際の存在物の状態と環境地図とが一致するように、環境地図を更新する必要がある。
Since the arrangement and shape of objects in the space can change, it is necessary to update the environment map so that the actual state of the object matches the environment map.
特開2012-93811号公報は環境地図を更新する技術を開示する。環境地図の更新とは、過去の環境地図の形状に新たな計測データを合わせ込み、過去の環境地図を新たな計測データで上書きする処理である。過去の環境地図の形状に含まれる誤差が新たな計測データを合わせ込む際の精度に影響するため、更新後の環境地図には新たな誤差が含まれ得る。そのため、更新処理を繰り返すと当該誤差が蓄積され、環境地図の精度が低下し得る。
JP 2012-93811 discloses a technique for updating an environmental map. The update of the environment map is a process of combining new measurement data with the shape of the past environment map and overwriting the past environment map with the new measurement data. Since the error included in the past environment map shape affects the accuracy when the new measurement data is combined, the updated environment map may include a new error. Therefore, when the update process is repeated, the error is accumulated, and the accuracy of the environment map can be lowered.
特開2012-93811号公報では、環境地図上の、実際の形状が不変である領域(不変領域)には形状が不変であるという属性が設定される。当該属性を有する不変領域に関しては、環境地図は更新されない。これにより、当該不変領域には誤差が蓄積されることはなくなり、さらに不変領域以外の形状についても精度を保つことができる。
In Japanese Unexamined Patent Application Publication No. 2012-93811, an attribute that the shape is invariable is set in an area on the environment map where the actual shape is invariable (invariable area). The environmental map is not updated for the invariant area having the attribute. As a result, no error is accumulated in the invariant region, and accuracy can be maintained for shapes other than the invariant region.
計測データによって作成される環境地図は、計測の時点で空間内に存在していた物体の位置、大きさ等を反映する。当該物体が、たとえば箱などの移動可能な物体であり、計測データが取得された後に移動されてしまうと、実際の空間内の存在物の配置と環境地図との間に不整合が生じる。しかしながら、特開2012-93811号公報の技術では自律移動ロボットは既存の環境地図を参照して自己位置を同定せざるを得ない。これでは、自己位置を誤って同定するおそれがある。
The environmental map created from the measurement data reflects the position and size of the objects that existed in the space at the time of measurement. If the object is a movable object such as a box and is moved after the measurement data is acquired, inconsistency occurs between the arrangement of the objects in the actual space and the environment map. However, with the technology disclosed in Japanese Patent Application Laid-Open No. 2012-93811, an autonomous mobile robot has to identify its own position with reference to an existing environment map. In this case, there is a possibility that the self-position is erroneously identified.
本開示の例示的な実施形態は、物体が移動した場合であっても、当該物体が移動する前後で生成された複数の中間マップを利用して、精度よく自己位置を同定することが可能なマップを生成する技術を提供する。
The exemplary embodiment of the present disclosure can accurately identify a self-position by using a plurality of intermediate maps generated before and after the object moves even when the object moves. Provides technology to generate maps.
本開示の移動体は、例示的な実施形態において、モータと、前記モータを制御して前記移動体を移動させる駆動装置と、周囲の空間をセンシングしてセンサデータを出力するセンサと、複数枚の中間マップのデータを格納する記憶装置であって、前記複数枚の中間マップの各々は、時間間隔を開けて前記センサによってそれぞれセンシングされた前記周囲の空間のセンサデータから生成される、記憶装置と、前記記憶装置に記憶された前記複数枚の中間マップから1枚のマップを生成する処理回路であって、前記1枚のマップは複数の特徴点を識別可能に含み、前記複数の特徴点の各々は、前記複数の中間マップ間で一致度が所定値より大きい位置を示す、処理回路とを備える。前記処理回路は、前記1枚のマップと、前記センサから新たに出力されたセンサデータとを照合して自己位置を同定する処理を行う。
In an exemplary embodiment, the moving body of the present disclosure includes a motor, a driving device that controls the motor to move the moving body, a sensor that senses a surrounding space and outputs sensor data, and a plurality of sheets. A storage device that stores intermediate map data, wherein each of the plurality of intermediate maps is generated from sensor data of the surrounding space sensed by the sensor at intervals of time. And a processing circuit for generating one map from the plurality of intermediate maps stored in the storage device, wherein the one map includes a plurality of feature points in an identifiable manner, and the plurality of feature points Each includes a processing circuit that indicates a position where the degree of coincidence is greater than a predetermined value among the plurality of intermediate maps. The processing circuit performs processing for identifying the self-position by comparing the one map with sensor data newly output from the sensor.
本発明の例示的な実施形態にかかる移動体システムは、移動体と信号処理装置とを有する。移動体は、モータと、モータを制御して移動体を移動させる駆動装置と、周囲の空間をセンシングしてセンサデータを出力するセンサと、複数枚の中間マップのデータを格納する記憶装置と制御回路とを有する。複数枚の中間マップの各々は、時間間隔を開けてセンサによってセンシングされた周囲の空間のセンサデータから生成される。信号処理装置は、移動体の記憶装置に記憶された複数枚の中間マップから1枚のマップを生成する処理回路を有する。当該1枚のマップは複数の特徴点を識別可能に含み、複数の特徴点の各々は、複数の中間マップ間で一致度が所定値より大きい点である。移動体の制御回路は、信号処理装置の処理回路によって生成された1枚のマップと、センサから新たに出力されたセンサデータとを照合して自己位置を同定する処理を行う。これにより、移動体はより精度よく自己位置を同定することができる。
A mobile system according to an exemplary embodiment of the present invention includes a mobile body and a signal processing device. The moving body includes a motor, a driving device that moves the moving body by controlling the motor, a sensor that senses the surrounding space and outputs sensor data, and a storage device that stores data of a plurality of intermediate maps. Circuit. Each of the plurality of intermediate maps is generated from sensor data of the surrounding space sensed by the sensor with a time interval. The signal processing device includes a processing circuit that generates one map from a plurality of intermediate maps stored in the storage device of the moving body. The one map includes a plurality of feature points in an identifiable manner, and each of the plurality of feature points is a point having a degree of matching greater than a predetermined value between the plurality of intermediate maps. The control circuit of the moving body performs processing for identifying the self position by comparing one map generated by the processing circuit of the signal processing device with the sensor data newly output from the sensor. Thereby, the moving body can identify its own position with higher accuracy.
以下、添付の図面を参照しながら、本開示による移動体、および、当該移動体と走行管理装置とを含む管理システムの一例を説明する。本明細書では、移動体の一例として無人搬送車を挙げる。無人搬送車はAGV(Automated Guided Vehicle)とも呼ばれており、本明細書でも「AGV」と記述する。
Hereinafter, an example of a mobile body according to the present disclosure and a management system including the mobile body and a travel management device will be described with reference to the accompanying drawings. In this specification, an automatic guided vehicle is mentioned as an example of a moving body. The automated guided vehicle is also called AGV (Automated Guided Vehicle), and is also described as “AGV” in this specification.
図1は、たとえば工場内の通路1を走行するAGV10を示す。図2は、本例による、AGV10の走行を管理する管理システム100の概要を示す。図示される例では、AGV10は地図データを有し、自身が現在どの位置を走行しているかを認識しながら走行する。AGV10の走行経路は、走行管理装置20からの指令に従う。AGV10は、指令に従って、内蔵された複数のモータをそれぞれ駆動し、車輪を回転させることによって移動する。指令は無線により、走行管理装置20からAGV10に送られる。AGV10と走行管理装置20との通信は、工場の天井付近に設けられた無線アクセスポイント2a、2b等を利用して行われる。通信は、たとえばWi-Fi(登録商標)規格に準拠する。なお、図1には1台のAGV10のみが示されているが、複数台のAGV10が走行してもよい。当該複数台のAGV10の各々の走行は、走行管理装置20によって管理されていてもよいし、されていなくてもよい。
FIG. 1 shows an AGV 10 that travels in a passage 1 in a factory, for example. FIG. 2 shows an overview of a management system 100 that manages the running of the AGV 10 according to this example. In the illustrated example, the AGV 10 has map data and travels while recognizing which position it is currently traveling. The travel route of the AGV 10 follows a command from the travel management device 20. The AGV 10 moves by rotating a plurality of built-in motors according to a command and rotating wheels. The command is transmitted from the traveling management device 20 to the AGV 10 by radio. Communication between the AGV 10 and the travel management device 20 is performed using wireless access points 2a, 2b, etc. provided near the ceiling of the factory. The communication conforms to, for example, the Wi-Fi (registered trademark) standard. Although only one AGV 10 is shown in FIG. 1, a plurality of AGVs 10 may travel. The traveling of each of the plurality of AGVs 10 may or may not be managed by the traveling management device 20.
管理システム100に含まれるAGV10および走行管理装置20の動作の概要は以下のとおりである。
The outline of the operation of the AGV 10 and the travel management device 20 included in the management system 100 is as follows.
AGV10は、走行管理装置20からの指令(第n指令(n:正の整数))に従って、第n位置から、目的位置である第(n+1)位置(以下「位置Mn+1」などと記述する。)に向かって移動中であるとする。なお、目的位置は、たとえばAGV10ごとに管理者によって決定され得る。
The AGV 10 is described as the (n + 1) th position (hereinafter, “position M n + 1 ”) as the target position from the nth position in accordance with a command (nth command (n: positive integer)) from the travel management device 20. )). The target position can be determined by the administrator for each AGV 10, for example.
AGV10が目的とする位置Mn+1に到達すると、AGV10は走行管理装置20に宛てて到達通知(以下「通知」と記述する。)を送信する。通知は無線アクセスポイント2aを介して走行管理装置20に送られる。本実施形態では、AGV10は、周囲をセンシングするセンサの出力と、マップデータとを照合して自己位置を同定する。そして自己位置が当該位置Mn+1に一致するか否かを判定すればよい。
When the AGV 10 reaches the target position M n + 1 , the AGV 10 transmits an arrival notification (hereinafter referred to as “notification”) to the travel management device 20. The notification is sent to the travel management device 20 via the wireless access point 2a. In this embodiment, AGV10 collates the output of the sensor which senses the periphery, and map data, and identifies a self position. Then, it may be determined whether or not the self position matches the position M n + 1 .
通知を受信すると、走行管理装置20は、AGV10を位置Mn+1から位置Mn+2まで移動させるための次の指令(第(n+1)指令)を生成する。第(n+1)指令は、位置Mn+2の位置座標を含み、さらに加速時間、定速走行時の移動速度等の数値を含み得る。走行管理装置20は、AGV10に宛てて当該第(n+1)指令を送信する。
When the notification is received, the traveling management apparatus 20 generates the next command ((n + 1) th command) for moving the AGV 10 from the position M n + 1 to the position M n + 2 . The (n + 1) th command includes the position coordinates of the position M n + 2 and may further include numerical values such as acceleration time and moving speed during constant speed traveling. The traveling management device 20 transmits the (n + 1) th command to the AGV 10.
第(n+1)指令を受信すると、AGV10は、第(n+1)指令を解析して、位置Mn+1から位置Mn+2までの移動に必要な前処理演算を行う。前処理演算は、たとえば、AGV10の各車輪を駆動するための各モータの回転速度、回転時間等を決定するための演算である。
When the (n + 1) th command is received, the AGV 10 analyzes the (n + 1) th command and performs a preprocessing calculation necessary for the movement from the position M n + 1 to the position M n + 2 . The preprocessing calculation is, for example, calculation for determining the rotation speed, rotation time, etc. of each motor for driving each wheel of the AGV 10.
図3は、AGV10の走行経路に設定される各目的位置(▲)の例を示す。隣り合う2つの目的位置の間隔は固定値でなくてもよく、管理者によって決定され得る。
FIG. 3 shows an example of each target position (▲) set in the travel route of AGV10. The interval between two adjacent target positions does not have to be a fixed value and can be determined by an administrator.
AGV10は、走行管理装置20からの指令に応じて種々の方向に移動することが可能である。図4A~図4Cは、継続的に移動するAGV10の移動経路の例を示す。
The AGV 10 can move in various directions according to commands from the travel management device 20. 4A to 4C show examples of movement paths of the AGV 10 that moves continuously.
図4Aは、直進時のAGV10の移動経路を示す。AGV10は、位置Mn+1への到達後、前処理演算を行い、演算結果に従って各モータを動作させて、次の位置Mn+2に直線的に移動を継続することができる。
FIG. 4A shows a moving path of the AGV 10 when traveling straight. After reaching the position M n + 1 , the AGV 10 can perform a pre-processing calculation, operate each motor according to the calculation result, and continue to move linearly to the next position M n + 2 .
図4Bは、位置Mn+1において左折し、位置Mn+2に向けて移動するAGV10の移動経路を示す。AGV10は、位置Mn+1への到達後、前処理演算を行い、演算結果に従って、進行方向右側に位置する少なくとも1台のモータを回転させる。そしてその場で角度θだけ半時計回りに回転すると、AGV10は位置Mn+2に向けて全てのモータを等速で回転させ、直進する。
FIG. 4B shows a moving path of the AGV 10 that makes a left turn at the position M n + 1 and moves toward the position M n + 2 . The AGV 10 performs a preprocessing calculation after reaching the position M n + 1 and rotates at least one motor located on the right side in the traveling direction according to the calculation result. When the AGV 10 rotates counterclockwise by an angle θ on the spot, all the motors rotate at a constant speed toward the position M n + 2 and go straight.
図4Cは、位置Mn+1から位置Mn+2まで円弧状に移動する時のAGV10の移動経路を示す。AGV10は、位置Mn+1への到達後、前処理演算を行い、演算結果に従って内周側のモータよりも外周側のモータの回転速度を速める。これにより、AGV10は次の位置Mn+2に向けて円弧状の経路で移動を継続することができる。
FIG. 4C shows a movement path of the AGV 10 when moving in a circular arc shape from the position M n + 1 to the position M n + 2 . The AGV 10 performs a preprocessing calculation after reaching the position M n + 1, and increases the rotational speed of the outer peripheral motor relative to the inner peripheral motor in accordance with the calculation result. As a result, the AGV 10 can continue to move along an arcuate path toward the next position M n + 2 .
図5は、本実施形態にかかる例示的なAGV10の外観図である。また図6は、AGV10のハードウェアの構成を示す。
FIG. 5 is an external view of an exemplary AGV 10 according to the present embodiment. FIG. 6 shows the hardware configuration of the AGV 10.
AGV10は、4つの車輪11a~11dと、フレーム12と、搬送テーブル13と、走行制御装置14と、レーザレンジファインダ15とを有する。なお、図5には、前輪11a、後輪11bおよび後輪11cは示されているが、前輪11dはフレーム12の蔭に隠れているため明示されていない。
The AGV 10 includes four wheels 11a to 11d, a frame 12, a transfer table 13, a travel control device 14, and a laser range finder 15. In FIG. 5, the front wheel 11 a, the rear wheel 11 b, and the rear wheel 11 c are shown, but the front wheel 11 d is not clearly shown because it is hidden behind the frame 12.
走行制御装置14は、AGV10の動作を制御する装置である。走行制御装置14は、主としてマイコン(後述)を含む複数の集積回路、複数の電子部品および、当該複数の集積回路および当該複数の電子部品が搭載された基板を含む。走行制御装置14は、上述した、走行管理装置20とのデータの送受信、および、前処理演算を行う。
The traveling control device 14 is a device that controls the operation of the AGV 10. The traveling control device 14 mainly includes a plurality of integrated circuits including a microcomputer (described later), a plurality of electronic components, and a substrate on which the plurality of integrated circuits and the plurality of electronic components are mounted. The travel control device 14 performs data transmission / reception with the travel management device 20 and pre-processing calculation described above.
レーザレンジファインダ15は、たとえば赤外のレーザ光15aを目標物に照射し、当該レーザ光15aの反射光を検出することにより、目標物までの距離を測定する光学機器である。本実施形態では、AGV10のレーザレンジファインダ15は、たとえばAGV10の正面を基準として左右135度(合計270度)の範囲の空間に、0.25度ごとに方向を変化させながらパルス状のレーザ光15aを放射し、各レーザ光15aの反射光を検出する。これにより、0.25度ごと、合計1080ステップ分の角度で決まる方向における反射点までの距離のデータを得ることができる。
The laser range finder 15 is an optical device that measures the distance to a target by, for example, irradiating the target with infrared laser light 15a and detecting the reflected light of the laser light 15a. In the present embodiment, the laser range finder 15 of the AGV 10 has a pulsed laser beam while changing its direction every 0.25 degrees in a space in the range of 135 degrees to the left and right (total 270 degrees) with respect to the front of the AGV 10, for example. 15a is emitted, and the reflected light of each laser beam 15a is detected. Thereby, the data of the distance to the reflection point in the direction determined by the angle for a total of 1080 steps every 0.25 degrees can be obtained.
AGV10の位置および姿勢と、レーザレンジファインダ15のスキャン結果とにより、AGVの周囲の物体の配置を得ることができる。一般に、移動体の位置および姿勢は、ポーズ(pose)と呼ばれる。2次元面内における移動体の位置および姿勢は、XY直交座標系における位置座標(x, y)と、X軸に対する角度θによって表現される。AGV10の位置および姿勢、すなわちポーズ(x, y, θ)を、以下、単に「位置」と呼ぶことがある。
The arrangement of objects around the AGV can be obtained from the position and orientation of the AGV 10 and the scan result of the laser range finder 15. In general, the position and posture of a moving object are called poses. The position and orientation of the moving body in the two-dimensional plane are expressed by position coordinates (x, y) in the XY orthogonal coordinate system and an angle θ with respect to the X axis. The position and posture of the AGV 10, that is, the pose (x, y, θ) may be simply referred to as “position” hereinafter.
後述する測位装置は、レーザレンジファインダ15のスキャン結果から作成された局所的地図データを、より広範囲の環境地図データと照合(マッチング)することにより、環境地図上における自己位置(x, y, θ)を同定することが可能になる。
The positioning device to be described later collates (matches) the local map data created from the scan result of the laser range finder 15 with a wider range of environmental map data, thereby self-position (x, y, θ on the environmental map). ) Can be identified.
なお、レーザ光15aの放射位置から見た反射点の位置は、角度および距離によって決定される極座標を用いて表現され得る。本実施形態では、レーザレンジファインダ15は極座標で表現されたセンサデータを出力する。ただし、レーザレンジファインダ15は、極座標で表現された位置を直交座標に変換して出力してもよい。
In addition, the position of the reflection point seen from the radiation position of the laser beam 15a can be expressed using polar coordinates determined by the angle and the distance. In the present embodiment, the laser range finder 15 outputs sensor data expressed in polar coordinates. However, the laser range finder 15 may convert the position expressed in polar coordinates into orthogonal coordinates and output the result.
レーザレンジファインダの構造および動作原理は公知であるため、本明細書ではこれ以上の詳細な説明は省略する。なお、レーザレンジファインダ15によって検出され得る物体の例は、人、荷物、棚、壁である。
Since the structure and operating principle of the laser range finder are known, further detailed description is omitted in this specification. Examples of objects that can be detected by the laser range finder 15 are people, luggage, shelves, and walls.
レーザレンジファインダ15は、周囲の空間をセンシングしてセンサデータを取得するための外界センサの一例である。そのような外界センサの他の例としては、イメージセンサおよび超音波センサが考えられる。
The laser range finder 15 is an example of an external sensor for sensing the surrounding space and acquiring sensor data. Other examples of such an external sensor include an image sensor and an ultrasonic sensor.
本明細書では、センサから出力されたデータを「センサデータ」と呼ぶ。レーザレンジファインダ15から出力された「センサデータ」は、角度θと距離Lとを一組とした複数組のベクトルデータである。角度θは、たとえば-135度から+135度の範囲で0.25度ずつ変化する。角度は、AGV10の正面を基準として右側を正、左側を負として表現され得る。距離Lは、角度θごとに計測された物体までの距離である。距離Lは、赤外のレーザ光15aの放射時刻と反射光の受信時刻との差(つまりレーザ光の往復の所要時間)の半分を、光速で除算して得られる。
In this specification, data output from the sensor is referred to as “sensor data”. The “sensor data” output from the laser range finder 15 is a plurality of sets of vector data in which the angle θ and the distance L are set as one set. For example, the angle θ changes by 0.25 degrees within a range of −135 degrees to +135 degrees. The angle may be expressed with the right side as positive and the left side as negative with respect to the front of the AGV 10. The distance L is the distance to the object measured for each angle θ. The distance L is obtained by dividing half of the difference between the emission time of the infrared laser beam 15a and the reception time of the reflected light (that is, the time required for the round trip of the laser beam) by the speed of light.
図6を参照する。図6には、AGV10の走行制御装置14の具体的な構成も示されている。
Refer to FIG. FIG. 6 also shows a specific configuration of the traveling control device 14 of the AGV 10.
AGV10は、走行制御装置14と、レーザレンジファインダ15と、4台のモータ16a~16dと、駆動装置17とを備えている。
The AGV 10 includes a travel control device 14, a laser range finder 15, four motors 16 a to 16 d, and a drive device 17.
走行制御装置14は、マイコン14aと、メモリ14bと、記憶装置14cと、通信回路14dと、測位装置14eとを有している。マイコン14a、メモリ14b、記憶装置14c、通信回路14dおよび測位装置14eは通信バス14fで接続されており、相互にデータを授受することが可能である。またレーザレンジファインダ15もまた通信インタフェース(図示せず)を介して通信バス14fに接続されており、測定結果である測定データを、マイコン14a、測位装置14eおよび/またはメモリ14bに送信する。
The traveling control device 14 includes a microcomputer 14a, a memory 14b, a storage device 14c, a communication circuit 14d, and a positioning device 14e. The microcomputer 14a, the memory 14b, the storage device 14c, the communication circuit 14d, and the positioning device 14e are connected by a communication bus 14f and can exchange data with each other. The laser range finder 15 is also connected to the communication bus 14f via a communication interface (not shown), and transmits measurement data as measurement results to the microcomputer 14a, the positioning device 14e and / or the memory 14b.
マイコン14aは、走行制御装置14を含むAGV10の全体を制御するための演算を行う制御回路(コンピュータ)である。典型的にはマイコン14aは半導体集積回路である。マイコン14aは、PWM(Pulse Width Modulation)信号を駆動装置17に送信して駆動装置17を制御し、モータに流れる電流を調整させる。これによりモータ16a~16dの各々が所望の回転速度で回転する。
The microcomputer 14a is a control circuit (computer) that performs calculations for controlling the entire AGV 10 including the travel control device 14. Typically, the microcomputer 14a is a semiconductor integrated circuit. The microcomputer 14a transmits a PWM (Pulse Width Modulation) signal to the driving device 17 to control the driving device 17 and adjust the current flowing through the motor. As a result, each of the motors 16a to 16d rotates at a desired rotation speed.
メモリ14bは、マイコン14aが実行するコンピュータプログラムを記憶する、揮発性の記憶装置である。メモリ14bは、マイコン14aおよび測位装置14eが演算を行う際のワークメモリとしても利用され得る。
The memory 14b is a volatile storage device that stores a computer program executed by the microcomputer 14a. The memory 14b can also be used as a work memory when the microcomputer 14a and the positioning device 14e perform calculations.
記憶装置14cは、マップデータを記憶する不揮発性の半導体メモリ装置である。ただし、記憶装置14cは、ハードディスクに代表される磁気記録媒体、または、光ディスクに代表される光学式記録媒体であってもよく、さらにいずれかの記録媒体にデータを書き込みおよび/または読み出すためのヘッド装置および当該ヘッド装置の制御装置を含んでもよい。マップデータの詳細は図20を参照しながら後述する。
The storage device 14c is a non-volatile semiconductor memory device that stores map data. However, the storage device 14c may be a magnetic recording medium typified by a hard disk or an optical recording medium typified by an optical disk, and a head for writing and / or reading data on any recording medium. The apparatus and the control device of the head device may be included. Details of the map data will be described later with reference to FIG.
通信回路14dは、たとえばWi-Fi(登録商標)規格に準拠した無線通信を行う無線通信回路である。
The communication circuit 14d is a wireless communication circuit that performs wireless communication conforming to, for example, the Wi-Fi (registered trademark) standard.
測位装置14eは、レーザレンジファインダ15からセンサデータを受け取り、また、記憶装置14cに記憶されたマップデータを読み出す。測位装置14eは、センサデータとマップデータとを照合して自己位置を同定する処理を行う。測位装置14eの具体的な動作は後述する。
The positioning device 14e receives sensor data from the laser range finder 15 and reads map data stored in the storage device 14c. The positioning device 14e performs a process of comparing the sensor data and the map data to identify the self position. The specific operation of the positioning device 14e will be described later.
なお、本実施形態では、マイコン14aと測位装置14eとは別個の構成要素であるとしているが、これは一例である。マイコン14aおよび測位装置14eの各動作を独立して行うことが可能な1つのチップ回路または半導体集積回路であってもよい。図6には、マイコン14aおよび測位装置14eを包括するチップ回路14gが示されている。本明細書では、マイコン14a、測位装置14eおよび/またはチップ回路14gは、コンピュータ、または、処理回路と呼ぶことがある。なお、以下では、マイコン14aおよび測位装置14eが別個独立に設けられている例で説明する。
In this embodiment, the microcomputer 14a and the positioning device 14e are separate components, but this is an example. It may be a single chip circuit or a semiconductor integrated circuit capable of independently performing the operations of the microcomputer 14a and the positioning device 14e. FIG. 6 shows a chip circuit 14g including the microcomputer 14a and the positioning device 14e. In this specification, the microcomputer 14a, the positioning device 14e, and / or the chip circuit 14g may be referred to as a computer or a processing circuit. Hereinafter, an example in which the microcomputer 14a and the positioning device 14e are separately provided will be described.
4台のモータ16a~16dは、それぞれ4つの車輪11a~11dに取り付けられ、各車輪を回転させる。なおモータの数は一例である。2台または3台でもよいし、5台以上であってもよい。
The four motors 16a to 16d are attached to the four wheels 11a to 11d, respectively, and rotate each wheel. The number of motors is an example. Two or three may be sufficient and five or more may be sufficient.
駆動装置17は、4台のモータ16a~16dの各々に流れる電流を調整するためのモータ駆動回路17a~17dを有する。モータ駆動回路17a~17dの各々はいわゆるインバータ回路であり、マイコン14aから送信されたPWM信号によって各モータに流れる電流をオンまたはオフし、それによりモータに流れる電流を調整する。
The drive device 17 has motor drive circuits 17a to 17d for adjusting the current flowing through each of the four motors 16a to 16d. Each of the motor drive circuits 17a to 17d is a so-called inverter circuit, and the current flowing to each motor is turned on or off by the PWM signal transmitted from the microcomputer 14a, thereby adjusting the current flowing to the motor.
次に、本開示によるAGV10が1枚のマップを生成する処理の一例を説明する。
Next, an example of processing in which the AGV 10 according to the present disclosure generates one map will be described.
マップを生成する処理は、第1の処理および第2の処理に分けることができる。第1の処理は、複数枚の中間マップを生成する処理である。第2の処理は、当該複数枚の中間マップから複数の特徴点を決定して1枚のマップを生成する処理である。
The process of generating a map can be divided into a first process and a second process. The first process is a process for generating a plurality of intermediate maps. The second process is a process of determining a plurality of feature points from the plurality of intermediate maps and generating one map.
まず第1の処理を説明する。
First, the first process will be described.
本実施形態では、第1の処理は、一例としてSLAM(Simultaneous Localization and Mapping)技術によって実現される。AGV10は、たとえば、AGV10が使用される工場内を実際に走行しながらレーザレンジファインダ15を動作させて周囲の空間をスキャンし、自己の位置を推定しながらマップを生成する。またはAGV10は、管理者に制御されながら特定の経路を走行し、レーザレンジファインダ15によって取得したセンサデータからマップを生成してもよい。
In the present embodiment, the first process is realized by SLAM (Simultaneous Localization and Mapping) technology as an example. For example, the AGV 10 scans the surrounding space by operating the laser range finder 15 while actually traveling in a factory where the AGV 10 is used, and generates a map while estimating its own position. Alternatively, the AGV 10 may travel on a specific route while being controlled by the administrator, and generate a map from the sensor data acquired by the laser range finder 15.
図7から図11はそれぞれ、移動しながらマップを生成するAGV10を示す。図7には、レーザレンジファインダ15を用いて周囲の空間をスキャンするAGV10が示されている。所定のステップ角毎にレーザ光が放射され、スキャンが行われる。
7 to 11 each show an AGV 10 that generates a map while moving. FIG. 7 shows an AGV 10 that scans the surrounding space using the laser range finder 15. Laser light is emitted at every predetermined step angle, and scanning is performed.
図7から図11の各々では、レーザ光の反射点の位置が、図7の点4のような、記号「・」で表される複数の点を用いて示されている。測位装置14eは、走行に伴って得られる黒点4の位置を、たとえばメモリ14bに蓄積する。AGV10が走行しながらスキャンを継続して行うことにより、マップが徐々に完成されてゆく。図8から図11では、簡略化のためスキャン範囲のみが示されている。当該スキャン範囲も例示であり、上述した合計270度の例とは異なる。
In each of FIGS. 7 to 11, the position of the reflection point of the laser beam is indicated by using a plurality of points represented by the symbol “•”, such as point 4 in FIG. The positioning device 14e accumulates the position of the black spot 4 obtained as a result of traveling, for example, in the memory 14b. The map is gradually completed by continuously performing scanning while the AGV 10 travels. 8 to 11, only the scan range is shown for the sake of simplicity. The scan range is also an example, and is different from the above-described example of 270 degrees in total.
図12は、完成した中間マップ30を模式的に示す。測位装置14eは、中間マップ30のデータをメモリ14bまたは記憶装置14cに蓄積する。なお図示されている黒点の数または密度は一例である。
FIG. 12 schematically shows the completed intermediate map 30. The positioning device 14e accumulates the data of the intermediate map 30 in the memory 14b or the storage device 14c. The number or density of black spots shown in the figure is an example.
AGV10は、時間間隔を開けて、複数回走行して複数枚の中間マップを生成し、メモリ14bまたは記憶装置14cに蓄積する。AGV10は、たとえば数日、数時間、数十分、数分等、当業者が適宜決定し得る時間間隔を開けて、走行し、複数枚の中間マップを生成する。
The AGV 10 generates a plurality of intermediate maps by running a plurality of times at intervals, and stores them in the memory 14b or the storage device 14c. The AGV 10 runs at a time interval that can be appropriately determined by those skilled in the art, such as several days, hours, tens of minutes, minutes, etc., and generates a plurality of intermediate maps.
AGV10が複数の中間マップを生成する理由は、動かされなかった物体(「固定物」と言う。)と、動かされた物体(「可動物」と言う。)とを識別できるからである。複数の中間マップにわたって同じ位置に存在する反射点は、固定物を表していると見なすことができる。一方、複数の中間マップ間で位置が変化している反射点は、存在していた可動物が除去された、または可動物が新たに置かれたことを意味する。なお本明細書では、現実には移動させることが可能であるが、複数の中間マップを生成する過程で移動されなかった物体も「固定物」と呼ぶ。一方、壁などの、一般には移動させる機会が少ないと考えられる物体であっても、複数の中間マップを生成する過程で移動された物体は「可動物」と呼ぶ。
The reason why the AGV 10 generates a plurality of intermediate maps is that an object that has not been moved (referred to as a “fixed object”) and an object that has been moved (referred to as a “movable object”) can be identified. Reflection points existing at the same position across a plurality of intermediate maps can be regarded as representing a fixed object. On the other hand, a reflection point whose position is changed between a plurality of intermediate maps means that the existing movable object has been removed or a new movable object has been placed. In the present specification, an object that can be actually moved but that has not been moved in the process of generating a plurality of intermediate maps is also referred to as a “fixed object”. On the other hand, even an object that is generally considered to have few opportunities to move, such as a wall, an object moved in the process of generating a plurality of intermediate maps is called a “movable object”.
一般には、SLAM技術を用いて図12に示す一枚のマップを取得し、その後、当該マップと、実際に走行して得たセンサデータとを照合すれば、AGVは自己位置を同定することができる。以下、一枚のマップを利用して自己位置を同定する処理の例を説明する。
In general, if a single map shown in FIG. 12 is acquired using the SLAM technology, and then the map is compared with the sensor data actually obtained, the AGV can identify its own position. it can. Hereinafter, an example of processing for identifying a self-location using a single map will be described.
図13から図15は、一般的な位置同定処理の手順を模式的に示す。AGVは、SLAM技術によって取得した、図13の中間マップ30に相当するマップ(以下、「基準マップ30」と記述する。)を事前に取得している。AGVは走行時に、所定の時間間隔で、または常時、図13に示すセンサデータ32を取得し、自己位置を同定する処理を実行する。
FIG. 13 to FIG. 15 schematically show a procedure for general position identification processing. AGV has acquired in advance a map (hereinafter referred to as “reference map 30”) corresponding to the intermediate map 30 of FIG. The AGV acquires the sensor data 32 shown in FIG. 13 at a predetermined time interval or at all times during traveling, and executes a process for identifying the self position.
まず、AGVは、基準マップ30上で位置および角度を変化させた種々の領域(たとえば領域34a、34b、34c)を設定し、その各々に含まれる複数の反射点とセンサデータ32に含まれる反射点とを照合する。図14は、照合の結果、一致したと判定された点(たとえば点5)を記号「■」で示す。一致点の数の割合が所定の基準値より大きい場合、AGVはセンサデータ32と領域34d内の複数の黒点とが一致したと判定する。
First, the AGV sets various regions (for example, regions 34 a, 34 b, 34 c) whose positions and angles are changed on the reference map 30, and a plurality of reflection points included in each region and a reflection included in the sensor data 32. Match points. FIG. 14 shows a point (for example, point 5) determined to be a match as a result of the collation, by the symbol “■”. If the ratio of the number of coincidence points is larger than a predetermined reference value, the AGV determines that the sensor data 32 coincides with a plurality of black points in the region 34d.
AGVは、領域34dに含まれる各黒点を得ることが可能なレーザ光の発光位置、すなわち自己位置を決定する。図15は、同定された自己位置36が、記号「X」で表されている。
AGV determines the emission position of the laser beam that can obtain each black spot included in the region 34d, that is, the self position. In FIG. 15, the identified self-position 36 is represented by the symbol “X”.
上述の方法によれば、AGVは自己位置を同定することができる。しかしながら、上述の方法では、AGVは常に正しく自己位置を同定できるとは限られず、実際の位置とは異なる位置を自己位置であると誤って同定する場合がある。
According to the method described above, AGV can identify its own position. However, in the above-described method, the AGV cannot always correctly identify the self position, and a position different from the actual position may be erroneously identified as the self position.
図16は、AGV10が走行する空間の一部に1または複数の障害物38が置かれた例を示す。当該障害物38の存在の有無により、AGVが取得するセンサデータは異なる。図17は、障害物38の位置を反映したセンサデータ40の例を示す。
FIG. 16 shows an example in which one or more obstacles 38 are placed in a part of the space where the AGV 10 travels. The sensor data acquired by the AGV differs depending on the presence or absence of the obstacle 38. FIG. 17 shows an example of the sensor data 40 reflecting the position of the obstacle 38.
図18は、基準マップ30とセンサデータ40との照合処理の模式図である。図13の例で説明したように、AGVは種々の領域34a、34b、34c等を設定し、その各々に含まれる複数の反射点とセンサデータ40とを照合する。
FIG. 18 is a schematic diagram of a matching process between the reference map 30 and the sensor data 40. As described in the example of FIG. 13, AGV sets various regions 34 a, 34 b, 34 c and the like, and collates a plurality of reflection points included in each of the regions with sensor data 40.
図19は、照合処理の結果、一致すると誤って判定された複数の点の集合6を含む領域34eを示す。このような誤判定が行われると、AGVは、実際の位置42からずれた位置44を自己位置であると誤って同定する。
FIG. 19 shows a region 34e including a set 6 of a plurality of points erroneously determined to match as a result of the matching process. When such an erroneous determination is made, the AGV erroneously identifies the position 44 shifted from the actual position 42 as the self position.
以下、上述のような誤った位置同定を避けることが可能な方法を説明する。本発明者が開発した方法では、上述したように、第1の処理を複数回行って複数枚の中間マップを取得し、その後さらに第2の処理を行う。
Hereinafter, a method capable of avoiding the erroneous position identification as described above will be described. In the method developed by the present inventor, as described above, the first process is performed a plurality of times to obtain a plurality of intermediate maps, and then the second process is further performed.
図20は、測位装置14eによって行われる第2の処理の手順を模式的に示す。第2の処理では、測位装置14eは、時間間隔を開けて取得した6枚の中間マップ30a~30fから1枚のマップ50aを生成する。なお中間マップの枚数は複数枚であれば何枚でもよい。たとえば、2枚から5枚までであってもよいし、7枚以上であってもよい。
FIG. 20 schematically shows the procedure of the second process performed by the positioning device 14e. In the second process, the positioning device 14e generates one map 50a from the six intermediate maps 30a to 30f acquired with a time interval. The number of intermediate maps may be any number as long as it is plural. For example, the number may be 2 to 5, or 7 or more.
6枚の中間マップ30a~30fを構成する各黒点は、レーザ光の反射点の位置を示す。測位装置14eは、中間マップ30a~30fから特徴点を決定する。本実施形態では、「特徴点」とは、複数回にわたって取得された複数枚の中間マップ間で「一致度」が相対的に高い点であり、より具体的には、一致度が所定値より大きい仮想的な座標点を意味する。個々の仮想的な座標点は、レーザ光が通過し得る「自由空間」か、レーザ光が通過できない「占有空間」かを決定するための1単位である。なお本実施形態では、「占有空間」の一例は、空間に存在する物体の表面および物体の内部である。
Each black dot constituting the six intermediate maps 30a to 30f indicates the position of the reflection point of the laser beam. The positioning device 14e determines feature points from the intermediate maps 30a to 30f. In the present embodiment, the “feature point” is a point having a relatively high “matching degree” between a plurality of intermediate maps acquired multiple times, and more specifically, the matching degree is higher than a predetermined value. It means a large virtual coordinate point. Each virtual coordinate point is one unit for determining “free space” through which laser light can pass or “occupied space” through which laser light cannot pass. In the present embodiment, an example of the “occupied space” is the surface of the object existing in the space and the inside of the object.
「一致度」は種々の方法で計算することができる。たとえば、「一致度」を、中間マップ30a~30fを構成する各黒点の位置に関する一致の程度で表す場合には以下の方法で計算することができる。
“The degree of coincidence” can be calculated by various methods. For example, when the “degree of coincidence” is expressed by the degree of coincidence regarding the positions of the black dots constituting the intermediate maps 30a to 30f, it can be calculated by the following method.
測位装置14eは、中間マップ30a~30fに共通して含まれる1つまたは複数の特徴量を基準として各中間マップ30a~30fの角度を調整し、中間マップ30a~30fを重ね合わせる。「特徴量」は、可動物が載置されることが実質的にないと考えられる位置、たとえば階段または昇降装置、を表す複数の黒点の位置関係(並び)によって表される。
The positioning device 14e adjusts the angles of the intermediate maps 30a to 30f on the basis of one or a plurality of feature amounts that are commonly included in the intermediate maps 30a to 30f, and superimposes the intermediate maps 30a to 30f. The “feature amount” is represented by a positional relationship (arrangement) of a plurality of black spots representing a position where a movable object is considered to be substantially not placed, for example, a staircase or a lifting device.
中間マップ30a~30fを重ね合わせる角度を決定した後、測位装置14eは、同じ位置に黒点を有する中間マップが4枚以上存在するか否かを判定する。同じ位置に黒点を有する中間マップが4枚以上存在する場合には、当該位置の黒点は「特徴点」として、新たに生成されるマップに反映される。一方、同じ位置に黒点を有する中間マップが3枚未満の場合には、当該位置の黒点は「特徴点」から除外する。なお、「4枚」は閾値の一例である。特徴点は、少なくとも2枚の中間マップ上に共通して含まれていればよく、その場合には閾値は「2枚」である。
After determining the angle at which the intermediate maps 30a to 30f are overlapped, the positioning device 14e determines whether there are four or more intermediate maps having black spots at the same position. When there are four or more intermediate maps having black spots at the same position, the black spots at the positions are reflected as “feature points” in the newly generated map. On the other hand, when there are less than three intermediate maps having black spots at the same position, the black spots at the positions are excluded from “feature points”. Note that “four” is an example of a threshold value. The feature point only needs to be included in common on at least two intermediate maps, and in this case, the threshold value is “two”.
または、測位装置14eは、仮想的な座標位置を設定し、中間マップごとに当該座標位置に最も近い黒点を決定し、当該座標位置からのずれ量を求め、さらにずれ量の和を求める。測位装置14eは、求めた和と予め定められた許容値との差または比を一致度として求めてもよい。生成するマップ上の仮想的な個々の座標位置について一致度を求めると、測位装置14eは特徴点を決定することができる。
Alternatively, the positioning device 14e sets a virtual coordinate position, determines a black spot closest to the coordinate position for each intermediate map, obtains a deviation amount from the coordinate position, and further obtains a sum of deviation amounts. The positioning device 14e may obtain the difference or ratio between the obtained sum and a predetermined allowable value as the degree of coincidence. When the degree of coincidence is obtained for each virtual coordinate position on the generated map, the positioning device 14e can determine the feature point.
各特徴点であると決定された座標位置は、複数のマップ上に共通して存在する固定物の表面を表していると言うことができる。一方、特徴点であると決定されなかった座標位置は、除去された可動物の表面が存在した位置、または新たに置かれた可動物の表面が存在する位置を表していると言える。
It can be said that the coordinate position determined to be each feature point represents the surface of a fixed object that exists in common on a plurality of maps. On the other hand, it can be said that the coordinate position that has not been determined to be a feature point represents the position where the surface of the removed movable object exists or the position where the surface of the newly placed movable object exists.
AGV10の測位装置14eは、特徴点を決定した1枚のマップ50aを生成する。図20に示されるように、マップ50aは複数の特徴点51の位置を示すが、特徴点51が存在しない位置52は空白である。マップ50aは、各特徴点のみが識別可能に示されたマップである。
The positioning device 14e of the AGV 10 generates one map 50a in which the feature points are determined. As shown in FIG. 20, the map 50a shows the positions of a plurality of feature points 51, but the position 52 where the feature points 51 do not exist is blank. The map 50a is a map in which only each feature point is identified.
なお、マップ50aの表現形式は一例である。AGV10は他の表現形式のマップを生成してもよい。たとえば図21は、特徴点と非特徴点とを識別可能に含む1枚のマップ50bを示す。マップ50bでは、複数の特徴点51、および、複数の非特徴点53が、それぞれ識別可能に示されている。
Note that the representation format of the map 50a is an example. The AGV 10 may generate a map of another expression format. For example, FIG. 21 shows one map 50b that includes distinguishable feature points and non-feature points. In the map 50b, a plurality of feature points 51 and a plurality of non-feature points 53 are shown to be identifiable.
マップ50bは他の方法によって生成されてもよい。中間マップ30a~30fを重ね合わせる角度を決定した後、測位装置14eは、中間マップ30a~30fの位置ごとに黒点が共通して存在する割合を判定する。たとえばある位置について、6枚の中間マップのうちの3枚に黒点が共通して存在する場合には、当該位置の黒点の割合を0.5(=3/6)に設定する。たとえば、図21に示す複数の黒点51の各々は重みが0.5以上の位置を表し、複数のX点53の各々は重みが0.5未満の位置を表していてもよい。設定された割合は、最終的に生成される1枚のマップ50aを利用した位置同定処理時に、計算に反映される程度を示す。マップ50bを利用した位置同定処理時には、各割合の値に応じた割合で各点がセンサデータとして取得された点と一致するか否かの判断に利用される。つまり、「割合」は位置同定処理時に計算に利用される「重み」を表す。位置同定処理時に必ず計算に含まれる、重みが1の黒点を「特徴点」と呼び、1未満の黒点を非特徴点と呼んでもよい。重みが閾値以上、たとえば0.6以上の黒点を「特徴点」と呼び、0.6未満の黒点を非特徴点と呼んでもよい。または、重みが少なくとも0より大きい黒点を「特徴点」と呼び、0の黒点を非特徴点と呼んでもよい。任意の閾値以上、または閾値より大きい重みを有する黒点を特徴点と呼び、当該閾値未満または以下の重みを有する黒点を非特徴点と呼び得る。あるいは、重みが設定されている黒点を「特徴点」と呼び、重みが設定されていない黒点を「非特徴点」と呼んでもよい。
The map 50b may be generated by other methods. After determining the angle at which the intermediate maps 30a to 30f are overlaid, the positioning device 14e determines the proportion of black spots that are commonly present for each position of the intermediate maps 30a to 30f. For example, if a black spot is commonly present on three of six intermediate maps at a certain position, the ratio of the black spot at that position is set to 0.5 (= 3/6). For example, each of the plurality of black points 51 shown in FIG. 21 may represent a position having a weight of 0.5 or more, and each of the plurality of X points 53 may represent a position having a weight of less than 0.5. The set ratio indicates a degree reflected in the calculation at the time of position identification processing using one finally generated map 50a. During the position identification process using the map 50b, each point is used to determine whether or not each point coincides with a point acquired as sensor data at a rate corresponding to the value of each rate. That is, the “ratio” represents the “weight” used for calculation during the position identification process. A black point having a weight of 1 that is always included in the calculation during the position identification process may be referred to as a “feature point”, and a black point less than 1 may be referred to as a non-feature point. A black point having a weight greater than or equal to a threshold, for example, 0.6 or more may be referred to as a “feature point”, and a black point having a weight less than 0.6 may be referred to as a non-feature point. Alternatively, a black point with a weight greater than at least 0 may be referred to as a “feature point”, and a black point with a weight of 0 may be referred to as a non-feature point. A black point having a weight greater than or equal to an arbitrary threshold value or greater than the threshold value may be referred to as a feature point, and a black point having a weight less than or less than the threshold value may be referred to as a non-feature point. Alternatively, a black point with a weight set may be referred to as a “feature point”, and a black point with no weight set may be referred to as a “non-feature point”.
測位装置14eは、センサデータと、マップ50aまたは50bに含まれる複数の特徴点とを照合して自己位置を同定する処理を行う。以下では、一例として、図17に示すセンサデータ40とマップ50aとを照合する処理を説明する。
The positioning device 14e performs a process of identifying the self-position by comparing the sensor data with a plurality of feature points included in the map 50a or 50b. Below, the process which collates the sensor data 40 and map 50a shown in FIG. 17 as an example is demonstrated.
図22は、マップ50aとセンサデータ40との照合処理の模式図である。測位装置14eは種々の領域34a、34b、34c等を設定し、図14を参照しながら説明した方法により、各領域に含まれる複数の特徴点とセンサデータ40とを照合する。
FIG. 22 is a schematic diagram of the matching process between the map 50a and the sensor data 40. The positioning device 14e sets various regions 34a, 34b, 34c, etc., and collates a plurality of feature points included in each region with the sensor data 40 by the method described with reference to FIG.
図23は、照合の結果、一致したと判定された点(たとえば点7)を記号「■」で示す。図23の例が図19の例と相違する点は、図23の例では、特徴点51が存在しない位置(または領域)52は照合処理には利用されないことである。測位装置14eは、一致度が高い特徴点とセンサデータとを利用して照合処理を行うため、より確実な照合結果を得ることができる。その結果、測位装置14eは、実際に存在する位置42を、自己位置であると正しく同定することができる。
FIG. 23 shows a point (for example, point 7) determined to be a match as a result of the collation, by the symbol “■”. The example of FIG. 23 differs from the example of FIG. 19 in that, in the example of FIG. 23, the position (or region) 52 where the feature point 51 does not exist is not used for the collation processing. Since the positioning device 14e performs the matching process using the feature points and sensor data having a high degree of coincidence, a more reliable matching result can be obtained. As a result, the positioning device 14e can correctly identify the position 42 that actually exists as the self-position.
なお、照合処理において、考え得るあらゆる大きさおよび角度を変化させた複数の領域を設定して、各領域内の複数の特徴点とセンサデータ40とを比較することは、演算量が非常に大きいため現実的ではない。そのため測位装置14eは、演算量を減少させるために領域を設定する範囲を制限する。AGV10は、目的位置Mnに到達すると、走行管理装置20から次に向かうべき目的位置Mn+1の指示を受けて走行する。そのため位置MnおよびMn+1の間の走行経路を含むよう領域を設定し、各領域内の複数の特徴とセンサデータ40とを比較すればよい。これにより、現在の走行経路から大きく離れた位置を含む領域については比較演算を省略できる。
In comparison processing, setting a plurality of regions with various possible sizes and angles and comparing a plurality of feature points in each region with the sensor data 40 requires a large amount of calculation. Therefore it is not realistic. Therefore, the positioning device 14e limits the range in which the area is set in order to reduce the amount of calculation. When the AGV 10 reaches the target position M n , the AGV 10 travels by receiving an instruction from the travel management device 20 about the target position M n + 1 to be next. Therefore, a region may be set so as to include the travel route between the positions M n and M n + 1 , and a plurality of features in each region may be compared with the sensor data 40. Thereby, the comparison calculation can be omitted for an area including a position far away from the current travel route.
なお、図21に示すマップ50bとセンサデータ40とを照合する処理を行う場合には、複数の特徴点51のみを用いてマップ50aの例と同様の方法を採用することができる。さらに、複数の非特徴点53を用いてもよい。このとき、測位装置14eは、特徴点51と非特徴点53とに異なる重みを与えて、マップ50bとセンサデータ40とを照合する処理を行ってもよい。たとえば各特徴点51の重みを、非特徴点53の重みよりも大きくする。その結果、ある領域内の各特徴点51の位置とセンサデータ40に含まれる反射点の位置とのずれ量が相対的に小さければ、ある特徴点51と当該領域のセンサデータとが一致したと判断され得る。
In addition, when performing the process which collates the map 50b shown in FIG. 21, and the sensor data 40, the method similar to the example of the map 50a using only the some feature point 51 is employable. Further, a plurality of non-feature points 53 may be used. At this time, the positioning device 14e may perform a process of matching the map 50b with the sensor data 40 by giving different weights to the feature points 51 and the non-feature points 53. For example, the weight of each feature point 51 is set larger than the weight of the non-feature point 53. As a result, if the amount of deviation between the position of each feature point 51 in a certain area and the position of the reflection point included in the sensor data 40 is relatively small, the certain feature point 51 and the sensor data in that area match. Can be judged.
上述の通り、AGV10は、複数枚の中間マップから、固定物を表す特徴点を識別可能に含む新たなマップを生成する。図24は、新たなマップを生成する処理の手順を示すフローチャートである。
As described above, the AGV 10 generates a new map that includes distinguishable feature points representing a fixed object from a plurality of intermediate maps. FIG. 24 is a flowchart illustrating a processing procedure for generating a new map.
ステップS10において、AGV10の測位装置14eは、工場内を走行しながら周囲の空間をセンシングして複数枚の中間マップを取得する。ステップS11において、測位装置14eは、複数枚の中間マップ間で一致度が所定値より大きい特徴点を決定する。一致度の計算方法、および、特徴点の決定方法は上述の通りである。ステップS12において、測位装置14eは特徴点を識別可能に含むマップを生成する。ステップS12によって生成されるマップは、図20に示すマップ50aでもよいし、図21に示すマップ50bでもよい。
In step S10, the positioning device 14e of the AGV 10 senses the surrounding space while traveling in the factory and acquires a plurality of intermediate maps. In step S <b> 11, the positioning device 14 e determines a feature point having a matching degree greater than a predetermined value between a plurality of intermediate maps. The method for calculating the coincidence and the method for determining the feature points are as described above. In step S12, the positioning device 14e generates a map including the feature points in an identifiable manner. The map generated in step S12 may be the map 50a shown in FIG. 20 or the map 50b shown in FIG.
なお、上述の処理は、測位装置14eではなくマイコン14aが行ってもよいし、マイコン14aおよび測位装置14eが統合されたチップ回路14gが行ってもよい。つまり、上述の処理は「処理回路」が行えばよい。また、上述の処理は、複数の中間マップの全てを利用しているが、全てではなく、少なくとも2枚の中間マップを利用すればよい。
The above-described processing may be performed by the microcomputer 14a instead of the positioning device 14e, or may be performed by the chip circuit 14g in which the microcomputer 14a and the positioning device 14e are integrated. That is, the “processing circuit” may perform the above processing. Moreover, although the above-mentioned process uses all of the plurality of intermediate maps, it is sufficient to use at least two intermediate maps instead of all of them.
以下、上述の実施形態の変形例を説明する。
Hereinafter, modifications of the above-described embodiment will be described.
(変形例1)
上述の例では、AGV10の測位装置14eが、センサデータから複数の中間マップを生成し、さらに当該複数の中間マップから1枚のマップを生成した。しかしながら、センサデータから最終的な1枚のマップを生成する処理を、AGV10以外の外部の信号処理装置、たとえば走行管理装置20が行ってもよい。AGV10と信号処理装置とを有するシステムを、「地図生成システム」と呼ぶ。さらに、複数の中間マップから生成された1枚のマップを用いてAGV10が自己位置を同定する処理を行うシステムを、「移動体システム」と呼ぶ。 (Modification 1)
In the above example, thepositioning device 14e of the AGV 10 generates a plurality of intermediate maps from the sensor data, and further generates one map from the plurality of intermediate maps. However, an external signal processing device other than the AGV 10, for example, the travel management device 20, may perform the process of generating one final map from the sensor data. A system including the AGV 10 and the signal processing device is referred to as a “map generation system”. Furthermore, a system in which the AGV 10 performs a process of identifying its own position using one map generated from a plurality of intermediate maps is called a “mobile system”.
上述の例では、AGV10の測位装置14eが、センサデータから複数の中間マップを生成し、さらに当該複数の中間マップから1枚のマップを生成した。しかしながら、センサデータから最終的な1枚のマップを生成する処理を、AGV10以外の外部の信号処理装置、たとえば走行管理装置20が行ってもよい。AGV10と信号処理装置とを有するシステムを、「地図生成システム」と呼ぶ。さらに、複数の中間マップから生成された1枚のマップを用いてAGV10が自己位置を同定する処理を行うシステムを、「移動体システム」と呼ぶ。 (Modification 1)
In the above example, the
以下では、外部の信号処理装置として「走行管理装置20」を例示する。
Hereinafter, “travel management device 20” will be exemplified as an external signal processing device.
AGV10は、工場を一巡して中間マップに必要な一群のセンサデータを取得し、たとえば着脱可能な記録媒体に書き込む。管理者は当該記録媒体を取り外し、書き込まれた一群のセンサデータを走行管理装置20に読み込ませる。または、AGV10は、工場内を走行しながら、1スキャンまたは数回のスキャン毎に、得られたセンサデータを無線で走行管理装置20に送信してもよい。工場を一巡した後は、走行管理装置20は中間マップに必要な一群のセンサデータを取得することができる。本変形例では、中間マップはAGV10ではなく、走行管理装置20によって作成される。
The AGV 10 takes a round of the factory, acquires a group of sensor data necessary for the intermediate map, and writes it on, for example, a removable recording medium. The administrator removes the recording medium and causes the traveling management device 20 to read the written group of sensor data. Alternatively, the AGV 10 may wirelessly transmit the obtained sensor data to the travel management device 20 for each scan or several scans while traveling in the factory. After making a round of the factory, the travel management device 20 can acquire a group of sensor data necessary for the intermediate map. In this modification, the intermediate map is created not by the AGV 10 but by the travel management device 20.
図25は、移動体システム60の構成を示す。図25はまた、走行管理装置20のハードウェア構成を示す。走行管理装置20は、CPU21と、メモリ22と、記憶装置23と、通信回路24と、画像処理回路25とを有する。CPU21、メモリ22、記憶装置23、通信回路24、および画像処理回路25は通信バス27で接続されており、相互にデータを授受することが可能である。
FIG. 25 shows the configuration of the mobile system 60. FIG. 25 also shows a hardware configuration of the travel management device 20. The travel management device 20 includes a CPU 21, a memory 22, a storage device 23, a communication circuit 24, and an image processing circuit 25. The CPU 21, the memory 22, the storage device 23, the communication circuit 24, and the image processing circuit 25 are connected by a communication bus 27 and can exchange data with each other.
CPU21は、走行管理装置20の動作を制御する信号処理回路(コンピュータ)である。典型的にはCPU21は半導体集積回路である。
The CPU 21 is a signal processing circuit (computer) that controls the operation of the travel management device 20. Typically, the CPU 21 is a semiconductor integrated circuit.
メモリ22は、CPU21が実行するコンピュータプログラムを記憶する、揮発性の記憶装置である。メモリ22は、CPU21が演算を行う際のワークメモリとしても利用され得る。
The memory 22 is a volatile storage device that stores a computer program executed by the CPU 21. The memory 22 can also be used as a work memory when the CPU 21 performs calculations.
記憶装置23は、AGV10から受信した一群のセンサデータを格納する。記憶装置23は、一群のセンサデータから生成された複数の中間マップの各データを格納してもよいし、さらに、複数の中間マップから生成された1枚のマップのデータを格納してもよい。記憶装置23は、不揮発性の半導体メモリであってもよいし、ハードディスクに代表される磁気記録媒体、光ディスクに代表される光学式記録媒体であってもよい。
The storage device 23 stores a group of sensor data received from the AGV 10. The storage device 23 may store data of a plurality of intermediate maps generated from a group of sensor data, and may further store data of one map generated from a plurality of intermediate maps. . The storage device 23 may be a nonvolatile semiconductor memory, a magnetic recording medium represented by a hard disk, or an optical recording medium represented by an optical disk.
なお、記憶装置23には、走行管理装置20として機能するために必要な、AGV10の行き先となり得る各位置を示す位置データも格納され得る。位置データは、たとえば管理者によって工場内に仮想的に設定された座標によって表され得る。位置データは管理者によって決定される。
Note that the storage device 23 can also store position data indicating each position that can be a destination of the AGV 10 that is necessary to function as the travel management device 20. The position data can be represented by coordinates virtually set in the factory by an administrator, for example. The location data is determined by the administrator.
通信回路24は、たとえばイーサネット(登録商標)規格に準拠した有線通信を行う。通信回路24は無線アクセスポイント2a、2b等と有線で接続されており、無線アクセスポイント2a、2b等を介して、AGV10と通信することができる。走行管理装置20は通信回路24を介して、AGV10から複数の中間マップの個々のデータまたはひとまとめにしたデータを受信し、生成した1枚のマップのデータをAGV10に送信する。
The communication circuit 24 performs wired communication based on, for example, the Ethernet (registered trademark) standard. The communication circuit 24 is connected to the wireless access points 2a, 2b and the like by wire, and can communicate with the AGV 10 via the wireless access points 2a, 2b and the like. The travel management device 20 receives individual data of a plurality of intermediate maps or a group of data from the AGV 10 via the communication circuit 24, and transmits the generated data of one map to the AGV 10.
また、通信回路24は、AGV10が向かうべき位置のデータを、バス27を介してCPU21から受信してAGV10に送信する。通信回路24は、AGV10から受信したデータ(たとえば通知)を、バス27を介してCPU21および/またはメモリ22に送信する。
Further, the communication circuit 24 receives the data of the position to which the AGV 10 should go from the CPU 21 via the bus 27 and transmits it to the AGV 10. The communication circuit 24 transmits data (for example, notification) received from the AGV 10 to the CPU 21 and / or the memory 22 via the bus 27.
画像処理回路25は外部モニタ29に表示する映像データを生成する回路である。画像処理回路25は、専ら、管理者が走行管理装置20を操作する際に動作する。本実施形態では特にこれ以上の詳細な説明は省略する。なお、モニタ29は走行管理装置20と一体化されていてもよい。また画像処理回路25の処理をCPU21が行ってもよい。
The image processing circuit 25 is a circuit that generates video data to be displayed on the external monitor 29. The image processing circuit 25 operates exclusively when the administrator operates the travel management device 20. In the present embodiment, further detailed explanation is omitted. The monitor 29 may be integrated with the travel management apparatus 20. Further, the CPU 21 may perform the processing of the image processing circuit 25.
走行管理装置20のCPU21または画像処理回路25は、記憶装置23を参照して一群のセンサデータを読み出す。各センサデータは、たとえば当該センサデータが取得されたときのAGV10の位置および姿勢、レーザ光の放射された角度、および、放射点から反射点までの距離を一組としたベクトルデータである。CPU21または画像処理回路25は、各センサデータを、直交座標上の座標点の位置に変換する。全てのセン-サデータを変換することにより、複数の中間マップが得られる。
The CPU 21 or the image processing circuit 25 of the travel management device 20 reads a group of sensor data with reference to the storage device 23. Each sensor data is, for example, vector data including a set of the position and orientation of the AGV 10 when the sensor data is acquired, the angle at which the laser light is emitted, and the distance from the radiation point to the reflection point. The CPU 21 or the image processing circuit 25 converts each sensor data into the position of the coordinate point on the orthogonal coordinates. By converting all the sensor data, a plurality of intermediate maps are obtained.
CPU21または画像処理回路25は、複数の中間マップから1枚のマップを生成する。つまり、図24の処理において、走行管理装置20は、上述のステップS11およびS12を行えばよい。
The CPU 21 or the image processing circuit 25 generates one map from a plurality of intermediate maps. That is, in the process of FIG. 24, the travel management device 20 may perform the above-described steps S11 and S12.
上述の処理によって複数枚の中間マップから1枚のマップが生成されると、生成されたマップのデータは、メモリ22または記憶装置23に一旦格納される。生成されたマップのデータは、その後無線または着脱可能な記録媒体等を介して、アクセスポイント2a、2bを介して中間マップのデータを生成したAGV10に送信される。ただし中間マップのデータは他のAGV10に送信されてもよい。
When one map is generated from a plurality of intermediate maps by the above-described processing, the generated map data is temporarily stored in the memory 22 or the storage device 23. The generated map data is then transmitted to the AGV 10 that generated the intermediate map data via the access points 2a and 2b via a wireless or removable recording medium. However, the intermediate map data may be transmitted to another AGV 10.
上述の説明では、走行管理装置20が一群のセンサデータから1枚のマップを生成した。走行管理装置20は一例である。CPU21に相当する信号処理回路(またはコンピュータ)、メモリおよび通信回路を有していれば、任意の信号処理装置が一群のセンサデータから1枚のマップを生成することができる。その場合には、当該信号処理装置からは、走行管理装置20として機能するために必要な機能および構成を書略することができる。
In the above description, the travel management device 20 generates one map from a group of sensor data. The travel management device 20 is an example. If a signal processing circuit (or computer) corresponding to the CPU 21, a memory, and a communication circuit are included, an arbitrary signal processing device can generate one map from a group of sensor data. In that case, functions and configurations necessary for functioning as the travel management device 20 can be omitted from the signal processing device.
(変形例2)
変形例2は、変形例1の応用である。 (Modification 2)
Modification 2 is an application of Modification 1.
変形例2は、変形例1の応用である。 (Modification 2)
たとえば非常に広い工場では、1台のAGV10を走行させながらセンサを用いて複数回センシングを行うことは非常に手間と時間とを要する。そのため、工場全体をより小さい複数のゾーンに分け、ゾーンごとに1台のAGV10を用いて中間マップを作成させてもよい。たとえば、150mx100mの工場を、50mx50mを1ゾーンとする6つのゾーンに分ける。
For example, in a very large factory, it is very time-consuming and time-consuming to perform sensing multiple times using a sensor while running one AGV10. Therefore, the entire factory may be divided into a plurality of smaller zones, and an intermediate map may be created using one AGV 10 for each zone. For example, a factory of 150 mx 100 m is divided into six zones with one zone of 50 mx 50 m.
各ゾーンのAGV10は、一群のセンサデータを取得し、変形例1で説明したいずれかの方法によってAGV10以外の外部の信号処理装置、たとえば走行管理装置20に一群のセンサデータを取得させる。走行管理装置20は、上述した処理により、ゾーン毎に、複数枚の中間マップを生成し、さらに複数枚の中間マップから1枚のゾーンマップを作成する。そして走行管理装置20は、各ゾーンマップを1つにまとめた統合マップを作成する。複数台のAGV10を並列的に動作させることにより、実質的に1つのゾーンをセンシングする時間で、工場全体のスキャンが終了し、1枚の統合マップを得ることができる。
The AGV 10 in each zone acquires a group of sensor data, and causes an external signal processing device other than the AGV 10 such as the travel management device 20 to acquire a group of sensor data by any of the methods described in the first modification. The travel management device 20 generates a plurality of intermediate maps for each zone by the above-described processing, and further generates one zone map from the plurality of intermediate maps. Then, the travel management device 20 creates an integrated map that combines the zone maps into one. By operating a plurality of AGVs 10 in parallel, the scan of the entire factory is completed in a time for sensing one zone substantially, and one integrated map can be obtained.
走行管理装置20は、得られた統合マップを、全てのAGV10に送信してもよいし、移動するゾーンが予め決まっているAGV10には、該当するゾーンのみを切り出して送信してもよい。
The traveling management device 20 may transmit the obtained integrated map to all the AGVs 10, or may cut out and transmit only the corresponding zones to the AGVs 10 in which moving zones are determined in advance.
(変形例3)
上述の実施形態では、AGV10が複数枚の中間マップから1枚のマップを生成した。本変形例では、AGV10の測位装置14eは、複数枚の中間マップを取得した後、センサデータと各中間マップのデータとを照合して、中間マップごとに、一致度が所定値より大きい複数の特徴点を決定する。たとえば測位装置14eは、センサデータと第1の中間マップのデータとを照合して、第1の中間マップについて、複数の第1の特徴点を決定する。同様に測位装置14eは、センサデータと第2の中間マップのデータとを照合して、第2の中間マップについて、複数の第2の特徴点を決定する。そして測位装置14eは、複数の第1の特徴点および複数の第2の特徴点に共通して含まれる複数の特徴点(「共通特徴点」と呼ぶ。)を決定する。共通特徴点は、上述の実施形態におけるマップ50aまたは50bに含まれる特徴点に相当する。 (Modification 3)
In the above-described embodiment, theAGV 10 generates one map from a plurality of intermediate maps. In this modification, the positioning device 14e of the AGV 10 obtains a plurality of intermediate maps and then collates the sensor data with the data of each intermediate map, and each of the intermediate maps has a plurality of matching degrees greater than a predetermined value. Determine feature points. For example, the positioning device 14e collates the sensor data with the data of the first intermediate map, and determines a plurality of first feature points for the first intermediate map. Similarly, the positioning device 14e collates the sensor data with the data of the second intermediate map, and determines a plurality of second feature points for the second intermediate map. Then, the positioning device 14e determines a plurality of feature points (referred to as “common feature points”) included in common with the plurality of first feature points and the plurality of second feature points. The common feature point corresponds to a feature point included in the map 50a or 50b in the above-described embodiment.
上述の実施形態では、AGV10が複数枚の中間マップから1枚のマップを生成した。本変形例では、AGV10の測位装置14eは、複数枚の中間マップを取得した後、センサデータと各中間マップのデータとを照合して、中間マップごとに、一致度が所定値より大きい複数の特徴点を決定する。たとえば測位装置14eは、センサデータと第1の中間マップのデータとを照合して、第1の中間マップについて、複数の第1の特徴点を決定する。同様に測位装置14eは、センサデータと第2の中間マップのデータとを照合して、第2の中間マップについて、複数の第2の特徴点を決定する。そして測位装置14eは、複数の第1の特徴点および複数の第2の特徴点に共通して含まれる複数の特徴点(「共通特徴点」と呼ぶ。)を決定する。共通特徴点は、上述の実施形態におけるマップ50aまたは50bに含まれる特徴点に相当する。 (Modification 3)
In the above-described embodiment, the
なお、測位装置14eは、全中間マップの枚数に対する、特徴点を有する中間マップの枚数の割合が閾値より大きいか、または閾値以上であるか否かを位置ごとに決定してもよい。上述の実施形態と同様、測位装置14eは、当該割合が閾値より大きいか、または閾値以上である場合には、当該位置が共通特徴点であると判定してもよい。
Note that the positioning device 14e may determine, for each position, whether the ratio of the number of intermediate maps having feature points to the total number of intermediate maps is greater than or equal to the threshold. Similar to the above-described embodiment, the positioning device 14e may determine that the position is a common feature point when the ratio is greater than the threshold value or greater than or equal to the threshold value.
測位装置14eは、共通特徴点のデータとセンサデータとの照合結果を利用して自己位置を同定する。なお、「照合結果」とは、測位装置14eが改めて共通特徴点のデータとセンサデータとの照合を行った結果であってもよいし、共通特徴点を決定する前に行われた処理の結果であってもよい。
The positioning device 14e identifies the self-position by using the collation result between the common feature point data and the sensor data. The “matching result” may be the result of the matching of the common feature point data and the sensor data by the positioning device 14e, or the result of the processing performed before the common feature point is determined. It may be.
(変形例4)
上述した例では、いずれも、AGV10が走行する二次元空間のマップを想定していた。しかしながら、高さ方向の空間もスキャンすることが可能なレーザレンジファインダを用いて、三次元空間のマップを生成してもよい。 (Modification 4)
In the examples described above, a map of a two-dimensional space in which theAGV 10 travels is assumed. However, a three-dimensional space map may be generated using a laser range finder that can also scan a space in the height direction.
上述した例では、いずれも、AGV10が走行する二次元空間のマップを想定していた。しかしながら、高さ方向の空間もスキャンすることが可能なレーザレンジファインダを用いて、三次元空間のマップを生成してもよい。 (Modification 4)
In the examples described above, a map of a two-dimensional space in which the
AGV10は、平面方向だけでなく、高さ方向も含めて複数の中間マップを生成する。そして、当該高さ方向の要素をさらに考慮して、複数の中間マップ間で一致度が所定値より大きい点を複数の特徴点として決定する。当該複数の特徴点を識別可能に含む1枚のマップを利用すれば、高さ方向の特徴量を利用して自己位置を同定する処理が行われ得る。
AGV10 generates a plurality of intermediate maps including not only the plane direction but also the height direction. Then, further considering the element in the height direction, a point having a degree of matching greater than a predetermined value between the plurality of intermediate maps is determined as a plurality of feature points. If a single map including the plurality of feature points in an identifiable manner is used, a process for identifying the self-position using the feature amount in the height direction can be performed.
本開示の技術は、自己位置を同定する処理を行う移動体、当該移動体を制御する走行管理装置、および移動体および走行管理装置を包含する管理システムにおいて広く用いられ得る。
The technology of the present disclosure can be widely used in a mobile body that performs processing for identifying a self-position, a travel management device that controls the mobile body, and a management system that includes the mobile body and the travel management device.
2a、2b 無線アクセスポイント、 10 自動搬送車(AGV)、 14 走行制御装置、 14a マイコン、 14b メモリ、 14c 記憶装置、 14d 通信回路、 14e 測位装置、 15 レーザレンジファインダ、 16a~16d モータ、 17 駆動装置、 17a~17d モータ駆動回路、 20 走行管理装置
2a, 2b wireless access point, 10 automatic guided vehicle (AGV), 14 travel control device, 14a microcomputer, 14b memory, 14c storage device, 14d communication circuit, 14e positioning device, 15 laser range finder, 16a-16d motor, 17 drive Device, 17a-17d motor drive circuit, 20 travel management device
Claims (13)
- 移動体であって、
モータと、
前記モータを制御して前記移動体を移動させる駆動装置と、
周囲の空間をセンシングしてセンサデータを出力するセンサと、
複数枚の中間マップのデータを格納する記憶装置であって、前記複数枚の中間マップの各々は、時間間隔を開けて前記センサによってそれぞれセンシングされた前記周囲の空間のセンサデータから生成される、記憶装置と、
前記記憶装置に記憶された前記複数枚の中間マップから1枚のマップを生成する処理回路であって、前記1枚のマップは複数の特徴点を識別可能に含み、前記複数の特徴点の各々は、前記複数の中間マップ間で一致度が所定値より大きい位置を示す、処理回路と
を備え、
前記処理回路は、前記1枚のマップと、前記センサから新たに出力されたセンサデータとを照合して自己位置を同定する処理を行う、移動体。 A moving object,
A motor,
A drive device for controlling the motor to move the movable body;
A sensor that senses the surrounding space and outputs sensor data;
A storage device for storing data of a plurality of intermediate maps, wherein each of the plurality of intermediate maps is generated from sensor data of the surrounding space sensed by the sensor with a time interval. A storage device;
A processing circuit for generating one map from the plurality of intermediate maps stored in the storage device, wherein the one map includes a plurality of feature points in an identifiable manner, and each of the plurality of feature points Includes a processing circuit that indicates a position where the degree of coincidence between the plurality of intermediate maps is greater than a predetermined value,
The said processing circuit is a mobile body which performs the process which collates the said one map and the sensor data newly output from the said sensor, and identifies a self-position. - 前記処理回路は、複数枚の中間マップのうちの少なくとも2枚の中間マップ間の一致度を算出して前記複数の特徴点を決定し、決定した前記複数の特徴点を識別可能に含む前記1枚のマップを生成する、請求項1に記載の移動体。 The processing circuit calculates the degree of coincidence between at least two intermediate maps among the plurality of intermediate maps, determines the plurality of feature points, and includes the determined plurality of feature points in an identifiable manner. The mobile body according to claim 1, wherein the mobile map is generated.
- 前記複数の特徴点の各々は、前記複数枚の中間マップのうちの少なくとも2枚の中間マップ上の共通の位置に存在するセンサデータから決定される、請求項2に記載の移動体。 The moving body according to claim 2, wherein each of the plurality of feature points is determined from sensor data existing at a common position on at least two of the plurality of intermediate maps.
- 前記複数の特徴点の各々は、前記複数枚の中間マップのうち、閾値以上の枚数の中間マップ上の共通の位置に前記センサデータが存在することを表す、請求項2に記載の移動体。 The moving body according to claim 2, wherein each of the plurality of feature points represents that the sensor data is present at a common position on the intermediate map having a number equal to or greater than a threshold value among the plurality of intermediate maps.
- 前記複数の特徴点の各々は、前記センサによってセンシングされた前記周囲の空間に存在する物体の位置を表す、請求項3または4に記載の移動体。 The moving body according to claim 3 or 4, wherein each of the plurality of feature points represents a position of an object existing in the surrounding space sensed by the sensor.
- 前記処理回路は、前記センサデータが存在するか否かを、前記複数枚の中間マップの各々について位置ごとに判定し、前記複数枚の中間マップの枚数に対する、前記センサデータが存在した中間マップの枚数の割合が閾値より大きいまたは閾値以上である各位置を、各特徴点として有する前記1枚のマップを生成する、請求項2に記載の移動体。 The processing circuit determines whether or not the sensor data exists for each position for each of the plurality of intermediate maps, and the processing circuit determines the number of intermediate maps in which the sensor data existed with respect to the number of the plurality of intermediate maps. The moving body according to claim 2, wherein the one map is generated that has, as each feature point, each position where the ratio of the number of sheets is greater than or equal to a threshold value.
- 前記処理回路は、位置ごとに、かつ、前記割合を利用して、前記1枚のマップ上の前記複数の特徴点の各々と前記センサから新たに出力されたセンサデータとが一致しているか否かを判定する、請求項5に記載の移動体。 The processing circuit determines whether each of the plurality of feature points on the one map matches the sensor data newly output from the sensor for each position and using the ratio. The moving body according to claim 5, wherein the determination is made.
- 前記移動体の前記記憶装置は、前記信号処理装置の前記処理回路によって生成された前記1枚のマップを記憶する、請求項3から7のいずれかに記載の移動体。 The mobile body according to any one of claims 3 to 7, wherein the storage device of the mobile body stores the one map generated by the processing circuit of the signal processing device.
- 前記処理回路は、前記複数の中間マップに共通して含まれる特徴量を基準として前記複数の中間マップを重ね合わせ、前記重ね合わせたときの前記複数のマップ間のずれ量の大きさに応じて前記一致度を算出する、請求項1から8のいずれかに記載の移動体。 The processing circuit superimposes the plurality of intermediate maps on the basis of a feature amount commonly included in the plurality of intermediate maps, and according to the amount of deviation between the plurality of maps when the superposition is performed. The moving body according to claim 1, wherein the degree of coincidence is calculated.
- 前記複数枚の中間マップから生成される前記1枚のマップは、前記複数の特徴点と、前記複数の特徴点以外の、複数の非特徴点とを識別可能に包含する、請求項1から9のいずれかに記載の移動体。 The one map generated from the plurality of intermediate maps includes the plurality of feature points and a plurality of non-feature points other than the plurality of feature points in an identifiable manner. The moving body in any one of.
- 移動体が自己位置を同定するために参照するマップを生成する信号処理装置であって、
前記移動体は、
モータと、
前記モータを制御して前記移動体を移動させる駆動装置と、
周囲の空間をセンシングしてセンサデータを出力するセンサと、
複数枚の中間マップのデータを格納する記憶装置であって、前記複数枚の中間マップの各々は、時間間隔を開けて前記センサによってそれぞれセンシングされた前記周囲の空間のセンサデータから生成される、記憶装置と、
制御回路と
を有し、
前記信号処理装置は、
前記記憶装置に記憶された前記複数枚の中間マップから1枚のマップを生成する処理回路であって、前記複数の中間マップ間で一致度が所定値より大きい複数の特徴点を決定し、前記複数の特徴点を識別可能に含む1枚のマップを生成する処理回路と、
生成した前記1枚のマップを記憶する記憶装置と
を備える信号処理装置。 A signal processing device for generating a map to which a mobile body refers to identify its own position,
The moving body is
A motor,
A drive device for controlling the motor to move the movable body;
A sensor that senses the surrounding space and outputs sensor data;
A storage device for storing data of a plurality of intermediate maps, wherein each of the plurality of intermediate maps is generated from sensor data of the surrounding space sensed by the sensor with a time interval. A storage device;
Control circuit and
The signal processing device includes:
A processing circuit for generating one map from the plurality of intermediate maps stored in the storage device, determining a plurality of feature points having a degree of matching greater than a predetermined value between the plurality of intermediate maps; A processing circuit for generating one map including a plurality of feature points in an identifiable manner;
A signal processing apparatus comprising: a storage device that stores the generated one map. - 移動体であって、
モータと、
前記モータを制御して前記移動体を移動させる駆動装置と、
周囲の空間をセンシングしてセンサデータを出力するセンサと、
複数枚の中間マップのデータを格納する記憶装置であって、前記複数枚の中間マップの各々は、時間間隔を開けて前記センサによってそれぞれセンシングされた前記周囲の空間のセンサデータから生成される、記憶装置と、
処理回路と
を有し、
前記移動体の移動中に前記センサは前記センサデータを出力し、
前記処理回路は、
移動中に前記センサから出力された前記センサデータと、前記記憶装置に記憶された前記複数枚の中間マップの各々とを比較して、各中間マップ上の位置ごとに、前記センサデータと各中間マップとの一致度が所定値より大きい特徴点を決定し、
前記複数枚の中間マップの枚数に対する、前記特徴点を有すると決定された中間マップの枚数の割合が閾値より大きいまたは閾値以上であるか否かを、前記特徴点を有すると決定された中間マップの位置ごとに決定し、
前記割合が閾値より大きいまたは閾値以上である各位置を共通特徴点として決定し、
前記共通特徴点および前記センサデータとの照合結果を利用して自己位置を同定する、移動体。 A moving object,
A motor,
A drive device for controlling the motor to move the movable body;
A sensor that senses the surrounding space and outputs sensor data;
A storage device for storing data of a plurality of intermediate maps, wherein each of the plurality of intermediate maps is generated from sensor data of the surrounding space sensed by the sensor with a time interval. A storage device;
A processing circuit,
During the movement of the moving body, the sensor outputs the sensor data,
The processing circuit includes:
The sensor data output from the sensor during movement is compared with each of the plurality of intermediate maps stored in the storage device, and the sensor data and the intermediate data are determined for each position on each intermediate map. Determine feature points whose degree of match with the map is greater than a predetermined value,
Whether the ratio of the number of intermediate maps determined to have the feature point to the number of the plurality of intermediate maps is greater than or equal to a threshold value is determined as having the feature point Determined for each position,
Determining each position where the ratio is greater than or equal to a threshold or greater than a threshold as a common feature point;
A moving object that identifies a self-location by using a result of collation with the common feature point and the sensor data. - 移動体が自己位置を同定するために参照するマップを生成する信号処理装置に搭載されたコンピュータによって実行されるコンピュータプログラムであって、
前記移動体は、
モータと、
前記モータを制御して前記移動体を移動させる駆動装置と、
周囲の空間をセンシングしてセンサデータを出力するセンサと、
複数枚の中間マップのデータを格納する記憶装置であって、前記複数枚の中間マップの各々は、時間間隔を開けて前記センサによってそれぞれセンシングされた前記周囲の空間のセンサデータから生成される、記憶装置と
を有し、
前記コンピュータプログラムは前記コンピュータに、
前記複数枚の中間マップのデータを前記移動体から受信させ、
前記複数の中間マップ間で一致度が所定値より大きい複数の特徴点を決定させ、
前記複数の特徴点を識別可能に含む1枚のマップを生成させることによって、前記複数枚の中間マップから前記1枚のマップを生成させ、
生成した前記1枚のマップが前記記憶装置に記憶されるよう、前記移動体に送信させる、
コンピュータプログラム。 A computer program executed by a computer mounted on a signal processing device that generates a map to which a moving body refers to identify its own position,
The moving body is
A motor,
A drive device for controlling the motor to move the movable body;
A sensor that senses the surrounding space and outputs sensor data;
A storage device for storing data of a plurality of intermediate maps, wherein each of the plurality of intermediate maps is generated from sensor data of the surrounding space sensed by the sensor with a time interval. A storage device,
The computer program is stored in the computer.
Receiving the data of the plurality of intermediate maps from the mobile body;
Determining a plurality of feature points having a degree of matching greater than a predetermined value between the plurality of intermediate maps;
Generating one map from the plurality of intermediate maps by generating one map including the plurality of feature points in an identifiable manner;
Sending the generated map so that the one map is stored in the storage device,
Computer program.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2019509044A JPWO2018180175A1 (en) | 2017-03-27 | 2018-03-01 | Moving object, signal processing device, and computer program |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017-061668 | 2017-03-27 | ||
JP2017061668 | 2017-03-27 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018180175A1 true WO2018180175A1 (en) | 2018-10-04 |
Family
ID=63676989
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2018/007787 WO2018180175A1 (en) | 2017-03-27 | 2018-03-01 | Mobile body, signal processing device, and computer program |
Country Status (2)
Country | Link |
---|---|
JP (1) | JPWO2018180175A1 (en) |
WO (1) | WO2018180175A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109724612A (en) * | 2019-01-14 | 2019-05-07 | 浙江大华技术股份有限公司 | A kind of AGV paths planning method and equipment based on topological map |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005326944A (en) * | 2004-05-12 | 2005-11-24 | Hitachi Ltd | Device and method for generating map image by laser measurement |
JP2010277548A (en) * | 2009-06-01 | 2010-12-09 | Hitachi Ltd | Robot management system, robot management terminal, method for managing robot, and program |
JP2012093811A (en) * | 2010-10-25 | 2012-05-17 | Hitachi Ltd | Robot system and map update method |
JP2015111336A (en) * | 2013-12-06 | 2015-06-18 | トヨタ自動車株式会社 | Mobile robot |
WO2015193941A1 (en) * | 2014-06-16 | 2015-12-23 | 株式会社日立製作所 | Map generation system and map generation method |
JP2016045874A (en) * | 2014-08-26 | 2016-04-04 | ソニー株式会社 | Information processor, method for information processing, and program |
-
2018
- 2018-03-01 WO PCT/JP2018/007787 patent/WO2018180175A1/en active Application Filing
- 2018-03-01 JP JP2019509044A patent/JPWO2018180175A1/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005326944A (en) * | 2004-05-12 | 2005-11-24 | Hitachi Ltd | Device and method for generating map image by laser measurement |
JP2010277548A (en) * | 2009-06-01 | 2010-12-09 | Hitachi Ltd | Robot management system, robot management terminal, method for managing robot, and program |
JP2012093811A (en) * | 2010-10-25 | 2012-05-17 | Hitachi Ltd | Robot system and map update method |
JP2015111336A (en) * | 2013-12-06 | 2015-06-18 | トヨタ自動車株式会社 | Mobile robot |
WO2015193941A1 (en) * | 2014-06-16 | 2015-12-23 | 株式会社日立製作所 | Map generation system and map generation method |
JP2016045874A (en) * | 2014-08-26 | 2016-04-04 | ソニー株式会社 | Information processor, method for information processing, and program |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109724612A (en) * | 2019-01-14 | 2019-05-07 | 浙江大华技术股份有限公司 | A kind of AGV paths planning method and equipment based on topological map |
Also Published As
Publication number | Publication date |
---|---|
JPWO2018180175A1 (en) | 2020-02-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10866587B2 (en) | System, method, and computer program for mobile body management | |
US10650270B2 (en) | Methods and systems for simultaneous localization and calibration | |
JP6825712B2 (en) | Mobiles, position estimators, and computer programs | |
US7899618B2 (en) | Optical laser guidance system and method | |
US9239580B2 (en) | Autonomous mobile robot, self position estimation method, environmental map generation method, environmental map generation apparatus, and data structure for environmental map | |
JP6711138B2 (en) | Self-position estimating device and self-position estimating method | |
US20200264616A1 (en) | Location estimation system and mobile body comprising location estimation system | |
US11537140B2 (en) | Mobile body, location estimation device, and computer program | |
JP7138538B2 (en) | Laser scanner calibration method, material handling machine | |
WO2016067640A1 (en) | Autonomous moving device | |
WO2018179960A1 (en) | Mobile body and local position estimation device | |
WO2019054209A1 (en) | Map creation system and map creation device | |
JP7255676B2 (en) | Carrier system, carrier, and control method | |
JP2019079171A (en) | Movable body | |
WO2018180175A1 (en) | Mobile body, signal processing device, and computer program | |
JP2020166702A (en) | Mobile body system, map creation system, route creation program and map creation program | |
JP7396353B2 (en) | Map creation system, signal processing circuit, mobile object and map creation method | |
WO2021049227A1 (en) | Information processing system, information processing device, and information processing program | |
JP2021056764A (en) | Movable body | |
JP6795730B6 (en) | Mobile management system, mobile, travel management device and computer program | |
WO2021220331A1 (en) | Mobile body system | |
JP6687313B1 (en) | Transport system | |
KR20240096070A (en) | SLAM NAVIGATON SYSTEM for Autonomous Forklift Truck based on ROS | |
JP2021149420A (en) | Estimation system and method | |
JP2020107116A (en) | Autonomous mobile body |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18776353 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2019509044 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18776353 Country of ref document: EP Kind code of ref document: A1 |