Nothing Special   »   [go: up one dir, main page]

CN115072626B - Transfer robot, transfer system, and method for generating presentation information - Google Patents

Transfer robot, transfer system, and method for generating presentation information Download PDF

Info

Publication number
CN115072626B
CN115072626B CN202110272328.9A CN202110272328A CN115072626B CN 115072626 B CN115072626 B CN 115072626B CN 202110272328 A CN202110272328 A CN 202110272328A CN 115072626 B CN115072626 B CN 115072626B
Authority
CN
China
Prior art keywords
machine body
transfer robot
information
target object
visual
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110272328.9A
Other languages
Chinese (zh)
Other versions
CN115072626A (en
Inventor
张硕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lingdong Technology Beijing Co Ltd
Original Assignee
Lingdong Technology Beijing Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lingdong Technology Beijing Co Ltd filed Critical Lingdong Technology Beijing Co Ltd
Priority to CN202110272328.9A priority Critical patent/CN115072626B/en
Priority to PCT/CN2021/138720 priority patent/WO2022188497A1/en
Publication of CN115072626A publication Critical patent/CN115072626A/en
Application granted granted Critical
Publication of CN115072626B publication Critical patent/CN115072626B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66FHOISTING, LIFTING, HAULING OR PUSHING, NOT OTHERWISE PROVIDED FOR, e.g. DEVICES WHICH APPLY A LIFTING OR PUSHING FORCE DIRECTLY TO THE SURFACE OF A LOAD
    • B66F9/00Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes
    • B66F9/06Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
    • B66F9/075Constructional features or details
    • B66F9/07504Accessories, e.g. for towing, charging, locking
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66FHOISTING, LIFTING, HAULING OR PUSHING, NOT OTHERWISE PROVIDED FOR, e.g. DEVICES WHICH APPLY A LIFTING OR PUSHING FORCE DIRECTLY TO THE SURFACE OF A LOAD
    • B66F17/00Safety devices, e.g. for limiting or indicating lifting force
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66FHOISTING, LIFTING, HAULING OR PUSHING, NOT OTHERWISE PROVIDED FOR, e.g. DEVICES WHICH APPLY A LIFTING OR PUSHING FORCE DIRECTLY TO THE SURFACE OF A LOAD
    • B66F17/00Safety devices, e.g. for limiting or indicating lifting force
    • B66F17/003Safety devices, e.g. for limiting or indicating lifting force for fork-lift trucks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B66HOISTING; LIFTING; HAULING
    • B66FHOISTING, LIFTING, HAULING OR PUSHING, NOT OTHERWISE PROVIDED FOR, e.g. DEVICES WHICH APPLY A LIFTING OR PUSHING FORCE DIRECTLY TO THE SURFACE OF A LOAD
    • B66F9/00Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes
    • B66F9/06Devices for lifting or lowering bulky or heavy goods for loading or unloading purposes movable, with their loads, on wheels or the like, e.g. fork-lift trucks
    • B66F9/075Constructional features or details
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions

Landscapes

  • Engineering & Computer Science (AREA)
  • Structural Engineering (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Geology (AREA)
  • Mechanical Engineering (AREA)
  • Transportation (AREA)
  • Civil Engineering (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Manipulator (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The embodiment of the application provides a transfer robot, a transfer system and a prompt information generation method. Wherein, transfer robot includes: a body; the sensing device is arranged on the machine body and is used for sensing the environmental information around the machine body; the controller is electrically connected with the sensing device and is used for acquiring the environment information through the sensing device when the transfer robot is in a waiting loading working state; when the existence of a target object in a set range around the machine body is determined based on the environment information, outputting a prompt instruction; the prompting device is electrically connected with the controller and is used for outputting prompting information which can be perceived by a human body according to the prompting instruction; the prompt information reflects the position of the machine body so as to provide a reference for the target object to execute loading actions. According to the technical scheme provided by the embodiment of the application, the occurrence rate of collision and damage of the transfer robot can be reduced, and the transfer robot is simple in structure and low in cost.

Description

Transfer robot, transfer system, and method for generating presentation information
Technical Field
The application relates to the technical field of robots, in particular to a transfer robot, a transfer system and a prompt information generation method.
Background
The transfer robot is an industrial robot capable of performing automated transfer work. The handling operation is to hold or support an object with one type of equipment, and to move from one location to another, for example, in a handling operation scene where a handling robot and a forklift are combined in a warehouse, a driver controls the forklift to lift and load a pallet for supporting the object on the handling robot, and then the handling robot carries the object to a designated location.
However, during handling of objects by a forklift, the driver is likely to block his view by the objects being forked, and it is difficult to see a relatively low-profile handling robot (typically about 20-40cm high). In actual operation, there is a risk that the forklift collides with and damages the transfer robot, and at the same time, there is a risk that the driver cannot judge whether the current position of the forklift is suitable for carrying out loading of the object on the transfer robot.
Disclosure of Invention
The present application provides a transfer robot, a transfer system, and a method of generating a presentation information that solve or at least partially solve the above-described problems.
In one embodiment of the present application, a transfer robot is provided. This transfer robot includes:
a body;
the sensing device is arranged on the machine body and is used for sensing the environmental information around the machine body;
the controller is electrically connected with the sensing device and is used for acquiring the environment information through the sensing device when the transfer robot is in a waiting loading working state; when the existence of a target object in a set range around the machine body is determined based on the environment information, outputting a prompt instruction;
the prompting device is electrically connected with the controller and is used for outputting prompting information which can be perceived by a human body according to the prompting instruction;
the prompt information reflects the position of the machine body so as to provide a reference for the target object to execute loading actions.
In one embodiment of the present application, a handling system is provided. The system comprises:
the transfer robot as described above for autonomous traveling to transfer an object;
a manually driven transport vehicle for loading the transfer robot with objects;
when the carrying robot is in a waiting loading working state, after the manual driving transport vehicle is sensed, a prompt message which can be perceived by a human body is output so as to guide a driver to drive the manual driving transport vehicle to finish the object loading action.
In an embodiment of the present application, a method for generating prompt information is provided, which is applied to a transfer robot. The method comprises the following steps:
acquiring environmental information around the robot body of the transfer robot when the transfer robot is in a waiting loading working state;
determining whether a target object exists in a set range around the machine body according to the environment information;
when the target object is determined, controlling a prompting device on the machine body to output prompting information which can be perceived by a human body;
the prompt information reflects the position of the transfer robot so as to provide a reference for the target object to execute loading actions.
In the technical solutions provided in the embodiments of the present application, the sensing device on the body of the handling robot makes the handling robot have the capability of sensing the information of the surrounding environment of the body; when the transfer robot is in a waiting loading working state, a controller electrically connected with the sensing device can acquire the environment information through the sensing device, and when the condition that a target object exists in a set range around the machine body is determined based on the environment information, a prompt instruction is output; the prompting device electrically connected with the controller can output prompting information which is perceivable by a human body and can reflect the position of the machine body according to the prompting instruction so as to provide reference for the loading action of the target object, reduce the occurrence rate of collision and damage of the transfer robot, and has simple structure and low cost.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, a brief description will be given below of the drawings that are needed to be utilized in the embodiments or the prior art descriptions, and it is obvious that the drawings in the following description are some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a schematic structural diagram of a transfer robot according to an embodiment of the present disclosure;
FIG. 2 is a schematic diagram of a handling system according to an embodiment of the present disclosure;
fig. 3 is a schematic diagram of a visual signal output unit in a transfer robot according to an embodiment of the present disclosure outputting two light beams projected on the ground;
fig. 4 is a schematic diagram of a visual signal output unit in a transfer robot according to an embodiment of the present disclosure outputting two rows of arrow patterns projected on the ground;
fig. 5 is a flowchart of a method for generating prompt information according to an embodiment of the present application.
Detailed Description
In order to enable those skilled in the art to better understand the present application, the following description will make clear and complete descriptions of the technical solutions in the embodiments of the present application with reference to the accompanying drawings in the embodiments of the present application.
In some of the flows described in the specification, claims, and drawings described above, a plurality of operations occurring in a particular order are included, and the operations may be performed out of order or concurrently with respect to the order in which they occur. The sequence numbers of operations such as 101, 102, etc. are merely used to distinguish between the various operations, and the sequence numbers themselves do not represent any order of execution. In addition, the flows may include more or fewer operations, and the operations may be performed sequentially or in parallel. It should be noted that, the term "or/and" in this application is merely an association relationship describing the association object, which means that three relationships may exist, for example: a and/or B are three cases that A can exist alone, A and B exist together and B exists alone; the character "/" in this application generally indicates that the associated object is an "or" relationship. Furthermore, the embodiments described below are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by those skilled in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
Fig. 1 shows a schematic structural diagram of a transfer robot according to an embodiment of the present application. As shown in fig. 1, the transfer robot includes: the machine body 10, the sensing device 20, a controller (not shown in the figure) and the prompting device 30. Wherein,,
and a sensing device 20 disposed on the body 10 for sensing environmental information around the body 10.
The controller is electrically connected with the sensing device 20 and is used for acquiring the environment information through the sensing device 20 when the transfer robot is in a waiting loading working state; and outputting a prompt instruction when the target object exists in the set range around the machine body based on the environment information.
The prompting device 30 is electrically connected with the controller and is used for outputting prompting information according to the prompting instruction;
the prompt information reflects the position of the machine body so as to provide a reference for the target object to execute loading actions.
Along the traveling direction of the transfer robot, the directions around the machine body 10 can be divided into: front side, rear side, left side, right side. The housing 10 may also include a corresponding space above the top of the housing. In practice, the position of the sensing device 20 on the machine body 10 can be designed according to the actual operation requirement. For example, the sensing device 20 may be provided on the side wall and the top of the body 10.
For convenience of description, the side wall of the body 10 may include, according to the above-described orientation; front side wall, back side wall, left side wall and right side wall. For example, when the transfer robot is in a waiting loading working state, the transfer robot is stopped at a fixed position and is stationary in a fixed posture, and the sensing device 20 is only required to be arranged on one side wall of the transfer robot, such as the front side wall. Of course, the sensing device 20 may be mounted on both the front and rear sidewalls of the body 10. For example, when the transfer robot is in a waiting loading working state, the transfer robot moves towards a target object (such as a fork) to a position where the target object is convenient to load, and in the moving process, the transfer robot needs to acquire surrounding environment information; at this time, the front side wall, the rear side wall, the left side wall and the right side wall of the transfer robot body 10 may be provided with sensing devices.
The sensing device 20 is provided on the top of the machine body 10 for use when the object to be lifted by the subject is loaded on the transfer robot. Assuming that the sensing device is a distance sensor, the distance from the top of the machine body when the loading action is performed can be sensed by the target object.
In the embodiment shown in fig. 1, the sensing device 20 is provided on the front side wall, the rear side wall, the right side wall and the left side wall of the machine body 10.
One or more sensing devices 20 may be provided on top of the machine body. A sensing device 20 is arranged at the center position of the top of the machine body 10; and/or the sensing device 20 is arranged at the center of at least one edge of the top of the machine body 10. For example, as shown in fig. 1, a sensing device 20 is disposed at a central position O of the top of the machine body 10; the sensing devices 20 are respectively disposed at the central positions B1 and B2 corresponding to the opposite sides of the top of the machine body 10, and the sensing devices 20 are also disposed at the central positions C1 and C2 corresponding to the other two opposite sides of the top of the machine body 10. It should be noted here that the number of sensing devices provided on the top and/or side walls of the machine body 10 shown in fig. 1 is merely illustrative and does not represent the actual number.
In some embodiments of the present application, the sensing device 20 is a distance sensor, which is used for sensing environmental information around the machine body 10, in particular, for sensing a distance of a target object around the machine body 10 relative to the machine body 10. In some embodiments, a distance sensor may measure the distance of a target object relative to the body 10 by transmitting a signal and receiving a reflected signal. The distance sensor may be any sensor capable of measuring distance of an object, such as but not limited to: acoustic wave sensors, infrared sensors, lidar, etc.
The sensing device 20 may be a visual sensor in addition to a distance sensor. Image information is acquired using a vision sensor, and then the distance and/or orientation of the target object relative to the machine body 10 is obtained through image analysis and calculation.
The controller in this embodiment may be a Central Processing Unit (CPU) with data processing and computing capabilities, a single-chip microcomputer, or the like. The controller may be disposed at any position of the machine body 10 according to practical situations, and is not limited herein. The controller is electrically connected with the sensing device 20, and can acquire the environmental information around the machine body 10 through the sensing device 20 when the carrier is in a waiting loading working state, so as to output a corresponding prompt instruction when determining that a target object exists in a set range around the machine body 10 based on the environmental information. In some embodiments, the target object may be a manually driven transport vehicle, such as a forklift, that loads the transfer robot with objects; the state in which the transfer robot is waiting for the loading operation may be a state in which the transfer robot moves closer to the target object for which the transfer robot is waiting to load an object, or may be a state in which the transfer robot moves closer to the target object for which the transfer robot is waiting to load an object, and is not limited herein.
And a prompting device 30 electrically connected with the controller after the controller outputs the prompting instruction. The prompting device 30 can output human-perceivable prompting information, such as prompting voice and visual information, according to the prompting instruction. Wherein the visual information may include, but is not limited to, at least one of: visible light, visible light curtain, visible projection pattern, etc.
The prompt information can reflect the location of the machine body 10 to provide a reference for the target object to perform loading actions. Wherein the loading action may include: the movement of the target object toward the body 10 to move to a proper loading position, and the loading of the target object on the body 10. The above-mentioned information can reflect the posture of the machine body 10 in addition to the position of the machine body 10.
The body of the transfer robot provided by the embodiment is provided with the sensing device, so that the transfer robot has the capability of sensing the environmental information around the body; when the transfer robot is in a waiting loading working state, a controller electrically connected with the sensing device can acquire the environment information through the sensing device, and when the condition that a target object exists in a set range around the machine body is determined based on the environment information, a prompt instruction is output; the prompting device electrically connected with the controller can output prompting information which can reflect the position of the machine body and is perceivable by a human body according to the prompting instruction so as to provide reference for the loading action of the target object, thereby realizing the purpose of enabling the transfer robot to have a position prompting function, reducing the occurrence rate of collision and damage of the transfer robot, and having simple structure and low cost.
Further, with continued reference to fig. 1, the prompting device 30 in this embodiment includes a speaker (not shown) and/or a visual signal output unit (not explicitly shown). The number of the speakers may be one or more, and the positions of the speakers on the machine body 10 may be flexibly set according to practical situations, which does not limit the number of the speakers and the positions of the speakers on the machine body. The visual signal output unit is used for outputting visual information, and the visual information can include at least one of the following: visible light (such as light 301 indicated by thick black lines in fig. 3, light 301 directed upward in fig. 2, light 301 projected on the ground in fig. 3), visible light curtain (not shown in the corresponding drawings), visible projected pattern (such as arrow pattern 302 shown in fig. 4).
Accordingly, in some embodiments, the visual signal output unit 410 may be, but is not limited to, one or a combination of a laser light, a projection device, a visual light curtain emitter. In a specific implementation, if the visual output unit 410 is a laser lamp, the visual information output by the visual output unit 410 is visible laser light, and the form of the laser light may be a line, a red, or any other form, which is not limited herein; if the visual output unit 410 is a projection device, such as a projector, the visual information output by the visual output unit 410 is a visual projection pattern, which may be, but is not limited to, a pattern such as an arrow type; if the visual output unit 410 is a visual light curtain emitter, the visual information output by the visual output unit 410 is a visual light curtain, and the visual light curtain is a light curtain composed of a plurality of parallel visual light rays (such as visual infrared light rays).
In this embodiment, the visual signal output unit may be provided on the top and/or side wall of the body 10. Specifically, in one possible technical solution, at least one visual signal output unit is disposed on the top of the machine body 10 to output first visual information in an upward direction, where the first visual information may guide the target object to load an object on the machine body 10, or may guide the target object to move closer to the machine body 10 to move to a suitable loading position; and/or at least one visual signal output unit is arranged on the side wall or the top edge of the machine body 10 to output second visual information projected on the ground, and the second visual information can guide the target object to move close to the machine body 10 so as to move to a proper loading position.
The transfer robot may be provided with a visual signal output unit that outputs the light 301 shown in fig. 2 directed upward from the top of the body and a visual signal output unit that projects the light 301 shown in fig. 3 on the ground or an arrow pattern 302 in fig. 4. The upward directed light 301 may prompt the target object to perform an action of loading an object to the transfer robot; the light or pattern projected on the ground may indicate the direction in which the target object is traveling toward the transfer robot.
In a specific implementation, the visual signal output unit is disposed at a central position of the top of the machine body 10, for example, the visual signal output unit (shown in the figure) may be disposed at a central position O of the top of the machine body 10 shown in fig. 1. And/or, the visual signal output unit is disposed at a central position of at least one side of the top of the machine body 10, for example, a visual signal output unit may be disposed at central positions B1 and B2 corresponding to two opposite sides of the top of the machine body 10 shown in fig. 1, and a visual signal output unit may be disposed at central positions C1 and C2 corresponding to the other two opposite sides of the top. And/or the visual signal output units are disposed at the position of at least one corner of the top of the machine body 10, for example, the positions corresponding to the four corners of the top of the machine body 10 shown in fig. 1, namely, the position D1, the position D2, the position D3 and the position D4, are respectively provided with one visual signal output unit.
What should be stated here is: the number of visual signal units provided on the body 10 shown in fig. 1 is merely illustrative and does not represent the actual number of settings.
As shown in the example of fig. 3 and fig. 4, the visual signal output units disposed at the center positions of two opposite sides of the top of the machine body 10 project second visual information (such as two light rays 301 in fig. 3 and two sets of sequentially arranged arrow patterns 302 in fig. 4) on the ground on the corresponding two sides of the machine body 10, respectively, with the machine body 10 as the center, and the width D between the two projected second visual information is larger than the width D of the target object.
Based on the foregoing, in an implementation technical solution, the controller in this embodiment is configured to determine, according to the environmental information, a distance and/or an orientation of the target object relative to the machine body 10, and output, according to the distance and/or the orientation, the corresponding prompting instruction, so as to control the prompting device 30 to output at least one of the following prompting information according to the difference of the distance and/or the orientation: alert sounds of different frequencies and/or volumes, prompt voices of different contents, visible light rays of different brightness and/or colors, visible light curtains of different brightness and/or colors, and visible projection views of different brightness and/or patterns to provide references for the target object to execute loading actions.
For example, if the visual signal output unit in the prompting device 30 is a laser lamp, the controller may control the speaker in the prompting device 30 to periodically output a warning sound with a low volume and/or control the visual signal output unit to output a low-brightness light with a light color when determining that the target object is far from the machine body 10; or, in case that the distance between the target object and the machine body 10 is determined to be relatively close, the controller may control the speaker to output an alarm sound with audio frequency and/or stepwise high volume (for example, may be similar to a reversing radar alarm sound), and/or control the visual signal output unit to output a high-brightness and dark-color visual light; still alternatively, upon determining that the distance of the target object with respect to the body 10 reaches a set distance suitable for the target object to perform loading of an object on the body 10, the controller may control the speaker to output a warning sound such as a long sound in frequency, or may also control the speaker to output a corresponding voice broadcast, and/or control the visual signal unit on the top of the body to output a high-brightness, deep-color visual light, or the like.
The following describes how the prompt information provides a reference for the target object to execute the loading action in detail with reference to a specific application scenario.
For example, assuming that the target object is a forklift, as shown in fig. 2 and 3, after the forklift lifts the object, the driver drives the forklift to move closer toward the direction of the position where the transfer robot is located, so as to move to a suitable loading position. In the process that the forklift moves towards the transfer robot, the sensing device 20 on the side wall of the body 10 of the transfer robot can acquire environmental information around the body 10 and send the environmental information to the controller, the controller can determine the distance and/or the direction of the forklift relative to the body 10 according to the environmental information, and when the forklift is determined not to be in the loading position according to the distance and/or the direction, the speaker is controlled to output corresponding prompt tones along with different distances and/or directions, and/or the visual signal output unit on the top and/or the side wall of the body 10 is controlled to output corresponding visual information so as to guide the forklift to move to the loading position. For example, for controlling the speaker to output a corresponding alert tone according to the distance, so as to guide the forklift to the loading position through the alert tone, if it is determined that the distance between the forklift and the machine body 10 is greater than the first distance (e.g. 10m, 5 m), the controller may control the speaker to output a first alert tone with low frequency and low volume, and the driver may determine that the current forklift is located relatively far from the machine body 10 based on the first alert tone heard, so that the driver may control the forklift to move forward at a relatively high speed; if the distance between the forklift and the machine body 10 is smaller than the first distance and larger than the second distance (such as 2m and 1 m), the controller can control the speaker to output a second warning sound with audio frequency and/or stepwise higher volume, and the driver can judge that the current forklift is at a relatively close position to the machine body 10 based on the heard second warning sound, so as to avoid colliding with the transfer robot, the controller can control the forklift to move forward at a relatively slow speed; if the distance between the forklift and the machine body 10 reaches a third distance (e.g., 0.5 m), the controller may control the speaker to output a third warning sound with high sound volume and long sound, and the driver may determine that the forklift has reached a loading position suitable for loading the object onto the machine body 10 based on the third warning sound, so that the controller may control the forklift to stop moving forward and start to perform the action of loading the object onto the machine body 10. When a driver controls the forklift to perform the object loading onto the body 10, the controller can acquire the environmental information above the top of the body 10 through the sensing device at the top of the body 10, and determine the distance between the object and the body 10 based on the environmental information, so that based on the distance between the object and the body, a corresponding prompt instruction is output to control the speaker to output a corresponding prompt tone, thereby guiding the forklift driver to control the forklift to load the object onto the body 10, and the specific guiding process can be referred to the above-described process of guiding the forklift to move to a proper loading position, which is not described in detail herein. In the case that the controller controls the speaker to output the corresponding prompt tone according to the difference of the distance and the direction, the output prompt voice can be the prompt voice with different contents, for example, the prompt voice with the distance of 1m and the prompt voice with the left offset of some.
Similarly, the visual output unit can be controlled to output corresponding visual information according to different distances and/or directions, so that a driver is guided to control the forklift to move to the loading position through the visual information, and/or is guided to control the forklift to load an object on the machine body, and the specific guiding principle can be referred to the guiding process based on the prompt tone, and is not described in detail herein.
In summary, the transfer robot provided in this embodiment has a position prompt function, where the position prompt function is implemented by combining two aspects of hearing and/or vision, so that the occurrence rate of collision and damage of the transfer robot can be effectively reduced, and the transfer robot has a simple structure and low cost.
Fig. 5 is a flow chart illustrating a method for generating prompt information according to an embodiment of the present application. The prompt information generation method is applied to the transfer robot. As shown in fig. 5, the prompt message generating method includes:
101. acquiring environmental information around the robot body of the transfer robot when the transfer robot is in a waiting loading working state;
102. determining whether a target object exists in a set range around the machine body according to the environment information;
103. when the existence of the target object is determined, controlling a prompting device on the machine body to output prompting information which can be perceived by a human body;
the prompt information reflects the position of the transfer robot so as to provide a reference for the target object to execute loading actions.
In the above 101, the specific structural functions of the handling robot may be referred to the corresponding content related to fig. 1, and will not be described herein. The target object may be a manually driven transport vehicle 40 (shown in fig. 2, 3 and 4) for loading the transfer robot with objects, such as a forklift. In practical applications, when the object is loaded on the transfer robot, there are generally two cases as follows: one is that the transfer robot is stationary, the target object, after lifting the object, moves closer to the direction of the position where the transfer robot is located, and when reaching a proper position, performs an operation of loading the object on the transfer robot; the other is that the target object lifts the object, waits for the transfer robot to move to and approach to the object, and when the transfer robot reaches a proper position, the target object performs an operation of loading the object on the transfer robot. Accordingly, the state in which the transfer robot is waiting for the loading operation may be a state in which the transfer robot moves toward the target object for which the transfer robot is waiting to load the object, or a state in which the transfer robot waits for the target object for which the transfer robot is loading the object and moves toward the target object, which is not limited herein. In the case where the transfer robot is in a waiting loading operation state, environmental information around the transfer robot body may be acquired by a sensing device provided on the transfer robot body.
In step 102, the set range around the machine body is related to the sensing range of the sensing device disposed on the machine body.
In the step 103, the "control the prompting device on the machine body to output the prompting information perceivable by the human body" may specifically include the following steps:
1031. determining the distance and/or the azimuth of the target object relative to the machine body according to the environment information;
1032. and controlling the prompting device to output corresponding prompting information which can be perceived by a human body according to the distance and/or the azimuth.
In a specific implementation technical scheme, the prompting device comprises a visual signal output unit; accordingly, step 1032 "according to the distance and/or the azimuth, the prompting device is controlled to output the corresponding prompting information perceivable by the human body", which is specifically implemented by the following steps:
a11, determining whether the target object is in a loading position according to the distance and/or the azimuth;
a12, when the target object is not at the loading position, controlling a visual signal output unit on the top and/or the side wall of the machine body to output visual information so as to guide the target object to move to the loading position;
a13, when the target object is at the loading position, controlling the visual signal output unit on the top of the machine body to output visual information so as to guide the target object to load an object on the machine body.
In specific implementation, the visual signal output unit may be, but is not limited to, a laser lamp, a projection device, and a visual light curtain emitter. Accordingly, the visual information may be, but is not limited to, at least one of: visible light, visible light curtain, visible projection pattern; the visual light curtain is composed of a plurality of parallel visual light rays (such as visual infrared light rays). When the target object is not at the loading position, the visual signal output unit on the top and/or the side wall of the machine body can be controlled to output at least one of the following visual information according to the difference of the distance and/or the direction: visible light of different brightness and/or color, visible light curtain of different brightness and/or color, visible projection pattern of different brightness and/or pattern to guide the target object to move to the loading position. Also, the visual signal output unit on the top of the body may be controlled to output visual information in the above-described manner to guide the target object to load an object on the body when the target object is in the loading position.
In another specific implementation technical scheme, the prompting device further includes a speaker, in this case, the specific implementation step of the step 1032 "controlling the prompting device to output the corresponding human-body-perceivable prompting information according to the distance and/or the direction" may refer to the implementation step corresponding to the fact that the prompting device includes a visual signal unit, which is not described in detail herein. The difference is that the speaker outputs a prompt tone, and when the target object is not at the loading position, the speaker can be controlled to output at least one of the following prompt tones according to the distance and/or the direction: alert tones of different frequencies and/or volumes, alert voices of different contents.
According to the technical scheme provided by the embodiment, when the carrier robot is in the waiting loading working state and the target object exists in the set range around the body is determined based on the acquired environmental information around the carrier robot body, the prompt device on the body is controlled to output the prompt information which can reflect the position of the carrier robot and is perceived by a human body, so that a reference can be provided for the target object to execute loading action. The technical scheme can effectively avoid collision of the target object and damage the transfer robot, and is simple in scheme and low in cost.
What needs to be explained here is: the main execution body of each step in the method provided in this embodiment may be a controller on the transfer robot. In addition, details of each step that are not described in detail in the foregoing embodiments may be referred to as corresponding details in each embodiment, which are not described herein. In addition, the method provided in this embodiment may further include other part or all of the steps in the foregoing embodiments, and specific reference may be made to the corresponding content of the foregoing embodiments, which is not repeated herein.
Yet another embodiment of the present application provides a handling system. Referring to fig. 2, 3 and 4, the handling system comprises:
a transfer robot, specifically, a body 10 shown in fig. 2, for autonomous travel to transfer an object;
a manually driven transport vehicle 40 for loading the transfer robot with objects;
when the handling robot is in a waiting loading working state, after sensing the manual driving transport vehicle 40, the handling robot outputs a prompt message perceivable by a human body to guide a driver to drive the manual driving transport vehicle 40 to finish the object loading action.
The specific structural functions of the handling robot may be referred to the corresponding content related to fig. 1, and will not be described herein. The manually driven transport vehicle 40 may be a forklift, but may be other types of transport vehicles, as well, and is not limited herein.
What needs to be explained here is: the details of each step in the handling system provided in this embodiment may be referred to the corresponding content in each embodiment, and will not be described herein. In addition, the handling system provided in this embodiment may further include other part or all of the steps in the foregoing embodiments, and specific reference may be made to the corresponding contents of the foregoing embodiments, which are not repeated herein.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solution of the present application, and are not limiting thereof; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical scheme described in the foregoing embodiments can be modified or some technical features thereof can be replaced by equivalents; such modifications and substitutions do not depart from the spirit and scope of the corresponding technical solutions.

Claims (15)

1. A transfer robot, comprising:
a body;
the sensing device is arranged on the machine body and is used for sensing the environmental information around the machine body;
the controller is electrically connected with the sensing device and is used for acquiring the environment information through the sensing device when the transfer robot is in a waiting loading working state; when the existence of a target object in a set range around the machine body is determined based on the environment information, outputting a prompt instruction;
the prompting device is electrically connected with the controller and is used for outputting prompting information which can be perceived by a human body according to the prompting instruction;
the prompt information reflects the position of the machine body and the posture of the machine body so as to provide a reference for guiding corresponding personnel to control the target object to execute loading action.
2. The transfer robot according to claim 1, wherein the presentation device includes: a speaker and/or a visual signal output unit; wherein,,
the loudspeaker is used for outputting prompt tones;
the visual signal output unit is used for outputting visual information;
the visual information includes at least one of: visible light, visible light curtain, visible projection pattern.
3. The transfer robot according to claim 2, wherein,
the controller is used for determining the distance and/or the position of the target object relative to the machine body according to the environment information; outputting the corresponding prompt instruction according to the distance and/or the direction so as to control the prompt device to output at least one of the following information along with the difference of the distance and/or the direction:
alert tones of different frequencies and/or volumes;
prompting voices of different contents;
visible light of different brightness and/or color;
visual light curtains of different brightness and/or color;
visual projection patterns of different brightness and/or patterns.
4. The transfer robot according to claim 2, wherein,
at least one visual signal output unit is arranged at the top of the machine body so as to output first visual information in an upward direction; and/or
At least one visual signal output unit is arranged on the side wall or the top edge of the machine body so as to output second visual information projected on the ground.
5. The transfer robot according to claim 4, wherein,
the visual signal output unit is arranged at the center of the top of the machine body; and/or
The visual signal output unit is arranged at the center of at least one edge of the top of the machine body; and/or
The visual signal output unit is arranged at the position of at least one corner of the top of the machine body.
6. The transfer robot according to claim 5, wherein the visual signal output units provided at the center positions of two opposite sides of the top of the body project second visual information on the ground on the respective sides of the body, respectively, with the body as a center, in axisymmetry, and a width between the two projected second visual information is larger than a width of the target object.
7. The transfer robot of any one of claims 2 to 6, wherein the visual signal output unit is a laser light, a projection device, a visual light curtain emitter.
8. The transfer robot according to any one of claims 1 to 6, characterized in that the top and side walls of the machine body are provided with the sensing means.
9. The transfer robot according to claim 8, wherein the side walls of the body include a front side wall, a rear side wall, a right side wall, and a left side wall in a traveling direction of the transfer robot;
the front side wall of the machine body is provided with the sensing device; or alternatively
The sensing devices are arranged on the front side wall and the rear side wall of the machine body; or alternatively
The sensing devices are arranged on the front side wall, the rear side wall, the left side wall and the right side wall of the machine body.
10. The transfer robot according to claim 8, wherein the sensing device is provided at a center position of the top of the body; and/or
The central position of at least one edge of the top of the machine body is provided with the sensing device.
11. The transfer robot of claim 1, wherein the sensing device is a distance sensor.
12. A handling system, comprising:
the transfer robot according to any one of claims 1 to 11, for autonomous travel to transfer an object;
a manually driven transport vehicle for loading the transfer robot with objects;
when the carrying robot is in a waiting loading working state, after the manual driving transport vehicle is sensed, a prompt message which can be perceived by a human body is output so as to guide a driver to drive the manual driving transport vehicle to finish the object loading action.
13. A prompt message generation method is applied to a transfer robot and is characterized by comprising the following steps:
acquiring environmental information around the robot body of the transfer robot when the transfer robot is in a waiting loading working state;
determining whether a target object exists in a set range around the machine body according to the environment information;
when the existence of the target object is determined, controlling a prompting device on the machine body to output prompting information which can be perceived by a human body;
the prompt information reflects the position of the transfer robot and the posture of the machine body so as to provide a reference for corresponding personnel to control the target object to execute loading action.
14. The method of claim 13, wherein controlling the prompting device on the body to output human-perceptible prompting information comprises:
determining the distance and/or the azimuth of the target object relative to the machine body according to the environment information;
and controlling the prompting device to output corresponding prompting information which can be perceived by a human body according to the distance and/or the azimuth.
15. The method of claim 14, wherein the prompting device comprises a visual signal output unit; and
according to the distance and/or the azimuth, the prompting device is controlled to output corresponding prompting information which can be perceived by a human body, and the prompting device comprises:
determining whether the target object is in a loading position according to the distance and/or the azimuth;
when the target object is not at the loading position, controlling a visual signal output unit on the top and/or the side wall of the machine body to output visual information so as to guide the target object to move to the loading position;
and when the target object is at the loading position, controlling the visual signal output unit on the top of the machine body to output visual information so as to guide the target object to load an object on the machine body.
CN202110272328.9A 2021-03-12 2021-03-12 Transfer robot, transfer system, and method for generating presentation information Active CN115072626B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202110272328.9A CN115072626B (en) 2021-03-12 2021-03-12 Transfer robot, transfer system, and method for generating presentation information
PCT/CN2021/138720 WO2022188497A1 (en) 2021-03-12 2021-12-16 Transfer robot, transfer system, and prompt information generation method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110272328.9A CN115072626B (en) 2021-03-12 2021-03-12 Transfer robot, transfer system, and method for generating presentation information

Publications (2)

Publication Number Publication Date
CN115072626A CN115072626A (en) 2022-09-20
CN115072626B true CN115072626B (en) 2023-07-18

Family

ID=83226306

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110272328.9A Active CN115072626B (en) 2021-03-12 2021-03-12 Transfer robot, transfer system, and method for generating presentation information

Country Status (2)

Country Link
CN (1) CN115072626B (en)
WO (1) WO2022188497A1 (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10236476A (en) * 1996-12-27 1998-09-08 Yoshiko Fujisawa Pallet, its production, forklift, load-carrying system using them, and partition plate used in the system
JP2001063446A (en) * 1999-08-27 2001-03-13 Tokyu Car Corp Pallet conveying vehicle
JP2006111415A (en) * 2004-10-15 2006-04-27 Toyota Industries Corp Location indicator, and location management system
KR20130099596A (en) * 2012-02-29 2013-09-06 부산대학교 산학협력단 Apparatus and method of unmanned forklift for autonomous loading and unloading
JP2015157682A (en) * 2014-02-24 2015-09-03 株式会社岡村製作所 Transportation dolly
CN206536514U (en) * 2017-01-16 2017-10-03 山东华力机电有限公司 Transfer robot safety guard
CN108502434A (en) * 2017-02-28 2018-09-07 广东利保美投资有限公司 Pallet machine people
CN109129389A (en) * 2017-06-27 2019-01-04 京东方科技集团股份有限公司 A kind of robot and its joining method, robot splicing system
CN208715752U (en) * 2018-07-20 2019-04-09 中南林业科技大学 A kind of retrospective bamboo-wood composite tray
KR102018765B1 (en) * 2018-06-20 2019-09-04 회명정보통신(주) Device for detecting position of fork for forklift truck
CN111361917A (en) * 2020-03-16 2020-07-03 福建通力达实业有限公司 Method and system for measuring, calculating and correcting position of mobile shelf
CN111533051A (en) * 2020-05-08 2020-08-14 三一机器人科技有限公司 Tray pose detection method and device, forklift and freight system

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003281653A (en) * 2002-03-26 2003-10-03 Victor Co Of Japan Ltd Autonomous mobile robot
JP2006048308A (en) * 2004-08-03 2006-02-16 Funai Electric Co Ltd Self-propelled cleaner
US20150202770A1 (en) * 2014-01-17 2015-07-23 Anthony Patron Sidewalk messaging of an autonomous robot
DE102016012313A1 (en) * 2016-10-15 2018-04-19 Man Truck & Bus Ag Device for assisting a driver of a vehicle with a projection device
CN206219148U (en) * 2016-11-25 2017-06-06 广州顶牛汽车用品有限公司 Fork truck intelligent integrated radar
WO2018150998A1 (en) * 2017-02-17 2018-08-23 北陽電機株式会社 Object capturing device, capture target, and object capturing system
JP2018139020A (en) * 2017-02-24 2018-09-06 シーオス株式会社 Autonomous mobile device and reflecting member
JP6940969B2 (en) * 2017-03-29 2021-09-29 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America Vehicle control device, vehicle control method and program
JP2018169894A (en) * 2017-03-30 2018-11-01 村田機械株式会社 Singular part detection device, autonomous mobile device, and singular part detection method
CN108303972B (en) * 2017-10-31 2020-01-17 腾讯科技(深圳)有限公司 Interaction method and device of mobile robot
CN208796109U (en) * 2018-09-21 2019-04-26 东莞市开胜电子有限公司 A kind of back carried automated guided vehicle of multisensor perception
CN109844674B (en) * 2018-10-15 2023-02-03 灵动科技(北京)有限公司 Logistics robot with controllable camera and indicator and operation method
CN209980437U (en) * 2019-01-29 2020-01-21 浙江瑞华康源科技有限公司 Arrival reminding device
KR102508073B1 (en) * 2019-03-25 2023-03-08 엘지전자 주식회사 A moving-robot and control method thereof
CN110045739A (en) * 2019-05-10 2019-07-23 湖北汽车工业学院 A kind of intelligent storage material robot, control system and control method
CN110860057A (en) * 2019-11-18 2020-03-06 燕山大学 Fire-fighting reconnaissance robot and reconnaissance method
CN110844496A (en) * 2019-11-25 2020-02-28 威海职业学院 Intelligent electromechanical automatic feeding control system and method
CN112171663A (en) * 2020-09-03 2021-01-05 上海姜歌机器人有限公司 Robot state prompting system, method and device and electronic equipment

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10236476A (en) * 1996-12-27 1998-09-08 Yoshiko Fujisawa Pallet, its production, forklift, load-carrying system using them, and partition plate used in the system
JP2001063446A (en) * 1999-08-27 2001-03-13 Tokyu Car Corp Pallet conveying vehicle
JP2006111415A (en) * 2004-10-15 2006-04-27 Toyota Industries Corp Location indicator, and location management system
KR20130099596A (en) * 2012-02-29 2013-09-06 부산대학교 산학협력단 Apparatus and method of unmanned forklift for autonomous loading and unloading
JP2015157682A (en) * 2014-02-24 2015-09-03 株式会社岡村製作所 Transportation dolly
CN206536514U (en) * 2017-01-16 2017-10-03 山东华力机电有限公司 Transfer robot safety guard
CN108502434A (en) * 2017-02-28 2018-09-07 广东利保美投资有限公司 Pallet machine people
CN109129389A (en) * 2017-06-27 2019-01-04 京东方科技集团股份有限公司 A kind of robot and its joining method, robot splicing system
KR102018765B1 (en) * 2018-06-20 2019-09-04 회명정보통신(주) Device for detecting position of fork for forklift truck
CN208715752U (en) * 2018-07-20 2019-04-09 中南林业科技大学 A kind of retrospective bamboo-wood composite tray
CN111361917A (en) * 2020-03-16 2020-07-03 福建通力达实业有限公司 Method and system for measuring, calculating and correcting position of mobile shelf
CN111533051A (en) * 2020-05-08 2020-08-14 三一机器人科技有限公司 Tray pose detection method and device, forklift and freight system

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于ANSYS的油田重型堆垛机结构优化设计;刘文波;周宇;;机械工程师(第12期);175-176 *

Also Published As

Publication number Publication date
WO2022188497A1 (en) 2022-09-15
CN115072626A (en) 2022-09-20

Similar Documents

Publication Publication Date Title
EP3382489B1 (en) Tour guide robot and moving area calibration method, computer readable storage medium
JP6883898B1 (en) Robot status notification system, robot status notification method, robot status notification device, electronic device and storage medium
US11358844B2 (en) Industrial vehicle remote operation system, industrial vehicle, computer-readable storage medium storing industrial vehicle remote operation program, and industrial vehicle remote operation method
EP3557361B1 (en) Charging station identifying method and device
US20170212517A1 (en) Vehicle positioning and object avoidance
WO2019233315A1 (en) Method and apparatus for controlling automated guided vehicle, and storage medium
US11544937B2 (en) Control device, control method, and computer readable medium
US11247669B2 (en) Method and system for collision avoidance in one hazardous area of a goods logistics facility
JP7300646B2 (en) Conveyor
US20210319701A1 (en) Information providing device for vehicle, and vehicle
JP2014002739A (en) System and method for guiding mobile device
CN115072626B (en) Transfer robot, transfer system, and method for generating presentation information
JP2014157051A (en) Position detection device
US20190050959A1 (en) Machine surround view system and method for generating 3-dimensional composite surround view using same
CA3230002A1 (en) Conveyor belt thickness measurement systems and methods for detecting changes in conveyor belt thicknesses
JP2021033701A (en) Work support system
KR20170115188A (en) Transport Robot For Industry Place
KR102666772B1 (en) The around monitoring apparatus ofo the image base
JPH07165387A (en) Collision preventing device for moving body
Gao et al. A high-speed color-based object detection algorithm for quayside crane operator assistance system
JP6556468B2 (en) Parking equipment
CN116654842B (en) Conveying equipment, control method and device of conveying equipment and storage medium
JP2006111415A (en) Location indicator, and location management system
JPH10315843A (en) Cargo collapse monitoring device
EP4372644A1 (en) Loading operation monitoring apparatus and method of using the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant