CN107272704A - A kind of outer means of delivery of the intelligent vehicle and robot chamber for merging unmanned machine travel - Google Patents
A kind of outer means of delivery of the intelligent vehicle and robot chamber for merging unmanned machine travel Download PDFInfo
- Publication number
- CN107272704A CN107272704A CN201710643174.3A CN201710643174A CN107272704A CN 107272704 A CN107272704 A CN 107272704A CN 201710643174 A CN201710643174 A CN 201710643174A CN 107272704 A CN107272704 A CN 107272704A
- Authority
- CN
- China
- Prior art keywords
- unmanned plane
- carrying robot
- kinect
- robot
- invade
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 230000033001 locomotion Effects 0.000 claims abstract description 38
- 238000000034 method Methods 0.000 claims abstract description 18
- 125000006850 spacer group Chemical group 0.000 claims abstract description 8
- 238000013135 deep learning Methods 0.000 claims description 15
- 230000009545 invasion Effects 0.000 claims description 11
- 230000007935 neutral effect Effects 0.000 claims description 11
- 239000003086 colorant Substances 0.000 claims description 6
- 238000004891 communication Methods 0.000 claims description 6
- 230000015572 biosynthetic process Effects 0.000 claims description 4
- 238000005259 measurement Methods 0.000 claims description 3
- 230000008569 process Effects 0.000 claims description 3
- 230000006641 stabilisation Effects 0.000 claims description 3
- 238000011105 stabilization Methods 0.000 claims description 3
- 230000001360 synchronised effect Effects 0.000 claims 1
- 230000032258 transport Effects 0.000 description 25
- 238000010586 diagram Methods 0.000 description 6
- 230000001154 acute effect Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 238000000151 deposition Methods 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 238000003384 imaging method Methods 0.000 description 2
- 210000005036 nerve Anatomy 0.000 description 2
- 238000003062 neural network model Methods 0.000 description 2
- 238000003860 storage Methods 0.000 description 2
- 230000004888 barrier function Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000007405 data analysis Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 238000009776 industrial production Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
- 238000012549 training Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0246—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
- G05D1/0251—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting 3D information from a plurality of images taken from different locations, e.g. stereo vision
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
- G05D1/0214—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0276—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
- G05D1/0278—Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using satellite positioning signals, e.g. GPS
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Aviation & Aerospace Engineering (AREA)
- General Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Electromagnetism (AREA)
- Manipulator (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
Abstract
The invention discloses the means of delivery outside a kind of intelligent vehicle and robot chamber for merging unmanned machine travel, comprise the following steps:Step 1:Transport task is instructed to the unmanned plane sent to specified carrying robot and matching;Step 2:Unmanned plane flies to the corresponding Lou Dong doorways of transport task starting point according to GPS navigation;Step 3:Carrying robot recognizes and matches unmanned plane;Step 4:After successful match, unmanned plane is made to keep the stable motion form of geo-stationary with carrying robot;Step 5:Judge whether motion morphology is stablized, if stable, unmanned plane guides carrying robot to move ahead;Step 6:When running to terminal, form stable is released, and carrying robot proceeds to transport task final position according to building spacer, completes transport task.Using unmanned plane as a single navigation module navigation, cost is low, high to carrying robot compatibility;Build the security that can not be invaded domain, enhance distraction procedure between unmanned plane and carrying robot.
Description
Technical field
The invention belongs to robot control field, more particularly to a kind of intelligent vehicle and robot for merging unmanned machine travel
The outdoor means of delivery
Background technology
Robot has more than 50 years history so far from being born, and industrial production, medical services are intervened more and more widely
Deng social life various aspects.The robot of Most current is industrial robot, compared with industrial robot, carrying robot
Cost is low, and control is simple, and build is compact, can flexibly complete transport task.Therefore, in recent years, increasing airborne carrier
Device people is used for the transport task in complex indoor environment.It fact proved, there is provided essence performing many floor transports for carrying robot
It is feasible and efficient in accurate transportation service.Personnel can effectively be saved and perform the iterative task time, it is ensured that the safety of transport
Property, staffs training demand is reduced, mistake is reduced, increased productivity and experimental precision.
In order that carrying robot has more extensive prospect, carrying robot is appointed except that must perform transport between floors
Business is outer, must also be moved between building, if proposing that a set of reasonable, practical robot, across building transportation resources, is transported realizing
Robot is carried more efficiently to utilize.
Chinese patent CN104181926A discloses a kind of robot navigation's control method based on surface mark, including with
Lower step:1. periodically obtain the pavement image being located in front of carrying robot;2. the road surface figure is being got each time
As after, detect and whether there is surface mark in the pavement image.If there is the surface mark, then according to pavement image
The coordinate system of foundation, calculates the position and direction of the carrying robot;If there is no the surface mark, then according to boat position
Supposition method calculates the position and direction of the robot;3. described in carrying robot according to the position and direction of the calculating, control
Make the direction of travel and speed of the robot.
The patent there are problems that as follows:1. under complex environment, the mark on road surface may be blocked, it is also possible to due to
Sleet, causes to indicate None- identified, the navigation practicality of carrying robot will be greatly reduced.2. due to no Real Time Obstacle Avoiding, meet
It can not be tackled to emergency case carrying robot, may there is the danger that collision is damaged.
The content of the invention
The invention provides the means of delivery outside a kind of intelligent vehicle and robot chamber for merging unmanned machine travel, its purpose exists
In overcoming above-mentioned problems of the prior art.By the way that as single navigation module, unmanned plane is taken full advantage of into nobody
Machine mobility and flexibility characteristics, can effectively carry out barrier zone detection, navigation way be planned again, it is ensured that distraction procedure
Security;
A kind of intelligent vehicle and robot for merging unmanned plane comprises the following steps across the building means of delivery:
Step 1:Transport task is instructed to the unmanned plane sent to specified carrying robot and matching;
Matching communication is carried out using light color stream cipher agreement between the carrying robot and matching unmanned plane;
The light color cryptographic communication agreement of each existence anduniquess between carrying robot and matching unmanned plane;
On unmanned plane and carrying robot identification problem, the two locking can be completed by the way of light color password.It is specific special
Levy:One group of colored lights is set respectively with unmanned plane afterbody immediately ahead of carrying robot, every group of colored lights is by K three primary colors lampet structure
Into, you can form 3kLight color combination is planted, the unmanned plane and carrying robot formed under one group of light color password, same system must be used
Same group of light color password, is mutually matched by imaging sensor, completes identification.High, the characteristics of implementation is strong with recognition efficiency.
Step 2:Shipping point of origin and terminal point coordinate are obtained according to the transport task instruction received, carrying robot is according to fixed
Bit slice is moved at the corresponding Lou Dong doorways of transport task starting point, and unmanned plane flies to transport task starting point pair according to GPS navigation
At the Lou Dong doorways answered;
Step 3:Carrying robot recognizes and matches unmanned plane;
Reach the unmanned plane hair at the corresponding Lou Dong doorways of transport task starting point and send difference according to the communication protocol of setting
The light signal of color, the light signal for the different colours that carrying robot is sent to unmanned plane is identified, if with depositing in advance
The light color sequence of storage is identical, then it is assumed that the match is successful for both, and both are communicated;
Step 4:After carrying robot and unmanned plane successful match, unmanned plane is made to keep geo-stationary with carrying robot
Stable motion form;
Step 5:Judge whether the motion morphology of unmanned plane with carrying robot between the two is stablized, if stable, nobody
Machine is according to under transport task generation guidance path, guiding carrying robot to move ahead;
The stable motion form refers to that unmanned plane and carrying robot are run with fixed distance and fixed attitude angle,
Geo-stationary is kept to move ahead;
Step 6:When unmanned plane is run to transport task final position, Lou Dongxin is recognized using the spacer outside a building
Breath, when the building information of identification is matched with transport task terminal, the form stable between carrying robot and unmanned plane is released,
Carrying robot proceeds to transport task final position according to building spacer, completes transport task.
Further, the Kinect sensor that the attitude angle of the unmanned plane and carrying robot is carried by both is measured
Obtain.
Further, carrying robot and nothing are judged using the motion morphology stable model based on deep learning neutral net
Both man-machine motion morphologies;
The motion morphology stable model based on deep learning neutral net is the phase with carrying robot and unmanned plane
To input data of the pose time series as deep learning neutral net, motion morphology stabilization result is used as deep learning nerve
The output data of network, is trained acquisition.
Further, strategy is calibrated according to following pose to be adjusted the posture of unmanned plane and carrying robot so that
Both keep stable motion form, in stable motion form, Fixed posture angle α0=90 °, β0=90 °:
1) whenAndWhen, carrying robot keeps phase with unmanned plane
With speed operation;
2) whenAndWhen, carrying robot is with nobody of 1.5 times
Machine speed follower unmanned plane moves ahead;
3) whenAndOrWhen, carrying robot wheel is to 0.1-
0.25r/s rotating speed real time calibrations;
4) whenAndOrWhen, carrying robot with 1.5 times nobody
Machine speed is run by track;
Wherein, l and L are respectively real-time range between unmanned plane and carrying robot Kinect and in stable operation form
Under fixed range;
α and α0Represent what unmanned plane was formed with carrying robot Kinect lines and unmanned plane Kinect horizontal lines respectively
Real-time attitude angle and the Fixed posture angle under stable operation form;
β and β0Unmanned plane and carrying robot Kinect lines and carrying robot Kinect horizontal lines institute shape are represented respectively
Into real-time attitude angle and the Fixed posture angle under stable operation form.
Further, build unmanned plane and carrying robot stable motion form can not invade domain, when can not invade domain
Interior when being invaded without other objects, unmanned plane and carrying robot are normally run;When that can not invade in domain in the presence of other objects, nothing
It is man-machine, with the artificial center of circle of carrying machine, to carry out arc movement using the Kinect sensor carried collection invasion point position,
Until intrusion object, which is located at, can not invade overseas;
The unmanned plane and the building process that can not invade domain of carrying robot stable motion form are as follows:
A carrying robot square boundary) is determined;
Carrying robot image border, and carrying robot size are detected, carrying robot square boundary is built, passes through
The summit of carrying robot square boundary, builds circular arc border C, radius is R, and carrying robot Kinect is placed in circular arc
Arc circle centre position where the C of border;
B) it regard the arc border of structure as the line between directrix, with unmanned plane Kinect and carrying robot Kinect
Parallel line is as bus, using the directrix and bus formation class cylindrical space region as can not invade domain.
Further, judge it is that can not invade to whether there is invader in domain according to procedure below:
First, structure can not invade domain three-dimensional system of coordinate;
Origin is in Kinect positions using carrying robot, with Kinect structures on the Kinect of carrying robot to unmanned plane
Into ray be x-axis, be z-axis perpendicular to the ray of ground upwardly, y-axis direction determined according to right-hand rule, build three-dimensional coordinate
System;
Secondly, calculating can not invade the distance between potential intrusion object and constructed three-dimensional coordinate system x-axis in domain
SFace;
Wherein, a represents distances of the carrying robot Kinect to potential intrusion object m;B represents unmanned plane Kinect to latent
In intrusion object m distance;L represents the distance between carrying robot Kinect and unmanned plane Kinect, and unmanned plane and fortune
Carrier aircraft device human world Kinect line is also the line in the center of circle;H represents carrying robot Kinect, unmanned plane Kinect and invasion
The height for the triangle that thing is constituted;
Finally, if H≤R, there is object invasion can not invade domain;If H>R, then no object invasion can not invade domain.
Beneficial effect
The invention provides the means of delivery outside a kind of intelligent vehicle and robot chamber for merging unmanned machine travel, this method is led to
Cross unmanned plane as external navigation module, realize that carrying robot follows traction to unmanned plane.In distraction procedure, pass through
Pose adjustment-decision plan, makes unmanned plane and carrying robot formation operation form locking.In decision stage, by the pose time
Sequence is transmitted to data analysis module and carries out neutral net deep learning, judges whether form locks.During avoidance, pass through
Kinect on carrying robot, detects the image border of carrying robot in real time respectively, and structure can not invade domain.Only when this
When being invaded in domain without other objects, unmanned plane and carrying robot could normally be drawn;When the field is invaded, unmanned plane root
According to the invasion point position, with the artificial center of circle of carrying machine, arc movement, until intrusion object is positioned at overseas.The above method
Use, take full advantage of that unmanned plane flexibility is high, the characteristics of mobility is good can not invade domain by building, and create a peace
Full running environment, improves traction security.
Brief description of the drawings
Fig. 1 is measurement UAV position and orientation angle α schematic diagrames;
Fig. 2 is measurement carrying robot pose angle beta schematic diagram;
Fig. 3 is deep learning neural network model schematic diagram;
Fig. 4 is that carrying robot can not invade domain border schematic diagram;
Fig. 5 is that unmanned plane carrying robot can not invade domain structure and intrusion object identification schematic diagram;
Fig. 6 is carrying robot across building operational process schematic diagram.
Embodiment
Below in conjunction with drawings and examples, the present invention is described further.
As shown in fig. 6, a kind of intelligent vehicle and robot for merging unmanned plane is across the building means of delivery, including following step
Suddenly:
Step 1:Transport task is instructed to the unmanned plane sent to specified carrying robot and matching;
On unmanned plane and carrying robot identification problem, the two locking can be completed by the way of light color password.
One group of colored lights is set respectively with unmanned plane afterbody immediately ahead of carrying robot, every group of colored lights is small by K three primary colors
Lamp is constituted, you can form 3kPlant light color combination, the unmanned plane and carrying robot palpus formed under one group of light color password, same system
Using same group of light color password, it is mutually matched by imaging sensor, completes identification.With recognition efficiency high, implementation is strong
Feature.
Step 2:Shipping point of origin and terminal point coordinate are obtained according to the transport task instruction received, carrying robot is according to fixed
Bit slice is moved at the corresponding Lou Dong doorways of transport task starting point, and unmanned plane flies to transport task starting point pair according to GPS navigation
At the Lou Dong doorways answered;
Step 3:Carrying robot recognizes and matches unmanned plane;
Reach the unmanned plane hair at the corresponding Lou Dong doorways of transport task starting point and send difference according to the communication protocol of setting
The light signal of color, the light signal for the different colours that carrying robot is sent to unmanned plane is identified, if with depositing in advance
The light color sequence of storage is identical, then it is assumed that the match is successful for both, and both are communicated;
Step 4:After carrying robot and unmanned plane successful match, unmanned plane is made to keep geo-stationary with carrying robot
Stable motion form;
The Kinect sensor that the attitude angle of the unmanned plane and carrying robot is carried by both, which is measured, to be obtained.As schemed
Shown in 1 and Fig. 2, α represents the acute angle that unmanned plane is formed with carrying robot Kinect lines and unmanned plane Kinect horizontal lines.
β represents the acute angle that unmanned plane is formed with carrying robot Kinect lines and unmanned plane Kinect horizontal lines.
L1, L2 are the distance to unmanned plane midpoint and edge at carrying robot Kinect, and L3 is unmanned plane midpoint to side
The distance of edge.Three side is triangle, it is possible thereby to determine α.
L4, L5 are the distance to unmanned plane midpoint and edge at carrying robot Kinect, and L6 is unmanned plane midpoint to side
The distance of edge.Three side is triangle, it is possible thereby to determine β.
Step 5:Judge whether the motion morphology of unmanned plane with carrying robot between the two is stablized, if stable, nobody
Machine is according to under transport task generation guidance path, guiding carrying robot to move ahead;
The stable motion form refers to that unmanned plane and carrying robot are run with fixed distance and fixed attitude angle,
Geo-stationary is kept to move ahead;
Both carrying robot and unmanned plane are judged using the motion morphology stable model based on deep learning neutral net
Motion morphology;
The motion morphology stable model based on deep learning neutral net is the phase with carrying robot and unmanned plane
To input data of the pose time series as deep learning neutral net, motion morphology stabilization result is used as deep learning nerve
The output data of network, is trained acquisition.
The deep learning neural network model is as shown in Figure 3;
Strategy is calibrated according to following pose to be adjusted the posture of unmanned plane and carrying robot so that both keep steady
Determine motion morphology, in stable motion form, Fixed posture angle α0=90 °, β0=90 °:
1) whenAndWhen, carrying robot keeps phase with unmanned plane
With speed operation;
2) whenAndWhen, carrying robot is with nobody of 1.5 times
Machine speed follower unmanned plane moves ahead;
3) whenAndOrWhen, carrying robot wheel is to 0.1-
0.25r/s rotating speed real time calibrations;
4) whenAndOrWhen, carrying robot with 1.5 times nobody
Machine speed is run by track;
Wherein, l and L are respectively real-time range between unmanned plane and carrying robot Kinect and in stable operation form
Under fixed range;
α and α0Represent what unmanned plane was formed with carrying robot Kinect lines and unmanned plane Kinect horizontal lines respectively
Real-time attitude angle and the Fixed posture angle under stable operation form;
β and β0Unmanned plane and carrying robot Kinect lines and carrying robot Kinect horizontal lines institute shape are represented respectively
Into real-time attitude angle and the Fixed posture angle under stable operation form.
Build unmanned plane and carrying robot stable motion form can not invade domain, when can not invade in domain without other things
When body is invaded, unmanned plane and carrying robot are normally run;When that can not invade in domain in the presence of other objects, unmanned plane is utilized certainly
The Kinect sensor collection invasion point position of band, with the artificial center of circle of carrying machine, carries out arc movement, until invader
Body, which is located at, can not invade overseas;
As shown in Figure 4 and Figure 5, the structure that can not invade domain of the unmanned plane and carrying robot stable motion form
Journey is as follows:
A carrying robot square boundary) is determined;
Carrying robot image border, and carrying robot size are detected, carrying robot square boundary is built, passes through
The summit of carrying robot square boundary, builds circular arc border C, radius is R, and carrying robot Kinect is placed in circular arc
Arc circle centre position where the C of border;
B) it regard the arc border of structure as the line between directrix, with unmanned plane Kinect and carrying robot Kinect
Parallel line is as bus, using the directrix and bus formation class cylindrical space region as can not invade domain.
Judge it is that can not invade to whether there is invader in domain according to procedure below:
First, structure can not invade domain three-dimensional system of coordinate;
Origin is in Kinect positions using carrying robot, with Kinect structures on the Kinect of carrying robot to unmanned plane
Into ray be x-axis, be z-axis perpendicular to the ray of ground upwardly, y-axis direction determined according to right-hand rule, build three-dimensional coordinate
System;
Secondly, calculating can not invade the distance between potential intrusion object and constructed three-dimensional coordinate system x-axis in domain
SFace;
Wherein, a represents distances of the carrying robot Kinect to potential intrusion object m;B represents unmanned plane Kinect to latent
In intrusion object m distance;L represents the distance between carrying robot Kinect and unmanned plane Kinect, and unmanned plane and fortune
Carrier aircraft device human world Kinect line is also the line in the center of circle;H represents carrying robot Kinect, unmanned plane Kinect and invasion
The height for the triangle that thing is constituted.
Finally, if H≤R, there is object invasion can not invade domain;If H>R, then no object invasion can not invade domain.
Step 6:When unmanned plane is run to transport task final position, Lou Dongxin is recognized using the spacer outside a building
Breath, when the building information of identification is matched with transport task terminal, the form stable between carrying robot and unmanned plane is released,
Carrying robot proceeds to transport task final position according to building spacer, completes transport task.
Specific embodiment described herein is only to spirit explanation for example of the invention.Technology neck belonging to of the invention
The technical staff in domain can be made various modifications or supplement to described specific embodiment or be replaced using similar mode
Generation, but without departing from the spiritual of the present invention or surmount scope defined in appended claims.
Claims (6)
1. a kind of intelligent vehicle and robot for merging unmanned plane is across the building means of delivery, it is characterised in that comprise the following steps:
Step 1:Transport task is instructed to the unmanned plane sent to specified carrying robot and matching;
Matching communication is carried out using light color stream cipher agreement between the carrying robot and matching unmanned plane;
Step 2:Shipping point of origin and terminal point coordinate are obtained according to the transport task instruction received, carrying robot is according to spacer
Move at the corresponding Lou Dong doorways of transport task starting point, unmanned plane flies corresponding to transport task starting point according to GPS navigation
At Lou Dong doorways;
Step 3:Carrying robot recognizes and matches unmanned plane;
Reach the unmanned plane hair at the corresponding Lou Dong doorways of transport task starting point and send different colours according to the communication protocol of setting
Light signal, the light signal for the different colours that carrying robot is sent to unmanned plane is identified, if with prestoring
Light color sequence is identical, then it is assumed that the match is successful for both, and both are communicated;
Step 4:After carrying robot and unmanned plane successful match, unmanned plane is made to keep the steady of geo-stationary with carrying robot
Determine motion morphology;
Step 5:Judge whether the motion morphology of unmanned plane with carrying robot between the two is stablized, if stable, unmanned plane according to
According to under transport task generation guidance path, guiding carrying robot to move ahead;
The stable motion form refers to that unmanned plane and carrying robot are run with fixed distance and fixed attitude angle, keeps
Geo-stationary moves ahead;
Step 6:When unmanned plane is run to transport task final position, building information is recognized using the spacer outside a building,
When the building information of identification is matched with transport task terminal, the form stable between carrying robot and unmanned plane is released, fortune
Carry robot and proceed to transport task final position according to building spacer, complete transport task.
2. according to the method described in claim 1, it is characterised in that the attitude angle of the unmanned plane and carrying robot passes through two
The Kinect sensor measurement that person carries is obtained.
3. method according to claim 2, it is characterised in that steady using the motion morphology based on deep learning neutral net
Cover half type judges the motion morphology of both carrying robot and unmanned plane;
The motion morphology stable model based on deep learning neutral net is the relative position with carrying robot and unmanned plane
Appearance time series is as the input data of deep learning neutral net, and motion morphology stabilization result is used as deep learning neutral net
Output data, be trained acquisition.
4. method according to claim 3, it is characterised in that calibrate strategy to unmanned plane and airborne carrier according to following pose
The posture of device people is adjusted so that both keep stable motion form, in stable motion form, Fixed posture angle α0=
90 °, β0=90 °:
1) whenAndWhen, carrying robot keeps mutually synchronized with unmanned plane
Degree operation;
2) whenAndWhen, carrying robot is with 1.5 times of unmanned plane speed
Unmanned plane is followed to move ahead;
3) whenAndOrWhen, carrying robot wheel is to 0.1-0.25r/
S rotating speed real time calibrations;
4) whenAndOrWhen, carrying robot is with 1.5 times of unmanned plane speed
Degree is by track operation;
Wherein, l and L are respectively real-time range between unmanned plane and carrying robot Kinect and under stable operation form
Fixed range;
α and α0Represent that unmanned plane and carrying robot Kinect lines and unmanned plane Kinect horizontal lines are formed respectively real-time
Attitude angle and the Fixed posture angle under stable operation form;
β and β0Represent what unmanned plane was formed with carrying robot Kinect lines and carrying robot Kinect horizontal lines respectively
Real-time attitude angle and the Fixed posture angle under stable operation form.
5. the method according to claim any one of 1-3, it is characterised in that build unmanned plane and the stable fortune of carrying robot
Dynamic form can not invade domain, when can not invade invaded in domain without other objects when, unmanned plane and carrying robot are normally run;
When that can not invade in domain in the presence of other objects, unmanned plane invades a point position using the Kinect sensor collection carried,
With the artificial center of circle of carrying machine, arc movement is carried out, until intrusion object, which is located at, can not invade overseas;
The unmanned plane and the building process that can not invade domain of carrying robot stable motion form are as follows:
A carrying robot square boundary) is determined;
Carrying robot image border, and carrying robot size are detected, carrying robot square boundary is built, passes through delivery
The summit of robot square boundary, builds circular arc border C, radius is R, and carrying robot Kinect is placed in circular arc border
Arc circle centre position where C;
B) using the arc border of structure as directrix, the line between unmanned plane Kinect and carrying robot Kinect is parallel
Line as bus, using the directrix and bus formation class cylindrical space region as domain can not be invaded.
6. method according to claim 5, it is characterised in that judge it is that can not invade in domain whether to deposit according to procedure below
In invader:
First, structure can not invade domain three-dimensional system of coordinate;
Kinect positions using carrying robot are constituted as origin with Kinect on the Kinect of carrying robot to unmanned plane
Ray is x-axis, is z-axis perpendicular to the ray of ground upwardly, y-axis direction is determined according to right-hand rule, build three-dimensional system of coordinate;
Secondly, calculating can not invade the distance between potential intrusion object and constructed three-dimensional coordinate system x-axis S faces in domain;
<mrow>
<mi>s</mi>
<mo>=</mo>
<mfrac>
<mrow>
<mi>a</mi>
<mo>+</mo>
<mi>b</mi>
<mo>+</mo>
<mi>L</mi>
</mrow>
<mn>2</mn>
</mfrac>
</mrow>
Wherein, a represents distances of the carrying robot Kinect to potential intrusion object m;B represent unmanned plane Kinect to it is potential enter
Invade object m distance;L represents the distance between carrying robot Kinect and unmanned plane Kinect, and unmanned plane and airborne carrier
Device human world Kinect line is also the line in the center of circle;H represents carrying robot Kinect, unmanned plane Kinect and invader structure
Into triangle height;
Finally, if H≤R, there is object invasion can not invade domain;If H>R, then no object invasion can not invade domain.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710643174.3A CN107272704B (en) | 2017-08-01 | 2017-08-01 | A kind of outer means of delivery of the intelligent vehicle for merging unmanned machine travel and robot chamber |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710643174.3A CN107272704B (en) | 2017-08-01 | 2017-08-01 | A kind of outer means of delivery of the intelligent vehicle for merging unmanned machine travel and robot chamber |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107272704A true CN107272704A (en) | 2017-10-20 |
CN107272704B CN107272704B (en) | 2018-02-23 |
Family
ID=60075488
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201710643174.3A Active CN107272704B (en) | 2017-08-01 | 2017-08-01 | A kind of outer means of delivery of the intelligent vehicle for merging unmanned machine travel and robot chamber |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107272704B (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108008448A (en) * | 2017-11-02 | 2018-05-08 | 中国科学院地质与地球物理研究所 | Measuring point method, apparatus and system are arranged in ground electromagnetic instrument field work |
CN113226024A (en) * | 2019-01-16 | 2021-08-06 | 株式会社尼罗沃克 | Unmanned aerial vehicle system, unmanned aerial vehicle, moving body, partition member, control method for unmanned aerial vehicle system, and unmanned aerial vehicle system control program |
Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6739556B1 (en) * | 2002-11-20 | 2004-05-25 | Raytheon Company | Method and apparatus for providing an aircraft emergency safety control system |
CN103914076A (en) * | 2014-03-28 | 2014-07-09 | 浙江吉利控股集团有限公司 | Cargo transferring system and method based on unmanned aerial vehicle |
CN104616419A (en) * | 2015-01-18 | 2015-05-13 | 南京森林警察学院 | Sky, ground and air integrated remaining fire monitoring method for forest fire |
CN104699102A (en) * | 2015-02-06 | 2015-06-10 | 东北大学 | System and method for collaboratively navigating, investigating and monitoring unmanned aerial vehicle and intelligent vehicle |
CN105096662A (en) * | 2015-07-24 | 2015-11-25 | 陶文英 | Design method of cooperative driving aircraft system and the system |
CN204956929U (en) * | 2015-07-30 | 2016-01-13 | 云南天质网络科技有限公司 | Continuous -pesticide -feeding crop -dusting unmanned aerial vehicle |
CN105303899A (en) * | 2015-11-12 | 2016-02-03 | 范云生 | Child-mother type robot cooperation system of combination of unmanned surface vessel and unmanned aerial vehicle |
CN205375196U (en) * | 2016-03-01 | 2016-07-06 | 河北工业大学 | Group robot control device for wind power plant inspection |
CN106444803A (en) * | 2016-09-14 | 2017-02-22 | 江苏师范大学 | UAV (Unmanned Aerial Vehicle) navigation system and method used for positioning of pipeline robot |
CN106774221A (en) * | 2017-01-22 | 2017-05-31 | 江苏中科院智能科学技术应用研究院 | A kind of unmanned plane cooperates patrol system and method with unmanned vehicle |
CN106828264A (en) * | 2017-01-17 | 2017-06-13 | 斑马信息科技有限公司 | Unmanned plane Vehicular system and its management method |
-
2017
- 2017-08-01 CN CN201710643174.3A patent/CN107272704B/en active Active
Patent Citations (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6739556B1 (en) * | 2002-11-20 | 2004-05-25 | Raytheon Company | Method and apparatus for providing an aircraft emergency safety control system |
CN103914076A (en) * | 2014-03-28 | 2014-07-09 | 浙江吉利控股集团有限公司 | Cargo transferring system and method based on unmanned aerial vehicle |
CN104616419A (en) * | 2015-01-18 | 2015-05-13 | 南京森林警察学院 | Sky, ground and air integrated remaining fire monitoring method for forest fire |
CN104699102A (en) * | 2015-02-06 | 2015-06-10 | 东北大学 | System and method for collaboratively navigating, investigating and monitoring unmanned aerial vehicle and intelligent vehicle |
CN105096662A (en) * | 2015-07-24 | 2015-11-25 | 陶文英 | Design method of cooperative driving aircraft system and the system |
CN204956929U (en) * | 2015-07-30 | 2016-01-13 | 云南天质网络科技有限公司 | Continuous -pesticide -feeding crop -dusting unmanned aerial vehicle |
CN105303899A (en) * | 2015-11-12 | 2016-02-03 | 范云生 | Child-mother type robot cooperation system of combination of unmanned surface vessel and unmanned aerial vehicle |
CN205375196U (en) * | 2016-03-01 | 2016-07-06 | 河北工业大学 | Group robot control device for wind power plant inspection |
CN106444803A (en) * | 2016-09-14 | 2017-02-22 | 江苏师范大学 | UAV (Unmanned Aerial Vehicle) navigation system and method used for positioning of pipeline robot |
CN106828264A (en) * | 2017-01-17 | 2017-06-13 | 斑马信息科技有限公司 | Unmanned plane Vehicular system and its management method |
CN106774221A (en) * | 2017-01-22 | 2017-05-31 | 江苏中科院智能科学技术应用研究院 | A kind of unmanned plane cooperates patrol system and method with unmanned vehicle |
Non-Patent Citations (1)
Title |
---|
任涛等: "无人机与智能车协同导航系统的设计", 《沈阳大学学报(自然科学版)》 * |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN108008448A (en) * | 2017-11-02 | 2018-05-08 | 中国科学院地质与地球物理研究所 | Measuring point method, apparatus and system are arranged in ground electromagnetic instrument field work |
CN108008448B (en) * | 2017-11-02 | 2019-11-19 | 中国科学院地质与地球物理研究所 | Measuring point method, apparatus and system are arranged in ground electromagnetic instrument field work |
US10488544B2 (en) | 2017-11-02 | 2019-11-26 | Institute Of Geology And Geophysics, Chinese Academy Of Sciences | Method, apparatus and system for arranging survey points in field operation with ground electromagnetic instrument |
CN113226024A (en) * | 2019-01-16 | 2021-08-06 | 株式会社尼罗沃克 | Unmanned aerial vehicle system, unmanned aerial vehicle, moving body, partition member, control method for unmanned aerial vehicle system, and unmanned aerial vehicle system control program |
Also Published As
Publication number | Publication date |
---|---|
CN107272704B (en) | 2018-02-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Mohsan et al. | Unmanned aerial vehicles (UAVs): Practical aspects, applications, open challenges, security issues, and future trends | |
US12130636B2 (en) | Methods and system for autonomous landing | |
Luo et al. | A survey of intelligent transmission line inspection based on unmanned aerial vehicle | |
CN109901580A (en) | A kind of unmanned plane cooperates with unmanned ground robot follows diameter obstacle avoidance system and its method | |
CN104407615B (en) | AGV robot guide deviation correction method | |
McGee et al. | Obstacle detection for small autonomous aircraft using sky segmentation | |
Bachrach et al. | RANGE–Robust autonomous navigation in GPS‐denied environments | |
CN109556615A (en) | The driving map generation method of Multi-sensor Fusion cognition based on automatic Pilot | |
CN109920246A (en) | It is a kind of that local paths planning method is cooperateed with binocular vision based on V2X communication | |
CN106054896A (en) | Intelligent navigation robot dolly system | |
CN108832997A (en) | A kind of unmanned aerial vehicle group searching rescue method and system | |
CN104298248A (en) | Accurate visual positioning and orienting method for rotor wing unmanned aerial vehicle | |
CN106289186A (en) | The airborne visual detection of rotor wing unmanned aerial vehicle and multi-target positioning system and implementation method | |
CN108132675A (en) | Unmanned plane is maked an inspection tour from main path cruise and intelligent barrier avoiding method by a kind of factory | |
CN105318888A (en) | Unmanned perception based unmanned aerial vehicle route planning method | |
CN206411514U (en) | A kind of intelligent storage mobile-robot system positioned based on Quick Response Code | |
US20210278523A1 (en) | Systems and Methods for Integrating Radar Data for Improved Object Detection in Autonomous Vehicles | |
CN102944224A (en) | Automatic environmental perception system for remotely piloted vehicle and work method for automatic environmental perception system | |
CN114474061A (en) | Robot multi-sensor fusion positioning navigation system and method based on cloud service | |
CN109164825A (en) | A kind of independent navigation barrier-avoiding method and device for multi-rotor unmanned aerial vehicle | |
CN107783119A (en) | Apply the Decision fusion method in obstacle avoidance system | |
CN107783547A (en) | Post disaster relief rotor wing unmanned aerial vehicle obstacle avoidance system and method | |
CN107272704B (en) | A kind of outer means of delivery of the intelligent vehicle for merging unmanned machine travel and robot chamber | |
CN113791627B (en) | Robot navigation method, equipment, medium and product | |
CN109839118A (en) | Paths planning method, system, robot and computer readable storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |