Nothing Special   »   [go: up one dir, main page]

CN111798514A - Intelligent moving target tracking and monitoring method and system for marine ranching - Google Patents

Intelligent moving target tracking and monitoring method and system for marine ranching Download PDF

Info

Publication number
CN111798514A
CN111798514A CN202010603465.1A CN202010603465A CN111798514A CN 111798514 A CN111798514 A CN 111798514A CN 202010603465 A CN202010603465 A CN 202010603465A CN 111798514 A CN111798514 A CN 111798514A
Authority
CN
China
Prior art keywords
coordinate system
target
camera
center point
monitoring
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010603465.1A
Other languages
Chinese (zh)
Other versions
CN111798514B (en
Inventor
卢国梁
超越
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Rizhao Institute Of Intelligent Manufacturing Shandong University
Original Assignee
Rizhao Institute Of Intelligent Manufacturing Shandong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Rizhao Institute Of Intelligent Manufacturing Shandong University filed Critical Rizhao Institute Of Intelligent Manufacturing Shandong University
Priority to CN202010603465.1A priority Critical patent/CN111798514B/en
Publication of CN111798514A publication Critical patent/CN111798514A/en
Application granted granted Critical
Publication of CN111798514B publication Critical patent/CN111798514B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Closed-Circuit Television Systems (AREA)

Abstract

The invention relates to an intelligent moving target tracking and monitoring method and system for a marine ranching, which controls the actions of a holder which is provided with a plurality of joints and is connected with a camera, tracks and detects a moving target and comprises the following steps: solving a transformation matrix according to the coordinates of the center point of the target after movement in the current camera coordinate system, wherein the transformation matrix can enable the current camera coordinate system to carry out pose change to obtain a target camera coordinate system, and the center point of the target after movement is located on a coordinate axis of the target camera coordinate system along the lens axis; calculating the pose information of the origin of the target camera coordinate system in the base coordinate system according to the transformation matrix; the method can track the monitoring target in real time, reduce the labor intensity of monitoring personnel, and can be well applied to ocean detection scenes of marine ranches.

Description

Intelligent moving target tracking and monitoring method and system for marine ranching
Technical Field
The invention relates to the technical field of monitoring, in particular to a method and a system for tracking and monitoring an intelligent moving target in a marine ranching.
Background
The statements herein merely provide background information related to the present disclosure and may not necessarily constitute prior art.
The concept of marine ranch is a new artificial fishery system proposed at the end of the 20 th century. Firstly, an environment suitable for marine organism growth and reproduction is created through an artificial means, the environment is built in a marine environment, a part of natural marine organisms can be absorbed, the natural marine organisms and artificial stocking organisms form an organism part of a marine ranch, and the advantages of the natural environment and the artificial control can be taken into consideration by matching with a systematized fishery facility and an artificial management measure, so that the benefits of two aspects of stable aquatic resources, sustainable development and the like are achieved.
The inventor finds that the traditional video monitoring system in the field of marine ranches mainly relies on a manual observation mode to carry out monitoring work, and carries out analysis and judgment on things happening in the video through human vision, tracks suspicious targets, reports dangerous conditions and carries out corresponding treatment. Moreover, most of the existing video monitoring systems are all-day-long working conditions, monitoring personnel need to face a display constantly at any time, physical fatigue and psychological boredom of the monitoring personnel are easily caused, negligence of the monitoring personnel is easily caused, potential safety hazards appear when the monitoring personnel face abnormal conditions, and particularly when one person faces a plurality of monitoring scenes. And the current monitoring system cannot realize continuous tracking monitoring of the moving target.
Disclosure of Invention
The invention aims to overcome the defects of the prior art, and provides an intelligent moving target tracking and monitoring method for a marine ranch, which can track and monitor targets in real time, reduces the labor intensity of monitoring personnel, and is suitable for detecting abnormal conditions of the marine ranch.
In order to achieve the purpose, the invention adopts the following technical scheme:
in a first aspect, an embodiment of the present invention provides a method for tracking and monitoring an intelligent moving target in a marine ranching, which controls the actions of a pan/tilt head having a plurality of joints and connected with a camera, and tracks and monitors the moving target, including:
solving a transformation matrix according to the coordinates of the center point of the target after the movement in the current camera coordinate system, wherein the transformation matrix can enable the current camera coordinate system to carry out pose change to obtain a target camera coordinate system, and the center point of the target after the movement is located on a coordinate axis of the target camera coordinate system along the lens axis;
calculating the pose information of the origin of the target camera coordinate system in the base coordinate system according to the transformation matrix;
and calculating the action information of each joint of the holder according to the obtained pose information.
In a second aspect, an embodiment of the invention provides an intelligent moving target tracking and monitoring system for an offshore pasture, which comprises a holder and a camera, wherein the holder comprises a first driving piece with a vertical axis, the first driving piece is connected with one end of an L-shaped first swing arm, a second driving piece is fixed at the other end of the first swing arm, the axis of the second driving piece is arranged along a first direction in a horizontal plane, the second driving piece is connected with one end of the L-shaped second swing arm, a third driving piece is fixed at the other end of the second swing arm, the axis of the third driving piece is arranged along a second direction perpendicular to the first direction in the horizontal plane, and the third driving piece is connected with the camera.
In a third aspect, an embodiment of the present invention provides an electronic device, including a memory, a processor, and a computer program stored in the memory and executable on the processor, where the processor executes the computer program to implement the method for tracking and monitoring an intelligent moving object in a marine ranch.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, which when executed by a processor, implements the method for tracking and monitoring an intelligent moving object in a marine ranch.
The invention has the beneficial effects that:
the intelligent moving target tracking and monitoring method for the marine ranch can solve a conversion matrix through detecting and calculating the coordinates of the center point of a moving target in a camera coordinate system, the conversion matrix can enable the current camera coordinate system to change the pose to obtain a target camera coordinate system, the center point of the moving target is made to fall on the coordinate axis of the target camera coordinate system along the axis of a lens, the coordinates of the origin of the converted camera coordinate system in a base coordinate system are obtained by utilizing the conversion matrix, and then, the motion information of each joint of the holder can be obtained through a kinematic inverse analysis algorithm, the central point of the moving target can be always on the coordinate axis of the camera coordinate system along the axis of the lens, the real-time tracking and monitoring of the moving target are realized, the labor intensity of monitoring personnel is reduced, the monitoring effect is ensured, and the method is suitable for ocean monitoring.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this application, illustrate embodiments of the application and, together with the description, serve to explain the application and are not intended to limit the application.
FIG. 1 is a schematic flow chart of a method according to example 1 of the present invention;
FIG. 2 is a schematic view of an assembly of a pan/tilt head and a camera and a schematic view of a joint coordinate system and a base coordinate system according to embodiment 1 of the present invention;
FIG. 3 is a schematic diagram illustrating a transformation from an image pixel coordinate system to an image physical coordinate system according to embodiment 1 of the present invention;
FIG. 4 is a schematic perspective view of a physical coordinate system of an image and a coordinate system of a camera in accordance with embodiment 1 of the present invention;
FIG. 5 is a schematic view of an assembly of a yaw motor and a planetary reducer according to embodiment 2 of the present invention;
fig. 6 is a schematic structural view of a speed reducer bracket according to embodiment 2 of the present invention;
the system comprises a yaw motor 1, a speed reducer 2, a first swing arm 3, a roll motor 4, a second swing arm 5, a pitch motor 6, a camera 7, a foundation 8, a yaw motor base 9, a base 10 and a speed reducer support 11.
Detailed Description
It should be noted that the following detailed description is exemplary and is intended to provide further explanation of the disclosure. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments according to the present application. As used herein, the singular is intended to include the plural unless the context clearly dictates otherwise, and it should be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of features, steps, operations, devices, components, and/or combinations thereof.
For convenience of description, the words "up", "down", "left" and "right" in the present invention shall only be construed to mean that they correspond to the directions of the drawing itself, including up, down, left and right, without limitation to the structure, but merely to facilitate the description of the invention and to simplify the description, and shall not indicate or imply that the device or element so referred to must have a particular orientation, be constructed and operated in a particular orientation, and therefore shall not be construed as limiting the invention.
As introduced in the background art, the traditional marine ranch video monitoring system mainly relies on a manual observation mode for monitoring, is high in labor intensity and poor in monitoring effect, and cannot realize continuous tracking and monitoring.
In embodiment 1 of a typical implementation manner of the present application, a method for tracking and monitoring an intelligent moving target in a marine ranch, as shown in fig. 1, controls a pan-tilt with a plurality of joints and connected with a camera to perform tracking and detection on a moving target, and specifically includes: the coordinate of the center point of the target after movement in the current camera coordinate system is detected and calculated, and a conversion matrix is solved, wherein the conversion matrix can enable the current camera coordinate system to carry out pose change to obtain a target camera coordinate system, and the center point of the target after movement is enabled to fall on a coordinate axis of the target camera coordinate system along the lens axis; calculating the pose information of the origin of the target camera coordinate system in the base coordinate system according to the transformation matrix; and calculating the action information of each joint of the holder connected with the camera according to the obtained pose information.
Taking a three-joint tripod head as an example for explanation, as shown in fig. 2, the tripod head includes a yaw motor 1 connected with a foundation, an axis of the yaw motor is vertically arranged, an output shaft of the yaw motor is connected with one end of an L-shaped first swing arm 3 through a speed reducer 2, the other end of the first swing arm is fixed with a roll motor 4, an output shaft of the roll motor is connected with one end of an L-shaped second swing arm 5, an axis of an output shaft of the roll motor is arranged along a first direction in a horizontal plane, the other end of the second swing arm is fixedly connected with a pitch motor 6, an output shaft of the pitch motor is connected with a camera 7, and an axis of an output shaft of the pitch motor is arranged along a second direction perpendicular to the first direction in the horizontal plane.
The yaw motor, the roll motor and the pitch motor form three joints of the holder, a joint coordinate system, a camera coordinate system and a base coordinate system are established, and the following principles are followed:
for the rotating joint, the direction perpendicular to the rotating direction is taken as a z-axis, the x-axis is taken as a direction perpendicular to the z-axis and the z-axis of the next joint, and after the x-axis and the z-axis are determined, the y-axis is determined through a right-hand coordinate system.
The directions are not unique, and both positive and negative directions
The direction of the coordinate axis of the base coordinate system is consistent with the direction of the coordinate axis of the first joint coordinate system at the yaw motor in the initial state.
And the camera coordinate system at the tail end takes the direction perpendicular to the outward direction of the camera lens as a z-axis, and simultaneously takes the direction perpendicular to the z-axis of the third joint coordinate system at the pitching motor and the z-axis of the camera coordinate system as an x-axis, and the y-axis is determined through the right-hand coordinate system.
Establishing a basis X at the connection position of the yaw motor and the foundation 80Axis, Y0Axis and Z0Coordinate system X formed by axes0-Y0-Z0,Z0Is arranged along the vertical direction.
Establishing a first joint coordinate system X at the connecting position of the yaw motor and the first swing arm1-Y1-Z1Wherein Z is1And Z0Parallel, X1And X0Parallel, Y1And Y0Parallel, Z1Arranged along the vertical direction
Establishing a second joint coordinate system X at the connecting position of the roll motor and the second swing arm2-Y2-Z2Wherein X is2And Z1Parallel, Y2And X1Parallel, Z2And Y1Parallel, Z2Arranged along a first direction.
Establishing a third joint coordinate system X at the connecting position of the pitching motor and the camera3-Y3-Z3Wherein X is3And X2Parallel, Y3And Z2Parallel, Z3And Y2Parallel, Z3Arranged along a second direction.
Establishing a camera coordinate system X at a camera4-Y4-Z4Wherein X is4And X3Parallel, Y4And Z3Parallel, Z4And Y3And (4) parallel.
Establishing a cradle head parameter table:
TABLE 1 Table of parameters of the Pan-Tilt head
Figure BDA0002559982700000061
αi-1: from xi-1Viewed in the direction of (1), zi-1And ziThe included angle therebetween. The sign is positive counterclockwise and negative clockwise.
ai-1: from xi-1Viewed in the direction of (1), zi-1And ziThe distance between them. The sign is the positive and negative of the origin of the coordinate system i in the corresponding direction i-1.
di: from ziViewed in the direction of (1), xi-1And xiThe sign of the distance between the two points is positive and negative in the corresponding direction of i-1 from the origin of the coordinate system i.
θi: from ziViewed in the direction of (1), xi-1And xiThe included angle therebetween. The sign is positive counterclockwise and negative clockwise.
bi: from yiViewed in the direction of (1), xi-1And xiThe sign of the distance between the two points is positive and negative in the corresponding direction of i-1 from the origin of the coordinate system i.
Wherein i is 1, 2, 3, 4
In Table 1, θ1、θ2、θ3Are variables and the remaining parameters are known quantities.
The specific method for controlling the holder to track the moving target comprises the following steps:
step 1: the camera collects CCD images of a moving target in a current posture, determines coordinate values of a moving target center point under an image pixel coordinate system (u-v coordinate system), and then converts the coordinate values of the moving target center point under the image pixel coordinate system into coordinate values under a current camera coordinate system.
The specific working method comprises the following steps: firstly, converting the coordinate value of the target center point after the movement under an image pixel coordinate system into the coordinate value of the target center point after the movement under an image physical coordinate system (x-y coordinate system), and then converting the coordinate value of the target center point after the movement under the image physical coordinate system into the coordinate value of the target center point after the movement under a camera coordinate system.
The method comprises the following specific steps: firstly, the method of Zhang Dayou calibration is adopted to obtain the internal reference matrix of the camera, such as the formula (1)
Figure BDA0002559982700000071
Then, converting an image pixel coordinate system into an image physical coordinate system:
the relationship between the image pixel coordinate system and the image physical coordinate system is shown in fig. 3, wherein the u-v coordinate system is the image pixel coordinate system, and the origin thereof is the upper left corner of the CCD image plane. The x-y coordinate system is an image physical coordinate system, the origin point of the image physical coordinate system is the center of the CCD image plane, and d is used respectivelyx,dyRepresents the physical size of each pixel in the x and y directions, in (u)0,v0) The coordinates of the origin of the image physical coordinate system on the image pixel coordinate system can be known as u0And v0Which is half the total pixel value in both directions of the image, respectively.
From the above relationship, the transformation between the image pixel coordinate system and the image physical coordinate system is shown in the following equations 2 and 3:
Figure BDA0002559982700000081
Figure BDA0002559982700000082
in the formula, u-horizontal coordinate of image pixel coordinate system of target central point
v-ordinate of pixel coordinate system of image of central point of target
x-horizontal coordinate of physical coordinate system of image of target central point
y-ordinate of physical coordinate system of image of target central point
dxCamera internal reference, the physical size of a pixel in the x-direction on the image pixel coordinate system
dyCamera internal reference, the physical size of a pixel in the y-direction on the image pixel coordinate system
u0-the abscissa of the origin of the physical image coordinate system on the image pixel coordinate system, taken 1544 by the present design
v0-ordinate of origin of physical coordinate system of image on coordinate system of image pixel, the design takes 1032
Converting equations 2 and 3 into a matrix form as shown in equation (4):
Figure BDA0002559982700000083
by the formula (4), the coordinate value of the target center point after the movement in the image pixel coordinate system can be used to solve the coordinate value in the image physical coordinate system.
Conversion between camera coordinate system and image physical coordinate system
The relationship between the image physical coordinate system and the camera coordinate system is shown in FIG. 4, where the X-y coordinate system is the image physical coordinate system and the X coordinate system is the X coordinate systemc-Yc-ZcThe coordinate system is a camera coordinate system, the origin of which is the optical center of the camera, which is herein understood to be the geometric center of the camera lens. From the similarity principle of triangles and with reference to FIG. 4, we can obtain the relation between two coordinate systems as shown in equation 5
Figure BDA0002559982700000091
The expression 5 is expressed in a matrix form, and the following expression 6 is obtained, namely, the relationship between the camera coordinate system and the image physical coordinate system is:
Figure BDA0002559982700000092
Xc、Yc、Zcare coordinate values of the target center point in three coordinate axes of the camera coordinate system respectively, wherein ZcThe distance between the target and the center of the camera lens is the working distance of the camera.
The relationship between the coordinates of the image pixels and the camera coordinate system can be found by matrix multiplication of equations 4 and 6 as shown in equation 7:
Figure BDA0002559982700000093
zc is the distance of the moving object from the center of the camera lens, i.e. the working distance of the camera, in equation 7
Figure BDA0002559982700000094
u0、v0All are camera parameters, and the formula 1 can be used in this embodiment
Figure BDA0002559982700000095
Taking the above parameters into equation 7, equation 8 can be obtained, and the matrix represented by equation 8 can obtain the coordinates (X) corresponding to the current camera coordinate system from the coordinates (u, v) of the target center point in the camera image pixel coordinate system after the motionc,Yc,Zc)。
Figure BDA0002559982700000096
In this embodiment, the coordinate values of the moved target center point image in the pixel coordinate system are obtained by using the existing three-frame matching algorithm, the first frame image and the second frame image are differentiated, the second frame image and the third frame image are differentiated, the results of the two-time differentiation are subjected to or operation, or the obtained image after the operation is the result of the three-frame differentiation, so that the void effect of the obtained result is smaller than that of the single-time frame difference due to the two-time continuous differentiation.
In order to further reduce the influence of the void effect, the frame difference result is highlighted to represent the moving object, and in this embodiment, the image enhancement operation such as the closing operation of the morphological processing and the median filtering is also performed.
Step 2: solving a transformation matrix according to the coordinates of the central point of the target after the movement in a camera coordinate system
Figure BDA0002559982700000101
The transformation matrix can enable the current camera coordinate system to change the pose and convert the pose into a target camera coordinate system, enable the moving target center point to fall on a set coordinate axis of the changed target camera coordinate system, set the coordinate axis to be a coordinate axis along the axis of the camera lens, and solve the coordinate of the origin of the target camera coordinate system under the current camera coordinate system according to the transformation matrix.
The coordinate of the target central point after the movement in the current camera coordinate system is (X)c,Yc,Zc) The coordinates in the target camera coordinate system are (0, 0, Z)c) The transformation matrix between the current camera coordinate system and the target camera coordinate system is
Figure BDA0002559982700000102
Namely:
Figure BDA0002559982700000103
wherein:
Figure BDA0002559982700000104
in the formula:
Figure BDA0002559982700000105
-coordinates of the origin of the target camera coordinate system in the current camera coordinate system
R-rotation matrix representing rotational movement of camera coordinate system
In one embodiment of the present application, in order to reduce the amount of calculation, the pitching motor is set to be inactive, that is, when the current camera coordinate system is set to the target camera coordinate system, the camera coordinate system firstly winds around X4Rotation theta4And then the movement is performed.
Figure BDA0002559982700000111
Can solve out
Figure BDA0002559982700000112
Wherein Xc, Yc and Zc are coordinate values of the target center point after movement in the current camera coordinate system, and can be obtained according to step 1, and theta4Absolute yaw angle theta for the next attitude of the pan/tilt head1Yaw angle absolute yaw angle theta from current attitude1 1Can pass theta during the solution process1Represents:
θ4=θ11 1
in this embodiment, the absolute yaw angle is a value of a rotational angle of the yaw motor relative to the original state after the yaw motor rotates to a certain attitude.
And step 3: and solving the coordinates of the origin of the target camera coordinate system under the base coordinate system according to the coordinate system transformation matrix between the adjacent joint coordinate systems and the coordinate system transformation matrix between the base coordinate system and the adjacent joint coordinate system.
Figure BDA0002559982700000113
Wherein,
Figure BDA0002559982700000114
Figure BDA0002559982700000115
is a coordinate system transformation matrix from the base coordinate system to the first joint coordinate system, and particularly firstly winds Z in the case of setting the joint coordinate system of the embodiment0Shaft counterclockwise rotation theta1Then along Z0Axial translation d1
Figure BDA0002559982700000116
T2 1A coordinate system transformation matrix from the first joint coordinate system to the second joint coordinate system, specifically, first winding x in the case of setting the joint coordinate system of the present embodiment1Counterclockwise rotation of the shaft alpha1Then along x1Axial translation a2Around z1Axis of rotation theta2Along y1Axial translation b2Last edge z1Axial translation d2
Figure BDA0002559982700000121
Figure BDA0002559982700000122
The coordinate system transformation matrix from the second joint coordinate system to the third joint coordinate system is the same as the transformation from the first joint coordinate system to the second joint coordinate system in the case of the joint coordinate system setup according to the present embodiment, and the description will not be repeated here.
Figure BDA0002559982700000123
Figure BDA0002559982700000124
Is a coordinate system transformation matrix from the third joint coordinate system to the camera coordinate system, and particularly winds x first in the case of setting the joint coordinate system of the embodiment3Counterclockwise rotation of the shaft alpha3Then along x3Axial translation a4Along y3Axial translation b4Last edge z3Axial translation d4
Is provided with
Figure BDA0002559982700000125
Is (x)w,yw,zw)
Substituting the formulas (14), (15), (16) and (17) into the formula (13) to obtain (x)w,yw,zw) In the formulae (14), (15), (16) and (17), θ1,θ2,θ3The pose information reflected by the current cradle head attitude is a known quantity, and other parameters are known quantities, so that the pose information can be obtained
Figure BDA0002559982700000126
And 4, step 4: according to the obtained
Figure BDA0002559982700000127
Solving pose parameters theta in adjacent joint coordinate system transformation matrix by using kinematic inverse analysis algorithm1,θ2,θ3And obtaining the rotation angles of the yaw motor, the roll motor and the pitching motor.
In particular, the method comprises the following steps of,
Figure BDA0002559982700000128
substituting equations (14), (15), (16) and (17) into equation (18), θ in equations (14), (15), (16) and (17) is1,θ2,θ3Is an unknown quantity, namely an angle to be solved.
For the convenience of solution, we define the algebraic expression f (θ) in advance3) And g (theta)2,θ3). Partial variable g12,θ3),g22,θ3),g32,θ3) Are respectively composed of g1,g2,g3Is represented by f13)、f23)、f33) Are respectively formed by1、f2、f3And (4) showing.
Figure BDA0002559982700000131
f13)=a2+a3*cos(θ3)-d4*sin(θ3) (20)
f23)=a3*sin(θ3)*cos(α2)+b3*cos(α2)-b4*sin(α2)-d3*sin(α2)+ d4*cos(α2)*cos(θ3) (21)
f33)=α3*sin(α2)*sin(θ3)+b3*sin(α2)+b4*cos(α2)+d3*cos(α2)+ d4*sin(α2)*cos(θ3) (22)
g12,θ3)=α1+f13)*cos(θ2)-f23)*sin(θ2) (23)
g22,θ3)=b2*cos(α1)-d2*sin(α1)+f13)*sin(θ2)*cos(α1)- f33)*sin(α1)+f23)*cos(α1)*cos(θ2) (24)
g32,θ3)=b2*sin(α1)+d2*cos(α1)+f13)*sin(α1)*sin(θ2)+f23)* cos(θ2)*sin(α1)+f33)*cos(α1) (25)
(cos(θ1)*g1-sin(θ1)*g2)2+(g2*cos(θ1)+sin(θ1)*g1)2=g1 2+g2 2(26)
And (3) setting the distance from the origin of the coordinate system of the target camera to the origin of the coordinate system of the base as r, and combining a formula (26) to obtain:
r=xw 2+yw 2+zw 2=g1 2+g2 2+(g3+102)2
=f1 2+f2 2+f3 2+a1 2+d1 2+d2 2+2d1*(sin(α1)*b2+d2*cos(α1))+2f3* (d1*cos(α1)+d2)+2α1*(f1*cos(θ2)-f2*sin(θ2))+(2b2+2d1* sin(α1))*(f2*cos(θ2)+f1*sin(θ2)) (27)
setting a parameter k1
k1=f1 2+f2 2+f3 2+a1 2+d1 2+d2 2+2d1*(sin(α1)*b2+d2*cos(α1))+ 2f3*(d1*cos(α1)+d2) (28)
k1Containing only the parameter theta3
According to the matrix corresponding relation, the following steps are carried out:
xw=cos(θ1)*g12,θ3)-sin(θ1)*g22,θ3) (29)
yw=g22,θ3)*cos(θ1)+sin(θ1)*g12,θ3) (30)
zw=g32,θ3)+d1=b2*sin(α1)+d2*cos(α1)+sin(α1)*(f1*sin(θ2)+ +f2*cos(θ2))+f3*cos(α1)+d1(31)
setting a parameter k2
k2=b2*sin(α1)+d2*cos(α1)+f3*cos(α1)+d1(32)
Simultaneous equations (27) and (31) yields:
Figure BDA0002559982700000141
in the present embodiment, the pitching motor is set not to operate, i.e., θ3Is a known quantity, xw,yw,zwAre all theta1So that equation (29) and equation (30) are directly simultaneous, the following equation set is obtained:
Figure BDA0002559982700000142
the following is obtained by substituting the formula of half angle:
Figure BDA0002559982700000143
Figure BDA0002559982700000144
wherein u is tan (θ)2/2),v=tan(θ1/2)
Calculating theta by the equation set1And theta2And further, the rotation angles of the yaw motor and the roll motor can be obtained.
In another embodiment of the application, the yaw motor is set not to rotate, i.e. θ1As is known, the current camera coordinate system moves to the target camera coordinatesWhen tying, firstly wind Y4Axis of rotation theta4And then the movement is performed.
At this time:
Figure BDA0002559982700000151
and (3) calculating:
Figure BDA0002559982700000152
in the formula [ theta ]4Absolute angle of pitch theta for the next attitude of the pan/tilt head3Absolute angle of pitch theta from current attitude3 1The difference value can pass through theta in the resolving process3Represents:
θ4=θ33 1
in this embodiment, the absolute rotation angle of the pitch angle is a rotation angle value relative to an original state after the pitch motor rotates to a certain attitude.
In this case, xw,yw,zwAre all theta3In three cases, the equation (34), (35) and (36) can be used to solve the problem of theta3Knowing theta1And theta3From equation (30), θ can be solved2
If a1=0,
r=k1+(2b2+2d1*sin(α1))*(f2*cos(θ2)+f1*sin(θ2))
zw=k2+sin(α1)*(f1*sin(θ2)+f2*cos(θ2))
The following can be obtained:
Figure BDA0002559982700000153
if sin (alpha)1)=0
zw=k2. (35)
If α is1And sin (alpha)1) Are not all 0:
Figure BDA0002559982700000154
theta according to equation (30) and solution3In combination with a known theta1Solving for theta2And further the rotation angles of the pitching motor and the rolling motor can be obtained.
Example 2:
the embodiment discloses an intelligent moving target tracking and monitoring system for a marine ranching: including cloud platform and camera, the cloud platform includes the first driving piece of the vertical setting of axis, and first driving piece is connected with the first swing arm one end of L type, and the other end of first swing arm is fixed with the second driving piece, and the first direction setting in the horizontal plane is followed to the second driving piece axis, and the second driving piece is connected with the one end of the second swing arm of L type, and the other end of second swing arm is fixed with the third driving piece, and the third driving piece axis sets up with first direction vertically second direction in the horizontal plane, and the third driving piece is connected with the camera.
In this embodiment, first driving piece adopts yaw motor 1, yaw motor's the vertical setting of output shaft axis, yaw motor's motor casing is fixed on yaw motor cabinet 9, and yaw motor cabinet passes through the bolt fastening on base 10 as the basis, yaw motor's both sides are provided with speed reducer support 11, speed reducer support connection has the reduction gear, the reduction gear adopts planetary reducer, and yaw motor's output shaft is as planetary reducer's input shaft, planetary reducer's output shaft and the one end fixed connection of first swing arm can drive first swing arm around vertical axis motion, and planetary reducer fixes on speed reducer support, can not increase the burden for yaw motor.
The second driving piece adopts a roll motor, an output shaft of the roll motor is fixedly connected with the end part of the second swing arm, and the axis of the roll motor is arranged along the first direction in the horizontal plane.
The third driving piece adopts a pitching motor, an output shaft of the pitching motor is connected with the camera, and the output shaft of the pitching motor is arranged along a second direction which is vertical to the first direction in the horizontal plane.
The camera, the yaw motor, the roll motor and the pitch motor are all connected with a control system, and the control system receives images transmitted by the camera and controls the yaw motor, the roll motor and the pitch motor to work according to the method in the embodiment 1.
Example 3:
the embodiment provides an electronic device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and when the processor executes the program, the method for tracking and monitoring an intelligent moving target in a marine ranch according to embodiment 1 is implemented.
Example 4:
the present embodiment provides a computer-readable storage medium, on which a computer program is stored, which when executed by a processor implements the method for tracking and monitoring an intelligent moving object in a marine ranch as described in embodiment 1.
"computer-readable storage medium" shall be taken to include a single medium or multiple media containing one or more sets of instructions; it should also be understood to include any medium that is capable of storing, encoding or carrying a set of instructions for execution by a processor and that cause the processor to perform any one of the methods of the present invention.
It will be appreciated by those skilled in the art that the steps of the invention described above may be implemented using general purpose computer means, or alternatively they may be implemented using program code executable by computing means, whereby the steps may be stored in memory means for execution by the computing means, or may be implemented as separate integrated circuit modules, or may have a plurality of modules or steps implemented as a single integrated circuit module. The present invention is not limited to any specific combination of hardware and software.
Although the embodiments of the present invention have been described with reference to the accompanying drawings, it is not intended to limit the scope of the present invention, and it should be understood by those skilled in the art that various modifications and variations can be made without inventive efforts by those skilled in the art.

Claims (10)

1. The utility model provides a marine ranching intelligence moving object tracking monitoring method, the control has a plurality of joints and is connected with the cloud platform action of camera, tracks the detection to the moving object, its characterized in that includes:
solving a transformation matrix according to the coordinates of the center point of the target after movement in the current camera coordinate system, wherein the transformation matrix can enable the current camera coordinate system to carry out pose change to obtain a target camera coordinate system, and the center point of the target after movement is located on a coordinate axis of the target camera coordinate system along the lens axis;
calculating the pose information of the origin of the target camera coordinate system in the base coordinate system according to the transformation matrix;
and calculating the action information of each joint of the holder according to the obtained pose information.
2. The method according to claim 1, wherein the method comprises receiving an image of the moving target collected by the camera, obtaining coordinate values of the center point of the moving target in the image pixel coordinate system, and obtaining the coordinate values of the center point of the moving target in the current camera coordinate system according to the coordinate values of the center point of the moving target in the image pixel coordinate system.
3. The method according to claim 2, wherein the origin of the image pixel coordinate system is an angular point of the image, the coordinate values of the moved target center point in the image physical coordinate system are obtained according to the coordinate values of the moved target center point in the image pixel coordinate system, the origin of the image physical coordinate system is a center point of the image, and the coordinate values of the moved target center point in the current camera coordinate system are obtained according to the coordinate values of the moved target center point in the image physical coordinate system.
4. The method according to claim 2, wherein a coordinate value of the center point of the moving target in the image pixel coordinate system is obtained by using a three-frame score-finding algorithm.
5. The method for tracking and monitoring the intelligent moving target in the marine ranch as claimed in claim 1, wherein the pose information of the origin of the coordinate system of the target camera in the current coordinate system of the camera is obtained according to the transformation matrix, and the pose information of the origin of the coordinate system of the target camera in the base coordinate system is obtained through calculation according to the transformation matrix of the coordinate system between the coordinate systems of the adjacent joints of the holder, wherein the pose parameters in the transformation matrix of the coordinate system are the pose parameters of each joint in the current posture of the holder.
6. The method for tracking and monitoring the intelligent moving target in the marine ranching as claimed in claim 5, wherein the pan-tilt is a three-joint pan-tilt, the joint of the pan-tilt connected with the camera is set to be not moved, and the pose parameters in the coordinate system transformation matrix are reversely solved according to the pose information of the origin of the coordinate system of the target camera in the base coordinate system, so as to obtain the motion information of other joints of the pan-tilt.
7. The method for tracking and monitoring the intelligent moving target in the marine ranching as claimed in claim 5, wherein the pan-tilt is a three-joint pan-tilt, the joints of the pan-tilt for connecting with an external foundation are set to be non-moving, and the pose parameters in the coordinate system transformation matrix are reversely solved according to the pose information of the origin of the target camera coordinate system in the base coordinate system, so as to obtain the motion information of other joints of the pan-tilt.
8. The utility model provides a monitoring system is trailed to ocean pasture intelligence motion target, a serial communication port, including cloud platform and camera, the cloud platform includes the first driving piece of the vertical setting of axis, first driving piece is connected with the first swing arm one end of L type, the other end of first swing arm is fixed with the second driving piece, first direction setting in the horizontal plane is followed to the second driving piece axis, the second driving piece is connected with the one end of the second swing arm of L type, the other end of second swing arm is fixed with the third driving piece, third driving piece axis sets up with first direction vertically second direction in the horizontal plane, the third driving piece is connected with the camera.
9. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor when executing the program implements the method for intelligent moving object tracking and monitoring of marine ranch as claimed in any of claims 1-7.
10. A computer-readable storage medium, on which a computer program is stored, wherein the program, when executed by a processor, implements the method for tracking and monitoring an intelligent moving object in a marine ranch.
CN202010603465.1A 2020-06-29 2020-06-29 Intelligent moving target tracking and monitoring method and system for ocean pasture Active CN111798514B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010603465.1A CN111798514B (en) 2020-06-29 2020-06-29 Intelligent moving target tracking and monitoring method and system for ocean pasture

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010603465.1A CN111798514B (en) 2020-06-29 2020-06-29 Intelligent moving target tracking and monitoring method and system for ocean pasture

Publications (2)

Publication Number Publication Date
CN111798514A true CN111798514A (en) 2020-10-20
CN111798514B CN111798514B (en) 2024-08-20

Family

ID=72803929

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010603465.1A Active CN111798514B (en) 2020-06-29 2020-06-29 Intelligent moving target tracking and monitoring method and system for ocean pasture

Country Status (1)

Country Link
CN (1) CN111798514B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113506340A (en) * 2021-06-15 2021-10-15 浙江大华技术股份有限公司 Method and equipment for predicting cloud deck pose and computer readable storage medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105242689A (en) * 2015-09-23 2016-01-13 浙江大学 Holder tracking visual system based on optical reflection
CN105321174A (en) * 2015-09-23 2016-02-10 浙江大学 Multi-plane-mirror reflection tracking pan-tilt system calibration method
CN105353772A (en) * 2015-11-16 2016-02-24 中国航天时代电子公司 Visual servo control method for unmanned aerial vehicle maneuvering target locating and tracking
CN105676865A (en) * 2016-04-12 2016-06-15 北京博瑞爱飞科技发展有限公司 Target tracking method, device and system
CN107992099A (en) * 2017-12-13 2018-05-04 福州大学 A kind of target sport video tracking and system based on improvement frame difference method
CN108958296A (en) * 2018-06-05 2018-12-07 西安工业大学 A kind of unmanned plane method for autonomous tracking
CN109933096A (en) * 2019-03-15 2019-06-25 山东鲁能智能技术有限公司 A kind of holder method of servo-controlling and system
CN109992014A (en) * 2019-04-24 2019-07-09 哈尔滨理工大学 The vision guided navigation cradle head device of climbing robot and control under a kind of slope pavement
US20190371003A1 (en) * 2018-05-30 2019-12-05 Baidu Online Network Technology (Beijing) Co., Ltd . Monocular vision tracking method, apparatus and non-volatile computer-readable storage medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105242689A (en) * 2015-09-23 2016-01-13 浙江大学 Holder tracking visual system based on optical reflection
CN105321174A (en) * 2015-09-23 2016-02-10 浙江大学 Multi-plane-mirror reflection tracking pan-tilt system calibration method
CN105353772A (en) * 2015-11-16 2016-02-24 中国航天时代电子公司 Visual servo control method for unmanned aerial vehicle maneuvering target locating and tracking
CN105676865A (en) * 2016-04-12 2016-06-15 北京博瑞爱飞科技发展有限公司 Target tracking method, device and system
WO2017177542A1 (en) * 2016-04-12 2017-10-19 高鹏 Object tracking method, device and system
CN107992099A (en) * 2017-12-13 2018-05-04 福州大学 A kind of target sport video tracking and system based on improvement frame difference method
US20190371003A1 (en) * 2018-05-30 2019-12-05 Baidu Online Network Technology (Beijing) Co., Ltd . Monocular vision tracking method, apparatus and non-volatile computer-readable storage medium
CN108958296A (en) * 2018-06-05 2018-12-07 西安工业大学 A kind of unmanned plane method for autonomous tracking
CN109933096A (en) * 2019-03-15 2019-06-25 山东鲁能智能技术有限公司 A kind of holder method of servo-controlling and system
CN109992014A (en) * 2019-04-24 2019-07-09 哈尔滨理工大学 The vision guided navigation cradle head device of climbing robot and control under a kind of slope pavement

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
RITA CUCCHIARA 等: "Advanced vedio surveillance with Pan Tilt Zoom Cameras", 《SEMANTIC SCHOLAR》, 31 December 2006 (2006-12-31), pages 1 - 8 *
冯健业 等: "基于云台的智能化运动目标跟踪监控系统设计", 《韶关学院学报·自然科学》, vol. 39, no. 9, 30 September 2018 (2018-09-30), pages 52 - 56 *
卢国梁 等: "基于计算机显微视觉的微动平台位移测量方法", 《传感器与微系统》, vol. 38, no. 5, 31 December 2019 (2019-12-31), pages 17 - 19 *
辛哲奎 等: "小型无人机地面目标跟踪系统机载云台自适应跟踪控制", 《控制理论与应用》, vol. 27, no. 8, 31 August 2010 (2010-08-31), pages 1001 - 1006 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113506340A (en) * 2021-06-15 2021-10-15 浙江大华技术股份有限公司 Method and equipment for predicting cloud deck pose and computer readable storage medium

Also Published As

Publication number Publication date
CN111798514B (en) 2024-08-20

Similar Documents

Publication Publication Date Title
CN111801198B (en) Hand-eye calibration method, system and computer storage medium
CN110480634B (en) Arm guide motion control method for mechanical arm motion control
CN100373394C (en) Petoscope based on bionic oculus and method thereof
CN106407882A (en) Method and apparatus for realizing head rotation of robot by face detection
CN103971375B (en) A kind of panorama based on image mosaic stares camera space scaling method
CN102842117B (en) Method for correcting kinematic errors in microscopic vision system
CN105652872B (en) The automatic method for tracking and positioning of substation's laser navigation crusing robot intelligent console
CN104835117A (en) Spherical panorama generating method based on overlapping way
WO2020063058A1 (en) Calibration method for multi-degree-of-freedom movable vision system
CN109840508A (en) One robot vision control method searched for automatically based on the depth network architecture, equipment and storage medium
CN109442171A (en) A kind of single eye stereo vision system and its application method
CN116872216B (en) Robot vision servo operation method based on finite time control
CN110954555A (en) WDT 3D vision detection system
CN111798514A (en) Intelligent moving target tracking and monitoring method and system for marine ranching
CN207326366U (en) A kind of narrow space bolt location and installation machine people
CN115205286A (en) Mechanical arm bolt identification and positioning method for tower-climbing robot, storage medium and terminal
CN110060295A (en) Object localization method and device, control device follow equipment and storage medium
Gao et al. Kinect-based motion recognition tracking robotic arm platform
CN102263893A (en) Multi-axis linear motor-driven bionic imaging platform
CN209648746U (en) A kind of vision robot's remote monitoring system
CN104751455A (en) Crop image dense matching method and system
CN206741179U (en) A kind of industrial camera pose adjusting apparatus
CN106990647A (en) A kind of industrial camera pose adjusting apparatus
Yao et al. Camshift algorithm-based moving target recognition and tracking system
CN111832510A (en) Method and system for intelligently finding pole tower

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Country or region after: China

Address after: 276801 No.177, Gaoxin 6th Road, Rizhao City, Shandong Province

Applicant after: Shandong University Rizhao Research Institute

Address before: 276801 No.177, Gaoxin 6th Road, Rizhao City, Shandong Province

Applicant before: Rizhao Institute of Intelligent Manufacturing, Shandong University

Country or region before: China

GR01 Patent grant
GR01 Patent grant