Nothing Special   »   [go: up one dir, main page]

Next Article in Journal
Material Source of Sediments from West Clarion–Clipperton Zone (Pacific): Evidence from Rare Earth Element Geochemistry and Clay Minerals Compositions
Next Article in Special Issue
Extended State Observer-Based Parameter Identification of Response Model for Autonomous Vessels
Previous Article in Journal
Research on an Extensible Monitoring System of a Seafloor Observatory Network in Laizhou Bay
Previous Article in Special Issue
A Lightweight Sea Surface Object Detection Network for Unmanned Surface Vehicles
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Binocular-Vision-Based Obstacle Avoidance Design and Experiments Verification for Underwater Quadrocopter Vehicle

1
College of Electrical Engineering, Zhejiang University of Water Resources and Electric Power, Hangzhou 310018, China
2
College of Electronics and Information, Hangzhou Dianzi University, Hangzhou 310018, China
3
CETC (Ningbo) Marine Electronics Research Institute, Jiaxing 314000, China
4
The 36th Research Institute of China Electronics Technology Group Corporation, Jiaxing 314000, China
*
Author to whom correspondence should be addressed.
J. Mar. Sci. Eng. 2022, 10(8), 1050; https://doi.org/10.3390/jmse10081050
Submission received: 13 June 2022 / Revised: 22 July 2022 / Accepted: 27 July 2022 / Published: 30 July 2022
(This article belongs to the Special Issue Advances in Marine Vehicles, Automation and Robotics)
Figure 1
<p>The mechanical structure of the underwater quadrocopter vehicle.</p> ">
Figure 2
<p>Hardware diagram.</p> ">
Figure 3
<p>Dynamic model of the UQV.</p> ">
Figure 4
<p>Rotational model in the body coordinate system.</p> ">
Figure 5
<p>Underwater binocular-vision-based obstacle positioning.</p> ">
Figure 6
<p>Camera imaging model.</p> ">
Figure 7
<p>Camera calibration images.</p> ">
Figure 8
<p>Calibration modeling results.</p> ">
Figure 9
<p>Imaging pixels of target point between two cameras.</p> ">
Figure 10
<p>Block stereo-matching algorithm.</p> ">
Figure 11
<p>Obstacle depth computing model.</p> ">
Figure 12
<p>Obstacle -avoidance-based path planning framework.</p> ">
Figure 13
<p>Obstacle avoidance path-planning framework.</p> ">
Figure 14
<p>Actual and ideal path.</p> ">
Figure 15
<p>Projected paths in the <math display="inline"><semantics> <mrow> <mi>X</mi> <mo>−</mo> <mi>a</mi> <mi>x</mi> <mi>i</mi> <mi>s</mi> </mrow> </semantics></math>, the <math display="inline"><semantics> <mrow> <mi>Y</mi> <mo>−</mo> <mi>a</mi> <mi>x</mi> <mi>i</mi> <mi>s</mi> </mrow> </semantics></math> and the <math display="inline"><semantics> <mrow> <mi>Z</mi> <mo>−</mo> <mi>a</mi> <mi>x</mi> <mi>i</mi> <mi>s</mi> </mrow> </semantics></math>, respectively.</p> ">
Figure 15 Cont.
<p>Projected paths in the <math display="inline"><semantics> <mrow> <mi>X</mi> <mo>−</mo> <mi>a</mi> <mi>x</mi> <mi>i</mi> <mi>s</mi> </mrow> </semantics></math>, the <math display="inline"><semantics> <mrow> <mi>Y</mi> <mo>−</mo> <mi>a</mi> <mi>x</mi> <mi>i</mi> <mi>s</mi> </mrow> </semantics></math> and the <math display="inline"><semantics> <mrow> <mi>Z</mi> <mo>−</mo> <mi>a</mi> <mi>x</mi> <mi>i</mi> <mi>s</mi> </mrow> </semantics></math>, respectively.</p> ">
Figure 16
<p>The photograph of the UQV.</p> ">
Figure 17
<p>Web page control interfaces.</p> ">
Figure 18
<p>Obstacle-free trajectory.</p> ">
Figure 19
<p>X-axis and Y-axis curves of the obstacle-free trajectory.</p> ">
Figure 20
<p>Experimental environment.</p> ">
Figure 21
<p>The distribution of three spherical obstacles.</p> ">
Figure 22
<p>Underwater obstacle image processing results.</p> ">
Figure 23
<p>Key steps of obstacle avoidance trajectory.</p> ">
Versions Notes

Abstract

:
As we know, for autonomous robots working in a complex underwater region, obstacle avoidance design will play an important role in underwater tasks. In this paper, a binocular-vision-based underwater obstacle avoidance mechanism is discussed and verified with our self-made Underwater Quadrocopter Vehicle. The proposed Underwater Quadrocopter Vehicle (UQV for short), like a quadrocopter drone working underwater, is a new kind of Autonomous Underwater Vehicle (AUV), which is equipped with four propellers along the vertical direction of the robotic body to adjust its body posture and two propellers arranged at the sides of the robotic body to provide propulsive and turning force. Moreover, an underwater binocular-vision-based obstacle positioning method is studied to measure an underwater spherical obstacle’s radius and its distance from the UQV. Due to its perfect ability of full-freedom underwater actions, the proposed UQV has obvious advantages such as a zero turning radius compared with existing torpedo-shaped AUVs. Therefore, one semicircle-curve-based obstacle avoidance path is planned on the basis of an obstacle’s coordinates. Practical pool experiments show that the proposed binocular vision can locate an underwater obstacle accurately, and the designed UQV has the ability to effectively avoid multiple obstacles along the predefined trajectory.

1. Introduction

As we know, the ocean is rich in medical resources, mineral resources, marine living resources and so on. To facilitate the investigation of the ocean, a large variety of underwater robots such as the Autonomous Underwater Vehicle (AUV) have been designed [1,2]. The AUV is one kind of intelligent marine robot with many advantages such as strong autonomy and high maneuverability, so it has gradually become an important platform to perform various underwater tasks. Nowadays, the AUV has a wide range of applications in the ocean, such as sea rescue, search range, pipeline inspection and other fields [3,4]. Since conventional AUVs are of torpedo-style shapes, they have great defects such as an underactuated system and excessive turning radius. Many researchers have devoted themselves to new and different types of underwater vehicles, and some underwater robots with new architectures have been developed recently [5,6,7]. A new type Autonomous Underwater Helicopter (AUH), which has a unique dish shape and can realize “four arbitrary” functions of full turn, fixed hover, precise landing, and free landing, was designed by Zhejiang University [5,6]. In addition, a new type of underwater snake robot, which mimics the motion of eels, was used to test a set-based guidance strategy for path-following with obstacle avoidance [7]. New multilink gliding robotic fish can swim flexibly and glide energy efficiently in three dimensions [8,9]. To summarize, underwater vehicles with new structures can make up for the defects of the torpedo AUV. In this paper, we design a new-style AUV named the underwater quadrocopter vehicle to verify the obstacle avoidance mechanism.
Due to complex and changeable underwater environments, autonomous underwater robots must be able to avoid underwater obstacles such as cliffs, wrecks and seafloor fluctuation [10]. The primary goal of obstacle avoidance is to maintain underwater robot safety in a mostly unknown environment through fully autonomous operation [11]. Increasing the autonomy of AUVs has the potential to make underwater operations safer, more efficient, cost-effective, and environmentally friendly. Unfortunately, although there are many research studies on obstacle avoidance strategies for unmanned aerial vehicles, obstacle avoidance research on autonomous underwater robots is still not common and mature. At present, underwater obstacle detection mainly depends on sonar and underwater vision. Due to the special working environment, sonar is traditionally the primary sensor for underwater robots [12,13,14]. Underwater sonar can provide low resolution images of underwater obstacles to obtain estimations of an obstacle’s position and movement trend, but they are inaccurate.
Petillot et al. [12] propose a novel obstacle avoidance and path planning framework for underwater vehicles based on multi-beam forward looking sonar. Braginsky et al. [13] provide extensive simulation and experimental results to demonstrate that the proposed three horizontal and one vertical obstacle avoidance approaches enable AUVs to navigate safely through obstacles. Belcher et al. [14] use a Dual-Frequency Identification Sonar (DIDSON) as an obstacle avoidance sonar to perform autonomous obstacle identification.
On the other hand, there are some obstacle avoidance applications based on underwater machine vision [15,16,17,18]. A vision-based obstacle detection technique using optical flow is proposed for collision avoidance of Autonomous Quadrotor Navigation in [15]. Barrett et al. [16] apply structured light laser imaging into underwater obstacle avoidance and navigation. Leonardi et al. [17] provide a proof of concept regarding a series of experiments investigating stereo vision for underwater obstacle avoidance and position estimation. Drews et al. [18] propose a new vision-based obstacle avoidance strategy using the Underwater Dark Channel Prior (UDCP) algorithm that can be applied to any underwater robot with a simple monocular camera and minimal on-board processing capabilities. Evans et al. [19] outline the sonar and video sensor processing techniques used for real-time control of the Intervention-AUV to perform tracking and 3D pose reconstruction. Xue et al. [20] propose a bio-inspired collision risk-assessment method for ensuring safe USV operation based on stereo vision.
Moreover, an underwater obstacle avoidance mechanism is often studied in tandem with path planning [21,22,23,24,25]. Path planning for ambulatory data gathering is discussed in [21], and an autonomous underwater vehicle traverses only location points in the constructed covering set with a hierarchical grid-based obstacle avoidance strategy. To solve the problem of autonomous obstacle avoidance in trajectory tracking, the model predictive control algorithm is applied to design an obstacle avoidance controller from the point of view of trajectory re-planning [22]. On the premise of realizing dynamic target obstacle avoidance, an adaptive generic model controller based on Radial Basis Function (RBF) neural networks is provided in [23]. A novel three-dimensional obstacle avoidance algorithm for an autonomous underwater robot with a sphere cross-section method is proposed in our previous work [24]. Moreover, we utilize ocean current characteristics simplified as a stream-function to design an optimal 3D trajectory with an obstacle avoidance function [25]. In conclusion, these obstacle-avoidance-based path planning methods consider only path design based on prior-known obstacles but ignore how to obtain the obstacle’s shape and the distance from the underwater robots. It is worth emphasizing that, to the best of our knowledge, the location and shape of an underwater obstacle are assumed to be obtained previously in almost all obstacle avoidance path design algorithms.
In this paper, we propose a new type of underwater robot, which is equipped with an underwater binocular vision system on the robot’s head, to verify the obstacle avoidance effect. The robot is named the underwater quadrocopter vehicle. Due to the impact of long-time water flow, convex underwater obstacles will become smooth and round gradually, so it is assumed that most underwater obstacles can be enveloped by particular spheres with various radii. Herein, for the sake of simplicity, we only discuss certain application scenarios with only spherical obstacles in this paper. Moreover, an underwater binocular-vision-based obstacle positioning algorithm is proposed to measure a spherical obstacle’s shape and its distance apart from the UQV. Subsequently, practical pool experiments verify the performance of proposed binocular vision system. Finally, simple semicircle-curve-based trajectory planning for the UQV is introduced to avoid multiple spherical obstacles effectively.
The main contributions of this paper are:
(1)
A novel binocular-vision-based obstacle ranging and recognition method for underwater applications is proposed in this paper, and the radii of spherical obstacles and the distance between them are calculated and applied to design the obstacle avoidance trajectory.
(2)
Different from existing methods, this paper not only considers an obstacle-avoidance-based path planning algorithm, but also studies an underwater obstacle recognition and processing method simultaneously.
(3)
This paper proposes a new type of underwater vehicle named the underwater quadrocopter vehicle, and its theoretical kinematic and dynamic model is investigated to verify full degree of freedom of movement and obstacle avoidance ability.
The rest of this paper is organized as follows. Section 2 describes the detailed kinematic and dynamic model for the proposed underwater quadrocopter vehicle. Section 3 proposes the underwater binocular-vision-based obstacle positioning method in detail. Section 4 illustrates spherical-obstacle-avoidance-based continuous path design method. Section 5 shows simulation results and the actual pool test of the proposed obstacle avoidance based path design. Section 6 presents the conclusion of this paper.

2. The Underwater Quadrocopter Vehicle Model

Different from a traditional torpedo-shaped AUV with one thruster and several steering engines, the well-designed UQV is equipped with four rotors on the vertical direction of the robotic body and two propellers equipped on the two sides of the robotic body. The mechanical structure of our self-made UQV is described in Figure 1. As we know, traditional torpedo-shaped AUVs often have inherent defects such as large turning radii and are unable to move vertically. However, the proposed UQV has a much smaller radius of gyration and can even turn in place with any turning radius. In conclusion, due to its perfect ability of full freedom underwater movements, the proposed UQV can perform cruise control, fixed depth hover, vertical floating and diving perfectly.
The detailed hardware structure of our self-made UQV is illustrated in Figure 2. The hardware system is mainly composed of three parts: a core processor module, a perception module and a power supply module. The core processor module using Jetson-Nano board as the core processor is responsible for driving the six propellers in a completely autonomous motion operation. The perception system takes charge of underwater data acquisition and analysis with binocular cameras, an inertial navigation system (INS), a depth sensor, an ultra-short baseline positioning system (USBL), differential GPS (DGPS), etc. Therefore, the proposed UQV has the ability to obtain its real-time position. The whole UQV is powered by a 24V DC lithium battery. Six propellers are supplied by isolated power sources; among them, two horizontal propellers are mainly used to provide forward and turning force, and four vertical propellers are controlled to maintain specific posture for the UQV.
To design an obstacle avoidance path, the kinematic and dynamic model of the UQV with unknown dynamics and disturbances will be discussed. Figure 3 illustrates the dynamic model of the UQV in the earth coordinate system ( W X Y Z ) and the body coordinate system ( B X Y Z ), respectively. Four rotors provide lift in the body coordinate system, and its direction is vertical upward and parallel to the Z a x i s . To counteract the reverse torque and gyro effect, four rotors are divided into two groups: the left-up and right-down rotors belong to one group, and the left-down and right-up rotors belong to another group. To ensure that the yaw angle does not change when hovering, we make two groups of rotors have opposite rotation directions, i.e., the rotation directions in the same group are the same and those in different group are opposite. Obviously, this kind of mechanical structure is the same as in a traditional unmanned aerial vehicle [26].
For the convenience of formula description, the main notations used in this paper are defined in Table 1.
With the symbols defined in Table 1, there is [ x ˙ , y ˙ , z ˙ ] T = R B W [ u , v , w ] T , and [ φ ˙ , θ ˙ , ψ ˙ ] T = T B W [ p , q , r ] T , where two rotation matrices R B W and T B W from the body coordinate system to the earth coordinate system can be expressed as:
R B W = c o s ψ c o s θ c o s φ s i n ψ + s i n φ c o s ψ s i n θ s i n φ s i n ψ + c o s φ c o s ψ s i n θ s i n ψ c o s θ c o s φ c o s ψ + s i n φ s i n ψ s i n θ s i n φ c o s ψ + c o s φ s i n ψ s i n θ s i n θ s i n φ c o s θ c o s φ c o s θ
T B W = 1 s i n φ t a n θ c o s φ t a n θ 0 c o s φ s i n φ 0 s i n φ / c o s θ c o s φ / c o s θ
Let V = [ u , v , w , p , q , r ] T denote the Velocity vector, and η ˙ = [ x ˙ , y ˙ , z ˙ , ϕ ˙ , θ ˙ , ψ ˙ ] T denote the State vector, so the completed kinematics model of the proposed UQV is expressed as,
η ˙ = J V
where J = R B W 3 × 3 3 × 3 T B W .
The essence of modeling the underwater quadrocopter vehicle is to apply Newton’s second law and Euler’s formula to analyze the stress force. Therefore, the completed dynamic model of the proposed UQV is expressed as [25],
M V + C V V + D V V + G V = τ
where M is the inertia coefficient matrix, G V is static resilience, C V and D V denote the Coriolis force–centripetal force matrix and the Damping coefficient matrix, respectively.
For simplicity, the well-designed UQV’s compound movement can be mainly divided into two separated motion modes: translational motion and rotational motion. The following subsections will analyze the two motion modes in detail.

2.1. Translational Motion

For the translational motion of the underwater quadrocopter vehicle, the force equation of the UQV in the earth coordinate system can be obtained according to Newton’s Second Law, that is,
F a l l = m a x a y a z
where m denotes the quality of the underwater quadrocopter vehicle and a x , a y and a z denote different acceleration values of the X / Y / Z a x i s in the earth coordinate system, respectively.
The resultant force of the proposed UQV is mainly composed of four groups: its own gravity, the buoyancy from water, the force generated by the two propellers and the lift provided by the four rotors. For the forces F 1 and F 2 from the two horizontally placed propellers, their directions are along the X / Y a x i s . The lifting forces F 3 , F 4 , F 5 , F 6 are generated by the four rotors, and their directions are along the Z a x i s of the body coordinate system ( B X Y Z ). The directions of gravity and buoyancy are along the Z a x i s in the earth coordinate system ( W X Y Z ); hence, it is necessary to convert the force in the body coordinate system to the earth coordinate system. Through the predefined rotation matrix R B W , we can obtain,
0 0 ρ g V m g + R B W f 1 0 f 2 = m a x a y a z
where f 1 = F 1 + F 2 and f 2 = F 3 + F 4 + F 5 + F 6 . Therefore, Equation (4) can be expanded to,
m a x = f 1 c o s ψ c o s θ + f 2 s i n φ s i n ψ + c o s φ c o s ψ s i n θ m a y = f 1 s i n ψ c o s θ + f 2 s i n φ c o s ψ + c o s φ s i n ψ s i n θ m a z = f 1 s i n θ + f 2 c o s φ c o s θ + ρ g V m g

2.2. Rotational Motion

The rotational motion of the UQV is mainly caused by the imbalance of torque. In the rotational motion, the imbalance of torque comes from different forces generated by the four rotors. As Figure 4 shows, the rotating direction of each rotor is specified with the given reference direction of X a x i s and Y a x i s in the body coordinate system. The reference direction of Z a x i s is determined outward from the X / Y a x i s plane using the right-hand rule. Consequently, the rotational motion can be analyzed as follows:
The torque calculation equation is defined as:
T = L × F
where L denotes the distance vector from the axis of rotation to the force point and F denotes the force vector applied to the UQV. Through the torque formula, the following equations can be established in the body coordinate system,
T x = F 1 L 1 y F 2 L 2 y F 3 L 3 y + F 4 L 4 y T y = F 1 L 1 x F 2 L 2 x + F 3 L 3 x + F 4 L 4 x
where L · · denotes the different arm of force related to the different propulsive force.
The internal stress in the Z a x i s direction is mainly generated by the imbalance from the moment of force when the two group rotors rotate. The rotor rotating clockwise will produce a reversal torque, making the rigid body rotate counterclockwise and vice versa. Therefore, in Figure 4, the reversal torque generated by the No.1 and No.3 rotors is along the positive Z a x i s direction, and the reversal torque generated by the No.2 and No.4 rotors is along the negative Z a x i s direction.
Through the rigid body rotation theorem and Euler equation, there is:
F = M β = M w + w × M w
where F = [ F x , F y , F z ] T is the total external moment of force in a certain fixed axis, M is the inertia matrix, and β is the angular acceleration, w = p , q , r T , M = M x x M x y M x z M y x M y y M y z M z x M z y M z z .
The rotation equation under six degrees of freedom can be obtained as,
F x = M x x p + M z z M y y q r + M x y p r q M y z q 2 r 2 M x z p q + r + m y G w u q + v p z G v w p + u r
F y = M y y q + M x x M z z r p M x y q r p + M y z p q r + M x z p 2 r 2 + m z G u v r + w q x G w u q + v p
F z = M z z r + M y y M x x p q M x y p 2 q 2 M y z p r + q + M x z q r p + m x G v w p + u r z G u w p v r
where r G = x G , y G , z G T 0 , 0 , 0 T is the coordinate of the gravity centre. Since the structure of the UQV is symmetrical, the inertial matrix can be rewritten as a diagonal matrix, i.e., M i j = 0 , i j . Hence, the above equation can be reduced to the following equation,
F x = M x x p + q r M z z M y y F y = M y y q + r p M x x M z z F z = M z z r
Let X u , Y v , N r denote linear damping coefficients of Surge, Sway, and Yaw, respectively; X u ˙ , Y v ˙ , N r ˙ denote added mass coefficients of Surge, Sway, Yaw, respectively; and X u | u | , Y v | v | , N r | r | denote secondary damping coefficients of Surge, Sway, Yaw, respectively. Hence, the inertia coefficient matrix M, the Coriolis force–centripetal coefficient matrix C V and the hydrodynamic damping matrix D V can be derived as follows,
M x x = m X u ˙ M y y = m Y v ˙ M z z = I z N r ˙
where I z is the moment of inertia in the Z a x i s .
C V ) = [ 0 0 M x x u 0 0 M y y v M x x u M y y v 0
D V = [ X u + X u | u | 0 0 0 Y v + Y v | v | 0 0 0 N r + N r | r | ]

3. Binocular-Vision-Based Underwater Obstacle Positioning

Binocular stereo vision uses two parallel cameras to capture an object of interest and then calculates two similar images’ deviations, finally obtaining a clear sense of depth from the correspondence between features. Nowadays, binocular stereo vision has been widely used with the advantages of simple equipment, low cost and less human intervention [27,28]. Herein, the binocular-vision-based underwater obstacle positioning principle is described in Figure 5. Two parallel underwater cameras mounted on the head of the UQV are used to capture imaging information ahead. Using the underwater binocular-vision-based obstacle positioning method, an underwater spherical obstacle’s size and the distance from the UQV will be calculated in a simple mode.
Traditionally, the binocular vision processing procedure can be divided into three main components. First, due to the congenital difference between the left camera and the right camera, the captured images will be distorted. Therefore, the first important step is camera calibration using derived distortion parameters. Second, a stereo matching algorithm is applied to calculate the parallax between the left image and the right image. To improve the accuracy of the disparity calculation in the binocular image matching process, the two cameras’ external parameters are applied to correct the target’s position deviation on the left and the right imaging plane. Third, the two cameras’ internal parameters are used to correct pixel deviation; consequently, the actual target’s position is projected according to the parallax of the target on the left and the right imaging pictures. Similarly, the proposed binocular-vision-based underwater obstacle positioning method is divided into three steps.

3.1. Step 1: Binocular Camera Calibration

According to the pinhole imaging model, the target P ( x , y , z ) will be projected onto the imaging plane. For an ideal pinhole camera as shown in Figure 6, the straight line distance from the pinhole to the projection plane is defined as the focal length f. Ideally, the principal point and the projection center should be on the same optical axis. However, due to the limitations of the cameras’ manufacturing process, the principal point and the projection center are not usually on the same optical axis. Therefore, it is necessary to correct the cameras’ offset using a certain calibration method. The famous Zhang Zhengyou calibration method [29] is applied in this section, and chessboard calibration images with different angles as shown in Figure 7 are used. In this paper, 14 chessboard calibration maps with different angles are collected for camera calibration. The chessboard collected by the left and right cameras is modeled and analyzed by MATLAB to obtain the correction parameters. The spatial position relationship between the binocular camera and the calibrated chessboard and the distributions of projection error of each image are described in Figure 8. Different color punctuation points distributed in the error diagram represent different errors of the chessboard corners. Figure 8 demonstrates that projection errors mostly approach zero, indicating that the image calibration effect is very good. The average error pixel points of each image indicated with the red line is 0.032489 pixels; therefore, it is verified that these camera calibration results are feasible.
Finally, the internal and external parameters of the used underwater cameras can be obtained following a traditional treatment process [27]. For the sake of space, these traditional processing methods are outside the scope of this article.

3.2. Step 2: Underwater Binocular Matching

As we know, the imaging planes of the left camera and the right camera usually cannot be the same in the binocular vision technology, so it is difficult to locate a common part in the stereo matching algorithm. Therefore, it is necessary to apply external correction parameters to align the polar line. In Figure 9, taking the center of the imaging plane of the left camera as the origin of coordinate system O l , the mark of the observed target P is P l . Similarly, taking the center of the imaging plane with the right camera as the origin of coordinate system O r , the mark of the observed target P is P r . Following the relationship between pixel coordinates and physical coordinates, there are,
P l = R l × P + T l P r = R r × P + T r P l = R T × ( P r T )
where R and T are the rotation matrix and the translation vector in Figure 9.
As a conventional method, the relationship between pixel coordinates and physical coordinates in the left and right imaging planes is described in Ref. [30].
So far, these matching points can be found in the visualization area where two camera views overlap. Once the triangulation parallax between the corresponding points of two views is known, the target’s depth can be obtained using the triangle similarity principle. The well known block matching stereo-matching algorithm with a small ’Sum of Absolute Differences (SAD)’ window [31] is applied to find the matching points between the two images as Figure 10 shows. The block matching stereo-matching algorithm is implemented in three steps. First, the binocular image is pre-processed to normalize the image’s brightness, to reduce brightness distortion, eliminate noise and enhance texture. Second, we use the previous SAD window to search for the optimal matching of each feature in the left image and the right image. After stereo-matching searching, the best matching point will be determined, i.e., the smallest SAD function value of the matching window is chosen. After block stereo matching, any conventional roundness extraction methods can be used to identify the target. For the sake of paper space, we will ignore explaining these traditional processing methods. After these steps, if the obstacle is spherical, its radius R can be obtained with an image pixel plane.

3.3. Step 3: Obstacle Depth Computing

In this paper, two deployed cameras are exactly the same with focal length f, and the positions are completely parallel along the UQV’s body. It is assumed that the baseline length of the left camera and the right camera is b, the coordinate of the target point is P ( x , y , z ) , the target point’s imaging coordinates of the left camera and the right camera are ( x l , y l ) and ( x r , y r ) , respectively. According to the triangle similarity principle between the blue triangle and the yellow triangle in Figure 11, there is such Equation,
d f = x x l = x b x r = y y l = y y r
where d is the distance between the UQV and the spherical obstacle, and it is the key parameter for obstacle avoidance path planning. Hence, we can obtain the actual coordinates of the target point,
x = x l × b x l x r y = y l × b x l x r d = f × b x l x r
Finally, the distance d between the underwater obstacle and the UQV and the radius R of the spherical obstacle can be obtained with the binocular-vision-based underwater obstacle positioning method. It is worth noting that d R in our application scenario.

4. Obstacle-Avoidance-Based Continuous Path Design

So far, the relative distance d away from the UQV and the radius of the spherical obstacle can be determined with the previously introduced binocular visual positioning method. In this section, we will discuss how to design an obstacle-free trajectory.
The detailed framework of obstacle avoidance path planning is described in Figure 12 It can be divided into four main parts: binocular visual positioning, obstacle avoidance coordinate framework, obstacle avoidance path planning and obstacle avoidance path tracking. The binocular visual-positioning module is used to obtain the obstacle’s location, which is demonstrated in the previous sections.
The obstacle avoidance framework determines how to design the obstacle avoidance path with the knowledge of obstacle information. The applied coordinate system constructed by the self-designed UQV and the spherical obstacle is constructed in Figure 13. Although the proposed obstacle avoidance framework is a three-dimensional coordinate system, it is assumed that the UQV and the underwater obstacle are at the same depth, i.e, with the same Y coordinate. Hence, we only consider an obstacle-free trajectory in a two-dimensional plane.
Specifically, it is assumed that the center of spherical obstacle O is ( x 1 , y 1 , z 1 ) , and its radius is R. The original heading of the UQV is θ 0 , and the front center of the UQV is ( x 0 , y 0 , z 0 ) . The investigated obstacle avoidance path planning is to choose one smooth trajectory L , which satisfies, each point P i ( x i , y i , z i , θ i ) , i [ 0 , 1 ] in the planned trajectory such that L is satisfied,
( x i x 1 ) 2 + ( y i y 1 ) 2 + ( z i z 1 ) 2 > R
subject to the starting heading equal to the ending heading approximately: θ 1 θ 0 .
The obstacle avoidance path planning is derived from the kinematics model of the UQV. To realize the obstacle avoidance mechanism in the embedded system, a semicircle path with radius R + d is used to design the obstacle avoidance path in this paper. In Figure 13, the obstacle avoidance path is decided as a semicircle with radius R + d , i.e., the starting point and the ending point are ( 0 , 0 , 0 ) and ( 0 , 0 , 2 ( R + d ) ) , respectively. The variable d is the relative distance between the UQV and the spherical obstacle’s center, which has been determined by the previous binocular visual positioning method. In actual execution, once the front underwater obstacle is identified, the above obstacle avoidance path calculation process will start. Furthermore, it is worth mentioning that the obstacle avoidance trajectory is designed only in the X O Z plane, and the influence in the Y a x i s is ignored.
The last obstacle avoidance path tracking module is controlled by the dynamic model of the UQV. Due to its full degree of freedom of movement, the introduced UQV can perform various actions such as turning any angle in a fixed place following the planned trajectory. Therefore, any successive curves can be used by the motion controller as long as the heading angle constraint is satisfied in the previous obstacle avoidance model.

5. Model Verification and Experimental Results

5.1. Motion Model Simulation

In this section, special trajectory tracking in a three-dimensional underwater region is carried out to verify the proposed dynamic and kinematic models. The used parameters are listed in Table 2. Some important parameters refer to Ref. [32].
Spiral path simulation results are illustrated in Figure 14, where the yellow curve denotes the designed spiral path in a three-dimensional space, and the green curve denotes the generated trajectory with the proposed UQV’s model. It is obvious that the ideal path followed by the UQV is consistent with the generated spiral path. Figure 15 denotes different projected curves in X a x i s , Y a x i s and Z a x i s , respectively. Since the designed trajectory is a spiral path, the projection curves of X a x i s and Y a x i s belong to periodic curves, and that of Z a x i s is linear with the time sequence. Due to control delay, there is a certain delay on the timeline for the actual and ideal curves in the X a x i s and Y a x i s . In conclusion, these trajectory tracking results show that the difference between the actual path and the ideal path is quite small, which verifies the proposed dynamic and kinematic models of the UQV. Hence, the above path-tracking effect can be guaranteed to realize any obstacle avoidance curves.

5.2. Hardware Implementation for Uqv

To verify the effect of the proposed obstacle avoidance mechanism, we have developed one kind of underwater quadrocopter vehicle by ourselves. The practical photograph of the UQV is illustrated in Figure 16. Furthermore, a tiny web server is embedded into the Jetson-Nano processor with Quad Core ARM @ 1.43 GHz GPU to perform underwater data acquisition and motion control. The well-designed web pages are illustrated in Figure 17 to display different specific functions, such as manual operation, trajectory tracking, video surveillance, etc. Due to the limitation of the thrusters used, the maximum speed of our UQV is only 1 knot. Therefore, a few meters distance away from forward the obstacle is enough to complete the obstacle’s position calculation.

5.3. Obstacle-Free Trajectory

In this section, the planned obstacle-free trajectory with our proposed method is verified. Figure 18 denotes the obstacle-avoidance-based continuous path in a three-dimensional and a two-dimensional space, respectively. In this simulation, the specific obstacle with radius 2 is located with central coordinates (10, 10, 1.5), and the starting point of the UQV is (10, 0, 1.5). The initial operation distance between the given spherical obstacle and the UQV is set to three, hence the radius of the planned semicircle path is five. In Figure 18, the purple line and the blue line denote the planned trajectory and the actual trajectory, respectively. The designed curves in the separate X-coordinate system and the Y-coordinate system are illustrated in Figure 19, and it is obvious that the actual curves are very close to the theoretical curves.

5.4. Experimental Results

To verify the aforementioned method, we conducted practical experiments in a pool at our college, which is shown in Figure 20. The measured rotation matrix is R = 0.57675 6.9956 0.42186 0.75455 0.65408 0.05304 0.31304 0.28772 0.90511 , and the measured translation vector is T = 53.42609 17.06029 227.97609 .
In our experimental platform, there are three spherical underwater obstacles with different radii and different colors deployed in the pool as shown in Figure 21. Figure 22 presents the binocular vision processing results for the captured images with the three obstacles. The displayed images in the upper part denote the original images and those in the lower part denote the processed results after we introduced the image processing method. It is obvious that although the captured underwater images are not very clear, the approximate circular shapes of the three obstacles can be recognized by the proposed binocular-vision-based obstacle detection method. The distance measurement accuracy of the proposed underwater ranging method is compared in Table 3. The final ranging results prove that the measured error is less than 10% of the measured range. Furthermore, the measured radius from the the pixel plane is approaching the actual radius, so it is suitable for obstacle recognition. It is worth mentioning that the elapsed time for underwater binocular vision processing was less than 1 s, so a 3 m distance is enough to calculate the obstacle avoidance path for our low-speed UQV.
Finally, we verify the effect of the obstacle-avoidance-based path with the actual pool tests. There are three underwater obstacles distributed in the original path from the starting point to the ending point. In other words, the UQV will collide with these obstacles along the original path if the obstacle avoidance function is not considered. However, with the designed obstacle avoidance path based on binocular vision, a collision will not occur in the experiments since the UQV has the ability to escape from obstacles. The key steps of the obstacle avoidance trajectory are illustrated in Figure 23. To summarize, the final trajectory shows that the proposed binocular-vision-based obstacle avoidance mechanism can meet the needs of realistic obstacle avoidance in an underwater environment.

6. Conclusions

In this paper, we propose a kind of underwater quadrocopter vehicle, which is equipped with four thrusters in the vertical direction of the robotic body, and two horizontal thrusters on the sides of the robotic body. The kinematic and dynamic characteristics verify that the proposed UQV has its own advantages compared to existing autonomous underwater vehicles. Moreover, the detailed mechanical structure, the hardware composition and the kinematic and dynamic model of the proposed UQV are analyzed. Furthermore, we study and verify the obstacle avoidance mechanism using our self-made UQV. In this paper, underwater binocular-vision-based obstacle positioning is introduced to measure an underwater spherical obstacle’s radius and distance from the UQV. Then a simple semicircle-curve-based obstacle avoidance path is planned on the basis of the obstacle’s coordinates. The final practical pool experiments show that the planned trajectory for the UQV can avoid multiple obstacles effectively.

Author Contributions

Conceptualization, M.Z. and W.C.; methodology, M.Z.; software, Q.X.; validation, M.Z., S.X. and W.C.; formal analysis, W.C.; investigation, M.Z.; resources, M.Z.; writing—original draft preparation, W.C.; writing—review and editing, M.Z. and W.C.; visualization, M.Z.; supervision, W.C.; project administration, M.Z.; funding acquisition, M.Z. and S.X. All authors have read and agreed to the published version of the manuscript.

Funding

This research has been partially supported by Natural Science Foundation of Zhejiang Province (No.LZJWY22E090001 and LZ22F010004), National Natural Science Foundation of China (No.61871163 and No.61801431), Scientific research foundation of Zhejiang University of Water Resources and Electric Power (xky2022033), Zhejiang Public Welfare Technology Research Project (LGF20F010005) and the Stable Supporting Fund of Acoustics Science and Technology Laboratory.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

Thank anonymous reviewers‘s contributions to improve the quality of this paper.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Yang, Y.; Xiao, Y.; Li, T. A Survey of Autonomous Underwater Vehicle Formation: Performance, Formation Control, and Communication Capability. IEEE Commun. Surv. Tutorials 2021, 23, 815–841. [Google Scholar] [CrossRef]
  2. Cheng, M.; Guan, Q.; Ji, F.; Cheng, J.; Chen, Y. Dynamic Detection Based Trajectory Planning for Autonomous Underwater Vehicle to Collect Data From Underwater Sensors. IEEE Internet Things J. 2022, 9, 13168–13178. [Google Scholar] [CrossRef]
  3. Cai, W.; Zhang, M.; Zheng, Y.R. Task Assignment and Path Planning for Multiple Autonomous Underwater Vehicles Using 3D Dubins Curves. Sensors 2017, 17, 1607. [Google Scholar] [CrossRef]
  4. Wen, J.; Yang, J.; Wang, T. Path Planning for Autonomous Underwater Vehicles Under the Influence of Ocean Currents Based on a Fusion Heuristic Algorithm. IEEE Trans. Veh. Technol. 2021, 70, 8529–8544. [Google Scholar] [CrossRef]
  5. Liu, X.; Wang, Z.; Guo, Y.; Wu, Y.; Wu, G.; Xu, J.; Chen, Y. The design of control system based on autonomous underwater helicopter. In Proceedings of the OCEANS 2018 MTS/IEEE Conference, Charleston, SC, USA, 22–25 October 2018; pp. 1–4. [Google Scholar] [CrossRef]
  6. Chen, C.; Huang, C.; Dai, X.; Huang, H.; Chen, Y. Motion and control simulation of a dished autonomous underwater helicopter. In Proceedings of the OCEANS 2017, Anchorage, AK, USA, 22–25 October 2017; pp. 1–6. [Google Scholar]
  7. Kohl, M.; Moe, S.; Kelasidi, E.; Pettersen, K.Y.; Gravdahl, J.T. Set-based path following and obstacle avoidance for underwater snake robots. In Proceedings of the 2017 IEEE International Conference on Robotics and Biomimetics (ROBIO), Macau, Macao, 5–8 December 2017; pp. 1206–1213. [Google Scholar] [CrossRef]
  8. Zhong, Y.; Chen, Y.; Wang, C.; Wang, Q.; Yang, J. Research on Target Tracking for Robotic Fish Based on Low-cost Scarce Sensing Information Fusion. IEEE Robot. Autom. Lett. 2022, 7, 6044–6051. [Google Scholar] [CrossRef]
  9. Wang, C.; Lu, J.; Ding, X.; Jiang, C.; Yang, J.; Shen, J. Design, Modeling, Control, and Experiments for a Fish-Robot-Based IoT Platform to Enable Smart Ocean. IEEE Internet Things J. 2021, 8, 9317–9329. [Google Scholar] [CrossRef]
  10. Antonelli, G.; Chiaverini, S.; Finotello, R.; Schiavon, R. Real-time path planning and obstacle avoidance for RAIS: An autonomous underwater vehicle. IEEE J. Ocean. Eng. 2001, 26, 216–227. [Google Scholar] [CrossRef]
  11. Fanelli, F.; Fenucci, D.; Marlow, R.; Pebody, M.; Phillips, A.B. Development of a Multi-Platform Obstacle Avoidance System for Autonomous Underwater Vehicles. In Proceedings of the 2020 IEEE/OES Autonomous Underwater Vehicles Symposium (AUV), St. Johns, NL, Canada, 30 September–2 October 2020; pp. 1–6. [Google Scholar] [CrossRef]
  12. Petillot, Y.; Ruiz, I.T.; Lane, D.M. Underwater vehicle obstacle avoidance and path planning using a multi-beam forward looking sonar. IEEE J. Ocean. Eng. 2001, 26, 240–251. [Google Scholar] [CrossRef]
  13. Braginsky, B.; Guterman, H. Obstacle Avoidance Approaches for Autonomous Underwater Vehicle: Simulation and Experimental Results. IEEE J. Ocean. Eng. 2016, 41, 882–892. [Google Scholar] [CrossRef]
  14. Belcher, E.O.; Fox, W.L.; Hanot, W.H. Dual-frequency acoustic camera: A candidate for an obstacle avoidance, gap-filler, and identification sensor for untethered underwater vehicles. In Proceedings of the OCEANS’02 MTS/IEEE, Biloxi, MI, USA, 29–31 October 2002. [Google Scholar]
  15. Lin, H.Y.; Peng, X.Z. Autonomous Quadrotor Navigation with Vision Based Obstacle Avoidance and Path Planning. IEEE Access 2021, 9, 102450–102459. [Google Scholar] [CrossRef]
  16. Barrett, D.; Vandor, I.; Kohler, E. Applying Structured Light Laser Imaging to Underwater Obstacle Avoidance and Navigation. In Proceedings of the OCEANS 2018 MTS/IEEE, Charleston, SC, USA, 22–25 October; 2018; pp. 1–6. [Google Scholar] [CrossRef]
  17. Leonardi, M.; Stahl, A.; Gazzea, M.; Ludvigsen, M.; Rist-Christensen, I.; Nornes, S.M. Vision based obstacle avoidance and motion tracking for autonomous behaviors in underwater vehicles. In Proceedings of the OCEANS, Aberdeen, UK, 19–22 June 2017; pp. 1–10. [Google Scholar] [CrossRef]
  18. Drews, P.; Hernández, E.; Elfes, A.; Nascimento, E.R.; Campos, M. Real-time monocular obstacle avoidance using Underwater Dark Channel Prior. In Proceedings of the 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Daejeon, Korea, 9–14 October 2016; pp. 4672–4677. [Google Scholar] [CrossRef]
  19. Evans, J.; Redmond, P.; Plakas, C.; Hamilton, K.; Lane, D. Autonomous docking for intervention-auvs using sonar and video-based real-time 3d pose estimation. In Proceedings of the OCEANS 2003, San Diego, CA, USA, 22–26 September 2003; Volume 4, pp. 2201–2210. [Google Scholar]
  20. Xue, K.; Liu, J.; Xiao, N.; Ji, X.; Qian, H.A. A Bio-inspired Simultaneous Surface and Underwater Risk Assessment Method Based on Stereo Vision for USVs in Nearshore Clean Waters. IEEE Robot. Autom. Lett. 2022. [Google Scholar] [CrossRef]
  21. Han, G.; Wang, H.; Li, S.; Jiang, J.; Zhang, W. Probabilistic Neighborhood Location-Point Covering Set-Based Data Collection Algorithm With Obstacle Avoidance for Three-Dimensional Underwater Acoustic Sensor Networks. IEEE Access 2017, 5, 24785–24796. [Google Scholar] [CrossRef]
  22. Sun, B.; Zhang, W.; Song, A.; Zhu, X.; Zhu, D. Trajectory Tracking and Obstacle Avoidance Control of Unmanned Underwater Vehicles Based on MPC. In Proceedings of the 2018 IEEE 8th International Conference on Underwater System Technology: Theory and Applications (USYS), Wuhan, China, 1–3 December 2018; pp. 1–6. [Google Scholar] [CrossRef]
  23. Chu, Z.; Zhu, D. Obstacle Avoidance Trajectory Planning and Trajectory Tracking Control for Autonomous Underwater Vehicles. In Proceedings of the 2018 13th World Congress on Intelligent Control and Automation (WCICA), Changsha, China, 4–8 July 2018; pp. 450–454. [Google Scholar] [CrossRef]
  24. Cai, W.; Wu, Y.; Zhang, M. Three-Dimensional Obstacle Avoidance for Autonomous Underwater Robot. IEEE Sens. Lett. 2020, 4, 7004004. [Google Scholar] [CrossRef]
  25. Cai, W.; Xie, Q.; Zhang, M.; Lv, S.; Yang, J. Stream-Function Based 3D Obstacle Avoidance Mechanism for Mobile AUVs in the Internet of Underwater Things. IEEE Access 2021, 9, 142997–143012. [Google Scholar] [CrossRef]
  26. Zuo, Z.; Liu, C.; Han, Q.L.; Song, J. Unmanned Aerial Vehicles: Control Methods and Future Challenges. IEEE/CAA J. Autom. Sin. 2022, 9, 601–614. [Google Scholar] [CrossRef]
  27. Guo, S.; Chen, S.; Liu, F.; Ye, X.; Yang, H. Binocular vision-based underwater ranging methods. In Proceedings of the 2017 IEEE International Conference on Mechatronics and Automation (ICMA), Kagawa, Japan, 6–9 August 2017; pp. 1058–1063. [Google Scholar] [CrossRef]
  28. Deng, Y.; Wang, H. Underwater Circular Object Positioning System Based on Monocular Vision. In Proceedings of the 2019 IEEE International Symposium on Signal Processing and Information Technology (ISSPIT), Ajman, United Arab Emirates, 10–12 December 2019; pp. 1–5. [Google Scholar] [CrossRef]
  29. Zhang, Z. A Flexible New Technique for Camera Calibration. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 1330–1334. [Google Scholar] [CrossRef]
  30. Lin, S.; Li, W.; Wang, C.; Tang, Y. Distance Measurement of Underwater Target Based on Stereo Vision. In Proceedings of the 2017 IEEE 7th Annual International Conference on CYBER Technology in Automation Control, and Intelligent Systems (CYBER), Waikiki, HI, USA, 31 July–4 August; 2017; pp. 97–102. [Google Scholar] [CrossRef]
  31. Liu, J.; Liu, W.; Li-e, G.; Le, L. Detection and localization of underwater targets based on monocular vision. In Proceedings of the 2017 2nd International Conference on Advanced Robotics and Mechatronics (ICARM), Tai’an, China, 27–31 August 2017; pp. 100–105. [Google Scholar] [CrossRef]
  32. Liu, J.H. Research on Multi-UUV Coordinated Formation Control Method Based on Virtual Pilot. Master’s Thesis, Harbin Engineering University, Harbin, China, 2017. [Google Scholar]
Figure 1. The mechanical structure of the underwater quadrocopter vehicle.
Figure 1. The mechanical structure of the underwater quadrocopter vehicle.
Jmse 10 01050 g001
Figure 2. Hardware diagram.
Figure 2. Hardware diagram.
Jmse 10 01050 g002
Figure 3. Dynamic model of the UQV.
Figure 3. Dynamic model of the UQV.
Jmse 10 01050 g003
Figure 4. Rotational model in the body coordinate system.
Figure 4. Rotational model in the body coordinate system.
Jmse 10 01050 g004
Figure 5. Underwater binocular-vision-based obstacle positioning.
Figure 5. Underwater binocular-vision-based obstacle positioning.
Jmse 10 01050 g005
Figure 6. Camera imaging model.
Figure 6. Camera imaging model.
Jmse 10 01050 g006
Figure 7. Camera calibration images.
Figure 7. Camera calibration images.
Jmse 10 01050 g007
Figure 8. Calibration modeling results.
Figure 8. Calibration modeling results.
Jmse 10 01050 g008
Figure 9. Imaging pixels of target point between two cameras.
Figure 9. Imaging pixels of target point between two cameras.
Jmse 10 01050 g009
Figure 10. Block stereo-matching algorithm.
Figure 10. Block stereo-matching algorithm.
Jmse 10 01050 g010
Figure 11. Obstacle depth computing model.
Figure 11. Obstacle depth computing model.
Jmse 10 01050 g011
Figure 12. Obstacle -avoidance-based path planning framework.
Figure 12. Obstacle -avoidance-based path planning framework.
Jmse 10 01050 g012
Figure 13. Obstacle avoidance path-planning framework.
Figure 13. Obstacle avoidance path-planning framework.
Jmse 10 01050 g013
Figure 14. Actual and ideal path.
Figure 14. Actual and ideal path.
Jmse 10 01050 g014
Figure 15. Projected paths in the X a x i s , the Y a x i s and the Z a x i s , respectively.
Figure 15. Projected paths in the X a x i s , the Y a x i s and the Z a x i s , respectively.
Jmse 10 01050 g015aJmse 10 01050 g015b
Figure 16. The photograph of the UQV.
Figure 16. The photograph of the UQV.
Jmse 10 01050 g016
Figure 17. Web page control interfaces.
Figure 17. Web page control interfaces.
Jmse 10 01050 g017
Figure 18. Obstacle-free trajectory.
Figure 18. Obstacle-free trajectory.
Jmse 10 01050 g018
Figure 19. X-axis and Y-axis curves of the obstacle-free trajectory.
Figure 19. X-axis and Y-axis curves of the obstacle-free trajectory.
Jmse 10 01050 g019
Figure 20. Experimental environment.
Figure 20. Experimental environment.
Jmse 10 01050 g020
Figure 21. The distribution of three spherical obstacles.
Figure 21. The distribution of three spherical obstacles.
Jmse 10 01050 g021
Figure 22. Underwater obstacle image processing results.
Figure 22. Underwater obstacle image processing results.
Jmse 10 01050 g022
Figure 23. Key steps of obstacle avoidance trajectory.
Figure 23. Key steps of obstacle avoidance trajectory.
Jmse 10 01050 g023
Table 1. Symbol Definition.
Table 1. Symbol Definition.
NameDefinition
mMass of the UQV
ρ Water density
VVolume of the UQV
gGravitational acceleration
φ , ψ , θ Roll angle, Yaw angle, Pitch angle
x / y / z X / Y / Z axis coordinates in earth coordinate system
a x / a y / a z X / Y / Z axis accelerations in earth coordinate system
u / v / w Linear velocity of X / Y / Z axis in body coordinate system
p / q / r Angular velocity of X / Y / Z axis in body coordinate system, w = [ p , q , r ] T
F 1 Force provided by propeller 1
F 2 Force provided by propeller 2
F 3 Force provided by rotor No.1
F 4 Force provided by rotor No.2
F 5 Force provided by rotor No.3
F 6 Force provided by rotor No.4
f 1 Resultant force provided by two propellers
f 2 Resultant force provided by four rotors
F a l l Resultant forces of the UQV
R B W , T B W Rotation matrix
MInertia matrix
V Velocity vector
η State vector
C V Coriolis force–centripetal force matrix
D V Damping coefficient matrix
G V Static resilience
τ Force and moment
dDistance between obstacle and the UQV
RRadius of spherical obstacle
Table 2. Simulation parameters.
Table 2. Simulation parameters.
ParameterValue
m50 kg
f2.6 mm
b6 cm
X u −1.5
X u ˙ 0.1
X u | u | 8.0
Y v −40
Y v ˙ 10
Y v | v | 200
N r −8.9
N r ˙ 5
N r | r | 15
Table 3. Measured distances with binocular-vision-based underwater ranging method.
Table 3. Measured distances with binocular-vision-based underwater ranging method.
Real Distance (cm)Measured Distance (cm)Measurement Error (cm)
5049.50.5
7071.51.5
9090.50.5
1101100
1301344
1501555
1701766
19020212
2102166
23024212
25027323
27029525
29032333
31034333
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Zhang, M.; Cai, W.; Xie, Q.; Xu, S. Binocular-Vision-Based Obstacle Avoidance Design and Experiments Verification for Underwater Quadrocopter Vehicle. J. Mar. Sci. Eng. 2022, 10, 1050. https://doi.org/10.3390/jmse10081050

AMA Style

Zhang M, Cai W, Xie Q, Xu S. Binocular-Vision-Based Obstacle Avoidance Design and Experiments Verification for Underwater Quadrocopter Vehicle. Journal of Marine Science and Engineering. 2022; 10(8):1050. https://doi.org/10.3390/jmse10081050

Chicago/Turabian Style

Zhang, Meiyan, Wenyu Cai, Qinan Xie, and Shenyang Xu. 2022. "Binocular-Vision-Based Obstacle Avoidance Design and Experiments Verification for Underwater Quadrocopter Vehicle" Journal of Marine Science and Engineering 10, no. 8: 1050. https://doi.org/10.3390/jmse10081050

APA Style

Zhang, M., Cai, W., Xie, Q., & Xu, S. (2022). Binocular-Vision-Based Obstacle Avoidance Design and Experiments Verification for Underwater Quadrocopter Vehicle. Journal of Marine Science and Engineering, 10(8), 1050. https://doi.org/10.3390/jmse10081050

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop