Nothing Special   »   [go: up one dir, main page]

Next Article in Journal
Stray Flux Sensor Core Impact on the Condition Monitoring of Electrical Machines
Next Article in Special Issue
Detecting and Correcting for Human Obstacles in BLE Trilateration Using Artificial Intelligence
Previous Article in Journal
A Novel Sensorised Insole for Sensing Feet Pressure Distributions
Previous Article in Special Issue
Collaborative Smartphone-Based User Positioning in a Multiple-User Context Using Wireless Technologies
You seem to have javascript disabled. Please note that many of the page functionalities won't work as expected without javascript enabled.
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Novel Position and Orientation Sensor for Indoor Navigation Based on Linear CCDs

1
School of Mechatronics Engineering and Automation, Shanghai University, Shanghai 200444, China
2
School of Information Engineering, Xuchang University, Xuchang 461000, China
*
Author to whom correspondence should be addressed.
Sensors 2020, 20(3), 748; https://doi.org/10.3390/s20030748
Submission received: 11 December 2019 / Revised: 15 January 2020 / Accepted: 22 January 2020 / Published: 29 January 2020

Abstract

:
The position and orientation of a mobile agent, such as robot or drone, etc., should be estimated in a timely way during operation in the structured indoor environment, so as to ensure the security and efficiency of task execution. Concerning the problem that the position and orientation are often estimated separately by different kinds of sensors in the off-the-shelf methods, we design a novel position orientation sensor (POS). The POS consists of four pairs of linear charge-coupled devices (CCDs) and cylindrical lenses, which can estimate the 3D coordinate of the anchor in the POS’s field of view. After detecting at least three anchors in its field of vision sequentially, the Rodrigues coordinate transformation algorithm is utilized to estimate the position and orientation of POS simultaneously. Meanwhile, the position and orientation are estimated at the receiver side. Hence there is no privacy concern associated with this system. The architecture of the proposed POS is symmetrical and redundant, even if one of the linear CCDs or cylindrical lens malfunctions, the whole system could still work normally. The proposed method is cost-effective and easily extends to a wide range. The numerical simulation demonstrates the feasibility and high accuracy of the proposed method, and it outperforms the off-the-shelf methods.

1. Introduction

Indoor mobile agents are widely used in various industries, such as logistics environments [1], military applications [2], automated manufacturing [3], commerce [4], etc. One of the most crucial functions of the indoor mobile agent is to accomplish user-specified tasks. Therefore, highly accurate position and orientation information about the mobile agent is a basis for subsequent precise motion control and path planning in the navigation process.
In the past decades, many methods have been proposed for indoor navigation, which can be roughly divided into two categories: relative positioning and absolute positioning [5]. Relative positioning methods estimate the current state based on measuring the distance and orientation of the robot relative to the initial state. Dead reckoning is a classical relative positioning technology method. It uses an inertial measurement unit (IMU) or odometer to realize navigation, and it does not need to transmit or receive any external information. The dead reckoning method can work independently and continuously to provide positioning services for mobile agents. However, the drift errors and noises of the IMU significantly affect the accuracy of the navigation over a large period of time [6], which makes this method only meet the state estimation requirements for a short time, and it cannot realize long-time navigation tasks in a complex environment. Although many improvements have been made to reduce the drift errors and noises, it cannot be applied alone for a long period without any correction strategy. Meanwhile, the dead reckoning method needs to know the initial position and orientation, which is difficult to achieve in actual scenes. Therefore, the estimation of positioning and orientation tends to depend more on absolute positioning methods.
Absolute positioning has attracted much attention for a long time and has made significant progress. It requires multiple reference points to determine the location of the moving agent. A typical representative of outdoor absolute positioning technology is the global navigation satellite system (GNSS). The outdoor positioning technology has been developed rapidly since the GNSS was put into service. We can observe its convenience when driving a car or using location sharing employing the GNSS module embedded in a smartphone. However, compared with the satellite channel of the GNSS outdoors, indoor positioning faces the terrestrial channel problems, which make it more complicated. In urban areas or inside buildings, walls and roofs can block the electronic waves from satellites, which makes GNSS only a practical solution for positioning in the indoor environment. Since GNSS does not work normally inside buildings, various studies have focused on indoor absolute positioning, and indoor absolute positioning has entered a new period of great changes and developments [7].
Indoor absolute positioning is a hot topic. It has been intensively researched for decades due to its importance and difficulty [8]. During the past few decades, many methods have been proposed to realize indoor positioning, such as Wi-Fi [9], Ultra-Wide Band (UWB) [10], ultrasound-assisted [11], Bluetooth [12], etc. Moreover, some of these methods such as the UWB can achieve cm-level positioning accuracy. However, these positioning techniques often suffer interference from Non-Line-of-Sight signal propagation and multipath fading effects. Meanwhile, they cannot provide three-dimensional (3D) position and orientation simultaneously, which are significant defects in the modern navigation applications of indoor mobile agents.
The photoelectric measurement technology has developed rapidly and has been used in a variety of indoor environments, and many feasible methods have been proposed. Laser radar mainly uses an infrared laser beam to scan the surrounding environment with a radial field of view. It is widely used in indoor mobile agent navigation because of its flexibility and real-time measurement capability. However, in some cases, such as in the assembly and docking procedures of automobile and large aircraft components, it is difficult for the laser radar to estimate the position and orientation [13,14]. In [15], a laser-based photoelectric scanning method is used to measure the 3D coordinate of anchors, and then estimate the position and orientation by using a coordinate transformation algorithm. Nevertheless, this method needs to use a sophisticated rotating laser scanning instrument, which is costly and not convenient for large-scale use in the case of multiple indoor mobile agents.
The vision sensor based on charge-coupled device (CCD) or CMOS (Complementary Metal Oxide Semiconductor) adopts a non-contact measurement method, which can estimate the motion parameters of indoor moving objects without disturbing the system. In [16], a method using the area CCD is presented to estimate the position and orientation. However, in such a system, the 2D and 3D situations should be considered, respectively, and then the number of different anchors will be decided. In [17], a robot equipped with a camera can be controlled to observe a rectangle object constraint, and then obtain the position and pose. Nevertheless, this system requires that the optical axis of the camera always point to the target center point, which makes the method complex and clumsy. In [18], a CMOS camera based on Wiimote is used to capture infrared LEDs (Light Emitting Diodes) for indoor mobile robot tracking, then the position and orientation of Wiimote can be estimated by using coordinate transformation, but this method is only suitable for 2D scenarios. The ToF camera is also can be used for indoor positioning [19,20], but it cannot work alone for indoor positioning, and needs the assistance of other equipment. Meanwhile, it also cannot estimate the position and orientation simultaneously.
Currently, there are some off-the-shelf commercial devices to measure the position and orientation information of the indoor mobile target using computer vision, such as the VICON [21]. These systems seem to be the right solution for the navigation of the indoor mobile agent, but such devices are not suitable for changing indoor backgrounds as they fail to work in dark or untextured areas. Meanwhile, these systems are expensive and cannot be easily extended on a large scale. A broader measure space means more requirement of camera measurement equipment, which leads to a higher cost overhead.
The existing visual measurement methods of indoor position and orientation mainly use area CCD or CMOS, and seldom use linear CCD to realize the position and orientation measurement of indoor mobile agents. In [22], the linear CCD is used to measure the position and orientation of moving objects, but this method needs to use multiple sets of linear CCD measuring equipment. Meanwhile, the position and attitude can only be estimated in a small range of indoor spaces. [23] proposed a 3D motion tracking system by using multiple linear optical sensor arrays, supplemented by an IMU, to achieves better performance in position and orientation measurement. However, this method requires the usage of the linear CCD and the IMU to estimate position and attitude, respectively. Meanwhile, it can also only achieve position and orientation measurements in a small range of indoor spaces.
Currently, there is still a lack of a cost-effective, economical, and satisfactory solution to realize indoor positioning and attitude estimation concurrently. The key to achieving high accuracy indoor position and orientation estimation in an affordable and wide range system is to use the appropriate sensor(s). Motivated by the specific requirement of the indoor position and orientation problem in the navigation application, we design a novel position orientation sensor (POS). This POS can create a 3D coordinate measurement system based on the intersection of four planes by using four pairs of linear CCDs and the cylindrical lenses. We name this created 3D coordinate system of the POS as the position orientation sensor coordinate system (POSCS). Compared with the traditional indoor sensor, which can estimate the distance or angle between the anchors and the mobile target, our method can estimate the 3D coordinate of the anchor in the field of vision (FOV) of the POS. Then according to at least three anchors in the indoor coordinate system (ICS) and their estimated coordinate value in the POSCS, we can use the Rodrigues coordinate transformation algorithm to calculate the rotation vector and translation vector of the POS. The translation vector contains the position of POS in the ICS, and the rotation vector contains the orientation of POS in the ICS. Finally, we can estimate the position and orientation of the indoor mobile agent simultaneously.
The rest of the paper is organized as follows: In Section 2, the system components and the working principle of POS are introduced, and then the FOV of the designed POS is simulated. In Section 3, the Rodrigues coordinate transformation algorithm is presented to estimate position and orientation. In Section 4, the simulation is performed to demonstrate the feasibility and high accuracy of the proposed method. In Section 5, a brief overview and advantages of the proposed method are discussed. Finally, the main conclusions of the whole work and future improvements are summarized in Section 6.

2. Principle of Measurement

2.1. System Component and Working Principle

The linear sensor which we use consists of two major components, a linear CCD and a cylindrical lens. The cylindrical lens is used to form the projection mapping from 3D space to the 1D image. According to the basic principles of optics, the rays from an object point on one side of the cylindrical lens forms a linear image on the other side [24].
As shown in Figure 1, a linear CCD and a cylindrical lens constitute a one-dimensional imaging unit (ODIU); the light from the light emitted diode (LED) marker passing through the cylindrical lens will focus to a line which is coplanar to the marker and optical axis of the lens [25]. To detect the change of plane due to the movement of an LED marker, a linear CCD is used, and it is located at the focal plane of the cylindrical lens. This linear CCD is placed perpendicular to the optic axis of the cylindrical lens to detect the change of the projection line when the LED marker moves. The center of the linear CCD is aligned with the center of the optic axis, and their distance is the focal length of the cylindrical lens.
The LED marker is projected to form a narrow spot on the linear CCD, and the center of the spot is taken as the image position of the marker. If the spot position on the linear CCD and the optical axis of the cylindrical lens are known, then the supporting plane with them can be calculated, and the LED marker is situated on the same plane.
As we know, a plane can be defined according to a line and a point that is not on that line. In order to determine the 3-D coordinates of a LED marker in the coordinate system, we need at least three ODIUs to register the LED marker’s 1D image position coordinates, and then reconstruct the 3D spatial coordinates of the anchor [26]. However, with the system composed of only three ODIUs, although it can provide the 3D coordinate information reconstruction of one marker, nevertheless, this system is vulnerable if any ODIU breaks. To make full use of the layout of the existing symmetrical structure on the mobile agent, such as a quadrotor drone who has four arms, we will use four ODIUs to determine the 3D coordinates of an anchor. Meanwhile, it is a redundant system by using four ODIUs, even if an ODIU malfunction, the whole system can still work normally.
As shown in Figure 2, four ODIUs are arranged, and the 3D coordinates of the anchor can be reconstructed by the intersection of four planes. When one LED marker (also the anchor) is located in the FOV of the POS, the anchor forms four projection lines crossing the four linear CCDs by passing through the respective cylindrical lens, and the projection lines intersect with the four linear CCD on the focal plane in the respective ODIU. If the position of the projection line in every ODIU is determined, then the plane which contains the anchor is also determined. Thus, the 3D coordinate of the anchor in POSCS can be uniquely resolved by these planes, which is determined by these ODIUs.
Meanwhile, we set the intersection of the four linear CCD’s extension lines as the origin of the POSCS; the photoelectric detection area of every linear CCD is on the XY plane of POSCS. The linear CCD1 and CCD2 are on the positive half-axis of the X-axis and Y-axis, while the CCD3 and CCD4 are on the negative half-axis of the X-axis and Y-axis. According to the created POSCS based on the four ODIUs, once the anchor is located in the FOV of POS, we can estimate the 3D coordinate of the anchor in POSCS.

2.2. Field of Vision of the POS

According to the above introduction of POS, it can only measure the anchor’s coordinate when it is located inside the FOV of the POS. The FOV is very important during the layout design of anchors. Therefore, it is necessary to carry out the FOV simulation for the designed POS.
The structure of the designed POS on the XY plane in POSCS is shown in Figure 3. The sensing length of the linear CCD’s photo-element is 30 mm. The linear CCD1 and CCD3 are on the positive half-axis and negative half-axis of the X-axis, and they are symmetrically placed relative to the origin point, while the CCD2 and CCD4 are on the positive half-axis and negative half-axis of the Y-axis, and they are also symmetrically placed relative to the origin point.
The CCD1_In, CCD2_In, CCD3_In, and CCD4_In are inner margins of every linear CCD, and their distances d to the origin are 60 mm; the CCD1_Out, CCD2_Out, CCD3_Out, and CCD4_Out are outer margins of the linear CCD, and their distances to the origin are 90 mm. The a1 and b1, a2 and b2, a3 and b3, a4 and b4 are two extreme points on the optical axis of cylindrical lens 1, cylindrical lens 2, cylindrical lens 3, and cylindrical lens 4. The focal length of the cylindrical lens is 50 mm. The values of the above parameters are summarized in Table 1.
According to Table 1, we divide the photosensitive area of the four linear CCD every 1 mm to choose the projection line, and there will be many spatial intersections. Here we consider the case that the calculated z value of the spatial intersection below 4000 mm. Figure 4 is the simulated FOV of POS according to the parameters in Table 1.
In Figure 4, the measurable area of every layer is a square structure, and the closest measurement distance of the POS is 300 mm, that means the distance from an anchor to POS needs to be at least 300 mm if the POS can detect the anchor, meanwhile, the higher of the height, and the broader measurement scope of the POS.

3. Position and Orientation Measurement Algorithm

In our indoor positioning and orientation measurement system, there are two different coordinate systems. One of them is the ICS, and its origin of the coordinate system is set at the corner of the room; while the other is the POSCS and its origin of the coordinate system is set at the POS, as shown in Figure 2. For every POS, it can create a POSCS on its body and then estimates the 3D coordinate of the anchor in its FOV.

3.1. Mathematical Model

Suppose three anchors A1, A2, A3 are fixed on the ceiling, and their coordinates in ICS are A1(X1, Y1, Z1), A2(X2, Y2, Z2) and A3(X3, Y3, Z3). After acquiring the coordinates of three anchors by using POS, and their coordinates in POSCS are C1(xc1, yc1, zc1), C2(xc2, yc2, zc2) and C3(xc3, yc3, zc3). According to the 3D coordinate value of three anchors in POSCS and ICS, three coordinate transformation equations are depicted to realize the coordinate transformation from POSCS to ICS, as shown in Equations (1)–(3):
[ X 1 Y 1 Z 1 ] = R · [ x c 1 y c 1 z c 1 ] + [ x 0 y 0 z 0 ] ,
[ X 2 Y 2 Z 2 ] = R · [ x c 2 y c 2 z c 2 ] + [ x 0 y 0 z 0 ] ,
[ X 3 Y 3 Z 3 ] = R · [ x c 3 y c 3 z c 3 ] + [ x 0 y 0 z 0 ] ,
In (1), (2), (3), we have established the relationship between two coordinate systems for three anchors, where the R is the 3*3 rotation vector, T= [x0, y0, z0]T is the translation vector. In the coordinate transformation equation, the translation vector T contains the origin position of POSCS in ICS, and rotation vector R contains the rotation angle of the POSCS in ICS, which are exactly what we desired.
Suppose the x, y, and z-axis rotation angle of the POS in ICS are α, β, and γ, respectively. Meanwhile, we set the α as the pitch angle, the β as the roll angle, the γ as the yaw angle, and we define the clockwise rotation angle to be positive and the counterclockwise rotation angle to be negative. Then according to the rotation angle, we can get the rotation matrix that surrounds the x, y, z-axis, as shown in Equations (4), (5), and (6):
R x ( α ) = [ 1 0 0 0 cos α sin α 0 sin α cos α ] ,
R y ( β ) = [ cos β 0 sin β 0 1 0 sin β 0 cos β ] ,
R z ( γ ) = [ cos γ sin γ 0 sin γ cos γ 0 0 0 1 ] ,
We define the rotation vector R = Ry(β)* Rx(α)* Rz(γ), and the result of R is shown in (7), where α = arcsin (R[2, 3]), β = −arctan (R[1, 3]/R[3, 3]), γ = −arctan (R[2, 1]/R[2, 2]):
R = [ cos β cos γ sin α sin β sin γ cos β sin γ + sin α sin β cos γ cos α sin β cos α sin γ cos α cos γ sin α sin β cos γ + cos β sin α sin γ sin β sin γ cos β sin α cos γ cos α cos β ] ,
The key to solving the coordinate transformation in (1), (2), and (3) is to determine the rotation vector R and the translation vector T. If we can acquire the R and T by solving the (1), (2), and (3), that means we can estimate the position and orientation of POS in ICS. So how to solve the rotation vector R and the translation vector T are the key point.

3.2. Rodrigues Coordinate Transformation Algorithm

In order to solve the R and T, three common points are required at least [27]. The 3D coordinate transformation is one of the most frequently encountered operations in geodesy, mapping, photogrammetry, computer vision, geographical informational science, etc. [28]. In this paper, we will elaborate on the Rodrigues coordinate transformation algorithm to solve the R and T.
In (8), R is the rotation vector, T is the translation vector. The parameters to be estimated are R and T. According to the mathematical transformation model, R is computed first, followed by the T:
[ X Y Z ] = R · [ x y z ] + T ,
Let the anti-symmetric matrix S is equal to (9), in which a, b, c are independent parameters.
S = [ 0 c b c 0 a b a 0 ] ,
The R is composed of anti-symmetric matrix S and 3*3 unit matrix I, as shown in (10).
R = ( I S ) 1 ( I + S ) ,
From (8), each anchor can list three equations, the equation of the second common point subtract the corresponding equation of the first common point, which can eliminate the translation vector T, then get the Equation (11).
[ X 2 X 1 Y 2 Y 1 Z 2 Z 1 ] = R [ x 2 x 1 y 2 y 1 z 2 z 1 ] ,
The Equation (10) into (11), then get (12).
( I S ) [ X 2 X 1 Y 2 Y 1 Z 2 Z 1 ] = ( I + S ) [ x 2 x 1 y 2 y 1 z 2 z 1 ] ,
We substitute (9) into (12), unfold the equation, then extract the a, b, c expression vector form as shown in formula (13), in which X12 = X2 − X1, Y12 = Y2 − Y1, Z12 = Z2 − Z1; x12 = x2 − x1, y12 = y2 − y1, z12 = z2 − z1:
[ 0 z 12 Z 12 y 12 Y 12 z 12 Z 12 0 x 12 + X 12 y 12 + Y 12 x 12 + X 12 0 ] [ a b c ] = [ X 12 x 12 Y 12 y 12 Z 12 z 12 ] ,
Obviously, in (13), the left coefficient matrix is a singular matrix and only having two independent equations, which cannot solve a, b, c parameters with two anchors. With the first and the third common points, we can get a similar equation, as shown in (14).
[ 0 z 13 Z 13 y 13 Y 13 z 13 Z 13 0 x 13 + X 13 y 13 + Y 13 x 13 + X 13 0 ] [ a b c ] = [ X 13 x 13 Y 13 y 13 Z 13 z 13 ] ,
Then we combine (13) and (14), and thus get (15):
[ 0 z 12 Z 12 y 12 Y 12 z 12 Z 12 0 x 12 + X 12 y 12 + Y 12 x 12 + X 12 0 0 z 13 Z 13 y 13 Y 13 z 13 Z 13 0 x 13 + X 13 y 13 + Y 13 x 13 + X 13 0 ] [ a b c ] = [ X 12 x 12 Y 12 y 12 Z 12 z 12 X 13 x 13 Y 13 y 13 Z 13 z 13 ] ,
Solving Equation (15) we can calculate the three parameters a, b, c. According to the value of a, b, and c, we can calculate the value of the rotation vector R by using (10). Then by substituting R into (8), we can calculate the value of the translation vector T. After the above procedure, we can get the R and T, which contain the orientation angle and position of POS in ICS.

3.3. Multiple Common Points in Rodrigues Coordinate Transformation Algorithm

If the POS detects more than three common points within its field of view at a given position, the problem of estimating the rotation vector and the translation vector can be converted to the least square problem, and the error Equation (16) is listed according to Equation (15).
V 3 ( n 1 ) × 1 = A 3 ( n 1 ) × 3 X 3 × 1 L 3 ( n 1 ) × 1 ,
In error Equation (16), X3×1= [a, b, c]T, the matrix of A3(n−1)×3 and L3(n−1)×1 are shown in Equation (17) and Equation (18).
A 3 ( n 1 ) × 3 = [ 0 z 12 Z 12 y 12 Y 12 z 12 Z 12 0 x 12 + X 12 y 12 + Y 12 x 12 + X 12 0 0 z 1 n Z 1 n y 1 n Y 1 n z 1 n Z 1 n 0 x 1 n + X 1 n y 1 n + Y 1 n x 1 n + X 1 n 0 ] ,
L 3 ( n 1 ) × 1 = [ X 12 x 12 Y 12 y 12 Z 12 z 12 X 1 n x 1 n Y 1 n y 1 n Z 1 n z 1 n ] T ,
According to the principle of least squares, the optimal solution of Equation (16) can be obtained, as shown in Equation (19):
X = ( A T A ) 1 A T L ,
According to Equation (19), three independent parameters in the Rodrigues coordinate conversion algorithm can be obtained in the case of multiple common points. Based on a, b, and c, the rotation vector R can be obtained by Equations (9) and (10).
When the measurement error of the common point is considered, the error Equation (20) is obtained according to Equation (8):
E = [ X Y Z ] R [ x y z ] [ x 0 y 0 z 0 ] ,
For brevity, suppose ae, be, and Te are as shown in Equation (21):
a e = [ X Y Z ] , b e = [ x y z ] , T e = [ x 0 y 0 z 0 ] ,
Then Equation (20) can be simplified to Equation (22):
E = a e R b e T e ,
Derived from the matrix of least squares, the error matrix E can be minimized under the conditions of (23), where ||.||2 represents the second-order norm:
min ( a e R b e T e 2 ) 2 ,
Equation (23) is unfolded, as shown in Equation (24):
( a e R b e T e 2 ) 2 = ( a e R b e T e ) T ( a e R b e T e ) ,
The partial derivative of Equation (24) with respect to the translation vector Te is obtained and makes its partial derivative equal to zero. When the error matrix E reaches the minimum, the optimal solution of the translation vector Te is shown in Equation (25):
T e = 1 n i = 1 n a e R 1 n i = 1 n b e ,
According to Equation (25), we can calculate the center of gravity of multiple common points in two coordinate systems, respectively. Then, the optimal translation vector Te can be calculated by using rotation matrix R and the gravity of multiple common points.

4. Performance Evaluation

We evaluated the proposed method by using simulation. The simulation scenario is performed in an indoor system mode, and Figure 5 illustrates the scene of an indoor positioning and orientation measurement system.
As shown in Figure 5, the dimension of the system model is 5000 mm × 5000 mm × 3000 mm. Thirty-six infrared LEDs are fixed on the ceiling as anchors. Meanwhile, their coordinates in ICS are shown in Table 2.
The POS moves under these anchors from P1 to P10 position, as shown in Figure 5. We set the P1 at [500 mm, 500 mm, 1000 mm] in ICS, and the x, y, z-axis rotation angle of POS in ICS are α = 0°, β = 0°, γ = −25°, so the translation vector T = [500 mm, 500 mm, 1000 mm]T. According to the coordinate of intersection points on linear CCD1, CCD2, CCD3, and CCD4, the anchors of A1, A2, and A7 are located in the FOV of the POS.
According to the parameters of POS in Table 1, we can get the coordinate of intersection points of A1 on CCD1 is m1 = [78.79 mm, 0 mm, 0 mm]T in POSCS, A1 on CCD2 is m2 = [0 mm, 77.35 mm, 0 mm]T in POSCS, A1 on CCD3 is m3 = [−74.27 mm, 0 mm, 0 mm]T in POSCS, A1 on CCD4 is m4 = [0 mm, −75.70 mm, 0 mm]T in POSCS. Similarly, the coordinate of intersection points of A2 and A7 on CCD1, CCD2, CCD3, CCD4 are n1, n2, n3, n4 and p1, p2, p3, p4 in POSCS, as shown in Table 3.
If we compare the coordinate values of these calculated projection point m1, m2, m3, m4, n1, n2, n3, n4, p1, p2, p3, p4 with the simulation parameters in Table 1, only p4 is not within the measurable range of the CCD4 in POSCS. However, three projection points on four linear CCDs are enough to reconstruct the 3D coordinates of this anchor in POSCS, and the additional projection point can be used as redundancy. Although p4 is not located in the measurable range of the CCD4 in POSCS, it does not affect the POS to estimate the coordinate of A7 in POSCS. This means that even if one of the ODIUs malfunctions, it will not affect the normal operation of the entire system. Thus, we will use m1, m2, m3, n1, n2, n3, p1, p2, p3 to estimate the coordinate of A1, A2, and A7 in POSCS.
Based on the coordinate of m1, m2, and m3, we can estimate the coordinate of A1 in POSCS is a1= [−110.75 mm, −40.31 mm, 2499.99 mm] T. Similarly, according to n1, n2, n3, and p1, p2, p3 we can estimate the coordinate of A2 and A7 in POSCS are a2= [644.50 mm, −392.49 mm, 2499.99 mm]T and a3= [241.43 mm, 714.95 mm, 2499.99 mm]T, as shown in Table 4.
According to the 3D coordinates of the three anchors A1, A2, A7 in ICS and POSCS, as shown in Table 2 and Table 4, we can use the Rodrigues coordinate transformation algorithm in Section 3 to estimate the position and orientation of the POS at the P1 position.
Similarly, according to the set value from P2 to P10 position, following the steps at the P1 position, we can acquire all measurable anchors, and the results are shown in Table 5.
According to Table 5, we can estimate the coordinate of the measurable anchors in POSCS, and the position and orientation from P1 to P10 are listed in Table 6.
The maximum measurement error and average measurement error of x, y, and z-axis are calculated based on Table 6, and then the average error is used to calculate the standard deviation, as shown in Table 7.
Similarly, the maximum measurement error, average measurement error, and standard deviation of pitch, roll, and yaw angle are shown in Table 8.
According to the evaluation results of measurement error in Table 7 and Table 8, the method proposed in this paper is compared with [15], as shown in Table 9. The measurement error in our method is smaller than the off-the-shelf methods, which indicates the feasibility and high accuracy of the indoor position and orientation measurement method proposed in this paper.

5. Discussion

Indoor navigation is critical for developing intelligent carriers. We design a novel POS by using four pairs of linear CCDs and cylindrical lenses for indoor navigation without the assistance of an IMU or other costly devices. This design has the following advantages: (1) The POS is designed for the indoor position and attitude measurement purposes in a wide range, and achieves sufficient accuracy in the structured indoor environment. (2) Our method does not need to consider the 2D or 3D working situation, respectively. We can estimate the position and orientation in a 3D situation or the position and direction in a 2D situation, as long as a minimum of three anchors is within the field of vision of the designed POS. Meanwhile, after three LED anchors are detected, we can estimate the position and orientation simultaneously, while many indoor positioning methods can only estimate the position. (3) Once the LED’s control unit and POS’s control unit are simultaneously triggered, and they will have an identical time sequence. Therefore, the number of POS used indoors is not limited, theoretically. (4) The structure of the proposed POS is a redundant system, even if a linear CCD is broken down; it does not affect the normal operation of the entire system, as long as the remaining three linear CCDs could work normally.
Of course, there is still some idealization in the simulation design process. In order to demonstrate the feasibility and effectiveness of the whole system, we ignore the partial distortion and nonlinear effects of the cylindrical lens. However, it depends entirely on the perfection of the cylindrical lens design and the cost, which is beyond the scope of this paper [29].

6. Conclusions

In this paper, a high-precision three-dimensional indoor position and orientation measurement method is proposed, which can be used for indoor navigation. We design a new-style indoor position orientation sensor by using four pairs of linear CCDs and cylindrical lenses, and analyze the field of vision of the position orientation sensor. The proposed indoor POS can estimate the 3D coordinate of the infrared LED anchors, after acquiring the coordinate value of at least three anchors sequentially in the POSCS, the Rodrigues coordinate transformation algorithm is used to estimate the position and orientation of the POS simultaneously. The position and orientation measurement method is performed via simulation. Thirty-six infrared LEDs are employed in simulation model space with dimensions of 5000 mm × 5000 mm × 3000 mm. The maximum value of position error and orientation error during simulation are 0.06 mm and 0.01°, respectively. The simulation result indicates the feasibility and high accuracy of the proposed method. The POS proposed in this paper is robust and can resist certain system failures, which is an alternative for indoor navigation.
There are two fundamental steps for the whole system to work properly. In this paper, we mainly elaborate on the POS operating principle and demonstrate the feasibility and high accuracy of the designed POS. In order to achieve a wide-range measurement of the indoor positioning and orientation, many anchor nodes will be fixed on the ceiling. However, the designed POS can only measure one anchor at a time in its FOV, so in the second phase, we will study how to design the wireless high-speed synchronous exposure between the infrared LED array and the POS. Meanwhile, we will conduct experiments to verify the proposed methods. Additionally, the architecture of the proposed POS is symmetrical and redundant, and we will study how to realize indoor position and orientation measurements without synchronization.
Due to the fast scanning, high-speed data processing, and precise coordinate measurement characteristics of the linear CCD, this system will be the appropriate candidate for the indoor position and orientation estimation, especially for the fasting moving object such as the flying agent indoors.

Author Contributions

C.W. conceived of the idea and developed the proposed approaches; L.X. Gave advice on the research and helped edited the paper; X.T. improved the quality of the manuscript and the completed revision. All authors have read and agreed to the published version of the manuscript.

Funding

This work is supported in part by a grant for the National Natural Science Foundation of China NSFC (No. 51775325), the National Youth Science Foundation of China (No. 61903241), and the Innovation Fund Project of the National Commercial Aircraft Manufacturing Engineering Technology Research Center (COMAC-SFGS-2017-36741).

Acknowledgments

The authors would like to thank the reviewers for their insightful comments and helpful suggestions.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Heibmeyer, S.; Overmetyer, L.; Muller, A. Indoor positioning of vehicles using an active optical infrastructure. In Proceedings of the 2012 International Conference on Indoor Positioning and Indoor Navigation (IPIN), Sydney, NSW, Australia, 13–15 November 2012. [Google Scholar]
  2. Lee, H.; Tak, J.; Choi, J. Wearable antenna integrated into military berets for indoor/outdoor positioning system. IEEE Antennas Wirel. Propag. Lett. 2017, 16, 1912–1922. [Google Scholar] [CrossRef]
  3. Perez, L.; Rodriguez, I.; Rodriguez, N.; Usamentiaga, R.; Garcia, D.F. Robot guidance using machine vision techniques in industrial environments: A comparative review. Sensors 2016, 16, 335. [Google Scholar] [CrossRef] [PubMed]
  4. Ashraf, I.; Hur, S.; Park, Y. Indoor positioning on disparate commercial smartphones using Wi-Fi access points coverage area. Sensors 2019, 19, 4351. [Google Scholar] [CrossRef] [Green Version]
  5. Fan, Q.; Sun, B.; Sun, Y.; Wu, Y.; Zhuang, X. Data fusion for indoor mobile robot positioning based on tightly coupled INS/UWB. J. Navig. 2017, 70, 1079–1097. [Google Scholar] [CrossRef]
  6. Kuang, J.; Niu, X.; Chen, X. Robust pedestrian dead reckoning based on MEMS-IMU for smartphones. Sensors 2018, 18, 1391. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  7. Deng, Z.; Yu, Y.; Yuan, X.; Wan, N.; Yang, L. Situation and development tendency of indoor positioning. China Commun. 2013, 10, 42–45. [Google Scholar] [CrossRef]
  8. Zafari, F.; Gkelias, A.; Leung, K.K. A survey of indoor localization systems and technologies. IEEE Commun. Surv. Tutor. 2019, 21, 2568–2599. [Google Scholar] [CrossRef] [Green Version]
  9. He, S.; Chan, S.G. Wi-Fi fingerprint-based indoor positioning: Recent advances and comparisons. IEEE Commun. Surv. Tutor. 2016, 18, 465–490. [Google Scholar] [CrossRef]
  10. Mazhar, F.; Khan, M.G.; Sallberg, B. Precise indoor positioning using UWB: A review of methods, algorithms and implementations. Wirel. Pers. Commun. 2017, 97, 4467–4491. [Google Scholar] [CrossRef]
  11. Qi, J.; Liu, G.P. A robust high-accuracy ultrasound indoor positioning system based on a wireless sensor network. Sensors 2017, 17, 2554. [Google Scholar] [CrossRef] [Green Version]
  12. Faragher, R.; Harle, R. Location fingerprinting with bluetooth low energy beacons. IEEE J. Sel. Areas Commun. 2015, 23, 2418–2428. [Google Scholar] [CrossRef]
  13. Kiraci, E.; Franciosa, P.; Turley, G.A.; Olifent, A.; Attridge, A.; Williams, M.A. Moving towards in-line metrology: evaluation of a Laser Radar system for in-line dimensional inspection for automotive assembly systems. Int J. Adv. Manuf. Tech. 2017, 91, 69–78. [Google Scholar] [CrossRef] [Green Version]
  14. Soe, N. Feature Based Design for Jigless Assembly. Ph.D. Thesis, Cranfield University, Cranfield, UK, 2004. [Google Scholar]
  15. Huang, Z.; Zhu, J.; Yang, L.; Xue, B.; Wu, J.; Zhao, Z. Accurate 3-D position and orientation method for indoor Mobile robot navigation based on photoelectric scanning. IEEE Trans. Instrum. Meas. 2015, 64, 2518–2529. [Google Scholar] [CrossRef]
  16. Hijikata, S.; Terabayashi, K.; Umeda, K. A simple indoor Self-Localization system using infrared LEDs. In Proceedings of the 2009 Sixth International Conference on Networked Sensing Systems (INSS), Pittsburgh, PA, USA, 17–19 June 2009. [Google Scholar]
  17. Tian, R.; Li, Q. Research on the application of rectangle object constraint in active vision of Mobile robot. In Proceedings of the 2016 International Conference on Robotics and Automation Engineering (ICRAE), Jeju, South Korea, 27–29 August 2016. [Google Scholar]
  18. Gu, D.; Chen, K.S. Design and performance evaluation of wiimote-based two-dimensional indoor localization systems for indoor mobile robot control. Measurement 2015, 66, 95–108. [Google Scholar] [CrossRef]
  19. Kohoutek, T.K.; Mautz, R.; Wegner, J.D. Fusion of building information and range imaging for autonomous location estimation in indoor environments. Sensors 2013, 13, 2430–2446. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  20. Paredes, J.; Alvare, F.; Aguilera, T.; Villadangos, J. 3D indoor positioning of UAVs with spread spectrum ultrasound and time-of-flight cameras. Sensors 2018, 18, 89–104. [Google Scholar]
  21. Merriaux, P.; Dupuis, Y.; Boutteau, R.; Vasseur, P.; Savatier, X. A study of vicon system positioning performance. Sensors 2017, 17, 1591. [Google Scholar] [CrossRef]
  22. Ai, L.; Yuan, F.; Ding, Z. Measurement of spatial object’s exterior attitude based on linear CCD. Chin. Opt. Lett. 2008, 6, 505–509. [Google Scholar]
  23. Kumar, A.; Bentzvi, P. Spatial object tracking system based on linear optical sensor arrays. IEEE Sens. J. 2016, 16, 7933–7940. [Google Scholar] [CrossRef]
  24. Zhang, Y.; Liu, C.; Fu, L.; Liu, H. A design of cylindrical lens for linear CCD used in dynamic envelope curve measurement of high-speed train. In Proceedings of the 2015 International Conference on Optical Instruments and Technology, Beijing, China, 7 August 2015. [Google Scholar]
  25. Wu, J.; Ding, H.; Wang, G. Aberration analysis and adjustment of non-spherical lens in the linear CCDs three-dimensional measurement system. In Proceedings of the Optical Fabrication, Testing, and Metrology, St Etienne, France, 26 February 2004; pp. 478–486. [Google Scholar]
  26. Wu, J.; Wen, Q. The method of realizing the three-dimension positioning based on linear CCD sensor in general DSP chip. In Proceedings of the 2008 30th Annual Internal Conference of the IEEE Engineering in Medicine and Biology Society, Vancouver, BC, Canada, 20–25 August 2008; pp. 2302–2305. [Google Scholar]
  27. Yang, F.; Dai, H.; Xing, H. Least squares based on Rodrigues Matrix and its application in similar material model of mining. In Proceedings of the 2015 IEEE International Conference on Mechatronics and Automation (ICMA), Beijing, China, 2–5 August 2015; pp. 1686–1690. [Google Scholar]
  28. Felus, Y.A.; Burtch, R.C. On symmetrical three-dimensional datum conversion. GPS Solut. 2009, 13, 65. [Google Scholar] [CrossRef]
  29. Liu, H.; Yang, L.; Guo, Y.; Guan, R.; Zhu, J. Precise calibration of linear camera equipped with cylindrical lenses using a radial basis function-based mapping technique. Opt. Express 2015, 23, 3412–3426. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Projection mapping model of the one-dimensional imaging unit.
Figure 1. Projection mapping model of the one-dimensional imaging unit.
Sensors 20 00748 g001
Figure 2. Schematic of the POS.
Figure 2. Schematic of the POS.
Sensors 20 00748 g002
Figure 3. Layout design of the POS.
Figure 3. Layout design of the POS.
Sensors 20 00748 g003
Figure 4. The FOV of the POS.
Figure 4. The FOV of the POS.
Sensors 20 00748 g004
Figure 5. System model.
Figure 5. System model.
Sensors 20 00748 g005
Table 1. Simulation Parameters.
Table 1. Simulation Parameters.
ParameterSpecification (mm)ParameterSpecification (mm)
CCD1_In[60, 0, 0]a1[75, 30, 50]
CCD1_Out[90, 0, 0]b1[75, −30, 50]
CCD2_In[0, 60, 0]a2[−30, 75, 50]
CCD2_Out[0, 90, 0]b2[30, 75, 50]
CCD3_In[−60, 0, 0]a3[−75, −30, 50]
CCD3_Out[−90, 0, 0]b3[−75, 30, 50]
CCD4_In[0, −60, 0]a4[30, −75, 50]
CCD4_Out[0, −90, 0]b4[−30, −75, 50]
focal length50d60
sensing length30
Table 2. The coordinates of anchors in ICS.
Table 2. The coordinates of anchors in ICS.
AnchorCoordinate (mm)AnchorCoordinate (mm)
A1[416.66, 416.66, 3000.00]A19[416.66, 2916.66, 3000.00]
A2[1250.00, 416.66, 3000.00]A20[1250.00, 2916.66, 3000.00]
A3[2083.33, 416.66, 3000.00]A21[2083.33, 2916.66, 3000.00]
A4[2916.66, 416.66, 3000.00]A22[2916.66, 2916.66, 3000.00]
A5[3750.00, 416.66, 3000.00]A23[3750.00, 2916.66, 3000.00]
A6[4583.33, 416.66, 3000.00]A24[4583.33, 2916.66, 3000.00]
A7[416.66, 1250.00, 3000.00]A25[416.66, 3750.00, 3000.00]
A8[1250.00, 1250.00, 3000.00]A26[1250.00, 3750.00, 3000.00]
A9[2083.33, 1250.00, 3000.00]A27[2083.33, 3750.00, 3000.00]
A10[2916.66, 1250.00, 3000.00]A28[2916.66, 3750.00, 3000.00]
A11[3750.00, 1250.00, 3000.00]A29[3750.00, 3750.00, 3000.00]
A12[4583.33, 1250.00,3000.00]A30[4583.33, 3750.00, 3000.00]
A13[416.66, 2083.33, 3000.00]A31[416.66, 4583.33, 3000.00]
A14[1250.00, 2083.33, 3000.00]A32[1250.00, 4583.33, 3000.00]
A15[2083.33, 2083.33, 3000.00]A33[2083.33, 4583.33, 3000.00]
A16[2916.66, 2083.33, 3000.00]A34[2916.66, 4583.33, 3000.00]
A17[3750.00, 2083.33, 3000.00]A35[3750.00, 4583.33, 3000.00]
A18[4583.33, 2083.33, 3000.00]A36[4583.33, 4583.33, 3000.00]
Table 3. The intersection point on CCD1-CCD4 in POSCS.
Table 3. The intersection point on CCD1-CCD4 in POSCS.
IntersectionCoordinate (mm)IntersectionCoordinate (mm)
m1[78.79, 0.00, 0.00]m2[0.00, 77.35, 0.00]
m3[−74.27, 0.00, 0.00]m4[0.00, −75.70, 0.00]
n1[63.37, 0.00, 0.00]n2[0.00, 84.54, 0.00]
n3[−89.68, 0.00, 0.00]n4[0.00, −68.52, 0.00]
p1[71.60, 0.00, 0.00]p2[0.00, 61.93, 0.00]
p3[−81.45, 0.00, 0.00]p4[0.00, −91.12, 0.00]
Table 4. The coordinate of anchors in POSCS.
Table 4. The coordinate of anchors in POSCS.
AnchorCoordinate (mm)
a1 (A1)−110.75, −40.31, 2499.99
a2 (A2)644.50, −392.49, 2499.99
a7 (A7)241.43, 714.95, 2499.99
Table 5. Measurable anchors from P1 to P10.
Table 5. Measurable anchors from P1 to P10.
PositionAnchors in FOV
P1A1, A2, A7
P2A15, A16, A21, A22, A23, A28
P3A20, A21, A26
P4A24, A29, A30
P5A17, A18, A23, A24
P6A16, A22, A23, A29
P7A1, A7, A8, A13, A14, A15, A20
P8A15, A20, A21, A26, A27
P9A25, A26, A31, A32
P10A15, A16, A21, A22, A23, A27, A28
Table 6. The position and orientation of POS.
Table 6. The position and orientation of POS.
Position (mm)Orientation(°)
Pitch(α)Roll(β)Yaw(γ)
P1 Set value[500.00, 500.00, 500.00]0.000.00−25.00
P1 estimated value[500.00, 499.99, 499.98]0.000.00−24.99
P2 Set value[2450.00, 1180.00, 610.00]30.00−5.00−45.00
P2 estimated value[2449.99, 1179.99, 609.98]29.99−5.00−45.00
P3 Set value[2917.00, 2345.00, 1234.00]16.0027.0090.00
P3 estimated value[2917.00, 2344.98, 1233.99]16.0026.9989.99
P4 Set value[4763.00, 2406.00, 1816.00]37.0016.00−125.00
P4 estimated value[4762.99, 2405.99, 1816.00]37.0015.99−125.00
P5 Set value[4686.00, 2758.00, 1201.00]−8.0015.00−33.00
P5 estimated value[4686.01, 2758.00, 1200.98]−8.0115.00−33.00
P6 Set value[4700.00, 4716.00, 1768.00]−35.0035.000.00
P6 estimated value[4699.97, 4716.00, 1767.99]−35.0034.990.00
P7 Set value[2615.00, 2934.00, 436.00]−18.0030.00−15.00
P7 estimated value[2615.00, 2934.01, 435.99]−18.0030.00−14.99
P8 Set value[700.00, 4612.00, 1300.00]−33.00−28.000.00
P8 estimated value[699.98, 4612.00, 1299.99]−33.00−28.000.00
P9 Set value[600.00, 2500.00, 968.00]40.000.0090.00
P9 estimated value[599.98, 2499.94, 967.99]40.000.0089.99
P10 Set value[1500.00, 1300.00, 712.0028.00−23.0042.00
P10 estimated value[1499.99, 1299.97, 712.00]27.99−23.0042.00
Table 7. Evaluation of Position Measurement.
Table 7. Evaluation of Position Measurement.
Maximum Error (mm)Average Error (mm)Standard Deviation (mm)
x0.0300.0110.0099
y0.0600.0150.0184
z0.0200.0110.0074
Table 8. Evaluation of Orientation measurement.
Table 8. Evaluation of Orientation measurement.
Maximum Error (°)Average Error (°) Standard Deviation (°)
Pitch (α)0.0100.0030.0048
Roll (β)0.0100.0030.0048
Yaw (γ)0.0100.0040.0052
Table 9. Error comparison between two measurement methods.
Table 9. Error comparison between two measurement methods.
Position Maximum Error (mm)Orientation Maximum Error (°)
Paper [15] method3.800.104
Our method0.060.010

Share and Cite

MDPI and ACS Style

Wang, C.; Xing, L.; Tu, X. A Novel Position and Orientation Sensor for Indoor Navigation Based on Linear CCDs. Sensors 2020, 20, 748. https://doi.org/10.3390/s20030748

AMA Style

Wang C, Xing L, Tu X. A Novel Position and Orientation Sensor for Indoor Navigation Based on Linear CCDs. Sensors. 2020; 20(3):748. https://doi.org/10.3390/s20030748

Chicago/Turabian Style

Wang, Chuang, Li Xing, and Xiaowei Tu. 2020. "A Novel Position and Orientation Sensor for Indoor Navigation Based on Linear CCDs" Sensors 20, no. 3: 748. https://doi.org/10.3390/s20030748

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop