Skip to main content

Spacecraft Pose Estimation Based on Different Camera Models

Abstract

Spacecraft pose estimation is an important technology to maintain or change the spacecraft orientation in space. For spacecraft pose estimation, when two spacecraft are relatively distant, the depth information of the space point is less than that of the measuring distance, so the camera model can be seen as a weak perspective projection model. In this paper, a spacecraft pose estimation algorithm based on four symmetrical points of the spacecraft outline is proposed. The analytical solution of the spacecraft pose is obtained by solving the weak perspective projection model, which can satisfy the requirements of the measurement model when the measurement distance is long. The optimal solution is obtained from the weak perspective projection model to the perspective projection model, which can meet the measurement requirements when the measuring distance is small. The simulation results show that the proposed algorithm can obtain better results, even though the noise is large.

1 Introduction

With the development of space technology, an increasing number of space missions involve the relative position measurement of two spacecraft [1,2,3,4], such as space assembly, space satellite repair, fuel injection, satellite capture and tracking, and space interception. The measurement of the spacecraft’s relative position is very important to maintain or change the spacecraft orientation in space to complete a space mission.

For relative position measurement, vision has some advantages in terms of weight, volume, power consumption, and equipment cost [5,6,7,8]. In the smart-OLEV mission [9, 10], the SMART-1 platform uses stereo cameras and lighting equipment to provide better measurement data within 5 m, but pointing data are only provided at a long distance. The Argon vision system is divided into long-distance and short-range vision sensors [11, 12], which select different field-of-view sensors for different distances. The natural image feature recognition system developed by the Johnson Space Center USES generates a 3D model of the target under test to calculate the relative pose [13, 14]. Its measurement accuracy is proportional to the relative distance. The measurement system requires to measure the relative pose information of two spacecraft at different distances for the control system or other systems, and the relative distance of the spacecraft varies significantly, reaching more than 20 times. Therefore, spacecraft pose estimation has the following characteristics.

(1) When the two spacecraft are relatively distant, the depth information of the feature points on the target spacecraft and the distance between the feature points are less than the relative distance between the two spacecraft.

(2) Because the focal length of the camera is fixed, the accuracy of the feature point extraction decreases with an increase in the relative distance between the two spacecraft.

Based on the reasons above, when the two spacecraft are far apart, the pose measurement accuracy will be reduced. At present, two main algorithms are used for pose estimation.

(1) Cooperative space measurement. The first is an analytical algorithm based on the perspective projection camera model, such as perspective-n-point [15,16,17] and direct linear transformation [18,19,20]. Using these algorithms, the pose of spacecraft can be solved directly. However, the accuracy of the spacecraft pose obtained using the analytical algorithm is unsatisfactory. and optimization algorithms based on nonlinear camera models, such as Gauss–Newton, Levenberg–Marqurdt, and orthogonal iteration algorithms [21,22,23]. These algorithms require a good initial solution for optimization. Therefore, we aim to obtain high-precision analytical solutions using analytical algorithms. (2) Based on noncooperative space measurement, the transformation of the pose is calculated by using pattern matching and 3D point cloud technology [24,25,26,27,28].

In this study, a spacecraft pose estimation algorithm based on the target geometric constraints of the spacecraft outline is proposed. To reduce the influence of distance on measurement accuracy, this study simplifies the camera measurement model. The simulation results show that the proposed algorithm has an image feature error of 0.1 pixel to 1 pixel from 1 m to 20 m.

2 Pose Estimation Algorithm

Spacecraft pose estimation is based on the relationship between the target spacecraft points and the corresponding image points. The relative pose of the target spacecraft coordinate system and camera coordinate system is calculated by using the multipoint correspondence relationship (Figure 1).

Figure 1
figure 1

Target spacecraft and camera coordinate system

The mapping relations between the target spacecraft point and the image point can be described by two mathematical mappings: 1) rigid transformation and 2) camera model. In the former, the space points in the space coordinate system and camera coordinate system follow a rigid body transformation, namely, rotation and translation transformations. Because the camera is installed on the tracker spacecraft, the relative pose relationship between the tracker spacecraft and the target spacecraft can be described by the pose relationship between the target spacecraft and the camera coordinate system. In the latter, the relationship between 3D space points in the camera coordinates and the projection 2D image points on the camera image plane is considered.

2.1 Algorithm Model

To construct the spacecraft pose estimation algorithm model, four coplanar symmetry points are used to calculate the spacecraft pose. The target spacecraft coordinate system is Os-xyz. There are four points for \({\varvec{P}}_{i}^{s} ,\) \(i = 1, \cdots ,4\). The four points in the target spacecraft coordinates are

$${\varvec{P}}_{1}^{s} = \left[ {\begin{array}{*{20}c} a \\ 0 \\ d \\ \end{array} } \right],\quad{\varvec{P}}_{2}^{s} = \left[ {\begin{array}{*{20}c} { - a} \\ 0 \\ d \\ \end{array} } \right],\quad{\varvec{P}}_{3}^{s} = \left[ {\begin{array}{*{20}c} c \\ b \\ d \\ \end{array} } \right],\quad{\varvec{P}}_{4}^{s} = \left[ {\begin{array}{*{20}c} { - c} \\ b \\ d \\ \end{array} } \right],$$
(1)

where a, b, c, d are known values and the relationship between \({\varvec{P}}_{i}^{s}\) and the points in the camera coordinate system \({\varvec{P}}_{i}^{c}\) is given by

$${\varvec{P}}_{i}^{c} = {\varvec{RP}}_{i}^{s} + {\varvec{T}},$$
(2)

where

$${\varvec{R}}_{3 \times 3} = \left[ {\begin{array}{*{20}c} I \\ J \\ K \\ \end{array} } \right] = \left[ {\begin{array}{*{20}c} {r_{11} } & {r_{12} } & {r_{13} } \\ {r_{21} } & {r_{22} } & {r_{23} } \\ {r_{31} } & {r_{32} } & {r_{33} } \\ \end{array} } \right],\quad{\varvec{T}}_{3 \times 1} = \left[ {\begin{array}{*{20}c} {t_{x} } \\ {t_{y} } \\ {t_{z} } \\ \end{array} } \right],$$
(3)

where I is the first row of the rotation matrix of \({\varvec{R}}_{3 \times 3}\), J is the second row, and K is the third row.

2.2 Camera Model

The fixed-focus-lens camera model can be simplified to a single-lens model. According to the optics principle, space points \(P_{i}^{c} (x_{i}^{c} ,\;y_{i}^{c} ,\;z_{i}^{c} )\), image points \(p_{i} (u_{i} ,\;v_{i} )\), and the camera origin \(O_{{\text{c}}}\) are located on the same line. Therefore, the camera model is called the pinhole camera model, which is also known as the perspective projection model.

$$u_{i} = \frac{{fx_{i}^{c} }}{{z_{i}^{c} }},\quad v_{i} = \frac{{fx_{i}^{c} }}{{z_{i}^{c} }},$$
(4)

where f is the camera focal distance. The spacecraft pose estimation model is

$$\left[ {\begin{array}{*{20}c} {{{u_{i} z_{i}^{c} } \mathord{\left/ {\vphantom {{u_{i} z_{i}^{c} } f}} \right. \kern-0pt} f}} \\ {{{v_{i} z_{i}^{c} } \mathord{\left/ {\vphantom {{v_{i} z_{i}^{c} } f}} \right. \kern-0pt} f}} \\ {z_{i}^{c} } \\ \end{array} } \right] = \left[ {\begin{array}{*{20}c} {IP_{i}^{s} } \\ {JP_{i}^{s} } \\ {KP_{i}^{s} } \\ \end{array} } \right] + \left[ {\begin{array}{*{20}c} {t_{x} } \\ {t_{y} } \\ {t_{z} } \\ \end{array} } \right].$$
(5)

Form Eq. (2), we can obtain the relationship between \(z_{i}^{c}\) and \(t_{z}\):

$$z_{i}^{c} = t_{z} (1 + \varepsilon ),\quad \varepsilon = {{KP^{s} } \mathord{\left/ {\vphantom {{KP^{s} } {t_{z} }}} \right. \kern-0pt} {t_{z} }}.$$
(6)

Finally, we have

$$\begin{aligned}& \left[ {\begin{array}{*{20}c} {\frac{{u_{1} }}{f}t_{z} (1 + \varepsilon_{1} )} & {\frac{{u_{2} }}{f}t_{z} (1 + \varepsilon_{2} )} & {\frac{{u_{3} }}{f}t_{z} (1 + \varepsilon_{3} )} & {\frac{{u_{4} }}{f}t_{z} (1 + \varepsilon_{4} )} \\ {\frac{{v_{1} }}{f}t_{z} (1 + \varepsilon_{1} )} & {\frac{{v_{2} }}{f}t_{z} (1 + \varepsilon_{2} )} & {\frac{{v_{3} }}{f}t_{z} (1 + \varepsilon_{3} )} & {\frac{{v_{4} }}{f}t_{z} (1 + \varepsilon_{4} )} \\ {t_{z} (1 + \varepsilon_{1} )} & {t_{z} (1 + \varepsilon_{2} )} & {t_{z} (1 + \varepsilon_{3} )} & {t_{z} (1 + \varepsilon_{4} )} \\ \end{array} } \right] \hfill \\ & \quad= \left[ {\begin{array}{*{20}c} { - ar_{11} + dr_{13} } & {ar_{11} + \,dr_{13} } & {cr_{11} + br_{12} + dr_{13} } & { - cr_{11} + br_{12} + dr_{13} } \\ { - ar_{21} + dr_{23} } & {ar_{21} + \,dr_{23} } & {cr_{21} + br_{22} + dr_{23} } & { - cr_{21} + br_{22} + dr_{23} } \\ { - ar_{31} + dr_{33} } & {ar_{31} + \,dr_{33} } & {cr_{31} + br_{33} + dr_{33} } & { - cr_{31} + br_{33} + dr_{33} } \\ \end{array} } \right] \hfill \\& \qquad+ \left[ {\begin{array}{*{20}c} {t_{x} } \\ {t_{y} } \\ {t_{z} } \\ \end{array} } \right]\;. \hfill \\ \end{aligned}$$
(7)

According to the symmetry properties of points, we have

$$\left\{ {\begin{array}{*{20}c} {r_{11} = k_{1} t_{z} ,} \\ {r_{21} = k_{2} t_{z} ,} \\ \end{array} } \right.\left\{ {\begin{array}{*{20}c} {r_{12} = k_{3} t_{z} ,} \\ {r_{22} = k_{4} t_{z} ,} \\ \end{array} } \right.\left\{ {\begin{array}{*{20}c} {r_{13} = k_{5} t_{z} - {{t_{x} } \mathord{\left/ {\vphantom {{t_{x} } d}} \right. \kern-0pt} d},} \\ {r_{23} = k_{6} t_{z} - {{t_{y} } \mathord{\left/ {\vphantom {{t_{y} } d}} \right. \kern-0pt} d},} \\ \end{array} } \right.$$
(8)

where

$$\left\{ \begin{aligned} &\begin{array}{*{20}c} {k_{1} = \frac{{(u_{2} (1 + \varepsilon_{2} ) - u_{1} (1 + \varepsilon_{1} ))}}{2af},} \\ {k_{2} = \frac{{(v_{2} (1 + \varepsilon_{2} ) - v_{1} (1 + \varepsilon_{1} ))}}{2af},} \\ \end{array} \hfill \\& \begin{array}{*{20}c} {k_{3} = \frac{{(u_{3} (1 + \varepsilon_{3} ) + u_{4} (1 + \varepsilon_{4} ) - u_{2} (1 + \varepsilon_{2} ) - u_{1} (1 + \varepsilon_{1} ))}}{2bf},} \\ {k_{4} = \frac{{(v_{3} (1 + \varepsilon_{3} ) + v_{4} (1 + \varepsilon_{4} ) - v_{2} (1 + \varepsilon_{2} ) - v_{1} (1 + \varepsilon_{1} ))}}{2bf},} \\ \end{array} \hfill \\& \begin{array}{*{20}c} {k_{5} = \frac{{(u_{2} (1 + \varepsilon_{2} ) + u_{1} (1 + \varepsilon_{1} ))}}{2df},} \\ {k_{6} = \frac{{(v_{2} (1 + \varepsilon_{2} ) - v_{1} (1 + \varepsilon_{1} ))}}{2df}.} \\ \end{array} \hfill \\ \end{aligned} \right.$$
(9)

2.3 Simplified Model

When the two spacecraft are relatively distant, the accuracy of the image feature extraction is low, and the depth information of the feature points to the target spacecraft can be ignored. The camera model can be approximated by a simplified perspective projection model [29,30,31]. Consequently, we obtain

$$z_{i}^{c} = t_{z} ,\quad i = 1,2,3,4.$$
(10)

Simplified perspective projection refers to the projection on a plane parallel to the imaging plane through the origin of the target spacecraft. Therefore, it ignores the depth of the target spacecraft point relative to the origin of the target spacecraft. When the two spacecraft are relatively distant, the neglect error is insignificant. From Eq. (8), we have

$$\begin{gathered} k_{1} = \frac{{(u_{2} - u_{1} )}}{2af}{, }\,\,k_{2} = \frac{{(v_{2} - v_{1} )}}{2af}{, }\,\,k_{3} = \frac{{(u_{3} + u_{4} - u_{2} - u_{1} )}}{2bf}, \hfill \\ k_{4} = \frac{{(v_{3} + v_{4} - v_{2} - v_{1} )}}{2bf}{, }\,\,k_{5} = \frac{{(u_{2} + u_{1} )}}{2df}{, }\,\,k_{6} = \frac{{(v_{2} - v_{1} )}}{2df}, \hfill \\ \end{gathered}$$
(11)

where ki can be calculated by image points. Equation (7) contains nine variables and six equations. Thus, it cannot be solved directly. The rotation matrix R has the following constraints:

$$\left\{ \begin{aligned} &r_{11}^{2} + r_{12}^{2} + r_{13}^{2} = 1, \hfill \\ &r_{21}^{2} + r_{22}^{2} + r_{23}^{2} = 1, \hfill \\ &r_{31}^{2} + r_{32}^{2} + r_{33}^{2} = 1, \hfill \\ &r_{11} r_{21} + r_{12} r_{22} + r_{13} r_{23} = 0, \hfill \\ &r_{31} r_{21} + r_{32} r_{22} + r_{33} r_{23} = 0, \hfill \\& r_{11} r_{31} + r_{12} r_{32} + r_{13} r_{33} = 0. \hfill \\ \end{aligned} \right.$$
(12)

From the first, third, and sixth equations of Eq. (12), we can obtain

$$(r_{11} r_{21} - r_{12} r_{22} )^{2} - (r_{11}^{2} + r_{12}^{2} + r_{21}^{2} + r_{22}^{2} ) + 1 = 0.$$
(13)

From Eqs. (8) and (13), we obtain

$$(k_{1} k_{3} - k_{2} k_{4} )^{2} t_{z}^{4} - (k_{1}^{2} + k_{2}^{2} + k_{3}^{2} + k_{4}^{2} )t_{z}^{2} + 1 = 0.$$
(14)

Eq. (14) is a quartic equation. Therefore, the number of roots is four. Two negative roots are removed according to the relationship between roots and coefficients, and two positive roots meet the following conditions:

$${\text{Condition 1}}: t_{z1}^{2} \le \frac{{k_{1}^{2} + k_{3}^{2} }}{{(k_{1} k_{3} - k_{2} k_{4} )^{2} }},$$
(15)
$${\text{Condition 2}}:t_{z2}^{2} \ge \frac{{k_{2}^{2} + k_{4}^{2} }}{{(k_{1} k_{3} - k_{2} k_{4} )^{2} }}.$$
(16)

Condition 2 can only be satisfied when the rotation angle is greater than 60°; therefore, the result of applying Condition 1 is selected.

Rotation matrix R can be described by four quaternion parameters \((q_{0} ,q_{1} ,q_{2} ,q_{3} )\):

$${\varvec{R}} = \left[ {\begin{array}{*{20}c} {q_{0}^{2} + q_{1}^{2} - q_{2}^{2} - q_{3}^{2} } & {2(q_{1} q_{2} + q_{3} q_{0} )} & {2(q_{1} q_{3} - q_{2} q_{0} )} \\ {2(q_{1} q_{2} - q_{3} q_{0} )} & {q_{0}^{2} - q_{1}^{2} + q_{2}^{2} - q_{3}^{2} } & {2(q_{2} q_{3} + q_{1} q_{0} )} \\ {2(q_{1} q_{3} + q_{2} q_{0} )} & {2(q_{2} q_{3} - q_{1} q_{0} )} & {q_{0}^{2} - q_{1}^{2} - q_{2}^{2} + q_{3}^{2} } \\ \end{array} } \right] \, {.}$$
(17)

Assumed that

$$\begin{aligned} \beta &= \frac{1}{2}(r_{32} - (r_{12} r_{31} - r_{11} r_{32} )) \hfill \\ &= \frac{1}{2}(k_{4} t{}_{y} - k_{3} k_{2} t_{z}^{2} - k_{1} k_{4} t_{z}^{2} ) = 2q_{1} q_{2} . \hfill \\ \end{aligned}$$
(18)

Form Eq. (14), we obtain

$$\left\{ \begin{aligned} &q_{1}^{2} + q_{2}^{2} = \frac{1}{2}(1 + r_{11} ), \hfill \\ &q_{1}^{2} - q_{2}^{2} = - \frac{1}{2}\sqrt {(1 + r_{11} )^{2} - 4\beta^{2} } , \hfill \\ &q_{1} r_{12} + q_{2} r_{31} = 2q_{4} (q_{2}^{2} - q_{1}^{2} ), \hfill \\& q_{2} r_{12} + q_{1} r_{31} = 2q_{3} (q_{2}^{2} - q_{1}^{2} ). \hfill \\ \end{aligned} \right.$$
(19)

As a result, we have

$$\left\{ \begin{aligned} &q_{1} = \frac{1}{2}\sqrt {1 + k_{1} t_{z} - \sqrt {(1 + k_{1} t_{z} )^{2} - 4\beta^{2} } } , \hfill \\ &q_{2} = \frac{\beta }{{2q_{1} }}, \hfill \\& q_{3} = \frac{{q_{2} k_{3} + q_{1} k_{2} }}{{2(q_{2}^{2} - q_{1}^{2} )}}t_{z} , \hfill \\ &q_{4} = \frac{{q_{1} k_{3} + q_{2} k_{2} }}{{2(q_{2}^{2} - q_{1}^{2} )}}, \hfill \\& t_{x} = \frac{{t_{z} }}{2f}(u_{1} (1 + \varepsilon_{1} ) + u_{2} (1 + \varepsilon_{2} )) - dr_{13} , \hfill \\& t_{y} = \frac{{t_{z} }}{2f}(v_{1} (1 + \varepsilon_{1} ) + v_{2} (1 + \varepsilon_{2} )) - dr_{33} , \hfill \\ \end{aligned} \right.$$
(20)
$$\left\{ \begin{aligned} t_{x} = \frac{{t_{z} }}{2f}(u_{1} (1 + \varepsilon_{1} ) + u_{2} (1 + \varepsilon_{2} )) - dr_{13} , \hfill \\ t_{y} = \frac{{t_{z} }}{2f}(v_{1} (1 + \varepsilon_{1} ) + v_{2} (1 + \varepsilon_{2} )) - dr_{33} . \hfill \\ \end{aligned} \right.$$
(21)

2.4 Optimization Algorithm

The accuracy of spacecraft pose estimation based on simplified perspective projection is poor. Therefore, an iterative optimal algorithm was constructed to improve the solution accuracy. The optimal algorithm for improving the accuracy of spacecraft pose estimation is shown in Figure 2.

Figure 2
figure 2

Optimal algorithm to improve the accuracy of spacecraft pose estimation

In the algorithm, \(R^{j} = \left[ {\begin{array}{*{20}c} {I^{j} } & {J^{j} } & {K^{j} } \\ \end{array} } \right]^{{\text{T}}}\) and \(T^{j} = \left[ {\begin{array}{*{20}c} {t_{x}^{j} } & {t_{y}^{j} } & {t_{z}^{j} } \\ \end{array} } \right]^{{\text{T}}}\),

$$I^{j} = \frac{{I^{\prime}}}{{\left\| {I^{\prime}} \right\|}},\quad K^{j} = \frac{{K^{\prime}}}{{\left\| {K^{\prime}} \right\|}},\quad J^{j} = I^{j} \times K^{j} ,$$
$$t_{z}^{j} = \frac{1}{2}\left(\frac{1}{{\left\| {I^{\prime}} \right\|}} + \frac{1}{{\left\| {K^{\prime}} \right\|}}\right),\quad t_{x}^{j} = dt_{z}^{j} t^{\prime}_{x} ,\quad t_{y}^{j} = t_{z}^{j} t^{\prime}_{y} d.$$

3 Experimental Section

The simulation experiment parameters were set as follows. The focal distance of the camera was 12 mm. The pixel size was 7.4 μm × 7.4 μm. The rotation matrix and the translation vector were \([\varphi ,\;\theta ,\;\psi ] = [30^\circ ,\;30^\circ ,\;30^\circ ]\) and \(T = [0.5t_{z} ,\;0.5t_{z} ,\;t_{z} ]\) (m), respectively, where \(t_{z} = 1 - 20.\) The four points in the target spacecraft coordinates were

$${\varvec{P}}_{1}^{s} = \left[ {\begin{array}{*{20}c} {75} \\ 0 \\ {75} \\ \end{array} } \right],\quad {\varvec{P}}_{2}^{s} = \left[ {\begin{array}{*{20}c} { - 75} \\ 0 \\ {75} \\ \end{array} } \right],\quad{\varvec{P}}_{3}^{s} = \left[ {\begin{array}{*{20}c} {30} \\ {40} \\ {75} \\ \end{array} } \right],\quad {\varvec{P}}_{4}^{s} = \left[ {\begin{array}{*{20}c} { - 30} \\ {40} \\ {75} \\ \end{array} } \right].$$

Simulation experiments verified the proposed algorithm in the following three aspects: 1) The optimization algorithm was analyzed without noise. 2) The relationship between the estimation accuracy and distance was analyzed with a mean value of 0 and a standard deviation of 0.1 pixel Gaussian noise. 3) The relationship between the estimation accuracy and distance was analyzed with a mean value of 0 and a standard deviation of 1 pixel Gaussian noise.

The simulation results are shown in Figures 3 and 4. The spacecraft pose estimation error is large, based on the simplified perspective projection, and the optimization algorithm based on the camera model effectively reduces the measurement error. After 10 iterations, the attitude errors are less than 0.42°, and the position errors are less than 4 mm.

Figure 3
figure 3

Attitude accuracy analysis using optimization

Figure 4
figure 4

Position accuracy analysis using optimization

Figures 5 and 6 show the estimation accuracy with a mean value of 0 and a standard deviation of 0.1 pixel noise. When \(t_{z}\) is 10 m, the attitude error is less than 0.36°, and the position error is less than 19.5 mm. When \(t_{z}\) is 20 m, the attitude error is less than 0.65°, and the position error is less than 117 mm. The maximum pose error occurs when \(t_{z} = 1\) m. This is mainly because the initial relative position accuracy is low based on the simplified perspective projection model.

Figure 5
figure 5

Attitude accuracy analysis with 0 mean and 0.1 pixel standard deviation noise

Figure 6
figure 6

Position accuracy analysis with 0 mean and 0.1 pixel standard deviation noise

Figures 7 and 8 show the estimation accuracy with a mean value of 0 and a standard deviation of 1 pixel noise. When \(t_{z}\) is 10 m, the attitude errors are less than 3°, and the position errors are less than 0.35 m. When \(t_{z}\) is 20 m, the attitude errors are less than 7.5°, and the position errors are less than 1 m.

Figure 7
figure 7

Attitude accuracy analysis with 0 mean and 1 pixel standard deviation noise

Figure 8
figure 8

Position accuracy analysis with 0 mean and 1 pixel standard deviation noise

4 Conclusions

To meet the requirements of pose estimation accuracy for spacecraft relative distance change from far to near, we propose a model based on two different camera models. In this model, the initial value of the spacecraft pose is calculated by a simplified perspective projection model, and the results are further optimized by the perspective projection model. The simulation results show that the errors of pose estimation are less than 0.8° and 117 mm when the image points have 0 mean and 0.1 pixel standard deviation. Further, the errors of pose estimation are less than 7.5° and 1 m when the image points have 0 mean and 1 pixel standard deviation. The estimation accuracy can satisfy the requirements of spacecraft missions.

References

  1. M Balch, D Tandy. A pose and position measurement system for the Hubble Space Telescope servicing mission. Proceedings of SPIE - The International Society for Optical Engineering, 2007: 65550F.

  2. M Xu, Q Qu, Y Dong, et al. Capturing a spacecraft around a flyby asteroid using hamiltonian-structure-preserving control. Communications in Nonlinear Science and Numerical Simulation, 2020: 105500.

  3. A M Zou, K D Kumar, A Ruiter. Fixed-time attitude tracking control for rigid spacecraft. Automatica, 2020, 113: 108792.

    Article  MathSciNet  MATH  Google Scholar 

  4. C Yin. Multi-loop attitude tracking control for capturing spacecraft. Aerospace Control, 2018, 36(01): 42-49. (in Chinese)

    Google Scholar 

  5. J Peng, W Xu, B Liang, et al. Virtual stereo-vision pose measurement of non-cooperative space targets for a dual-arm space robot. IEEE Transactions on Instrumentation & Measurement, 2019: 1-13.

  6. S Sharma, J Ventura, S D'Amico. Robust model-based monocular pose initialization for noncooperative spacecraft rendezvous. Journal of Spacecraft and Rockets, 2018, 55(6): 1414-1429.

    Article  Google Scholar 

  7. G Liu, C Xu, Y Zhu, et al. Monocular vision-based pose determination in close proximity for low impact docking. Sensors, 2019, 19(15): 3261.

    Article  Google Scholar 

  8. L Zhang, F Zhu, Y Hao, et al. Rectangular-structure-based pose estimation method for non-cooperative rendezvous. Applied Optics, 2018, 57(21): 6164-6173.

    Article  Google Scholar 

  9. C Kaiser, F Sjoeberg, J M Delcura, et al. SMART-OLEV-An orbital life extension vehicle for servicing commercial spacecrafts in GEO. Acta Astronautica, 2008, 63(1-4): 400-410.

    Article  Google Scholar 

  10. F Sellmaier, T Boge, J Spurmann, et al. On-orbit servicing missions: Challenges and solutions for spacecraft operations. Spaceops Conference, 2010, 2: 1816-1826.

    Google Scholar 

  11. J M Galante, J V Eepoel, M Strube, et al. Pose measurement performance of the argon relative navigation sensor suite in simulated flight conditions. Occupational Ergonomics, Minneapolis, Minnesota, 2012. https://doi.org/10.2514/6.2012-4927.

    Article  Google Scholar 

  12. B J Naasz, E J Van, S Z Queen, et al. Flight results from the HST SM4 relative navigation sensor system. 33rd Annual AAS Rocky Mountain Guidance and Control Conference, Breckenridge, CO, 2010, 137: 723–744.

    Google Scholar 

  13. X Peng, Z Sun, M Chen, et al. Robust noncooperative attitude tracking control for rigid bodies on rotation matrices subject to input saturation constraint. International Journal of Robust and Nonlinear Control, 2021, 32(3): 1583-1603.

    Article  MathSciNet  Google Scholar 

  14. R Volpe, M Sabatini, G B Palmerini. Pose and shape reconstruction of a noncooperative spacecraft using camera and range measurements. International Journal of Aerospace Engineering, 2017, 2017(2): 1-13.

    Article  Google Scholar 

  15. V Lepetit, F Moreno-Noguer, P Fua. EPnP: An accurate O(n) solution to the PnP problem. International Journal of Computer Vision, 2009, 81(2): 155-166.

    Article  Google Scholar 

  16. P Chen, G Hu, J Cui. Extended gravitational pose estimation. Optik - International Journal for Light and Electron Optics, 2014, 125(20): 6106-6112.

    Article  Google Scholar 

  17. Q Yu, G Xu, W Dong, et al. Solving the perspective-three-point problem with linear combinations: An accurate and efficient method. Optik - International Journal for Light and Electron Optics, 2020, 228(3): 165740.

    Google Scholar 

  18. L Quan, Z Lan. Linear N-point camera pose determination. IEEE Trans. Pattern Anal. Mach. Intell., 1999, 21(8): 774-780.

    Article  Google Scholar 

  19. R Galego, A Ortega, R Ferreira, et al. Uncertainty analysis of the DLT-Lines calibration algorithm for cameras with radial distortion. Computer Vision & Image Understanding, 2015, 140: 115-126.

    Article  Google Scholar 

  20. A Nagano. Three-dimensional videography using omnidirectional cameras: An approach inspired by the direct linear transformation method. Journal of Biomechanics, 2021, 128: 110722.

    Article  Google Scholar 

  21. W Zhang, G Xu, Y Cheng, et al. Research on orthogonal iteration algorithm of visual pose estimation for UAV landing. 2018 IEEE CSAA Guidance, Navigation and Control Conference (GNCC), Xiamen, China, 2018: 1-6.

  22. G Hu, Z Zhou, J Cao, et al. Non-linear calibration optimisation based on the Levenberg–Marquardt algorithm. IET Image Processing, 2020, 14(7): 1402-1414.

    Article  Google Scholar 

  23. C Sun, H Dong, B Zhang, et al. An orthogonal iteration pose estimation algorithm based on an incident ray tracking model. Measurement Science & Technology, 2018. https://doi.org/10.1088/1361-6501/aad014.

    Article  Google Scholar 

  24. J Li, Y Zhuang, Q Peng, et al. Pose estimation of non-cooperative space targets based on cross-source point cloud fusion. Remote Sensing, 2021, 13(21): 4239.

    Article  Google Scholar 

  25. L Zhang, F Zhu, Y Hao, et al. Optimization-based non-cooperative spacecraft pose estimation using stereo cameras during proximity operations. Applied Optics, 2017, 56(15): 4522.

    Article  Google Scholar 

  26. G Zhao, S Xu, Y Bo. LiDAR-based non-cooperative tumbling spacecraft pose tracking by fusing depth maps and point clouds. Sensors, 2018, 18(10): 3432.

    Article  Google Scholar 

  27. Q Wei, Z Jiang, H Zhang. Robust spacecraft component detection in point clouds. Sensors, 2018, 18(4): 933.

    Article  Google Scholar 

  28. L Liu, G Zhao, Y Bo. Point cloud based relative pose estimation of a satellite in close range. Sensors, 2016, 16(6): 824

    Article  Google Scholar 

  29. F Dornaika, C Garcia. Pose estimation using point and line correspondences. Real-Time Imaging, 1999, 5(3): 215-230.

    Article  Google Scholar 

  30. I Shimshoni, R Basri, E Rivlin . A geometric interpretation of weak-perspective motion. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1999, 21(3): 252-257

    Article  Google Scholar 

  31. J H Byun, T D Han. PPAP: Perspective projection augment platform with Pan–Tilt actuation for improved spatial perception. Sensors, 2019, 19(12): 2652.

    Article  Google Scholar 

Download references

Acknowledgements

The authors sincerely thanks to Professor Naiming Qi of Harbin Institute of Technology for his critical discussion and reading during manuscript preparation.

Funding

Supported by National Natural Science Foundation of China (Grant No. 12272104).

Author information

Authors and Affiliations

Authors

Contributions

LM was in charge of the whole algorithm and wrote the manuscript; NQ provided suggestions and rectification ideas; ZZ assisted with simulation analysis. All authors read and approved the final manuscript.

Authors’ Information

Lidong Mo, is currently a PhD candidate at School of Astronautics, Harbin Institute of Technology, China.

Naiming Qi, is currently a professor at School of Astronautics, Harbin Institute of Technology, China. He received his PhD degree from Harbin Institute of Technology, China, in 2001.

Zhenqing Zhao, is currently a teacher at Northeast Agricultural University, China.

Corresponding author

Correspondence to Lidong Mo.

Ethics declarations

Competing Interests

The authors declare no competing financial interests.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Mo, L., Qi, N. & Zhao, Z. Spacecraft Pose Estimation Based on Different Camera Models. Chin. J. Mech. Eng. 36, 63 (2023). https://doi.org/10.1186/s10033-023-00884-8

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1186/s10033-023-00884-8

Keywords