Journal of Systems Engineering and Electronics ›› 2025, Vol. 36 ›› Issue (3): 825-834.doi: 10.23919/JSEE.2025.000067
• CONTROL THEORY AND APPLICATION • Previous Articles
Ruihu ZHOU(), Mengqi TONG(
), Yongxin GAO(
)
Received:
2023-11-22
Online:
2025-06-18
Published:
2025-07-10
Contact:
Yongxin GAO
E-mail:zhouruihu@stu.xjtu.edu.cn;tm1920321091@stu.xjtu.edu.cn;yxgao@xjtu.edu.cn
About author:
Supported by:
Ruihu ZHOU, Mengqi TONG, Yongxin GAO. Vision-aided inertial navigation for low altitude aircraft with a downward-viewing camera[J]. Journal of Systems Engineering and Electronics, 2025, 36(3): 825-834.
Add to citation manager EndNote|Reference Manager|ProCite|BibTeX|RefWorks
Table 1
Parameter settings for the simulation"
Parameter | Value |
IMU angle random walk coefficient/( | 1e-6 |
IMU angular rate random walk coefficient/( | 1e-8 |
IMU velocity random walk coefficient/( | 1e-4 |
IMU acceleration random walk coefficient/( | 1e-5 |
IMU sample rate/ | 100 |
Image processing rate/ | 1 |
Feature error standard deviation/pixel | 1 |
Number of camera poses in state | 23 |
Average depth of feature/m | 2 000 |
Minimum number of tracking frames | 18 |
Maximum number of tracking frames | 23 |
Maximum number of features | 40 |
Table 2
RMSEs at 3 000 s"
State | SINS-RMSE | VIN-RMSE | |
Latitude | 1.005e-03 | 1.451e-05 | 98.56 |
Longitude | 1.010e-03 | 1.559e-04 | 84.57 |
Height | 2.040e+03 | 2.926e+00 | 99.86 |
5.336e+00 | 4.493e-01 | 91.58 | |
5.711e+00 | 8.244e-02 | 98.56 | |
2.346e+00 | 9.303e-03 | 99.60 | |
Pitch | 2.223e-04 | 4.676e-05 | 78.97 |
Roll | 3.361e-04 | 5.032e-05 | 85.03 |
Yaw | 1.222e-03 | 8.192e-04 | 32.98 |
Table 3
Parameter settings for the real-world data experiment"
Parameter | Value |
IMU angle random walk coefficient/( | 1e-5 |
IMU angular rate random walk coefficient/( | 1e-6 |
IMU velocity random walk coefficient/( | 1e-4 |
IMU acceleration random walk coefficient/( | 1e-5 |
IMU sample rate/ | 100 |
Image processing rate/ | 1 |
Number of camera poses in state | 25 |
Minimum number of tracking frames | 22 |
Maximum number of tracking frames | 25 |
Maximum number of features | 40 |
Table 4
Performance metrics at 250 s"
State | |||
Latitude | 2.682e-10 | 4.16e-10 | 56.08 |
Longitude | 3.21e-10 | 1.08e-09 | 61.89 |
Height | 83.28 | ||
69.92 | |||
75.44 | |||
88.27 | |||
Pitch | 1.88e-07 | 1.04e-07 | 82.59 |
Roll | 2.18e-07 | 3.94e-07 | 82.48 |
Yaw | 4.00e-06 | 1.089e-05 | 10.15 |
1 | HUANG G Q. Visual-inertial navigation: a concise review. Proc. of the IEEE International Conference on Robotics and Automation, 2019: 9572−9582. |
2 |
LYU X, HU B Q, DAI Y B, et al Gaussian process regression-based quaternion unscented Kalman robust filter for integrated SINS/GNSS. Journal of Systems Engineering and Electronics, 2022, 33 (5): 1079- 1088.
doi: 10.23919/JSEE.2022.000105 |
3 | BAR-SHALOM Y, LI X R, KIRUBARAJAN T. Estimation with applications to tracking and navigation: theory, algorithms and software. New York: Wiley, 2001. |
4 | MOURIKIS A I, ROUMELIOTIS S I. A multi-state constraint Kalman filter for vision-aided inertial navigation. Proc. of the IEEE International Conference on Robotics and Automation, 2007: 3565−3572. |
5 | LYNEN S, ACHTELIK M W, WEISS S, et al. A robust and modular multi-sensor fusion approach applied to MAV navigation. Proc. of the IEEE/RSJ International Conference on Intelligent Robots and Systems, 2013: 3923−3929. |
6 |
QI W, XU X Y, CHEN X, et al 360-VIO: a robust visual-inertial odometry using a 360° camera. IEEE Trans. on Industrial Electronics, 2024, 71 (9): 11136- 11145.
doi: 10.1109/TIE.2023.3337541 |
7 | ZHANG M, GUO R N, ZHANG X. Fusion VIO: an extendable sensor fusion framework for unmaned vehicles. Proc. of the Chinese Control and Decision Conference, 2022: 2780−2784. |
8 |
ZHANG T, XU J Y, SHEN H, et al RMSC-VIO: robust multi-stereoscopic visual-inertial odometry for local visually challenging scenarios. IEEE Robotics and Automation Letters, 2024, 9 (5): 4130- 4137.
doi: 10.1109/LRA.2024.3377008 |
9 | WANG Z, YANG K L, SHI H, et al. LF-VIO: a visual-inertial-odometry framework for large field-of-view cameras with negative plane. Proc. of the IEEE/RSJ International Conference on Intelligent Robots and Systems, 2022: 4423−4430. |
10 |
QIN T, LI P L, SHEN S VINS-Mono: a robust and versatile monocular visual-inertial state estimator. IEEE Trans. on Robotics, 2018, 34 (4): 1004- 1020.
doi: 10.1109/TRO.2018.2853729 |
11 | CHIU H P, DAS A, MILLER P, et al. Precise vision-aided aerial navigation. Proc. of the International Conference on Intelligent Robots and Systems, 2014: 688−695. |
12 | LI M Y, MOURIKIS A I. Optimization-based estimator design for vision-aided inertial navigation. Proc. of the Conference on Robotics: Science and Systems, 2013: 241−248. |
13 | KHEDEKAR N, KULKARNI M, ALEXIS K. MIMOSA: a multi-modal SLAM framework for resilient autonomy against sensor degradation. Proc. of the IEEE/RSJ International Conference on Intelligent Robots and Systems, 2022: 7153−7159. |
14 | ZHENG F, TSAI G, ZHANG Z, et al. Trifo-VIO: robust and efficient stereo visual inertial odometry using points and lines. Proc. of the IEEE/RSJ International Conference on Intelligent Robots and Systems, 2018: 3686−3693. |
15 |
MOURIKIS A I, ROUMELIOTIS S I, BURDICK J W SC-KF mobile robot localization: a stochastic cloning Kalman filter for processing relative-state measurements. IEEE Trans. on Robotics, 2007, 23 (4): 717- 730.
doi: 10.1109/TRO.2007.900610 |
16 |
LI M Y, MOURIKIS A I High-precision, consistent EKF-based visual-inertial odometry. International Journal of Robotics Research, 2013, 32 (6): 690- 711.
doi: 10.1177/0278364913481251 |
17 | GENEVA P, ECKENHOFF K, LEE W, et al. OpenVINS: a research platform for visual-inertial estimation. Proc. of the International Conference on Robotics and Automation, 2020: 4666−4672. |
18 | LI M Y, MOURIKIS A I. Improving the accuracy of EKF-based visual-inertial odometry. Proc. of the IEEE International Conference on Robotics and Automation, 2012: 828−835. |
19 | LI M Y, YU H S , ZHENG X, et al. High-fidelity sensor modeling and self-calibration in vision-aided inertial navigation. Proc. of the International Conference on Robotics and Automation, 2014: 409−416. |
20 |
SUN K, MOHTA K, PFROMMER B, et al Robust stereo visual inertial odometry for fast autonomous flight. IEEE Robotics and Automation Letters, 2018, 3 (2): 965- 972.
doi: 10.1109/LRA.2018.2793349 |
21 | WU K Z, ZHANG T, SU D, et al. An invariant-EKF VINS algorithm for improving consistency. Proc. of the International Conference on Intelligent Robots and Systems, 2017: 1578−1585. |
22 | HEO S, PARK C G Consistent EKF-based visual-inertial odometry on matrix Lie group. IEEE Sensors Journal, 2018, 18 (19): 3780- 3788. |
23 |
HESCH J A, KOTTAS D G, BOWMAN S L, et al Consistency analysis and improvement of vision-aided inertial navigation. IEEE Trans. on Robotics, 2014, 30 (1): 158- 176.
doi: 10.1109/TRO.2013.2277549 |
24 | HUAI Z, HUANG G Q. Robocentric visual-inertial odometry. Proc. of the International Conference on Intelligent Robots and Systems, 2018: 6319−6326. |
25 | ZUO X X, GENEVA P, LEE W, et al. LIC-fusion: LiDAR-inertial-camera odometry. Proc. of the International Conference on Intelligent Robots and Systems, 2019: 5848−5854. |
26 |
PACHTER M, MONTGOMERY T J Visual-INS using a human operator and converted measurements. IEEE Trans. on Aerospace and Electronic Systems, 2017, 53 (5): 2359- 2371.
doi: 10.1109/TAES.2017.2696235 |
27 |
XUE C, HUANG Y L, ZHAO C, et al A Gaussian-generalized-inverse-Gaussian joint-distribution-based adaptive MSCKF for visual-inertial odometry navigation. IEEE Trans. on Aerospace and Electronic Systems, 2023, 59 (3): 2307- 2328.
doi: 10.1109/TAES.2022.3213787 |
28 | SUN W, LI Y D, DING W, et al A novel visual inertial odometry based on interactive multiple model and multistate constrained Kalman filter. IEEE Trans. on Instrumentation and Measurement, 2024, 73, 1- 10. |
29 | TRAWNY N, ROUMELIOTIS S I. Indirect Kalman filter for 3D attitude estimation. Minnesota: Multiple Autonomous Robotic Systems Laboratory, 2005. |
30 | NOURELDIN A, KARAMAT T B, GEORGY J. Fundamentals of inertial navigation, satellite-based positioning and their integration. Berlin: Springer Science & Business Media, 2013. |
31 | SOLA J. Quaternion kinematics for the error-state Kalman filter. https://arxiv.org/abs/1711.02508. |
32 | OpenCV Developers Team. Open source computer vision (OpenCV) library. http://opencv.org. |
[1] | Haijian XUE, Tao WANG, Xinghui CAI, Jintao WANG, Fei LIU. Anti-interference self-alignment algorithm by attitude optimization estimation for SINS on a rocking base [J]. Journal of Systems Engineering and Electronics, 2023, 34(5): 1333-1342. |
[2] | Xiaolin NING, Weiping YUAN, Yanhong LIU. A tightly coupled rotational SINS/CNS integrated navigation method for aircraft [J]. Journal of Systems Engineering and Electronics, 2019, 30(4): 770-782. |
[3] | Gongmin Yan, Xi Sun, Jun Weng, Qi Zhou, and Yongyuan Qin. Time-asynchrony identification between inertial sensors in SIMU [J]. Journal of Systems Engineering and Electronics, 2015, 26(2): 346-352. |
[4] | Tong Zhang, Kang Chen, Wenxing Fu, Yunfeng Yu, and Jie Yan. Optimal two-iteration sculling compensation mathematical framework for SINS velocity updating [J]. Journal of Systems Engineering and Electronics, 2014, 25(6): 1065-1071. |
[5] | Yu Chen and Yan Zhao*. New rapid transfer alignment method for SINS of airborne weapon systems [J]. Journal of Systems Engineering and Electronics, 2014, 25(2): 281-287. |
[6] | Jingshuo Xu, Yongjun Wang, and Zhicai Xiao. Rapid transfer alignment for SINS of carrier craft [J]. Journal of Systems Engineering and Electronics, 2013, 24(2): 303-308. |
[7] | Wei Sun and Feng Sun. Novel approach to GPS/SINS integration for IMU alignment [J]. Journal of Systems Engineering and Electronics, 2011, 22(3): 513-518. |
Viewed | ||||||
Full text |
|
|||||
Abstract |
|
|||||