×

Monocular camera based trajectory tracking of 3-DOF helicopter. (English) Zbl 1302.93155

Summary: The vision based flight control problems of unmanned Micro Aerial Vehicles (MAVs) have attracted much attention in recent years. This paper presents a new solution to the trajectory tracking problem of a 3-Degrees-Of-Freedom (3-DOF) helicopter by utilizing only an onboard monocular camera without any artificial marker. First, the Parallel Tracking And Mapping (PTAM) algorithm, which is a famous solution to the visual simultaneous localization and mapping (vSLAM) problem, is employed to estimate the attitude angles of the 3-DOF helicopter. Then the calibration puzzle of the mapping between the onboard camera and helicopter coordinate systems is turned into an optimization problem. A robust cost function is applied to get the accurate estimate of the mapping. Finally, for the purpose of alleviating the influences of nonlinearity and coupling between channels, the Feedback Linearization (FL) and the Linear Quadratic Regulation (LQR) techniques are employed to design the controller. Experimental results show that the proposed method can ensure that the helicopter hovers without drift and has good tracking performance.

MSC:

93C85 Automated systems (robots, etc.) in control theory
68T45 Machine vision and scene understanding
94A08 Image processing (compression, reconstruction, etc.) in information and communication theory
68T40 Artificial intelligence for robotics
93B18 Linearizations
49N10 Linear-quadratic optimal control problems

Software:

PIXHAWK; MonoSLAM
Full Text: DOI

References:

[1] Brockers, R., S.Susca, D.Zhu, and L.Matthies, “Fully self‐contained vision‐aided navigation and landing of a micro air vehicle independent from external sensor inputs,” Conference on Unmanned Systems Technology XIV, Baltimore, MD, p. 83870Q (2012).
[2] Weiss, S., D.Scaramuzza, and R.Siegwart, “Mo‐ nocular‐SLAM‐Based Navigation for Autonomous Micro Helicopters in GPS‐Denied Environments,” J. Field Robot., Vol. 28, No. 6, pp. 854-874 (2011).
[3] Ghadiok, V., J.Goldin, and W.Ren, “On the design and development of attitude stabilization, vision‐based navigation, and aerial gripping for a low‐cost quadrotor,” Auton. Robot., Vol. 33, No. 1-2, pp. 41-68 (2012).
[4] Grabe, V., H. H.Bulthoff, and P. R.Giordano, “On‐board Velocity Estimation and Closed‐loop Control of a Quadrotor UAV based on Optical Flow,” IEEE International Conference on Robotics and Automation, Saint Paul, MN, pp. 491-497 (2012).
[5] Eberli, D., D.Scaramuzza, S.Weiss, and R.Siegwart, “Vision Based Position Control for MAVs Using One Single Circular Landmark,” J. Intell. Robot. Syst., Vol. 61, No. 1-4, pp. 495-512 (2011).
[6] Herisse, B., F. X.Russotto, T.Hamel, and R.Mahony, “Hovering flight and vertical landing control of a VTOL Unmanned Aerial Vehicle using Optical Flow,” IEEE/RSJ International Conference on Intelligent Robots and Systems, Nice, France, pp. 801-806 (2008).
[7] Herisse, B., T.Hamel, R.Mahony, and F. X.Russotto, “Landing a VTOL Unmanned Aerial Vehicle on a Moving Platform Using Optical Flow,” IEEE Trans. Robot., Vol. 8, No. 1, pp. 77-89 (2012).
[8] Marlow, S. Q. and J. W.Langelaan, “Local Terrain Mapping for Obstacle Avoidance Using Monocular Vision,” J. the Amer. Helicopter Soc., Vol. 56, No. 2, p. 22007 (2011).
[9] Herisse, B., S.Oustrieres, T.Hamel, R.Mahony, and F. X.Russotto, “A general optical flow based terrain‐following strategy for a VTOL UAV using multiple views,” IEEE International Conference on Robotics and Automation, Anchorage, AK, pp. 3341-3348 (2010).
[10] Shakernia, O., Y.Ma, T. J.Koo, and S.Sastry, “Landing an Unmanned Air Vehicle: Vision based Motion Estimation and Nonlinear Control,” Asian J. Control, Vol. 1, No. 3, pp. 128-145 (1999).
[11] Li, W., T.Zhang, and K.Kuhnlenz, “A Vision‐Guided Autonomous Quadrotor in an air‐Ground Multi‐Robot System,” IEEE International Conference on Robotics and Automation, Shanghai, China, pp. 2980-2985 (2011).
[12] Guenard, N., T.Hamel, and R.Mahony, “A Practical Visual Servo Control for an Unmanned Aerial Vehicle,” IEEE Trans. Robot., Vol. 24, No. 2, pp. 331-340 (2008).
[13] Carrillo, L. R. G., E.Rondon, A.Sanchez, A.Dzul, and R.Lozano, “Stabilization and Trajectory Tracking of a Quad‐Rotor Using Vision,” J. Intell. Robot. Syst., Vol. 61, No. 1-4, pp. 103-118 (2011).
[14] Lee, D., T.Ryan, and H. J.Kim, “Autonomous Landing of a VTOL UAV on a Moving Platform Using Image‐based Visual Servoing,” IEEE International Conference on Robotics and Automation, Saint Paul, MN, pp. 971-976 (2012).
[15] Meier, L., P.Tanskanen, F.Fraundorfer, and M.Pollefeys, “PIXHAWK: A System for Autonomous Flight using Onboard Computer Vision,” IEEE International Conference on Robotics and Automation, Shanghai, China, pp. 2992-2997 (2011).
[16] Yamada, T., T.Yairi, S. H.Bener, and K.Machida, “A Study on SLAM for Indoor Blimp with Visual Markers,” ICROS‐SICE International Joint Conference, Fukuoka, Japan, pp. 647-652 (2009).
[17] Davison, A. J., I. D.Reid, N. D.Molton, and O.Strasse, “MonoSLAM: Real‐time single camera SLAM,” IEEE Trans. Pattern Anal. Mach. Intell., Vol. 29, No. 6, pp. 1052-1067 (2007).
[18] Klein, G. and D.Murray, “Parallel tracking and mapping for small AR workspaces,” Sixth IEEE and ACM International Symposium on Mixed and Augmented Reality, Nara, Japan, 225-234 (2007).
[19] Eade, E. and T.Drummond, “Scalable Monocular SLAM,” IEEE Computer Society Conference on Computer Vision and Pattern Recognition, New York, NY, 469-476 (2006).
[20] Chiu, C. C. and C. T.Lo, “Vision‐Only Automatic Flight Control for Small UAVs,” IEEE Trans. Veh. Technol., Vol. 60, No. 6, pp. 2425-2437 (2011).
[21] Tanaka, K., H.Ohtake, M.Tanaka, and H. O.Wang, “Wireless Vision‐Based Stabilization of Indoor Microhelicopter,” IEEE/ASME Trans. Mechatron., Vol. 17, No. 3, 519-524 (2012).
[22] Kallapur, A. G., I. R.Petersen, and S. G.Anavatti, “Robust Gyro‐free Attitude Estimation for a Small Fixed‐wing Unmanned Aerial Vehicle,” Asian J. Control, Vol. 14, No. 6, pp. 1484-1495 (2012). · Zbl 1303.93169
[23] Rosten, E. and T.Drummond, “Machine learning for high‐speed corner detection,” 9th European Conference on Computer Vision, Graz, Austria, pp. 430-443 (2006).
[24] Strasdat, H., J. M. M.Montiel, and A. J.Davison, “Visual SLAM: Why filter? ”, Image Vis. Comput., Vol. 30, No. 2, pp. 65-77 (2012).
[25] Ma, Y., S.Soatto, J.Kosecka, and S. S.Sastry, An Invitation to 3‐D Vision: From Images to Geometric Models, Springer, New York (2006).
[26] Zhang, Z., “Determining the epipolar geometry and its uncertainty: a review,” Int. J. Comput. Vis., Vol. 27, No. 2, pp. 161-195 (1998).
[27] Shan, J., H. T.Liu, and S.Nowotny, “Synchronised trajectory‐tracking control of multiple 3‐DOF experimental helicopter,” IEE Proceedings‐Control Theory and Applicat., Vol. 152, No. 6, pp. 683-692 (2005).
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.