×

Object traversing by monocular UAV in outdoor environment. (English) Zbl 07886926

Summary: In search and monitoring tasks, object traversing scenarios are more challenging than avoidance and tracking. In this paper, we propose an object perception and path planning algorithm which can detect hollow objects and calculate a traversing path in real-time. Meanwhile, a general vision unmanned aerial vehicle (UAV) system framework is designed to implement the proposed algorithm. A series of fundamental but effective algorithms are utilized in combination to detect hollow objects, and then the triangulation is applied to calculate the 3D position of the object. In order to improve the detection accuracy, a path planning and motion control algorithm is designed to enhance the safety and stability during traversing. The actual experimental results show that the UAV system is robust and feasible.
© 2020 Chinese Automatic Control Society and John Wiley & Sons Australia, Ltd

MSC:

93-XX Systems theory; control
Full Text: DOI

References:

[1] S.Tang and V.Kumar, Autonomous flight, Ann. Rev. Control Robot. Auton. Syst.1(2018), no. 1, 29-52.
[2] L.Mejias et al., Vision‐based detection and tracking of aerial targets for UAV collision avoidance, IEEE/RSJ International Conference on Intelligent Robots and Systems, Taipei, Taiwan, 2010, pp. 87-92.
[3] A.Rozantsev, V.Lepetit, and P.Fua, Flying objects detection from a single moving camera, Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2015, pp. 4128-4136.
[4] J.Torres‐Sánchez, F.López‐Granados, and J. M.Peña, An automatic object‐based method for optimal thresholding in UAV images: Application for vegetation detection in herbaceous crops, Comput. Electron. Agriculture114 (2015), 43-52.
[5] A.Al‐Kaff et al., Monocular vision‐based obstacle detection / avoidance for unmanned aerial vehicles, IEEE Intelligent Vehicles Symposium, 2016, pp. 92-97.
[6] X.‐Z.Peng, H.‐Y.Lin, and J.‐M.Dai, Path planning and obstacle avoidance for vision guided quadrotor UAV navigation, IEEE International Conference on Control and Automation, Kathmandu, Nepal, 2016, pp. 984-989.
[7] F.Kendoul, Survey of advances in guidance, navigation, and control of unmanned rotorcraft systems, J. Field Robot.29 (2012), no. 2, 315-378.
[8] F.Jurado et al., Vision‐based trajectory tracking system for an emulated quadrotor UAV, Asian J. Control16 (2014), no. 3, 729-741. · Zbl 1302.93154
[9] M. C.Santos et al., UAV obstacle avoidance using RGB‐D system, International Conference on Unmanned Aircraft Systems, Denver, CO, USA, 2015, pp. 312-319.
[10] Y.Bi et al., A lightweight autonomous MAV for indoor search and rescue, Asian J. Control21(2019), no. 4, 1732-1744. · Zbl 1432.93334
[11] J.Redmon et al., You only look once: Unified, real‐time object detection, Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 2016, pp. 779-788.
[12] W.Liu et al., SSD: Single Shot multibox Detector, European conference on computer vision, 2016, pp. 21-37.
[13] S.Jung et al., Perception, guidance, and navigation for indoor autonomous drone racing using deep learning, IEEE Robot. Autom. Lett.3 (2018), no. 3, 2539-2544.
[14] Y.Kuznietsov, J.Stuckler, and B.Leibe, Semi‐supervised deep learning for monocular depth map prediction, Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 2017, pp. 6647-6655.
[15] F.Liu et al., Learning depth from single monocular images using deep convolutional neural fields, IEEE Trans. Pattern Anal. Machine Intell.38 (2015), no. 10, 2024-2039.
[16] C.‐H.Lin, F.‐Y.Hsiao, and F.‐B.Hsiao, Vision‐based tracking and position estimation of moving targets for unmanned helicopter systems, Asian J. Control15 (2013), no. 5, 1270-1283. · Zbl 1286.93123
[17] H.Lim and S. N.Sinha, Monocular localization of a moving person onboard a quadrotor MAV, IEEE International Conference on Robotics and Automation, Seattle, WA, USA, 2015, pp. 2182-2189.
[18] P.Ramon Soria, B. C.Arrue, and A.Ollero, Detection, location and grasping objects using a stereo sensor on UAV in outdoor environments, Sensors17 (2017), no. 1, 103.
[19] R.Jin and J.Wang, A vision tracking system via color detection, IEEE International Conference on Control and Automation, Kathmandu, Nepal, 2016, pp. 865-870.
[20] S.Li et al., Monocular camera based trajectory tracking of 3‐dof helicopter, Asian J. Control16 (2014), no. 3, 742-751. · Zbl 1302.93155
[21] J. Z.Song et al., An improved RRT path planning algorithm, Acta Electron. Sin.38 (2010), no. 2A, 225-228.
[22] M. C.Lee and M. G.Park, Artificial potential field based path planning for mobile robots using a virtual obstacle concept, IEEE/ASME International Conference on Advanced Intelligent Mechatronics, Kobe, Japan, Japan, 2003, pp. 735-740.
[23] L.Yin, Y.Yin, and C. J.Lin, A new potential field method for mobile robot path planning in the dynamic environments, Asian J. Control11 (2009), no. 2, 214-225.
[24] G.Luo et al., UAV path planning in mixed‐obstacle environment via artificial potential field method improved by additional control force, Asian J. Control17 (2015), no. 5, 1600-1610. · Zbl 1333.93178
[25] S. G.Cui, H.Wang, and L.Yang, A simulation study of A‐star algorithm for robot path planning, International Conference on Mechatronics Technology, 2012, pp. 506-510.
[26] K. H.Su, F. L.Lian, and C. Y.Yang, Development of vision‐based navigation system for wheeled agent, Asian J. Control16 (2014), no. 3, 778-794. · Zbl 1302.93138
[27] I.DJ‐Innovations, DJI Matrice100 platform, 2020. https://www.dji.com/cn/matrice100
[28] D. J.Bora, A. K.Gupta, and F. A.Khan, Comparing the performance of L* A* B* and HSV color spaces with respect to color image segmentation, 2015. arXiv Preprint arXiv:1506.01472.
[29] O.Shakernia et al., Sense and avoid (SAA) flight test and lessons learned, AIAA Infotech@ Aerospace 2007 Conference and Exhibit, 2007, pp. 3003.
[30] K. G.Derpanis, Overview of the RANSAC algorithm, Image Rochester NY4 (2010), no. 1, 2-3.
[31] R.Hartley and A.Zisserman, Multiple view geometry in computer vision, Cambridge University Press, 2003.
[32] S.Benhimane and E.Malis, Homography‐based 2d visual tracking and servoing, Int. J. Robot. Res.26 (2007), no. 7, 661-676.
[33] H.Zhang, The experiment of object traversing by monocular UAV in outdoor environment part 1, 2018. https://www.youtube.com/watch?v=H7D7lNcDr0A, https://www.youtube.com/watch?v=afIdvd0Z7Wo
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.