Article
Version 1
Preserved in Portico This version is not peer-reviewed
DANAE++: A Smart Approach for Denoising Underwater Attitude Estimation
Version 1
: Received: 15 January 2021 / Approved: 18 January 2021 / Online: 18 January 2021 (14:22:52 CET)
A peer-reviewed article of this Preprint also exists.
Russo, P.; Di Ciaccio, F.D.; Troisi, S. DANAE: A Smart Approach for Denoising Underwater Attitude Estimation ++. Sensors 2021, 21, 1526. Russo, P.; Di Ciaccio, F.D.; Troisi, S. DANAE: A Smart Approach for Denoising Underwater Attitude Estimation ++. Sensors 2021, 21, 1526.
Abstract
One of the main issues for underwater robots navigation is represented by the accurate vehicle positioning, which heavily depends on the orientation estimation phase. The systems employed to this scope are affected by different noise typologies, mainly related to the sensors and to the irregular noise of the underwater environment. Filtering algorithms can reduce their effect if opportunely configured, but this process usually requires fine techniques and time. This paper presents DANAE++, an improved denoising autoencoder based on DANAE, which is able to recover Kalman Filter IMU/AHRS orientation estimations from any kind of noise, independently of its nature. This deep learning-based architecture already proved to be robust and reliable, but in its enhanced implementation significant improvements are obtained both in terms of results and performance. In fact, DANAE++is able to denoise the three angles describing the attitude at the same time, and that is verified also on the estimations provided by the more performing Extended KF. Further tests could make this method suitable for real-time applications on navigation tasks.
Keywords
attitude estimation; autoencoders; deep learning; denoising; Kalman filter; underwater environment
Subject
Computer Science and Mathematics, Algebra and Number Theory
Copyright: This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Comments (0)
We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.
Leave a public commentSend a private comment to the author(s)
* All users must log in before leaving a comment