International Conference on System-Integrated Intelligence - SysInt 2025
Lecture
04.06.2025
Explainable Residual Attention U-Net for Advancing Damage Diagnostics in Fibre Metal Laminates
SK

Sanjeev Kumar

Universität Bremen

Kumar, S. (V)¹
¹University of Bremen
Vorschau
24 Min. Untertitel (CC)

Fibre metal laminates (FML) have proved to be a transformative material for some crucial industries, such as aerospace, automotive, and marine, due to their exceptional mechanical properties. However, the development of a cost-effective and time-efficient damage diagnostics method is a challenge given the complex multi-layered structure of FMLs, especially for damages caused by the low-impact energy of less than 20 joules. Automated damage detection and segmentation in the Computed Tomography (CT) data of FMLs using the state-of-the-art deep learning model: Residual attention U-net effectively solves this problem. By leveraging an innovative attention mechanism employed in a U-Net architecture, it achieves an impressive F1 score exceeding 75%, using a minimal dataset of only 28 training images (300 × 950 pixels), having 121 unique features. In order to improve the interpretability of the segmentation results, a Gradient-weighted Class Activation Mapping (Grad-CAM) algorithm adapted for segmentation tasks, which is popularly known as Seg-Grad-CAM, is used. This provides an insight into the model's decision-making process and ensures its reliability in critical applications. The lack of absolute ground truth hinders the reliable quantitative analysis of the automatic segmentation, therefore, a synthetic CT dataset was used to address this problem. The findings of this study contribute to the development of an efficient, interpretable, and scalable solution for automatic damage detection and segmentation within FML plates, which leads to a broader application in industrial nondestructive testing and structural health monitoring. 

Manuskript

Manuskript

Erwerben Sie einen Zugang, um dieses Dokument anzusehen.

Abstract

Abstract

Erwerben Sie einen Zugang, um dieses Dokument anzusehen.

© 2026