Abstract
In urban low-altitude wireless network (LAWN) scenarios, radar signals used for aerial drone monitoring are often corrupted by building occlusions and noise, impairing reliable perception. To address this, we propose Deep Time-Frequency Inverse Reconstruction Network (DTFIRNet), a deep learning framework for robust radar perception under occluded conditions. DTFIRNet integrates an enhanced UNet with residual connections and spatio-temporal attention, and a trainable iSTFT module guided by STFT-derived time-frequency representations (TFRs), enabling robust signal restoration under occlusion. Experimental results show that DTFIRNet significantly outperforms existing methods in waveform recovery and delay estimation, achieving an average range error of 0.611~{m}. We also construct and release a dataset of corrupted LFM radar echoes via IEEE DataPort (https://dx.doi.org/10.21227/qw6h-ns61), supporting further research in radar-based LAWN sensing.
| Original language | English |
|---|---|
| Pages (from-to) | 3223-3234 |
| Number of pages | 12 |
| Journal | IEEE Transactions on Cognitive Communications and Networking |
| Volume | 12 |
| Early online date | 28 Oct 2025 |
| DOIs | |
| Publication status | Published - 2026 |
Bibliographical note
Publisher Copyright:© 2015 IEEE.
Funding
This work was partially supported by the Peng Cheng Laboratory 2025 Cross-Disciplinary Research Project, Grant No. 2025QYB001 (”Research on 6G AI-Empowered Near-Field MIMO and Movable Antenna Key Technologies”). Part of this research was previously presented at IEEE Int. Conf. Commun. China 2025.
Keywords
- Low-altitude wireless networks (LAWN)
- aerial drone range estimation
- occlusion-aware signal recovery
- radar waveform restoration
Fingerprint
Dive into the research topics of 'DTFIRNet: Deep learning-based Radar Perception for Urban Low-Altitude Wireless Networks'. Together they form a unique fingerprint.Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver