The University of Southampton
University of Southampton Institutional Repository

Robust Bayesian attention belief network for radar work mode recognition

Robust Bayesian attention belief network for radar work mode recognition
Robust Bayesian attention belief network for radar work mode recognition
Understanding and analyzing radar work modes play a key role in electronic support measure system. Many classifiers, for example those based on convolutional neural network (CNN) and recurrent neural network (RNN), are available for recognizing radar work modes as well as emitter types from their waveform parameters. However, the performance of these methods may suffer significantly when confronting different types of signal degradation, e.g., measurement error, lost pulse and spurious pulse. To tackle this issue, we in this paper develop a Bayesian attention belief network (BABNet) based on Bayesian neural networks in which the probability distribution over weights can help to enhance the model robustness for corrupted data. In particular, we adopt pre-trained CNN as the Bayesian inference prior. This not only accelerates the convergence speed, but also avoids the training process getting stuck in bad local minima. Meanwhile, instead of using RNNs which are difficult to be implemented in parallel, the combination of padding operation and attention module in the proposed BABNet enables CNN, as the backbone, to process sequential data with variable length. Extensive experiments are conducted to demonstrate the recognition capability and robustness of the BABNet in different environments.
Attention mechanism, Bayesian neural network, Pulse descriptor word, Radar work mode, Recognition, Robustness
0960-3174
Du, Mingyang
d42b1519-40d9-476a-b2a3-2b9f6c63d46a
Zhong, Ping
a51c7669-d590-4a7b-adc8-0f7d2b9e1932
Cai, Xiaohao
de483445-45e9-4b21-a4e8-b0427fc72cee
Bi, Daping
7d0942b5-14a9-4e09-b709-416fa34f31a2
Jing, Aiqi
c303e8cf-3555-4b90-aa8a-fab8803280f5
Du, Mingyang
d42b1519-40d9-476a-b2a3-2b9f6c63d46a
Zhong, Ping
a51c7669-d590-4a7b-adc8-0f7d2b9e1932
Cai, Xiaohao
de483445-45e9-4b21-a4e8-b0427fc72cee
Bi, Daping
7d0942b5-14a9-4e09-b709-416fa34f31a2
Jing, Aiqi
c303e8cf-3555-4b90-aa8a-fab8803280f5

Du, Mingyang, Zhong, Ping, Cai, Xiaohao, Bi, Daping and Jing, Aiqi (2023) Robust Bayesian attention belief network for radar work mode recognition. Statistics and Computing, 133, [103874]. (doi:10.1016/j.dsp.2022.103874).

Record type: Article

Abstract

Understanding and analyzing radar work modes play a key role in electronic support measure system. Many classifiers, for example those based on convolutional neural network (CNN) and recurrent neural network (RNN), are available for recognizing radar work modes as well as emitter types from their waveform parameters. However, the performance of these methods may suffer significantly when confronting different types of signal degradation, e.g., measurement error, lost pulse and spurious pulse. To tackle this issue, we in this paper develop a Bayesian attention belief network (BABNet) based on Bayesian neural networks in which the probability distribution over weights can help to enhance the model robustness for corrupted data. In particular, we adopt pre-trained CNN as the Bayesian inference prior. This not only accelerates the convergence speed, but also avoids the training process getting stuck in bad local minima. Meanwhile, instead of using RNNs which are difficult to be implemented in parallel, the combination of padding operation and attention module in the proposed BABNet enables CNN, as the backbone, to process sequential data with variable length. Extensive experiments are conducted to demonstrate the recognition capability and robustness of the BABNet in different environments.

Text
Robust Bayesian Attention Belief Network for Radar Work Mode Recognition - Accepted Manuscript
Restricted to Repository staff only until 6 December 2024.
Request a copy

More information

e-pub ahead of print date: 7 December 2022
Published date: 1 March 2023
Additional Information: Funding Information: Ping ZHONG (Senior Member, IEEE) received the M.S. degree in applied mathematics and the Ph.D. degree in information and communication engineering from the National University of Defense Technology (NUDT), Changsha, China, in 2003 and 2008, respectively. Dr. Zhong was a recipient of the National Excellent Doctoral Dissertation Award of China in 2011 and the New Century Excellent Talents in the University of China in 2013. From March 2015 to February 2016, he was a visiting Scholar with the Department of Applied Mathematics and Theory Physics, University of Cambridge, Cambridge, U.K. He is currently a Professor with the National Key Laboratory of Science and Technology on ATR, NUDT. He has authored more than 40 peer-reviewed articles in international journals, such as the IEEE transactions and letters. His research interests include computer vision, machine learning and pattern recognition. Funding Information: This document is the results of the research project funded by the National Natural Science Foundation of China (Grant no. 61971428 ). Publisher Copyright: © 2022 Elsevier Inc.
Keywords: Attention mechanism, Bayesian neural network, Pulse descriptor word, Radar work mode, Recognition, Robustness

Identifiers

Local EPrints ID: 475142
URI: http://eprints.soton.ac.uk/id/eprint/475142
ISSN: 0960-3174
PURE UUID: 1cb3b43d-16e4-4fd0-b323-620d4ce08565
ORCID for Xiaohao Cai: ORCID iD orcid.org/0000-0003-0924-2834

Catalogue record

Date deposited: 10 Mar 2023 17:43
Last modified: 17 Mar 2024 04:01

Export record

Altmetrics

Contributors

Author: Mingyang Du
Author: Ping Zhong
Author: Xiaohao Cai ORCID iD
Author: Daping Bi
Author: Aiqi Jing

Download statistics

Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.

View more statistics

Atom RSS 1.0 RSS 2.0

Contact ePrints Soton: eprints@soton.ac.uk

ePrints Soton supports OAI 2.0 with a base URL of http://eprints.soton.ac.uk/cgi/oai2

This repository has been built using EPrints software, developed at the University of Southampton, but available to everyone to use.

We use cookies to ensure that we give you the best experience on our website. If you continue without changing your settings, we will assume that you are happy to receive cookies on the University of Southampton website.

×