ENWAR: a RAG-empowered multi-modal LLM framework for wireless environment perception
ENWAR: a RAG-empowered multi-modal LLM framework for wireless environment perception
Large language models (LLMs) hold significant promise in advancing network management and orchestration in sixth-generation (6G) and beyond networks. However, existing LLMs are limited in domain-specific knowledge and their ability to handle multi-modal sensory data, which is critical for real-time situational awareness in dynamic wireless environments. This article addresses this gap by introducing Enwar,1 an ENvironment-aWARe retrieval-augmented generation (RAG)-empowered multi-modal LLM framework. Enwar seamlessly integrates multi-modal sensory inputs to perceive, interpret, and cognitively process complex wireless environments to provide human-interpretable situational awareness. Enwar is evaluated on the global positioning system (GPS), light detection and ranging (LiDAR) sensors, and camera modality combinations of the DeepSense6G dataset with stateof- the-art LLMs such as Mistral-7b/8x7b and LLaMa3.1-8/70/405b. Compared to general and often superficial environmental descriptions of these vanilla LLMs, Enwar delivers richer spatial analysis, accurately identifies positions, analyzes obstacles, and assesses line-of-sight (LoS) between vehicles. Results show that Enwar achieves key performance indicators of up to 70% relevancy, 55% context recall, 80% correctness, and 86% faithfulness, demonstrating its efficacy in multi-modal perception and interpretation.
Nazar, Ahmad M.
08c49739-566d-4afa-8aaf-bcae430fbead
Celik, Abdulkadir
f8e72266-763c-4849-b38e-2ea2f50a69d0
Selim, Mohamed Y.
34252a1d-1a3b-448c-b5c1-52d1428bab4b
Abdallah, Asmaa
86b80268-48be-4bc8-9577-c989e496e459
Qiao, Daji
08190337-6fc1-4e91-9108-9037ba7d69e5
Eltawil, Ahmed M.
5eb9e965-5ec8-4da1-baee-c3cab0fb2a72
Nazar, Ahmad M.
08c49739-566d-4afa-8aaf-bcae430fbead
Celik, Abdulkadir
f8e72266-763c-4849-b38e-2ea2f50a69d0
Selim, Mohamed Y.
34252a1d-1a3b-448c-b5c1-52d1428bab4b
Abdallah, Asmaa
86b80268-48be-4bc8-9577-c989e496e459
Qiao, Daji
08190337-6fc1-4e91-9108-9037ba7d69e5
Eltawil, Ahmed M.
5eb9e965-5ec8-4da1-baee-c3cab0fb2a72
Nazar, Ahmad M., Celik, Abdulkadir, Selim, Mohamed Y., Abdallah, Asmaa, Qiao, Daji and Eltawil, Ahmed M.
(2026)
ENWAR: a RAG-empowered multi-modal LLM framework for wireless environment perception.
IEEE Communications Magazine.
(doi:10.1109/MCOM.001.2500261).
Abstract
Large language models (LLMs) hold significant promise in advancing network management and orchestration in sixth-generation (6G) and beyond networks. However, existing LLMs are limited in domain-specific knowledge and their ability to handle multi-modal sensory data, which is critical for real-time situational awareness in dynamic wireless environments. This article addresses this gap by introducing Enwar,1 an ENvironment-aWARe retrieval-augmented generation (RAG)-empowered multi-modal LLM framework. Enwar seamlessly integrates multi-modal sensory inputs to perceive, interpret, and cognitively process complex wireless environments to provide human-interpretable situational awareness. Enwar is evaluated on the global positioning system (GPS), light detection and ranging (LiDAR) sensors, and camera modality combinations of the DeepSense6G dataset with stateof- the-art LLMs such as Mistral-7b/8x7b and LLaMa3.1-8/70/405b. Compared to general and often superficial environmental descriptions of these vanilla LLMs, Enwar delivers richer spatial analysis, accurately identifies positions, analyzes obstacles, and assesses line-of-sight (LoS) between vehicles. Results show that Enwar achieves key performance indicators of up to 70% relevancy, 55% context recall, 80% correctness, and 86% faithfulness, demonstrating its efficacy in multi-modal perception and interpretation.
Text
ENWAR_A_RAG-Empowered_Multi-Modal_LLM_Framework_for_Wireless_Environment_Perception
- Version of Record
More information
e-pub ahead of print date: 29 January 2026
Identifiers
Local EPrints ID: 510550
URI: http://eprints.soton.ac.uk/id/eprint/510550
ISSN: 0163-6804
PURE UUID: 4ac07577-f290-4225-b0ae-3487b8f168f6
Catalogue record
Date deposited: 13 Apr 2026 16:45
Last modified: 14 Apr 2026 02:17
Export record
Altmetrics
Contributors
Author:
Ahmad M. Nazar
Author:
Abdulkadir Celik
Author:
Mohamed Y. Selim
Author:
Asmaa Abdallah
Author:
Daji Qiao
Author:
Ahmed M. Eltawil
Download statistics
Downloads from ePrints over the past year. Other digital versions may also be available to download e.g. from the publisher's website.
View more statistics