Utilizing a Transformer-Based Model with Self-Attention Mechanism as the feature selector for the Detection of DDoS and Man-in-the-Middle Attacks in IoT Environments: An Application on the Edge-IIoTSet Dataset 


Vol. 2,  No. 1, pp. 0-0, Jan.  2025
10.23246/AAIRJ.2025.02.01.02


PDF
  Abstract

As cyber threats in IoT networks become increasingly sophisticated, identifying Man-in-the-Middle (MitM) and Distributed Denial of Service (DDoS) attacks with high precision is critical. This study pioneers the application of a Transformer-based model, fortified with self-attention mechanisms, to serve as a state-of-the-art intrusion detection system. This research specifically excels in feature selection and sequence classification within IoT network traffic data. Data is transformed into sequential representations through preprocessing, facilitating the model's ability to capture temporal relationships and contextual understanding. The Transformer model, tailored for sequence classification, auto-learns essential features from the processed data. Multi-headed self-attention allows the model to focus on varied aspects of network data, enhancing detection capabilities. The model is fine-tuned using a cross-entropy loss function on a labeled dataset comprising MitM and DDoS instances, as well as benign traffic. During inference, the model assigns anomaly scores to sequences, flagging those with high scores as likely attacks. Trained and evaluated on the recently released Edge-IIoTset dataset, the model achieved an accuracy rate of 98.50%. In comparisons, it outperformed traditional classifiers like Support Vector Machines, Long Short-Term Memory, and Deep Recurrent Neural Networks. The model attained an F1 score of 99.39%, evidencing its efficacy in balancing false positives and negatives—an essential metric in network security. This study highlights the potential of deploying Transformer-based models in intrusion detection systems. The system's performance underscores a new paradigm in proactive cybersecurity measures, opening avenues for adaptive network defense mechanisms. Future research could optimize the model's architecture and expand its applications across cybersecurity domains.

  Statistics
Cumulative Counts from November, 2022
Multiple requests among the same browser session are counted as one view. If you mouse over a chart, the values of data points will be shown.


  Cite this article

[IEEE Style]

I. M. Atalabi, "Utilizing a Transformer-Based Model with Self-Attention Mechanism as the feature selector for the Detection of DDoS and Man-in-the-Middle Attacks in IoT Environments: An Application on the Edge-IIoTSet Dataset," AAIRJ, vol. 2, no. 1, pp. 0-0, 2025. DOI: 10.23246/AAIRJ.2025.02.01.02.

[ACM Style]

Ifedayo Michael Atalabi. 2025. Utilizing a Transformer-Based Model with Self-Attention Mechanism as the feature selector for the Detection of DDoS and Man-in-the-Middle Attacks in IoT Environments: An Application on the Edge-IIoTSet Dataset. AAIRJ, 2, 1, (2025), 0-0. DOI: 10.23246/AAIRJ.2025.02.01.02.

[KICS Style]

Ifedayo Michael Atalabi, "Utilizing a Transformer-Based Model with Self-Attention Mechanism as the feature selector for the Detection of DDoS and Man-in-the-Middle Attacks in IoT Environments: An Application on the Edge-IIoTSet Dataset," AAIRJ, vol. 2, no. 1, pp. 0-0, 1. 2025. (https://doi.org/10.23246/AAIRJ.2025.02.01.02)