Get 20M+ Full-Text Papers For Less Than $1.50/day. Start a 14-Day Trial for You or Your Team.

Learn More →

Long Boosted Memory Algorithm for Intelligent Spectrum Sensing in 5G and Beyond Systems

Long Boosted Memory Algorithm for Intelligent Spectrum Sensing in 5G and Beyond Systems Forthcoming wireless generations, namely the fifth generation and beyond, are experiencing various roll-out, planning, and implementation issues due to spectrum insufficiency. This spectrum shortage arises due to the growing number of wireless subscribers, significant traffic demands, inefficient spectrum distribution, and coexistence problems. The recognition of a free spectrum for wireless communication services is a critical requirement. So, the free spectrum can be predicted and modelled by using the spectrum sensing functionality of cognitive radio in the potential sub-THz band (0.1–1 THz) for beyond fifth-generation networks. Owing to the excellent prediction and classifying capabilities of deep learning, this research applies deep learning for spectrum sensing. The spectrum sensing data is a time-series sequence of binary 1(busy slots) and binary 0(free slots). To achieve this, a novel Long Boosted Memory Algorithm (LBMA) has been proposed here. Long Short-Term Memory (LSTM) are weak predictors unable to model long-term dependencies like a future prediction of primary user presence based on past time stamps and prone to overfitting. So, multiple weak LSTM predictors have been integrated to form a strong predictor not prone to overfitting using the AdaBoost technique for estimating robust spectrum predictions. LBMA uses input vectors like RSSI, the distance between cognitive radio user and gateways, and energy vectors to train the model. LBMA has been compared and evaluated with the existing deep learning methods based on metrics like Training time, Accuracy, sensitivity, specificity, detection probability, cross-validation and Time Complexity under different SNR scenarios (0 to 20 dB). The simulated results indicate that the proposed LBMA has outperformed the existing algorithms with an accuracy of 99.3, a sensitivity of 93.1, specificity of 92.9, sensing time of 1.7599 s with the lowest time complexity, and a training time of 56 s. http://www.deepdyve.com/assets/images/DeepDyve-Logo-lg.png Journal of Network and Systems Management Springer Journals

Long Boosted Memory Algorithm for Intelligent Spectrum Sensing in 5G and Beyond Systems

Loading next page...
 
/lp/springer-journals/long-boosted-memory-algorithm-for-intelligent-spectrum-sensing-in-5g-im4AiID50E

References (63)

Publisher
Springer Journals
Copyright
Copyright © The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2022
ISSN
1064-7570
eISSN
1573-7705
DOI
10.1007/s10922-022-09652-w
Publisher site
See Article on Publisher Site

Abstract

Forthcoming wireless generations, namely the fifth generation and beyond, are experiencing various roll-out, planning, and implementation issues due to spectrum insufficiency. This spectrum shortage arises due to the growing number of wireless subscribers, significant traffic demands, inefficient spectrum distribution, and coexistence problems. The recognition of a free spectrum for wireless communication services is a critical requirement. So, the free spectrum can be predicted and modelled by using the spectrum sensing functionality of cognitive radio in the potential sub-THz band (0.1–1 THz) for beyond fifth-generation networks. Owing to the excellent prediction and classifying capabilities of deep learning, this research applies deep learning for spectrum sensing. The spectrum sensing data is a time-series sequence of binary 1(busy slots) and binary 0(free slots). To achieve this, a novel Long Boosted Memory Algorithm (LBMA) has been proposed here. Long Short-Term Memory (LSTM) are weak predictors unable to model long-term dependencies like a future prediction of primary user presence based on past time stamps and prone to overfitting. So, multiple weak LSTM predictors have been integrated to form a strong predictor not prone to overfitting using the AdaBoost technique for estimating robust spectrum predictions. LBMA uses input vectors like RSSI, the distance between cognitive radio user and gateways, and energy vectors to train the model. LBMA has been compared and evaluated with the existing deep learning methods based on metrics like Training time, Accuracy, sensitivity, specificity, detection probability, cross-validation and Time Complexity under different SNR scenarios (0 to 20 dB). The simulated results indicate that the proposed LBMA has outperformed the existing algorithms with an accuracy of 99.3, a sensitivity of 93.1, specificity of 92.9, sensing time of 1.7599 s with the lowest time complexity, and a training time of 56 s.

Journal

Journal of Network and Systems ManagementSpringer Journals

Published: Jul 1, 2022

Keywords: Deep learning; CNN; LSTM; AdaBoost; Spectrum sensing; Cognitive radio; Beyond 5G

There are no references for this article.