Figure - available from: Transactions on Emerging Telecommunications Technologies
This content is subject to copyright. Terms and conditions apply.
Multitier cloud radio access network HetNets structure

Multitier cloud radio access network HetNets structure

Source publication
Article
Full-text available
The networks are evolving drastically since last few years in order to meet user requirements. For example, the 5G is offering most of the available spectrum under one umbrella. In this work, we will address the resource allocation problem in fifth‐generation (5G) networks, to be exact in the Cloud Radio Access Networks (C‐RANs). The radio access n...

Similar publications

Conference Paper
Full-text available
Massive machine-type communication (mMTC) is expected to play a pivotal role in emerging 5G networks. Considering the dense deployment of small cells and the existence of heterogeneous cells, an MTC device can discover multiple cells for association. Under traditional cell association mechanisms, MTC devices are typically associated with an eNodeB...

Citations

... Network virtualization is in place because of a massive influx of devices coming with IoT and 5G applications [15]. Hence, there is tremendous pressure on next-generation infrastructure. ...
Article
Full-text available
The network transformation from physical network function to Virtual Network Function (VNF) requires a fundamental design change in the way applications and services are tested and assured in a hybrid virtual network. Once the VNFs are onboarded in a cloud network infrastructure, operators need to automatically test VNFs in real-time at the time of instantiation. This paper explicitly analyses the problem of adaptive self-healing of a Virtual Machine (VM) allocated by the VNF with the Deep Reinforcement Learning (DRL) approach. The DRL-based big data collection and analytics engine performs aggregation to probe data and analyze it for troubleshooting and performance management. This engine helps to determine corrective actions (self-healing), such as scaling or migrating VNFs. Hence, we proposed a Deep Queue Learning (DQL) based Deep Queue Networks (DQN) mechanism for self-healing VNFs in the virtualized infrastructure manager. Virtual network probes of closed-loop orchestration perform the automation of the VNF and provide analytics for real-time, policy-driven orchestration in an open networking automation platform through the stochastic gradient descent method for VNF service assurance and network reliability. The proposed DQN/DDQN mechanism optimizes the price and lowers the cost by 18% for resource usage without disrupting the Quality of Service (QoS) provided by the VNF. The outcome of adaptive self-healing of the VNFs enhances the computational performance by 27% compared to other state-of-the-art algorithms.
... RA has attracted a lot of attention from several authors in the past, however, it is mainly for singletier networks such as seen in the seminal work by Dahrouj et al [8] and references therein. Recently, authors have investigated RA in multi-tier networks such as in [12], their RA optimization problem was similar to ours, in the sense that it was geared towards achieving spectral efficiency, however, the methodology used to actualize it differs. This work differs from the aforementioned reviewed works because they do not properly address the significant inter-cell interference problem encountered in 5G Heterogeneous Networks, and hence cannot be used effectively in it. ...
Article
Full-text available
The fifth-generation (5G) mobile communication network is believed to outperform its predecessors in terms of improved: spectral efficiency, throughput, energy efficiency, quality of experience, etc. However, if these gains are to be realized, an inter-cell interference management scheme must be designed to tackle the effect of inter-cell interference which is prevalent in systems using frequency re-use one deployment. In this work, we first design a user-centric based clustering scheme that will help identify the base stations that are providing strong inter-cell interference links to this user, afterward, resource allocation scheme is devised by the user's base station and all the other base stations involved in the clustering to mitigate the inter-cell interference as well as improve the aggregate spectral efficiency of the system. Results obtained show that the aggregate spectral efficiency of the system is improved and the inter-cell interference mitigated when the devised clustering scheme and coordinated beamforming vectors are utilized in the system.
... The growth of demanding mobile applications as well as an increase in daily traffic are driving up the use of mobile devices [3]. When lockdowns were established because of COVID-19 concerns, this behaviour was especially evident [4][5][6]. The increasing volume of traffic poses new technical hurdles in terms of handling mobile devices and satisfying a variety of user needs [7]. ...
Article
Full-text available
In recent times, the advancement in network devices has focused entirely on the miniaturization of services that should ensure better connectivity between them via fifth generation (5G) technology. The 5G network communication aims to improve Quality of Service (QoS). However, the allocation of resources is a core problem that increases the complexity of packet scheduling. In this paper, a resource allocation model is developed using a novel deep learning algorithm for optimal resource allocation. The novel deep learning is formulated using the constraints associated with optimal radio resource allocation. The objective function design aims at reducing the system delay. The study predicts the traffic in a complex environment and allocates resources accordingly. The simulation was conducted to test the scheduling efficacy and the results showed an improved rate of allocation than the other methods.
... Risks are still quite minimal in comparison with qualitative ones (Failed 2022; Bashir et al. 2019;Sangaiah et al. 2019;Jayaraman et al. 2023). ...
... Risks are still quite minimal in comparison with qualitative ones (Failed 2022; Bashir et al. 2019;Sangaiah et al. 2019;Jayaraman et al. 2023). ...
Article
Full-text available
The widespread use of networks in industrial control systems has led to a number of problems, one of the most pressing being cyber security, or the protection of information with the goal of preventing cyberattacks. This work provides a model that mixes fault tree analysis, decision theory, plus fuzzy theory helps to identify the current reasons of cyberattack prevention failures and (ii) assess the vulnerability of a cybersecurity system. The Fuzzy-based Modified MCDM-TOPSIS Model was used to analyse the cybersecurity risks associated with assaulting websites, e-commerce platforms, and enterprise resource planning (ERP), as well as the potential effects of such assaults. We evaluate these effects, which include data dissemination, data alteration, data loss or destruction, and service disruption, in terms of criteria linked to monetary losses and time for restoration. The model application's findings show how effective it is and how much more susceptible e-commerce is to cybersecurity threats than websites or ERP, in part because of frequent operator access, credit transactions, and user authentication issues that are exclusive to e-commerce.
... Efficient resource allocation is of critical importance for 5G networks to ensure optimal performance. Various techniques, such as cognitive radio, network virtualization, and software-defined networking, have been proposed to improve resource allocation in 5G networks [5]. These techniques ensure the available resources are used optimally to maximize these networks' performance. ...
... Bashir, A. K. et al. [5] has discussed an optimal multitier resource allocation of cloud radio access network (C-RAN) in 5G using machine learning can be achieved by using a deep learning framework. This deep learning framework can be designed to learn the characteristics of the radio environment within the 5G cloud RAN. ...
Article
Full-text available
Intelligent Decision Model for efficient resource allocation in 5G broadband communication networks is essential for ensuring the most efficient use of available resources. This model considers several factors, such as traffic demand, network topology, and radio access technology, to make the most efficient decisions about resource allocation. It is based on intelligent algorithms and advanced analytics, which allow the network to quickly and accurately identify the optimal resource allocation for a given situation. This model can reduce costs, improve network performance, and increase customer satisfaction. In addition, the Intelligent Decision Model can help operators reduce the complexity and cost of managing a 5G network. The intelligent decision model for efficient resource allocation in 5G broadband communication networks is based on a combination of artificial intelligence (AI) and optimization techniques. The proposed decision models can use AI to identify patterns in traffic and user behavior. In contrast, the proposed can use optimization techniques to maximize resource utilization and reduce latency in the network. This model can also leverage predictive analytics and machine learning algorithms to determine the most efficient allocation of resources. Additionally, the proposed model can use AI to detect and mitigate potential security threats and malicious activities in the network. the proposed IDM has reached 91.85% of accuracy, 90.05% of precision, 90.96% of recall and 91.33% of F1-score.
... The growth of demanding mobile applications as well as an increase in daily traffic are driving up the use of mobile devices [3]. When lockdowns were established because of COVID-19 concerns, this behaviour was especially evident [4][5][6]. The increasing volume of traffic poses new technical hurdles in terms of handling mobile devices and satisfying a variety of user needs [7]. ...
Article
Full-text available
In recent times, the advancement in network devices has focused entirely on the miniaturisation of services that should ensure better connectivity between them via fifth generation (5G) technology. The 5G network communication aims to improve Quality of Service (QoS). However, the allocation of resources is a core problem that increases the complexity of packet scheduling. In this paper, we develop a resource allocation model using a novel deep learning algorithm for optimal resource allocation. The novel deep learning is formulated using the constraints associated with optimal radio resource allocation. The objective function design aims at reducing the system delay. The study predicts the traffic in a complex environment and allocates resources accordingly. The simulation was conducted to test the scheduling efficacy and the results showed an improved rate of allocation than the other methods.
... However, the communication channels must also ensure a very high data throughput and minimal latency. Although connecting billions of devices and sensors to the network simultaneously is difficult, this goal can be realized by utilizing solutions like massive MIMO, D2D technology, and cloud RAN [45]. ...
Article
Full-text available
Fifth-generation (5G) cellular networks are state-of-the-art wireless technologies revolutionizing all wireless systems. The fundamental goals of 5G are to increase network capacity, improve data rates, and reduce end-to-end latency. Therefore, 5G can support many devices connected to the Internet and realize the Internet of Things (IoT) vision. Though 5 G provides significant features for mobile wireless networks, some challenges still need to be addressed. Although 5 G offers valuable capabilities for mobile wireless networks, specific issues still need to be resolved. This article thoroughly introduces 5G technology, detailing its needs, infrastructure, features, and difficulties. In addition, it summarizes all the requirements and specifications of the 5G network based on the 3rd Generation Partnership Project (3GPP) Releases 15–17. Finally, this study discusses the key specifications challenges of 5G wireless networks.
... ITU-R P series of suggestions has valuable data on radio wave spread including ITU-R feature the issues at the higher frequencies, including the millimeter wave frequencies [48]. For instance, IoT hubs ordinarily are low-intricacy gadgets and have restricted on-board power [49,50] (see Table 2). ...
Article
Mobile verbal exchange is viewed as one of the essential problem in the area of cellular networks. A mobile community apparatuses the proper public model to contribute packet of the data services on excessive speed. The packets of data service model are mandatory to improve the quality of service (QoS). This lookup exertion offers the Improvement of QoS and radio get right of entry to 5G Mobile Networks. Towards improve built-in services, a useful resource allocation protocol is applied at “Radio Access Network (RAN)” layer to improve the QoS in 5G. QoS labeling is a form of organizational communication that enables a local hub or exit station to communicate with or signal its neighbors to request special adjusting to of favorable traffic. QoS highlighting is helpful to provide ways for site visitors to adjust. ISP perform well and show full potential of the pervasive network by utilizing the simulation model in 5G’s ultra-low latency, real-time connectivity, significantly increased capacity, and quick speed. This will help to make device connections faster, more efficient, and less prone to delays. Lower, medium, and high frequency bands can be used to categories 5G. Mid-bandwidth enables better cell broadband connections or greater machine-to-machine communication and provides faster connectivity than 4G at close range.
... Therefore, to address the different challenges of fog computing, many MLbased caching techniques are developed to enhance fog networks. For example, in [124,125], a MLbased caching scheme named online proactive caching is developed to predict time-series requests for contents and updated the network edge caching. In this scheme, Bidirectional Deep-recurrent Neural Network (BRNN) and convolution neural network models are used to predict content popularity and reduce computational costs. ...
Article
Full-text available
The massive growth of diversified smart devices and continuous data generation poses a challenge to communication archi-tectures. To deal with this problem, communication networks consider fog computing as one of promising technologies that can improve overall communication performance. It brings on-demand services proximate to the end devices and delivers the requested data in a short time. Fog computing faces several issues such as latency, bandwidth, and link utilization due to limited resources and the high processing demands of end devices. To this end, fog caching plays an imperative role in addressing data dissemination issues. This study provides a comprehensive discussion of fog computing, Internet of Things (IoTs) and the critical issues related to data security and dissemination in fog computing. Moreover, we determine the fog-based caching schemes and contribute to deal with the existing issues of fog computing. Besides, this paper presents a number of caching schemes with their contributions, benefits, and challenges to overcome the problems and limitations of fog computing. We also identify machine learning-based approaches for cache security and management in fog computing, as well as several prospective future research directions in caching, fog computing, and machine learning.