Figure 3 - available via license: CC BY
Content may be subject to copyright.
Directional communication with multiple beams in 5G network.

Directional communication with multiple beams in 5G network.

Source publication
Article
Full-text available
5G is expected to deal with high data rates for different types of wireless traffic. To enable high data rates, 5G employs beam searching operation to align the best beam pairs. Beam searching operation along with high order modulation techniques in 5G, exhausts the battery power of user equipment (UE). LTE network uses discontinuous reception (DRX...

Context in source publication

Context 1
... LTE-DRX mechanism does not include beam searching operation [19,20]. Figure 3 shows the concept of directional communication with multiple beams in 5G network. The gNB transmits K number of beams and a UE has L number of beams. ...

Similar publications

Article
Full-text available
Fifth‐generation (5G) networks deal with high‐frequency data rates, ultra‐low latency, more reliability, massive network capacity, more availability, and a more uniform user experience. To validate the high‐frequency rates, 5G networks engage beam searching operation. By adopting a beam searching state between the short and long sleep, one can redu...

Citations

... Contemporary smartphones have highly advanced image processing systems, smart integrated assistants, and offer gigabit speeds over their radios. All of these features, and many more, are enabled by artificial intelligence [1], [2], [28]. ...
... Less obviously, artificial intelligence techniques themselves can also be used to reduce mobile energy consumption. For example, by optimizing data transmission [28], or location services [29]. Hence, artificial intelligence has two faces in this problem: it is both 1) a key enabler of desired (efficient) mobile features and 2) a major power draw on these devices, playing a part in both the solution and the problem. ...
Preprint
Artificial intelligence is bringing ever new functionalities to the realm of mobile devices that are now considered essential (e.g., camera and voice assistants, recommender systems). Yet, operating artificial intelligence takes up a substantial amount of energy. However, artificial intelligence is also being used to enable more energy-efficient solutions for mobile systems. Hence, artificial intelligence has two faces in that regard, it is both a key enabler of desired (efficient) mobile functionalities and a major power draw on these devices, playing a part in both the solution and the problem. In this paper, we present a review of the literature of the past decade on the usage of artificial intelligence within the realm of green mobile computing. From the analysis of 34 papers, we highlight the emerging patterns and map the field into 13 main topics that are summarized in details. Our results showcase that the field is slowly increasing in the past years, more specifically, since 2019. Regarding the double impact AI has on the mobile energy consumption, the energy consumption of AI-based mobile systems is under-studied in comparison to the usage of AI for energy-efficient mobile computing, and we argue for more exploratory studies in that direction. We observe that although most studies are framed as solution papers (94%), the large majority do not make those solutions publicly available to the community. Moreover, we also show that most contributions are purely academic (28 out of 34 papers) and that we need to promote the involvement of the mobile software industry in this field.
... Meanwhile, the connection via the LTE network is maintained. Using artificial intelligence (AI), predicting the arrival of packets from wireless links is proposed in [13]. All these methods can have an impact in terms of energy efficiency in 5G radio networks, but on the other side, it may result in less energy efficiency in the LTE or other wireless networks. ...
Article
Full-text available
The evolution of mobile technology and computation systems enables User Equipment (UE) to manage tremendous amounts of data transmission. As a result of current 5G technology, several types of wireless traffic in millimeter wave bands can be transmitted at high data rates with ultra-reliable and small latency communications. The 5G networks rely on directional beamforming and mmWave uses to overcome propagation and losses during penetration. To align the best beam pairs and achieve high data rates, beam-search operations are used in 5G. This combined with multibeam reception and high-order modulation techniques deteriorates the battery power of the UE. In the previous 4G radio mobile system, Discontinuous Reception (DRX) techniques were successfully used to save energy. To reduce the energy consumption and latency of multiple-beam 5G radio communications, we will propose in this paper the DRX Beam Measurement technique (DRX-BM). Based on the power-saving factor analysis and the delayed response, we will model DRX-BM into a semi-Markov process to reduce the tracking time. Simulations in MATLAB are used to assess the effectiveness of the proposed model and avoid unnecessary time spent on beam search. Furthermore, the simulation indicates that our proposed technique makes an improvement and saves 14% on energy with a minimum delay.
... Memon et al. [105] published a paper on the use of AI to save energy in 5G networks. The study used the recurrent LSTM artificial neural network to extract the packet arrival time pattern based on network traffic analysis [105]. ...
... Memon et al. [105] published a paper on the use of AI to save energy in 5G networks. The study used the recurrent LSTM artificial neural network to extract the packet arrival time pattern based on network traffic analysis [105]. The research results in the discussed article indicate that, compared to the LTE-DRX method, the proposed algorithm shows 69% higher energy efficiency on trace 1 and 55% for trace 2 [105]. ...
... The study used the recurrent LSTM artificial neural network to extract the packet arrival time pattern based on network traffic analysis [105]. The research results in the discussed article indicate that, compared to the LTE-DRX method, the proposed algorithm shows 69% higher energy efficiency on trace 1 and 55% for trace 2 [105]. The presented research results are crucial for the entire mobile industry in the context of the future development of next-generation networks. ...
Article
Full-text available
The digital transformation of the energy sector toward the Smart Grid paradigm, intelligent energy management, and distributed energy integration poses new requirements for computer science. Issues related to the automation of power grid management, multidimensional analysis of data generated in Smart Grids, and optimization of decision-making processes require urgent solutions. The article aims to analyze the use of selected artificial intelligence (AI) algorithms to support the abovementioned issues. In particular, machine learning methods, metaheuristic algorithms, and intelligent fuzzy inference systems were analyzed. Examples of the analyzed algorithms were tested in crucial domains of the energy sector. The study analyzed cybersecurity, Smart Grid management, energy saving, power loss minimization, fault diagnosis, and renewable energy sources. For each domain of the energy sector, specific engineering problems were defined, for which the use of artificial intelligence algorithms was analyzed. Research results indicate that AI algorithms can improve the processes of energy generation, distribution, storage, consumption, and trading. Based on conducted analyses, we defined open research challenges for the practical application of AI algorithms in critical domains of the energy sector.
... In this context, accurate traffic prediction mechanisms are appealing, specially those relying on AI as they can potentially capture the inherent dynamics and non-linearities of the system [52]- [54]. Interestingly, long short-term memory (LSTM) is a popular type of recurrent NN (RNN) [55] that is specially designed to learn long-term dependencies of a sequence, thus, predictions are made based on long-sequences of previous input values rather than on a single previous input value [52], [56]. Different from classical solutions, e.g., relying on dynamic programming [57] and reinforcement learning [58], LSTM-based techniques [53], [59], [60] provide autonomous decision-making and relative fast learning speed, especially in problems with large state and action spaces. ...
... Interested readers in the details of LSTM networks are advised to review[56],[61]. ...
Preprint
Full-text available
Reducing energy consumption is a pressing issue in low-power machine-type communication (MTC) networks. In this regard, the Wake-up Signal (WuS) technology, which aims to minimize the energy consumed by the radio interface of the machine-type devices (MTDs), stands as a promising solution. However, state-of-the-art WuS mechanisms use static operational parameters, so they cannot efficiently adapt to the system dynamics. To overcome this, we design a simple but efficient neural network to predict MTC traffic patterns and configure WuS accordingly. Our proposed forecasting WuS (FWuS) leverages an accurate long-short term memory (LSTM)- based traffic prediction that allows extending the sleep time of MTDs by avoiding frequent page monitoring occasions in idle state. Simulation results show the effectiveness of our approach. The traffic prediction errors are shown to be below 4%, being false alarm and miss-detection probabilities respectively below 8.8% and 1.3%. In terms of energy consumption reduction, FWuS can outperform the best benchmark mechanism in up to 32%. Finally, we certify the ability of FWuS to dynamically adapt to traffic density changes, promoting low-power MTC scalability
... Furthermore, deploying the LSTM model in NR could capture the beam misalignment problem [84]. Another online learning algorithm trained with the data from eNB is useful to predict the next packet arrival time as well. ...
Article
Full-text available
The Discontinuous Reception (DRX) is the most effective timer-based mechanism for User Equipment (UE) power saving. In Long Term Evolution (LTE) systems, the development of the DRX mechanism enormously extends the UE battery life. With the DRX mechanism, a UE is allowed to enter a dormant state. Given a DRX cycle, the UE needs to wake up periodically during the dormancy to check whether it receives new downlink packets or not. The UE can achieve a high sleeping ratio by skipping most channel monitoring occasions. As the mobile network evolved to 5G, the battery life requirement increased to support various new services. 3rd Generation Partnership Project (3GPP) also enhances the DRX mechanism and adds new DRX-related features in the New Radio (NR) Release 16 standard. In addition to the time-based design, 3GPP proposed two signaling-based mechanisms: power saving signal and UE assistance information. This survey paper introduces the latest DRX mechanism in the 3GPP NR standard and summarizes the state-of-the-art research. Researchers have investigated the DRX mechanism in various use cases, such as web browsing services and heterogeneous networks. They focus on the UE sleep ratio and packet delay and propose corresponding analytical models. New DRX architectures are also discussed to conquer the power-saving problem in specific schemes, especially in the 5G NR networks. This paper categorizes and presents the papers according to the target services and the network scenarios in detail. We also survey the work focusing on the new challenges (such as beamforming and thermal issue) in the NR network and introduce the future research directions in the 6G era.
... An extensive overview of several solutions for 5G infrastructures including, but not limited to massive MIMO, millimeterwave (mmWave), NOMA, and also the latest achievements on simulator capabilities, are clearly outlined. Also, artificial intelligence (AI)-based discontinuous reception (DRX) technique for greening 5G enabled devices have been proposed in [57]. The proposed mechanism significantly outperforms the conventional long-term evolution (LTE)-DRX technique in efficient energy savings. ...
Article
Full-text available
In recent times, the rapid growth in mobile subscriptions and the associated demand for high data rates fuels the need for a robust wireless network design to meet the required capacity and coverage. Deploying massive numbers of cellular base stations (BSs) over a geographic area to fulfill high-capacity demands and broad network coverage is quite challenging due to inter-cell interference and significant rate variations. Cell-free massive MIMO (CF-mMIMO), a key enabler for 5G and 6G wireless networks, has been identified as an innovative technology to address this problem. In CF-mMIMO, many irregularly scattered single access points (APs) are linked to a central processing unit (CPU) via a backhaul network that coherently serves a limited number of mobile stations (MSs) to achieve high energy efficiency (EE) and spectral gains. This paper presents key areas of applications of CF-mMIMO in the ubiquitous 5G, and the envisioned 6G wireless networks. First, a foundational background on massive MIMO solutions-cellular massive MIMO, network MIMO, and CF-mMIMO is presented, focusing on the application areas and associated challenges. Additionally, CF-mMIMO architectures, design considerations, and system modeling are discussed extensively. Furthermore, the key areas of application of CF-mMIMO such as simultaneous wireless information and power transfer (SWIPT), channel hardening, hardware efficiency, power control, non-orthogonal multiple access (NOMA), spectral efficiency (SE), and EE are discussed exhaustively. Finally, the research directions, open issues, and lessons learned to stimulate cutting-edge research in this emerging domain of wireless communications are highlighted.
... Several DRX approaches for VoIP communications can be found in the literature, as for instance [22][23][24][27][28][29]. In [27], the authors propose a modified power-saving mechanism (PSM) that reversely applies the state transition of legacy LTE PSM by considering the attributes of network propagation delay, whereas in [28] it is considered an adaptive DRX method that utilizes service and terminal acknowledgement provided by the deep packet inspection mechanism and modifies the DRX settings on the fly. ...
... However, for that sake, it is of paramount importance to statistically characterize the behaviour of silence periods. Dynamic DRX mechanisms that predict packets arrival using a simple neural network, in response to traffic conditions, are proposed in [23,24]. An algorithm for multiple beam communications is proposed in [23], which enables dynamic short and long sleep cycles. ...
... Dynamic DRX mechanisms that predict packets arrival using a simple neural network, in response to traffic conditions, are proposed in [23,24]. An algorithm for multiple beam communications is proposed in [23], which enables dynamic short and long sleep cycles. In [24], a neural network is trained for prediction of next packet arrival time based on real wireless traffic. ...
Article
Full-text available
The continuous traffic increase of mobile communication systems has the collateral effect of higher energy consumption, affecting battery lifetime in the user equipment (UE). An effective solution for energy saving is to implement a discontinuous reception (DRX) mode. However, guaranteeing a desired quality of experience (QoE) while simultaneously saving energy is a challenge; but undoubtedly both energy efficiency and the QoE have been essential aspects for the provision of real-time services, such as voice over Internet protocol (VoIP), voice over LTE, and mobile broadband in 4G networks and beyond. This paper focuses on human voice communications and proposes a Gaussian process regression algorithm that is capable of recognizing patterns of silence and predicts its duration in human conversations, with a prediction error as low as 1.87%. The proposed machine learning mechanism saves energy by switching OFF/ON the radio frequency interface, in order to extend the UE autonomy without harming QoE. Simulation results validate the effectiveness of the proposed mechanism compared with the related literature, showing improvements in energy savings of more than 30% while ensuring a desired QoE level with low computational cost. © 2021 The Authors. IET Communications published by John Wiley & Sons Ltd on behalf of The Institution of Engineering and Technology
... Considering, for example, the data exchange between two users which are geographically close to each other, which represents an everincreasing reality, the possibility of having such data exchange directly between the devices in a D2D fashion can represent several enhancements in power consumption reduction. D2D can be used as a mechanism to decrease power consumption [59,60], by reducing the network hops to only one with the immediate advantage of lower latency; better quality of service and experience; and, from the core network perspective, decrease signaling and overall backbone traffic. Additionally, and probably the most obvious advantage, is that both devices will need less power to transmit the same amount of data, thereby directly increasing EE by reducing energy consumption. ...
Article
Full-text available
Fifth generation (5G) and Beyond-5G (B5G) will be characterized by highly dense deployments, both on network plane and user plane. Internet of Things, massive sensor deployments and base stations will drive even more energy consumption. User behavior towards mobile service usage is witnessing a paradigm shift with heavy capacity, demanding services resulting in an increase of both screen time and data transfers, which leads to additional power consumption. Mobile network operators will face additional energetic challenges, mainly related to power consumption and network sustainability, starting right in the planning phase with concepts like energy efficiency and greenness by design coming into play. The main contribution of this work is a two-tier method to address such challenges leading to positively-offset carbon dioxide emissions related to mobile networks using a novel approach. The first tier contributes to overall power reduction and optimization based on energy efficient methods applied to 5G and B5G networks. The second tier aims to offset the remaining operational power usage by completely offsetting its carbon footprint through geosequestration. This way, we show that the objective of minimizing overall networks’ carbon footprint is achievable. Conclusions are drawn and it is shown that carbon sequestration initiatives or program adherence represent a negligible cost impact on overall network cost, with the added value of greener and more environmentally friendly network operation. This can also relieve the pressure on mobile network operators in order to maximize compliance with environmentally neutral activity.
... Most energy-efficient techniques are leveraging network intelligence to achieve a more efficient result. With a lot of cognitive network-based EE applications proposed in literature, artificial intelligence is expected to play a crucial role in EE for future networks [74], including for efficient adaptive resource allocation, discontinuous reception [75], channel learning for power management [76,77], traffic offloading for energy efficiency in small cells [78], node device authentication for security [79], and intermittent energy management for energy-harvested applications [80,81]. We present a list of prediction-based techniques for WCNs in Table 3. Predictive approaches are particularly heavy on processors, inciting the question of how much processing can a network tolerate? ...
Article
Full-text available
The projected rise in wireless communication traffic has necessitated the advancement of energy-efficient (EE) techniques for the design of wireless communication systems, given the high operating costs of conventional wireless cellular networks, and the scarcity of energy resources in low-power applications. The objective of this paper is to examine the paradigm shifts in EE approaches in recent times by reviewing traditional approaches to EE, analyzing recent trends, and identifying future challenges and opportunities. Considering the current energy concerns, nodes in emerging wireless networks range from limited-energy nodes (LENs) to high-energy nodes (HENs) with entirely different constraints in either case. In view of these extremes, this paper examines the principles behind energy-efficient wireless communication network design. We then present a broad taxonomy that tracks the areas of impact of these techniques in the network. We specifically discuss the preponderance of prediction-based energy-efficient techniques and their limits, and then discuss the trends in renewable energy supply systems for future networks. Finally, we recommend more context-specific energy-efficient research efforts and cross-vendor collaborations to push the frontiers of energy efficiency in the design of wireless communication networks.
... Discontinuous reception can also contribute to improve the energy efficiency. Authors in [23] introduced an artificial intelligence (AI) approach, i.e., recurrent neural network (RNN), to adapt sleep cycles of user terminals. ...