LTE radio frame structure

LTE radio frame structure

Source publication
Article
Full-text available
Machine-to-Machine (M2M) communication refers to autonomous communication among devices that aims for a massive number of connected devices. M2M communication can support ubiquitous communication and full mechanical automation, and it will change everything from industry to ourselves. Recent developments in communication technology make Long Term E...

Context in source publication

Context 1
... communication uses OFDMA in the downlink and SC-FDMA in the uplink channel. The data in both the uplink and downlink is transmitted as frames of 10ms duration, as shown in Fig. 7. Each frame is further divided into 10 subframes of length 1ms each. The duration of subframe is known as transmission time interval (TTI), and each such subframe further divided into two slots of 0.5ms duration each. The resource units are allocated in slots of 0.5ms long in time domain and 180KHz bandwidth in frequency domain [89]- ...

Citations

... If a packet's metric is the highest, the UE gets the top priority and can transmit its data first when resources are available. These metric functions can be categorized according to a number of criteria, including input parameters, objectives, service targets, and optimization goals (see [43][44][45][46][47]). The various prioritization systems are expressed as follows: ...
Preprint
Full-text available
The explosive growth of Internet of Things (IoT) applications has driven the development of specialized communication technologies to cater to diverse device needs. Narrowband IoT (NB-IoT) is a promising low-power wide-area network (LPWAN) technology designed to support massive machine-type communication (MTC) with tolerant delay transmission. However, MTC presents new challenges for NB-IoT, especially in meeting the very low latency and high reliability requirements. It is important to understand which techniques, such as scheduling and resource allocation, enable low-delay applications to meet their goals. To accomplish these tasks, this paper uses realistic simulations within a variety of traffic scenarios, focusing on scheduling uplink data and resource allocation management. We discuss (1) the components contributing to the Quality of Service (QoS) for NB-IoT devices, (2) evaluate the performance of prioritized scheduling schemes and resource unit (RU) configuration for uplink transmission modes, point out potential improvements to reduce the latency for delay-sensitive devices, and (3) implement and evaluate a novel hybrid scheduling solution called the Gap Aware Hybrid Scheduling (GAHUS) Algorithm. We have developed a solution to enhance network efficiency and prioritize sensitive-delay devices to meet their QoS requirements. Using methods like simulations, comparisons, and performance evaluations, we have demonstrated the effectiveness of our approach in assigning free resource gaps created during long uplink transmissions. GAHUS enables handling massive MTC traffic while ensuring reliable performance for delay-sensitive devices by minimizing rejections and delays.
... 11 Hence, these serve as critical facilitators of the IoT. 12 Indeed, by 2026, the IoT market will be worth trillions due to the connectivity of an exponential number of MT devices for communication and data sharing purposes. 13 With the increasing adoption of the IoT and its requirement to accommodate a diverse and sometimes competing list of goals, 14 MTC is projected to occupy a dominating role in networks. ...
Article
Full-text available
Mobile communication has multiplied in many digital intelligent applications. The machine‐type communication (MTC) services have afforded flexible communication facilities considering those facilities. However, high energy consumption and poor data rate have made the MTC a complex system. So, the resource allocation strategy has been introduced by different procedures such as neural models, optimization, and mathematical models. But, in some cases, these algorithms have recorded high complexity and high resource requirement. Hence, the present research article has planned to develop a novel Chimp‐based Extreme Neural Model (CbENM) for allocating the desired optimal resource for each machine user. Moreover, the resources were assigned based on the task completion deadline. Before the resource allocation process, the active state users were predicted by analyzing their data rate and deadlines. Finally, the desired resources were allocated to each user, and the performance was validated. The presented model has a lower delay bond rate and a high data aggregate rate.
... Traditional cellular networks cannot meet these KPIs. As IoT devices started to be deployed in large numbers, various surveys were conducted to address these issues (Langat and Musyoki 2022;Singh et al. 2021a;Amodu and Othman 2018). Numerous surveys have identified significant inefficiencies in LPWA networks and emphasized critical research directions. ...
Chapter
Full-text available
A wireless backhaul optimization approach using a delay jitter is suggested to handle the wireless backhaul issue for 5G dynamic heterogeneous situations. First, the delay and delay jitter issues in 5G dynamic heterogeneous situations are carefully evaluated, optimization indicators are defined, and the fundamental backhaul model is further built. Then, considering the optimization action needs, include delay constraints to create better model 1; examine network overload, relax channel number allocation variables to construct improved model 2, and present a matching hierarchical method for a quick solution. The simulation results reveal that the suggested approach has improved delay jitter performance compared with three kinds of current wireless backhaul optimization algorithms.
... Traditional cellular networks cannot meet these KPIs. As IoT devices started to be deployed in large numbers, various surveys were conducted to address these issues (Langat and Musyoki 2022;Singh et al. 2021a;Amodu and Othman 2018). Numerous surveys have identified significant inefficiencies in LPWA networks and emphasized critical research directions. ...
Chapter
Full-text available
The use of previous generation networks like 4G was vastly used in the Internet of Things (IoT) devices. The constant need to grow and develop just so the network can fulfill the requirement of IoT devices is still going on. The exponential growth of the data services substantially challenged the security and the networks of IoT because they were run by the mobile internet requiring high bit rate, low latency, high availability, and performances within various networks. The IoT integrates several sensors and data to provide services and a communication standard. Fifth Generation Communication System (5G) enabled IoT devices to allow the seamless connectivity of billions of interconnected devices. Cellular connections have become a central part of the society that powers our daily lives. Numerous security issues have come to light because of the exponential expansion of 5G technologies and the adaptation of the slow counterpart of IoT devices. Network services without security and privacy pose a threat to the infrastructure and sometimes endanger human lives. Analyzing security threats and mitigation is a crucial and fundamental part of the IoT ecosystem. Authorization of data, confidentiality, trust, and privacy of 5G enabled IoT devices are the most challenging parts of the system. And to provide a solution to these, we need a robust system to handle cyberattacks and prevent vulnerabilities by countermeasures. This paper includes a comprehensive discussion of 5G, IoT fundamentals, the layered architecture of 5G IoT, security attacks and their mitigation, current research, and future directions for 5G enabled IoT infrastructure.
... Traditional cellular networks cannot meet these KPIs. As IoT devices started to be deployed in large numbers, various surveys were conducted to address these issues (Langat and Musyoki 2022;Singh et al. 2021a;Amodu and Othman 2018). Numerous surveys have identified significant inefficiencies in LPWA networks and emphasized critical research directions. ...
Chapter
Full-text available
Security of data is very important while providing communication either by the wired or wireless medium. It is a very challenging issue in the world and the wireless mobile network makes it more challenging. In a wireless mobile network, there is a cluster of self-contained, self-organized networks that form a temporarily multi-hop peer-to-peer radio network, lacking any use of the pre-determined organization. As these networks are mobile and wireless connection links are used to connect these networks through each other, many of the times these kinds of networks are accomplished of self-manage, self-define, and self-configure. Due to their dynamic nature, wireless mobile networks/systems do not have a fixed infrastructure and, due to this, it is more vulnerable to many types of hostile attacks. Different kinds of security attacks that are present in wireless mobile networks are stated in the paper with their spotting and precaution techniques. Furthermore, the paper deliberates on the various types of mobile networks along with their numerous challenges and issues. Moreover, the paper defines the need and goals of security in wireless mobile networks as well as many security attacks along with their detection or prevention methods.
... Traditional cellular networks cannot meet these KPIs. As IoT devices started to be deployed in large numbers, various surveys were conducted to address these issues (Langat and Musyoki 2022;Singh et al. 2021a;Amodu and Othman 2018). Numerous surveys have identified significant inefficiencies in LPWA networks and emphasized critical research directions. ...
Chapter
Full-text available
The significant expansion of cellular networks has increased their potential to support a wide range of use cases beyond their original purpose of providing broadband access. One such development is using cellular networks to support the Internet of Things (IoT), called Cellular IoT (CIoT). The growth of CIoT is an important trend in the evolution of cellular networks, it leads to broader and more comprehensive ecosystem circumstances. The extensive IoT business evolution is transforming a diverse sector, including health, smart cities, security, and agriculture. Nevertheless, a large scale with very different characteristics and use cases struggle with connectivity challenges due to the unique traffic features of massive IoT and the tremendous density of IoT devices. This study aims to identify the critical obstacles that hinder the widespread deployment of IoT over cellular networks and suggest an innovative algorithm to mitigate them effectively. We discovered that the primary challenges revolve around three specific areas: connection setup, network resource management, and energy consumption. In this regard, we investigate the integration of massive Machine-Type Communication (mMTC) into cellular networks, focusing on the performance of Narrowband IoT (NB-IoT) in supporting mMTC.
... As a result, the destination must have signal categorization methods in place to ensure that the information is demodulated securely. SC technology has been integrated into numerous existing wireless technical regulations, including those for mobile phones, wireless local area networks, and microwave schemes [22,23]. Radio surveillance using SC algorithms is also utilized by government entities to verify compliance with spectrum management legislation. ...
Article
Full-text available
Categorization of space time block code (STBC) signals has become widely acknowledged as a crucial foundational mechanism for creating intelligent wireless transmissions in both the governmental and business sectors. The use of multiple antennas at a broadcaster complicates a signal categorization task because assumptions about the number and transmission matrix of the sent antennas must be made. STBC categorization has only been investigated in the context of non-relaying environments, and no methods for relaying transmissions have been reported. This work proposes a revolutionary strategy for categorizing STBC signals that can be implemented in amplify-and-forward (AF) relaying networks. Time-domain characteristics of the STBC waveforms provide the basis of the mathematical ingredients used in the offered categorization process. The employed STBC waveform is reflected in the spikes observed in the fast Fourier transform of the second-order lag product of the collected waveforms. This creates the foundation for an effective discriminating feature. Advantages of the described strategy include not requiring any prior awareness of the modulation type, channel conditions, signal-to-noise ratio (SNR), or the block timing synchronization of the STBC waveforms. The indicated strategy has been shown through simulation experiments to be capable of providing appropriate categorization accuracy despite the existence of transmission faults, even at relatively low SNR levels.
... SD + g (4) SD , (36b) 6 VOLUME 4, 2016 This article has been accepted for publication in IEEE Access. This is the author's version which has not been fully edited and content may change prior to final publication. ...
... SRD + g (4) SRD , (36c) ...
Article
Full-text available
Amplify-and-forward (AF) orthogonal frequency division multiplexing (OFDM) transmissions encounter a significant difficulty in the form of in-phase and quadrature phase (IQ) mismatch. Previous reports on this problem have solely been discussed in the context of uncoded transmissions. In addition, in these precedent studies, IQ equalization must be conducted following the estimation stage for accurate detection of data symbols. This research delves into the issue of the IQ mismatch between transmission and reception in AF-OFDM systems in the context of channel coding. We design a magnificent code-aided approach to predict the overall channel impulse responses (CIRs), which encompass the actual CIRs and IQ mismatch originating at the source, relay, and destination. Instead of using a collection of algorithms, the proposed approach can be utilized to estimate nine parameters simultaneously. Due to the impractical nature of the precise maximum-likelihood (ML) strategy to this situation, we instead utilize an expectation-maximization (EM) process as a low complexity strategy to predict the parameters under consideration. The suggested estimation approach uses an iterative process to improve predictions exploiting the a priori knowledge gained from the soft information supplied by the channel decoder. In addition, we demonstrate how to carry out data detection by making use of the estimated parameters. The simulation results verify the effectiveness of the proposed estimator and detector for usage in real-world settings, with superiority over the conventional ones.
... First, as pointed out in [9]- [13], the practical deployment and implementation of mMTC networks on a large scale faces a number of challenges due to the intensive communications between MTCDs, including congestion, coverage, energy self-sufficiency, scalability, and modelling. Consequently, advanced multiple access techniques, such as non-orthogonal multiple access (NOMA), have been used to alleviate congestion caused by massive signalling overhead when a large number of MTCDs are attempting to access the spectrum [9], [12], [13]. Second, because MTCDs rely heavily on short transmissions for instantaneous communication, the network sum-throughput can be drastically reduced without efficient resource allocation algorithms. ...
Article
Full-text available
In this paper, we aim to improve the con-nectivity, scalability, and energy efficiency of machine-type communication (MTC) networks with different types of MTC devices (MTCDs), namely Type-I and Type-II MTCDs, which have different communication purposes. To this end, we propose two transmission schemes called connectivity-oriented machine-type communication (CoM) and quality-oriented machine-type communication (QoM), which take into account the stochastic geometry-based deployment and the random active/inactive status of MTCDs. Specifically, in the proposed schemes, the active Type-I MTCDs operate using a novel Bernoulli random process-based simultaneous wireless information and power transfer (SWIPT) architecture. Next, utilizing multiuser power-domain non-orthogonal multiple access (PD-NOMA), each active Type-I MTCD can simultaneously communicate with another Type-I MTCD and a scalable number of Type-II MTCDs. In the performance analysis of the proposed schemes, we prove that the true distribution of the received power at a Type-II MTCD in the QoM scheme can be approximated by the Singh-Maddala distribution. Exploiting this unique statistical finding, we derive approximate closed-form expressions for the outage probability (OP) and sum-throughput of massive MTC (mMTC) networks. Through numerical results, we show that the proposed schemes provide a considerable sum-throughput gain over conventional mMTC networks.
... The 3GPP and the ETSI standardized reference architectures for M2M. The 3GPP focuses on the recommendation for communication of machine-type communication devices, while ETSI mainly focuses on the MTC applications [20]. This article follows ETSI reference architecture. ...
Article
Machine-to-Machine (M2M) communication in the Long Term Evolution (LTE) network has recently grown exponentially as the volume of connected devices has increased rapidly in the last decade. M2M traffic can be understood via certain parameters in terms of packet length, packet generation frequency, delay, and data rate requirements, and it typically flows in the uplink direction. Primarily, the LTE network design is optimized for Human-to-Human (H2H) communication. As a result, designing uplink scheduling in LTE networks is fraught with difficulties which restrict the use of potential capacity. In response to the preceding methodologies, focusing on the QCI priority degrades resource utilization and cell throughput. Therefore, a scheduling mechanism needs to optimize the system performance with priority support to use LTE in M2M communication. This paper highlights existing flaws in the optimisation process and proposes a scalable priority-based resource allocation scheme for M2M communication in the LTE/LTE-Advance network. The proposed scheme for resource allocation strikes a balance between resource utilization and application priority support. According to the results, the proposed scheduling algorithm outperforms the standard algorithms concerning resource sharing fairness, average resource utilization, QCI priority support, and delay budget violation.