ArticlePDF Available

A Comprehensive Survey of Machine Learning Techniques in Next-Generation Wireless Networks and the Internet of Things

Authors:

Abstract

The advent of next-generation wireless networks and the Internet of Things (IoT) has introduced numerous challenges in terms of quality of service (QoS), user data rates, throughput, and security. These challenges necessitate innovative solutions to optimize performance and ensure robust security. Machine Learning (ML) has emerged as an influential tool in this regard, offering the potential to fully harness the capabilities of next-generation wireless networks and the IoT. With an ever-increasing number of connected devices and the commensurate data proliferation, ML presents an effective means of analyzing and processing this data. One significant challenge addressed by ML is network optimization. Through the analysis of network traffic patterns, congestion points are identified, and potential network performance issues are predicted. Security, a critical concern in next-generation wireless networks and the IoT, is another facet where ML proves instrumental by detecting and mitigating security breaches. This is achieved by analyzing data to identify anomalous behaviour and potential threats. Moreover, ML facilitates informed decision-making in IoT systems. By scrutinizing real-time data generated by IoT devices, ML algorithms reveal valuable insights, trends, and correlations. This capability enables IoT-enabled systems to make data-driven decisions, thus enhancing the efficiency of various applications such as smart cities, industrial automation, healthcare, and environmental monitoring. This study undertakes a systematic review of the impact of ML techniques, such as reinforcement learning, deep learning, transfer learning, and federated learning, on next-generation wireless networks, placing a particular emphasis on the IoT. The literature is reviewed systematically and studies are categorized based on their implications. The aim is to highlight potential challenges and opportunities, providing a roadmap for researchers and scholars to explore new approaches, overcome challenges, and leverage potential opportunities in the future.
A Comprehensive Survey of Machine Learning Techniques in Next-Generation Wireless
Networks and the Internet of Things
Mohammad Aftab Alam Khan1* , Hazilah Mad Kaidi2
1 Malaysia-Japan International Institute of Technology, Universiti Teknologi Malaysia, Kuala Lumpur 54100, Malaysia
2 Razak Faculty of Technology and Informatics, Universiti Teknologi Malaysia, Kuala Lumpur 54100, Malaysia
Corresponding Author Email: hazilah.kl@utm.my
https://doi.org/10.18280/isi.280416
ABSTRACT
Received: 10 April 2023
Revised: 12 June 2023
Accepted: 23 July 2023
Available online: 31 August 2023
The advent of next-generation wireless networks and the Internet of Things (IoT) has
introduced numerous challenges in terms of quality of service (QoS), user data rates,
throughput, and security. These challenges necessitate innovative solutions to optimize
performance and ensure robust security. Machine Learning (ML) has emerged as an
influential tool in this regard, offering the potential to fully harness the capabilities of next-
generation wireless networks and the IoT. With an ever-increasing number of connected
devices and the commensurate data proliferation, ML presents an effective means of
analyzing and processing this data. One significant challenge addressed by ML is network
optimization. Through the analysis of network traffic patterns, congestion points are
identified, and potential network performance issues are predicted. Security, a critical
concern in next-generation wireless networks and the IoT, is another facet where ML proves
instrumental by detecting and mitigating security breaches. This is achieved by analyzing
data to identify anomalous behaviour and potential threats. Moreover, ML facilitates
informed decision-making in IoT systems. By scrutinizing real-time data generated by IoT
devices, ML algorithms reveal valuable insights, trends, and correlations. This capability
enables IoT-enabled systems to make data-driven decisions, thus enhancing the efficiency
of various applications such as smart cities, industrial automation, healthcare, and
environmental monitoring. This study undertakes a systematic review of the impact of ML
techniques, such as reinforcement learning, deep learning, transfer learning, and federated
learning, on next-generation wireless networks, placing a particular emphasis on the IoT.
The literature is reviewed systematically and studies are categorized based on their
implications. The aim is to highlight potential challenges and opportunities, providing a
roadmap for researchers and scholars to explore new approaches, overcome challenges, and
leverage potential opportunities in the future.
Keywords:
Internet of Things (IoT), Machine Learning
(ML), quality of service, deep learning,
reinforcement learning, 5G and beyond,
next-generation wireless networks
1. INTRODUCTION
The landscape of wireless communication systems has
undergone substantial evolution over the past decades,
primarily fuelled by technological advancements and the
escalating demand for mobile computing and pervasive
connectivity. The genesis of this evolution can be traced back
to the establishment of commercial mobile telephony in the
1980s, followed by the widespread adoption of Wi-Fi in the
1990s, and subsequently, the expansion of mobile broadband
and the Internet of Things (IoT) in the 21st century [1, 2]. The
present study embarks on a comprehensive and systematic
review of Machine Learning-based methodologies applied in
various facets of next-generation telecommunication networks,
with a particular emphasis on the IoT paradigm. The
amalgamation of IoT and next-generation networks unveils a
multitude of potential arenas where Machine Learning (ML)
can make significant contributions. For instance, the spatial
and temporal big data that IoT continuously generates requires
transmission over the network for processing, analysis, and
inference by edge, fog, and cloud computing servers. This
scenario presents a myriad of research gaps and challenges that
call for effective solutions from the ML research community.
The most recent milestone in the evolution of wireless
communication is the advent of fifth-generation (5G)
technology, which stands poised to usher in a new era of
connectivity and performance. By enabling previously
impossible applications and services, 5G networks promise
higher data rates, lower latency, and enhanced reliability
compared to their predecessors. Furthermore, 5G networks
have been engineered to cater to the growing connectivity
demands of IoT devices, which are anticipated to proliferate
exponentially due to the introduction of smart and wearable
devices, wireless sensor networks (WSNs), and mobile ad-hoc
networks (MANETs). As such, the evolution of wireless
communication has revolutionized our lifestyle and work
culture, providing constant connectivity and real-time access
to information [3].
The IoT, constituting a network of interconnected physical
devices, vehicles, buildings, and other entities embedded with
sensors, software, and network connectivity, facilitates data
collection and exchange. The role of IoT devices in wireless
communication systems is pivotal in providing a seamless
communication experience for users [4]. Given the
exponential growth in the number of connected devices and
the escalating demand for higher data rates, IoT-enabled
Ingénierie des Systèmes d’Information
Vol. 28, No. 4, August, 2023, pp. 959-967
Journal homepage: http://iieta.org/journals/isi
959
wireless communication systems are becoming indispensable
to contemporary society.
These systems are expected to provide reliable and secure
communication links between IoT devices and other
communication networks, thereby facilitating the delivery of
innovative services and applications. The integration of IoT
devices with wireless communication systems is anticipated to
stimulate the growth of smart cities, the industrial Internet, and
other IoT-based applications, thus paving the path towards a
more connected and intelligent future [5-7].
IoT devices are designed to generate vast volumes of data
from diverse sensors continuously. However, the local
computational resources required to process this data are
typically lacking in IoT networks. As a result, the data is
transmitted via robust communication links to edge, fog, or
cloud servers for further processing. It is in this context that
Machine Learning (ML) plays a crucial role, helping to discern
hidden patterns and trends in this big data. The insights derived
from ML algorithms are subsequently used to inform decision
support and expert systems.
The remainder of this paper is structured as follows: Section
2 introduces the taxonomy of the literature review, followed
by a comprehensive review of the literature in Section 3.
Section 4 presents challenges and opportunities for ML in IoT
and next-generation wireless communication systems. Finally,
Section 5 provides the conclusion.
2. TAXONOMY OF LITERATURE REVIEW
The architecture of this systematic literature review is
depicted in a hierarchical flow that is further classified by the
taxonomy presented in this section. The review commences
with an exploration of the evolution of wireless
communication systems, detailing the progression of their
generations, the data rates supported, and the temporal
developments. This forms the base of the review's pyramid
structure.
In the subsequent generations of wireless communication
systems, particularly in 5G and beyond, the integration of IoT
forms the second tier of the pyramid. This incorporation marks
a significant milestone in the evolution of wireless
communication systems.
The final tier of the pyramid comprises the examination of
the implications and applications of various Machine Learning
approaches within the IoT and next-generation wireless
communication systems. This phase of the literature review is
pivotal in understanding the role and impact of Machine
Learning in these advanced systems.
Following this systematic review, a compilation of
challenges and opportunities is presented, based on the
insights gleaned from the reviewed literature. Concluding
remarks are then provided, summarizing the key findings and
implications of the review. The schematic representation of
this literature review is illustrated in Figure 1.
Table 1 elaborates on the taxonomy, focusing on the critical
domains within next-generation wireless networks where ML
has been deployed. These areas include adaptive
communication, Non-Orthogonal Multiple Access (NOMA),
and radio resource allocation. The subsequent sections present
a series of thematic tables, where relevant studies are
catalogued and discussed in relation to their respective topics.
This structured approach to the literature review ensures a
comprehensive understanding of the role of Machine Learning
in next-generation wireless communication systems and IoT.
Figure 1. Flow of literature review
Table 1. Topic-wise taxonomy mapping
Table ID
Topics
2
ML in Resource Management
3
ML in Adaptive communication
4
ML in NOMA systems
3. SYSTEMATIC LITERATURE REVIEW
This section presents a systematic review of the literature
on Machine Learning implications in IoT and next-generation
wireless communication systems.
3.1 Evolution in wireless communication
Wireless communication has evolved significantly over the
past few decades and has played a critical role in shaping the
modern world as we know it today. The evolution of wireless
communication can be divided into several generations, each
characterized by advances in technology and the development
of new standards.
First Generation (1G) Wireless Communication (1980s):
1G wireless communication refers to the first generation
of mobile telephony, which was analog-based and
primarily used for voice communication. It was based on
Frequency Division Multiple Access (FDMA). The
available system bandwidth is divided among the active
users equally. That could result in poor radio resource
utilization, crosstalk, and a limited number of supported
users [27].
Second Generation (2G) Wireless Communication
(1990s): 2G wireless communication introduced digital
communication, which improved the quality of voice
communication and enabled the transmission of data over
cellular networks. The first 2G network was launched in
1991, and by the end of the decade, 2G networks had been
deployed globally. The technology used is FDMA plus
Time Division Multiple Access (TDMA). Each user
signal is assigned a sequence of time and frequency
channel slots for communication. It is also known as the
Global System of Mobile (GSM) network and land mobile
network, one of the most famous and widely used
networks across the globe as an alternative to the public
switched telephone networks (PSTN). The number of
users supported has significantly increased with the only
limitation of time synchronization and poor time and
frequency slots utilization [28].
960
Third Generation (3G) Wireless Communication (2000s):
3G wireless communication brought faster data rates,
increased network capacity, and improved multimedia
support, enabling a new range of data-intensive
applications, such as mobile internet access, video
conferencing, and multimedia messaging. It is mainly
based on the code division multiple access technology
(CDMA) with various other variants and standards. The
users were adequately separated in the code domain;
hence a huge number of users were supported. The
limitations observed were code orthogonality and RAKE
receiver complexity [28].
Fourth Generation (4G) Wireless Communication (2010s):
4G wireless communication brought further
improvements in data rate and network capacity, enabling
high-speed mobile broadband access, and enabling new
applications, such as online gaming, video streaming, and
real-time multimedia communication. The technology is
mainly based on a combination of multicarrier (MC)
CDMA, that is the combination of CDMA, FDMA and
TDMA along with space division multiple access (SDMA)
enhanced support. It is also known as ultra-wideband
(UWB) technology [2].
Fifth Generation (5G) Wireless Communication (2020s-
Present): 5G wireless communication promises to bring
even faster data rates, lower latency, and increased
network capacity, enabling a new range of applications,
such as autonomous vehicles, virtual reality, and the IoT.
The main motivation behind the advent and promotion of
5G was enhanced connectivity, data rates, low latency,
and the overwhelming number of devices that appeared
due to mobile and cloud computing technologies. It is
based on non-orthogonal multiple access (NOMA)
techniques and millimeter wave (mm) communication [2].
The evolution of wireless communication has been driven
by the increasing demand for mobile connectivity and the need
to support new and more demanding applications. As
technology continues to evolve, future generations of wireless
communication will likely bring even more advanced
capabilities, enabling new and innovative applications and
transforming the way we live and work.
3.2 Mobile computing, IoT and wireless communication
The IoT refers to the interconnected network of physical
devices, vehicles, home appliances, and other items embedded
with electronics, software, sensors, and network connectivity,
allowing these objects to collect and exchange data. IoT
devices can communicate and collaborate with the
surrounding environment, enabling them to collect and
analyze data, make decisions, and perform tasks without
human intervention. The IoT has a wide range of applications
across various industries and sectors, some of which are:
Smart Homes: IoT devices are used to automate and
control various home appliances, such as lighting, heating,
air conditioning, and security systems, from a single
device, such as a smartphone.
Healthcare: IoT devices are used in healthcare to monitor
patients remotely, track vital signs, and collect medical
data for analysis, helping to improve patient care and
reduce hospital stays.
Manufacturing: IoT devices are used in manufacturing to
improve efficiency, monitor production processes, and
optimize supply chain management.
Agriculture: IoT devices are used in agriculture to monitor
crops, soil, and weather conditions, helping farmers to
optimize crop production and improve yield.
Transportation: IoT devices are used in transportation to
improve traffic management, monitor vehicle
performance, and reduce fuel consumption.
Energy: IoT devices are used in energy to monitor and
manage energy consumption, reduce waste, and improve
sustainability.
Retail: IoT devices are used in retail to track inventory,
monitor customer behaviour, and improve customer
experience.
These are just a few examples of the many areas where IoT
is being used to improve efficiency, reduce costs, and create
new and innovative solutions. As technology continues to
evolve, IoT will likely have an even greater impact on our
daily lives and the way we work and interact with the world
around us. All the said technological advances require and set
the room of 5G and beyond the network. That is mainly due to
the inherent need for connectivity, speed, data rates and low
latency with support to excessively increase the number of
connected devices.
The wireless sensor network (WSN) is a specific type of IoT
technology that uses wireless communication to connect many
low-power, small sensors, and devices. The WSNs are
designed for applications in which many small sensors are
deployed to collect data and send it to a central location for
processing and analysis. WSNs can be used in a wide range of
applications, including environmental monitoring, industrial
process control, healthcare, military, and many others [4, 29].
The WSNs are a crucial part of the IoT ecosystem and are used
in many IoT applications.
The WSNs have several advantages over traditional wired
sensor networks. They are easy to install and maintain, as there
are no wires to run or cables to connect. They can be deployed
in remote or hard-to-reach locations, and they can be
reconfigured on-the-fly to adapt to changing conditions.
WSNs are also energy-efficient, as they are designed to use
low power, allowing the sensors to run for long periods on
batteries. However, WSNs also have some challenges,
including limited bandwidth, interference, security, and
scalability. Researchers and engineers are constantly working
to address these challenges and improve the performance of
WSNs. As technology continues to evolve, WSNs are
becoming increasingly popular and widely used for a variety
of applications [4, 30].
Wearables are also a specific type of IoT device that can be
worn on the body and are designed to be used near the user.
Examples of wearables include smartwatches, fitness trackers,
and smart glasses. Wearables are equipped with sensors,
computing power, and wireless connectivity, which enables
them to collect and exchange data with other devices.
Wearables are a growing part of the IoT ecosystem and are
used in many applications, including health and fitness
tracking, mobile payments, and hands-free control of other
devices. They allow users to access and control information
and perform tasks without having to physically interact with a
device. Mobile computing technologies play a crucial role in
enabling the IoT and its various applications. That is under
such communication systems the said technologies can be
affordable at a user level. Some of the mobile computing
technologies used in IoT include, but are not limited to:
Wireless Connectivity: IoT devices require wireless
connectivity to exchange data and communicate with
961
each other. Mobile computing technologies such as Wi-
Fi, cellular networks, and Bluetooth provide the
wireless connectivity infrastructure needed for IoT
devices to function.
Mobile Operating Systems: Mobile operating systems,
such as Android and iOS, are used to create and manage
IoT devices. These operating systems provide a
platform for developers to create and deploy IoT
applications, and for users to interact with and control
their devices.
Cloud Computing: Cloud computing technologies,
such as Amazon Web Services (AWS) and Microsoft
Azure, provide a scalable and secure platform for
storing, processing, and analyzing large amounts of
data generated by IoT devices.
Mobile Devices as Controllers: Mobile devices such as
smartphones and tablets can serve as controllers for IoT
devices. Users can use their mobile devices to monitor
and control their IoT devices, as well as to access and
analyze the data generated by these devices.
Mobile Data Analytics: Mobile computing
technologies, such as mobile data analytics, provide the
tools needed to analyze the vast amounts of data
generated by IoT devices. These tools can help
organizations to uncover new insights, make better
decisions, and improve their operations.
The impact of mobile computing technologies on wireless
communication systems has been significant in recent years.
The integration of mobile computing technologies into
wireless communication systems has led to the creation of
smart and connected devices that can communicate with each
other, resulting in new opportunities for innovation and growth
in various industries [31, 32]. The rise of the IoT has been a
key driver for this development, as more and more devices are
being connected to the Internet to collect, process, and
exchange data. This has created the need for wireless
communication systems that can support many devices, have
low power consumption, and low latency [33, 34]. To meet
these requirements, new mobile computing technologies, such
as 5G, have been developed to improve wireless
communication capabilities and support the growth of IoT
applications. The combination of IoT and mobile computing
technologies has enabled the development of new use cases,
such as smart homes, smart cities, and connected vehicles,
among others [35]. Overall, the impact of mobile computing
technologies on wireless communication systems has been
transformative, providing new opportunities and driving
innovation in a variety of industries.
The integration of IoT devices, wireless communication
systems, and mobile computing technologies is expected to
play a major role in enabling the next generation of smart and
connected applications [36]. IoT devices can collect and
exchange data with each other and with other communication
networks through wireless communication systems. These
systems are designed to provide reliable and secure
communication links between IoT devices, enabling the
delivery of innovative services and applications. Mobile
computing technologies, on the other hand, provide the
computational power and storage needed to process and
analyze the vast amounts of data generated by IoT devices.
The combination of IoT devices, wireless communication
systems, and mobile computing technologies is expected to
drive the growth of various applications, including smart cities,
industrial Internet, and wearable technologies. These
applications require low-latency, high-bandwidth, and reliable
communication links, which can only be provided by the
integration of IoT devices, wireless communication systems,
and mobile computing technologies, thus paving the way for a
more connected and intelligent future [29].
Moreover, the impact of mobile computing technologies on
wireless communication systems has been significant in recent
years. The integration of mobile computing technologies into
wireless communication systems has led to the creation of
smart and connected devices that can communicate with each
other, resulting in new opportunities for innovation and growth
in various industries. The rise of the IoT has been a key driver
for this development, as more and more devices are being
connected to the internet to collect, process, and exchange data.
This has created the need for wireless communication systems
that can support many devices, have low power consumption,
and have low latency. To meet these requirements, new mobile
computing technologies, such as 5G, have been developed to
improve wireless communication capabilities and support the
growth of IoT applications. The combination of IoT and
mobile computing technologies has enabled the development
of new use cases, such as smart homes, smart cities, and
connected vehicles, among others. Overall, the impact of
mobile computing technologies on wireless communication
systems has been transformative, providing new opportunities
and driving innovation in a variety of industries. In conclusion,
mobile computing technologies play a key role in enabling the
IoT and its various applications and will continue to shape the
future of IoT and other emerging technologies [37, 38].
3.3 ML in IoT-enabled wireless communication
Traditional optimization techniques have been investigated
in the literature for optimizing wireless communication
systems. But there are limitations such as the complexity of
the techniques such as convex optimization, secondly, no
closed-form formula is usually available. Moreover,
traditional techniques are applied to earlier communication
systems with relatively fewer variables and a better degree of
freedom. Nonetheless, for modern wireless communication
systems traditional optimization techniques are either
impractical or nearly impossible to investigate or relied upon.
A sample optimization problem has been depicted in Eq. 1.
The overall system’s data rate is being enhanced subject to the
fulfilment of two constraints. Namely, the bit error and
transmit power. The mathematically constrained optimization
problem for the communication system can be given as:



(1)
 
and



(2)
where,󰇛󰇛󰇜󰇜 the bit rate of the ith subcarrier,
which is a product of code rate and modulation order used, PT
is the available power and BERT is the target BER that depends
upon the quality of service (QoS) or application requirements
962
while Nsc is several subcarriers in NOMA.
Here comes ML which is a subfield of artificial intelligence
(AI) that provides systems with the ability to automatically
improve their performance through experience. In the context
of IoT-enabled wireless communication, ML has the potential
to greatly enhance the performance and efficiency of these
networks by optimizing various aspects such as network
management, resource allocation, and security [39]. This can
be achieved using ML algorithms such as supervised and
unsupervised learning, deep learning, and reinforcement
learning. In supervised learning, the algorithms are trained on
a labelled dataset and the goal is to make predictions on new,
unseen data. In unsupervised learning, the algorithms work
with an unlabeled dataset and the goal is to discover patterns
or structures in the data. In reinforcement learning, the
algorithms learn through trial-and-error interactions with an
environment [40].
These algorithms can be applied to various problems
encountered in IoT wireless communication systems such as
energy efficiency, data accuracy, and network reliability. By
leveraging ML, IoT-enabled wireless communication systems
can better handle the large amounts of data generated by IoT
devices and provide more robust and efficient communication
services. ML is used in a wide range of applications, including
image and speech recognition, natural language processing,
recommendation systems, and predictive modeling. It has
revolutionized many industries, including healthcare, finance,
marketing, and transportation, and continues to play a crucial
role in the development of AI and the IoT such as medical IoT
(IoMT) [41, 42].
ML has a significant role in wireless communication,
particularly in the design and optimization of communication
systems. ML techniques are used to tackle complex problems
in wireless communication, such as interference management,
resource allocation, and network optimization. ML has also
played a crucial role in the development of 5G wireless
communication networks. With the increase in the number of
connected devices and the growing demand for high-speed
data transfer, 5G networks require sophisticated techniques to
optimize performance and ensure efficient utilization of
resources. ML algorithms are used in 5G networks to perform
functions such as network slicing, traffic management, and
congestion control [43-45].
ML has a key role in wireless communication, providing
new and innovative solutions to complex problems in the field.
It is expected to continue to play a crucial role in the evolution
of wireless communication networks, especially with the
advent of the IoT and the growing demand for high-speed data
transfer. ML is a subfield of AI that focuses on the
development of algorithms and models that can learn from and
make predictions on data. ML algorithms use statistical
techniques to model and understand the relationships between
the input data and output predictions, enabling the models to
improve over time with experience [46].
Moreover, ML can also help in reducing the complexity of
network management by enabling autonomous decision-
making, reducing the need for manual intervention, and
enabling real-time responses to change network conditions.
Additionally, ML can also be used to detect and prevent
security threats by analyzing the behaviour of IoT devices and
detecting anomalies in real-time. This helps to ensure the
security and privacy of sensitive data and protects the network
from potential cyber-attacks [47].
Another important aspect where ML can play a crucial role
is in optimizing the utilization of network resources. ML
algorithms can be used to predict network congestion and
dynamically allocate network resources such as bandwidth and
power to the devices that need it most. This results in improved
network performance and reduces the need for manual
intervention. ML has the potential to significantly improve the
performance and efficiency of IoT-enabled wireless
communication systems. The integration of ML with IoT and
wireless communication technologies is a rapidly growing
area of research, and numerous advancements have been made
in recent years. Though, there is still much room for
improvement, and ongoing research efforts are aimed at
further refining the use of ML in these systems and developing
new ML-based solutions for the various challenges faced by
IoT-enabled wireless communication [48].
3.4 Summary of literature review on ML in wireless
communication
This study has a specific aim to evaluate the impact of ML
techniques in the upcoming generation of wireless networks
while considering the IoT as a crucial factor. To achieve this
aim, a systematic review of the existing literature is conducted.
This study will provide an in-depth analysis of the role of ML
in next-generation wireless networks and its potential impact
on the design and optimization of these networks, particularly
in the context of IoT devices. The systematic review of the
literature enables us to gain a comprehensive understanding of
the current state of the art in ML-based wireless
communication and the opportunities and challenges in this
area. The results of this study will provide valuable insights
for researchers, engineers, and practitioners in the field of
wireless communication and help in future research and
development in this area. From the literature review, the
following can be summarized.
(1) IoTs have been becoming an essential component in
information and communication technologies because of the
tremendously growing popularity of wearable and sensory
devices.
(2) Communication systems have been evolving rigorously
to meet the expectations of the demanding information and
communication technologies of the current era.
(3) ML has been playing a significant role in optimizing the
communication systems utilization and fulfilment of the
enhanced speed and data rate needs.
(4) Together the IoT and communication systems have been
obtaining the limitless benefits of ML in various aspects
whether it is radio resources utilization or elasticity of the
demand.
Moreover, Table 2 contains a summary of the selected
literature for ML in wireless communication systems such as
device-to-device (D2D) communication emphasizing radio
resource optimization such as for spectrum and energy
efficiency (EE).
Likewise, Table 3 summarizes the studies involving ML
such as Gaussian Radial Basis Function (GRBF) neural
networks, Fuzzy Rule-Based System (FRBS) and Genetic
algorithms (GA) in adaptive communication.
Similarly, Table 4 summarizes ML in the NOMA-based
systems in the power domain (PD) as well as code domain (CD)
for various radio networks.
963
Table 2. Machine Learning based resource management
Ref
Objective
Method
Conclusion
[8]
A power allocation scheme is utilized to
optimize the D2D transmit power and
maximize the EE
ML-based power control
algorithm
It was shown that the spectrum and energy efficiency
of a network can be enhanced by maximizing EE and
optimizing the transmit power.
[9]
To maximize the sum throughput of D2D links,
while at the same time ensuring the QoS
Deep Reinforcement
Learning (DRL)
The simulation
results reveal that POPS outperforms DDPG, DDQN,
and DQN by 16.67%, 24.98%, and 59.09%.
[10]
Optimize system spectral efficiency
Joint utility and strategy
estimation-based learning
The proposal can achieve near-optimal performance
in a distributed manner
[11]
Optimize a long-term utility that is related to
task execution delay, task queuing delay, and so
on.
DRL
Improved computation offloading performance
significantly compared to several baseline policies
[12]
Optimize system sum rate
K-nearest neighbours
(KNN)
Raise system performance compared to a state-of-
the-art approach.
[13]
BDMA can overcome the scarcity of time and
frequency to share spectrum and OFDM is used
as 5G modulation techniques.
Beam Division Multiple
Access (BDMA)
Does not support adaptivity which is a feature of
OFDM
Table 3. Machine Learning in adaptive communication
Ref
Objective
Method
Conclusion
[14]
ACM using Product code and QAM
FRBS
Flat power distribution
[15]
Adaptive modulation
Fuzzy Logic
Only adaptive modulation
[16]
ACM and power
FRBS and GA.
Complex system
[17]
ACM and power using GA and product codes with QAM
compared to the water-filling principle
FRBS, Water-filling principle,
GA
Huge complexity in decoding
product codes
[18]
Adaptive communication
GRBF Neural Network
For satellite communication only.
[19]
Adaptive communication
FRBS and differential evolution
algorithm
For satellite communication only.
Table 4. Machine Learning in NOMA systems
Ref
Objective
Method
Conclusion
[20]
Optimal power allocations for m-user uplink/
downlink NOMA systems
Evolutionary computing
CD not considered
[21]
An overview of the latest NOMA research and
innovations and applications.
Survey on various ML NOMA applications
Missing ACM rather than
surveying challenges and
trends.
[22]
PD multiplexing NOMA with an emphasis on
amalgams of multiple antenna (MIMO) techniques
and NOMA
Survey on various ML methods in PD-
multiplexing-aided NOMA
Focused on MIMO and
NOMA mainly while ACM
was missing.
[23]
CD-NOMA performance was found better than
classical ALOHA
Compare the performance of CD-NOMA
with classical ALOHA protocol.
Focused on CD NOMA while
PD NOMA was not
considered
[24]
Power allocation policies are discussed for the
proposed scheme
ML in NOMA for centralized radio networks
The application of FD in
NOMA has been studied
[25]
The superiority of NOMA-enabled F-RANs over
conventional OMA-enabled F-RANs is verified.
ML was used as monotonic optimization
approach
Limited to adaptative power
[26]
Investigated the error performance of NOMA
schemes in the presence of channel estimation
errors in addition to imperfect SIC
Derive exact bit error probabilities (BEPs) in
closed forms and technical analysis is
validated via simulations
Limited to optimum power
allocation
4. CHALLENGES AND OPPORTUNITIES
The role of ML in IoT-enabled next-generation wireless
communication systems is a complex and dynamic field, with
several challenges that need to be addressed. Some of the main
challenges include:
Data privacy and security: IoT devices generate a
large amount of sensitive data, and preserving the
privacy and security of this data is critical.
Limited computation and storage: Many IoT
devices have limited computational and storage
resources, making it challenging to deploy ML
algorithms on these devices.
Network heterogeneity: IoT devices are connected
to the network through different communication
technologies such as Wi-Fi, Zigbee, and others,
leading to network heterogeneity. This can make it
challenging to deploy ML algorithms seamlessly.
Heterogeneous data sources: IoT devices can
generate data from various sources, making it
challenging to integrate and analyze data from
multiple sources.
Lack of standardization: There is currently a lack
of standardization in the deployment of ML
algorithms for IoT devices, making it challenging
to deploy ML algorithms in a scalable manner.
964
IoT and big data: The huge data continuously
generated/produced by the IoT devices need to be
processed. However, the IoT devices themselves
can not do that due to limited resources and
sometimes just comprised of sensors and
transmitters. That data need to be transmitted to
some edge, fog or cloud server for processing and
analyses with the help of ML-based algorithms.
Real-time processing: IoT devices generate real-
time data, and there is a need to process this data
in real time to enable real-time decision-making.
This can be challenging for ML algorithms that
require significant computational resources.
Lack of data: ML algorithms require massive
amounts of data to train accurate models. There
may be a lack of adequate data from IoT devices to
develop ML solutions.
Concept drift: The relationships between input
data and outputs may change over time, reducing
the accuracy of ML models. ML solutions for IoT
and wireless networks need to adapt to concept
drift.
Lack of explainability: ML models are often
opaque and complex, lacking explainability. This
can reduce the trust and adoption of ML solutions.
Interpretable ML is needed.
Despite these challenges, there are huge opportunities in
applying ML for the IoT in next-generation wireless
communication systems are numerous and include:
Improved network performance: ML can be used to
optimize network resources, manage traffic, and
improve network efficiency and coverage.
Predictive maintenance: ML algorithms can analyze
data from IoT devices to predict potential failures,
allowing for proactive maintenance to prevent
outages.
Increased security: ML algorithms can identify and
mitigate potential security threats, such as cyber-
attacks, in real time.
Customization and personalization: ML can be used
to personalize services for individual users based on
their preferences and behaviour.
Enhanced user experience: ML algorithms can
provide real-time recommendations and personalized
services, improving the overall user experience.
Efficient resource allocation: ML algorithms can
optimize the use of network resources, reducing
energy consumption and increasing the overall
efficiency of the system.
Real-time data analysis: ML algorithms can analyze
large amounts of data from IoT devices in real time,
providing valuable insights and enabling better
decision-making.
Automated network optimization: ML can
automatically optimize network configurations and
parameters to maximize performance. This reduces
manual effort and improves network efficiency.
Predictive modeling: ML models can predict future
network demands, traffic patterns and failures. This
enables a proactive rather than reactive approach to
network monitoring and management.
Personalized QoS: ML techniques can predict the
QoS needs of different IoT applications and allocate
network resources to meet those needs. This results
in improved QoS for end users.
5. CONCLUSION
In conclusion, the integration of Machine Learning
techniques with IoT and wireless communication systems has
the potential to bring about significant advancements in the
next generation of wireless networks. But, to fully realize the
potential of ML for IoT in wireless communication, several
challenges need to be addressed, including the development of
algorithms that can effectively process large amounts of data
in real-time, the creation of secure and reliable communication
infrastructure, and the design of efficient and effective ML
models. However, despite the numerous benefits, there are
also several challenges associated with applying ML in IoT-
based systems. These challenges include data privacy and
security, computational complexity, and interpretability. To
fully realize the potential of ML in IoT-based systems, it is
crucial that these challenges are addressed and overcome.
Further research in this area can help to expand the capabilities
of ML and bring in the next generation of wireless
communication systems that are more efficient, secure, and
scalable.
It can further be concluded that next-generation networks
are among the hottest areas of research where ML can be
incorporated to solve complex problems more adequately.
Nonetheless, ML techniques exhibit inherent complexity in
the training phase, more research is needed to address this
problem and make the solution real time. In the IoT inclusion,
fog and cloud computing becomes more evident and
consequently, the problems become more diversified and
multifaceted. In this case, different variants of ML can be
investigated such as transfer learning, federated and fusion-
based learning [49, 50]. That is still an open area of research
to comprehend ML-based optimization in next-generation
wireless networks, especially in the IoT and cloud computing
paradigms. In particular smart applications, such as wearables,
smart homes, smart cities and industrial automation such as
Industry 4.0 and Healthcare 5.0 are a few among many
potential application areas.
REFERENCES
[1] Shah, A.S., Qasim, A.N., Karabulut, M.A., Ilhan, H.,
Islam, M.B. (2021). Survey and performance evaluation
of multiple access schemes for next-generation wireless
communication systems. IEEE Access, 9: 113428-
113442. https://doi.org/10.1109/access.2021.3104509
[2] Budhiraja, I., Kumar, N., Tyagi, S., Tanwar, S., Han, Z.,
Piran, M.J., Suh, D.Y. (2021). A systematic review on
NOMA variants for 5G and beyond. IEEE Access, 9:
85573-85644.
https://doi.org/10.1109/access.2021.3081601
[3] Khanh, Q.V., Hoai, N.V., Manh, L.D., Le, A.N., Jeon, G.
(2022). Wireless communication technologies for IoT in
5G: Vision, applications, and challenges. Wireless
Communications and Mobile Computing, 2022: 1-12.
https://doi.org/10.1155/2022/3229294.
[4] Borza, P.N., Machedon-Pisu, M., Hamza-Lup, F. (2019).
Design of wireless sensors for IoT with energy storage
and communication channel heterogeneity. Sensors,
19(15): 3364. https://doi.org/10.3390/s19153364
965
[5] ALRikabi, H.T.S., Alaidi, A.H., Nasser, K. (2020). The
Application of wireless communication in IOT for saving
electrical energy. International Journal of Interactive
Mobile Technologies (IJIM), 14(1): 152-160.
https://doi.org/10.3991/ijim.v14i01.11538
[6] Celebi, H. B., Pitarokoilis, A., Skoglund, M. (2020).
Wireless communication for the industrial IoT. Industrial
IoT: Challenges, Design Principles, Applications, and
Security, Springer, Cham, 57-94.
https://doi.org/10.1007/978-3-030-42500-5_2
[7] Singh, R.K., Berkvens, R., Weyn, M. (2020). Energy
efficient wireless communication for IoT enabled
greenhouses. In International Conference on
Communication Systems and Networks, pp. 885-887.
https://doi.org/10.1109/comsnets48256.2020.9027392
[8] Khan, M.A.A., Kaidi, H.M., Ahmad, N. (2021). Joint
time and power control scheme for NOMA-enabled D2D
users with energy harvesting in IoT environment. In 2021
IEEE Symposium on Future Telecommunication
Technologies (SOFTT), Bandung, Indonesia, pp. 28-34.
https://doi.org/10.1109/SOFTT54252.2021.9673146
[9] Khan, M.A.A., Kaidi, H.M., Ahmad, N., Rehman, M.U.
(2023). Sum throughput maximization scheme for
NOMA-Enabled D2D groups using deep reinforcement
learning in 5G and beyond networks. IEEE Sensors
Journal, pp. 1-1.
https://doi.org/10.1109/JSEN.2023.3276799
[10] Sun, Y., Peng, M., Poor, H.V. (2018). A distributed
approach to improving spectral efficiency in uplink
device-to-device-enabled cloud radio access networks.
IEEE Transactions on Communications, 66(12): 6511-
6526. https://doi.org/10.1109/TCOMM.2018.2855212
[11] Chen, X., Zhang, H., Wu, C., Mao, S., Ji, Y., Bennis, M.
(2018). Optimized computation offloading performance
in virtual edge computing systems via deep
reinforcement learning. IEEE Internet of Things Journal,
6(3): 4005-4018.
https://doi.org/10.1109/JIOT.2018.2876279
[12] Wang, J.B., Wang, J., Wu, Y., Wang, J. Y., Zhu, H., Lin,
M., Wang, J. (2018). A Machine Learning framework for
resource allocation assisted by cloud computing. IEEE
Network, 32(2): 144-151.
https://doi.org/10.1109/MNET.2018.1700293
[13] Dalela, P.K., Bhave, P., Yadav, P., Yadav, A., Tyagi, V.
(2018). Beam Division Multiple Access (BDMA) and
modulation formats for 5G: Heir of OFDM? In 2018
International Conference on Information Networking
(ICOIN) Chiang Mai, Thailand, pp. 450-455.
https://doi.org/10.1109/ICOIN.2018.8343158
[14] Qureshi, I.M., Muzaffar, M.Z. (2011). Adaptive coding
and modulation for OFDM systems using product codes
and fuzzy rule base system. International Journal of
Computer Applications, 35(4): 41-48.
https://doi.org/10.5120/4393-6099
[15] Qureshi, I.M., Naseem, M.T., Muzaffar, M.Z. (2012). A
fuzzy rule base system aided rate enhancement scheme
for OFDM systems. In 2012 International Conference on
Emerging Technologies Islamabad, Pakistan, pp. 1-6.
https://doi.org/10.1109/INMIC.2012.6511444
[16] Qureshi, I.M., Salam, M.H., Naseem, M.T. (2013).
Efficient link adaptation in OFDM systems using a
hybrid intelligent technique. In 13th International
Conference on Hybrid Intelligent Systems, Gammarth,
Tunisia, pp. 12-17.
https://doi.org/10.1109/HIS.2013.6920471
[17] Islam, S.R., Avazov, N., Dobre, O.A., Kwak, K.S. (2016).
Power-domain non-orthogonal multiple access (NOMA)
in 5G systems: Potentials and challenges. IEEE
Communications Surveys & Tutorials, 19(2): 721-742.
https://doi.org/10.1109/COMST.2016.2621116
[18] Rahman, A. (2023). GRBF-NN based ambient aware
realtime adaptive communication in DVB-S2. Journal of
Ambient Intelligence and Humanized Computing, 14(5):
5929-5939. https://doi.org/10.1007/s12652-020-02174-
w
[19] Rahman, A.U., Dash, S., Luhach, A.K. (2021). Dynamic
MODCOD and power allocation in DVB-S2: A hybrid
intelligent approach. Telecommunication Systems, 76:
49-61. https://doi.org/10.1007/s11235-020-00700-x
[20] Ali, M.S., Tabassum, H., Hossain, E. (2016). Dynamic
user clustering and power allocation for uplink and
downlink non-orthogonal multiple access (NOMA)
systems. IEEE Access, 4: 6325-6343.
https://doi.org/10.1109/ACCESS.2016.2604821
[21] Ding, Z., Lei, X., Karagiannidis, G.K., Schober, R., Yuan,
J., Bhargava, V.K. (2017). A survey on non-orthogonal
multiple access for 5G networks: Research challenges
and future trends. IEEE Journal on Selected Areas in
Communications, 35(10): 2181-2195.
https://doi.org/10.1109/JSAC.2017.2725519
[22] Liu, Y., Qin, Z., Elkashlan, M., Ding, Z., Nallanathan, A.,
Hanzo, L. (2017). Nonorthogonal multiple access for 5G
and beyond. Proceedings of the IEEE, 105(12): 2347-
2381. https://doi.org/10.1109/JPROC.2017.2768666
[23] Duchemin, D., Gorce, J.M., Goursaud, C. (2018). Code
domain non orthogonal multiple access versus aloha: A
simulation-based study. In 2018 25th International
Conference on Telecommunications (ICT) Saint-Malo,
France, pp. 445-450.
https://doi.org/10.1109/ICT.2018.8464836
[24] Chen, X., Liu, G., Ma, Z., Zhang, X., Fan, P., Chen, S.,
Yu, F.R. (2019). When full duplex wireless meets non-
orthogonal multiple access: Opportunities and challenges.
IEEE Wireless Communications, 26(4): 148-155.
https://doi.org/10.48550/arXiv.1905.13605
[25] Liu, B., Liu, C., Peng, M., Liu, Y., Yan, S. (2020).
Resource allocation for non-orthogonal multiple access-
enabled fog radio access networks. IEEE Transactions on
Wireless Communications, 19(6): 3867-3878.
https://doi.org/10.1109/TWC.2020.2978843
[26] Ferdi, K.A.R.A., Hakan, K.A.Y.A. (2020). Error
probability analysis of non-orthogonal multiple access
with channel estimation errors. In 2020 IEEE
International Black Sea Conference on Communications
and Networking (BlackSeaCom) Odessa, Ukraine, pp. 1-
5.
https://doi.org/10.1109/BlackSeaCom48709.2020.9234
956
[27] Akbar, A., Jangsher, S., Bhatti, F.A. (2021). NOMA and
5G emerging technologies: A survey on issues and
solution techniques. Computer Networks, 190: 107950.
https://doi.org/10.1016/j.comnet.2021.107950
[28] Agrawal, R. (2022). Comparison of different mobile
wireless technology (From 0G to 6G). ECS Transactions,
107(1): 4799-4839.
https://doi.org/10.1149/10701.4799ecst
[29] Eeshwaroju, S., Jakkula, P., Ganesan, S. (2020). IoT
based empowerment by smart health monitoring, smart
966
education and smart jobs. In 2020 International
conference on computing and information technology
(ICCIT-1441), Tabuk, Saudi Arabia, pp. 1-5.
https://doi.org/10.1109/iccit-144147971.2020.9213754
[30] Ali, M.H., Kamardin, K., Maidin, S.N., Hlaing, N.W.,
Kaidi, H.M., Ahmed, I.S., Taj, N. (2022). Agricultural
production system based on IOT. International Journal of
Integrated Engineering, 14(6): 313-328.
https://doi.org/https://doi.org/10.30880/ijie.2022.14.06.
028
[31] Frank, M., Drikakis, D., Charissis, V. (2020). Machine-
learning methods for computational science and
engineering. Computation, 8(1): 15.
https://doi.org/10.3390/computation8010015
[32] Zhong, S., Zhang, K., Bagheri, M., Burken, J.G., Gu, A.,
Li, B.K., Ma, X.M., Marrone, B.L., Ren, Z.J., Schrier, J.,
Shi, W., Tan, H.Y., Wang, T.B., Wang, X., Wong, B.M.,
Xiao, X.S., Yu, X., Zhu, J.J., Zhang, H.C. (2021).
Machine Learning: New ideas and tools in environmental
science and engineering. Environmental Science &
Technology, 55(19): 12741-12754.
https://doi.org/10.1021/acs.est.1c01339
[33] Alqahtani, A., Aldhafferi, N., Atta-ur-Rahman 0001,
Sultan, K., Khan, M.A.A. (2018). Adaptive
communication: A systematic review. Journal of
Communications, 13(7): 357-367.
https://doi.org/10.12720/jcm.13.7.357-367
[34] Atta-ur-Rahman, Musleh, D., Aldhafferi, N., Alqahtani,
A., Alfifi, H. (2018). Adaptive communication for
capacity enhancement: A hybrid intelligent approach.
Journal of Computational and Theoretical Nanoscience,
15(4): 1182-1191.
https://doi.org/10.1166/jctn.2018.7191
[35] Kaur, M.J., Mishra, V.P., Maheshwari, P. (2020). The
convergence of digital twin, IoT, and machine learning:
Transforming data into action. Digital twin technologies
and smart cities, Springer, Cham, 3-17.
https://doi.org/10.1007/978-3-030-18732-3_1
[36] Daya Sagar, K.V. (2019). Implement of smart health care
monitoring system using mobile IoT and cloud
computing technologies, International Journal of
Innovative Technology and Exploring Engineering,
9(2S5): 67-72.
https://doi.org/10.35940/ijitee.b1015.1292s519
[37] Khan, M.A.A., AlAyat, M., AlGhamdi, J., AIOtaibi,
S.M., ALZahrani, M., AIQahtani, M., ur Rahman, A.,
Altassan, M., Jan, F. (2022). WeScribe: An intelligent
meeting transcriber and analyzer application. In
Proceedings of Third International Conference on
Computing, Communications, and Cyber-Security: IC4S
2021, Singapore: Springer Nature Singapore, pp. 755-
766. https://doi.org/10.1007/978-981-19-1142-2_59
[38] Khan, M.A.A., Alsawwaf, M., Arab, B., ALHashim, M.,
Almashharawi, F., Hakami, O., Olatunji, S.O., Farooqui,
M., ur Rahman, A. (2022). Road damages detection and
classification using deep learning and UAVs. In 2022
2nd Asian Conference on Innovation in Technology
(ASIANCON), Ravet, India, pp. 1-6.
https://doi.org/10.1109/ASIANCON55314.2022.990904
3
[39] Khan, M.A.A., Kaidi, H.M., Alam, N., Rehman, M.U.
(2021). Sum throughput maximization scheme for
NOMA-Enabled D2D groups using deep reinforcement
learning in 5G and beyond networks. IEEE Sensors
Journal, 23(13): 15048-15057. https://doi.org/
10.1109/JSEN.2023.3276799
[40] Dash, S., Biswas, S., Banerjee, D., Rahman, A.U. (2019).
Edge and fog computing in healthcare-A review.
Scalable Computing: Practice and Experience, 20(2):
191-206. https://doi.org/10.12694/scpe.v20i2.1504
[41] Nasir, M.U., Khan, S., Mehmood, S., Khan, M.A.,
Rahman, A.U., Hwang, S.O. (2022). IoMT-based
osteosarcoma cancer detection in histopathology images
using transfer learning empowered with blockchain, fog
computing, and edge computing. Sensors, 22(14): 5444.
https://doi.org/10.3390/s22145444
[42] Rahman, A.U., Nasir, M.U., Gollapalli, M., Alsaif, S.A.,
Almadhor, A.S., Mehmood, S., Khan, M.A., Mosavi, A.
(2022). IoMT-based mitochondrial and multifactorial
genetic inheritance disorder prediction using Machine
Learning. Computational Intelligence and Neuroscience,
Article ID: 2650742,
https://doi.org/10.1155/2022/2650742
[43] Alhaidari, F., Rahman, A., Zagrouba, R. (2020). Cloud
of Things: Architecture, applications and challenges.
Journal of Ambient Intelligence and Humanized
Computing, 14: 5957-5975.
https://doi.org/10.1007/s12652-020-02448-3
[44] Dash, S., Ahmad, M., Iqbal, T. (2021). Mobile cloud
computing: A green perspective. In Intelligent Systems:
Proceedings of ICMIB 2020, Springer, Singapore, 185:
523-533. https://doi.org/10.1007/978-981-33-6081-5_46
[45] Rahman, A., Sultan, K., Dash, S., Khan, M.A. (2018).
Management of resource usage in mobile cloud
computing. International Journal of Pure and Applied
Mathematics, 119(16): 255-261.
[46] Basheer Ahmed, M.I., Zaghdoud, R., Ahmed, M.S.,
Sendi, R., Alsharif, S., Alabdulkarim, J., Saad, B.A.A.,
Alsabt, R., Rahman, A., Krishnasamy, G. (2023). A real-
time computer vision-based approach to detection and
classification of traffic incidents. Big Data and Cognitive
Computing, 7(1): 22.
https://doi.org/10.3390/bdcc7010022
[47] Bhati, B.S., Rai, C.S., Balamurugan, B., Al-Turjman, F.
(2020). An intrusion detection scheme based on the
ensemble of discriminant classifiers. Computers &
Electrical Engineering, 86: 106742.
https://doi.org/10.1016/j.compeleceng.2020.106742
[48] Chettri, L., Bera, R. (2019). A comprehensive survey on
Internet of Things (IoT) toward 5G wireless systems.
IEEE Internet of Things Journal, 7(1): 16-32.
https://doi.org/10.1109/jiot.2019.2948888
[49] Gafni, T., Shlezinger, N., Cohen, K., Eldar, Y.C., Poor,
H.V. (2022). Federated learning: A signal processing
perspective. IEEE Signal Processing Magazine, 39(3):
14-41. https://doi.org/10.1109/MSP.2021.3125282
[50] Asif, R. N., Abbas, S., Khan, M. A., Sultan, K., Mahmud,
M., Mosavi, A. (2022). Development and validation of
embedded device for electrocardiogram arrhythmia
empowered with transfer learning. Computational
Intelligence and Neuroscience, Article ID: 5054641.
https://doi.org/10.1155/2022/5054641
967
ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
To constructively ameliorate and enhance traffic safety measures in Saudi Arabia, a prolific number of AI (Artificial Intelligence) traffic surveillance technologies have emerged, including Saher, throughout the past years. However, rapidly detecting a vehicle incident can play a cardinal role in ameliorating the response speed of incident management, which in turn minimizes road injuries that have been induced by the accident's occurrence. To attain a permeating effect in increasing the entailed demand for road traffic security and safety, this paper presents a real-time traffic incident detection and alert system that is based on a computer vision approach. The proposed framework consists of three models, each of which is integrated within a prototype interface to fully visualize the system's overall architecture. To begin, the vehicle detection and tracking model utilized the YOLOv5 object detector with the DeepSORT tracker to detect and track the vehicles' movements by allocating a unique identification number (ID) to each vehicle. This model attained a mean average precision (mAP) of 99.2%. Second, a traffic accident and severity classification model attained a mAP of 83.3% while utilizing the YOLOv5 algorithm to accurately detect and classify an accident's severity level, sending an immediate alert message to the nearest hospital if a severe accident has taken place. Finally, the ResNet152 algorithm was utilized to detect the ignition of a fire following the accident's occurrence; this model achieved an accuracy rate of 98.9%, with an automated alert being sent to the fire station if this perilous event occurred. This study employed an innovative parallel computing technique for reducing the overall complexity and inference time of the AI-based system to run the proposed system in a concurrent and parallel manner.
Article
Full-text available
Internet of things (IoT) is not a single word, but it has gathered billions of devices in the same lane. The Internet of things has given the lives of things. Machines have a sense now like a human. It works remotely as the program has been settled inside the chip. The system has become so smart and reliable. The Internet of things has brought out changes in most of the sectors of humankind. Meanwhile, agriculture is the main strength of a country. The more the production of agricultural products increased, the world will be more completeness from food shortage. The production of agriculture can be increased when the IoT system can be entirely implemented in the agricultural sector. Most of the approaches for IoT based agriculture have been reviewed in this paper. Related to IoT based agriculture, most of the architecture and methodology have been interpreted and have been critically analyzed based on previous related work of the researchers. This paper will be able to provide a complete idea with the architecture and methodology in the field of IoT based agriculture. Moreover, the challenges for agricultural IoT are discussed with the methods provided by the researchers.
Article
Full-text available
With the emergence of the Internet of Things (IoT), investigation of different diseases in healthcare improved, and cloud computing helped to centralize the data and to access patient records throughout the world. In this way, the electrocardiogram (ECG) is used to diagnose heart diseases or abnormalities. e machine learning techniques have been used previously but are feature-based and not as accurate as transfer learning; the proposed development and validation of embedded device prove ECG arrhythmia by using the transfer learning (DVEEA-TL) model. is model is the combination of hardware, software, and two datasets that are augmented and fused and further finds the accuracy results in high proportion as compared to the previous work and research. In the proposed model, a new dataset is made by the combination of the Kaggle dataset and the other, which is made by taking the real-time healthy and unhealthy datasets, and later, the AlexNet transfer learning approach is applied to get a more accurate reading in terms of ECG signals. In this proposed research, the DVEEA-TL model diagnoses the heart abnormality in respect of accuracy during the training and validation stages as 99.9% and 99.8%, respectively, which is the best and more reliable approach as compared to the previous research in this field.
Article
Full-text available
A genetic disorder is a serious disease that a ects a large number of individuals around the world. ere are various types of genetic illnesses, however, we focus on mitochondrial and multifactorial genetic disorders for prediction. Genetic illness is caused by a number of factors, including a defective maternal or paternal gene, excessive abortions, a lack of blood cells, and low white blood cell count. For premature or teenage life development, early detection of genetic diseases is crucial. Although it is di cult to forecast genetic disorders ahead of time, this prediction is very critical since a person's life progress depends on it. Machine learning algorithms are used to diagnose genetic disorders with high accuracy utilizing datasets collected and constructed from a large number of patient medical reports. A lot of studies have been conducted recently employing genome sequencing for illness detection, but fewer studies have been presented using patient medical history. e accuracy of existing studies that use a patient's history is restricted. e internet of medical things (IoMT) based proposed model for genetic disease prediction in this article uses two separate machine learning algorithms: support vector machine (SVM) and K-Nearest Neighbor (KNN). Experimental results show that SVM has outperformed the KNN and existing prediction methods in terms of accuracy. SVM achieved an accuracy of 94.99% and 86.6% for training and testing, respectively.
Article
Full-text available
Bone tumors, such as osteosarcomas, can occur anywhere in the bones, though they usually occur in the extremities of long bones near metaphyseal growth plates. Osteosarcoma is a malignant lesion caused by a malignant osteoid growing from primitive mesenchymal cells. In most cases, osteosarcoma develops as a solitary lesion within the most rapidly growing areas of the long bones in children. The distal femur, proximal tibia, and proximal numerus are the most frequently affected bones, but virtually any bone can be affected. Early detection can reduce mortality rates. Osteosarcoma’s manual detection requires expertise, and it can be tedious. With the assistance of modern technology, medical images can now be analyzed and classified automatically, which enables faster and more efficient data processing. A deep learning-based automatic detection system based on whole slide images (WSIs) is presented in this paper to detect osteosarcoma automatically. Experiments conducted on a large dataset of WSIs yielded up to 99.3% accuracy. This model ensures the privacy and integrity of patient information with the implementation of blockchain technology. Utilizing edge computing and fog computing technologies, the model reduces the load on centralized servers and improves efficiency.
Article
Full-text available
Communication technologies are developing very rapidly and achieving many breakthrough results. The advent of 5th generation mobile communication networks, the so-called 5G, has become one of the most exciting and challenging topics in the wireless study area. The power of 5G enables it to connect to hundreds of billions of devices with extreme-high throughput and extreme-low latency. The 5G realizing a true digital society where everything can be connected via the Internet, well known as the Internet of Things (IoT). IoT is a technology of technologies where humans, devices, software, solutions, and platforms can connect based on the Internet. The formation of IoT technology leads to the birth of a series of applications and solutions serving humanity, such as smart cities, smart agriculture, smart retail, intelligent transportation systems, and IoT ecosystems. Although IoT is considered a revolution in the evolution of the Internet, it still faces a series of challenges such as saving energy, security, performance, and QoS support. In this study, we provide a vision of the Internet of Things that will be the main force driving the comprehensive digital revolution in the future. The communication technologies in the IoT system are discussed comprehensively and in detail. Furthermore, we also indicated indepth challenges of existing common communication technologies in IoT systems and future research directions of IoT. We hope the results of this work can provide a vital guide for future studies on communication technologies for IoT in 5G.
Article
Device-to-Device (D2D) communication underlaying cellular network is a capable system for advancing the spectrum’s efficiency. However, in this condition, D2D generates cross-channel and co-channel interference for cellular and other D2D users, which creates an excessive technical challenge for allocating the spectrum. Despite this, massive connectivity is another issue in the 5G and beyond networks that need to be addressed. To overcome this problem, non-orthogonal multiple access (NOMA) is integrated with the D2D groups (DGs). In this paper, our target is to maximize the sum throughput of the overall network while maintaining the signal-to-interference noise ratio (SINR) of the cellular and D2D users. To achieve the target, a discriminated spectrum distribution framework dependent on multi-agent deep reinforcement learning (MADRL), termed a deep deterministic policy gradient (DDPG) is proposed. Here, it shares the global historical states, actions, and policies using the duration of central training. Furthermore, the proximal online policy scheme (POPS) is used to decrease the computation complexity of training. It utilized the clipping substitute technique for the modification and reduction of complexity at the training stage. The simulation results demonstrated that the proposed scheme POPS attains 16.67%, 24.98%, and 59.09% higher performance than the DDPG, Deep Dueling and deep Q-network (DQN).
Article
Networks and the value they provide. Mobile wireless technology has evolved from Zero Generation (0G) through the first, second, and third-generation systems during the last several decades to increase Quality of Service (QoS), efficiency, and performance. There is now a shift to implementing fourth-generation (4G) systems. Shortly, mobile devices will have access to 4G and 5G networks. 1G technology made mass mobile wireless communication possible. Since analog technology was superseded by digital in the 1990s, two-generation (2G) wireless communication has evolved tremendously. New converged networks that simultaneously handle voice and data transmission are emerging due to 3G technology's focus on voice communication. Currently, "5G" isn't being used in official situations. 5G research aims to expand the WWW, dynamic ad hoc wireless networks, and more. People have been involved in the Mobile Wireless Generation from one generation to another.
Article
The dramatic success of deep learning is largely due to the availability of data. Data samples are often acquired on edge devices, such as smartphones, vehicles, and sensors, and in some cases cannot be shared due to privacy considerations. Federated learning is an emerging machine learning paradigm for training models across multiple edge devices holding local data sets, without explicitly exchanging the data. Learning in a federated manner differs from conventional centralized machine learning and poses several core unique challenges and requirements, which are closely related to classical problems studied in the areas of signal processing and communications. Consequently, dedicated schemes derived from these areas are expected to play an important role in the success of federated learning and the transition of deep learning from the domain of centralized servers to mobile edge devices.
Conference Paper
Energy efficiency in underlying wireless communication systems is challenging. There are several device-to-device (D2D) groups in which an individual group contains one transmitter and two receivers laying in a cellular network. The downlink (DL) transmission helps the transmitter of the D2D group (DG) for energy harvesting (EH). Harvested and stored energy is used to transfer the information to the respective D2D receivers (DRX) over the same spectrum, simultaneously. In this paper, the power allocation scheme is utilized to optimize the D2D transmitter (DTX) power and to maximise the DG’s energy-efficiency (EE). The user of poor channel conditions of a DG is affected by the strong channel conditions due to intra-user interference, which reduces the EE of the respective group. Therefore, a non-orthogonal multiple access (NOMA) technique is employed on the DG’s transmitter to reduce intra-user interference. Numerical results demonstrate that the spectrum efficiency and EE of a network can be enhanced by maximising the EE and optimising the power of transmitters in DGs using successive convex approximation.