Content uploaded by Mandeep Kaur
Author content
All content in this area was uploaded by Mandeep Kaur on Nov 18, 2022
Content may be subject to copyright.
An Energy-Efficient Load Balancing
Approach for Fog Environment Using
Scientific Workflow Applications
Mandeep Kaur and Rajni Aron
Abstract Fog computing seeks the attention of researchers by bringing a revolution
in the Internet of Things (IoT). Fog computing emerged as a complement to cloud
computing. It extends cloud services to the network edge and processes large and
complex tasks near end users. Furthermore, fog computing can help process workflow
tasks on its nodes only rather than sending them to the cloud, which helps to reduce
the time consumed to request and process at the cloud layer. Scientific Workflow is
used to represent data flow in scientific applications, which are very time-critical. This
paper has proposed an energy-efficient load balancing approach for fog computing
to reduce energy consumption in s cientific workflow applications. The proposed
algorithm works to reduce energy consumption in fog nodes by equal distribution
of workload in fog resources. Genome and SIPHT workflow applications have been
considered to evaluate in iFogSim.
Keywords Energy-efficient ·Fog computing ·Load balancing ·Resource
utilization ·Scientific workflows
1 Introduction
Fog computing contains sensors, actuators, gateways, and other computing devices in
its layered structure, which helps to store and process end-user requests at their end
only. CISCO introduced fog computing to support the end-users facing obstacles
while accessing the cloud data centers [1]. As its name indicates, fog is near the
M. Kaur (B
)
Lovely Professional University, Jalandhar, India
e-mail: k.mandeep@chitkara.edu.in
Present Address:
Chitkara University Institute of Engineering and Technology, Chitkara University, Rajpura,
Punjab, India
R. Aron
SVKM’s Narsee Monjee Institute of Management Studies (NMIMS) University, Mumbai,
Maharashtra, India
e-mail: rajni@nmims.edu
© The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd. 2022
S
. Majhietal. (eds.), Distributed Computing and Optimization Techniques, Lecture Notes
in Electrical Engineering 903, https://doi.org/10.1007/978-981-19-2281-7_16
165
166 M. Kaur and R. Aron
end surface where all the Internet of things communicates and generates data. The
amount of data increases daily, which needs proper storage and processing. Hence
fog computing provides a layer near to end devices to cope with the high latency
problem. Fog computing also helps to implement scientific workflow tasks. Workflow
systems help manage different resources, which can be used in fog computing to
manage scientific workflow tasks. Workflows are also defined as Direct Acyclic
Graphs (DAG), which contain vertices and edges, where vertices denote different
tasks to be executed, and edges show the relationship between these tasks [2].
Workflow applications like scientific tasks, face recognition, and sentiment anal-
ysis are complex tasks that increase complexity in the fog environment. Workflows
contain dependent tasks, in which firstly available resources are found, and then
tasks are assigned for execution. Due to the complexity of workflow tasks, there
can be wastage of resources, resulting in more energy consumption [3]. Work-
flow scheduling is considered an NP-complete problem, which deems time and cost
parameters while running the tasks [4]. With the distributed nature of fog computing,
fog nodes are deployed near the end devices. Here are few examples of scientific
workflows: Cybershake, LIGO, Sipht, Genome, Montage. Sipht and Genome work-
flows have been considered for evaluation of the proposed approach. Sipht is used
to detect replicates of all bacteria in the national center. It helps to collect biological
information [5, 6]. The genome can be any data related to microbiological resistance,
pathogen’s identity, genetic information [7, 8].
1.1 Load Balancing at Fog Layer
In order to implement load balancing in workflow tasks also means conserving energy
in fog resources. If the workflow tasks are unevenly distributed in the fog nodes, then
the resource requirement may be more, or fewer resources could be utilized. So, to
conserve energy consumed by fog resources, load balancing is a must in the fog
computing layer. The task of the load balancer at the fog layer is to balance the
distribution of workload in all the fog resources equally. Workload distribution can
help for the efficient utilization of resources [3, 5, 9].
1.2 Our Contribution
This article contributes the following:
1. It proposes fog computing architecture for maximum resource utilization in
scientific workflow applications.
2. Proposed load balancing approach for a fog computing environment that aims to
reduce energy consumption in fog resources while executing large and complex
scientific workflow tasks.
An Energy-Efficient Load Balancing Approach … 167
The remaining article has been organized as follows. Section 2 reviews the existing
literature in fog computing. Section 3 proposes Fog computing architecture imple-
menting load balancing for maximum resource utilization. This section also contains
the proposed EE-LB algorithm that has been evaluated to obtain simulation results
shown in the next Sect. 4. The last Sect. 5, concludes the article and provides future
scope.
2 Literature Review
This section contains the review of literature containing dynamic resource alloca-
tion and load balancing in workflows. There are many types of research works that
provided different techniques for scheduling the workflows, but load balancing is
still needed to explore. Literature review has been classified into two categories that
are described below:
2.1 Resource Allocation in Fog Computing
Li [3] proposed a load balancing based workflow scheduling model for resource
allocation in cloud environment. Their proposed system model reduces response time
and energy consumption in executing scientific workflow applications. The proposed
workflow scheduling approach is based on the shortest path technique. The authors
developed a social media application and performed a live video application. They
created social media application and considered live video application of workflows
to implement their proposed scenario. Naha [5] proposed a linear-regression method
for energy-aware resource allocation. The authors minimize failure occurred due to
energy constraints in the fog environment. Furthermore, the authors proposed an
energy-efficient framework. Along with this, an energy-aware framework has been
proposed to execute different applications in fog. The proposed approach has been
compared with other existing techniques, reducing execution and processing time.
Rehman et al. [10] proposed a “Dynamic Energy Efficient Resource Allocation
strategy (DEER)” for maximum resource utilization by implementing load balancing
in fog environment. The proposed approach has been executed in simulation envi-
ronment, and the obtained results are compared with other approach in terms of cost
and energy consumption. It has been obtained that energy consumption has been
improved by 8.67%, and computational cost by 16.77%. Xu et al. [11] proposed
“Dynamic Resource Allocation Strategy (DRAM)” for fog computing to obtain
maximum load balancing in fog environment. The major steps involved in DRAM
are: fog operation partitioning, detecting available nodes, dynamic resource alloca-
tion to local and global users. Maximum resource utilization has been obtained by
implementing load balancing, but energy consumption of nodes in fog environment
is not given much attention.
168 M. Kaur and R. Aron
2.2 Load Balancing in Fog Computing
Rizvi [4] has reduced computational cost and execution time while executing work-
flow scheduling policy, i.e., fair budget policy. The proposed policy has been evalu-
ated using different workflow applications and compared their results with the execu-
tion time and energy consumption in other approaches. The authors also perform
ANOVA test on their proposed strategies. Kaur et al. [12] proposed equal distri-
bution of workload-based load balancing approach for the fog computing envi-
ronment. The authors considered the cloud analyst tool to evaluate their proposed
system and compared its obtained results with existing round robin and the throttled
load balancing approach. The main motive of the proposed algorithm is to enhance
resource utilization and reduce the implementation cost.
Kaur et al. [16] proposed energy-aware approach for load balancing in fog
computing environment. The authors considered scientific workflow applications
(Genome, Cybershake) for evaluating their proposed approach, and simulation results
are obtained using iFogSim simulation environment. They only considered few fog
nodes to evaluate their proposed approach. More the number of fog nodes more will be
the energy consumption in them so, the proposed approach will not be able to handle
more workloads with lesser number of fog nodes. Table 1 shows the comparison of
existing approaches.
3 Proposed Fog Computing Architecture for Maximum
Resource-Utilization
Fog computing act as a middle layer between IoT and cloud layer. The traditional
fog computing architecture brings the cloud services from core to network edge [2].
The fog architecture provided in this article contains load balancing in the middle
layer. This section includes fog computing architecture for workload balancing in
workflows. In distributed fog environment, fog nodes are deployed near to the end-
users. In proposed architecture, load balancing has been applied in the fog computing
layer so that workflow tasks should be evenly distributed to all fog resources. The
following Fig. 1 shows fog computing architecture:
Figure 1 shows fog computing architecture containing three layers, i.e., end-users,
fog layer and cloud layer. End user layer is connected to the nearby deployed fog
nodes. Users submit their workflow tasks to the fog nodes. Fog nodes are having
nano data centers, which store and process user requests locally. Each fog node
is connected to the central controller, which controls all fog nodes. The central
controller firstly schedules the tasks at the fog node’s local queues. The tasks can be
executed in any manner, i.e. First Come First Serve (FCFS), shortest path first etc.
Then tasks are forwarded to the load balancer, which keeps track of all available and
utilized fog nodes. The energy is assigned in the form of electric power consumed
by various resources. The idle VMs also consume energy along with the overloaded
An Energy-Efficient Load Balancing Approach … 169
Tabl e 1 Comparison of various existing approaches
Author Year Purpose of
work
Type of
network
Tool used Application Research gap
Naha
et al. [5]
2021 Proposed
energy aware
approach
based on
multiple linear
regression for
load balancing
in fog
Fog CloudSim Time-sensitive
applications
Fog nodes
can be
clustered to
enhance the
performance
of system
Mokni
et al. [7]
2021 Propose a
hybrid
multi-agent
approach for
Cloud-Fog
environment
to schedule
IoT tasks
workflows
Cloud-Fog CloudSim IoT application They do not
consider
energy
consumption
in cloud-Fog
Davami
et al. [15]
2021 Proposed
high-level
architecture
for scheduling
of multiple
work fows
Fog Architecture
tradeoff
analysis
method
(ATAM)
Scientific
workflow
Article does
not consider
load
balancing t o
distribute
equal
workloads
Kaur et al.
[16]
2020 Proposed
Energy-aware
load balancing
approach
Fog-Cloud iFogSim Scientific
workflows
Lesser
number of
fog nodes are
considered
Hameed
et al. [17]
2021 Proposed
dynamic
clustering
approach for
vehicular
system
Vehi c ul a r
ad-hoc
network
(Fog)
NS2 Realistic
vehicular
network
Security of
fog nodes can
also consider
VMs. Hence, to optimize energy consumption in fog nodes, proper load balancing is
required, so that no VM remains underutilized, and no VM becomes overloaded. Load
balancer equally distributes these tasks into the VMs. Workflow tasks are executed
by fog nodes, and load balancing helps to improve utilization of all the resource in
the fog layer. With the enhancement in resource utilization, the energy consumption
of fog nodes can be reduced, which helps to reduce implementation cost. Fog layer
is connected to cloud layer above it. Cloud layer have large data center having large
storage and computing capacity that takes data from fog layer and store it for future
use.
170 M. Kaur and R. Aron
Fig. 1 Fog computing architecture implementing load balancing for maximum resource-utilization
3.1 Energy-Efficient Load Balancing Approach (EE-LB)
In fog computing environment, when large computational workflow applications
are executed, then there is increase in demand of resources also. So, with the
increased resources requirement there is more energy consumption. For efficient-
energy consumption in fog environment there is a need for energy-efficient load
balancing approach so that there can be no wastage of energy and resource utilization
can be increased. In this section, proposed energy-efficient load balancing approach
for fog computing environment has been provided. This section proposes a hybrid
load balancing approach that is based on Simulated Annealing and water cycle opti-
mization approaches. Optimization approaches used in this work are described as
follows:
Simulation Annealing Algorithm (SAA). SAA has been used for intra-cluster
mapping of tasks on fog nodes. The energy consumption in fog clusters has been
analyzed using SAA. SAA has a large margin for error control so, it has been used
to find global solution for scientific workflows.
Water Cycle Optimization (WCO). WCO works on the basis of natural water cycle
process that has been used in this work when the optimized solution has not been
found with SAA. WCO reduces the energy consumption and cost of intra-cluster
resource mapping.
Both optimization techniques work in hybrid form to enhance the performance,
and reduce the energy consumption and computational cost in fog nodes. Here is the
pseudocode for proposed algorithm:
An Energy-Efficient Load Balancing Approach … 171
4 Result Analysis and Discussion
This section provides the results obtained by evaluating proposed approach in
iFogSim environment. This section is divided into subsections i.e., parameter consid-
ered for comparison of obtained results, experimental requirements, and results are
shown in graph form in the later subsection.
4.1 Parameters Considered
The proposed approach has been evaluated based on two performance parameters
i.e., computational cost, and energy consumption. These parameters are explained
as follows:
Computational Cost: Computational cost can be calculated in terms of mainte-
nance cost of fog nodes as well as cloud nodes. Sometimes only few nodes are utilized
and others remains idle, but they also require maintenance. Hence maintenance cost
can be calculated using following equation:
Cost =C fog
r +Ccl oud
r+ R(c+ f )(1)
Equation (1) used to calculate the computational cost in fog environment i.e., it
considers as the sum of the total cost of resources of fog layer C fog
r, total cost of
resources of cloud layer Ccl oud
r, and total available resources at fog and cloud layer
R(c+ f ).
Energy Consumption: When tasks are assigned to resources for processing, the
resources consume energy while executing these tasks. Sometimes in case of large
computational tasks, few resources get more tasks to execute, and others remains
idle. All resources consume energy whether they are in execution mode or idle [7].
Hence energy consumption can be calculated as follows:
EnergyConsumption(Ec) = E fog,cloud
idle + E fog,cloud
utili zed + Rmax (2)
Equation (2) used to calculate energy consumption (Ec) in fog environment that
can be calculated as sum of energy consumption by all idle and utilized fog as
well as cloud resources E fog,cloud
idle , E fog,clou d
utili zed , and maximum number of available
resources (Rmax ).
172 M. Kaur and R. Aron
4.2 Experimental Requirement
Table 2 describes the experimental requirements used for executing proposed
approach. Proposed EE-LB approach in has been evaluated by using iFogSim
simulation tool and considered Genome and Sipht workflow applications. All the
requirements have been explained as follows in Table 2.
4.3 Experimental Results
The experimental results obtained after evaluating proposed approach EE-LB are
shown in the form of graphs by comparing with other existing approaches i.e. DEER
[10], DRAM [11], EA-LB [16], SRFog [18] on the basis of few parameters that are
described in the following Table 3.
Figure 2 shows the simulation results obtained by executing proposed Energy-
efficient (EE-LB) and comparing it with the other existing approaches. It can be seen
from the graphs that EE-LB approach outperforms the other existing approaches and
reduces cost and energy consumption in evaluating considered workflows.
Tabl e 2 Experimental
requirements Requirement Valu e
Simulator iFogSim
Operating system Windows 10, 64 bit
Fog nodes 20–140
Wor kl oa d 100–1000 tasks
Tabl e 3 Comparison of proposed approach (EE-LB) with other approaches
Parameter DEER DRAM EA-LB SRFog (EE-LB)
Simulation
environment
CloudSim CloudSim iFogSim Kubernetes iFogSim
Number of
Nodes
500 to 2000
resources
20–140 20 fog nodes 2–7 nodes 140 fog nodes
Number of
workloads
N numbers 500–1000 100–200 6 user requests 100–1000
Network type Fog Fog Fog Fog Fog
Energy
consumption
(KJoules)
1002.51397 756.345 935.321 1134.453 675.66
Computational
cost ($)
7000–8000 8000–9000 5500–6000 4000–4500 2500–3000
An Energy-Efficient Load Balancing Approach … 173
0
2000
4000
6000
8000
10000
30 60 90 120 150 180 210
Cost in Dollars
Number of resources per cluster
Cost Analysis of EE-LB
EA-LB DEER SRFog DRAM Energy-Efficient Load Balancing(EE-LB)
Fig. 2 Cost analysis of EE-LB by comparing with other approaches
Fig. 3. Energy Consumption analysis of EE-LB by comparing with other approaches
Figure 3 shows the simulation results obtained by evaluating considered work-
flows in proposed framework. It has been obtained from the graphs that with increase
in number of fog nodes energy consumption also increase. Proposed approach EE-LB
reduces energy consumption compared to the other approaches.
5 Conclusion and Future Scope
Load balancing in scientific workflows is necessary to fully utilize the resources
at fog layer. This article provides architecture for fog computing implementing
load balancing in scientific workflow. Furthermore, this article reviews the existing
load balancing and scheduling techniques in workflows and provides a comparison
between them. Different types of existing scientific workflows have been considered
174 M. Kaur and R. Aron
and evaluated in iFogSim by applying proposed approach EE-LB and results are
compared with EA-LB, DEER, SRFog, DRAM. It has been observed that EE-LB
reduces computational cost by 28%, and energy consumption by 35% as compared
to the other approaches. In future, QoS parameters in fog environment needs to be
explored more.
References
1. Bonomi F, Milito R, Zhu J, Addepalli S (2012) Fog computing and its role in the internet of
things. In: Proceedings of the first edition of the MCC workshop on Mobile cloud computing,
pp 13–16
2. Ding R, Li X, Liu X, Xu J (2018) A cost-effective time-constrained multi-workflow scheduling
strategy in fog computing. In: International conference on service-oriented computing.
Springer, Cham, pp 194–207
3. Li C, Tang J, Ma T, Yang X, Luo Y (2020) Load balance based workflow job scheduling
algorithm in distributed cloud. J Netw Comput Appl 152:102518
4. Rizvi N, Ramesh D (2020) Fair budget constrained workflow scheduling approach for
heterogeneous clouds. Clust Comput 23(4):3185–3201
5. Naha RK, Garg S, Battula SK, Amin MB, Georgakopoulos D (2021) Multiple linear regression-
based energy-aware resource allocation in the fog computing environment. arXiv preprint
arXiv:2103.06385
6. De Maio V, Kimovski D (2020) Multi-objective scheduling of extreme data scientific workflows
in Fog. Future Gener Comput Syst 106:171–184
7. Mokni M et al (2021) Cooperative agents-based approach for workflow scheduling on fog-cloud
computing. J Amb Intell Hum Comput:1–20
8. Ahmad Z et al (2021) Scientific workflows management and scheduling in cloud computing:
taxonomy, prospects, and challenges. IEEE Access 9:53491–53508
9. Singh SP (2021) An energy efficient hybrid priority assigned laxity algorithm for load balancing
in fog computing. Sustain C omput Inform Syst: 100566
10. Rehman AU et al (2020) Dynamic energy efficient resource allocation strategy for load
balancing in fog environment. IEEE Access 8:199829–199839
11. Xu X et al (2018) Dynamic resource allocation for load balancing in fog environment. Wirel
Commun Mob Comput 2018
12. Kaur M, Aron R (2020) Equal distribution based load balancing technique for fog-based cloud
computing. In: International conference on artificial intelligence: advances and applications
2019. Springer, Singapore, pp 189–198
13. Shahid MH, Hameed AR, Islam S, Khattak HA, Din IU, Rodrigues JJPC (2020) Energy and
delay efficient fog computing using caching mechanism. Comput Commun 154:534–541
14. Kaur A et al (2020) Deep-Q learning-based heterogeneous earliest finish time scheduling
algorithm for scientific workflows in cloud. Softw Pract Exp
15. Davami F et al (2021) Fog-based architecture for scheduling multiple workflows with high
availability requirement. Computing 1–40
16. Kaur M, Aron R (2020) Energy-aware load balancing in fog cloud computing. Mater Today
Proc
17. Hameed AR et al (2021) (2021) Energy-and performance-aware load-balancing in vehicular
fog computing. Sustain Comput Inform Syst 30:100454
18. dos Santos P, Pedro J et al (2021) SRFog: a flexible architecture for virtual reality content
delivery through fog computing and segment routing. In: IM2021, the IFIP/IEEE symposium
on integrated network and service management