Fig 1 - available from: SN Computer Science
This content is subject to copyright. Terms and conditions apply.
Popular Hadoop software logo

Popular Hadoop software logo

Source publication
Article
Full-text available
Advancement in communication technologies has also made a positive impact by increase in the computation. Cloud computing is an internet-based computational utility which has reduced the cost of computation and cutting short of larger investments. Cloud is service-oriented architecture with decentralized computation. The SWOT analysis of the cloud...

Context in source publication

Context 1
... as TSACS aimed to solve the issues that are raised at the scheduling level. They used clustering algorithm, converting stage, and also the scheduling stage. The first two stages prompt the virtual machines to get assigned to the appropriate virtual machine. It is expected to overcome the problem of oscillations which is a major issue as on today (Fig. ...

Similar publications

Article
Full-text available
With the technologies in Internet of Things (IoT) developing rapidly, various kinds of IoT devices are connected over the Internet. Therefore, how to meet the requirements of the executing IoT applications is becoming a critical issue. Generally speaking, offloading the IoT applications to the public cloud is an efficient approach to process them....

Citations

... (Sambit Kumar Mishra a, 2020). [ (Gundu, 2020)] ...
Article
Cloud computing is new model that permit to the clients, associations, to buy the necessary adminis-trations as indicated by our requirements. It is used to upload their data and retrieve data accordingto the needs over the internet. This model offer several services like to store data, easy and conve-nient web services, etc. In this era Cloud Application is developed the different services some arePlatform-as-a-Service (Paas), Software-as-a-Service (Saas), Infrastructures-as-a-Service (Iaas), andCloud computing improve their services day by client also demand reliable and new services forefficiency and reliability. Load Balancing be major provocation in cloud computing. Techniquecalled load balancing is used to distribute load of the data over the cloud network. It is also used tominimize the resource usage.
... 9 The existing resource management techniques and taxonomy of cloud computing are discussed in detail. [9][10][11][12][13] These studies in various paradigms need to be brought under an umbrella to compare and generalize the techniques and algorithms used in specific situations. It will help in observing the trade-offs of these algorithms and possible limitations to work upon. ...
Article
Full-text available
With the advent of the Internet of Things (IoT) paradigm, the cloud model is unable to offer satisfactory services for latency-sensitive and real-time applications due to high latency and scalability issues. Hence, an emerging computing paradigm named as fog/edge computing was evolved, to offer services close to the data source and optimize the quality of services (QoS) parameters such as latency, scalability, reliability, energy, privacy, and security of data. This article presents the evolution in the computing paradigm from the client-server model to edge computing along with their objectives and limitations. A state-of-the-art review of Cloud Computing and Cloud of Things (CoT) is presented that addressed the techniques, constraints, limitations, and research challenges. Further, we have discussed the role and mechanism of fog/edge computing and Fog of Things (FoT), along with necessitating amalgamation with CoT. We reviewed the several architecture, features, applications, and existing research challenges of fog/edge computing. The comprehensive survey of these computing paradigms offers the depth knowledge about the various aspects, trends, motivation, vision, and integrated architectures. In the end, experimental tools and future research directions are discussed with the hope that this study will work as a stepping-stone in the field of emerging computing paradigms.
... (4) Machine type machine communication in 6G: machine type communication, which includes both mission essential and huge connectivity characteristics, is expected to be a crucial cornerstone of 6G development, driven by a desire to supply verticalspecific wireless network solutions [26]. ...
Article
Full-text available
The exchange of information from one person to another is called communication. Telecommunication makes it possible with electronic devices and their tools. The scientist Alexander Graham Bell has invented the basic telephone in 1876 in the USA. Telephones now have the new format in the form of mobile phones, which are the primary media for communicating and transmitting data. We are using 5th-generation mobile network standards. Still, there are some requirements for the users that are believed to be solved in the 6th-generation mobile network standards. By 2030, all of the people would be using 6G. The computing model in the cloud is not dependent on either the location or any specific device that would provide the service. It is an on-demand computational service-oriented mechanism. Combining these two technologies as mobile cloud computing provides customized options with more flexible implementations. Artificial intelligence is being used in devices in many fields. AI can be used in mobile network services (MNS) to provide more reliable and customized services to the users, such as network operation monitoring, network operation management, fraud detection, and reduction in mobile transactions and security to the cyber devices. Combining cloud with AI in mobile network services in the 6th generation would improve human beings’ lives, such as zero road accidents, advanced level special health care, and zero crime rates in society. However, the most vital needs for sixth-generation standards are the capability to manage large volumes of records and excessive-statistics-fee connectivity in step with gadgets. The sixth-generation mobile network is under development. This generation has many exciting features. Security is the central issue that needs to be sorted out using appropriate forensic mechanisms. There is a need to approach high-performance computing for improved services to the end-user. Considering three-dimensional research methodologies (technical dimension, organizational dimension, and applications hosted on the cloud) in a high-performance computing environment leads to two different cases such as real-time stream processing and remote desktop connection and performance test. By ‘narrowing the targeted worldwide audience with a wide range of experiential opportunities,’ this paper is aimed at delivering dynamic and varied resource allocation for reliable and justified on-demand services.
... Different task scheduling algorithms for cloud computing have been proposed by many researchers. Researchers in [5] introduced a SWOT analysis of the cloud computing that can be used virtually in every industry to improve the service delivery and improvement. The work in [6] empirically compares and offers an insight into the performance of some renown state-of-the-art task scheduling heuristics concerning the throughput, average resource utilization ratio and makespan. ...
... This new wireless communication will require ultra-reliable low latency communication networks. Not only this, the upcoming devices should possess the speed [14] of terabit/second speeds. This requires making much more advancement in the field of electronics. ...
Article
Full-text available
Advancements in computation have driven mankind to new destinations. This sophistication has given a new space of combination of Multiple Technologies and their interconnections. The amalgamation of such interoperability has opened for a wide range of customized services to the end-users. Fifth Generation Mobile Networks Standard has just launched in a few countries like China and the researches for the Sixth Generation standard are on the way to explore. People are now much aware of the cloud and its data centers, Artificial Intelligence, and Machine learning techniques. Now 6G is a Coming up Technology. It is the latest upcoming cellular brand bond technology standard. This is going to replace 5G incoming of the years. This paper would provide some of the glimpses of such 6G.
... In task scheduling method, tasks were allocated to a best suitable resource for execution. In load balancing, task scheduling is a non-polynomial (NP) hard problem because number of tasks and length of tasks vary rapidly, therefore it is difficult to calculate possible mapping of tasks to resources and evaluate an optimal mapping [6], [7]. ...
Article
Full-text available
In cloud computing, load balancing among the resources is required to schedule a task, which is a key challenge. This paper proposes a dynamic degree memory balanced allocation (D2MBA) algorithm which allocate virtual machine (VM) to a best suitable host, based on availability of random-access memory (RAM) and microprocessor without interlocked pipelined stages (MIPS) of host and allocate task to a best suitable VM by considering balanced condition of VM. The proposed D2MBA algorithm has been simulated using a simulation tool CloudSim by varying number of tasks and keeping number of VMs constant and vice versa. The D2MBA algorithm is compared with the other load balancing algorithms viz. Round Robin (RR) and dynamic degree balance with central processing unit (CPU) based (D2B_CPU based) with respect to performance parameters such as execution cost, degree of imbalance and makespan time. It is found that the D2MBA algorithm has a large reduction in the performance parameters such as execution cost, degree of imbalance and makespan time as compared with RR and D2B CPU based algorithms
... Azure [2] can be used in several different areas like, (1) application development: anybody can create any web application using this Azure platform. (2) testing: after developing an application successfully on the platform, it can be SN Computer Science tested. ...
... Pipeline is a logical grouping of activities that performs a unit of work. The working mechanism of pipelines (which are driven workflows) in Azure Data Factory typically perform the following four steps: (1) connect and collect, (2) transform and enrich, (3) publish, (4) monitor. ...
Article
Full-text available
The new era of computation has given new directions for advancement and sophistication. The meaning of computation has made a paradigm shift from device and location orientation to the distributed level. Storage has become cheaper. Multiple technologies and their interconnections and their wide range of services have made this planet to be more sophisticated and more flexible in computing. Service orientation based computing of cloud has achieved a huge success. Data centers has already existing even before the emergence of cloud technology, but cloud makes these data centers to communicate, between these data centers and allocate the resources properly and functioning appropriately using a network and to used these resources and maintain them properly. Most of the cloud services providers have their own Data center. ‘Azure’ online cloud platform provided Microsoft provides cloud services and resources to the end-user. Its pricing mechanism is economical and flexible. One can learn Azure very easily and Micro soft company provides Azure certifications for the one who qualifies with international standards.
... In 1980's real-time operating system concept is introduced. To suffice the response time requirement for the external events, real-time scheduling is done [2]. In the external events, execution time and response time plays an important role. ...
Article
Full-text available
In the present day scenario cloud computing is an attractive subject for IT and non IT personnel. It is a service-oriented pay per use computational model. Cloud has working models with service-oriented delivery mechanism as well as deployment-oriented infrastructure mechanism. Data centers are the backbone of cloud computing. The massive participation of public has also increased the load on the cloud servers. Proper scheduling of resources is always needed. Quality of service is to be provided as per the service level agreement. Virtualization technique is the main reason behind the huge success of cloud. Multi-cloud exchanges to optimize connectivity today, multi-cloud exchanges offer the next level in direct connectivity, allowing organizations to safely and easily expand multi-cloud capabilities. Exchanges eliminate the added worries that an open Internet can bring as well as the tedious provisioning and configuring that comes with connecting to the public Internet. Importantly, multi-cloud exchanges allow organizations to establish a single connection to multiple cloud providers at the same time through an Ethernet switching platform, rather than wrestling with multiple individual connections to cloud providers.
Article
Full-text available
The article analyzes selected software solutions for balancing traffic in the server cluster: Traefik, HAProxy, and NGINX. For the demonstration, a system model consists of a cluster of three servers connected to the management servers with load-balancing solutions that share a public IP address. The application servers use the same database available in a multi-master configuration and the management servers are connected via the BGP protocol. User requests reach the server cluster through redundant traffic balancing subsystems. After testing the designed system, it was found that HAProxy is the best among all selected load-balancing solutions and ensures high cluster availability. While setting up the HAProxy it is recommended to choose the dynamic Least-Connections load balancing algorithm. Keywords: a server cluster, load-balancing, Round Robin, Least connections, HAPoxy
Chapter
Cloud computing as an advanced technology in the IT infrastructure presents nowadays a big concern of researches. It’s no longer a matter of on-demand successful delivery of computing resources. Throughput, performance, server response time, and cost had become the metrics that enable the quality-of-service agreement. Technically, cloud service provider guarantees to deliver computing resources (storage, servers and applications) through back-end data center. It consists of several hosts distributed geographically to answer the client requests. To ensure the service level agreement between clients and providers, cloud infrastructure software need to schedule and optimally manage the workload of several demands. Here, Load balancing technology enters as a major key with a set of algorithms to handle the most effectively and fairly the allocation and scheduling of computational resources, to serve the large amount of calling jobs. This review presents a comparative and comprehensive study that covers the principal concepts of cloud computing, and the well-known algorithms used for load balancing which are classified into static and dynamic sets. The objectives of this survey are to (1) mention, explain, compare and analyze some developed methods for load balancing by systematically reviewing papers from the years 2018 to 2021, (2) analyze the level of maturity of the solutions proposed in the literature and (3) present an insight into the actual solutions which may help with future improvements.KeywordsCloud computing modelInternet-based SWLoad balancingVirtualizationHost/VM migrationQoS Measurements