Fig 1 - uploaded by K. R. Singh
Content may be subject to copyright.
Remote data Backup Server and its Architecture 

Remote data Backup Server and its Architecture 

Source publication
Article
Full-text available
Today in electronic world large amount that required the data recovery services. We know that the cloud computing introduces a new type of computing platform in today’s world. This type of computing will generates a large amount of private data on main cloud. Therefore, the necessity of data recovery services are growing day-by-day and it requires...

Context in source publication

Context 1
... cost as well for implementation of the recovery problem’s solution and can easily recover the data after any disaster. That’s why, the need of the back -up and recovery techniques for cloud computing arises due to heavy storage of its clients. Remote Data Backup server is a server which stores the main cloud’s entire da ta as a whole and located at remote place (far away from cloud). And if the central repository lost its data, then it uses the information from the remote repository. The purpose is to help clients to collect information from remote repository either if network connectivity is not available or the main cloud is unable to provide the data to the clients. As shown in Fig 1, if clients found that data is not available on central repository, then clients are allowed to access the files from remote repository (i.e. indirectly). The Remote backup services should cover the following issues: 1) Privacy and ownership. 2) Relocation of servers to the cloud. 3) Data security. 4) Reliability. 5) Cost effectiveness. 6) Appropriate Timing. Different clients access the cloud with their different login or after any authentication process. They are freely allowed to upload their private and essential data on the cloud. Hence, the privacy and ownership of data should be maintained; Owner of the data should only be able to access his private data and perform read, write or any other operation. Remote Server must maintain this Privacy and ownership. For data recovery there must be relocation of server to the cloud. The Relocation of server means to transfer main server’s data to another server; however the new of location is unknown to the client. The clients get the data in same way as before without any intimation of relocation of main server, such that it provides the location transparency of relocated server to the clients and other third party while data is been shifted to remote server. The client’s data is stored at central repository with complete protection. Such a security should be followed in its remote repository as well. In remote repository, the data should be fully protected such that no access and harm can be made to the remote cloud’s data either intentionally or unintentionally by third party or any other client. The remote cloud must possess the reliability characteristics. Because in cloud computing the main cloud stores the complete data and each client is dependent on the main cloud for each and every little amount of data; therefore the cloud and remote backup cloud must play a trustworthy role. That means, both the server must be able to provide the data to the client immediately whenever they required either from main cloud or remote server. The cost for implementation of remote server and its recovery & back-up technique also play an important role while creating the structure for main cloud and its correspondent remote cloud. The cost for establishing the remote setup and for implementing its technique must be minimum such that small business can afford such system and large business can spend minimum cost as possible. The process of data recovery takes some time for retrieval of data from remote repository as this remote repository is far away from the main cloud and its clients. Therefore, the time taken for such a retrieval must be minimum as possible such that the client can get the data as soon as possible without concerning the fact that remote repository is how far away from the client. There are many techniques that have focused on these issues. In forthcoming section, we will be discussing some of recent techniques of back-up and recovery in cloud computing domain. In our literature survey, we found many techniques that are having their unique ways to create backup and recovery. Broadly speaking, all those techniques focus on three different aspects, such as cost controlling, data duplication and security issues. Each of the technique has the complete focus on their aim of backup and recovery. Further, we detail few recent techniques HSDRT [1], PCS [2], ERGOT [3], Linux Box [5], Cold and Hot back-up technique [6], SBBR [10], REN [17] that have addressed the aforesaid issues. The HS-DRT [1] is an innovative file back-up concept, which makes use of an effective ultra-widely distributed data transfer mechanism and a high-speed encryption technology, It consists of 3 components: First, the main functions are Data Centre, Second, Supervisory server and third, various client nodes specified by admin. The client nodes are composed of PCs, smart phones, Network Attached Storage and ...

Similar publications

Article
Full-text available
Federated learning (FL) is a collaborative, decentralized privacy‐preserving method to attach the challenges of storing data and data privacy. Artificial intelligence, machine learning, smart devices, and deep learning have strongly marked the last years. Two challenges arose in data science as a result. First, the regulation protected the data by...
Conference Paper
Full-text available
In light of the trend towards cloud-based applications, privacy enhancing technologies are becoming increasingly important. Searchable encryption (SE) allows to outsource data to the cloud in a secure way, whilst permitting search functionality on that encrypted data: the host is able to perform search queries on behalf of the user, but without hav...
Article
Full-text available
The advances of cloud computing, fog computing and Internet of Things (IoT) make the industries more prosperous than ever. A wide range of industrial systems such as transportation systems and manufacturing systems have been developed by integrating cloud computing, fog computing and IoT successfully. Security and privacy issues are a major concern...
Article
Full-text available
With the rapid development of mobile communication and the sharp increase of smart mobile devices, wireless data traffic has experienced explosive growth in recent years, thus injecting tremendous traffic into the network. Fog Radio Access Network (F-RAN) is a promising wireless network architecture to accommodate the fast growing data traffic and...
Article
Full-text available
With the rapid growth and adoption of cloud computing, more sensitive information is centralized onto the cloud every day. For protecting this sensitive information, it must be encrypted before being outsourced. Current search schemes allow the user to query encrypted data using keywords, but these schemes do not guarantee the privacy of queries (i...

Citations

... A vital aspect for businesses moving operations to the cloud. Cloud providers are developing solutions for quick recovery from disruptions like natural disasters or cyberattacks [25], [26]. ...
Article
Full-text available
Cloud computing is a method of delivering IT services through a network of interconnected servers, forming a unified system known as the "Cloud." This virtualized ecosystem integrates interconnected networks, servers, applications, storage, and services, enabling users to conveniently access them as needed with minimal administrative intervention. This review article focuses on virtualization and containerization as foundational pillars, the paper explores their revolutionary impact on resource management and deployment efficiency. It further delves into the impending trends and challenges set to shape the cloud computing landscape from 2025 to 2030. It reviews the anticipated adoption of hybrid and multi-cloud strategies, offering organizations tailored solutions while mitigating risks of vendor lock-in. The rise of edge computing is highlighted as a pivotal solution to address latency issues and foster a competitive IoT ecosystem. Additionally, integration of AI and machine learning within cloud frameworks is poised to unlock new realms of innovation and optimization, propelling digital transformation forward. The article underscores the imperative for enhanced security measures to safeguard sensitive data and ensure user privacy. Ongoing price competitions among cloud providers and increased regulatory scrutiny are also discussed, highlighting the dynamic landscape of cloud computing. It provides valuable insights into the past, present, and future trajectory of cloud computing. It concludes by affirming the pivotal role of cloud computing in driving digital innovation and empowering organizations to thrive in an increasingly interconnected world, offering recommendations for businesses to leverage emerging technologies and navigate evolving challenges effectively. Keywords: Cloud computing, Hybrid and multi-cloud strategies, IoT ecosystem, AI and machine learning integration, Security measures etc. oud" is a virtualized environment comprising networks, servers, applications, storage, and services, accessible to users on-demand with minimal management involvement. It offers resources and services without requiring users to have in-depth system knowledge, providing a wide range of applications and scalable services tailored to users and businesses [1]. Cloud computing is a computing technique that delivers IT services through interconnected low-cost computing units via IP networks. Emerging from search engine platform architecture, it possesses five pivotal technical traits: expansive resource capacity, exceptional scalability, shared resource pools encompassing both virtual and physical assets, dynamic resource allocation, and versatile applicability across various purposes [2]. Cloud computing gives services to users, allowing them to store and process data without needing hardware. It's a delivery model for computing services that facilitates the real-time development of various services. Examples include Google Accounts and Amazon Elastic Compute Cloud (EC2). Table-1: Types of cloud computing services [3], [4], [5]. Cloud Type Description 1. Public Cloud Provides open access to services over the internet, managed by dedicated service providers. Infrastructure as a Service (IaaS) This generally offers network, storage, and software systems, replacing traditional data center functions. Platform as a Service (PaaS) This generally provides virtualized servers for application development and deployment, minimizing server maintenance.
... disaster recovery and backup technologies has been conducted to ensure data security and business sustainability [2,3]. In contrast to the above research, research on location privacy protection technology assumes that the user's location and query are not accurately recognized as much as possible when the system has been intruded by the attacker, which in turn ensures the user's privacy security. ...
Article
Full-text available
With the development of mobile applications, location-based services (LBSs) have been incorporated into people’s daily lives and created huge commercial revenues. However, when using these services, people also face the risk of personal privacy breaches due to the release of location and query content. Many existing location privacy protection schemes with centralized architectures assume that anonymous servers are secure and trustworthy. This assumption is difficult to guarantee in real applications. To solve the problem of relying on the security and trustworthiness of anonymous servers, we propose a Geohash-based location privacy protection scheme for snapshot queries. It is named GLPS. On the user side, GLPS uses Geohash encoding technology to convert the user’s location coordinates into a string code representing a rectangular geographic area. GLPS uses the code as the privacy location to send check-ins and queries to the anonymous server and to avoid the anonymous server gaining the user’s exact location. On the anonymous server side, the scheme takes advantage of Geohash codes’ geospatial gridding capabilities and GL-Tree’s effective location retrieval performance to generate a k-anonymous query set based on user-defined minimum and maximum hidden cells, making it harder for adversaries to pinpoint the user’s location. We experimentally tested the performance of GLPS and compared it with three schemes: Casper, GCasper, and DLS. The experimental results and analyses demonstrate that GLPS has a good performance and privacy protection capability, which resolves the reliance on the security and trustworthiness of anonymous servers. It also resists attacks involving background knowledge, regional centers, homogenization, distribution density, and identity association.
... According to some studies, the relationship between RTO and cost is non-linear. Another paper written by Sharma and Singh [3] says that nowadays, data acts as fuel for different organizations. There is a lot of important information created by companies every day, so it is essential to save this information in a secure place. ...
Conference Paper
Full-text available
Cloud computing is the collection of excellent services (storage, databases, etc.) provided to customers and organizations over the internet to fulfill their requirements efficiently. Everything depends on the internet, and you can access your data anywhere. On the other hand, cloud service clients must ensure that the outsourced backup is secure enough to entrust with their data. This research highlighted how the cloud has benefited people by providing unlimited, secure storage without any high cost of investment in hardware. Cloud storage also aids in the security of your data backup, allowing you to access your files and documents even if your hardware is lost or damaged. It is also known as disaster recovery. Google Photos and Google Drive are examples of adequately encrypted backup storage. This paper also discusses the core functionalities of Google photos and Google drive and their advantages in a more comprehensive form. This research's main aim is to provide users with awareness regarding data loss and recovery. Furthermore, threats related to modern technology are also part of this research. In addition, the article also included some graphs and figures to explain the results and facts in detail.
... To ensure confidentiality, integrity, and availability of institutional data, it is important to have a regular and reliable offsite backup through appropriate processes and procedures, which must be well defined within policies to avoid any information system security incidences. Other researchers, including Sharma and Singh (2012). These disasters should be documented and well-studied to have measures in place so that they do not repeat in the future and give a chance to other institutions to learn from them and protect themselves from facing the same challenges (Asgary, 2016). ...
Article
Full-text available
Information and data in our organisations must be protected from illegal access and associated threats. Traditionally, a comprehensive information systems security policy sets the required foundation for protecting data by directing system users on how they must behave and offering necessary controls. With this understanding, the current study determined the quality of security policies adopted by learning institutions in guiding users on the prudent use of information systems. The following institutions were included in the analysis; The Institute of Accountancy Arusha (IAA), The Institute of Finance Management (IFM). College of Business Education (CBE), University of Dar es Salaam (UDSM), Ardhi University (ARU), Arusha Technical College (ATC), Open University of Tanzania (OUT) and Eastern and Southern African Management Institute (ESAMI). This study was qualitative and descriptive. It compared the key themes of an information security policy with the actual policies of the selected organisation. The purpose was to know whether these critical policy elements are considered in sampled policies for learning institutions based in Tanzania. The comparison theme includes password management, email use principles, disaster handling and recovery, hardware and software management, and information handling. The study found that higher learning institutions in Tanzania have poor information system security policies. A harmonised policy framework is necessary to improve the quality of policies used in learning institutions in Tanzania.
... Among all backup techniques Parity Cloud Service (PCS) is most reliable and cost effective. It uses virtual disk in user system for data backup and also stores parity information with data in order to recover from errors (Sharma et al., 2012). ...
... After this data is duplicated and sent to different client nodes for storage along with the keys required for decryption. For data recovery, supervisory server gathers the data from all clients, combines the data in proper order and decrypts the data (Sharma et al., 2012). ...
... For secure transmission of data from cloud to the Linux Box, encryption and secured channel interface is used between the two. Linux Box also checks the cloud for data updates and accordingly updates its local storage (Sharma et al., 2012). ...
Article
Full-text available
Today, with the use of mobile devices everywhere and with the success of cloudcomputing, the concept of mobile cloud computing (MCC) has been introduced. MCC is anincorporation of cloud computing into the mobile environment. In MCC, computing resources such asmemory, processing and storage are not actually present at the device of user. Instead, these resourcesare moved on to a remote location known as cloud and are owned by some service provider. The useraccesses these resources through internet. The advantages associated with MCC are low initialinvestment by a user, low cost of operation and maintenance. There are also several problems relatedto the MCC which are limited battery life, storage and bandwidth, diverse operating systems of mobiledevices and security. Various solutions for MCC problems have been developed with their ownadvantages and disadvantages. This paper presents a survey of MCC, its architecture, its problems andexisting solution of those problems and advantages and disadvantages of different solutions.
... In order for addressing this problem, constructing a remote data backup service is an effective method. It is a server that stores the leading cloud's entire data and is located at a remote place, which intends to assist clients to access data from the remote server when network or connection failure comes across [21]. ...
... When a local system is severely damaged, the only local form of traditional data backup is insufficient to meet user needs for network service system backup; thus, cloud-based data recovery and security backup technologies must be used. [9]. The proposed work has been implemented on AWS and MacBook Air, which have Intel Core i5 with Processor Speed of 1.6 GHz and RAM of 8 GB. ...
Article
Cloud computing also offers several services to its clients in departments such as education, healthcare, military, government, and sports. The storage of each service, which keeps a huge amount of electronic data in a cloud database, is one of the services provided by cloud infrastructure. In the main cloud, this type computes massive amounts of private data. As a result, information retrieval solutions that are both effective and efficient are required. When the server deletes the lost data and cannot transmit any information to the client, the recovery method's main goal is to assist the user in collecting data from each server backup. Users can benefit from cloud computing by storing data, accessing data from anywhere, and retrieving data at any time. As data on the cloud is generated at such a quick rate, speed is one of the most important aspects of cloud computing. It's simple to save files in the cloud and then retrieve it again. However, the difficult problem is to safely store and retrieve data from the cloud. We suggested a way for securely storing data in the cloud utilizing encryption and decryption methods in this work. We also employed a compression module to compress data before putting it in the cloud.
... Sharma and Singh [11] presented a review of backup techniques as a disaster recovery strategy for cloud computing environments. The backup is one strategy among many that can be implemented to provide disaster recovery capabilities. ...
Chapter
A technology disaster can result in a long-term interruption, a loss of reputation, monetary losses, and the incapacity to continue doing business. Although there is much information regarding disaster recovery and business continuity, there is little research on how firms might use other technology frameworks to help with information technology disaster recovery, given the maturity level of existing disaster recovery processes. There is also an inadequate understanding of organizational recovery from unforeseen events, with little or no interruption to their business continuity. Disruptions in the business world can happen anywhere and at any moment. It is hard to predict what will happen and when. It is now a legal requirement for businesses to be prepared for disasters and recovery scenarios. With an ever-increasing reliance on business processes using electronics channels, it has practically become a requirement for any company build a framework that suit them to plan for Business Continuity in case of IT catastrophe, and thus the article seeks to address.KeywordsDisasterRecoveryBackupContinuityManagementEmergency
... In the literature survey of [3] they found many techniques that have their unique ways to create backup and recovery. We illustrated these techniques in Tab 4. The experimental results, done by [4] shows that many organizations and companies have utilized disaster recovery solutions to minimize the downtime and data loss incurred when catastrophes take place. ...
Article
Context: Digital data is being stored in large quantities in Cloud, requiring data backup and recovery services. Due to many factors such as disasters and other disruptive events, the risk of data loss is huge. Therefore, backup and data recovery are essential and effective in improvement of system availability and maintaining Business Continuity. Nevertheless, the process to achieve the goal of business uninterrupted faces many challenges regarding data security, integrity and failure prediction. Objective: This paper has the following goals: analyzing system- atically the current published research and presenting the most common factors leading to the need of Disaster Recovery and backup plan; investigating and identifying the adopted solutions and techniques to prevent data loss; and lastly, investigating the influence Data Recovery and Backup has in terms of business continuity and identifying the privacy and security issues regarding the disaster recovery process. Method: A systematic mapping study was conducted, in which 45 papers, dated from 2010 to 2020 were evaluated. Results: A set of 45 papers is selected from an initial search of 250 papers, including 10 papers from snowball sampling, following the references from some papers of interest. These results are categorized based on the relevant research questions, such as causes of disasters, data loss, business continuity, and security and privacy issues. Conclusion: An overview of the topic is presented by investigating and identifying the following features: challenges, issues, solutions, techniques, factors, and effects regarding the backup and recovery process.
... We did not focus on data recovery here as it is out of the scope of this paper. Many previous works [53] study this problem and their techniques can be integrated in our system as well. In addition, the malicious Cloud Server may also affect the accessibility of the data query process, our recent work [54] proposed a mechanism towards a misbehaving Cloud Server which also can be applied in this centralized setting if necessary. ...
Article
The application of blockchain to Vehicular Edge Computing (VEC) has attracted significant interests. As the Internet of Things plays an essential and fundamental role for data collecting, data analyzing, and data management in VEC, it is vital to guarantee the security of the data. However, the resource-constraint nature of edge node makes it challenging to meet the needs to maintain long life-cycle IoT data since vast volumes of IoT data quickly increase. In this paper, we propose Acce-chain, a storage-elastic blockchain based on different storage capacities at the edge. Acce-chain supports re-write operation to re-write the historical block with a newly generated block without breaking the hash links between the blocks. As a result, Acce-chain ensures that the hot data can be efficiently accessed at the edge without incurring much communication costs or increasing the total size of the chain. To guarantee the security of the re-write process, we propose a new cryptographic primitive named Dynamic Threshold Trapdoor Chameleon Hash (DTTCH). To guarantee the verifiability of query operation, we design a novel storage structure named HybridStore to ensure the verifiable query for on-chain/off-chain IoT data. As a result, Acce-chain achieves both authorized re-write and verifiable query simultaneously. We provide security analysis for the DTTCH scheme and the IoT data query algorithms. We evaluate Acce-chain through experiments and the results show that the performance of the re-write operation is feasible in real-world VEC settings, and the query efficiency can achieve up to several magnitudes better than which of the baseline. The results also demonstrate that Acce-chain can provide high service quality for the latency-sensitive VEC systems.