Figure 1 - available via license: Creative Commons Attribution 3.0 Unported
Content may be subject to copyright.
Data Migration Layers shows Data Migration Layers architecture. This migration is a traditional way of migrating data to data marts / data warehouses. Data generated in business logic is not a traditional data which can now we store on data marts and data warehouse. This traditional architecture is generating Big Volume of data now a day's. Big data offers huge number of prospects such as taming business by analysing the user behaviour's, getting more accurate forecasts in scientific research by analysing large number of previous results etc. As big data provides these opportunities, along with this, there are many challenges while handling big data [1]. Considering the big data, the challenges lie mainly in areas such as data capture, data storage, analysis, representation. To handle these problems, we have many powerful tools like Hadoop [2].Migration includes complete transfer of data from one place to another with minimum cost. Following are the major approaches proposed in the literature. 

Data Migration Layers shows Data Migration Layers architecture. This migration is a traditional way of migrating data to data marts / data warehouses. Data generated in business logic is not a traditional data which can now we store on data marts and data warehouse. This traditional architecture is generating Big Volume of data now a day's. Big data offers huge number of prospects such as taming business by analysing the user behaviour's, getting more accurate forecasts in scientific research by analysing large number of previous results etc. As big data provides these opportunities, along with this, there are many challenges while handling big data [1]. Considering the big data, the challenges lie mainly in areas such as data capture, data storage, analysis, representation. To handle these problems, we have many powerful tools like Hadoop [2].Migration includes complete transfer of data from one place to another with minimum cost. Following are the major approaches proposed in the literature. 

Source publication
Article
Full-text available
Big Data Analytics is a big word now days. As per demanding and more scalable process data generation capabilities, data acquisition and storage become a crucial issue. Cloud storage is a majorly usable platform; the technology will become crucial to executives handling data powered by analytics. Now a day's trend towards "big data-as-a-service" is...

Context in source publication

Context 1
... of the solution given to avoid the huge data transfer cost is from Amazon web service [3] of its new cloud front service. Figure 1. shows Data Migration Layers architecture. ...

Similar publications

Article
Full-text available
Data has become an essential part of every Economy, Production, Organization, Business function andindividual. The amount of data in world is growing day by day because of use of internet, Smartphone, social network, fine tuning of ubiquitous computing and many other technological advancements. The power of Big Data delivers cost reduction, faster,...
Article
Full-text available
Communicating by using information technology in various ways produces big amounts of data. Such data requires processing and storage. The cloud is an online storage model where data is stored on multiple virtual servers. Big data processing represents a new challenge in computing, especially in cloud computing. Data processing involves data acquis...

Citations

... On the other hand, (Fletcher et al. 2019) employed weighted joint likelihoods in their data integration model as a mean to highlight data sources according to various criteria (e.g sample size). The notion of big data integration in the cloud blends data manipulation technologies and cloud computing in a new generation of data analytics platforms (Kune et al. 2016;Manekar and Pradeepini 2017). Users today require new big data integration cloud services, such as data collection from many sources via cloud-deployed APIs. ...
Article
Full-text available
With the rapid expansion of social media users and the ever-increasing data exchange between them, the era of big data has arrived. Integration of big data generates enormous benefits, making it a hotspot for research. However, big data demonstrates the heterogeneity brought on by multiple data sources. Big data integration is constrained by multi-source heterogeneous data. Moreover, the rise in the volume of social media data is affecting the efficiency of data integration. This study is concerned with developing a novel framework for data integration system that can manage the heterogeneity of massive social media data. The framework is comprised of four layers: data source layer, application layer, resource layer, and visualization layer. The framework establishes correlations between data stored in distributed data sources. We used RESTful APIs to offer end-users with reliable and effective web-based access to data using unique queries. The framework was evaluated based on firsthand impressions of test users, who answered a standardized set of questions after testing real-world inputs.
... But all of the information for that is in the system and does not depend on any distributed information on local devices. As shown in several studies, legacy data challenges and their storage during migration are key concerns that project stakeholders have in their planning and designing phase of migration (Gholami et al. 2017;Manekar and Gera 2017). In the most dire of circumstances, even a different team familiar with how the platform works in general could pick up work and continue it after some time of orientation. ...
... Yet in addition, AI solutions can be the key for data protection and security of critical data that an organization has, as shown in a recent study (Diener et al. 2016). Collecting and storing big data before starting any migration is a hot topic, where the integration of AI approaches can help in handling data challenges, extractions from different resources, and designing data lakes in a migration project (Manekar and Gera 2017). ...
Chapter
Full-text available
In a networked world, companies depend on fast and smart decisions, especially when it comes to reacting to external change. With the wealth of data available today, smart decisions can increasingly be based on data analysis and be supported by IT systems that leverage AI. A global pandemic brings external change to an unprecedented level of unpredictability and severity of impact. Resilience therefore becomes an essential factor in most decisions when aiming at making and keeping them smart. In this chapter, we study the characteristics of resilient systems and test them with four use cases in a wide-ranging set of application areas. In all use cases, we highlight how AI can be used for data analysis to make smart decisions and contribute to the resilience of systems.
... In this point of view, for the analysis of the functions in our case, the dataset wants to suits the open EHR customary, the need to hyperlink the present information to the engendered "XML or JSON" outputs. Here,the management computer is desired to take care of and manipulate open EHR data [13]. Recent EMRs and EHRs used in database management systems based on totally "SQL" despensing, then once the health location is superior and in addition to a doubt we face a whole host of restraints associated with the formation of data, the use of No SQL is essential in this regard [15,17]. ...
Conference Paper
Full-text available
An adoption of digital health documents has recently developed in fantastic realm. A little vigor our bodies stockpile, cope and route their statistics in force. However, the subsistence of such tremendous and scrupulous unit poses new confront and problems for health professionals. In verity, even though the most essential intention of electronic fitness documentation is to achieve great tribal information from health flow of work data for very general practitioner make the most significant analytical tackle. This is due to more than one building and steps, which definitely discourages them from wanting more and more. We eliminate and accurately discover the appropriate adaptation of analytical tools to electronic records on physical condition to improve their use through the capacity of health professionals. A case study is presented on the discharge approach of EHR based on open fully electronic fitness documentation and appears in the adoption of health analysis.
... Finally, in the experiment section, experimental results are discussed with enhanced performance.A recent study shows that every day 2.5 QB (Quintillion Bytes) of Data generated daily. These bulks of data are complex to capture, store, search, produce and analyse using a traditional database system [3,4]. 90 percentage of today's data is just created within the last couple of years of span. ...
... ANN framework was used in [3] that reduce the infrastructure cost and it also offers speedy delivery of results. However, there was no consideration on QoS requirements. ...
Chapter
Data deduplication is the trending and flexible data compression mechanism for reducing the strong space and data in the cloud. In this mechanism, the entire data in the file is divided into a small number of chunks, and a unique hash code is generated during the file uploading process and is stored in the cloud for further assessment. Along with the file storage, providing security to the file is also an important task, which can be done by using the enhanced convergent encryption technique in this proposed paper. The traditional encryption algorithms do not encourage data encryption while outsourcing the file to data because the usage of different algorithms generates different cipher texts even though the contents of the files are identical, due to which the deduplication process is not possible to reduce the data efficiently. The proposed paper aims at the crucial cloud taxability functionality so that encrypted files from different entities can be saved by the overhead bandwidth and storage. With the advancement of communication trends, data sharing among social media is increasing day by day. To share the data with security in the cloud, one of the popular mechanisms is “attribute-based encryption”, which allows the user to share the data having the common credentials to access the files. In this paper, attributed-based encryption is performed by generating a more number of convergent keys to share the data across multiple entities like mobiles, desktops, tablets, and other cloud servers.KeywordsEnhanced convergent encryption techniqueCipher textAttribute-based encryptionDeduplicationCloud storageHash values
Chapter
Cloud computing is most powerful and demanding for businesses in this decade. “Data is future oil” can be proved in many ways, as most of the business and corporate giants are very much worried about business data. In fact to accommodate and process this data, we required a very expensive platform that can work efficiently. Researchers and many professionals have been proved and standardize some cloud computing standards. But still, some modifications and major research toward big data processing in multi-cloud infrastructure need to investigate. Reliance on a single cloud provider is a challenging task with respect to services like latency, QoS and non-affordable monetary cost to application providers. We proposed an effective deadline-aware resource management scheme through novel algorithms, namely job tracking, resource estimation and resource allocation. In this paper, we will discuss two algorithms in detail and do an experiment in a multi-cloud environment. Firstly, we check job track algorithms and at last, we will check job estimation algorithms. Utilization of multiple cloud service providers is a promising solution for an affordable class of services and QoS.
Chapter
The complexity in which the interactions of the supply chain in framed is one of the main difficulties for the organizational management. The amount of structured and unstructured data emerging from their own interactions increases exponentially, it is constantly and rapidly transformed; its usefulness will lie on the integrity in handling and speed for processing the mentioned data. This Article proposes a new archetype for the supply chain based on the decentralized information management which operation is based on the management of Big Data from the application of Blockchain technologies. It is concluded that the technological gap to reach this new archetype is only broken through the redefinition of the way in which transactions are carried out, integrating Big Data management with the Blockchain application.