Access control mechanism.

Access control mechanism.

Source publication
Article
Full-text available
In the era of big data, modern data marketplaces have received much attention as they allow not only large enterprises but also individuals to trade their data. This new paradigm makes the data prone to various threats, including piracy, illegal reselling, tampering, illegal redistribution, ownership claiming, forgery, theft, misappropriation, etc....

Context in source publication

Context 1
... data-owners have complete right to decide and give access-permission to the buyers. The access control mechanism is depicted in Algorithm 4 and it is shown pictorially in Figure 6. Let us discuss it in detail. ...

Similar publications

Preprint
Full-text available
People are capable of manipulating and copying digital images. Protection and security must be given to the right property from piracy and illegal copies. To safeguard the intellectual property of digital images digital watermarking is used as a solution. Watermarking technique is prevalent due to an effective copyright protection method i.e., only...
Article
Full-text available
Nowadays, with the rapid development of advanced technologies, an illegal copy of digital documents can be easily generated. The Portable Document Format is the most common and widely used text document on the internet. The copyright protection of these documents is a challenging task. Advanced techniques have been proposed in the past but have not...
Article
Full-text available
In the current digital information age, PDF417 code has become an indispensable tool in daily life because it is extensively used. By the universal use of PDF417 code applications, mobile code scanning has become an indispensable part of people’s lives. Before scanning a PDF417 code, it is extremely important to confirm the security of the two-dime...
Article
Full-text available
This article delves into the intricate field of digital steganography, a pivotal method for ensuring the confidentiality and integrity of hidden data within digital files. As digital communication continues to evolve, the necessity for robust security measures to protect sensitive information has never been more critical. This research primarily fo...
Article
Full-text available
Data hiding, also known as information hiding and digital watermarking, refers to the technology of hiding secret information in publicly available media and making it difficult for people to feel its existence. According to the survival result of the hidden information when the stego media is processed, this technology can be divided into robust,...

Citations

... Some evidence on the performance of these prototypes has been reported, but only emulated scenarios [16] with no real blockchain networks can be found. In some cases, authors have replicated the behavior of blockchain networks using programming interfaces [17], but still, no real information is provided about how these marketplaces would behave in a real ecosystem. ...
Article
Full-text available
The data economy has arisen in most developed countries. Instruments and tools to extract knowledge and value from large collections of data are now available and enable new industries, business models, and jobs. However, the current data market is asymmetric and prevents companies from competing fairly. On the one hand, only very specialized digital organizations can manage complex data technologies such as Artificial Intelligence and obtain great benefits from third-party data at a very reduced cost. On the other hand, datasets are produced by regular companies as valueless sub-products that assume great costs. These companies have no mechanisms to negotiate a fair distribution of the benefits derived from their industrial data, which are often transferred for free. Therefore, new digital data-driven marketplaces must be enabled to facilitate fair data trading among all industrial agents. In this paper, we propose a blockchain-enabled solution to monetize industrial data. Industries can upload their data to an Inter-Planetary File System (IPFS) using a web interface, where the data are randomized through a privacy-preserving algorithm. In parallel, a blockchain network creates a Non-Fungible Token (NFT) to represent the dataset. So, only the NFT owner can obtain the required seed to derandomize and extract all data from the IPFS. Data trading is then represented by NFT trading and is based on fungible tokens, so it is easier to adapt prices to the real economy. Auctions and purchases are also managed through a common web interface. Experimental validation based on a pilot deployment is conducted. The results show a significant improvement in the data transactions and quality of experience of industrial agents.
... The literature [14] argues that transactional data is becoming increasingly popular as a commodity and shows through research studies that the current data market is dynamic and the data in the market is evolving towards high quality. The literature [15] proposes a novel big data watermarking technique to provide transparent immutable audit trails for data movement in big data realization scenarios to address the threats of piracy, illegal resale, and tampering faced by modern data markets. This paper firstly compares the differences between traditional factors of production and data production factors, summarizes the six basic economic characteristics of data factors, and studies the market-oriented classification of metadata, comparing the similarities and differences of business metadata, technical metadata, management metadata, security metadata, and audit metadata. ...
Article
Full-text available
Studying regional differences in the market-based allocation of data factors can help promote the development of data factor markets in China. This paper establishes a market-based allocation framework for data factors and analyzes the economic characteristics and market-based classification of data factors to explain the market-based framework. On this basis, a data factor market-based allocation efficiency measurement model is established based on network DEA with additional intermediate inputs to calculate the stage and overall market-based allocation efficiency of 30 Chinese provinces and cities with available data from 2019 to 2020, while the Malmquist index reflecting the inter-period dynamic changes is analyzed. The measured average values of market-based allocation efficiency of data factors in 2020 in the eastern, central, and western regions of China are 0.714, 0.515, and 0.362, all exceeding the average value in 2019, and improving by 12.43%, 15.67%, and 17.38% year-on-year, respectively. The average value of market-based allocation efficiency of data factors in 2019~2020 is 0.80, and above in six provincial. There are 6 administrative districts. 19 provinces have a Malmquist index of market-based allocation of data elements greater than 1 in 2020, accounting for 63.37%. In the era of the digital economy, provinces should actively formulate high-standard data factor market cultivation and development plans to ensure that data factors become basic and strategic resources for each province, empowering high-quality economic development.
... Dai [20] proposed a data trading ecosystem based on blockchain technology, and conducted an in-depth analysis and discussion on the management and control of the specific process of data trading. Sahoo S et al. [21] studied the watermarking problem in the context of big data trading based on the Ethereum blockchain. Xiang [22] put forward a decentralized data trading scheme based on smart contracts and digital watermarks, and designed algorithms to prevent data resale and ensure the fairness of data sharing. ...
Article
Full-text available
With the increase in the market share of data trading, the risks such as identity authentication and authority management are increasingly intensified. Aiming at the problems of centralization of identity authentication, dynamic changes of identities, and ambiguity of trading authority in data trading, a two-factor dynamic identity authentication scheme for data trading based on alliance chain (BTDA) is proposed. Firstly, the use of identity certificates is simplified to solve the problems of large calculation and difficult storage. Secondly, a two-factor dynamic authentication strategy is designed, which uses distributed ledger to achieve dynamic identity authentication throughout the data trading. Finally, a simulation experiment is carried out on the proposed scheme. The theoretical comparison and analysis with similar schemes show that the proposed scheme has lower cost, higher authentication efficiency and security, easier authority management, and can be widely used in various fields of data trading scenarios.
... Its decentralization improves transparency without the need for third-party involvement in data sharing. [130][131][132][133] Meanwhile, quantum encryption, while still in its early stages of development, promises to take security to a higher level. 134 The Health Sector Cybersecurity Coordination Center, an HHS body, is already recommending the development of a working group to evaluate the posture and long-term objectives of the HHS towards quantum cryptography. ...
Research
Full-text available
As technology advances rapidly, health systems are finding new and innovative ways to deploy digital solutions to address health challenges. The integration of digital health has demonstrated benefits in terms of population health and improved efficiency in the delivery of care; however, in some cases, it has also exposed and exacerbated various inequities that have long existed in health systems and wider society. Adverse health outcomes have disproportionately impacted at-risk populations as a result of health inequities. Digital health can play an important role in addressing these inequities. A new Economist Impact program, The Intersection of Digital Health and Equity, aims to define the implications, gaps and opportunities at the intersection of digital health and health equity as a critical component of progress toward effective, high-value care for all. We bring together an evidence-based program with valuable insights from industry experts. The analysis of this report is underpinned by four core pillars of digital health equity: 1) empowerment and access, 2) accountability and justice, 3) community and leadership, 4) metrics. Although more attention is being paid to creating a more equitable health system, much work still needs to be done. Our study analysis finds that: 1) It is essential to increase the diverse representation of underserved populations in positions of decision-making, ranging from healthcare executives to community health leaders. 2) Inclusion of underserved groups and respective community leaders in the technology design phase is critical to increasing engagement. 3) The Health Insurance Portability and Accountability Act of 1996 (HIPAA) needs to be revisited and policymakers should consider supporting legislation that focuses on expanding access and enhancing protections for underserved populations, such as the Digital Equity Act of 2021. Further research is necessary to quantify the direct impacts of improvements in digital health equity on each social determinant of health. This project was conducted and published by Economist Impact and supported by the Veterans Health Administration Innovation Ecosystem. Report available at: https://impact.economist.com/perspectives/health/intersection-digital-health-and-equity
... This provides a secured and trusted linkage between the data and its modification or ownership history. This approach has been explored in some recent works [18]- [20] that fulfil a common interest of improving data ownership traceability. However, these solutions do not address issues such as ambiguous data ownership, undisclosed data reselling and dispersal of data ownership across multiple marketplaces. ...
... In [18], the authors propose a watermarking technique for big data by leveraging the power of blockchain technology and smart contracts. An owner stores the watermarked data in the IPFS and registers it to the system. ...
... A comparison of the most relevant works [18]- [20], [27] is summarized in Table I. This comparison demonstrates the need to develop a holistic architecture which can address complex requirements to devise an effective data ownership traceability in the multi-marketplace data trading scenario. ...
Article
Today massive amounts of data are generated from Internet-of-Things (IoT) sensors that can be streamed in real-time and utilized for building valuable services. As the demand for data sharing has increased, a new business model of data marketplace has emerged that allows individuals to sell their data to buyers for monetary gain. However, these data marketplaces are prone to various threats such as unauthorized data redistribution/reselling, tampering of data, dishonest data ownership claims, and trade of bogus data. The existing solutions related to data ownership traceability are unable to address the above issues due to ambiguous data ownership, undisclosed data reselling, and dispersal of data ownership across multiple marketplaces. In order to solve the above problems, we propose a novel blockchain framework, TrailChain, that uses watermarking to generate a trusted trade trail for tracking the data ownership spanning across multiple decentralized marketplaces. Our solution includes mechanisms for detecting any unauthorized data reselling within and across marketplaces. We also propose a fair resell payment sharing scheme that ensures the resell revenue is shared with the data owners over authorized reselling. We present a prototype implementation of the system using Ethereum. We perform extensive simulations to demonstrate TrailChain’s feasibility by benchmarking performance metrics including execution gas costs, execution time, latency and throughput.
... A so-called escrow contract can be used in which a thirdparty escrow agent receives, holds and disburses assets according to a predefined contract or agreement on behalf of the agreement participants [125]. This approach can be adapted to smart contracts, where an autonomous clearinghouse receives and holds the payment for a data asset or a token for granting asset to a data product and makes sure these are exchanged according to the agreement between the data provider and data consumer [70], [115], [126]. In some data markets, the data itself is even stored on a blockchain in an encrypted manner [90], [127]. ...
... Aspects from the service-level facet can be readily monitored and logged on the blockchain so that other data consumers can see the historical performance of the data product [70], [128]. The content-and context-based facets can be stored on the blockchain by asking previous data consumers to write reviews, which are then stored on the blockchain [91], [126]. ...
... Firstly, manual data brokers search for highly specialised data products that may or may not be publicly offered on the data market registry [8], [35] or, alternatively, for requests for data products that they can provide [130], [131]. Secondly, the most common manual actors in the literature review are the data transformers, who create new data products from existing ones; either by adding new value or insights, standardising data products or by aggregating different data products into new ones [35], [84], [126]. Finally, manual data quality assessment relies on human actors to assess the quality of data products in scenarios where the data quality is hard to address in a standardised way [97], [101]. ...
Article
Full-text available
Data markets are platforms that provide the necessary infrastructure and services to facilitate the exchange of data products between data providers and data consumers from different environments. Over the last decade, many data markets have sprung up, capitalising on the increased appreciation of the value of data and catering to different domains. In this work, we analyse the existing body of scientific literature on data markets to provide the first comprehensive overview of research into the design of data markets, regardless of scientific background or application domain. In doing so, we contribute to the field in several ways: 1) We present an overview of the state of the art in academic research on data markets and compare this with existing market trends to identify potential gaps. 2) We identify important application domains and contexts where data markets are being put into practice. 3) Finally, we provide taxonomies of both design problems for data markets and the solutions that are being investigated to address them. We conclude our work by identifying common types of data markets and corresponding best practices for designing them. The outcome of this work is intended to serve as a starting point for software architects and engineers looking to design data markets.
... Fast advancement of vector map copyright protection techniques has been witnessed in recent decades, which can be divided into two types: accountability and precaution. The accountability includes digital watermarking [22][23][24], digital fingerprinting [25][26][27], and blockchain [28][29][30]. Digital watermarking is used to identify the copyright of the vector map, digital fingerprinting performs well in tracing the original pirate, and blockchain is enabled by integrating several core technologies, such as cryptographic hash, digital signature (based on asymmetric cryptography), and distributed consensus mechanism, which can be applied for protecting data copyright and managing patents [31]. ...
Article
Full-text available
Encryption of vector maps, used for copyright protection, is of importance in the community of geographic information sciences. However, some studies adopt one-to-one mapping to scramble vertices and permutate the coordinates one by one according to the coordinate position in a plain map. An attacker can easily obtain the key values by analyzing the relationship between the cipher vector map and the plain vector map, which will lead to the ineffectiveness of the scrambling operation. To solve the problem, a vector map encryption algorithm based on a double random position permutation strategy is proposed in this paper. First, the secret key sequence is generated using a four-dimensional quadratic autonomous hyperchaotic system. Then, all coordinates of the vector map are encrypted using the strategy of double random position permutation. Lastly, the encrypted coordinates are reorganized according to the vector map structure to obtain the cipher map. Experimental results show that: (1) one-to-one mapping between the plain vector map and cipher vector map is prevented from happening; (2) scrambling encryption between different map objects is achieved; (3) hackers cannot obtain the permutation key value by analyzing the pairs of the plain map and cipher map.
... However, they do not consider using the blockchain to reduce the role of TTP. Sahoo et al. [18] introduced a blockchain-technology based technique that uses the idea of controlling the number of users who can access the underlying data. However, this scheme does not itself focus on making the watermarking scheme robust enough to be robust against malicious attacks. ...
... On the other hand, our proposed technique itself uses the blockchain technology to enhance the security of database in such a way that helps to: (i) identify the owner of a particular of data in case of more than one user have watermarked the same data; and (ii) facilitate the version control process by using a different watermark when the data is updated by the insertion of a new block of data. Similarly, the technique presented in [18] recommends controlling the number of users for robustness of the scheme. However, our proposed scheme does not put any limitation for number of users and is yet robust. ...
Article
With widespread use of relational database in various real-life applications, maintaining integrity and providing copyright protection is gaining keen interest of the researchers. For this purpose, watermarking has been used for quite a long time. Watermarking requires the role of trusted third party and a mechanism to extract digital signatures (watermark) to prove the ownership of the data under dispute. This is often inefficient as lots of processing is required. Moreover, certain malicious attacks, like additive attacks, can give rise to a situation when more than one parties can claim the ownership of the same data by inserting and detecting their own set of watermarks from the same data. To solve this problem, we propose to use blockchain technology-as trusted third party-along with watermarking for providing a means of rights protection of relational databases. Using blockchain for writing the copyright information alongside watermarking helps to secure the watermark as changing the blockchain is very difficult. This way, we combined the resilience of our watermarking scheme and the strength of blockchain technology-for protecting the digital rights information from alteration-to design and implement a robust scheme for digital right protection of relational databases. Moreover, we also discuss how the proposed scheme can also be used for version control. The proposed technique works with nonnumeric features of relational database and does not target only selected tuple or portion (subset) from the database for watermark embedding unlike most of the existing techniques; as a result, the chances of subset selection containing no watermark decrease automatically. The proposed technique employs zero-watermarking approach and hence no intentional error (watermark) is added to the original dataset. The results of the experiments proved the effectiveness of the proposed scheme.
Article
Full-text available
Machine Learning is being used worldwide in the deployment of API's (Application Programming Interface). The development of machine learning presents: techniques, algorithms, sequences, logic based on facts, and predictions of future errors in various processes of organizations such as the process of deployment of API's/functionalities/software. A systematic literature review (SLR) was conducted on machine learning for the process of API/functionality deployment/error detection. The search strategy identified 176378 papers in digital libraries such as: Scopus, ProQuest, ScienceDirect, IEEE Xplore, Taylor & Francis Online, Web of Science, Wiley Online Library and ACM Digital Library; which were filtered by exclusion and quality criteria obtaining as final result, for review and analysis, 85 papers. The results of the systematic review have focused on machine learning papers recently published in recent years regarding the deployment of API's, software, monitoring and control tools, error detection where machine learning offers alternatives to improve and be more efficient in those processes that fail regularly today. The RSL has allowed a broad view on the studies and findings presented in this study.