Figure 8 - uploaded by Mohamed el Ghazouani
Content may be subject to copyright.
Architecture of the proposed multi-agent system model

Architecture of the proposed multi-agent system model

Source publication
Article
Full-text available
Recently, data storage represents one of the most important services in Cloud Computing. The cloud provider should ensure two major requirements which are data integrity and storage efficiency. Blockchain data structure and the efficient data deduplication represent possible solutions to address these exigencies. Several approaches have been propos...

Context in source publication

Context 1
... this work, the authors apply data deduplication in the cloud server side (Figure 8), where they use MAS for modeling several autonomous intelligent agents: Interface Agent, Mediator Agent, Analysis Agent, Control Agent and Data agent. Each agent hassome specific tasks to achieve, starting with receiving files, going through data deduplication and ending with storing these files. ...

Similar publications

Conference Paper
Full-text available
In the Cyber world today managing threats dynamically, is challenging, moreover generation of knowledge and enhancing knowledge management (KM) capabilities is becoming crucial due to the dynamics of technological advancements. In this context a regulatory mechanism is essential; this resilience required is achieved through data analytics which bri...
Article
Full-text available
A Large-Scale Heterogeneous Network (LS-HetNet) integrates different networks into one uniform network system to provide seamless one-world network coverage. In LS-HetNet, various devices use different technologies to access heterogeneous networks and generate a large amount of data. For dealing with a large number of access requirements, these dat...
Article
Full-text available
Most ghost imaging reconstruction algorithms require a large measurement time to retrieve the object information clearly. But not all groups of data play a positive role in reconstructing the object image. Abandoning some redundant data can not only enhance the quality of reconstruction images but also speed up the computation process. Here, we pro...
Article
Full-text available
Electronic medical records have the advantages of large storage capacity, convenience, improved efficiency, etc. However, since electronic medical records contain a large amount of patient’s private information, if they are stored directly in the cloud server, patients may face the risk of privacy data leakage and electronic medical record data tam...
Article
Full-text available
The massive redundant data storage and communication in network 4.0 environments have issues of low integrity, high cost, and easy tampering. To address these issues, a secure data storage and recovery scheme in the blockchain-based network is proposed by improving the decentration, tampering-proof, real-time monitoring and management of storage sy...

Citations

... This ensures protection as the exclusive use of the author's work. There is a severe need for restriction on the usage of any artistic work, which protects the author's wilful claim for ownership [19], [20]. ...
Article
Full-text available
Preserving the integrity of ancient aesthetic works has become a greater matter of concern in the modern digital age. As the internet is streaming with data that are collected from multiple sources, it imposes a severe threat to the integrity and security of the original aesthetic of many craft works. This paper proposes a novel framework that attempts to preserve Li Yu's craft aesthetics, which can be explored from his wide range of literary collections. As the author is popular for his excellence in many art forms like gardening, theatre arts, etc, recreation of his work is a major problem of concern. The framework initially pre-processed the corpus to extract keywords, which are then subjected to a semantic vector to find the closely related as well as differing words. Then, these words are subjected to geometric perturbations to impart versatility to the training phase. The text under screening is fed to the classification phase to find whether it is an altered or modified work of Li Yu, thus isolating the author's work from being reproduced, which greatly affects the aesthetics. As a future extension, the work can be trained with different classifiers to develop a defrauding system.
... Junyi Guo (2018), on the construction of supply chain information sharing platform, proposes that should start from the perspective of supply chain information sharing, and believes that the blockchain scenario application in the field of supply chain should be reconstructed [5]. Ghazouani (2019) and other scholars form a cloud data security de-weighting scheme under the condition of blockchain and multi-agent system, that is, the data information uploaded to the cloud by users can be safely de-weighted through multi-agent system, and at the same time [6], based on blockchain technology to ensure that the file information is difficult to be tampered with, thus Wei (2020) and other scholars mainly design cloud applications by combining the advantages of cloud computing and blockchain to achieve the security and completeness of data in the cloud [7]. ...
... Junyi Guo (2018), on the construction of supply chain information sharing platform, proposes that should start from the perspective of supply chain information sharing, and believes that the blockchain scenario application in the field of supply chain should be reconstructed [5]. Ghazouani (2019) and other scholars form a cloud data security de-weighting scheme under the condition of blockchain and multi-agent system, that is, the data information uploaded to the cloud by users can be safely de-weighted through multi-agent system, and at the same time [6], based on blockchain technology to ensure that the file information is difficult to be tampered with, thus Wei (2020) and other scholars mainly design cloud applications by combining the advantages of cloud computing and blockchain to achieve the security and completeness of data in the cloud [7]. ...
... Regardless of that TPA can handle tasks more efficiently than resource-constraint users with sufficient computing resources, a single point of failure (SPOF) risk occurred caused to dependency on trusted TPA. A blockchain-based DIA (BDIA) scheme [6][7][8][9] has been proposed as a way to mitigate the single-point risk of TPA-based RIDA by replacing a blockchain with the trusted TPA. ...
... Instead of the TPA, the prover in BDIA publishes the proof of the verifier's auditing request over the blockchain. The verifier leverages a cryptographic tag that has already been stored in the immutable blockchain to verify this proof [6][7][8], or the blockchain nodes verify the proof instead of the TPA, and the verifier checks the result of the consensus [9]. ...
Chapter
CCP data integrity auditing in South Korea's smart HACCP system is currently only enabled through centralized access control, so its reliability cannot be assured. Since simply logging the history of data access does not provide a way to effectively verify the integrity of the data, so reliable HACCP systems must include the appropriate data integrity auditing schemes on their system. Many researches have been proposed to verify the integrity of data stored on semi-honest cloud servers, but to the best of our knowledge, no research has taken a smart HACCP environment into account. Therefore, in this paper, we propose a blockchain-based data integrity auditing protocol appropriate for contexts with a smart HACCP system. Moreover, we conduct qualitative analysis to show that the computational overhead in our protocol is reasonable. The analysis results show that our protocol will satisfy the requirements of the smart HACCP system regardless of the blockchain platform on which it is implemented.KeywordsSmart HACCPCloud storage serviceRemote data integrity auditingBlockchain
... It meets the requirements of learners with different needs and different levels. It also provides users with decision-making support and reduces the probability of risk [4]. e process of data mining generally needs to go through the four stages of obtaining data, preparing data, mining data, and expressing and explaining mining results, as shown in Figure 1. ...
Article
Full-text available
Data integrity verification means that the data in the cloud are uploaded by the user. In addition to the user’s own update of the data, any external factors including the cloud service provider’s data are destroyed, tampered with, and lost, and the data are not updated in a timely manner. Any inconsistencies in the actual data required can be detected by the user. This article aims to study cloud data integrity verification algorithms based on data mining and accounting informatization. This article proposes data mining technology, accounting information to help business managers assist management work. The article uses Company H as an example to illustrate the accounting information system and proposes a new information management strategy based on it. The CBF algorithm and data integrity verification algorithm are used to study the cloud storage data integrity verification protocol, the cloud data integrity verification model is constructed, the data program flow design is analyzed, and the time-consuming operation of file data insertion is analyzed. The experiment in this paper uses 16 standard mathematical calculation formulas to strengthen the analysis. The results show that the study of cloud data integrity verification algorithms based on data mining and accounting information is beneficial to the integrity and protection of data. When the number of documents added increases from 0 to 400, the document agreement shows an upward trend, and the agreement in this paper basically fluctuates between 10 and 80.
... smart contract-enabled service). Ghazouani et al. [27] proposed to adopt blockchain as a database for storing metadata of client files. This database serves as a logging database to grant data integrity auditing. ...
Article
Full-text available
Cloud storage is an ideal platform to accommodate massive data. However, with the increasing number of various devices and improved processing power, the amount of generated data is becoming gigantic. Therefore, this calls for a cost‐effective way to outsource massively generated data to a remote server. Cloud service providers utilise deduplication technique which deduplicates redundant data by aborting identical uploading requests and deleting redundant files. However, current deduplication mechanisms mainly focus on the storage saving of the server, and ignore the sustainable and long‐term financial interests of servers and users. This is not helpful to expand outsourcing and deduplication services. Blockchain is an ideal solution to achieve an economical and incentive‐driven deduplication system. Though some current research studiess have integrated deduplication with blockchain, they did not utilise blockchain as a financial tool. Meanwhile, it lacks an arbitration mechanism to settle disputes between the server and the user, especially in a Bitcoin payment where the payment is not confirmed immediately and a dispute may occur. This creates a burden to achieve fair and transparent incentive‐based deduplication service. In this work, we construct a deduplication system with financial incentives for the server and the user based on Bitcoin. The data owner will pay money via Bitcoin to the server for outsourcing the file, but this fee can be compensated by charging deduplication users with some fees to acquire the deduplication service. The server and the user can receive revenues using deduplication service. Disputes on the fair distribution of incentives can be settled by our arbitration protocol with chameleon hashes as arbitration tags. We give concrete construction and security requirements for our proposed BDAI. The security analysis shows that our BDAI is theoretically secure. The performance evaluation shows that our proposed BDAI is acceptably efficient for the deduplication. Meanwhile, we evaluate and conclude that 1% of outsourcing fee (or less) is a reasonable and preferable price for each deduplication user to pay as compensation for data owner.
... Thereafter, these adaptative control systems were implemented in many countries to manage traffic control in metropolitan areas, and others have been developed, such as RHODES [17] and TUC [18]. The MAS is rapidly growing as one of the most powerful popular technologies proposed to solve complicated problems in different fields, such as electrical engineering, cloud security [19] [20], data storage [21], civil engineering, and transportation systems. Computer technologies, including MAS, have been widely proposed to deal with traffic control and management [22]. ...
Article
Full-text available
To favor emergency vehicles, promote collective modes of transport in Moroccan cities, we propose in this paper a control system to manage traffic at signalized intersections with priority links in urban settings. This system combines multi-agent technology and fuzzy logic to regulate traffic flows. The traffic system flow is divided into two types of vehicles; priority and regular vehicles. The regular vehicles can use only the regular links, while the priority vehicles may use both priority and the regular links. This approach aims to favor emergency vehicles and promote collective modes of transport, it acts on the traffic light phases length and order to control all traffic flows. We proposed a decentralized system of regulation based on real-time monitoring to develop a local inter-section state, and intelligent coordination between neighboring intersections to build an overview of the traffic state. The regulation and prioritization decisions are made through cooperation, communication, and coordination between different agents. The performance of the proposed system is investigated and instantiated in ANYLOGIC simulator, using a section of the Marrakesh road network that contains priority links. The results indicate that the designed system can significantly develop the efficiency of the traffic regulation system.
... It is the responsibility of CSP to implement a variety of measures to ensure data integrity. e CSP informs the client about the kind of data that is being stored in the cloud [5]. As a result, it is necessary for CSP to maintain records about the data, such as the type of data, whether public or private, when and where it is required, the type of Virtual Memory and accumulated, the period of time when it is accessed, and so on, in order to protect the data from unauthorized access and to maintain data confidentiality. ...
... NN classifiers have posterior probabilities that are higher than the posterior probability of the DT classifiers, which is represented by the formula for maximum posterior probability (max posterior probability) (5). e probabilistic model, as shown in equation (6), is used to identify the input character picture with the highest likelihood of being recognized: ...
Article
Full-text available
The Zika virus presents an extraordinary public health hazard after spreading from Brazil to the Americas. In the absence of credible forecasts of the outbreak's geographic scope and infection frequency, international public health agencies were unable to plan and allocate surveillance resources efficiently. An RNA test will be done on the subjects if they are found to be infected with Zika virus. By training the specified characteristics, the suggested Hybrid Optimization Algorithm such as multilayer perceptron with probabilistic optimization strategy gives forth a greater accuracy rate. The MATLAB program incorporates numerous machine learning algorithms and artificial intelligence methodologies. It reduces forecast time while retaining excellent accuracy. The projected classes are encrypted and sent to patients. The Advanced Encryption Standard (AES) and TRIPLE Data Encryption Standard (TEDS) are combined to make this possible (DES). The experimental outcomes improve the accuracy of patient results communication. Cryptosystem processing acquires minimal timing of 0.15 s with 91.25 percent accuracy.
... Any data that has been recorded cannot be modified after it has been recorded [23]. • The blockchain network provides transparency to all participants the layer of trust [15]. ...
... Hyperledger's multichannel design isolates data across channels where private data gathering allows for personal information separation between businesses within the same network [38]. The prominent built-in features of blockchain technology are trust, security, and transparency, which resolve marketing issues in an effective way [23]. The integration of blockchain technology in digital marketing is still in Fig. 4 Application of blockchain in digital marketing development, but this review of the literature proved that it has enormous potential to improve or even disrupt multiple areas in marketing. ...
Chapter
Full-text available
Blockchain technology is the fastest-developing technology in recent years, and it has had a significant impact on a wide range of industries and companies in the industry 4.0 era. Privacy, security, and trust are three of the most pressing issues facing digital marketing today. The purpose of this study is to investigate security, privacy, and trust challenges in digital marketing and how blockchain technology can be used to influence digital marketing to increase consumer trust and security. This was done via a systematic review of literature published between 2017 and 2021. The review identified that the use of blockchain technology in digital marketing and marketing management will continue to grow, and it has proven to be effective in providing solutions to both existing and upcoming company challenges and situations in industry 4.0. Blockchain can influence digital marketing by removing intermediaries and delivering trusted cybersecurity services with a high level of transparency and accountability. Also discussed are the possible problems and limitations of blockchain-enabled digital marketing.KeywordsBlockchainDigital marketingIndustry 4.0CybersecurityPrivacyTrustTransparency
... Thereafter, these adaptative control systems were implemented in many countries to manage traffic control in metropolitan areas, and others have been developed, such as RHODES [17] and TUC [18]. The MAS is rapidly growing as one of the most powerful popular technologies proposed to solve complicated problems in different fields, such as electrical engineering, cloud security [19] [20], data storage [21], civil engineering, and transportation systems. Computer technologies, including MAS, have been widely proposed to deal with traffic control and management [22]. ...
Article
Full-text available
To favor emergency vehicles, promote collective modes of transport in Moroccan cities, we propose in this paper a control system to manage traffic at signalized intersections with priority links in urban settings. This system combines multi-agent technology and fuzzy logic to regulate traffic flows. The traffic system flow is divided into two types of vehicles: priority and regular vehicles. The regular vehicles can use only the regular links, while the priority vehicles may use both priority and the regular links. This approach aims to favor emergency vehicles and promote collective modes of transport, it acts on the traffic light phases length and order to control all traffic flows. We proposed a decentralized system of regulation based on real-time monitoring to develop a local inter-section state, and intelligent coordination between neighboring intersections to build an overview of the traffic state. The regulation and prioritization decisions are made through cooperation, communication, and coordination between different agents. The performance of the proposed system is investigated and instantiated in ANYLOGIC simulator, using a section of the Marrakesh Road network that contains priority links. The results indicate that the designed system can significantly develop the efficiency of the traffic regulation system © 2021, International Journal of Communication Networks and Information Security. All Rights Reserved.
... More and more studies are focusing their attention on how to utilize blockchain technology to enhance the security of data stored on the cloud storage system. Ghazouani et al. (Ghazouani, Kiram, and Errajy, 2019) proposed a multi-agent system to perform data deduplication, which took the blockchain as a logging database to store the information of data and ensure the data integrity. However, it cannot ensure the confidentiality of data due to the lack of effective encryption algorithm. ...
Article
Deduplication scheme based on convergent encryption (CE) is widely-used in cloud storage system to eliminate redundant data. However, the adversaries can obtain the data by the brute-force attack, if the data belongs to a predictable set for CE. In addition, previous works usually introduce the third-party auditors to execute the data integrity verification, suffering from data disclosure by the auditors. In this paper, we propose a secure authorized deduplication scheme based on blockchain, which can ensure the confidentiality and security of the users' data stored on cloud servers. In our scheme, the users can utilize the smart contract to create a tamper-proofing ledger, which can protect the data from illegal modification. Meanwhile, the users can execute the integrity audit protocols to check the users’ data integrity by the smart contract. Moreover, a hierarchical role hash tree (HRHT) will be constructed to create the role key when the users upload their data to CSP, allowing the authorized users to access the data. Security analysis and performance evaluation demonstrate that our proposed scheme is resilient against the brute-force attack and the collusion attack, while it has limited computation overhead.