Figure - available from: Wireless Communications and Mobile Computing
This content is subject to copyright. Terms and conditions apply.
Framework of our proposed scheme.

Framework of our proposed scheme.

Source publication
Article
Full-text available
Group activities on social networks are increasing rapidly with the development of mobile devices and IoT terminals, creating a huge demand for group recommendation. However, group recommender systems are facing an important problem of privacy leakage on user's historical data and preference. Existing solutions always pay attention to protect the h...

Similar publications

Preprint
Full-text available
Modern recommender systems operate in a fully server-based fashion. To cater to millions of users, the frequent model maintaining and the high-speed processing for concurrent user requests are required, which comes at the cost of a huge carbon footprint. Meanwhile, users need to upload their behavior data even including the immediate environmental...
Article
Full-text available
With the deep integration of "AI + medicine", AI-assisted technology has been of great help to human beings in the medical field, especially in the area of predicting and diagnosing diseases based on big data, because it is faster and more accurate. However, concerns about data security seriously hinder data sharing among medical institutions. To f...
Preprint
Full-text available
Unmanned aerial vehicle (UAV)-enabled edge federated learning (FL) has sparked a rise in research interest as a result of the massive and heterogeneous data collected by UAVs, as well as the privacy concerns related to UAV data transmissions to edge servers. However, due to the redundancy of UAV collected data, e.g., imaging data, and non-rigorous...
Article
Full-text available
As Internet of Things (IoT) technology continues to advance at a rapid pace, smart devices have permeated daily life. Service providers are actively collecting copious numbers of user data, with the aim of refining machine learning models to elevate service quality and accuracy. However, this practice has sparked apprehensions amongst users concern...
Preprint
Full-text available
Private set intersection is an important problem with implications in many areas, ranging from remote diagnostics to private contact discovery. In this work, we consider the case of two-party PSI in the honest-but-curious setting. We propose a protocol that solves the server-aided PSI problem using delegated blind quantum computing. More specifical...

Citations

... Existing efforts to address data privacy concerns in federated recommender systems primarily rely on differential privacy [21]- [23] and homomorphic encryption [24], [25]. Differential privacy achieves privacy-preserving by introducing random noise on updated model parameters [26]- [28]. ...
Preprint
Federated recommender systems have been crucially enhanced through data sharing and continuous model updates, attributed to the pervasive connectivity and distributed computing capabilities of Internet of Things (IoT) devices. Given the sensitivity of IoT data, transparent data processing in data sharing and model updates is paramount. However, existing methods fall short in tracing the flow of shared data and the evolution of model updates. Consequently, data sharing is vulnerable to exploitation by malicious entities, raising significant data privacy concerns, while excluding data sharing will result in sub-optimal recommendations. To mitigate these concerns, we present LIBERATE, a privacy-traceable federated recommender system. We design a blockchain-based traceability mechanism, ensuring data privacy during data sharing and model updates. We further enhance privacy protection by incorporating local differential privacy in user-server communication. Extensive evaluations with the real-world dataset corroborate LIBERATE's capabilities in ensuring data privacy during data sharing and model update while maintaining efficiency and performance. Results underscore blockchain-based traceability mechanism as a promising solution for privacy-preserving in federated recommender systems.
... Now, that this paper suggests combining both approaches, that drawback arises again: How to overcome the fact that user-specific nudges rely on a large amount of personal data and observations of the sharing behaviour? We propose the use of algorithms such as differential privacy, which has been widely relied on by personalized recommender systems to protect users' privacy [40]. To the best of our knowledge, this concept has not been adapted to nudges but has the potential to greatly benefit the proposed Figure 3: Overview of the enhanced hybrid privacy agent enhanced privacy-preserving nudges. ...
... Users' data contains sensitive information about the individual which needs to be protected. To achieve this, several studies have applied various privacy-preserving techniques such as homomorphic encryption [14,15], differential privacy [16,17], anonymization [18,19], and even the game-theory method [20]. Among them, Badsha et al. [14] proposed a privacy-preserving recommender system that allows all computation to be done in a privately distributed manner using ElGamal homomorphic encryption, without compromising recommendation accuracy and efficiency. ...
Article
User privacy in the recommender systems have received much attention over the years. However, much of this attention has been on privacy protection in single-domain recommender systems and not on cross-domain recommender systems. The privacy-preserving cross-domain recommender systems not only encourages collaboration of data between different domains to solve the problem of data sparsity but also ensures the users’ privacy and secure transfer of auxiliary information between domains. However, existing studies are not suitable for privacy protection in a cross-domain scenario. To this end, we propose a novel privacy-preserving framework for cross-domain recommender systems that provides a generic template for other secure cross-domain recommender systems. Employing a homomorphic encryption scheme, the framework consists of two protocols for users’ privacy in cross-domain recommender systems. We mathematically described every step involved in each protocol, proved that the two protocols are secure against a semi-honest adversary, and compared the complexity of the protocols.
Article
Federated recommender systems have been crucially enhanced through data sharing and continuous model updates, attributed to the pervasive connectivity and distributed computing capabilities of Internet of Things (IoT) devices. Given the sensitivity of IoT data, transparent data processing in data sharing and model updates is paramount. However, existing methods fall short in tracing the flow of shared data and the evolution of model updates. Consequently, data sharing is vulnerable to exploitation by malicious entities, raising significant data privacy concerns, while excluding data sharing will result in sub-optimal recommendations. To mitigate these concerns, we present LIBERATE, a privacy-traceable federated recommender system. We design a blockchain-based traceability mechanism, ensuring data privacy during data sharing and model updates. We further enhance privacy protection by incorporating local differential privacy in user-server communication. Extensive evaluations with the real-world dataset corroborate LIBERATE’s capabilities in ensuring data privacy during data sharing and model update while maintaining efficiency and performance. Results underscore blockchain-based traceability mechanism as a promising solution for privacy-preserving in federated recommender systems.
Chapter
Group Recommender Systems (GRS) combine large amounts of data from various user behaviour signals (likes, views, purchases) and contextual information to provide groups of users with accurate suggestions (e.g. rating prediction, rankings). To handle those large amounts of data, GRS can be extended to use distributed processing and storage solutions (e.g. MapReduce-like algorithms and NoSQL databases). As such, privacy has always been a core issue since most recommendation algorithms rely on user behaviour signals and contextual information that may contain sensitive information. However, existing work in this domain mostly distributes data processing tasks without addressing privacy, and the solutions that address privacy for GRS (e.g. k-anonymisation and local differential privacy) remain centralised. In this paper, we identify and analyse privacy concerns in GRS and provide guidelines on how decentralised techniques can be used to address them.KeywordsGroup recommender systemsDecentralisationPrivacy
Article
The adoption of protection routing guarantees the existence of a loop-free alternate path for packet forwarding when a single link or node failure occurs. By Tapolcai's method, the presence of two completely independent spanning trees (CISTs for short) suffices to configure a protection routing. This article extends the idea of protection routing to involve more CISTs and attach a secure mechanism in the configuration, which we call the secure multi-protection routing scheme (SMPR-scheme). Then, we use the SMPR-scheme to deal with privacy-preserving data transmissions, such as downloading personal medical records, tax bills, or other private information. To evaluate the effectiveness of the SMPR-scheme, we develop a probabilistic model that allows some malicious nodes to collect information illegally through neighboring access. We experimented with SMPR-scheme on the BCube (i.e., the generalized hypercube) datacenter network. Simulation results show that data transmission using SMPR-scheme ensures confidentiality (i.e., no node other than the recipient can receive the complete message) and effectively resists privacy collection even under malicious infringement.
Article
Nowadays, the purpose of human genomics is widely emerging in health-related problems and also to achieve time and cost-efficient healthcare. Due to advancement in genomics and its research, development in privacy concerns is needed regarding querying, accessing and, storage and computation of the genomic data. While the genomic data is widely accessible, the privacy issues may emerge due to the untrusted third party (adversaries/researchers), they may reveal the information or strategy plans regarding the genome data of an individual when it is requested for research purposes. To mitigate this problem many privacy-preserving techniques are used along with cryptographic methods are briefly discussed. Furthermore, efficiency and accuracy in a secure and private genomic data computation are needed to be researched in future.
Article
Context-aware recommendation systems are of increasing popularity in the digital era to recommend personalized items to users. However, how to ensure user data privacy while remaining high recommendation accuracy is widely considered a challenge. In this work, we propose a privacy-preserving method for the context-aware recommendation system in the two-cloud model. In particular, we first adjust the standard additive secret sharing scheme to support secure negative integers computation, based on which we manage to design secure comparison protocol and division protocols that enjoy desirable security and efficiency. By using these new protocols, we propose a secure and efficient context-aware recommendation system that also supports offline users. Compared with the state-of-the-art, our scheme achieves stronger data privacy preservation by further protecting the intermediate data calculated during the system training. Experimental results on real-world datasets indicate that our scheme is efficient. Notable, our system could achieve more significant performance improvement by running the underlying schemes in parallel.