Fig 4 - uploaded by Jinli Cao
Content may be subject to copyright.
Initialization for group_1.  

Initialization for group_1.  

Source publication
Article
Full-text available
Web services via wireless technologies, mobile services (M-services), HTTP, and XML have become important for conducting business. W3C XML Protocol Working Group has been developing standard techniques such as Web Services Description Language (WSDL), simple object access protocol (SOAP), universal description discovery and integration (UDDI). Howe...

Context in source publication

Context 1
... data ( , , ) are valid when the equation is successful, in which and are computed by the trusted_role; otherwise the ( , , ) are invalid. The credential_role publishes in a public directory the pair ( , ) for the signer with the public key . The initialization processes of the system are shown in Fig. ...

Similar publications

Conference Paper
Full-text available
The lack of an effective and accepted method for implementing a semantic web services infrastructure to make it possible to execute automatic tasks, like services discovery or orchestration, leads developers communities to face a non-standardized and disorganized backlog of published Aeronautical Web Services. Developers must to hard code the conne...
Article
Full-text available
ESO Reflex is a prototype graphical workflow system, based on Taverna, and primarily intended to be a flexible way of running ESO data reduction recipes along with other legacy applications and user-written tools. ESO Reflex can also readily use the Taverna Web Services features that are based on the Apache Axis SOAP implementation. Taverna is a ge...
Article
Full-text available
Web mining is the application of data mining techniques to discover patterns from the Web. Web services defines set of standards like WSDL(Web Service Description Language), SOAP(Simple Object Access Protocol) and UDDI(Universal Description Discovery and Integration) to support service description, discovery and invocation in a uniform interchangea...
Conference Paper
Full-text available
Web services is an emerging technology of distributed computing as an interoperable means integrating loosely coupled Web applications. The current basic Web services standards SOAP, WSDL and UDDI are not sufficient to fully support the complete business process. To support the business process, several specifications related to the Web services co...
Article
Full-text available
Traditional distributed database transaction applications within large organizations often involve a large number of resources. In this case, people and DDBMSs distributed over a wide geographic area, may introduce conflict between heterogeneous systems. Web services (WS) provide solution for this problem since WS have an independent platform, inde...

Citations

... However, technological advancement also brings forth concerns, particularly as the volume of sensitive data and system complexity increase, prompting a growing awareness and emphasis on data privacy issues and system security protection [2][3][4]. Access control serves as the first line of safeguard, mitigating the risk of unauthorized resource access or data breaches [5][6][7][8]. In an era where information is a valuable asset, effective access control strategies contribute significantly to organizations' overall security posture, fostering trust among stakeholders and ensuring compliance with regulatory requirements [9][10][11]. ...
... To compare the performance of the proposed method with other algorithms, we introduce a F1 score defined as (7). ...
Article
Full-text available
For modern information systems, robust access control mechanisms are vital in safeguarding data integrity and ensuring the entire system’s security. This paper proposes a novel semi-supervised learning framework that leverages heterogeneous graph neural network-based embedding to encapsulate both the intricate relationships within the organizational structure and interactions between users and resources. Unlike existing methods focusing solely on individual user and resource attributes, our approach embeds organizational and operational interrelationships into the hidden layer node embeddings. These embeddings are learned from a self-supervised link prediction task based on a constructed access control heterogeneous graph via a heterogeneous graph neural network. Subsequently, the learned node embeddings, along with the original node features, serve as inputs for a supervised access control decision-making task, facilitating the construction of a machine-learning access control model. Experimental results on the open-sourced Amazon access control dataset demonstrate that our proposed framework outperforms models using original or manually extracted graph-based features from previous works. The prepossessed data and codes are available on GitHub,facilitating reproducibility and further research endeavors.
... This scheme [2] used the Schnorr [4] signature scheme and Zheng [5] signature encryption scheme to propose a strong designated verifier signature scheme, which achieves signer identity privacy by avoiding the use of encryption algorithms and further improves the efficiency of signing and verification. A secure and flexible access control scheme and protocol for M-services based on role based access control (RBAC) [6] in the same year. In 2004, Laguillaumie et al. [7] provided the first formal description of the concept of designated verifier signatures and a formal definition of the signer identity privacy property in strong designated verifier signatures. ...
Article
Full-text available
In an attribute-based strong designated verifier signature, a signer who satisfies the access structure signs the message and assigns it to a verifier who satisfies the access structure to verify it, which enables fine-grained access control for signers and verifiers. Such signatures are used in scenarios where the identity of the signer needs to be protected, or where the public verifiability of the signature is avoided and only the designated recipient can verify the validity of the signature. To address the problem that the overall overhead of the traditional attribute-based strong designated verifier signature scheme is relatively large, an efficient attribute-based strong designated verifier signature scheme based on elliptic curve cryptography is proposed, as well as a security analysis of the new scheme given in the standard model under the difficulty of the elliptic curve discrete logarithm problem (ECDLP). On the one hand, the proposed scheme is based on elliptic curve cryptography and uses scalar multiplication on elliptic curves, which is computationally lighter, instead of bilinear pairing, which has a higher computational overhead in traditional attribute-based signature schemes. This reduces the computational overhead of signing and verification in the system, improves the efficiency of the system, and makes the scheme more suitable for resource-constrained cloud end-user scenarios. On the other hand, the proposed scheme uses LSSS (Linear Secret Sharing Schemes) access structure with stronger access policy expression, which is more efficient than the "And" gate or access tree access structure, making the computational efficiency of the proposed scheme meet the needs of resource-constrained cloud end-users.
... In the present era, data assumes a critical role in the daily lives of individuals [1][2][3][4][5][6]. The dissemination and utilization of data [7][8][9][10][11][12][13] have created enormous opportunities for decision- making and knowledge exploration [5,[14][15][16][17][18][19][20][21][22][23]. For instance, in 2006, Netflix released a dataset comprising 100 million movie ratings to enhance its recommendation system's performance [24]. ...
Article
Full-text available
The privacy-preserving data publishing (PPDP) problem has gained substantial attention from research communities, industries, and governments due to the increasing requirements for data publishing and concerns about data privacy. However, achieving a balance between preserving privacy and maintaining data quality remains a challenging task in PPDP. This paper presents an information-driven distributed genetic algorithm (ID-DGA) that aims to achieve optimal anonymization through attribute generalization and record suppression. The proposed algorithm incorporates various components, including an information-driven crossover operator, an information-driven mutation operator, an information-driven improvement operator, and a two-dimensional selection operator. Furthermore, a distributed population model is utilized to improve population diversity while reducing the running time. Experimental results confirm the superiority of ID-DGA in terms of solution accuracy, convergence speed, and the effectiveness of all the proposed components.
... Despite their growing popularity, detecting and mitigating insider threats present significant challenges due to limited access to real-world insider threat datasets caused by privacy concerns [23]. Moreover, the complex interplay of activities within large organizations makes identifying insider threats challenging [24]. Furthermore, the effectiveness of machine learning models in generating predictions is compromised when dealing with imbalanced data [25,26]. ...
Article
Full-text available
Insider threats refer to abnormal actions taken by individuals with privileged access, compromising system data’s confidentiality, integrity, and availability. They pose significant cybersecurity risks, leading to substantial losses for several organizations. Detecting insider threats is crucial due to the imbalance in their datasets. Moreover, the performance of existing works has been evaluated on various datasets and problem settings, making it challenging to compare the effectiveness of different algorithms and offer recommendations to decision-makers. Furthermore, no existing work investigates the impact of changing hyperparameters. This paper aims to objectively assess the performance of various supervised machine learning algorithms for detecting insider threats under the same setting. We precisely evaluate the performance of various supervised machine learning algorithms on a balanced dataset using the same feature extraction method. Additionally, we explore the impact of hyperparameter tuning on performance within the balanced dataset. Finally, we investigate the performance of different algorithms in the context of imbalanced datasets under various conditions. We conduct all the experiments in the publicly available CERT r4.2 dataset. The results show that supervised learning with a balanced dataset in RF obtains the best accuracy and F1-score of 95.9% compared with existing works, such as, DNN, LSTM Autoencoder and User Behavior Analysis.
... Figure 2 represents the straggling device scenario where Device 3 takes more time delaying global model aggregation. The research work for preserving privacy has been a hot topic for many years [32][33][34] , it has also been discussed in edge framework [35] . In EFL, stragglers issue can intensify degradation of model performance as the participating devices are in millions and uncertain about their prolonged participation as well as their complete time dedication for model training. ...
Article
To fully exploit enormous data generated by intelligent devices in edge computing, edge federated learning (EFL) is envisioned as a promising solution. The distributed collaborative training in EFL deals with delay and privacy issues compared to traditional centralized model training. However, the existence of straggling devices, responding slow to servers, degrades model performance. We consider data heterogeneity from two aspects: high dimensional data generated at edge devices where the number of features is greater than that of observations and the heterogeneity caused by partial device participation. With large number of features, computation overhead on the devices increases, causing edge devices to become stragglers. And incorporation of partial training results causes gradients to be diverged which further exaggerates when more training is performed to reach local optima. In this paper, we introduce elastic optimization methods for stragglers due to data heterogeneity in edge federated learning. Specifically, we define the problem of stragglers in EFL. Then, we formulate an optimization problem to be solved at edge devices. We customize a benchmark algorithm, FedAvg, to obtain a new elastic optimization algorithm (FedEN) which is applied in local training of edge devices. FedEN mitigates stragglers by having a balance between lasso and ridge penalization thereby generating sparse model updates and enforcing parameters as close as to local optima. We have evaluated the proposed model on MNIST and CIFAR-10 datasets. Simulated experiments demonstrate that our approach improves run time training performance by achieving average accuracy with less communication rounds. The results confirm the improved performance of our approach over benchmark algorithms.
... Xie and Sun analyzed the impact of different service stages on customer satisfaction by characterizing perceived quality [13]. Focusing on service, studies have researched cloud service [14], investigated secure m-services [15], and designed a generic Internet of Things architecture for scalable service cooperation [16]. Recommendation services for privacy-preserving tasks is also studied [17,18]. ...
Article
Full-text available
With the rapid development of global technology, many firms have entered the product-sharing market, where business-to-consumer (B2C), peer-to-peer (P2P), and hybrid rental services are the three most common business models. The difference between these models lies in the product provider. The product provider in B2C mode is enterprises, while in P2P mode, individuals provide the products, and both enterprises and individuals provide the products in hybrid mode. We employ a game model to analyze the influence of a B2C platform introducing P2P sharing service by comparing the profit of the platform and the original equipment manufacturer (OEM) under each of the three models. Our findings indicate that introducing P2P service always benefits the B2C platform but sometimes harms the OEM. Our findings also indicate that as the quality of the platform’s products improves, the profit improvement brought to the B2C platform by introducing P2P sharing decreases. We find that the impact of introducing P2P service also varies with consumer behaviors. For example, with increasing consumer usage level, the B2C platform’s profit improvement brought by introducing P2P sharing increases. Furthermore, we determine the optimal pricing strategies of the OEM and sharing platform in B2C mode and in hybrid mode.
... In the area of enterprise architecture and enterprise application integration, much work on similar topics has already been done before Industry 4.0 related to platforms for eCommerce. These works could be categorized to web services-based approaches [62,63] and various competing XML schema-based standards such as universal business language (UBL), Open Applications Group Integration Specification (OAGIS), eXtensible Business Reporting Language (XBRL), XML Common Business Library (xCBL) and commerce eXtensible Markup Language (cXML) [64,65,66,67,68,69,70,71]. ...
Article
Full-text available
Industry 4.0 architecture has been studied in a large number of publications in the fields of Industrial Internet of Things, Cyber Physical Production Systems, Enterprise Architectures, Enterprise Integration and Cloud Manufacturing. A large number of architectures have been proposed, but none of them has been adopted by a large number of research groups. Two major Industry 4.0 reference architectures have been developed by industry-driven initiatives, namely the German Industry 4.0 and the US-led Industrial Internet Consortium. These are the Reference Architecture Model Industry 4.0 and Industrial Internet Reference Architecture, which are being standardized by the International Electrotechnical Commission and the Object Management Group, respectively. The first research goal of this article is to survey the literature on Industry 4.0 architectures in a factory context and assess awareness and compatibility with Reference Architecture Model Industry 4.0 and Industrial Internet Reference Architecture. The second research goal is to adapt a previously proposed advanced manufacturing concept to Reference Architecture Model Industry 4.0. With respect to the first research goal, it was discovered that only a minority of researchers were aware of the said reference architectures and that in general authors offered no discussion about the compatibility of their proposals with any internationally standardized reference architecture for Industry 4.0. With respect to the second research goal, it was discovered that Reference Architecture Model Industry 4.0 was mature with respect to communication and information sharing in the scope of the connected world, that further standardization enabling interoperability of different vendors’ technology is still under development and that technology standardization enabling executable business processes between networked enterprises was lacking.
... Web Service(WS) is a vital technology for implementing SOA to provide interoperability between heterogeneous systems and integrating inter-organization applications [4]. WS are Flexible, self-dening, and loosely coupled prgrammed codes running on a remote server that can be publicised, discovered, and invoked across by using a set of standard protocols through internet [5]. ...
Article
Full-text available
In service computing, the same target functions can be achieved by multiple Web services from different providers. Due to the functional similarities, the client needs to consider the non-functional criteria. However, Quality of Service provided by the developer suffers from scarcity and lack of reliability. In addition, the reputation of the service providers is an important factor, especially those with little experience, to select a service. Most of the previous studies were focused on the user's feedbacks for justifying the selection. Unfortunately, not all the users provide the feedback unless they had extremely good or bad experience with the service. In this vision paper, we propose a novel architecture for the web service discovery and selection. The core component is a machine learning based methodology to predict the QoS properties using source code metrics. The credibility value and previous usage count are used to determine the reputation of the service.
... Chen, Chen, & Jan, 2007;B. Lee, Kim, & Kang;Lei, Quintero, & Pierre, 2009;Patel & Crowcroft;H Wang, Huang, & Dodda;H Wang, Zhang, Cao, & Kambayahsi, 2004;H Wang, Zhang, Cao, & Varadharajan, 2003). For example, Patel and Crowcroft (Patel & Crowcroft) proposed in 1997 a homeless mechanism based on the notion of tickets. ...
Article
Full-text available
Mobile authentication is an essential service to ensure the security of engaging parties in a ubiquitous wireless network environment. Several solutions have been proposed mainly based on both centralised and distributed authentication models to allow ubiquitous mobile access authentication; however, limitations still exist in these approaches, namelyflexibility, security and performance issues and vulnerabilities. These shortcomings are influenced by the resource limitations of both wireless networks and the mobile devices together with inter-technology and inter-provider challenges. In this paper, the authors reviewed the major techniques in thefield of ubiquitous mobile access authentication, which has attracted many researchers in the past decade. After investigating existing mobile authentication models and approaches, the common challenges are summarised to serve as the solution key requirements. The identified key solution requirements allow analysing and evaluating mobile authentication approaches.
... Due to the rapid growth of the Internet and communication developments, electronic commerce has become much more popular and widely used than ever [1][2][3][4][5][6][7][8]. The mobile telecommunications have been developed from 2 G to 3.5 G. Furthermore, LTE Advanced, 4 G, and 5 G are being implemented to the market in recent years. ...
Article
Full-text available
Electronic cash (e-cash) is definitely one of the most popular research topics in the e-commerce field. It is very important that e-cash be able to hold the anonymity and accuracy in order to preserve the privacy and rights of customers.There are two types of e-cash in general, which are online e-cash and offline e-cash. Both systems have their own pros and cons and they can be used to construct various applications. In this paper, we pioneer to propose a provably secure and efficient offline e-cash scheme with date attachability based on the blind signature technique, where expiration date and deposit date can be embedded in an e-cash simultaneously. With the help of expiration date, the bank can manage the huge database much more easily against unlimited growth, and the deposit date cannot be forged so that users are able to calculate the amount of interests they can receive in the future correctly. Furthermore, we offer security analysis and formal proofs for all essential properties of offline e-cash, which are anonymity control, unforgeability, conditional-traceability, and no-swindling.