Figure 1 - uploaded by Hivi Dino
Content may be subject to copyright.
Computer Performance Measurement System [25].

Computer Performance Measurement System [25].

Source publication
Article
Full-text available
The importance of process monitoring applications continues to grow. Generally, many of the developments in process monitoring are being driven by access to more and more data. Process monitoring is important to understand the variation in a process and to assess its current state. Process monitoring and controlling an organization is of high impor...

Citations

... To name just a few examples of the kinds of activities that might be included in this phase, some examples of the kinds of jobs that might be included in this phase are receiving orders, creating invoices, and verifying stock levels. These are just a few examples of the kinds of jobs that might be included in this phase [34]. It is nearly always required for the activities of a company to be carried out in accordance with a certain set of rules and procedures in order to make certain that everything goes swimmingly and without a hitch. ...
Article
Full-text available
It is possible that the utilization of an Enterprise Architecture Framework (EAF), which describes each of the software development processes as well as the ways in which they are integrated with one another, would be of assistance to a company in achieving the goals that it has established for itself. These goals include maximizing profits, enhancing customer satisfaction, and increasing employee productivity. To put it another way, it helps companies detect defects within their systems and determine the degree to which such issues are present in the company. There have been several Enterprise Architecture Frameworks developed, and many of them are in use today. Some of these frameworks focus on very particular aspects, while others give users greater leeway in how they put them to use. The outcomes of this study present a broad range of options from which to choose an Enterprise Architecture Framework that is able to fulfill the fundamental criteria. These options can be found by clicking here. In this research, a literature review of publications associated to Enterprise Architecture Framework that have been published in a range of academic fields is presented. These articles have been published throughout the course of the last few years. Throughout the course of this study, these articles were written and published. The general public has had access to these publications throughout the course of the last several years thanks to the dissemination of information via various channels.
... Both of these protocols are used in the backend infrastructure that supports the World Wide Web. The software and hardware components that, when put together, constitute a web server, are the ones responsible for handling these requests as they come in [17]. These components are the entities that are in charge of fulfilling these requests, and they are listed below. ...
Article
Full-text available
Web servers and service-oriented architectures have become critical technologies in the growth of loosely coupled and distributed applications. Web services present new research problems, owing to the fact that they are used to execute mission-critical and complex enterprise process structures. The architecture and infrastructure approaches are needed in this context to ensure service quality. This paper presents the most current survey on different web service frameworks, such as REST web services, standard DICOM archive, web pathology viewer, GeoFog4Health, E-mobility architecture, and so on. The aforementioned services have been used to solve issues like performance, fog availability, privacy, multimedia services, web service efficiency, and so on. Aside from that, the author used explanations of the Methods/Tools, goals; challenges, applied field, and significant outcomes, as well as the findings obtained using this methodology.
... The performance of detection methods during the classification process can be improved (whether the classifier is human or software). Their classification skills may be improved in the case of end-users through enhancing their awareness of phishing attacks by studying independently through their experience online or through formal training programmers [25][26][27][28]. This may be done during the learning process of profound learning or improvement of a rule-based scheme of detection measures in the case of the program classification. ...
Article
Full-text available
The pandemic of COVID-19 obliges citizens to follow the "work from home "scheme. The Internet is also a powerful channel for social connections. The huge dependency of people on digital media opens doors to fraud. Phishing is a form of cybercrime that is used to rob users of passwords from online banking, e-commerce, online schools, digital markets, and others. Phishers create bogus websites like the original and deliver users spam mails. When an online user visits fake web pages via spam, phishers steal their credentials. As a result, it is important to identify these forms of fraudulent websites until they do any harm to victims. Inspired by the ever-changing existence of phishing websites. This paper reviews the work on Phishing attack detection and aims to examine techniques that mainly detect and help in preventing phishing attacks rather than mitigating them. Here we offered a general overview of the most effective phishing attack detection strategies focused on deep learning.
... The performance of detection methods during the classification process can be improved (whether the classifier is human or software). Their classification skills may be improved in the case of end-users through enhancing their awareness of phishing attacks by studying independently through their experience online or through formal training programmers [25][26][27][28]. This may be done during the learning process of profound learning or improvement of a rule-based scheme of detection measures in the case of the program classification. ...
... This is because semantic similarity refers to the degree to which two things share the same meaning (for example, string format). It is possible to offer a numerical representation of the semantic links that exist between language units and the concepts or occurrences that they refer to by making use of statistical techniques [33][34][35]. This is something that has been shown to be possible. ...
Article
There has been a meteoric rise in the total amount of digital texts as a direct result of the proliferation of internet access. As a direct result of this, document clustering has evolved into a crucial method that must be used in order to successfully extract relevant information from big document collections. When employing the document clustering approach, documents are automatically sorted into groups whose members have a high degree of similarity to one another. These groups are created by applying the document clustering technique. Because they do not take into account the semantic linkages that exist between the texts, traditional clustering approaches are unable to provide an acceptable description of a collection of texts. This is because traditional clustering techniques. Document clusters, in which texts are ordered according to their meaning rather than their use of keywords, have been extensively utilized as a means of overcoming these challenges as a result of the incorporation of semantic information. This has been possible as a result of the fact that document clusters can group together related texts. In this investigation, we looked at a total of 27 distinct papers that were published over the previous five years and categorized the documents based on the semantic similarities that existed between the various pieces. A detailed literature evaluation is included to each and every one of the publications that were selected for further consideration. Comparative research is carried out on a wide variety of evaluation strategies, including as algorithms, similarity metrics, instruments, and processes. Following that, there is a drawn-out discussion that analyzes the similarities and differences between the activities.
... Public clouds are accessible to anybody, whereas private clouds are only accessible to a select few. On the other hand, the service model includes an offering for infrastructure as a service, platform as a service, and software as a service [22]. ...
Article
Full-text available
Cloud computing has swiftly established itself as the norm in its field as a result of the advantages described above. In an attempt to reduce the amount of time spent on infrastructure upkeep, an increasing number of businesses are moving their operations to the cloud. As a result, maintaining a cloud environment has proved to be exceedingly difficult. It is necessary to have an efficient cloud monitoring system in order to reduce the workload associated with administration and improve cloud operation. The cloud monitoring service is beneficial since it has the potential to improve performance and make administration easier. The administration of Quality of Service (QoS) parameters for cloud-hosted, virtualized, and physical services and applications is one of the most important responsibilities of cloud monitoring. As a result, cloud management software retains a record of both actions and services, and it also conducts dynamic setups of the cloud in order to increase operational effectiveness. The performance of businesses and businesses as a whole was examined in this article, as was the influence that cloud-ready programs and tools have on that performance, as well as the advantages that may be obtained from adopting such programs and products.
... The emergence of multiteraflop machines with thousands of processors for scientific computing combined with advanced sensory-based experimentation has heralded an explosive growth of structured and unstructured heterogeneous data in science and engineering fields. ML and DL approaches were initially introduced to scientific computing to solve the lack of efficient data modeling procedures, preventing scientists from rapidly interacting with heterogeneous complex data [4]. These approaches show transformative potential because they enable the exploration of vast design spaces, the identification of multidimensional connections, and the management of ill-posed issues [5,6,7]. ...
Preprint
Full-text available
Recent breakthroughs in computing power have made it feasible to use machine learning and deep learning to advance scientific computing in many fields, such as fluid mechanics, solid mechanics, materials science, etc. Neural networks, in particular, play a central role in this hybridization. Due to their intrinsic architecture, conventional neural networks cannot be successfully trained and scoped when data is sparse; a scenario that is true in many scientific fields. Nonetheless, neural networks offer a strong foundation to digest physical-driven or knowledge-based constraints during training. Generally speaking, there are three distinct neural network frameworks to enforce underlying physics: (i) physics-guided neural networks (PgNN), (ii) physics-informed neural networks (PiNN) and (iii) physics-encoded neural networks (PeNN). These approaches offer unique advantages to accelerate the modeling of complex multiscale multi-physics phenomena. They also come with unique drawbacks and suffer from unresolved limitations (e.g., stability, convergence, and generalization) that call for further research. This study aims to present an in-depth review of the three neural network frameworks (i.e., PgNN, PiNN, and PeNN) used in scientific computing research. The state-of-the-art architectures and their applications are reviewed; limitations are discussed; and future research opportunities in terms of improving algorithms, considering causalities, expanding applications, and coupling scientific and deep learning solvers are presented. This critical review provides a solid starting point for researchers and engineers to comprehend how to integrate different layers of physics into neural networks.
... They all communicate, though, in different ways and for particular reasons. With the variation of their modes of interaction, the various modes of interaction have become essential to recognize and describe [4][5][6]. A thesis by [7] based on the analysis and analysis of semantic annotation, making an argument for using cloud computing framework to overcome some of its problems. ...
Article
Semantic web and cloud technology systems have been critical components in creating and deploying applications in various fields. Although they are self-contained, they can be combined in various ways to create solutions, which has recently been discussed in depth. We have shown a dramatic increase in new cloud providers, applications, facilities, management systems, data, and so on in recent years, reaching a level of complexity that indicates the need for new technology to address such tremendous, shared, and heterogeneous services and resources. As a result, issues with portability, interoperability, security, selection, negotiation, discovery, and definition of cloud services and resources may arise. Semantic Technologies, which has enormous potential for cloud computing, is a vital way of reexamining these issues. This paper explores and examines the role of Semantic-Web Technology in the Cloud from a variety of sources. In addition, a "cloud-driven" mode of interaction illustrates how we can construct the semantic web and provide automated semantical annotations to web applications on a large scale by leveraging Cloud computing properties and advantages .
... Although edge computing applications generally demand low energy tasks with real-time guarantees. The problem is the time complexity in which to recognize MEPs and change voltages, which frequently prohibits the planning of tasks in real time [43]. ...
Article
Full-text available
The term "Real-Time Operating System (RTOS)" refers to systems wherein the time component is critical. For example, one or more of a computer's peripheral devices send a signal, and the computer must respond appropriately within a specified period of time. Examples include: the monitoring system in a hospital care unit, the autopilot in the aircraft, and the safety control system in the nuclear reactor. Scheduling is a method that ensures that jobs are performed at certain times. In the real-time systems, accuracy does not only rely on the outcomes of calculation, and also on the time it takes to provide the results. It must be completed within the specified time frame. The scheduling strategy is crucial in any real-time system, which is required to prevent overlapping execution in the system. The paper review classifies several previews works on many characteristics. Also, strategies utilized for scheduling in real time are examined and their features compared.
... Researchers have also set up numerous tools and methods to support the analysis of Android malware. Droid-Scope is an example of a platform built in a virtualized environment to analyse malware [25]. ...
... The updates may include new registry keys, IP addresses, domain names, and file path locations [48]. It will also indicate whether the infection is connecting to an external server controlled by a hacker through the application of dynamic analysis [25]. It is both a helpful and time-consuming dynamic analytical approach. ...
Article
Full-text available
Mobile malware is malicious software that targets mobile phones or wireless-enabled Personal digital assistants (PDA), by causing the collapse of the system and loss or leakage of confidential information. As wireless phones and PDA networks have become more and more common and have grown in complexity, it has become increasingly difficult to ensure their safety and security against electronic attacks in the form of viruses or other malware. Android is now the world's most popular OS. More and more malware assaults are taking place in Android applications. Many security detection techniques based on Android Apps are now available. Android applications are developing rapidly across the mobile ecosystem, but Android malware is also emerging in an endless stream. Many researchers have studied the problem of Android malware detection and have put forward theories and methods from different perspectives. Existing research suggests that machine learning is an effective and promising way to detect Android malware. Notwithstanding, there exist reviews that have surveyed different issues related to Android malware detection based on machine learning. The open environmental feature of the Android environment has given Android an extensive appeal in recent years. The growing number of mobile devices, they are incorporated in many aspects of our everyday lives. In today’s digital world most of the anti-malware tools are signature based which is ineffective to detect advanced unknown malware viz. Android OS, which is the most prevalent operating system (OS), has enjoyed immense popularity for smart phones over the past few years. Seizing this opportunity, cybercrime will occur in the form of piracy and malware. Traditional detection does not suffice to combat newly created advanced malware. So, there is a need for smart malware detection systems to reduce malicious activities risk. The present paper includes a thorough comparison that summarizes and analyses the various detection techniques.