Chapter

Maximizing Profits and Efficiency: The Intersection of AI, Machine Learning, and Supply Chain Financial Management

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Efficiency and profitability are at the core of effective supply chain financial management. In the contemporary business landscape, the intersection of financial acumen and operational excellence is a critical driver of success. This exploration delves into the significance of maximizing profits and efficiency in supply chain financial management, guided by the transformative influence of artificial intelligence (AI) and machine learning. Supply chain financial management is defined by its critical role in allocating financial resources strategically, optimizing working capital, analyzing costs, and mitigating risks.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

ResearchGate has not been able to resolve any citations for this publication.
Article
Full-text available
Electrocardiogram (ECG) is one among the most common detecting techniques in the analysis and detection of cardiac arrhythmia adopted due to its cost efficiency and simplicity. In a clinical routine, ECG database is collected on daily basis and these databases are reviewed manually. Along with other conventional methods, various approaches using machine learning has been proposed in the past few years. But these would require in-depth knowledge on several parameters and pre-processing techniques in the specific domain. This study is aimed at implementing a more reliable deep learning model that has the capacity to diagnose arrhythmia from a database with 109,446 samples in 5 different categories. In our proposed work, we have used deep learning methodologies for the diagnosis and detection of cardiac arrhythmia automatically. Balancing the biasedness in the waveforms from MIT-BIH arrhythmia database, model is developed. MIT-BIH arrhythmia database with the ECG waveforms promises good accuracy. This automated prediction of the disease using CNN and ResNet-18 architectures are compared in terms of accuracy. CNN has accuracy approximately 97.86% and 98.14% for improved ResNet-18. Also, a comparative analysis is done with the proposed model and already existing techniques. Several limitations and future opportunities are also reviewed. We believe it can be used considerably for cardiac arrhythmia prediction worldwide. Based on the results obtained, ResNet-18 architecture can be used as an efficient procedure, that reduces the burden of training a deep convolutional neural network from start, resulting in a technique that is simple to use.
Article
Full-text available
This research article proposes a periodically elliptical graphene metamaterial array-based wavelength-tunable absorber. The recent identification of graphene represents a substantial advancement in the field of plasmon-based electronics, which includes design, fabrication, wave distribution, inherent adaptability, and affordable prototyping. The finite difference time domain (FDTD) method is used to study the effects of different parameters. This article shows an apparent absorption peak with maximum absorption of 49.6% occurs at 55.12 µm compared to the existing metamaterial absorber. By extending the ellipse’s major axis, increasing the X polarization of the incidence angle within a specific range, and enhancing graphene’s chemical potential, the absorption rate of elliptical graphene arrays will be increased. Furthermore, to accurately evaluate the sensing capabilities of the structure, we model the spectra under varying refractive indexes in the environment, resulting in a design with a sensitivity (S) of up to 15,169 nm/RIU and a figure of merit (FOM) value. This research makes it easier to design environmental monitors, biosensors, and optoelectronic devices based on graphene.
Article
Full-text available
The quality and quantity of medical data produced by digital devices have improved significantly in recent decades. This has led to cheap and easy data generation. There has therefore been an increased advantage in the areas of Big Data and machine learning. There is a huge application of machine leaning and artificial intelligence in health care sector. The use of machine learning to train the machine to classify the medical cases taking care of the historical data can be a boon in medical studies. In this paper, we have analyzed many machine learning algorithms and classifiers which are used to make prediction on the diabetes based on the chosen features and attributes of the dataset. The implementation of the algorithms and its performance are compared in terms of accuracy; we have also used the soft voting ensemble techniques and applied the standardized PIMA diabetes data for which the highest accuracy is achieved.
Article
Full-text available
The increasing number of private cars, public transportation vehicles, and pedestrians, as well as the absence of adequate space for these ground amenities, are one of the primary causes of traffic congestion and accidents in the Kathmandu Valley. Investigations have indicated that the Kathmandu Valley has the greatest traffic accidents despite the heavy presence of the government and its agencies there. Most teens and young adults suffer injuries while using motor vehicles. The study's primary objective is to foresee and prevent such complications by planning for sufficient subsurface infrastructure (a cut‐and‐cover rectangular tunnel) for the Kathmandu Valley's transportation network. The overlying pressure, lateral earth pressure, live load, uplift pressure, and live surcharge are some of the forces acting on the tunnel, creating unique stress and moment zones. The tunnel meets the following geometric requirements: (a) Each of the tunnel's two cells has a clear span of 10 m and a clear height of 5.5 m. The side walls, inner walls, top slab, and bottom slab are all 700 mm thick. Soil has built up to a height of 4 m over the tunnel's roof. The analytical method is used in the tunnel segment's analysis. Furthermore, the designed tunnel has been evaluated for stability, considering the deflection and shear resistance. The analysis indicates that the tunnel meets the stability requirements. This implies that the structure is capable of withstanding the applied forces without excessive deflection. Non‐linear dynamic time history analyses of the El Centro earthquake and the Gorkha earthquake were computed. From the El Centro earthquake, the maximum displacement was 23.63 mm at 10.59 s, and from the Gorkha earthquake, the maximum displacement was 16 mm at 0.19 s for the modeled structures.
Article
Full-text available
Climate change (CC) is one of the greatest threats to human health, safety, and the environment. Given its current and future impacts, numerous studies have employed computational tools (e.g., machine learning, ML) to understand, mitigate, and adapt to CC. Therefore, this paper seeks to comprehensively analyze the research/publications landscape on the MLCC research based on published documents from Scopus. The high productivity and research impact of MLCC has produced highly cited works categorized as science, technology, and engineering to the arts, humanities, and social sciences. The most prolific author is Shamsuddin Shahid (based at Universiti Teknologi Malaysia), whereas the Chinese Academy of Sciences is the most productive affiliation on MLCC research. The most influential countries are the United States and China, which is attributed to the funding activities of the National Science Foundation and the National Natural Science Foundation of China (NSFC), respectively. Collaboration through co-authorship in high-impact journals such as Remote Sensing was also identified as an important factor in the high rate of productivity among the most active stakeholders researching MLCC topics worldwide. Keyword co-occurrence analysis identified four major research hotspots/themes on MLCC research that describe the ML techniques, potential risky sectors, remote sensing, and sustainable development dynamics of CC. In conclusion, the paper finds that MLCC research has a significant socio-economic, environmental, and research impact, which points to increased discoveries, publications, and citations in the near future.
Article
Full-text available
The publication trends and bibliometric analysis of the research landscape on the applications of machine and deep learning in energy storage (MDLES) research were examined in this study based on published documents in the Elsevier Scopus database between 2012 and 2022. The PRISMA technique employed to identify, screen, and filter related publications on MDLES research recovered 969 documents comprising articles, conference papers, and reviews published in English. The results showed that the publications count on the topic increased from 3 to 385 (or a 12,733.3% increase) along with citations between 2012 and 2022. The high publications and citations rate was ascribed to the MDLES research impact, co-authorships/collaborations, as well as the source title/journals’ reputation, multidisciplinary nature, and research funding. The top/most prolific researcher, institution, country, and funding body on MDLES research are; is Yan Xu, Tsinghua University, China, and the National Natural Science Foundation of China, respectively. Keywords occurrence analysis revealed three clusters or hotspots based on machine learning, digital storage, and Energy Storage. Further analysis of the research landscape showed that MDLES research is currently and largely focused on the application of machine/deep learning for predicting, operating, and optimising energy storage as well as the design of energy storage materials for renewable energy technologies such as wind, and PV solar. However, future research will presumably include a focus on advanced energy materials development, operational systems monitoring and control as well as techno-economic analysis to address challenges associated with energy efficiency analysis, costing of renewable energy electricity pricing, trading, and revenue prediction
Article
Full-text available
The paper proposes microwave-based metamaterial-based sensors for coronavirus infection detection. A sensor has been developed to calculate the electromagnetic wave’s coefficients of transmission (S21) and reflection (S11) at resonance frequency. The computer simulation technology (CST) tools have been used to create the system and evaluate its consequences. The sensor uses electromagnetic interaction with a blood sample from a person who has COVID-19. This is done by watching for an alteration in the resonant frequency, which serves as a sign of COVID-19. Infectious people’s blood samples indicated a 740-MHz shift in frequency compared to normal people’s blood samples. This is a fascinating method to find COVID-19. A lot of research has been done on parametric analyses. In this work, various applications like CST, HFSS, and ADS are used to support the findings, and these applications align well with each other. Lastly, it is also looked into how the fields were spread out for the indicated arrangement.
Chapter
Full-text available
AI has had a substantial influence on image processing, allowing cutting-edge methods and uses. The foundations of image processing are covered in this chapter, along with representation, formats, enhancement methods, and filtering. It digs into methods for machine learning, neural networks, optimization strategies, digital watermarking, picture security, cloud computing, image augmentation, and data pretreatment methods. The impact of cloud computing on platforms, performance, privacy, and security are also covered. The chapter's consideration of future trends and applications emphasises the substantial contributions that AI has made to image processing as well as the ethical and societal ramifications of this technology.
Article
Full-text available
In this work, two different combinations of materials are prepared, and the effects on dual doping in Ca1−x−yGdxSrxMnO3 and Ca1−x−yCexSrxMnO3 (x = 0, 0.025, 0.05, y = 0, 0.025, 0.05) materials are evaluated, and its parameters are optimized and predicted by the Box-Benhken design in the RSM method. The activation energy was measured with respect to different thermoelectrical material concentrations. RSM design is validated using hybrid DBN-RSO. The results show that increasing temperature, increases the amount of doping, decreases the thermal conductivity (k) and increases the electrical conductivity (σ) and power factor (PF). A bigger number of merits was also reached by increasing the amount of doping and the temperature. At 1000 K, the Ca0.95Gd0.05Sr0.05MnO3 material has a low thermal conductivity and the highest figure of merit (ZT) value of 0.24, which is more than Ca0.95Ce0.05Sr0.05MnO3. The predicted values from the DBN-RSO method provide results that are closer to the experimental observations. The highest score (ZT) that the DBN-RSO prediction received was 0.26. Besides, the regression value of 99% is obtained from the experimented and predicted values. It shows the confidence and fitness of values. Also, the DBN-RSO achieves closer results to the experimental design with the lowest error value.
Article
Full-text available
Humanity’s quest for safe, resilient, and liveable cities has prompted research into the application of computational tools in the design and development of sustainable smart cities. Thus, the application of artificial intelligence in sustainable smart cities (AISC) has become an important research field with numerous publications, citations, and collaborations. However, scholarly works on publication trends and the research landscape on AISC remain lacking. Therefore, this paper examines the current status and future directions of AISC research. The PRISMA approach was selected to identify, screen, and analyse 1,982 publications on AISC from Scopus between 2011 and 2022. Results showed that the number of publications and citations rose from 2 to 470 and 157 to 1,540, respectively. Stakeholder productivity analysis showed that the most prolific author and affiliation are Tan Yigitcanlar (10 publications and 518 citations) and King Abdulaziz University (23 publications and 793 citations), respectively. Productivity was attributed to national interests, research priorities, and national or international funding. The largest funder of AISC research is the National Natural Science Foundation of China (126 publications or 6.357 percent of the total publications). Keyword co-occurrence and cluster analyses revealed 6 research hotspots on AISC: digital innovation and technologies; digital infrastructure and intelligent data systems; cognitive computing; smart sustainability; smart energy efficiency; nexus among artificial intelligence, Internet of Things, data analytics and smart cities. Future research would likely focus on the socio-economic, ethical, policy, and technical aspects of the topic. It is envisaged that global scientific interest in AISC research and relevant publications, citations, products, and services will continue to rise in the future.
Article
Full-text available
Logistic regression is a commonly used classification algorithm in machine learning. It allows categorizing data into discrete classes by learning the relationship from a given set of labeled data. It learns a linear relationship from the given dataset and then introduces nonlinearity through an activation function to determine a hyperplane that separates the learning points into two subclasses. In the case of logistic regression, the logistic function is the most used activation function to perform binary classification. The choice of logistic function for binary classifications is justified by its ability to transform any real number into a probability between 0 and 1. This study provides, through two different approaches, a rigorous statistical answer to the crucial question that torments us, namely where does this logistic function on which most neural network algorithms are based come from? Moreover, it determines the computational cost of logistic regression, using theoretical and experimental approaches.
Article
Full-text available
-Twitter and Facebook are popular among college educators. The use of social media in schools of higher learning has also been the subject of study. The use of social media has opened up new avenues of contact, collaboration, and participation between students and teachers. Accepting students and educators who make use of technological tools to do so requires insight into the factors that shape their propensity to do so. Using the Technology Acceptance Model (TAM) framework, which highlights perceived ease of use, perceived usefulness, and behavioral intention to use new technologies, this paper investigates the extent to which Nigerians are adopting social networking media for e-learning. Quantitative studies made use of surveys. Teachers and students from four different Nigerian schools participated in this survey. The suggested model factors were predicted using structural equation modeling (SEM). Intentions to utilize social media for e-learning by students and faculty at Nigerian institutions were shown to be impacted by these factors: perceived ease of use and perceived utility. The research is limited in that it does not offer any insight into interactive factors such interaction with research group members and peers, interaction with supervisors or lecturers, engagement, or active collaborative learning. Index Terms-Technology acceptance model, TAM, perceived ease of use, higher education, social media networking.
Chapter
Full-text available
A modern networking structure that employs software-based controllers to control and interact with primary hardware devices for directing the traffic on a network is called software-defined networking (SDN). It differs from the conventional network by creating a centralized control over the routing of data packets. Networks are widely used networks in which spontaneous network connectivity among the nodes is needed for communicating useful information quickly to the target audience. Nodes in ad hoc networks (AHN) down to function in an infrastructure-less environment can form a group among themselves freely and launch wireless multi-hop communication without any centralized access point. Every node can have direct communication among each other and be involved in relying on the data packet. Routing in AHN is difficult and has specific constraints over wireless transmission such as frequently changing topology, self-organizing nature, wireless link fluctuation, and resource constraint nature of nodes. Imposing SDN technology in designing routing protocols for various application needs of upcoming scenarios of AHN is crucial for improved network management and reducing the overall communication cost. SDN-based routing protocols shift the routing choices from basic network elements to the controller. This technique helps to identify the shortest route with minimum latency and to reduce the control packet exchange rapidly. This article first proposes the various network structures that rely on SDN technology for competent message transmission in mobile AHN and then presents a survey on SDN-based network routing protocols from different branches of AHN with the methodology used and advantages and disadvantages of each. This helps the researchers to enhance them further to meet the requirement of various application scenarios.
Article
Full-text available
Tourism is the main source of economic growth and a creator of employment of various kinds. The Rajasthan state got the fifth rank in the top 10 tourist destinations among domestic tourism ranks. The main inflection of this paper is to investigate the influence of tourist satisfaction of international tourists on future behavioral intentions with special reference to the desert triangle of Rajasthan. The holiday satisfaction model was used to construct the survey method applied in this study. According to the findings, tourist holiday satisfaction is particularly high for attraction qualities in numerous tourist sites in Rajasthan's desert triangle. The HOLSAT model was used to calculate the average mean of experience and expectation in this study, and the results show that the attributes described as attractions, accessibility and activities components are those that foreign visitors value the most. International tourists, on the other hand, give the lowest scores to qualities classed under amenities and accommodation components, indicating that international tourism has a long way to go.Thus, the tourism sector will need to guide human resources to curate and deliver these experiences.
Article
Full-text available
Alongside many other countries around the world, India has fought voraciously against the coronavirus pandemic 2019 (COVID-19). Of particular interest in this paper is the agriculture supply chains in India, where the harvesting period of rabi crops coincides with the COVID-19 outbreak. The fruitful harvest during the pandemic has caused many states in India to face unprecedented challenges in reaping and post-reaping tasks of harvest in agriculture supply chains. Customarily, harvesting utilises transient work from different locations in India, yet during the lockdown, many workers in the agriculture industry have relocated back to their hometowns, thereby leading to labour shortage and halting harvesting activities during the pandemic.Besides the shortage of labour, this paper also identifies pertinent challenges relating to climate change, occupational health and safety, pricing and revenue and transportation. The legislature is a significant stakeholder that needs to proactively devise frameworks to overcome agriculture supply chain challenges, with suggestions for agriculture supply chain management during global pandemics offered herein.
Chapter
Full-text available
COVID-19 is likely to pose a significant threat to healthcare, especially for disadvantaged populations due to the inadequate condition of public health services with people's lack of financial ways to obtain healthcare. The primary intention of such research was to investigate trend analysis for total daily confirmed cases with new corona virus (i.e., COVID-19) in the countries of Africa and Asia. The study utilized the daily recorded time series observed for two weeks (52 observations) in which the data is obtained from the world health organization (WHO) and world meter website. Univariate ARIMA models were employed. STATA 14.2 and Minitab 14 statistical software were used for the analysis at 5% significance level for testing hypothesis. Throughout time frame studied, because all four series are non-stationary at level, they became static after the first variation. The result revealed the appropriate time series model (ARIMA) for Ethiopia, Pakistan, India, and Nigeria were Moving Average order 2, ARIMA(1, 1, 1), ARIMA(2, 1, 1), and ARIMA (1, 1, 2), respectively.
Article
Full-text available
Free download link for 50 days: https://authors.elsevier.com/a/1anly4sj-4lxgi Epidemic outbreaks are a special case of supply chain (SC) risks which is distinctively characterized by a long-term disruption existence, disruption propagations (i.e., the ripple effect), and high uncertainty. We present the results of a simulation study that opens some new research tensions on the impact of COVID-19 (SARS-CoV-2) on the global SCs. First, we articulate the specific features that frame epidemic outbreaks as a unique type of SC disruption risks. Second, we demonstrate how simulation-based methodology can be used to examine and predict the impacts of epidemic outbreaks on the SC performance using the example of coronavirus COVID-19 and anyLogistix simulation and optimization software. We offer an analysis for observing and predicting both short-term and long-term impacts of epidemic outbreaks on the SCs along with managerial insights. A set of sensitivity experiments for different scenarios allows illustrating the model's behavior and its value for decision-makers. The major observation from the simulation experiments is that the timing of the closing and opening of the facilities at different echelons might become a major factor that determines the epidemic outbreak impact on the SC performance rather than an upstream disruption duration or the speed of epidemic propagation. Other important factors are lead-time, speed of epidemic propagation, and the upstream and downstream disruption durations in the SC. The outcomes of this research can be used by decision-makers to predict the operative and long-term impacts of epidemic outbreaks on the SCs and develop pandemic SC plans. Our approach can also help to identify the successful and wrong elements of risk mitigation/preparedness and recovery policies in case of epidemic outbreaks. The paper is concluded by summarizing the most important insights and outlining future research agenda.
Article
Full-text available
The impact of digitalisation and Industry 4.0 on the ripple effect and disruption risk control analytics in the supply chain (SC) is studied. The research framework combines the results from two isolated areas, i.e. the impact of digitalisation on SC management (SCM) and the impact of SCM on the ripple effect control. To the best of our knowledge, this is the first study that connects business, information, engineering and analytics perspectives on digitalisation and SC risks. This paper does not pretend to be encyclopedic, but rather analyses recent literature and case-studies seeking to bring the discussion further with the help of a conceptual framework for researching the relationships between digitalisation and SC disruptions risks. In addition, it emerges with an SC risk analytics framework. It analyses perspectives and future transformations that can be expected in transition towards cyber-physical SCs. With these two frameworks, this study contributes to the literature by answering the questions of (1) what relations exist between big data analytics, Industry 4.0, additive manufacturing, advanced trace & tracking systems and SC disruption risks; (2) how digitalisation can contribute to enhancing ripple effect control; and (3) what digital technology-based extensions can trigger the developments towards SC risk analytics.
Conference Paper
Full-text available
Screening existing literature on Supply Chain Risk Management (SCRM) shows that only sporadic attention is paid on real data driven SCRM. Most tools and approaches lead to an expert knowledge based SCRM. Due to the arising topic of digitalization in supply chains, leading to Industry 4.0 (I4.0), there is huge potential in building a data driven, smart SCRM. To speed up research in this direction it is worthwhile to define a new research framework giving direction. To create a consistent framework and define smart SCRM in more detail a literature review will take place to select appropriate dimensions like SCRM phases, readiness stages of Digitalization/ I4.0 and SC perspectives describing the degree of SC collaboration. Afterwards the SCRM and I4.0 dimensions will be put into focus describing what impact I4.0 will have on SCRM leading to future requirements. The new framework serves as a basis for future SSCRM research. It helps to categorize research projects through multiple dimensions and to identify potential research gaps. The developed SSCRM requirements framework is a practical tool guiding the requirement specification when designing a company specific SSCRM system.
Article
Full-text available
This paper provides an input-output method to estimate worldwide economic impacts generated by supply chain disruptions. The method is used to analyse global economic effects due to the disruptions in the automotive industry that followed the Japanese earthquake and the consequent tsunami and nuclear crisis of March 2011. By combining a mixed multi-regional input-output model, the World Input-Output Database and data at the factory level, the study quantifies the economic impacts of the disruptions broken down by country and industry. The results show that the global economic effect (in terms of value added) of this disruption amounted to US$139 billion. The most affected (groups of) countries were Japan (39%), the USA (25%), China (8%) and the European Union (7%). The most strongly affected industries were transport equipment (37%), other business activities (10%), basic and fabricated metals (8%), wholesale trade (7%) and financial intermediation (4%).
Article
Full-text available
Based on bioinformatics algorithm, there are a wide range of implementations. With the urge for program speed, many applications take the heuristic approach to compensate running time. One of the most critical shortcomings of this technique is the loss of optimality, i.e. the desired results may not always be found. To overcome this problem, many different hardware architectures have been experimented for bioinformatics algorithm such as cell broadband engine, cluster and compute unified device architecture where, the main technique for obtaining high performance is to parallelize the task to be run simultaneously by multiple vector execution units with single instruction multiple data and by multiple processors with multiple instruction multiple data. In this paper we presents a survey of data intensive bioinformatics applications on variety of parallel Architecture that are available for accelerating the processing of large biological data set .
Article
Full-text available
Purpose The purpose of this paper is to review the literature to describe the current practices and research trends in managing supply chains in crisis. This paper also provides directions for future research in supply chain crisis management. Design/methodology/approach Articles published prior to August 2008 are analyzed and classified. Findings A unique five‐dimensional framework to classify the literature is provided. The study reveals that there has been extensive research done in this area in recent years. Much of the research is focused on proactive approaches to crisis in supply chains. Management during various internal crises such as supplier bankruptcy or loss of key clients is a new, challenging area that requires further investigation. Research limitations/implications This paper does not include articles that are not peer‐reviewed. Practical implications This paper will serve as a guide to supply chain managers who would like to know how crises, disasters, and disruptions in supply chains have been handled in existing academic literature. Originality/value To the best of the authors' knowledge, this is the first literature review in the area of managing supply chains during crisis that looks at both SCM and operations research/management science journals. This paper identifies the various methods that have been used to handle crisis situations and provides a framework to classify the literature. Additionally, this paper identifies gaps in the literature that can provide ideas for future research in this area.
Article
Full-text available
Supply managers must manage many risks in their increasingly competitive environments. Traditionally this meant buffering against uncertainties, which sub-optimized operational performance. Risk management can be a more effective approach to deal with these uncertainties by identifying potential losses. This conceptual study proposes that situational factors- degree of product technology, security needs, the relative importance of the supplier, and the purchasers’ prior experience with the situation should be taken into consideration when determining the level of risk management in the supply chain. Doing so can avoid unforeseen losses and lead to better anticipation of risks.
Article
Full-text available
In recent years the issue of supply chain risk has been pushed to the fore, initially by fears related to possible disruptions from the much publicised “millennium bug”. Y2K passed seemingly without incident, though the widespread disruptions caused by fuel protests and then Foot and Mouth Disease in the UK, and by terrorist attacks on the USA have underlined the vulnerability of modern supply chains. Despite increasing awareness among practitioners, the concepts of supply chain vulnerability and its managerial counterpart supply chain risk management are still in their infancy. This paper seeks to identify an agenda for future research and to that end the authors go on to clarify the concept of supply chain risk management and to provide a working definition. The existing literature on supply chain vulnerability and risk management is reviewed and compared with findings from exploratory interviews undertaken to discover practitioners' perceptions of supply chain risk and current supply chain risk management strategies.
Article
Full-text available
When major disruptions occur, many supply chains tend to break down and take a long time to recover. However, not only can some supply chains continue to function smoothly, they also continue to satisfy their customers before and after a major disruption. Some key differentiators of these supply chains are cost-effective and time-efficient strategies. In this paper, certain "robust" strategies are presented that possess two properties. First, these strategies will enable a supply chain to manage the inherent fluctuations efficiently regardless of the occurrence of major disruptions. Second, these strategies will make a supply chain become more resilient in the face of major disruptions. While there are costs for implementing these strategies, they provide additional selling points for acquiring and retaining apprehensive customers before and after a major disruption.
Article
In Industry 4.0, integrating IIoT and smart manufacturing is crucial for high-quality, efficient, and cost-effective production. However, the performance of IIoT systems can be hindered by unevenly distributed ESPs. To tackle this challenge, we propose an optimized embedded system for edge intelligence and smart manufacturing, leveraging digital twin technology. Our approach employs a digital twin-assisted alliance game resource optimization strategy to jointly optimize multi-dimensional resource allocation, including bandwidth, computing, and caching resources, while considering constraints like maximum delay. The optimization problem maximizes edge terminal utility and ESP utility, transformed into a convex optimization problem with linear constraints. An approximate optimal solution is obtained through an alternating iterative method. Simulation results demonstrate significant enhancements in resource utilization efficiency compared to baseline schemes like Nash equilibrium and large coalition. The proposed scheme is ideal for large-scale edge intelligence and smart manufacturing systems, with benefits increasing alongside the number of ESPs.
Article
Algeria is one of the Maghreb countries most affected by wildfires. The economic, environmental, and societal consequences of these fires can last several years after the wildfire. Often, it is possible to avoid such disasters if the detection of the outbreak of fire is fast enough, reliable, and early. The lack of datasets has limited the methods used to predict wildfires in Algeria to the mapping risk areas, which is updated annually. This study is the result of the availability of a recent dataset relating the history of forest fires in the cities of Bejaia and Sidi Bel-Abbes during the year 2012. The dataset being small size, we used principal component analysis to reduce the number of variables to 6, while retaining 96.65% of the total variance. Moreover, we developed an artificial neural network (ANN) with two hidden layers to predict wildfires in these cities. Next, we trained and compared the performance of our classifier with those provided by the Logistic Regression, K Nearest Neighbors, Support Vector Machine, and Random Forest classifiers, using a 10-fold stratified cross-validation. The experiment shows a slight superiority of the ANN classifier compared to the others, in terms of accuracy, precision, and recall. Our classifier achieves an accuracy of 0.967±0.026 and F1-score of 0.971±0.023. The SHAP technique revealed the importance of the features (RH, DC, ISI) in the predictions of the ANN model.
Article
The current milieu, encourages rapid growth of wireless communication, multimedia applications, robotics and graphics to have efficient utilization of resources with high throughput and low power digital signal processing (DSP) systems. In an aggregate DSP system ranging from audio/video signal processing to wireless sensor networks, floating point matrix multiplication is used in wide scale in most of the fundamental processing units. Hardware implementation of floating-point matrix multiplication demands a colossal number of arithmetic operations that alter speed and consuming more area and power. DSP systems essentially uses two techniques to reduce dynamic power consumption:—they are pipelining and parallel processing that needs high performance processing element with less area and low power in diverse scientific computing applications. However, number of adders and multipliers used in the design of floating-point unit also increases subsequently. The adders and multipliers are the most area, delay and power consuming data path elements in the processing unit. The arithmetic level reduction of delay, power and area in the processing element is performed by the selection of appropriate adders and multipliers. This article proposes a parallel multiplication architecture using Strassen and UrdhvaTiryagbhyam multiplier, which involves design of efficient parallel matrix multiplication with flexible implementation of FPGA (Field Programmable Gate Array) device to analyse the computation and area. The design incorporates scheduling of blocks, operations on processing elements, block size determination, parallelization and double buffering for storage of matrix elements.
Chapter
There is a boom in autonomous systems with rapid technological development that cannot adapt without human interaction to actual life conditions. Such programs will free people from the added activities which are mundane, repeatable, and invalid. The unmanned aerial vehicle (UAV) can use in boring, dusty, and hazardous environments, where human health and welfare are at significant risk. Autonomous systems deliver efficiencies in the supply chain, tracking, and hazardous climate control. This paper proposes the concept to improve the UAV's effectiveness, the compression and track architecture is combined. In addition, the UAV and pipe can increase the friction of the trackwheel. The process of cleaning starts when the sensors feel the mouth. The washing mechanism will not occur if the sensors do not detect the dirt. The focus of this chapter is on the drainage cleaning system automation. Device automation is required to address the mobility and space challenge. This chapter proposes the effective use of this method to manage waste disposal and routine wastage filtration.
Chapter
This paper mentions how morphological component analysis, deep learning, and steganography could be used to safeguard the dissemination, recognition, and affirmation of text-based pictures over one such Internet of Things–based link. Morphological component analysis has been utilized to extract features from text-based images. Every one of those attributes may indeed, in fact, also have different morphological components. A morphological component technique decreases not only redundant but also, moreover, statistically independent features, and the dimensions in bits of data of a text-based picture to an unexpected tier of pictorial performance. Instead of relying on a singular temporal feature descriptor, the ability to obtain detailed text-based picture texture (Guo et al., In Proceedings of the Ninth IEEE International Conference on Computer Vision (ICCV). France, Nice, 2003) characteristics proves to be an essential characteristic of the indicated methodology.For the purpose of constructing morphological components, four different textured aspects (which include horizontal and vertical) were postulated all across the whole paper. Without first being transmitted through the use of Internet of Things–based networks, the morphological portions of a concealed text-based picture may be further scattered and usually embedded into the least significant bit (LSB) of a cover pixel utilizing spatial obfuscation techniques. To fetch a message from stored images, a concealed key of such an embedding process can converse with such a text retrieval method at that recipient’s edge. Finally, a combined convolution neural network approach would be used to recognize an engrained text-based message. In addition, an optimization technique would be employed to improve the hybrid convolutional neural network (HCNN) effectiveness.The findings of the analyses reveal that perhaps the visual and statistical estimations of such a cover image of the morphological component of a textual image ever seen as insertion were indeed similar to a value system upon implantation, not only removing the possibility of a highly confidential utterance being found but also empowering encrypted systems. An experiment demonstrated that the recommended technique, which integrates morphological component analysis with Daubechies filters and steganography, outperforms Haar filtering methods and techniques in terms of peak signal-to-noise ratio, structural similarity index, and accuracy.
Article
This paper develops an effective encryption and steganography-based text extraction in IoT using deep learning method. Initially, the input text and cover images are separately pre-processed. DCT (discrete cosine transform) is utilized to transfer the image from spatial domain to frequency domain. Then, the original text is encrypted using new optimized equilibrium-based homomorphic encryption (OEHE) approach. Next, the extended wavelet convolutional transient search (EWCTS) optimizer with quotient multi-pixel value differencing (QMPVD) is developed to embed the secret text in cover images. Then, at receiver side, the reverse process for encryption and steganography is executed with secret key provided by the sender. Finally, the accurate text is extracted at receiver side using steganalysis process. The developed approach is executed in MATLAB software. The various evaluation metrics are used to authorize the effectiveness of suggested approach. Simulation outcomes proved that the suggested technique provides better outcomes than other existing approaches.
Chapter
IoT solutions enable customers to automate, analyze, and integrate their systems to a greater extent. They broaden the breadth and accuracy of these fields. The Internet of Things includes sensors, networks, and robotics, and it employs both old and new technology. The Internet of Things makes use of software breakthroughs, lower hardware costs, and a contemporary approach to technology (IoT). However, there have been several types of research in the field of IoT where the smart application is built. This research has focused on IoT-based smart applications that could be used for security enhancement in industries as well as homes. In another word, its research has introduced smart applications to maintain security from threats such as theft, fire, and other unexpected events that may result in financial loss.
Chapter
The goal of this research is to see how well is a fast primary screening method for COVID-19 that relies only on cough sounds collected from 2200 clinically verified samples utilizing the laboratory molecular testing performs (1100 Covid-19 positive and 1100 Covid-19 negative). The clinical labels were applied to the results, and severity of the samples may be judged based on quantitative RT-PCR (qRT-PCR), cycle threshold, and patient lymphocyte counts. The fast spread of the COVID-19 virus poses a significant danger of serious pulmonary disease, and it also causes the most heinous harm to humanity. As a result, a quick and clear disease classification model to distinguish between normal and COVID-19 infected individuals is critical. In this article, we describe the various machine learning and other models that have been used to predict COVID-19 patients.KeywordsSARS-CoV-2COVID-19Respiratory diseasesCoughConvolutional neural networkMachine learning modelsClassification
Chapter
In this digital world, a set of information about the real-world entities is collected and stored in a common place for extraction. When the information generated has no meaning, it will convert into meaningful information with a set of rules. Those data have to be converted from one form to another form based on the attributes where it was generated. Storing these data with huge volume in one place and retrieving from the repository reveals complications. To overcome the problem of extraction, a set of rules and algorithms was framed by the standards and researchers. Mining the data from the repository by certain principles is called data mining. It has a lot of algorithms and rules for extraction from the data warehouses. But when the data is stored under a common structure on the repository, the values derived from that huge volume are complicated. Computing statistical data using data mining provides the exact information about the real-world applications like population, weather report, and probability of occurrences.
Article
Purpose The purpose of this paper is to determine the impact of standardized management systems (ISO 9001, ISO 14001, ISO 22000 and ISO 28000) on minimizing selected aspects of risk in the supply chain. Design/methodology/approach A questionnaire was used to explore this topic. Respondents were divided into two group regarding organization type – logistic service providers and focal companies. Basic data analysis was based on descriptive statistics and on analysis of variance with organization type as a stratification factor. Deeper data analysis was based on factor analysis with principal component analysis and varimax rotation with Kaiser normalization. Findings Research shows that standardized management systems turn out to be useful in supply chain risk management (SCRM) regardless of the role that the organization plays in the supply chain. However, the strength of their positive impact varies. There were few respondents among logistic operators who were low in assessing the legitimacy of implementing standardized management systems in the examined context. Having this in mind, especially representatives of logistic operators with a limited budget should consider making the decision to implement standardized management systems. Practical implications The obtained research results may be helpful for managers who consider to implement standardized management systems in the context of using obtain guidelines to develop procedures to improve supply chain management and ensure the repeatability of ongoing processes. Originality/value Although the number of studies on the SCRM increases, it is worth noting that in the literature there is still a lack of research and studies addressing the impact of standardized management systems on SCRM (especially from the perspective of organizations with various functions in supply chains such as focal companies and logistic service providers). Therefore, there is a need for comprehensive research in this area. According to the authors, the study carried out, at least to some extent, will fill this gap.
Article
In this paper, the proposed topology of a fully automated control (FAC) battery charger that provides turbo-boost operation and fully automated detection without any feedback from the system is discussed. It uses closed-loop control for automatic system load detection and automatically adjusts the battery discharge current. The advantages include shortened battery discharge time and maximum power provided by the adapter. Therefore, the FAC technique is flexible for any load system that is limited to battery chargers with the turbo boost mode support. In addition, to avoid over discharge, a battery protection circuitry is also included. The test chip is fabricated using a 0.25-μm CMOS process, with a maximum efficiency of 94.58% when the charger operates in the turbo boost mode, and achieves 97% maximum efficiency when the charger operates in buck charge mode.
Article
Supply chain risk management (SCRM) is a nascent area emerging from a growing appreciation for supply chain risk by practitioners and by researchers. However, there is diverse perception of research in supply chain risk because these researchers have approached this area from different domains. This paper presents our study of this diversity from the perspectives of operations and supply chain management scholars: First, we reviewed the researchers’ output, i.e., the recent research literature. Next, we surveyed two focused groups (members of Supply Chain Thought Leaders and International Supply Chain Risk Management groups) with open-ended questions. Finally, we surveyed operations and supply chain management researchers during the 2009 INFORMS meeting in San Diego. Our findings characterize the diversity in terms of three “gaps”: a definition gap in how researchers define SCRM, a process gap in terms of inadequate coverage of response to risk incidents, and a methodology gap in terms of inadequate use of empirical methods. We also list ways to close these gaps as suggested by the researchers.
Article
Supply chain risk management (SCRM) is of growing importance, as the vulnerability of supply chains increases. The main thrust of this article is to describe how Ericsson, after a fire at a sub-supplier, with a huge impact on Ericsson, has implemented a new organization, and new processes and tools for SCRM. The approach described tries to analyze, assess and manage risk sources along the supply chain, partly by working close with suppliers but also by placing formal requirements on them. This explorative study also indicates that insurance companies might be a driving force for improved SCRM, as they now start to understand the vulnerability of modern supply chains. The article concludes with a discussion of risk related to traditional logistics concepts (time, cost, quality, agility and leanness) by arguing that supply chain risks should also be put into the trade-off analysis when evaluating new logistics solutions – not with the purpose to minimize risks, however, but to find the efficient level of risk and prevention.
Article
This article presents a secondary analysis of the literature, supplemented by case studies to determine if large companies increase their exposure to risk by having small- and medium-size enterprises (SMEs) as partners in business critical positions in the supply chain, and to make recommendations concerning best practice. A framework defining the information systems (IS) environment is used to structure the review. The review found that large companies’ exposure to risk appeared to be increased by inter-organisational networking. Having SMEs as partners in the supply chain further increased the risk exposure. SMEs increased their own exposure to risk by becoming partners in a supply chain. These findings indicate the importance of undertaking risk assessments and considering the need for business continuity planning when a company is exposed to inter-organisational networking.
Article
Global supply chains face a multitude of risks. A review of the recent literature reveals a few structured and systematic approaches for assessing risks in supply chains. However, there is no conceptual framework that ties together this literature. The purpose of this paper is to integrate literature from several disciplines - including logistics, supply chain management, operations management, strategy, and international business - to develop a model of global supply chain risk management. The implications for stakeholders and how future research could bring more insights to the phenomenon of global supply chain risk management are also discussed.
Chapter
Since the start of the new century the world at large has experienced escalating uncertainty as a result of climate changes, epidemics, terrorist threats and an increasing amount of economic upheaval. These uncertainties create risks for the proper functioning of supply chains. This chapter provides an insight into developing a proactive approach to predict risks and manage uncertainties that may potentially disrupt the supply chain. The aim of the chapter is to present a holistic perspective regarding supply chain risk management and incorporate a methodology to manage supply chain risks proactively. When discussing supply chain risk issues with industry personnel it was noticed that post 9/11, the issue of supply disruption had gained importance within the industry. But the focus on managing these disruptions and sources of these disruptions has been primarily reactive. Supply chain personnel in some instances have remarked that they have in the past researched and presented to their top management proactive risk management solutions which had been subsequently rejected and no investment provided. There is now, however, an increasing interest regarding proactive tools and hence this chapter seeks to present a framework for implementing proactive risk management. The chapter also suggests some tools which may prove useful in predicting supply chain risks.
12 Fusion of Datamining and Artificial Intelligence. Machine learning and iot for intelligent systems and smart applications
  • B Khan
  • A Hasan
  • D Pandey
  • R J M Ventayen
  • B K Pandey
  • G Gowwrii
A Supply Chain Risk Management process.
  • W.Kersten