Article

Recent Development in Big Data Analytics for Business Operations and Risk Management

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

"Big data" is an emerging topic and has attracted the attention of many researchers and practitioners in industrial systems engineering and cybernetics. Big data analytics would definitely lead to valuable knowledge for many organizations. Business operations and risk management can be a beneficiary as there are many data collection channels in the related industrial systems (e.g., wireless sensor networks, Internet-based systems, etc.). Big data research, however, is still in its infancy. Its focus is rather unclear and related studies are not well amalgamated. This paper aims to present the challenges and opportunities of big data analytics in this unique application domain. Technological development and advances for industrial-based business systems, reliability and security of industrial systems, and their operational risk management are examined. Important areas for future research are also discussed and revealed.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... Its potential is to improve decisionmaking, promote efficiency, and enhance customer engagement [9]. The studies showed that the implementation of BDA has resulted in more effective targeting and segmentation strategies, personalized customer interactions, and overall improved marketing outcomes [4,8,10]. ...
... Previous studies have provided insights into the relationships between big data analytics, innovation, and ecommerce [4,8,10]. They stated that dynamic capability is an organization's aptitude to build and reconfiguring its external and internal competencies to tackle rapid environmental changes. ...
... This perspective suggests that digital marketing can promptly adjust its resources and processes to address environmental threats and opportunities, thereby maintaining a competitive edge. Companies can innovate by managing various types and sources of information, and big data analytics can serve as an invaluable tool in this regard [4,8,10]. ...
Conference Paper
Full-text available
This paper delineates how telecommunications companies in Jordan can harness the power of big data analytics to enhance their digital marketing strategies. Its role is substantial in promoting creative businesses concerning intelligent technologies, as the telecommunications industry is fundamentally associated with it. The dynamic capability of an organization’s aptitude to build and reconfigure its external and internal competencies to tackle rapid environmental changes is unavoidable. This perspective suggests that digital marketing can promptly adjust its resources and processes to address environmental threats and opportunities. The capability of the dynamic framework is a persuasive instrument in the digital economy for producing, employing, and transforming business models. On the other hand, digital marketing enhances business performance, focusing on customer retention, brand loyalty, and customer satisfaction. Primary data was collected through the questionnaires for Zain Jordan, Umniah, and Orange Jordan, which was analyzed by SPSS to develop regression analysis. The curve analysis shows that the p-value for Zain Jordan and Orange Jordon are 0.576 and 0.339, respectively. Although subscribers showed confidence in Zain Jordon, but they need to minimize the complexity to enhance the performance.
... Choi, Lee, and Irani [44,45] point out the drawbacks of traditional data analytic tools and offer an FCM technique as a Big Data analytics tool (BDA) that will aid in the prioritization of public-sector decision-making. ...
... Choi, Lee, and Irani [44][45] point out the drawbacks of traditional data analytic tools and offer an FCM technique as a Big Data analytics tool (BDA) that will aid in the prioritization of public-sector decision-making. ...
Article
Full-text available
Globalization has gotten increasingly intense in recent years, necessitating accurate forecasting. Traditional supply chains have evolved into transnational networks that grow with time, becoming more vulnerable. These dangers have the potential to disrupt the flow of goods or several planned actions. For this reason, increased resilience against various types of risks that threaten the viability of an organization is of major importance. One of the ways to determine the magnitude of the risk an organization runs is to measure how popular it is with the buying public. Although risk is impossible to eliminate, effective forecasting and supply chain risk management can help businesses identify, assess, and reduce it. As a result, good supply chain risk management, including forecasting, is critical for every company. To measure the popularity of an organization, there are some discrete values (bounce rate, global ranking, organic traffic, non-branded traffic, branded traffic), known as KPIs. Below are some hypotheses that affect these values and a model for the way in which these values interact with each other. As a result of the research, it is clear how important it is for an organization to increase its popularity, to increase promotion in the shareholder community, and to be in a position to be able to predict its future requirements.
... As influenced by Big Data analytics, continuous data monitoring sought to enhance the institution of dynamic approaches in risk management. The organizations incorporate the adaptation to real-time strategies to enhance the response to emerging risks, hence the enhanced risk management practices relevant to evolving business environments (Choi et al., 2017). ...
... This approach includes harnessing the Big Data potential, which supports organizations in overcoming the traditional models essential in enhancing the proactive and data-driven approaches. This shift correlates to the modernized and evolving landscape with diverse, dynamic, and unpredictable risks (Choi et al., 2017). Machine learning has consequently influenced predictive capabilities through accurate risk forecasts; hence, effective mitigation models are essential for enhancing organizational resilience and improving decision-making processes. ...
Article
Full-text available
Bigdata analytics integration in Enterprise Risk Management (ERM) features of the modern and transformative impact and risk assessment models. The advent of Big Data and advanced analytics has revolutionized the organizational identification, assessment, and risk mitigation model, hence the revolution from static approaches to dynamic and data-driven models. This research paper examines the theoretical frameworks such as Risk Analytics Frameworks, Data-driven Decision-Making models, and Cybersecurity Risk models essential in enhancing the leverage of enhanced risk assessment. The research findings depict data integration as the fundamental concept in risk identification, predictive capabilities, and dynamic risk management alongside the multi-dimensional perspective on risk factors. The study concludes with the illustration of integrating Big Data analytics as fundamental in modern risk management practices, empowering the organizations' approaches to addressing emerging threats. The approaches enhance decision-making, resource allocation, and organizational resilience. Hence, the research has expounded on the significance of adaptive data-driven approaches in manoeuvring the complex business world.
... Moreover, the concept of big data refers to how to deal with big data files and what effective computing frameworks store and process the files [7]. The main characteristics of big data include complexity, constantly changing by time, and huge volume [8]. Since the size of data files increases, conventional techniques cope with the problem of a computer's memory; moreover, the techniques are often implemented to run on a single computer. ...
... Our simulation experiment is configured by a standalone Spark cluster with three worker nodes. Each node has 6 cores, 8 ...
Article
Full-text available
Big data analytics is an emerging topic in academic and industrial engineering fields, where the large-scale data issue is the most attractive challenge. It is crucial to design an effective large-scale data processing model to handle big data. In this paper, we aim to improve the accuracy of the classification task and reduce the execution time for large-scale data within a small cluster. In order to overcome these challenges, this paper presents a novel ensemble-based paradigm that consists of the procedure of splitting large-scale data files and developing ensemble models. Two different splitting methods are first developed to partition large-scale data into small data blocks without overlapping. Then we propose two ensemble-based methods with high accuracy and less execution time: bagging-based and boosting-based methods. Finally, the proposed paradigm can be implemented by four predictive models, which are combinations of two splitting methods and two ensemble-based methods. A series of persuasive experiments was conducted to evaluate the effectiveness of the proposed paradigm with four different combinations. Overall, the proposed paradigm with boosting-based is the best in terms of the accuracy metric compared with existing methods. In addition, boosting-based methods achieve 91.6% accuracy compared with 52% accuracy of base line model for a big data file with 10 million samples. However, the paradigm with bagging-based takes the least execution time to yield results. This paper also reveals the effectiveness of the computing Spark cluster for large-scale data and points out the weakness of RDD (Resilient Distributed dataset).
... Additionally, when firms use big data analytics in developing products and enhancing business processes for e-commerce, they might be stimulated to become innovative businesses (Bharadwaj et al. 2013;Choi et al. 2017;Teece et al. 2016). In fact, the framework of dynamic capabilities essentially focuses on the need to methodically undertake entrepreneurial innovation as an integral part of long-term corporate strategy (Erevelles et al. 2016). ...
... Meanwhile, Yoo et al. (2012) indicated that innovation with universal digital technologies shows features like heavy reliance toward digital technology platforms, presence of distributed innovations, and the use of combinatorial innovations. Relevantly, Choi et al. (2017) and Kwon et al. (2014) mentioned the need to correctly manage high data structuring, because aside from stimulating creativity and innovation, cognitive model creation can also promote homogenization and constraint (Dobusch and Kapeller, 2018;Wamba 2017). ...
Article
Full-text available
Big data analytics (BDA), as a new innovation tool, played an important role in helping businesses to survive and thrive during great crises and mega disruptions like COVID-19 by transitioning to and scaling e-commerce. Accordingly, the main purpose of the current research was to have a meaningful comprehensive overview of BDA and innovation in e-commerce research published in journals indexed by the Scopus database. In order to describe, explore, and analyze the evolution of publication (co-citation, co-authorship, bibliographical coupling, etc.), the bibliometric method has been utilized to analyze 541 documents from the international Scopus database by using different programs such as VOSviewer and Rstudio. The results of this paper show that many researchers in the e-commerce area focused on and applied data analytical solutions to fight the COVID-19 disease and establish preventive actions against it in various innovative manners. In addition, BDA and innovation in e-commerce is an interdisciplinary research field that could be explored from different perspectives and approaches, such as technology, business, commerce, finance, sociology, and economics. Moreover, the research findings are considered an invitation to those data analysts and innovators to contribute more to the body of the literature through high-impact industry-oriented research which can improve the adoption process of big data analytics and innovation in organizations. Finally, this study proposes future research agenda and guidelines suggested to be explored further.
... T HE apparel retailing and manufacturing (ARM) industry has entered the digital age, in which data analytics [1] and other disruptive technologies have become crucial [3]. There is no doubt that intelligent technologies are playing a critical role. ...
... Artificial intelligence (AI), 3-D orienting (also called additive manufacturing) [2], blockchain, platforms, mobile technologies, etc., are all playing important roles in ARM. 1 Mckinsey reports that fashion companies on average invest up to 1.8% of their incomes in technologies, and this expense is predicted to "double" and reach around 3+% by 2030. 2 The recent COVID-19 pandemic has also enticed apparel companies to better utilize different technological tools and consumers are more technologically ready. For example, the fashion retailer Timberland has used the WhatsApp platform as a way to keep its stores and continue to sell products during the COVID-19 pandemic [5]. ...
Article
The papers in this special issue focus on the role of intelligent technologies in the apparel, retailing, and manufacturing (ARM) industry and assesses its impact on technology management. ARM businesses have entered the digital age, in which data analytics and other disruptive technologies have become crucial. There is no doubt that intelligent technologies are playing a critical role. Artificial intelligence (AI), 3-D orienting (also called additive manufacturing), blockchain, platforms, mobile technologies, etc., are all playing important roles in ARM. The recent COVID-19 pandemic has also enticed apparel companies to better utilize different technological tools and consumers are more technologically ready.
... Siloed risk management accounts for fragmented approach aiming at establishing some levels of communication, data and information sharing between organizations, which might result in isolated risk evaluation and conflicting risk mitigation efforts. These downsides of mix can be the cause of thoughtlessness, duplication of efforts and stunting of the overall risk perception [8]. ...
Article
Full-text available
This report aims to provide a general description of the various methods that can be used for making a hybrid type of Machine Learning or Artificial Intelligence model that will be used for mitigating and finding financial risks. By combining different types of AI and ML technologies it is possible to make newer and latest versions of techniques and tools that can readily assist enthusiasts and professionals.
... These risks can be internal to institutions (e.g., inadequate or failing internal processes, human errors, or system failures) or can originate from external events (such as fraud, vulnerable computer systems, control failures, operational errors, neglected procedures, or natural disasters). The increase in the quantity, variety, and complexity of operational risk exposures, particularly for financial firms, has led to the adoption of AI and machine learning-based solutions [52]. AI can assist institutions at various stages of the risk management process, from identifying risk exposure to measuring, estimating, and evaluating their effects [53]. ...
Article
This article delves into the transformative effects of Artificial Intelligence (AI) and Machine Learning (ML) on the realm of risk management. AI and ML technologies have revolutionized risk assessment, mitigation, and management across various sectors by offering advanced analytical capabilities and automated decision-making processes. In the financial sector, for instance, these technologies have facilitated improvements in loan decision processes, fraud detection, and compliance. Partnerships like ZestFinance and Baidu exemplify the successful application of AI in enhancing loan decisions based on vast data analysis. Despite the evident benefits, challenges such as model-related risks, data availability and protection, and the need for skilled personnel persist. This article aims to provide a comprehensive overview of the current applications of AI and ML in risk management while identifying opportunities for further research and development in this rapidly evolving field.
... Traditional operational risk management emphasises internal controls, personnel training, and optimisation. An avenue toward artificial intelligence and machine learningbased solutions has been opened due to the growth in operational risk exposures in terms of number, variety, and complexity, particularly for financial institutions (Choi et al., 2017). Concerns about a bank's capacity to fulfil its short-term financial obligations are known as liquidity risks. ...
... Choi, T.M. et al stated in their article that: "Big data analytics plays a pivotal role in transforming business operations and managing risks in modern supply chains." [8] Big data can reveal correlations between different risk factors. This is crucial for understanding how complex supply chains react under the combined effect of various risk factors. ...
Article
Full-text available
This paper explores the transformative impact of big data technology on cross-border supply chain inventory management. In the era of globalization, supply chains face increased complexities and risks, particularly in cross-border logistics. Challenges include transportation uncertainties, delays due to long-distance transport, infrastructure disparities, and transparency issues. Integrating big data analytics offers a solution to these challenges by enabling predictive analytics for demand forecasting, inventory optimization, and risk management. This study highlights the role of big data in enhancing supply chain transparency, reducing uncertainties, and improving decision-making processes. Examples from JD E-commerce and NongFu Spring demonstrate the practical application of big data in optimizing inventory management and mitigating risks. JD E-commerce employs artificial intelligence and big data analytics for inventory management, leading to reduced turnover days and cost efficiency. NongFu Spring, on the other hand, uses big data for scenario marketing and supply chain optimization. The paper concludes that big data technology not only revolutionizes inventory management but also plays a crucial role in addressing risks in the supply chain, thus leading to more efficient, transparent, and resilient supply chains in the face of globalization challenges.
... They classified the research interest of BD into five main divisions: techniques and processes, information governance, security and risk, professional roles and competencies and new applications such as trading signals, fraud prevention, the Internet of Things (IoT) and customer insights. In particular, Choi et al. (2016) indicated that BD improves risk management by providing several resources for data collection (e.g. wireless sensor networks). ...
Article
Purpose This study aims to scrutinize the relationship between the perception of big data (BD) features and the primary outcomes of financial accounting. Likewise, it explores whether financial accounting practices moderate the relationship between BD features and firm sustainability. Design/methodology/approach The study used a questionnaire survey based on the Likert scale for two distinct groups of participants: academic scholars and industry practitioners operating in the BD era within the energy sector. Findings The results reveal significant positive associations between BD features and firm performance, reporting quality, earnings determinants, fair value measurements, risk management, firm value, the efficiency of the decision-making process, narrative disclosure and firm sustainability. Besides, the path analysis indicates an indirect impact of BD on firm sustainability via financial accounting practices. The results suggest that energy firms should consider incorporating BD analysis into their financial accounting processes to improve their sustainability performance and create long-term value for their stakeholders. Practical implications The findings are particularly interesting to academics in accounting and business to improve the accounting curriculums to fit the technological revolution, especially in the field of BD analytics. Practitioners within energy industries must also refine their skills and knowledge to meet the challenges of BD in the foreseeable future. The results provide important implications for policy setters to revise current financial accounting standards to cope with technological innovation. Originality/value The study makes a valuable contribution by critically examining the impact of BD on various financial accounting practices neglected in prior research. It highlights the transformative power of BD in the domain of financial accounting and provides insights into its potential implications for energy firms.
... Visibility capability relies heavily on data connectivity (Brandon-Jones et al., 2014). Most importantly, data can be used to evaluate vulnerabilities (Choi & Lambert, 2017) and improve SCR (Choi et al., 2016). Data accuracy is fundamental to SCR and ignorance of this can lead to disastrous outcomes (Ivanov & Dolgui, 2021). ...
Article
Full-text available
The unprecedented global impact of the COVID-19 pandemic has heightened the critical significance of supply chain resilience (SCR) within the contemporary supply chain landscape. The body of literature dedicated to SCR has significantly expanded since the early 2000s, with numerous scholars delving into the construction of SCR frameworks based on empirical studies and literature reviews. Despite the usefulness of these frameworks, there has been a notable absence of a generic framework that transcends industry and national boundaries, particularly in light of the disruptive events triggered by the COVID-19 crisis. This study employs the narrative literature review method to intricately integrate itself into the existing SCR literature, conducting a comprehensive analysis, identifying theoretical foundations and empirical discoveries, and synthesizing this knowledge into a cohesive and all-encompassing structure to formulate a conceptual framework for SCR. This generic framework is designed to accommodate the unique characteristics of various supply chains. While the empirical validation of this innovative framework remains pending, it presents a valuable opportunity for scholars to engage in scientific investigations on SCR, building upon the collective insights of their predecessors. Moreover, practitioners can leverage this framework to scrutinize and construct resilient supply chains capable of withstanding future disruptions.
... Different social media channels provide different benefits in producing brand-related content for consumers and businesses. These channels have also become important intermediaries in businesses' decision-making processes (Choi, Chan, and Yue 2017). While these channels have similar characteristics regarding basic function, each has its strengths. ...
Article
Full-text available
This study examines the extent to which companies operating within the technology park, ranked in the top 3 in Turkey according to the performance index, utilise social media marketing. The research explores the level of their engagement, the platforms they leverage, and their relative activity across these platforms. Furthermore, a profile has been delineated specifically for companies in technology-intensive industrial markets. Out of the 1121 companies accessed on their websites, it was observed that 55% of them had added redirects to their social media accounts. Among the 615 companies with accessible social media accounts, it was noted that 102 of them had profiles on Facebook, Instagram, Linked In, YouTube, and X. On the other hand, it has been found that the percentages of companies sharing content on their social media accounts at least one year ago are noteworthy (Among firms using YouTube 33.3%, Facebook 5.9%, and X 33.7%). This study offers perspectives for academic circles and practitioners through analyses of the involvement levels of firms operating in a business-to-business context in social media marketing and the platforms they utilise.
... Big data technology is an important tool for the overall improvement of enterprises' ability to process various types of information materials [5][6][7][8]. The volume of data and information technology are the main features of the big data era [9][10][11][12]. There are differences in the way big data technology is used in various fields, especially in the field of resource management, which helps to improve the ability of enterprises to collect information and data about resources and employees of the enterprise and provides effective data support for the decision making of the enterprise resource department [13][14][15][16]. ...
Article
Full-text available
With the booming of big data technology, a sustainable development path is an important strategic resource for enterprises. In this paper, a vector is set up by introducing a multidimensional algorithm. Then connect state paths and define an intergenerational criterion. A very large and very small function is generated according to the criterion. The welfare function is set up to obtain the usual expression for the discount rate. Finally, a bivariate is set up to obtain the modified rule, and the implied interest rate is used to define the return rate, based on which the sustainability model is constructed. The experimental results show that the data are collected from enterprises using big data technology, the research is conducted based on the measured objectives, and the non-standardized coefficients are calculated statistically. Among them, the standard error of management resources is 61%, which has the smallest error value compared to other groups.
... In the era of big data, cloud computing has dynamic and reliable computing capabilities that make efficient and massive enterprise business data analysis possible [1]- [4]. The group intelligence in the cloud accounting environment provides a better operating environment for internal control and risk management decision support, and the scalability and versatility of cloud accounting can help enterprises improve their internal control and risk management architecture [5]- [6]. The data generated by the current market economic development is relatively large and difficult to estimate; through cloud accounting technology, enterprises can better collect, integrate and share data and information, and enterprises can apply data more conveniently and effectively in the implementation of budget management [7]- [10]. ...
Article
Full-text available
To solve the financial risk problem of enterprise financing difficulties and large financing amounts in the era of big data, this paper constructs a financial risk assessment model and evaluates the indexes of enterprise financial risk by combining it with cloud accounting products. Firstly, we assign and determine the weights, use the evidence theory to develop the enterprise financial risk evaluation, and determine the weights of each risk evaluation index. Secondly, we set up a fuzzy evaluation language set, calculate the maximum value of the fuzzy rubric affiliation function for the risk indicators of state-owned enterprises under cloud accounting, pre-process the obtained data, determine the specific indicator weights with the help of fuzzy hierarchical analysis, and carry out consistency tests on them. Finally, the relative importance of each indicator is calculated among the model factors to obtain the overall risk score of cloud accounting enterprise finance. The experimental results show that the risk level measured according to the steps of the cloud accounting risk assessment system, the data confidentiality risk score is 76.14, the cloud accounting product risk is 85.96, the service risk is 72.88, and only one item is higher risk. This shows that the use of cloud accounting financial risk control measures can improve the quality of financial risk management of enterprises.
... In business and supply chain management, AI is used for predictive analytics, demand forecasting, and process optimization. AI algorithms analyze large datasets to identify patterns and predict future trends, thereby enhancing decision-making processes [7]. ...
Article
Full-text available
This paper examines the integration of Blockchain and Artificial Intelligence (AI) in enhancing inventory management. It highlights how these technologies synergistically improve accuracy and efficiency, significantly reducing fraud and errors. The study explores Blockchain's secure ledger and AI's predictive analytics, emphasizing their practical applications in various industries. Challenges such as technical complexities and ethical considerations, including data privacy and regulatory compliance, are also addressed. The paper concludes by discussing the implications for businesses and researchers, underscoring the transformative impact of these technologies in inventory management and the necessity for ongoing innovation and ethical vigilance.
... Financial Risk Management is now a crucial aspect, aiming to mitigate threats arising from market volatility, economic changes, and rapid industrial dynamics [3]. The application of Data Analytics and Predictive Algorithms is emerging as a tool that has the potential to strengthen risk management capabilities, providing intelligent solutions to identify, evaluate and manage risks with greater precision [4][5] [6]. Financial Risk Management in the banking industry requires a holistic and adaptive approach to the changing business environment. ...
Article
Full-text available
This research delves into the realm of financial risk management within the Indonesian banking sector, with a focus on leveraging Data Analytics and Predictive Algorithms. Amidst the global financial market's complexities and the evolving nature of banking risks, this study aims to provide a comprehensive understanding of how advanced technological tools can enhance risk identification, evaluation, and management. Utilizing extensive datasets from the Indonesian Banking Statistics, Central Statistics Agency, and Bank Indonesia, the research explores the intricate relationship between various banking risks and macroeconomic factors. The study employs sophisticated predictive models to analyze data, focusing on credit and operational risks. The findings highlight the significant impact of macroeconomic variables on banking risks and the effectiveness of predictive models in risk assessment. The research contributes to the existing literature by offering a detailed analysis of the integration of machine learning and big data analytics in banking risk management. It also provides strategic insights for banks to adopt more dynamic, data-driven risk management strategies in the face of economic and industrial changes. The study underlines the importance of continuous innovation in technological applications to meet the evolving demands of the banking sector.
... Furthermore, big data also has the potential to reduce operating costs associated with risk management while ensuring compliance with legal obligations. (Choi, Chan, & Yue, 2016) The field of weather forecasting is currently experiencing a revolution due to big data. With constantly increasing computing power, technological advancements, and the utilization of more complete data sets, predicting the weather has become much more reliable. ...
Article
Full-text available
Innovative and sophisticated technologies have been rapidly developing in recent years. These cutting-edge advancements encompass a wide spectrum of devices like mobile phones, PCs and social media trackers. As a consequence of their widespread usage, these technologies have engendered the generation of vast volumes of unstructured data in diverse formats, spanning terabytes (TB) to petabytes (PB).This vast and varied data is called big data. It holds great promise for both public and private industries. Many organizations utilize big data to uncover useful insights, whether for marketing choices, monitoring specific actions, or identifying potential threats.This kind of data processing is made possible using different methods known as Big Data Analytics. It allows you to gain significant advantages by handling large amounts of unorganized, organized, and partially organized information quickly, which would be impossible with traditional database techniques. While big data presents considerable While it offers benefits for businesses and decision-makers, it also puts consumers at risk. This risk results from the use of analytics technologies, which need the preservation, administration, and thorough analysis of enormous volumes of data gathered from many sources. Consequently, individuals face the risk of their personal information being compromised as a result of the collection and revelation of behavioral data. Put simply, the excessive accumulation of data may lead to multiple breaches in security and privacy. Nevertheless, the realm of big data does indeed raise concerns pertaining to security and privacy. Scholars from various disciplines are actively engaged in addressing these concerns. The study will concentrate on large data applications, substantial security hurdles, and privacy concerns. We'll talk about potential methods for enhancing confidentiality and safety in problematic big data scenarios, and we'll also analyze present security practices.
... The perspective of online banking, the reduction of physical branches and the improvement of the cost income ratio due to a decrease in operative costs have changed the banks" business models (Alonso, Berg, Kothari, Papageorgiou, & Rehman, 2020). Online and mobile banking have improved thanks to the evolution of banking technology (Singh, 2020;Thisarani & Fernando, 2021) and have led to fully digitalized reality where banks are fully substituted in all their services and operations by online experience for opening and using current accounts, receiving loans and consumer credits (Choi, Chan, Yue, 2017;Figini, Bonelli, & Giovannini, 2017). ...
Article
Full-text available
Digital revolution is influencing many economic sectors and for a few years banking sector is under a great transformation mainly due to the development and the use of new technologies. The most recent ones are artificial intelligence (AI) with the recourse to advanced algorithms. The main banking services, their offer, but above all, the customer relations have been significantly influenced by the this. The recourse to new channels, the monitoring of risks and the controls of frauds are only some of the applications of machine learning (ML). To manage the increase in financial and non-financial risks AI and ML seem to give a great help to banks. The survey conducted from December 2022 to May 2023 with a sample of Italian banks of different size, shows the level of awareness in the recourse to these technologies. Moreover, it aims to assess the maturity and the future perspectives in the adoption of AI in the financial system. The analysis is divided into different investigation areas that show how banks can mitigate the risks involved with the implementation of AI and how it affects the risk management process. The paper covers the gap in literature where AI and ML are mainly considered as separate tools to face specific banking projects; and Italian banks, even if with differences due to the size, are aware of the relevance of these new technologies. The research is a contribute to the discussion about the application of AI and ML in a holistic dimension.
... In the backdrop of carbon neutrality, models predicting financial risks require modifications and enhancements. Traditional models, such as the Credit Risk VaR model, statistical-based risk models, and data mining-based risk models [55], though effective in financial risk measurement, lack dimensions accounting for carbon emissions and environmental impact under a carbon-neutral paradigm. Light et al. [56] posited that, viewed from the lens of carbon neutrality, enterprise risks can be categorized into traditional financial risks and risks associated with carbon emissions. ...
Article
Full-text available
Climate change is widely acknowledged as the paramount global challenge of the 21st century, bringing economic, social, and environmental impacts due to rising global temperatures, more frequent extreme weather events, and ecosystem disturbances. To combat this, many countries target net-zero carbon emissions by 2050, reshaping both the financial system and consumption patterns. This transition has sharpened the financial sector’s focus on climate-related risks, making the carbon footprint, environmental benefits of investments, and sustainability of financial products critical to investors’ decisions. However, conventional risk prediction methods may not fully capture these climate-associated risks in a carbon-neutral setting. Emerging from this context is the need for innovative predictive tools. Recently, Long Short-Term Memory networks (LSTM) have gained prominence for their efficacy in time-series forecasting. Singular Spectrum Analysis (SSA), effective for extracting time series patterns, combined with LSTM as SSA-LSTM, offers a potentially superior approach to financial risk prediction. Our study, focusing on a case study of the wind energy sector in China, situates itself within the growing body of research focusing on the integration of environmental sustainability and financial risk management. Leveraging the capabilities of SSA-LSTM, we aim to bridge the gap in the current literature by offering a nuanced approach to financial risk prediction in the carbon-neutral landscape. This research not only reveals the superiority of the SSA-LSTM model over traditional methods but also contributes a robust framework to the existing discourse, facilitating a more comprehensive understanding and management of financial risks in the evolving carbon-neutral global trend.
... Indeed, the application of digital technologies to increase the resilience ability of a supply chain in a pandemic context is important, particularly with a focus on data analytics, AI, and machine learning (Choi et al., 2017). This significance extends beyond the eatery industry, as highlighted by Talapatra et al., (2022), where the two principles identified can be replicated in the field of SCRE. ...
Article
The main objective of this research was to examine the instrumental role played by interpretable learning systems, specifically artificial intelligence (AI) technologies, in enhancing supply chain viability and resilience. It seeks to contribute to our understanding of the critical role played by interpretable learning systems in supporting decision-making during emergencies and crises. The research employs an empirical approach to address the research gaps in the application and impact of interpretable learning systems in supply chain management by utilizing the case of COVID-19 vaccine deliveries in France as a descriptive study. The findings highlight the ability to develop a learning system that adeptly predicts vaccine deliveries and vaccination rates. It emphasizes the importance of interpretable learning systems in optimizing supply chain management, navigating the complex landscape of vaccine distribution, establishing effective prioritization strategies, and maximizing the efficient utilization of resources.
... One of the most important factors that changed our lives in the "information age" we live in has been that technology and, therefore the new opportunities it provides have become a part of our daily lives and routines over time (Choi et al., 2016). Many technologies, such as wearable technologies that even only dreamed of seeing in science fiction or social media platforms that allow simultaneous content sharing, have surrounded our lives in all aspects. ...
Article
Full-text available
This study presents bibliometric analysis of brand-related content on social media. By filtering by topic in the WOS database, publications between 2000 and 2021 on two types of social media content, firm-generated content (FGC) and user-generated content (UGC), are examined. For FGC, 47 articles in the database are reviewed, while 3502 articles are included in the analysis for UGC. The research results found that while the FGC studies of the researchers mainly were “Business,” the UGC articles were “Computer Science Information Systems” predominantly. In addition, the journal that gives the most place to studies with the FGC topic is the Journal of Marketing. On the other hand, the journals New Media & Society and Sustainability published 46 articles each for UGC studies. As a result of the co-word network analysis, although there were five themes in the map of the FGC articles, more than ten themes were found in the map of the UGC articles. The research results are expected to shed light on researchers who will work on brand-related social media content in the following years.
... Even though the relationship between DTs and OR has not been clarified, some specific emerging DTs have been associated with SCR. Big data analytics is suitable for improving the SCR of business operations and risk management (Spieske and Birkel, 2021;Choi et al., 2016). Bag et al. (2022) examined the role of big data and predictive analytics (BDPA) in developing a resilient supply chain network in the South African mining industry. ...
Article
Climate change has resulted in the increased frequency and intensity of extreme weather events. One example is flooding, which has driven logistics firms to reconsider their resilience. Although logistics firms can employ digital techniques (DTs) to improve organisational resilience (OR), the interplay of DTs and OR in disruptive events is under-researched. In OR research, few studies have considered the roles of two critical components of DTs, namely digital orientation (DO) and digital competency (DC). This study uses multiple research methods to examine the role of DTs, including DO and DC, in logistics firms' OR during floods. In Phase 1, managers in logistics firms shared their views on the role of DTs for OR during floods through semi-structured interviews, which revealed that DO and DC might affect OR and firm performance through thematic analysis. This study further develops a conceptual framework and the associated hypotheses by combining the findings of Phase 1 with those reported in the literature. In Phase 2, survey data was collected via a self-administrated questionnaire survey, and structural equation modelling was then conducted to test the conceptual model. The results show that DC can positively affect OR and firm performance. DO can directly and positively affect firm performance and indirectly affect OR through the mediation role of DC. This study promotes the development of OR research and bridges the theory-practice gap by clarifying how DTs strengthen OR towards disruptive events.
... Further, big data analytics can be utilized to enhance resilience of SC operations by making use of the organization's database. Large volumes of data can be analyzed to predict and model risks and assess the vulnerability and risk mitigation capacity of SCs (Choi and Lambert, 2017;Choi et al., 2016). Ivanov et al. (2018) investigate how big data analytics can be effectively used during the SC planning stages to identify potential supplier risk exposures, which will further help to monitor and identify disruptions proactively. ...
... Furthermore, big data also has the potential to reduce operating costs associated with risk management while ensuring compliance with legal obligations. (Choi, Chan, & Yue, 2016) The field of weather forecasting is currently experiencing a revolution due to big data. With constantly increasing computing power, technological advancements, and the utilization of more complete data sets, predicting the weather has become much more reliable. ...
Article
Full-text available
Innovative and sophisticated technologies have been rapidly developing in recent years. These cutting-edge advancements encompass a wide spectrum of devices like mobile phones, PCs and social media trackers. As a consequence of their widespread usage, these technologies have engendered the generation of vast volumes of unstructured data in diverse formats, spanning terabytes (TB) to petabytes (PB).This vast and varied data is called big data. It holds great promise for both public and private industries. Many organizations utilize big data to uncover useful insights, whether for marketing choices, monitoring specific actions, or identifying potential threats.This kind of data processing is made possible using different methods known as Big Data Analytics. It allows you to gain significant advantages by handling large amounts of unorganized, organized, and partially organized information quickly, which would be impossible with traditional database techniques. While big data presents considerable While it offers benefits for businesses and decision-makers, it also puts consumers at risk. This risk results from the use of analytics technologies, which need the preservation, administration, and thorough analysis of enormous volumes of data gathered from many sources. Consequently, individuals face the risk of their personal information being compromised as a result of the collection and revelation of behavioral data. Put simply, the excessive accumulation of data may lead to multiple breaches in security and privacy. Nevertheless, the realm of big data does indeed raise concerns pertaining to security and privacy. Scholars from various disciplines are actively engaged in addressing these concerns. The study will concentrate on large data applications, substantial security hurdles, and privacy concerns. We'll talk about potential methods for enhancing confidentiality and safety in problematic big data scenarios, and we'll also analyze present security practices.
... This is especially crucial in the field of financial management, where precise forecasts are essential for making prudent investment choices.In this research paper, we will examine the numerous methods in which AI may be used to predictive analytics in financial management. Following a discussion of the benefits of utilizing AI for predictive analytics, we will investigate many particular uses of AI in financial management, such as credit risk analysis, portfolio management, and fraud detection [3]. Evaluation of a borrower's creditworthiness is the process of credit risk analysis. ...
Conference Paper
Full-text available
The use of artificial intelligence (AI) in financial management for predictive analytics is a rapidly emerging topic. This research study investigates the numerous ways in which artificial intelligence (AI) might be utilized to enhance financial forecasting and decision-making. The article opens by addressing the benefits of utilizing AI for predictive analytics, such as the capacity to manage vast volumes of data, find patterns and trends, and produce high-accuracy forecasts. The study then delves into numerous particular uses of artificial intelligence in financial management, such as credit risk analysis, portfolio management, and fraud detection. Finally, the study discusses the problems and limits of employing AI for predictive analytics in financial management, as well as future research objectives in this field. Overall, this study article indicates how artificial intelligence (AI) has the potential to change financial management by delivering more accurate and efficient decision-making tools.
... Interestingly, another group of researchers concluded in 2017 that big data research was still in its infancy, and its focus was rather unclear and related studies were not well amalgamated (Choi et al., 2017). However, they believed that big data analytics would definitely lead to valuable knowledge for many organizations, and business operations and risk management could benefit greatly from big data and analytics. ...
Article
Full-text available
As the data-analysts job market grows, many colleges and universities have started offering a data analytics curriculum. However, there are potential gaps between the skills business organizations expect of data analysts and the skills universities and colleges teach their students. This study collected 2500+ data-analyst job ads posted on LinkedIn and analyzed them using distribution analysis, cross-tabulation analysis, and cluster analysis. Among many findings, this study identified five most essential nontechnical skills and five most essentials areas of technical skills. In addition, of 90+ computer programs business organizations expect data analysts to use, this study identified SQL, Microsoft Excel, Tableau, Python, and Microsoft Power BI to be the five most essential computer programs for potential data analysts to master.
... Further, big data analytics can be utilized to enhance resilience of SC operations by making use of the organization's database. Large volumes of data can be analyzed to predict and model risks and assess the vulnerability and risk mitigation capacity of SCs (Choi and Lambert, 2017;Choi et al., 2016). Ivanov et al. (2018) investigate how big data analytics can be effectively used during the SC planning stages to identify potential supplier risk exposures, which will further help to monitor and identify disruptions proactively. ...
Article
During an emergency, there are many activities in a pharmaceutical supply chain (PSC) which are rendered ineffective. This study aims to propose a holistic approach for Big Data Driven Pharmaceuticals Supply Chain (BDDPSC) during medical emergencies. During unprecedented situations, the quality of healthcare services by using traditional supply chain becomes a challenge. The current study aims to generate a model wherein multiple entities will be the part of data entries like hospitals, clinical trials, medical practitioners, and drug manufacturing companies during COVID-19. The study has considered certified medical practitioners as the experts and based their responses for proposing a theoretical framework deploying E-Delphi-Qualitative Data Analysis approach. By critically examining experts' responses and comments, the study formulated four major themes and ten sub-themes for smooth functioning of BDDPSCs during an emergency. The E-Delphi was conducted in two rounds to reach final consensus and to find a balance for PSC in terms of efficiency and quality. This research is novel wherein big data enabled PSC theoretical model has been formulated using a qualitative approach for handling COVID-19. The proposed framework provides an enriched way to capture data from the important link viz. "health officials" of PSCs.
... The data [39] were accumulated on servers on a daily manner that build huge issues if data was protected [17,22,24] for weeks as the size of accumulated data is large. This long-time data preservation and searching of particular events is a complex and costly process [7]. Moreover, the manual assessment of live surveillance underwent various drawbacks, like dependency, subjectivity on human attention and cognitive capabilities are risky options. ...
Article
Full-text available
The expansion of growth in the generation of video data in various organizations causes an urgent requirement for effectual video summarization methods. This paper devises a novel optimization-driven deep learning technique for video summarization. The aim is to give an automated video summarization. Initially, the video data is extracted from the database. Then, the representative frame selection is done using Bayesian fuzzy clustering (BFC). After that, the frames are then temporally segmented, wherein each segment is modelled as a representative frame, which is generated by clustering the temporal segment into clusters. These segments are selected from each cluster closest to the cluster center. The next step is fine refining that is performed using Deep convolution neural network (Deep CNN), which helps to refine the final frame set. The Deep CNN is trained using the proposed Lion deer hunting (LDH) algorithm. The LDH algorithm is the integration of the Deer hunting optimization algorithm (DHOA) and Lion optimization algorithm (LOA). Thus, the final frames obtained by the proposed LDH-based Deep CNN are employed for video summarization. Here, the final frames are adapted to play as a continuous output video. The developed LDH-based Deep CNN offered enhanced performance than other techniques with the highest precision of 0.841, highest recall of 0.810, and highest F1-Score of 0.888.
... In this context, data-driven supply chain management has become a practice that is highly sought after by many firms (Sanders and Ganeshan 2018). This trend is supported by the previous literature, which has consistently demonstrated the value of data in improving firms' operations, including supply chain configuration (Wang et al. 2016), safety stock allocation (Hamister et al. 2018), risk management (Choi et al. 2016), and promotion planning (Cohen et al. 2017). With the advances in artificial intelligence in recent years, techniques such as deep learning have been increasingly adopted to improve several aspects related to retail operations, such as procurement (Cui et al. 2022), demand forecasting (Birim et al. 2022), inventory replenishment (Qi et al. 2020), and risk management (Wu and Chien 2021). ...
Article
Full-text available
The COVID-19 pandemic has triggered panic-buying behavior around the globe. As a result, many essential supplies were consistently out-of-stock at common point-of-sale locations. Even though most retailers were aware of this problem, they were caught off guard and are still lacking the technical capabilities to address this issue. The primary objective of this paper is to develop a framework that can systematically alleviate this issue by leveraging AI models and techniques. We exploit both internal and external data sources and show that using external data enhances the predictability and interpretability of our model. Our data-driven framework can help retailers detect demand anomalies as they occur, allowing them to react strategically. We collaborate with a large retailer and apply our models to three categories of products using a dataset with more than 15 million observations. We first show that our proposed anomaly detection model can successfully detect anomalies related to panic buying. We then present a prescriptive analytics simulation tool that can help retailers improve essential product distribution in uncertain times. Using data from the March 2020 panic-buying wave, we show that our prescriptive tool can help retailers increase access to essential products by 56.74%.
... Clustering methods are thought to be necessary for both data collection and analysis. Its goal is to pinpoint subsets, structural elements, or objects that represent uniformity and regularity in material while symbolizing essential and important attributes [1]. This problem was initially brought up in relation to market bundle analytics, which sought to identify typical collections of products that are bought concurrently. ...
Article
Pattern analysis is one of the most important tasks for strong data and useful data from unstructured data. The item-sets for this work must include the aforementioned homogeneity, but rather frequency in the data. Despite the fact that numerous efficient algorithms have been developed in this field, the rising demand for information has led to deterioration in the effectiveness of present pattern mining techniques. The goal of this study is to provide fresh, efficient methods for mining huge data patterns. To accomplish this, a variety of methods based on the Map reduction algorithm as well as the Apache expansive implementation have been proposed. The recommended algorithms may be divided into three main groups. The Attribute Mapper (AprioriMR) and recurrent AprioriMR techniques, which eliminate any previous objects in data without employing a pruning methodology, are the first two described. Second, two recommended approaches (space cutting AprioriMR or top AprioriMR) to improve the search space leverage certain well-generalized pro property. The last approach for mining compressed versions of frequent patterns is the maximum AprioriMR algorithm. To evaluate the efficacy of the offered strategies, up to 3 1018 interactions and more than 5 million distinct ones have been examined in a range of big data datasets. Evaluations of well-known and efficient pattern mining algorithms are part of the testing process. The results show both the benefit of employing Dremel versions when taking into consideration intricate issues and the insufficiency of this strategy when dealing with little amounts of data.
... mainly focuses on disruption, uncertainty, and risk themes. Some of the contributing authors include Singh (Lamba & Singh, 2019;Mishra & Singh, 2020), Choi (Choi et al., 2016, Araz et al., 2020, and Ivanov Ivanov & Rozhkov, 2020). Cluster 3 (represented by blue lines) mainly focuses on firm performance and predictive analytics themes. ...
Article
Full-text available
Supply chain management has evolved from local and regional purchasing and supply activities prior to the industrial revolution to the current form of technology-led, data-driven, collaborative, and global supply network. Data-driven technologies and applications in supply chain management enable supply chain planning, performance, coordination, and decision-making. Although the literature on procurement, production, logistics, distribution, and other areas within the supply chain is rich in their respective areas, systematic analyses of supply chain analytics are relatively few. Our objective is to examine supply chain analytics research to discover its intellectual core through a detailed bibliometric analysis. Specifically, we adopt citation, cocitation, co-occurrence, and centrality analysis using data obtained from the Web of Science to identify key research themes constituting the intellectual core of supply chain analytics. We find that there has been increasing attention in research circles relating to the relevance of analytics in supply chain management and implementation. We attempt to discover the themes and sub-themes in this research area. We find that the intellectual core of SCA can be classified into three main themes (i) introduction of big data in the supply chain, (ii) adoption of analytics in different functions of operations management like logistics, pricing, location, etc., and (iii) application of analytics for improving performance and business value. The limitations of the current study and related future research directions are also presented.
... As for RM exper tise, its ability and effectiveness in DFI can be improved by combination with DA and SM competencies. By adding DA, companies can improve and optimize RM processes (Choi, Chan, & Yue, 2017;Dicuonzo, Galeone, Zappimbulso, & Dell'Atti, 2019). ...
Article
Full-text available
Due to rapid technological advancement, the financial industry is now transitioning from traditional to digitally based financial services. In Indonesia in particular, this transformation is being carried out in accordance with the idea of digital finance innovation. However, within the Indonesian financial industry, there is a competency gap that hinders the rate of progress of this transformation. Accordingly, this study aims to create a curriculum designed explicitly for digital finance innovation for Indonesian higher education institutions in order to address the existing competency gap. Through the application of multivariate regression analysis, eight required competencies and their respective subjects were identified in this study. The identified competencies can provide insights for higher education institutions in creating a curriculum to ensure that their future graduates can fill the competency gap within the Indonesian financial industry. The results of the study state that: (i) an effective and appropriate relationship between DA, RM, BM, and SM competencies and can improve the ability of financial institutions to identify and manage risks in their environment; (ii) the relationship between FIK knowledge and PRO and BM competencies can help financial institutions ensure that their DFI products, such as applications, meet the standards set by regulators and improve management practices; (iii) the relationship between FL and PRO competencies can improve the organization's ability to reduce the possibility of miscommunication and misperceptions surrounding DFI ideas. Hence, these findings have implications for the university curriculum that implements them, bolstering the output of graduates who can help accelerate the actualization of the digital financial innovation agendas of the Indonesian financial industry. Keywords: digital finance innovation; financial technology; higher education institutions
Article
Full-text available
Big Data is still gaining attention as a fundamental building block of the Artificial Intelligence and Machine Learning world. Therefore, a lot of effort has been pushed into Big Data research in the last 15 years. The objective of this Systematic Literature Review is to summarize the current state of the art of the previous 15 years of research about Big Data by providing answers to a set of research questions related to the main application domains for Big Data analytics; the significant challenges and limitations researchers have encountered in Big Data analysis, and emerging research trends and future directions in Big Data. The review follows a predefined procedure that automatically searches five well-known digital libraries. After applying the selection criteria to the results, 189 primary studies were identified as relevant, of which 32 were Systematic Literature Reviews. Required information was extracted from the 32 studies and summarized. Our Systematic Literature Review sketched the picture of 15 years of research in Big Data, identifying application domains, challenges, and future directions in this research field. We believe that a substantial amount of work remains to be done to align and seamlessly integrate Big Data into data-driven advanced software solutions of the future.
Chapter
The rise of cloud computing, internet of things, and information technology has made big data technology a common concern for many professionals and researchers. A financial risk control model, known as the MSHDS-RS model, was creatively suggested in response to the present state of inappropriate feature data design in big data risk control technology. The concept is built on multi source heterogeneous data structure (MSHDS) and random subspace (RS). This model is novel in that it uses a normalized sparse model for feature fusion optimization to create integrated features after extracting the hard and soft features from loan customer information sources. Subsequently, a base classifier is trained on the feature subset acquired via probability sampling, and its output is combined and refined by the application of evidence reasoning principles. The accuracy improvement rate of the MSHDS-RS method is approximately 3.0% and 3.6% higher than that of the current PMB-RS methods under the conditions of soft feature indicators and integrated feature indicators, respectively, according to an observation of the operation results of MSHDS-RS models under various feature sets. As a result, the suggested optimization fusion approach is trustworthy and workable. This study has helped to reduce financial risks associated with the internet and may be useful in helping lenders make wise judgments.
Chapter
The study explores the role of big data analytics (BDA) in business process by presenting a comprehensive review of listed literature on Scopus database. 1122 studies are analyzed keyword-wise to identify the trend and future scope of BDA in business process. The study suggests that none of business process aspects are untouched by big data analytics. From planning to execution, manufacturing to customer satisfaction, costing to performance management and corporate governance to public relation, all aspect of business process has employed big data analytics tools in one way or other. BDA has also admitted its presence in different business domain like supply chain, service-industry, industry 4.0 and sustainable business practices. Internet of Things (IoT), Artificial Intelligence (AI), Deep learning, Decision support system, Neural network, Predictive analysis, Cloud computing and Machine learning are mostly used big data analytics tools. Keyword analysis also provide insights of currently researched topics and future scope for under researched topics.
Chapter
Given the intense competition and ever-increasing demands, SMEs find themselves obligated to undertake continuous improvement actions to enhance quality, optimize costs, and reduce cycle times. The BPR method emerges today as the most effective tool to address these needs. However, it is a highly risky operation and requires proper implementation to achieve the desired outcomes. This study highlights the importance of continuous risk management in BPR projects and proposes an agile framework for practitioners for its implementation.
Preprint
Full-text available
(To appear in a revised form on Journal of Big data - Springer Nature) Big Data is still gaining attention as a fundamental building block of the Artificial Intelligence and Machine Learning world. Therefore, a lot of effort has been pushed into Big Data research in the last 15 years. The objective of this Systematic Literature Review is to summarise the current state of the art of the previous 15 years of research about Big Data by providing answers to a set of research questions related to the main application domains for Big Data analytics; the significant challenges and limitations researchers have encountered in Big Data analysis, and emerging research trends and future directions in Big Data. The review follows a predefined procedure that automatically searches five well-known digital libraries. After applying the selection criteria to the results, 189 primary studies were identified as relevant, of which 32 were Systematic Literature Reviews. Required information was extracted from the 32 studies and summarised. Our Systematic Literature Review sketched the picture of 15 years of research in Big Data, identifying application domains, challenges, and future directions in this research field. We believe that a substantial amount of work remains to be done to align and seamlessly integrate Big Data into data-driven advanced software solutions of the future.
Article
Bundling sales exhibit significant advantages and popularity on e-commerce platforms. However, information asymmetry between the platform and upstream suppliers dramatically affects bundling sales, but the influencing mechanism remains unclear. In this article, we introduce information sharing into bundling sales and focus on the interaction between information sharing decisions and bundling sales strategies. Our findings show that under the reselling mode, the platform exhibits a reluctance to share information with suppliers, and the inclination toward bundling diminishes as demand uncertainty rises. Conversely, under the agency mode, the platform tends to engage in information sharing when faced with high commission rates, while leaning toward bundling sales when the degree of substitution between individual products and bundled products, as well as commission rates, is moderate. Specifically, we find that in the agency mode, the platform's information sharing decisions inhibit its bundling sales strategies. Furthermore, we observe a gradual shift in the platform's distribution mode preference toward the reselling mode as demand uncertainty escalates. These findings offer valuable insights for platform management in optimizing the operational efficiency of the entire supply chain.
Article
Full-text available
The conducted research substantiates the relevance of the application and the need to develop adaptive software to solve the problems of using IT in the construction industry. The basic principles of reflexive adaptation, as well as the use of software product lines engineering (SPLE) technology to create adaptive software systems, are considered. A generalized architecture of an adaptive software system based on models of variability and levels of reflection is proposed. A formal description of the reflexive software system for the implementation of variants of the adaptive behavior of application programs has been carried out. The practical significance of the research is that giving the applied software system adaptive properties will enable it to modify its structure at a qualitatively new level, which in turn will increase its reliability, resistance to failures, flexibility, reduce the cost of maintenance and extend the life of the system. Such a system will be able to expand the class of solvable tasks throughout the entire life cycle, as well as perform operations that are currently considered part of the duties of certain specialists, for example, it will be able to perform self-administration functions.
Preprint
Full-text available
Intelligent and agile risk management is a proactive and adaptive approach to managing risks in organizations, including healthcare settings. It combines advanced technologies, data analytics, and agile practices to enhance risk identification, assessment, response, and monitoring. This approach leverages artificial intelligence, machine learning, and automation to gather and analyze data, predict potential risks, and support decision-making processes. By adopting intelligent and agile risk management, organizations can benefit from proactive risk identification, enhanced risk assessment accuracy, agile response strategies, improved decision-making, and optimized resource allocation. It also promotes continuous learning, stakeholder engagement, and a culture of continuous improvement. However, implementing intelligent and agile risk management comes with challenges, such as complexity, data security and privacy concerns, financial considerations, skill requirements, and the need for effective change management. To mitigate these risks, organizations should carefully plan and prepare for implementation, prioritize data security and privacy, conduct thorough cost-benefit analysis, address skill gaps, balance technology reliance with human judgment, ensure smooth integration, and establish ethical frameworks. By addressing these considerations, organizations can unlock the advantages of intelligent and agile risk management while mitigating potential risks and optimizing risk management outcomes.
Article
Full-text available
Students are the future of a nation. Personalizing student interests in higher education courses is one of the biggest challenges in higher education. Various AI and ML approaches have been used to study student behaviour. Existing AI and ML algorithms are used to identify features for various fields, such as behavioural analysis, economic analysis, image processing, and personalized medicine. However, there are major concerns about the interpretability and understandability of the decision made by a model. This is because most AI algorithms are black-box models. In this study, explain- able AI (XAI) aims to break the black box nature of an algorithm. In this study, XAI is used to identify engineering students’ interests, and BRB and SP-LIME are used to explain which attributes are critical to their studies. We also used (PCA) for feature selection to identify the student cohort. Clustering the cohort helps to analyse the between influential features in terms of engineering discipline selection. The results show that there are some valuable factors that influence their study and, ultimately, the future of a nation.
Article
Full-text available
The purpose of this paper is to investigate potential links between Blockchain technology (BCT) and big data analytics (BDA) with supply chain risk management (SCRM) and supply chain performance (SCP) in the Jordanian Chemical and Cosmetic Industries Sector. Additionally, the paper tests a conceptual model that links SCRM to indirect effects. To test our proposition, data were collected from 364 employees working in Jordanian Chemical and Cosmetic Industries Sector. The data were analyzed using structural equation modeling with aid of the Lavaan R package. The results show that the influences of blockchain technology and big data analytics on supply chain performance do occur directly, and indirectly through the cascading of a supply chain risk management.
Chapter
This chapter considers knowledge management relative to healthcare. Information technology has been widely applied, in the form of electronic healthcare data systems and other forms of business intelligence. Data is available not only from these electronic data records, but also from social media platforms and machine-to-machine sources. Medical data comes in many forms to include biometric and text. Automated knowledge management systems can sometimes provide real-time response to decision-making in fast-moving crises.KeywordsKnowledge managementApplicationsBlockchain
Article
Compared with the traditional brick-and-mortar retail industry, e-commerce has numerous advantages in the collection of consumer demand information, and big data technology can help online platforms forecast accurate demand information from the collected data, which can then be used to improve business operations. To investigate the incentives for sharing the forecast information with or without platform encroachment, this paper establishes stylized analytical models in an e-commerce setting consisting of a manufacturer, an online platform and a third-party retailer (as a reseller). Depending on whether the online platform encroachment happens or not, we consider two different platform operation modes, namely, the mode without platform encroachment where only the third-party retailer sells products, and the mode with platform encroachment where both the online platform and the third-party retailer sell products. Our analysis shows that when the online platform does not encroach, it has no incentives to share forecast information with the manufacturer, but always have incentives to share forecast information with the third-party retailer. Interestingly, when the online platform encroaches, the online platform may share forecast information, depending on the level of substitutability between products and commission rate charged by it, either only with the third-party retailer, or with both the manufacturer and the third-party retailer, or with none of them.
Article
Full-text available
A hybrid model integrates a first‐principles model with a data‐driven model which predicts certain unknown dynamics of the process, resulting in higher accuracy than first‐principles model. Additionally, a hybrid model has better extrapolation capabilities compared with a data‐based model, which is useful for process control and optimization purposes. Nonetheless, the domain of applicability (DA) of a hybrid model is finite and should be taken into account when developing a hybrid model‐based predictive controller in order to maximize its performance. To this end, a Control Lyapunov–Barrier Function‐based model predictive controller (CLBF‐based MPC) is developed which utilizes a deep hybrid model (DHM), that is, a deep neural network (DNN) combined with a first‐principles model. Additionally, theoretical guarantees are provided on stability as well as on system states to stay within the DA of the DHM. The efficacy of the proposed control framework is demonstrated on a continuous stirred tank reactor.
Article
Full-text available
This paper presents a safety-based route planner that exploits vehicle-to-cloud-to-vehicle (V2C2V) connectivity. Time and road risk index (RRI) are considered as metrics to be balanced based on user preference. To evaluate road segment risk, a road and accident database from the highway safety information system is mined with a hybrid neural network model to predict RRI. Real-time factors such as time of day, day of the week, and weather are included as correction factors to the static RRI prediction. With real-time RRI and expected travel time, route planning is formulated as a multiobjective network flow problem and further reduced to a mixed-integer programming problem. A V2C2V implementation of our safety-based route planning approach is proposed to facilitate access to real-time information and computing resources. A real-world case study, route planning through the city of Columbus, Ohio, is presented. Several scenarios illustrate how the "best" route can be adjusted to favor time versus safety metrics.
Article
Full-text available
Clustering of big data has received much attention recently. In this paper, we present a new clusiVAT algorithm and compare it with four other popular data clustering algorithms. Three of the four comparison methods are based on the well known, classical batch k-means model. Specifically, we use k-means, single pass k-means, online k-means, and clustering using representatives (CURE) for numerical comparisons. clusiVAT is based on sampling the data, imaging the reordered distance matrix to estimate the number of clusters in the data visually, clustering the samples using a relative of single linkage (SL), and then noniteratively extending the labels to the rest of the data-set using the nearest prototype rule. Previous work has established that clusiVAT produces true SL clusters in compact-separated data. We have performed experiments to show that k-means and its modified algorithms suffer from initialization issues that cause many failures. On the other hand, clusiVAT needs no initialization, and almost always finds partitions that accurately match ground truth labels in labeled data. CURE also finds SL type partitions but is much slower than the other four algorithms. In our experiments, clusiVAT proves to be the fastest and most accurate of the five algorithms; e.g., it recovers 97% of the ground truth labels in the real world KDD-99 cup data (4,292,637 samples in 41 dimensions) in 76 s.
Article
Full-text available
Scheduling of dynamic and multitasking workloads for big-data analytics is a challenging issue, as it requires a significant amount of parameter sweeping and iterations. Therefore, real-time scheduling becomes essential to increase the throughput of many-task computing. The difficulty lies in obtaining a series of optimal yet responsive schedules. In dynamic scenarios, such as virtual clusters in cloud, scheduling must be processed fast enough to keep pace with the unpredictable fluctuations in the workloads to optimize the overall system performance. In this paper, ordinal optimization using rough models and fast simulation is introduced to obtain suboptimal solutions in a much shorter timeframe. While the scheduling solution for each period may not be the best, ordinal optimization can be processed fast in an iterative and evolutionary way to capture the details of big-data workload dynamism. Experimental results show that our evolutionary approach compared with existing methods, such as Monte Carlo and Blind Pick, can achieve higher overall average scheduling performance, such as throughput, in real-world applications with dynamic workloads. Furthermore, performance improvement is seen by implementing an optimal computing budget allocating method that smartly allocates computing cycles to the most promising schedules.
Article
Full-text available
In the big data era, systems reliability is critical to effective systems risk management. In this paper, a novel multiobjective approach, with hybridization of a known algorithm called NSGA-II and an adaptive population-based simulated annealing (APBSA) method is developed to solve the systems reliability optimization problems. In the first step, to create a good algorithm, we use a coevolutionary strategy. Since the proposed algorithm is very sensitive to parameter values, the response surface method is employed to estimate the appropriate parameters of the algorithm. Moreover, to examine the performance of our proposed approach, several test problems are generated, and the proposed hybrid algorithm and other commonly known approaches (i.e., MOGA, NRGA, and NSGA-II) are compared with respect to four performance measures: 1) mean ideal distance; 2) diversification metric; 3) percentage of domination; and 4) data envelopment analysis. The computational studies have shown that the proposed algorithm is an effective approach for systems reliability and risk management.
Article
Full-text available
Internet of Things (IoT) has provided a promising opportunity to build powerful industrial systems and applications by leveraging the growing ubiquity of radio-frequency identification (RFID), and wireless, mobile, and sensor devices. A wide range of industrial IoT applications have been developed and deployed in recent years. In an effort to understand the development of IoT in industries, this paper reviews the current research of IoT, key enabling technologies, major IoT applications in industries, and identifies research trends and challenges. A main contribution of this review paper is that it summarizes the current state-of-the-art IoT in industries systematically.
Article
Full-text available
Cloud computing introduces flexibility in the way an organization conducts its business. On the other hand, it is advisable for organizations to select cloud service partners based on how prepared they are owing to the uncertainties present in the cloud. This study is a conceptual research which investigates the impact of some of these uncertainties and flexibilities embellished in the cloud. First, we look at the assessment of security and how it can impact the supply chain operations using entropy as an assessment tool. Based on queuing theory, we look at how scalability can moderate the relationship between cloud service and the purported benefits. We aim to show that cloud service can only prove beneficial to supply partners under a highly secured, highly scalable computing environment and hope to lend credence to the need for system thinking as well as strategic thinking when making cloud service adoption decisions.
Article
Full-text available
Combining ideas from evolutionary algorithms, decomposition approaches, and Pareto local search, this paper suggests a simple yet efficient memetic algorithm for combinatorial multiobjective optimization problems: memetic algorithm based on decomposition (MOMAD). It decomposes a combinatorial multiobjective problem into a number of single objective optimization problems using an aggregation method. MOMAD evolves three populations: 1) population $P_{L}$ for recording the current solution to each subproblem; 2) population $P_{P}$ for storing starting solutions for Pareto local search; and 3) an external population $P_{E}$ for maintaining all the nondominated solutions found so far during the search. A problem-specific single objective heuristic can be applied to these subproblems to initialize the three populations. At each generation, a Pareto local search method is first applied to search a neighborhood of each solution in $P_{P}$ to update $P_{L}$ and $P_{E}$ . Then a single objective local search is applied to each perturbed solution in $P_{L}$ for improving $P_{L}$ and $P_{E}$ , and reinitializing $P_{P}$ . The procedure is repeated until a stopping condition is met. MOMAD provides a generic hybrid multiobjective algorithmic framework in which problem specific knowledg- , well developed single objective local search and heuristics and Pareto local search methods can be hybridized. It is a population based iterative method and thus an anytime algorithm. Extensive experiments have been conducted in this paper to study MOMAD and compare it with some other state-of-the-art algorithms on the multiobjective traveling salesman problem and the multiobjective knapsack problem. The experimental results show that our proposed algorithm outperforms or performs similarly to the best so far heuristics on these two problems.
Article
Full-text available
The outputs of upstream individual processes (members) become the inputs of downstream members in supply chains. When multiple inputs and outputs are present, data envelopment analysis has been widely applied to assess efficiency. In cooperative groups, such as supply chains, a maximin decision approach can reflect not only overall system efficiency, but also efficiency of system elements. This paper discusses a maximin efficiency multistage supply chain model capable of measuring supply chain members performance as well as overall supply chain performance.
Article
Full-text available
The advances in cloud computing and internet of things (IoT) have provided a promising opportunity to resolve the challenges caused by the increasing transportation issues. We present a novel multilayered vehicular data cloud platform by using cloud computing and IoT technologies. Two innovative vehicular data cloud services, an intelligent parking cloud service and a vehicular data mining cloud service, for vehicle warranty analysis in the IoT environment are also presented. Two modified data mining models for the vehicular data mining cloud service, a Naïve Bayes model and a Logistic Regression model, are presented in detail. Challenges and directions for future work are also provided.
Article
Full-text available
This article focuses on a scalable software platform for the Smart Grid cyber-physical system using cloud technologies. Dynamic Demand Response (D2R) is a challenge-application to perform intelligent demand-side management and relieve peak load in Smart Power Grids. The platform offers an adaptive information integration pipeline for ingesting dynamic data; a secure repository for researchers to share knowledge; scalable machine-learning models trained over massive datasets for agile demand forecasting; and a portal for visualizing consumption patterns, and validated at the University of Southern California's campus microgrid. The article examines the role of clouds and their tradeoffs for use in the Smart Grid Cyber-Physical Sagileystem.
Article
Full-text available
Extracting useful knowledge from huge digital datasets requires smart and scalable analytics services, programming tools, and applications.
Article
Full-text available
Finding data governance practices that maintain a balance between value creation and risk exposure is the new organizational imperative for unlocking competitive advantage and maximizing value from the application of big data. The first Web extra at http://youtu.be/B2RlkoNjrzA is a video in which author Paul Tallon expands on his article "Corporate Governance of Big Data: Perspectives on Value, Risk, and Cost" and discusses how finding data governance practices that maintain a balance between value creation and risk exposure is the new organizational imperative for unlocking competitive advantage and maximizing value from the application of big data. The second Web extra at http://youtu.be/g0RFa4swaf4 is a video in which author Paul Tallon discusses the supplementary material to his article "Corporate Governance of Big Data: Perspectives on Value, Risk, and Cost" and how projection models can help individuals responsible for data handling plan for and understand big data storage issues.
Article
Full-text available
Electromagnetic interference, equipment noise, multi-path effects and obstructions in harsh smart grid environments make the quality-of-service (QoS) communication a challenging task for WSN-based smart grid applications. To address these challenges, a cognitive communication based cross-layer framework has been proposed. The proposed framework exploits the emerging cognitive radio technology to mitigate the noisy and congested spectrum bands, yielding reliable and high capacity links for wireless communication in smart grids. To meet the QoS requirements of diverse smart grid applications, it differentiates the traffic flows into different priority classes according to their QoS needs and maintains three dimensional service queues attributing delay, bandwidth and reliability of data. The problem is formulated as a Lyapunov drift optimization with the objective of maximizing the weighted service of the traffic flows belonging to different classes. A suboptimal distributed control algorithm (DCA) is presented to efficiently support QoS through channel control, flow control, scheduling and routing decisions. In particular, the contributions of this paper are three folds; employing dynamic spectrum access to mitigate with the channel impairments, defining multi-attribute priority classes and designing a distributed control algorithm for data delivery that maximizes the network utility under QoS constraints. Performance evaluations in ns-2 reveal that the proposed framework achieves required QoS communication in smart grid.
Article
Full-text available
Risk management has become a vital topic both in academia and practice during the past several decades. Most business intelligence tools have been used to enhance risk management, and the risk management tools have benefited from business intelligence approaches. This introductory article provides a review of the state-of-the-art research in business intelligence in risk management, and of the work that has been accepted for publication in this issue.
Article
This paper demonstrates that an important role of intermediaries in supply chains is to reduce the financial risk faced by retailers. It is well known that risk averse retailers when faced by the classical single-period inventory (newsvendor) problem will order less than the expected value maximizing (newsboy) quantity. We show that in such situations a risk neutral distributor can offer a menu of mutually beneficial contracts to the retailers. We show that a menu can be designed to simultaneously: (i) induce every risk averse retailer to select a unique contract from it; (ii) maximize the distributor's expected profit; and (iii) raise the order quantity of the retailers to the expected value maximizing quantity. Thus inefficiency created due to risk aversion on part of the retailers can be avoided. We also investigate the influence of product/market characteristics on the offered menu of contracts.
Chapter
Researchers work with sensitive information every day and are often the ones responsible for managing how confidential data gets stored in the database and files. Risk management is involved in preventing our organization and/or our donors from harm, and it’s the “or” that separates risk from ethics. Ethics is related to risk, but where risk is affected by external factors, ethics are internal-to each individual and within an organizational culture. Ethics and risk guidelines aren’t just fuzzy feel-good notions to uphold; they protect you, your board, your organization, and your donors from real harm. Professional associations for frontline fundraisers and prospect researchers such as The Association of Fundraising Professionals (AFP), Association for Healthcare Philanthropy (AHP), and Association of Professional Researchers for Advancement (APRA) keep current on guidelines as they evolve, and are great resources if you have questions or concerns about your organization and compliance. Business ethics; Non profit organizations; philanthropy; Prospect research; risk management
Article
Data mining techniques identify relationships, patterns, trends, and predictive information form large and complex databases. This study demonstrates the use of a data mining technique to assess the risk of management fraud. We use a data mining tool to analyze the management fraud data, presence or absence of red flags in fraud and no fraud cases, collected by a Big Six firm. The ensuing results compare favorably with the statistical and neural network results obtained by the other studies. The study illustrates the ease of using data mining techniques by demonstrating the rapid development of models, querying capabilities, and ease of encoding statistical models in audit decision making.
Article
Everyday a constant stream of data is generated as a result of social interactions, Internet of things, e-commerce and other business processes. This vast amount of data should be collected, stored, transformed, monitored and analyzed in a relatively brief period of time. Reason behind is data may contain the answer to business insights and new ideas fostering competitiveness and innovation. Big Data technologies/methodologies have emerged as the solution to this need. However, being a relatively new trend there is still much that remains unknown. This study, based on a risk and benefits perspective, uses the theory of planned behavior to develop a model that predicts the intention to adopt Big Data technologies.
Article
The development of the theory and application of Monte Carlo Markov Chain methods, vast improvements in computational capabilities, and emerging software alternatives have made it possible for more frequent use of Bayesian methods in reliability applications. Bayesian methods, however, remain controversial in reliability (and some other applications) because of the concern about from where the needed prior distributions should come. On the other hand, there are many applications where engineers have solid prior information on certain aspects of their reliability problems based on physics of failure or previous experience with the same failure mechanism. For example, engineers often have useful but imprecise knowledge about the effective activation energy in a temperature-accelerated life test or about the Weibull shape parameter in the analysis of fatigue-failure data. In such applications, the use of Bayesian methods is compelling, as it offers an appropriate compromise between assuming that such quantities are known and assuming that nothing is known. In this paper, we compare the use of Bayesian methods with the traditional maximum-likelihood methods for a group Of examples, including the analysis of field data with multiple censoring, accelerated life-test data, and accelerated degradation-test data.
Article
A framework of comprehensive defense architecture for power system security and stability is presented. From the general safety concept, the comprehensive defense architecture should include the power system security assurance system, i.e. active safety system, and power system stability control system, i.e. passive safety system. Power system security assurance system (active safety) refers to the measures to improve power system security and controllability. Power system stability control system (passive safety) refers to the measures to maintain power system security and stability during disturbances. The power system security assurance system (active safety) can be set to three-defense lines. The first defense line is a strong power grid structure, to lay a solid foundation for the power system security; the second defense line is the optimized automatic control system, to enhance the safe operation of power system; the third defense line is a safe operation planning to ensure that the power system operates at a safe level. Power system stability control system (passive safety) is the traditional three-defense lines for power system security and stability. The first defense line is rapid removal of faulty components, to prevent the failure to expand; the second defense line is the control measures to maintain power system stability; the third defense line is the measures to prevent widespread blackouts when power system loses stability.
Article
Motivated by the presence of loss-averse decision making behavior in practice, this article considers a supply chain consisting of a firm and strategic consumers who possess an S-shaped loss-averse utility function. In the model, consumers decide the purchase timing and the firm chooses the inventory level. We find that the loss-averse consumers' strategic purchasing behavior is determined by their perceived gain and loss from strategic purchase delay, and the given rationing risk. Thus, the firm that is cognizant of this property tailors its inventory stocking policy based on the consumers' loss-averse behavior such as their perceived values of gain and loss, and their sensitivity to them. We also demonstrate that the firm's equilibrium inventory stocking policy reflects both the economic logic of the traditional newsvendor inventory model, and the loss-averse behavior of consumers. The equilibrium order quantity is significantly different from those derived from models that assume that the consumers are risk neutral and homogeneous in their valuations. We show that the firm that ignores strategic consumer's loss-aversion behavior tends to keep an unnecessarily high inventory level that leads to excessive leftovers. Our numerical experiments further reveal that in some extreme cases the firm that ignores strategic consumer's loss-aversion behavior generates almost 92% more leftovers than the firm that possesses consumers’ loss-aversion information and takes it into account when making managerial decisions. To mitigate the consumer's forward-looking behavior, we propose the adoption of the practice of agile supply chain management, which possesses the following attributes: (i) procuring inventory after observing real-time demand information, (ii) enhanced design (which maintains the current production mix but improves the product performance to a higher level), and (iii) customized design (which maintains the current performance level but increases the variety of the current production line to meet consumers’ specific demands). We show that such a practice can induce the consumer to make early purchases by increasing their rationing risk, increasing the product value, or diversifying the product line. © 2015 Wiley Periodicals, Inc. Naval Research Logistics, 2015
Article
As massive data acquisition and storage becomes increasingly affordable, a wide variety of enterprises are employing statisticians to engage in sophisticated data analysis. In this paper we highlight the emerging practice of Magnetic, Agile, Deep (MAD) data analysis as a radical departure from traditional Enterprise Data Warehouses and Business Intelligence. We present our design philosophy, techniques and experience providing MAD analytics for one of the world's largest advertising networks at Fox Audience Network, using the Greenplum parallel database system. We describe database design methodologies that support the agile working style of analysts in these settings. We present dataparallel algorithms for sophisticated statistical techniques, with a focus on density methods. Finally, we reflect on database system features that enable agile design and flexible algorithm development using both SQL and MapReduce interfaces over a variety of storage mechanisms.
Article
Within the past few years, organizations in diverse industries have adopted MapReduce-based systems for large-scale data processing. Along with these new users, important new workloads have emerged which feature many small, short, and increasingly interactive jobs in addition to the large, long-running batch jobs for which MapReduce was originally designed. As interactive, large-scale query processing is a strength of the RDBMS community, it is important that lessons from that field be carried over and applied where possible in this new domain. However, these new workloads have not yet been described in the literature. We fill this gap with an empirical analysis of MapReduce traces from six separate business-critical deployments inside Facebook and at Cloudera customers in e-commerce, telecommunications, media, and retail. Our key contribution is a characterization of new MapReduce workloads which are driven in part by interactive analysis, and which make heavy use of query-like programming frameworks on top of MapReduce. These workloads display diverse behaviors which invalidate prior assumptions about MapReduce such as uniform data access, regular diurnal patterns, and prevalence of large jobs. A secondary contribution is a first step towards creating a TPC-like data processing benchmark for MapReduce.
Article
Big data systems can use different deployment paradigms based on workloads and access patterns. Opportunities for tighter integration will enable big data systems to leverage public cloud environments more effectively. As both public cloud and big data systems see more adoption and new usage paradigms evolve, new features and enhancements will help make the intersection of the two worlds broader and more mature.
Article
This paper studies the newsvendor problem in the presence of consumer behavior, specifically, social interaction. We show that deterministic consumer valuation on products derived from social interaction can be an advantage for firms. This paper examines the implications of random consumer product valuation and a lower threshold number of subscribers to the proposed deal. Several implications have been yielded.
Article
The trend of globalization and outsourcing makes supply unreliable and companies begin to have supplier diversity embedded into their procurement departments. Traditionally, contract suppliers are a major supply channel for many companies, while the effectiveness of reactive supply sources, such as spot markets, is often ignored. Spot markets have negligible lead times and higher average prices comparing with contract suppliers. In our research, procurement utilizes the combinatorial benefits of proactive supply (a contract supplier) and reactive supply (a spot market). The uncertainties of yields, spot prices, and demand, and the correlations among them are also taken into consideration when designing procurement plans. The objectives of this paper were to evaluate the effectiveness of dealing with uncertain supply using the spot market along with the contract supplier and to model the dependences among all the potential uncertainties. This research also seeks high expected profits without overlooking the associated variances. The analytical expression to determine the optimal order quantity is obtained under the most general situations where commodities can be both bought and sold via the spot market. Some properties are derived to provide useful managerial insights. In addition, reference scenarios, such as pure contract sourcing and the spot market restricted for buying or selling only, are included for comparison purposes.
Article
The two biggest trends in the data center today are cloud computing and big data. This column will examine the intersection of the two. Industry hype has resulted in nebulous definitions for each, so I'll start by defining terms.
Conference Paper
Big Data has come up with aureate haste and a clef enabler for the social business, Big Data gifts an opportunity to create extraordinary business advantage and better service delivery. Big Data is bringing a positive change in the decision making process of various business organizations. With the several offerings Big Data has come up with several issues and challenges which are related to the Big Data Management, Big Data processing and Big Data analysis. Big Data is having challenges related to volume, velocity and variety. Big Data has 3Vs Volume means large amount of data, Velocity means data arrives at high speed, Variety means data comes from heterogeneous resources. In Big Data definition, Big means a dataset which makes data concept to grow so much that it becomes difficult to manage it by using existing data management concepts and tools. Map Reduce is playing a very significant role in processing of Big Data. This paper includes a brief about Big Data and its related issues, emphasizes on role of MapReduce in Big Data processing. MapReduce is elastic scalable, efficient and fault tolerant for analysing a large set of data, highlights the features of MapReduce in comparison of other design model which makes it popular tool for processing large scale data. Analysis of performance factors of MapReduce shows that elimination of their inverse effect by optimization improves the performance of Map Reduce.
Conference Paper
The concept of disassembly-to-order (DTO) has recently become popular. The goal of DTO is to determine the optimum number of end-of-life (EOL) products to be disassembled in order to fulfill the demand for components and materials such that some desired criteria of the system are satisfied. However, the outcome of this problem is fraught with errors. This is due to the unpredictable circumstances of the EOL products which stem from many sources such as the operating environment, different usage patterns and customers upgrades. If one could get advanced information about the status of the products, it could prove to be quite invaluable in making EOL management decisions. Advanced product information consists of two types of data, viz., static and dynamic. The static data consists of the product name, the brand name, the model type, etc. The dynamic data consists of cumulative data covering the circumstances to which the product was subjected to during its useful life. Capturing these data has become an important goal of many manufacturers. Numerous technological advances and the availability of various monitoring devices, embedded in products, offer us with many product monitoring and data collection alternatives. In this paper, an integer program is developed to model and solve the DTO problem that utilizes the captured data from EOL products. A numerical example is considered to illustrate the use of this methodology.
Article
Purpose Revenue maximization through improving click‐throughs is of great importance for price comparison shopping services (PCSSs) whose revenues directly depend on the number of click‐throughs of items in their itemsets. The purpose of this paper is to present an approach aiming to maximize the revenue of a PCSS by proposing effective itemset construction methods that can maximize the click‐throughs. Design/methodology/approach The authors suggest three itemset construction methods, namely naïve method (NM), exhaustive method (EM), and local update method (LM). Specifically, NM searches for the best itemset for an item in terms of textual similarity between an item and an itemset, while EM produces the best itemset for each item for maximizing click‐throughs by considering all the possible memberships of the item. Finally, through combining NM and EM, the authors propose an LM that attempts to improve click‐throughs by locally updating the memberships of items according to their ranks in each itemset. Findings Through evaluation of the proposed methods based on a real‐world dataset, it has been found that improvement of click‐throughs is small when itemsets are constructed by using the textual similarity alone. However, significant improvement in the number of click‐throughs was achieved when considering items' membership updates dynamically. Originality/value Unlike the previous studies that mainly focus on the textual similarity, the authors attempt to maximize the revenue through constructing itemsets that can result in more click‐throughs. By using the proposed methods, it is expected that PCSSs will be able to automatically construct itemsets that can maximize their revenues without the need for manual task.
Conference Paper
Microfinance Institutions are established with the mission and goals of serving poor people who lacked financial services from the conventional financial institutions but capable of engaging in small scale economic activities. These institutions are found to be quite relevant in the Ethiopian context and in their current state they are established very recently. To date, there are more than 20 recognized microfinance institutions in Ethiopia. Wisdom Microfinance is one of these accredited institutions. This study was aimed at exploring the potential application of data mining techniques for supporting key operational and strategic decisions of microfinance institutions. Qualitative and quantitative data were gathered and used in the study. The qualitative data revealed facts regarding the information requirement of the company and helped the investigator to understand the business rules. The quantitative data were used for conducting model building experiments. WEKA data mining software was used for model building and analysis. Three sets of classifier algorithms were used in the experiment. It was found out that J48 classifier delivered good result in terms of classifying instances correctly. The study recommends future researchers to experiment model building using multiple algorithms and larger datasets.
Article
Internet of Things (IoT) software is required not only to dispose of huge volumes of real-time and heterogeneous data, but also to support different complex applications for business purposes. Using an ontology approach, a Configurable Information Service Platform is proposed for the development of IoT-based application. Based on an abstract information model, information encapsulating, composing, discomposing, transferring, tracing, and interacting in Product Lifecycle Management could be carried out. Combining ontology and representational state transfer (REST)-ful service, the platform provides an information support base both for data integration and intelligent interaction. A case study is given to verify the platform. It is shown that the platform provides a promising way to realize IoT application in semantic level.
Conference Paper
In contemporary society, with supply chains becoming more and more complex, the data in supply chains increases by means of volume, variety and velocity. Big data rise in response to the proper time and conditions to offer advantages for the nodes in supply chains to solve prewiously difficult problems. For any big data project to succeed, it must first depend on high-quality data but not merely on quantity. Further, it will become increasingly important in many big data projects to add external data to the mix and companies will eventually turn from only looking inward to also looking outward into the market, which means the use of big data must be broadened considerably. Hence the data supply chains, both internally and externally, become of prime importance. ICT (Information and Telecommunication) supply chain management is especially important as supply chain link the world closely and ICT supply chain is the base of all supply chains in today's world. Though many initiatives to supply chain security have been developed and taken into practice, most of them are emphasized in physical supply chain which is addressed in transporting cargos. The research on ICT supply chain security is still in preliminary stage. The use of big data can promote the normal operation of ICT supply chain as it greatly improve the data collecting and processing capacity and in turn, ICT supply chain is a necessary carrier of big data as it produces all the software, hardware and infrastructures for big data's collection, storage and application. The close relationship between big data and ICT supply chain make it an effective way to do research on big data security through analysis on ICT supply chain security. This paper first analyzes the security problems that the ICT supply chain is facing in information management, system integrity and cyberspace, and then introduces several famous international models both on physical supply chain and ICT supply chain. After that the authors d- scribe a case of communication equipment with big data in ICT supply chain and propose a series of recommendations conducive to developing secure big data supply chain from five dimensions.
Article
Reliability growth testing becomes difficult to implement as the product development cycle continues to shrink. As a result, the new design is prone to latent failures due to design immaturity and uncertain operating condition. Reliability growth planning emerged as a new methodology to drive the reliability across the product lifetime. We propose a multiphase reliability growth model that sequentially determines and implements corrective actions (CAs) against surfaced and latent failure modes. Such a holistic approach enables the manufacturer to attain the reliability goal while ensuring the product time to market. We devise a CA effectiveness function to assess the tradeoff between the failure removal rate and the required resources. Rosen's gradient projection algorithm is used to determine the optimal resource allocation in each phase. The applicability and performance of the reliability growth model are demonstrated on a fleet of semiconductor testing equipment.
Article
Reliability analysis of phased-mission systems (PMS) must consider the statistical dependences of element states across different phases as well as changes in system configuration, success criteria, and component behavior. This paper proposes a recursive method for the exact reliability evaluation of PMS consisting of nonidentical independent nonrepairable multistate elements. The method is based on conditional probabilities and the branch-and-bound principle. It is invariant to changes in system structure, demand, and the elements' state transition rates among phases. The main advantage of this method is that it does not require the composition of decision diagrams and can be fully automated. Both analytical and numerical examples are presented to illustrate the application and advantages of the proposed method. The computational performance of the proposed algorithm is illustrated through comprehensive experimentation on the CPU running time of the algorithm.
Article
Cloud computing, rapidly emerging as a new computation paradigm, provides agile and scalable resource access in a utility-like fashion, especially for the processing of big data. An important open issue here is to efficiently move the data, from different geographical locations over time, into a cloud for effective processing. The de facto approach of hard drive shipping is not flexible or secure. This work studies timely, cost-minimizing upload of massive, dynamically-generated, geo-dispersed data into the cloud, for processing using a MapReduce-like framework. Targeting at a cloud encompassing disparate data centers, we model a cost-minimizing data migration problem, and propose two online algorithms: an online lazy migration (OLM) algorithm and a randomized fixed horizon control (RFHC) algorithm , for optimizing at any given time the choice of the data center for data aggregation and processing, as well as the routes for transmitting data there. Careful comparisons among these online and offline algorithms in realistic settings are conducted through extensive experiments, which demonstrate close-to-offline-optimum performance of the online algorithms.
Article
Government snooping, recently publicized, is now using the same data sources that corporations use to watch us. The same records used by federal agencies to search for terrorists are used, with fewer controls, by corporations searching for customers.
Article
In this paper, a novel knowledge-based global operation approach is proposed to minimize the effect on the production performance caused by unexpected variations in the operation of a mineral processing plant subjected to uncertainties. For this purpose, a feedback compensation and adaptation signal discovered from process operational data is employed to construct a closed-loop dynamic operation strategy. It uses the signal to regulate the outputs of the existing open-loop and steady-state based system so as to compensate the uncertainty in the steady-state operation at the plant-wide level. The utilization mechanism of operational data through constructing increment association rules is firstly described. Then, a rough set based rule extraction approach is developed to generate the compensation rules. This includes two steps, namely the determination of the variables to be compensated based on the significance of attributes in the rough set theory and the extraction of the compensation rules from process data. Based upon the operational data of the mineral processing plant, relevant rules are obtained. Both simulation and industrial experiments are carried out for the proposed global operation, where the effectiveness of the proposed approach has been clearly justified.
Article
Automated visual surveillance systems are attracting extensive interest due to public security. In this paper, we attempt to mine semantic context information including object-specific context information and scene-specific context information (learned from object-specific context information) to build an intelligent system with robust object detection, tracking, and classification and abnormal event detection. By means of object-specific context information, a cotrained classifier, which takes advantage of the multiview information of objects and reduces the number of labeling training samples, is learned to classify objects into pedestrians or vehicles with high object classification performance. For each kind of object, we learn its corresponding semantic scene-specific context information: motion pattern, width distribution, paths, and entry/exist points. Based on this information, it is efficient to improve object detection and tracking and abnormal event detection. Experimental results demonstrate the effectiveness of our semantic context features for multiple real-world traffic scenes.
Article
Risk aversion typically erodes the value of an investment opportunity, often increasing the incentive to delay investment. Although this may be true when the decision maker has discretion only over the timing of investment, any additional discretion over the capacity of a project may lead to different results. In this paper, we extend the traditional real options approach by allowing for discretion over capacity while incorporating risk aversion and operational flexibility in the form of suspension and resumption options. In contrast to a project without scalable capacity, we find that increased risk aversion may actually facilitate investment because it decreases the optimal capacity of the project. Finally, we illustrate how the relative loss in the value of the investment opportunity due to an incorrect capacity choice may become less pronounced with increasing risk aversion and uncertainty.
Article
Business intelligence (BI) is the process of transforming raw data into useful information for more effective strategic, operational insights, and decision-making purposes so that it yields real business benefits. This new emerging technique can not only improve applications in enterprise systems and industrial informatics, respectively, but also play a very important role to bridge the connection between enterprise systems and industrial informatics. This paper was intended as a short introduction to BI with the emphasis on the fundamental algorithms and recent progress. In addition, we point out the challenges and opportunities to smoothly connect industrial informatics to enterprise systems for BI research.
Article
The need to store, process, and analyze large amounts of data is finally driving enterprise customers to adopt cloud computing at scale. Understanding the economic drivers behind enterprise customers is key to designing next-generation cloud services.