Figure 1 - uploaded by Bogdan Grechuk
Content may be subject to copyright.
Sample autocorrelation function (ACF) for the FTSE 100 index for the period of 1-April-2015-1-April-2016. 

Sample autocorrelation function (ACF) for the FTSE 100 index for the period of 1-April-2015-1-April-2016. 

Source publication
Article
Full-text available
In a typical one-period decision making model under uncertainty, unknown consequences are modeled as random variables. However, accurately estimating probability distributions of the involved random variables from historical data is rarely possible. As a result, decisions made may be suboptimal or even unacceptable in the future. Also, an agent may...

Contexts in source publication

Context 1
... for example, the FTSE 100 index is such an asset. Figure 1 depicts the sample autocorrelation function (ACF) of daily prices of the index from 1-April-2015 to 1-April-2016 with the lag up to 80, taken from [1]. For a lag longer than 80 days, the autocorrelation is negligible, so that T = 80, and weights q 1 , . . . ...
Context 2
... a lag longer than 80 days, the autocorrelation is negligible, so that T = 80, and weights q 1 , . . . , q 80 are then proportional to the ACF in Figure 1 and satisfy ∑ T t=1 q t = 1. Behavioral evidence supporting the notion of time profile in- cludes, but is not limited to the following (a) The effect of fading memory and emotions [18, Part 6]: ...
Context 3
... particular, (3) are weights in expo- nential smoothing [9]. (b) The ACF decreases with time and almost vanishes after 80 days (see Figure 1). (c) In mean-variance portfolio selection, the optimal portfo- lios with time profiles based on geometric progression (3) with various q outperform optimal portfolio in which as- set rates of return are modeled by an ARIMA time-series model -to be discussed in Example 9 ( Figure 5). ...
Context 4
... one may obtain a monotone preference relation if the standard deviation in (26) is replaced by a general deviation measure [44,45,46]. weights chosen according to the ACF of the FTSE 100 index in Figure 1. We select n = 70 most actively traded 12 assets from the FTSE 100 index and use assets' daily rates of return from 1-April-2015 to 1-April-2016. ...

Similar publications

Preprint
Full-text available
This article introduces a new method for eliciting prior distributions from experts. The method models an expert decision-making process to infer a prior probability distribution for a rare event $A$. More specifically, assuming there exists a decision-making process closely related to $A$ which forms a decision $Y$, where a history of decisions ha...

Citations

... To include data quality into the decision-making process, results of the data quality assessment were used to define probabilistic distributions of uncertainty. The work of Grechuk and Zabarankin (2017) inspired the approach adopted. The main uncertain parameter considered is the supply volume for raw materials. ...
Article
Planning decisions are generally subject to some level of uncertainty. In forestry, data describing the resources available have a major impact on operations performance and productivity. This paper aims to present a method to improve decision-making in the forest supply chain by taking supply uncertainty into account using the results of data quality assessments. The case study describes the operations planning process of a Canadian forest products company dealing with an uncertain volume of wood supply. Three approaches to constructing probability distributions based on data quality are tested. Each approach offers a different level of precision: (1) a frequency distribution of accuracy, (2) a normal distribution based on average accuracy, and (3) a normal distribution based on data quality classification. Using stochastic programming to plan transport and production shows that lower costs can be achieved with a general characterisation of the data accuracy. Not considering uncertainty when planning operations leads to a significant replanning transportation cost. Using classes of data quality to include uncertainty in operations planning contributes to reducing the transportation cost from $15.90/m³ down to $15.32/m³ representing 3.6%.
... Based on the existing decision data, we use scientific data processing technology to objectively analyze and evaluate them and transform them into effective decision indicators and knowledge, which can provide reliable and reasonable suggestions and decision support for decision-makers. This data-driven decision-making has become a new trend in modern decision-making [5][6][7]. significance to discuss the three-way decisions under a hybrid multi-attribute environment, especially in the case of attributes represented by intuitionistic fuzzy numbers or IVIFNs with more fuzzy information. ...
Article
Full-text available
In the era of internet connection and IOT, data-driven decision-making has become a new trend of decision-making and shows the characteristics of multi-granularity. Because three-way decision-making considers the uncertainty of decision-making for complex problems and the cost sensitivity of classification, it is becoming an important branch of modern decision-making. In practice, decision-making problems usually have the characteristics of hybrid multi-attributes, which can be expressed in the forms of real numbers, interval numbers, fuzzy numbers, intuitionistic fuzzy numbers and interval-valued intuitionistic fuzzy numbers (IVIFNs). Since other forms can be regarded as special forms of IVIFNs, transforming all forms into IVIFNs can minimize information distortion and effectively set expert weights and attribute weights. We propose a hybrid multi-attribute three-way group decision-making method and give detailed steps. Firstly, we transform all attribute values of each expert into IVIFNs. Secondly, we determine expert weights based on interval-valued intuitionistic fuzzy entropy and cross-entropy and use interval-valued intuitionistic fuzzy weighted average operator to obtain a group comprehensive evaluation matrix. Thirdly, we determine the weights of each attribute based on interval-valued intuitionistic fuzzy entropy and use the VIKOR method improved by grey correlation analysis to determine the conditional probability. Fourthly, based on the risk loss matrix expressed by IVIFNs, we use the optimization method to determine the decision threshold and give the classification rules of the three-way decisions. Finally, an example verifies the feasibility of the hybrid multi-attribute three-way group decision-making method, which provides a systematic and standard solution for this kind of decision-making problem.
... Authors in (Y. Aien et al., 2016Aien et al., , 2014Grechuk & Zabarankin, 2018;Jordehi, 2018;Soroudi & Amraee, 2013;Yuan Zhao et al., 2015), carried out a review on a different aspect of uncertainty modeling of DGs. The applied method can be summarized in figure 4. ...
Article
Full-text available
Numerous potential advantages to the requirements and effectiveness of the supplied electricity can be accomplished by the installation of distributed generation units. In order to take full advantage of these benefits, it is essential to position the Distributed Generation (DG) units in appropriate locations. Otherwise, their installation may have an adverse effect on the quality of energy and system operation. Several optimization techniques have been created over the years to optimize distributed generation integration. Optimization techniques are therefore constantly changing and have been the main attention of many fresh types of research lately. This article evaluates cutting-edge techniques of optimizing the issue of positioning and sizing distributed generation units from renewable energy sources based on recent papers that have already been applied to distribution system optimization. Furthermore, this article pointed out the environmental, economic, technological and regulatory drivers that lead to a rapid interest in the DG system based on renewable sources. A summary of popular meta-heuristic optimization tools discussed in table form with merits and demerits to increase fresh prospective paths to multi-approach that have not yet been studied.
... In the context of asset management, many papers assume a finite (but possibly very large) number of scenarios for the future excess return R, (which for example may correspond to historical time series of returns of the corresponding portfolio at the specified times in the past, see Grechuk and Zabarankin (2018)) and this is the case that we research in this paper. Although the questions of existence of optimal solutions has been solved, the problem of uniqueness for finite number of scenarios has not been analyzed carefully enough. ...
Preprint
Full-text available
We consider the issue of solution uniqueness for portfolio optimization problem and its inverse for asset returns with a finite number of possible scenarios. The risk is assessed by deviation measures introduced by Rockafellar et al. (2006) instead of variance as in the Markowitz optimization problem. We prove that in general one can expect uniqueness neither in forward nor in inverse problems. We discuss consequences of that non-uniqueness for several problems in risk analysis and portfolio optimization, including capital allocation, risk sharing, cooperative investment, and the Black-Litterman methodology. In all cases, the issue with non-uniqueness is closely related to the fact that subgradient of a convex function is non-unique at the points of non-differentiability. We suggest methodology to resolve this issue by identifying a unique "special" subgradient satisfying some natural axioms. This "special" subgradient happens to be the Stainer point of the subdifferential set.
... Using past experience and historical data to predict the probabilities of these variables is often impossible. Judgments provided based on past experience and historical data might be inaccurate and unacceptable (Grechuk & Zabarankin, 2018). Decision-making tools aim to improve the general process of decision-making. ...
... Unsuccessful or inappropriate decisions should be reassessed, and the process started again. Grechuk and Zabarankin (2018) modelled the general decision-making process under uncertainty as four stages as shown in Figure 1, They said that decision makers and analysts possess historical and experimental data which is insufficient. Data acquired from statistical understanding of various assumptions depending on the nature of the problem might provide better understanding of risk and uncertainty associated with the problem. ...
... Data → Uncertainty modelling → Risk preference modelling → Choice/Decision Figure 1: A model of a general decision-making process under uncertainty (Grechuk and Zabarankin, 2018) The impact of the choice of a method on actual decisions is also well known, as well as the consequences of poor decisions (Kornyshova & Salinesi, 2007). Eldarandaly et al. (2009) asserted that applying different MCDM methods to the same decisional problem could often generate different outcomes. ...
Article
This paper presents a new methodology to recommend the most suitable Multi-Criteria Decision Making (MCDM) method from a subset of candidate methods when risk and uncertainty are anticipated. A structured approach has been created based on an analysis of MCDM problems and methods characteristics. Outcomes of this analysis provide decision makers with a suggested group of candidate methods for their problem. Sensitivity analysis is applied to the suggested group of candidate methods to analyze the robustness of outputs when risk and uncertainty are anticipated. A MCDM method is automatically selected that delivers the most robust outcome. MCDM methods dealing with discrete sets of alternatives are considered. Numerical examples are presented where some MCDM methods are compared and recommended by calculating the minimum percentage change in criteria weights and performance measures required to alter the ranking of any two alternatives. A MCDM method will be recommended based on a best compromise in minimum percentage change required in inputs to alter the ranking of alternatives. Different cases are considered and some new propositions are presented based on potential generalized scenarios of MCDM problems.
Article
This work describes the creation of a new method to choose a suitable Multi-Criteria Decision Making (MCDM) method for a Boeing strategic decision. The decision involved four global market regions being ranked based on their market attractiveness and competitive strength when risk and uncertainty were anticipated. Following an analysis of MCDM problems and methods, a new organized approach was created to provide a decision maker with a sub-group of suitable MCDM methods. Sensitivity analysis was used to investigate the robustness of the outputs from the various candidate methods. A MCDM method is recommended automatically. The recommended candidate method is the one that provided the most robust output (solution to the problem). Only methods that deal with a discrete set of choices were considered. In the Boeing strategic decision presented in this paper, two MCDM methods were compared and a recommendation was made after calculating the minimum percentage change in performance measures and criteria weights required to change the ranking of any two alternatives. An MCDM method was recommended based on a compromise between the minimum percentage change that was required in the inputs to change the ranking of alternatives. Some propositions are discussed based on general scenarios concerning MCDM problems.
Article
We advance a mathematical representation of Command and Control as distributed decision makers using the Kuramoto model of networked phase oscillators. The phase represents a continuous Perception-Action cycle of agents at each network node; the network the formal and informal communications of human agents and information artefacts; coupling the strength of relationship between agents; native frequencies the individual decision speeds of agents when isolated; and stochasticity temporal noisiness in intrinsic agent behaviour. Skewed heavy-tailed noise captures that agents may randomly ‘jump’ forward (rather than backwards) in their decision state under time stress; there is considerable evidence from organisational science that experienced decision-makers behave in this way in critical situations. We present a use-case for the model using data for military headquarters staff tasked to drive a twenty-four hour ‘battle-rhythm’. This serves to illustrate how such a mathematical model may be used realistically. We draw on a previous case-study where headquarters’ networks were mapped for routine business and crisis scenarios to provide advice to a military sponsor. We tune the model using the first data set to match observations that staff performed synchronously under such conditions. Testing the impact of the crisis scenario using the corresponding network and heavy-tailed stochasticity, we find increased probability of decision incoherence due to the high information demand of some agents in this case. This demonstrates the utility of the model to identify risks in headquarters design, and potential means of identifying points to change. We compare to qualitative organisational theories to initially validate the model.