Article

Foundations of Statistics

Authors:
To read the full-text of this research, you can request a copy directly from the author.

Abstract

Diagrams and Tables. Measures of Location. Measures of Dispersion and Skewness. Basic Ideas of Probability. Random Variables and Their Probability Distribution. Some Standard Discrete and Continuous Probability Distributions. Approximations to the Binomial and Poisson Distributions. Linear Functions of Random Variables and Joining Distributions. Sample Populations and Point Estimation. Interval Estimation. Hypothesis Tests for the Mean and Variance of Normal Distributions. Hypothesis Tests for the Binomial Parameter p,p. Hypothesis Tests for Independence and Goodness-of-Fit. Non-Parametric Hypothesis Tests. Correlation. Regression. Elements of Experimental Design and Analysis. Quality Control Charts and Acceptance Sampling.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... Assessing HEPs is difficult, and is related to two concepts: human error and probability. The theory of probability is discussed, for instance, by Savage (1972). Human error is widely discussed in psychology literature, e.g. by Reason (1990), Woods et al. (1988) and Hollnagel & Marsden (1996). ...
Article
Human reliability analysis (HRA) of a probabilistic safety assessment (PSA) includes identifying human actions from safety point of view, modelling the most important of them in PSA models, and assessing their probabilities. As manifested by many incidents and studies, human actions may have both positive and negative effect on safety and economy. Human reliability analysis is one of the areas of probabilistic safety assessment (PSA) that has direct applications outside the nuclear industry. The thesis focuses upon developments in human reliability analysis methods and data. The aim is to support PSA by extending the applicability of HRA. The thesis consists of six publications and a summary. The summary includes general considerations and a discussion about human actions in the nuclear power plant (NPP) environment. A condensed discussion about the results of the attached publications is then given, including new development in methods and data. At the end of the summary part, the contribution of the publications to good practice in HRA is presented. In the publications, studies based on the collection of data on maintenancerelated failures, simulator runs and expert judgement are presented in order to extend the human reliability analysis database. Furthermore, methodological frameworks are presented to perform a comprehensive HRA, including shutdown conditions, to study reliability of decision making, and to study the effects of wrong human actions. In the last publication, an interdisciplinary approach to analysing human decision making is presented. The publications also include practical applications of the presented methodological frameworks.
... The connection between dynamic consistency and Bayesian updating has been studied in several decision theoretic models. Ghirardato (2002) studies a model in which the decision maker, for any possible event, holds a conditional preference relation over Savage-acts (Savage (1954)), and imposes axioms which guarantee that these conditional preferences can be represented by subjective expected utility functions. The paper shows that the conditional preferences are dynamically consistent if and only if the decision maker uses the same utility function for every observable event, and the induced belief revision function satisfies Bayesian updating. ...
Article
Full-text available
In the literature there are at least two models for probabilistic belief revision: Bayesian updating and imaging [D. K. Lewis, Counterfactuals. Cambridge/MA: Harvard Univ. Press (1973; Zbl 0989.03003); P. Gärdenfors, Knowledge in flux: Modeling the dynamics of epistemic states. Cambridge, MA: MIT Press (1988)]. In this paper we focus on imaging rules that can be described by the following procedure: (1) Identify every state with some real-valued vector of characteristics, and accordingly identify every probabilistic belief with an expected vector of characteristics; (2) For every initial belief and every piece of information, choose the revised belief which is compatible with this information and for which the expected vector of characteristics has minimal Euclidean distance to the expected vector of characteristics of the initial belief. This class of rules thus satisfies an intuitive notion of minimal belief revision. The main result in this paper is to provide an axiomatic characterization of this class of imaging rules.
... The principle that humans were constantly violating is defined by The Sure Thing Principle. It is a concept widely used in game theory and was originally introduced by [64]. This principle is fundamental in Bayesian probability theory and states that if one prefers action A over B under state of the world X, and if one also prefers A over B under the complementary state of the world X, then one should always prefer action A over B even when the state of the world is unspecified. ...
Article
Full-text available
Probabilistic graphical models such as Bayesian Networks are one of the most powerful structures known by the Computer Science community for deriving probabilistic inferences. However, modern cognitive psychology has revealed that human decisions could not follow the rules of classical probability theory, because humans cannot process large amounts of data in order to make judgements. Consequently, the inferences performed are based on limited data coupled with several heuristics, leading to violations of the law of total probability. This means that probabilistic graphical models based on classical probability theory are too limited to fully simulate and explain various aspects of human decision making. Quantum probability theory was developed in order to accommodate the paradoxical findings that the classical theory could not explain. Recent findings in cognitive psychology revealed that quantum probability can fully describe human decisions in an elegant framework. Their findings suggest that, before taking a decision, human thoughts are seen as superposed waves that can interfere with each other, influencing the final decision. In this work, we propose a new Bayesian Network based on the psychological findings of cognitive scientists. We made experiments with two very well known Bayesian Networks from the literature. The results obtained revealed that the quantum like Bayesian Network can affect drastically the probabilistic inferences, specially when the levels of uncertainty of the network are very high (no pieces of evidence observed). When the levels of uncertainty are very low, then the proposed quantum like network collapses to its classical counterpart.
... Many social decisions have to be made in uncertain environments where individuals have both different tastes over the possible outcomes and different beliefs over the possible states of the world. As Hylland and Zeckhauser (1979) and Mongin (1995, 1998) have showed, for Subjective Expected Utility (SEU) preferences (Savage, 1954; Anscombe and Aumann, 1963), simultaneous * We thank Tzachi Gilboa, Peter Klibanoff, Philippe Mongin, Sujoy Mukerji, Klaus Nehring, Efe Ok, Marcus Pivato, Xiangyu Qu, David Schmeidler, Peter Wakker, and Stéphane Zuber, as well as participants to the D-TEA 2014 meeting, DRI seminar in Paris, and workshop in Bielefeld University, for useful comments and discussions. ...
Article
We provide possibility results on the aggregation of beliefs and tastes for Monotone, Bernoullian and Archimedian preferences of Cerreia-Vioglio, Ghirardato, Maccheroni, Mari-nacci, and Siniscalchi (2011). We propose a new axiom, Unambiguous Pareto Dominance, which requires that if the unambiguous part of individuals' preferences over a pair of acts agree, then society should follow them. We characterize the resulting social preferences and show that it is enough that individuals share a prior to allow non dictatorial aggregation. A further weakening of this axiom on common-taste acts, where cardinal preferences are identical, is also characterized. It gives rise to a set of relevant priors at the social level that can be any subset of the convex hull of the individuals' sets of relevant priors. We then apply these general results to the Maxmin Expected Utility model, the Choquet Expected Utility model and the Smooth Ambiguity model. We end with a characterization of the aggregation of ambiguity attitudes.
... The integration of preferences in medicine is discussed in different works [1,2,3,4], and preference was defined as the desirability of a health-related outcome, process, or treatment choice [4]. Two approaches have been developed in the literature: 1) a quantitative approach [5,6] where preferences are expressed by means of a utility function, the option with the maximal utility is considered the best one, 2) a qualitative approach [7,8,9] where relative preferences are expressed by ordering the options. However, quantitative approaches are more complex because they require to express preferences with numerical values, which is often difficult in medicine. ...
Article
Full-text available
Medical decision making, such as choosing which drugs to prescribe, requires to consider mandatory constraints, e.g. absolute contraindications, but also preferences that may not be satisfiable, e.g. guideline recommendations or patient preferences. The major problem is that these preferences are complex, numerous and come from various sources. The considered criteria are often conflicting and the number of decisions is too large to be explicitly handled. In this paper, we propose a framework for encoding medical preferences using a new connective, called ordered disjunction symbolized by ~×. Intuitively, the preference "Diuretic~×Betablocker means: "Prescribe a Diuretic if possible, but if this is not possible, then prescribe a Betablocker". We give an inference method for reasoning about the preferences and we show how this framework can be applied to a part of a guideline for hypertension.
... To appreciate how this decision approach helps make sense of choice under risk and uncertainty, we begin by analysing two well-demonstrated violations of Savage's [20] If Allais' [1] two pairs of choice problems are represented, for convenience, by only using two dimensions (the best and the worst possible outcome dimensions), the equate-to-differentiate account can most easily be understood with the aid of a graphical representation, as shown in Figs. 1 and 2. We can see that the construction of the gambles will render the equating of difference on the ''best possible outcome'' dimension easier than that on the ''worst possible outcome'' dimension for the first pair of choices, but vice versa for the second, assuming a negatively accelerated (concave) utility function. In other word, regardless of the fact that the gambles in the second pair are just the gambles in the first pair, each minus a constant which is common to the gambles being considered, the gamble parameters are designed to encourage participants to differentiate the subjectively greater difference (the difference between the bad-outcome ($0) of Alternative B and the certain outcome ($1m) of Alternative A) on the worst possible outcome dimension in the first pair of choices, but to differentiate the subjectively greater difference (the difference between the good-outcome ($5m) of Alternative D and the good-outcome ($1m) of Alternative C) on the best possible outcome dimension in the second pair. ...
Article
Full-text available
A generalised weak dominance process was used to derive a model for pairwise choice where each alternative involves consequences measurable on only two conflicting dimensions. This approach models much human choice behaviour as a process in which people seek to equate a less significant difference between alternatives on one dimension, thus leaving the greater one-dimensional difference to be differentiated as the determinant of the final choice. These aspects of the equate-to-differentiate model were tested in three judgment problems, where features shared by the offered alternatives were experimentally manipulated. The cancellation-and-focus model and the transparency hypothesis concerning violations of the cancellation axiom were compared with the equate-to-differentiate hypothesis. The findings for various judging tasks favour the equate-to-differentiate explanation: whether shared features and unique features are subfeatures of a single main feature or not is the cause which is most likely to be responsible for a resulting cancellation.
... Although exploitation means forgoing the benefits of exploration that can be enjoyed in solitary situations, those who seize the firstmover advantage do better than those who do not in many (but, of course, not all) competitive situations. It is a proverbial truth that you should ''look before you leap'' (see also Savage, 1954/1972, p. 16). In our competitive environment , it emerged that a quick peek before leaping was very helpful—but that more extensive looking permitted the competitor to leap first and gain an edge. ...
... The optimal model can be successfully applied only when a decision maker possesses perfect knowledge of all aspects of a situation. Following Savage (1954) and Binmore (2009), perfect knowledge of an environment is possible if one resides in a so-called small world. Examples of a small world are a controlled laboratory experiment, a lottery, and certain games. ...
Conference Paper
Full-text available
The critical step facing every decision maker is when to stop collecting evidence and proceed with the decision act. This is known as the stopping rule. Over the years, several unconnected explanations have been proposed that suggest nonoptimal approaches can account for some of the observable violations of the optimal stopping rule. The current research proposes a unifying explanation for these violations based on a new stopping rule selection (SRS) theory. The main innovation here is the assumption that a decision maker draws from a large set of different kinds of stopping rules and is not limited to using a single one. The SRS theory hypothesizes that there is a storage area for stopping rules—the so-called decision operative space (DOS)— and a retrieval mechanism that is used to select stopping rules from the DOS. The SRS theory has shown good fit to challenging data published in the relevant literature. . One of the most important steps of decision making is determining when to stop collecting evidence and proceed with the final decision. This is defined as the stopping rule and it is thought to be an irreplaceable component of almost all cognitive models of decision making. Take, for example, a patient who is facing a risky medical treatment. The treatment can have a good outcome—that is, the patient will benefit from it—or it can have a bad outcome—that is, the patient will suffer serious side effects. To the patient's surprise, doctors don't have a unanimous opinion on whether the treatment is beneficial or harmful. Thus, the patient decides to ask for several doctors' opinions. The patient collects either positive opinions (+1) in favor of the risky treatment or negative opinions (-1) against the risky treatment. The total sum of evidence is defined as the critical difference, d. But how many opinions should he collect to reduce the risk of making the wrong decision? To help the patient with the decision, his best friend, a statistician, tells him that the number of opinions can be calculated based on the most optimal solution.
... That is, formally identical life-death questions would yield the same choice preference. According to the independence axiom of expected utility theory (Savage, 1954), if one prefers the sure option of 1/3U (6) to the gamble of U(1/3*6 + 2/3*0), the person should also prefer 1/3U(600) to U(1/3*600 + 2/3*0). Note that the second pair of options can be reduced to the first pair by multiplying a common ratio of 1/100. ...
... The standard approach to decision making, going back to Savage [21], suggests that an agent should maximize expected utility. But computing the relevant probabilities might be difficult, as might computing the relevant utilities. ...
Article
There have been two major lines of research aimed at capturing resource-bounded players in game theory. The first, initiated by Rubinstein (), charges an agent for doing costly computation; the second, initiated by Neyman (), does not charge for computation, but limits the computation that agents can do, typically by modeling agents as finite automata. We review recent work on applying both approaches in the context of decision theory. For the first approach, we take the objects of choice in a decision problem to be Turing machines, and charge players for the “complexity” of the Turing machine chosen (e.g., its running time). This approach can be used to explain well-known phenomena like first-impression-matters biases (i.e., people tend to put more weight on evidence they hear early on) and belief polarization (two people with different prior beliefs, hearing the same evidence, can end up with diametrically opposed conclusions) as the outcomes of quite rational decisions. For the second approach, we model people as finite automata, and provide a simple algorithm that, on a problem that captures a number of settings of interest, provably performs optimally as the number of states in the automaton increases.
... If an appropriate utility is assigned to each possible consequence and the expected utility of each alternative is calculated, then the best action is to consider the alternative with the highest expected utility (which can be the smallest expected value). Different axioms that imply the existence of utilities with the property that expected utility is an appropriate guide to consistent decision making are presented in [14] [12] [10] [11] [6]. The choice of the adequate utility function for a specific type of problem can be taken using direct methods presented in Keeney and Raiffa's book. ...
Conference Paper
Full-text available
Suppose we want to obtain de optimal route in a directed random network, where the parameters associated to the arcs are real random variables following discrete distributions. The criteria that has been chosen by us to decide which route is optimal is the one that minimizes the expected value of an utility function over the considered network. The topology of the networks is not known in advance, which means that we only have knowledge of the incoming (outgoing) arcs, and their parameters, of some specific node once we reach it. Thus the optimal path is determined in a dynamic way. If the parameters associated to the arcs were deterministic the so-lution to this problem would be very easy to obtain. Since they are real random variables in order to obtain the optimal dynamic route from a source to sink node we present an algorithm that uses a straight generalization of Bellman's equation, used in the known algorithms to the determined the shortest path in directed determin-istic networks, plus some techniques that improve the running times without jeopardizing memory. We also present computational results for networks with 100 up to 10000 nodes and densities 2, 5 and 10.
... The optimal investment problem of choosing the best way to allocate an investor's capital is often formulated as the problem of maximizing, over admissible investment strategies, the expected utility of terminal wealth. The formulation relies on the axiomatic foundation developed by von Neumann and Morgenstern [52] and Savage [46]. In continuous time optimal portfolio selection, the study dates back to the seminal contributions of Merton [38] [39]. ...
Article
Full-text available
Motivated by recent axiomatic developments, we study the risk- and ambiguity-averse investment problem where trading takes place over a fixed finite horizon and terminal payoffs are evaluated according to a criterion defined in terms of a quasiconcave utility functional. We extend to the present setting certain existence and duality results established for the so-called variational preferences by Schied (2007). The results are proven by building on existing results for the classical utility maximization problem.
... In their experiments on the Trolley Problem, Greene et al. (2008) showed how task-sensitive individuals are in the moral decision making process and that the task environment also has a hand in directing individuals to use cognitive or emotional cues when making moral judgments. On the other hand, the perspective of risk as an emotion is relatively a new concept considering that the traditional notion of risk has always been a cognitive exercise of assigning probabilities to states of the world (e.g., Savage, 1954; von Neumann and Morgenstern, 1944). Slovic (1987) was already critical in developing the idea of risk as feeling in the literature, arguing that social, cultural, economic and political factors inform the way we develop our perceptions of risk. ...
Thesis
Full-text available
The dominant perspective in economics and psychology has been choice as information where ex post behavioral data, i.e. utilities, satisfaction levels, are used to infer human behavior. But the increasing importance of affect—automatic, perceptive, emotional, and intuitive processes—in judgment and decision making cannot anymore be overlooked. We propose an alternative perspective where choice of information is used as behavioral data. We conducted an experiment using affect-driven choice problems under ambiguous negative task scenarios—moral dilemmas involving loved ones and risk as feeling scenarios. We offer three evidences to support our case. First, ex ante, most individuals facing risky choices attempted to confirm the threat of the task environment (confirmation bias) by asking for more information but the same individuals facing moral dilemmas avoided information (confirmation avoidance). Ex post, individuals rated high on moral dilemmas but very low in risky dilemmas. Second, response time correlations showed that individuals made fast, principled decisions, showing that affective processes were in motion in the decision process. Third, ex ante choice of information predicted more various moral, religious, and political variables than ex post preference scores.
... So-called frequentist probability is sometimes confused with stable propensities of chance setups (e.g., Peirce, 1923/2010; Popper, 1959/2002) but usually means hypothetical longrunning frequencies (e.g., Ellis, 1842, 1854; Neyman, 1937; Venn, 1866/2006; von Mises, 1957; see Maher, 2010). Alternatively, " Bayesian " probability can be a degree of rational knowledge or inductive, logical, or evidential support for a proposition or state of affairs (e.g., Bayes, 1763; Carnap, 1950; Jaynes, 2003; Jeffreys, 1939/1998; Keynes, 1921/2008; Laplace, 1774/1986; Shannon, 1948), or it can be a degree of belief in a proposition or state of affairs (e.g., De Finetti, 1931/1989; Jeffrey, 2004; Ramsey, 1926/1990; Savage, 1954; see Galavotti, 2011). We are interested in how the ontologies and epistemologies of each can affect the way management researchers conceptualize organizations and their members, as well as intervene in existing organizational practices and enact different organizational realities (e.g., Hacking, 1983; Latour, 2004; Law, 2004; Shapin & Schaffer, 1985). ...
Article
Full-text available
Special Issue Purpose This special issue is focused on how a Bayesian approach to estimation, inference, and reasoning in organizational research might supplement—and in some cases supplant—traditional frequentist approaches. Bayesian methods are well suited to address the increasingly complex phenomena and problems faced by 21st-century researchers and organizations, where very complex data abound and the validity of knowledge and methods are often seen as contextually driven and constructed. Traditional modeling techniques and a frequentist view of probability and method are challenged by this new reality.
... Essentially, he isolated the relevant properties that hold of conditional probabilities, and of the common prior assumption -which drive the original result -and imposed them as independent conditions on general decision functions in partitional information structures. As such, he was able to isolate and interpret the underlying assumptions of the original result as (i) an assumption of " like-mindedness " , which requires agents to take the same action given the same information , and (ii) an assumption that he claimed is analogous to requiring the agents' decision functions to satisfy Savage's Sure-Thing Principle ([9]). This principle is intended to capture the intuition that " if an agent takes the same action in ...
Article
Moses & Nachum ([7]) identify conceptual flaws in Bacharach's generalization ([3]) of Aumann's seminal "agreeing to disagree" result ([1]). Essentially, Bacharach's framework requires agents' decision functions to be defined over events that are informationally meaningless for the agents. In this paper, we argue that the analysis of the agreement theorem should be carried out in information structures that can accommodate for counterfactual states. We therefore develop a method for constructing such "counterfactual structures" (starting from partitional structures), and prove a new agreement theorem within such structures. Furthermore, we show that our approach also resolves the conceptual flaws in the sense that, within our framework, decision functions are always only defined over events that are informationally meaningful for the agents.
... The research questions prompted an extensive literature review that converged on three disciplines the authors felt essential to any methodology for enterprise PMS design: disciplines of systems science, operational test and evaluation (OT&E), and multicriteria decision analysis (MCDA). Specifically, the review compelled developers to: Consider a link to systems science as one essential to the design of systems intended to measure performance of Rouse's [1] goal-directed organizations, or enterprises, that necessarily feature the complexity and related concepts central to systems science [2]; Recognize the utility to enterprise measurement of OT&E concepts of critical operational issues (COIs), measures of effectiveness (MOEs), and measures of performance (MOPs), as defined by Sproles [3] [4] [5] and others [6] [7] [8] [9] [10]; and to Recognize the criticality of MCDA-related concerns such as utility [11] [12] [13] and measurement theory [14] to the validity and usefulness of results achieved with an enterprise PMS. Figure 1 depicts those three disciplines, select components, and the relationships among all within a methodology the authors were ultimately led to develop and label as Enterprise AID, or simply AID, for enterprise assessment, improvement, and design. Ensuing subsections more fully explain the importance of each discipline to PMS design. ...
Article
Full-text available
The Enterprise AID − for assessment, improvement, and design − methodology is a systems science-, operational test and evaluation-, and multicriteria decision analysis-based approach to design and deployment of performance measurement systems (PMSs) tailored to specific enterprises pursuing any or all of enterprise assessment, improvement, or design. Its two phases of design and deployment sprang from designers’ inductively generated and now prototyped response to a gap they recognized between performance measurement capabilities required by contemporary enterprises and those offered by contemporary PMSs. This paper illustrates key concepts underlying AID, while a companion document, The Enterprise AID methodology: Application, draws from a prototyping effort to identify value to be gained by stakeholders from PMSs designed and deployed with methodology application.
... In this environment, where calculations were laborious, grand and simple theories of statistics were inevitable. The books by Fisher (1956), Savage (1954) and Wald (1950) illustrate this. Historically, such a philosophical phase is typical of the early stages of a discipline, before technological advances enable relevant experimentation. ...
... 49–50). Chambers and Quiggin (1954, p. 77), citing de Finetti (1974), and Savage (1954), state that ''probabilities used in economic decision-making are inherently subjective. Consequently, probabilities can only be inferred from the observed behaviour of decision makers, and are inseparably tied to beliefs and preferences. ...
... In contrast to the optimistic attitude in PO analysis, the minmax regret analysis first proposed by Savage [16,17] is a conservative approach. The term "regret" means the discrepancy between the actual payoff of an alternative and the best one that could have been rendered with a different choice. ...
Article
The traditional approach for multiple attribute decision analysis with incomplete information on alternative values and attribute weights is to identify alternatives that are potentially optimal. However, the results of potential optimality analysis may be misleading as an alternative is evaluated under the best-case scenario of attribute weights only. Robust optimality analysis is a conservative approach that is concerned with an assured level of payoff for an alternative across all possible scenarios of weights. In this study, we introduce two measures of robust optimality that extend the robust optimality analysis approach and classify alternatives in consideration into three groups: strong robust optimal, weak robust optimal and robust non-optimal. Mathematical models are developed to compute these measures. It is claimed that robust optimality analysis and potential optimality analysis together provide a comprehensive picture of an alternative's variable payoff.
Article
Full-text available
We review the progress of naturalistic decision making (NDM) in the decade since the first conference on the subject in 1989. After setting out a brief history of NDM we identify its essential characteristics and consider five of its main contributions: recognition-primed decisions, coping with uncertainty, team decision making, decision errors, and methodology. NDM helped identify important areas of inquiry previously neglected (e.g. the use of expertise in sizing up situations and generating options), it introduced new models, conceptualizations, and methods, and recruited applied investigators into the field. Above all, NDM contributed a new perspective on how decisions (broadly defined as committing oneself to a certain course of action) are made. NDM still faces significant challenges, including improvement of the quantity and rigor of its empirical research, and confirming the validity of its prescriptive models. Copyright © 2001 John Wiley & Sons, Ltd.
Article
We consider the natural combination of two strands of recent statistical research, i.e., that of decision making with uncertain utility and that of Nonparametric Predictive Inference (NPI). In doing so we present the idea of Nonparametric Predictive Utility Inference (NPUI), which is suggested as a possible strategy for the problem of utility induction in cases of extremely vague prior information. An example of the use of NPUI within a motivating sequential decision problem is also considered for two extreme selection criteria, i.e., a rule that is based on an attitude of extreme pessimism and a rule that is based on an attitude of extreme optimism.
Article
Full-text available
This is a recently extended, better version of a paper that is to appear soon! Any other changes to the paper (which, if there are any, will be very minor) will be incorporated into the electronic version of the paper available at, for example, <dl.dropboxusercontent.com/u/50966294/documents/Publications/ComparativeExpectations_Pedersen.pdf> or my website (arthurpaulpedersen.org). Abstract: I introduce a mathematical account of expectation based on a *qualitative criterion of coherence* for *qualitative comparisons* between gambles (or random quantities). The qualitative comparisons may be interpreted as an agent's comparative preference judgments over options or more directly as an agent's comparative expectation judgments over random quantities. The criterion of coherence is reminiscent of de Finetti's quantitative criterion of coherence for betting, yet it does not impose an Archimedean condition on an agent's comparative judgments, it does not require the binary relation reflecting an agent's comparative judgments to be *reflexive,* *complete,* or even *transitive,* and it applies to an absolutely arbitrary collection of gambles, free of structural conditions (e.g., closure, measurability, etc.). Moreover, unlike de Finetti's criterion of coherence, the qualitative criterion respects the *principle of weak dominance,* a standard of rational decision making that obliges an agent to reject a gamble that is possibly worse and certainly no better than another gamble available for choice. Despite these weak assumptions, I establish a qualitative analogue of de Finetti's Fundamental Theorem of Prevision, from which it follows that any coherent system of *comparative expectations* can be extended to a weakly ordered coherent system of comparative expectations over any collection of gambles containing the initial set of gambles of interest. The extended weakly ordered coherent system of comparative expectations satisfies familiar additivity and scale invariance postulates (i.e., independence) when the extended collection forms a linear space. In the course of these developments, I recast de Finetti's quantitative account of coherent prevision in the qualitative framework adopted in this article. I show that *comparative previsions* satisfy qualitative analogues of de Finetti's famous bookmaking theorem and his Fundamental Theorem of Prevision. The results of this article complement those of another article [Pedersen, 2013]. I explain how those results entail that any coherent weakly ordered system of comparative expectations over a unital linear space can be represented by an *expectation function* taking values in a (possibly non-Archimedean) totally ordered field extension of the system of real numbers. The ordered field extension consists of *formal power series* in a *single* infinitesimal, a natural and economical representation that provides a relief map tracing numerical non-Archimedean features to qualitative non-Archimedean features. The technical developments of these articles contain key ingredients of an unconditionally full account of subjective expected utility in the style of Savage (1954) and Anscombe and Aumann (1963}. Additional Remarks: The numerical representation of qualitative comparisons in terms of formal power series in a single infinitesimal shows how (lexicographic) ordered vectorial representations (following the tradition of Hausner,Thrall, Chipman, Fishburn, Blume et al., etc.) may be understood instead quite simply as ordered field representations with multiplication of vectors naturally defined by means of the familiar operation of multiplication of power series. Such a field representation sheds light not only upon how to multiply vectorial utilities and vectorial probabilities in agreement with traditional accounts in the expected utility paradigm but also upon how to define conditional lexicographic probability and expectation according to the standard ratio formula. Thus the formal developments of these articles address basic questions posed by Halpern (2010), serving to close what may appear to be a gap between ordered vectorial representations and ordered field representations. As I explain in the article, existing accounts of non-Archimedean probability and expectation proposed by philosophers and non-philosophers alike deliver mathematically deficient formalisms and rest on inadequate theoretical foundations.
Article
Full-text available
This paper presents an axiomatic framework for the priority heuristic, a model of bounded rationality in Selten’s (in: Gigerenzer and Selten (eds.) Bounded rationality: the adaptive toolbox, 2001) spirit of using empirical evidence on heuristics. The priority heuristic predicts actual human choices between risky gambles well. It implies violations of expected utility theory such as common consequence effects, common ratio effects, the fourfold pattern of risk taking and the reflection effect. We present an axiomatization of a parameterized version of the heuristic which generalizes the heuristic in order to account for individual differences and inconsistencies. The axiomatization uses semiorders (Luce, Econometrica 24:178–191, 1956), which have an intransitive indifference part and a transitive strict preference component. The axiomatization suggests new testable predictions of the priority heuristic and makes it easier for theorists to study the relation between heuristics and other axiomatic theories such as cumulative prospect theory.
Data
Recent empirical studies in Behavioral Agency Model (see Pepper and Gore, 2012) on executive compensations make evidence how the agent attitude to risk influences the subjectively perceived incentive value. The paper sets out a compensation schedule matching multiple goals: (1) aligning the incentives with the executive subjectively perceived fair and equitable compensation; (2) discouraging the executive excessive risk-taking; (3) providing an approach to calculate the certainty equivalent of the uncertain compensation. To hit the first goal we suggest to use the target-oriented decision approach (see Bordley and LiCalzi, 2000) able to guide the agent in eliciting her subjective value function through the assessment of the (uncertain) target to hit. The proposed approach is compatible with prospect theory (see Kahneman and Tversky, 1979). With reference to the second goal involving the problem on how prevent moral hazard phenomena, we suggest to insert an event-linked option. That warranty ties the compensation payment to the outgoing a set of given performance indicators taken as benchmarks. The third goal is achieved using the notion of actuarial zero-utility premium principle extended to prospect theory (see Kaluszka and Krzeszowiec, 2012, 2013). To explicit the agent subjective value function we suggest an interactive graphical method proposed by Goldstein and Rothschild (2014) based on the Distribution Builder (see Sharpe et al., 2000). Our approach generalizes the Pepper and Gore (2012, 2013) compensation formula and provides a normative foundation for constructing compensation schemes, which are coherent with Savage’s (1954) rationality axioms and prospect theory as well.
Article
Full-text available
The present two studies examine how the participants (i.e., 150 managers) make trust-based employee selection in hypothetical situations, based on five cues of trustworthiness derived from previous surveys. In Study 1, each executive participant is presented with a pair of candidates with different cue profiles so that the choice would favor one of them based upon each of the four following heuristics: Franklin's rule, likelihood expectancy, take-the-best (TTB), and minimum requirement (MR). Study 2 adopting a within-subject design jointly compares the four heuristics. The results show that simple heuristics (MR and TTB) outperform the more complex strategies (Franklin's rule and likelihood expectancy) in their predictive accuracy. The MR heuristic, a heuristic tallying the frequency of passes against a set of minimal rather than optimal or satisfactory requirements, performs even better than the TTB heuristic, particularly when the number of the cues identified as MRs is small.
Article
Vigorous debate in phonetics and phonology has focused on the structure and cognitive foundation of distinctive feature theory, as well as on the definition and representation of features themselves. In particular, we show in Section 1 that, although vowel height has long been the object of close scrutiny, research on the three-or more-tiered height representation of vowels in the phonology of English remains inconclusive. Section 2 reports on the rationale and methodology of a sound-symbolism experiment designed to evaluate the implicit phonological knowledge that English native speakers have of vowel height differences. In Section 3 we tabulate results and argue in Section 4 that their intuitive understanding of such differences is best accounted for in terms of a three-tiered height axis 1. RESUMEN El intenso debate en la fonética y la fonología se ha enfocado en la estructura y en la base cognitiva de la teoría de los rasgos distintivos, así como en la definición y en la representación de los rasgos mismos. En concreto, demostramos en la Sección 1 que, aunque la apertura de las vocales ha sido durante mucho tiempo objeto de riguroso análisis, la investigación en torno a la representación de las vocales mediante tres o más grados de apertura no es concluyente en la fonología del inglés. La sección 2 informa sobre el razonamiento y la metodología de un experimento de simbolismo fonético diseñado para evaluar el conocimiento fonológico implícito con que cuentan los angloparlantes de Inglaterra respecto a los distintos grados de apertura vocálica. En la Sección 3 se tabulan los resultados, y sostenemos en la Sección 4 que la comprensión intuitiva de tales diferencias se explica mejor en términos de un eje con tres grados de apertura. PALABRAS CLAVE: binarismo, rasgo distintivo, simbolismo fonético, percepción fonológica, apertura vocálica.
Article
Full-text available
I introduce a mathematical account of expectation based on a *qualitative criterion of coherence* for *qualitative comparisons* between gambles (or random quantities). The qualitative comparisons may be interpreted as an agent's comparative preference judgments over options or more directly as an agent's comparative expectation judgments over random quantities. The criterion of coherence is reminiscent of de Finetti's quantitative criterion of coherence for betting, yet it does not impose an Archimedean condition on an agent's comparative judgments, it does not require the binary relation reflecting an agent's comparative judgments to be *reflexive,* *complete,* or even *transitive,* and it applies to an absolutely arbitrary collection of gambles, free of structural conditions (e.g., closure, measurability, etc.). Moreover, unlike de Finetti's criterion of coherence, the qualitative criterion respects the *principle of weak dominance,* a standard of rational decision making that obliges an agent to reject a gamble that is possibly worse and certainly no better than another gamble available for choice. Despite these weak assumptions, I establish a qualitative analogue of de Finetti's Fundamental Theorem of Prevision, from which it follows that any coherent system of *comparative expectations* can be extended to a weakly ordered coherent system of comparative expectations over any collection of gambles containing the initial set of gambles of interest. The extended weakly ordered coherent system of comparative expectations satisfies familiar additivity and scale invariance postulates (i.e., independence) when the extended collection forms a linear space. In the course of these developments, I recast de Finetti's quantitative account of coherent prevision in the qualitative framework adopted in this article. I show that *comparative previsions* satisfy qualitative analogues of de Finetti's famous bookmaking theorem and his Fundamental Theorem of Prevision. The results of this article complement those of another article [Pedersen, 2013]. I explain how those results entail that any coherent weakly ordered system of comparative expectations over a unital linear space can be represented by an *expectation function* taking values in a (possibly non-Archimedean) totally ordered field extension of the system of real numbers. The ordered field extension consists of *formal power series* in a *single* infinitesimal, a natural and economical representation that provides a relief map tracing numerical non-Archimedean features to qualitative non-Archimedean features.
Article
We describe and analyze the asset pricing, technology and regulatory changes in the banking sector. The material is structured as follows: 1. Asset pricing. We start with the benchmark of perfect and complete markets. We first extend to incomplete and imperfect markets and second, to asset pricing with counter party risk in OTC derivatives, with collateralization and funding costs. 2. Financial Crisis 2007. Taxation and regulation, in particular we focus on FATCA. Consumer protection and an overview about regulatory initiatives. Systemic risk, balance sheet regulation, leverage and regulation of OTC markets are then considered in more details. 3. Technological progress and its impact on banking 4. Innovation. We discuss some recent innovations, the innovation life cycle and discuss the impact of innovation on society.
Article
I introduce a mathematical account of expectation based on a *qualitative criterion of coherence* for *qualitative comparisons* between gambles (or random quantities). The qualitative comparisons may be interpreted as an agent's comparative preference judgments over options or more directly as an agent's comparative expectation judgments over random quantities. The criterion of coherence is reminiscent of de Finetti's quantitative criterion of coherence for betting, yet it does not impose an Archimedean condition on an agent's comparative judgments, it does not require the binary relation reflecting an agent's comparative judgments to be *reflexive,* *complete,* or even *transitive,* and it applies to an absolutely arbitrary collection of gambles, free of structural conditions (e.g., closure, measurability, etc.). Moreover, unlike de Finetti's criterion of coherence, the qualitative criterion respects the *principle of weak dominance,* a standard of rational decision making that obliges an agent to reject a gamble that is possibly worse and certainly no better than another gamble available for choice. Despite these weak assumptions, I establish a qualitative analogue of de Finetti's Fundamental Theorem of Prevision, from which it follows that any coherent system of *comparative expectations* can be extended to a weakly ordered coherent system of comparative expectations over any collection of gambles containing the initial set of gambles of interest. The extended weakly ordered coherent system of comparative expectations satisfies familiar additivity and scale invariance postulates (i.e., independence) when the extended collection forms a linear space. In the course of these developments, I recast de Finetti's quantitative account of coherent prevision in the qualitative framework adopted in this article. I show that *comparative previsions* satisfy qualitative analogues of de Finetti's famous bookmaking theorem and his Fundamental Theorem of Prevision. The results of this article complement those of another article [Pedersen, 2013]. I explain how those results entail that any coherent weakly ordered system of comparative expectations over a unital linear space can be represented by an *expectation function* taking values in a (possibly non-Archimedean) totally ordered field extension of the system of real numbers. The ordered field extension consists of *formal power series* in a *single* infinitesimal, a natural and economical representation that provides a relief map tracing numerical non-Archimedean features to qualitative non-Archimedean features.
Article
Full-text available
The study of covert networks is plagued by the fact that individuals conceal their attributes and associations. To address this problem, we develop a technology for eliciting this information from qualitative subject-matter experts to inform statistical social network analysis. We show how the information from the subjective probability distributions can be used as input to Bayesian hierarchical models for network data. In the spirit of “proof of concept,” the results of a test of the technology are reported. Our findings show that human subjects can use the elicitation tool effectively, supplying attribute and edge information to update a network indicative of a covert one.
Article
I introduce a mathematical account of expectation based on a *qualitative criterion of coherence* for *qualitative comparisons* between gambles (or random quantities). The qualitative comparisons may be interpreted as an agent's comparative preference judgments over options or more directly as an agent's comparative expectation judgments over random quantities. The criterion of coherence is reminiscent of de Finetti's quantitative criterion of coherence for betting, yet it does not impose an Archimedean condition on an agent's comparative judgments, it does not require the binary relation reflecting an agent's comparative judgments to be *reflexive,* *complete,* or even *transitive,* and it applies to an absolutely arbitrary collection of gambles, free of structural conditions (e.g., closure, measurability, etc.). Moreover, unlike de Finetti's criterion of coherence, the qualitative criterion respects the *principle of weak dominance,* a standard of rational decision making that obliges an agent to reject a gamble that is possibly worse and certainly no better than another gamble available for choice. Despite these weak assumptions, I establish a qualitative analogue of de Finetti's Fundamental Theorem of Prevision, from which it follows that any coherent system of *comparative expectations* can be extended to a weakly ordered coherent system of comparative expectations over any collection of gambles containing the initial set of gambles of interest. The extended weakly ordered coherent system of comparative expectations satisfies familiar additivity and scale invariance postulates (i.e., independence) when the extended collection forms a linear space. In the course of these developments, I recast de Finetti's quantitative account of coherent prevision in the qualitative framework adopted in this article. I show that *comparative previsions* satisfy qualitative analogues of de Finetti's famous bookmaking theorem and his Fundamental Theorem of Prevision. The results of this article complement those of another article [Pedersen, 2013]. I explain how those results entail that any coherent weakly ordered system of comparative expectations over a unital linear space can be represented by an *expectation function* taking values in a (possibly non-Archimedean) totally ordered field extension of the system of real numbers. The ordered field extension consists of *formal power series* in a *single* infinitesimal, a natural and economical representation that provides a relief map tracing numerical non-Archimedean features to qualitative non-Archimedean features.
Data
I introduce a mathematical account of expectation based on a *qualitative criterion of coherence* for *qualitative comparisons* between gambles (or random quantities). The qualitative comparisons may be interpreted as an agent's comparative preference judgments over options or more directly as an agent's comparative expectation judgments over random quantities. The criterion of coherence is reminiscent of de Finetti's quantitative criterion of coherence for betting, yet it does not impose an Archimedean condition on an agent's comparative judgments, it does not require the binary relation reflecting an agent's comparative judgments to be *reflexive,* *complete,* or even *transitive,* and it applies to an absolutely arbitrary collection of gambles, free of structural conditions (e.g., closure, measurability, etc.) Moreover, unlike de Finetti's criterion of coherence, the qualitative criterion respects the *principle of weak dominance,* a standard of rational decision making that obliges an agent to reject a gamble that is possibly worse and certainly no better than another gamble available for choice. Despite these weak assumptions, I establish a qualitative analogue of de Finetti's Fundamental Theorem of Prevision, from which it follows that any coherent system of *comparative expectations* can be extended to a weakly ordered coherent system of comparative expectations over any collection of gambles containing the initial set of gambles of interest. The extended weakly ordered coherent system of comparative expectations satisfies familiar additivity and scale invariance postulates (i.e., independence) when the extended collection forms a linear space. In the course of these developments, I recast de Finetti's quantitative account of coherent prevision in the qualitative framework adopted in this article. I show that *comparative previsions* satisfy qualitative analogues of de Finetti's famous bookmaking theorem and his Fundamental Theorem of Prevision. The results of this article complement those of another article [Pedersen, 2013]. I explain how those results entail that any coherent weakly ordered system of comparative expectations over a unital linear space can be represented by an *expectation function* taking values in a (possibly non-Archimedean) totally ordered field extension of the system of real numbers. The ordered field extension consists of *formal power series* in a *single* infinitesimal, a natural and economical representation that provides a relief map tracing numerical non-Archimedean features to qualitative non-Archimedean features.
Article
Full-text available
This paper attempts to disentangle the difference between our own self-assessment of moral judgment and intentional judgment. Using a Knobe-effect version of the Trolley Problem in our experiment, we used reactant variables—magnitude of victims, outcome probabilities, and expected outcome of lives saved—to find points were these two forms of moral judgments diverge. Outcome probabilities provided the most salient and consistent pattern, indicating that participants used outcome probabilities as their reference point in their moral decision making. We saw that, with respect to their own self-assessed moral judgments, individuals tend to be morally optimistic when the task dilemma emphasizes the lives saved, but become morally pessimistic when the same dilemma emphasizes the killing of the one other victim. Under negative probability frames, individuals tend to overestimate their attribution on positive intentions and underestimate their attribution on negative intentions.
Article
The paper considers an environmental policy decision in which the appropriate approach for discounting future costs and benefits is unknown. Uncertainty about the discount rate is formulated as a decision under Knightian uncertainty. To solve this, we employ minimax regret, a decision criterion that is much less conservative then the related criterion maximin—in particular, it can be shown to implement a “proportional response” in that it equally balances concern about the mistake of doing too little with that of doing too much. Despite the criterion's balanced nature, the minimax regret solution mimics a policy that maximizes the present discounted value of future net benefits with an effective (certainty-equivalent) discount rate that declines over time to the lowest possible rate. In addition to reinforcing Weitzman's (1998) original limiting result, the approach generates concrete policy advice when decision makers are unable to specify a prior over possible discount rates. We apply it to the Stern–Nordhaus discounting debate and find that the effective discount rate converges to the Stern rate in just under 200 years.
Article
Full-text available
We identify and explain the mechanisms that account for the emergence of fairness preferences and altruistic punishment in voluntary contribution mechanisms by combining an evolutionary perspective together with an expected utility model. We aim at filling a gap between the literature on the theory of evolution applied to cooperation and punishment, and the empirical findings from experimental economics. The approach is motivated by previous findings on other-regarding behavior, the co-evolution of culture, genes and social norms, as well as bounded rationality. Our first result reveals the emergence of two distinct evolutionary regimes that force agents to converge either to a defection state or to a state of coordination, depending on the predominant set of self- or other-regarding preferences. Our second result indicates that subjects in laboratory experiments of public goods games with punishment coordinate and punish defectors as a result of an aversion against disadvantageous inequitable outcomes. Our third finding identifies disadvantageous inequity aversion as evolutionary dominant and stable in a heterogeneous population of agents endowed initially only with purely self-regarding preferences. We validate our model using previously obtained results from three independently conducted experiments of public goods games with punishment.
Article
We combine forward investment performance processes and ambiguity averse portfolio selection. We introduce the notion of robust forward criteria which addresses the issues of ambiguity in model specification and in preferences and investment horizon specification. It describes the evolution of time-consistent ambiguity averse preferences. We first focus on establishing dual characterizations of the robust forward criteria. This offers various advantages as the dual problem amounts to a search for an infimum whereas the primal problem features a saddle-point. Our approach is based on ideas developed in Schied (2007) and Zitkovic (2009). We then study in detail non-volatile criteria. In particular, we solve explicitly the example of an investor who starts with a logarithmic utility and applies a quadratic penalty function. The investor builds a dynamical estimate of the market price of risk $\hat \lambda$ and updates her stochastic utility in accordance with the so-perceived elapsed market opportunities. We show that this leads to a time-consistent optimal investment policy given by a fractional Kelly strategy associated with $\hat \lambda$. The leverage is proportional to the investor's confidence in her estimate $\hat \lambda$.
Conference Paper
In the previous decision-theoretic rough sets (DTRS), its loss function values are constant. This paper extends the constant values of loss functions to a more realistic dynamic environment. Considering the dynamic change of loss functions in DTRS with the time, an extension of DTRS, dynamic decision-theoretic rough sets (DDTRS) is proposed in this paper. An empirical study of climate policy making validates the reasonability and effectiveness of the proposed model.
Article
Full-text available
Lower and upper attainable bounds for stability radius of efficient solution to multicriterial Boolean portfolio optimization problem with Savage’s minimax risk criterion are obtained.
Article
Full-text available
Based on the classical Markowitz model, we formulate a vector (multicriteria) Boolean problem of portfolio optimization with bottleneck criteria under risk. We obtain the lower and upper attainable bounds for the quantitative characteristics of the type of stability of the problem, which is a discrete analog of the Hausdorff upper semicontinuity of the multivalued mapping that defines the Pareto optimality.
Article
In the typical analysis of a data set, a single method is selected for statistical reporting even when equally applicable methods yield very different results. Examples of equally applicable methods can correspond to those of different ancillary statistics in frequentist inference and of different prior distributions in Bayesian inference. More broadly, choices are made between parametric and nonparametric methods and between frequentist and Bayesian methods. Rather than choosing a single method, it can be safer, in a game-theoretic sense, to combine those that are equally appropriate in light of the available information. Since methods of combining subjectively assessed probability distributions are not objective enough for that purpose, this paper introduces a method of distribution combination that does not require any assignment of distribution weights. It does so by formalizing a hedging strategy in terms of a game between three players: nature, a statistician combining distributions, and a statistician refusing to combine distributions. The optimal move of the first statistician reduces to the solution of a simpler problem of selecting an estimating distribution that minimizes the Kullback–Leibler loss maximized over the plausible distributions to be combined. The resulting combined distribution is a linear combination of the most extreme of the distributions to be combined that are scientifically plausible. The optimal weights are close enough to each other that no extreme distribution dominates the others. The new methodology is illustrated by combining conflicting empirical Bayes methods in the context of gene expression data analysis.
Article
We review several of de Finetti’s fundamental contributions where these have played and continue to play an important role in the development of imprecise probability research. Also, we discuss de Finetti’s few, but mostly critical remarks about the prospects for a theory of imprecise probabilities, given the limited development of imprecise probability theory as that was known to him.
Article
We review de Finetti’s two coherence criteria for determinate probabilities: coherence1 defined in terms of previsions for a set of events that are undominated by the status quo – previsions immune to a sure-loss – and coherence2 defined in terms of forecasts for events undominated in Brier score by a rival forecast. We propose a criterion of IP-coherence2 based on a generalization of Brier score for IP-forecasts that uses 1-sided, lower and upper, probability forecasts. However, whereas Brier score is a strictly proper scoring rule for eliciting determinate probabilities, we show that there is no real-valued strictly proper IP-score. Nonetheless, with respect to either of two decision rules – Γ-maximin or (Levi’s) E-admissibility-+-Γ-maximin – we give a lexicographic strictly proper IP-scoring rule that is based on Brier score.
Article
This paper develops the first model of probabilistic choice under subjective uncertainty (when probabilities of events are not objectively known). The model is characterized by seven standard axioms (probabilistic completeness, weak stochastic transitivity, nontriviality, event-wise dominance, probabilistic continuity, existence of an essential event, and probabilistic independence) as well as one new axiom. The model has an intuitive econometric interpretation as a Fechner model of (relative) random errors. The baseline model is extended from binary choice to decisions among m>2m>2 alternatives using a new method, which is also applicable to other models of binary choice.
Article
This article operationalizes a non-empty relation as implied if strict preference and indifference jointly do not completely order the choice set. Specifically, indecision is operationalized as a positive preference for delegating choice to a least predictable device.
Article
In this paper we extend the results on ex-ante agreeable trade of Kajii and Ui [Kajii, A., Ui, T., 2006. Agreeable bets with multiple priors. Journal of Economic Theory 128, 299–305] to the case of non-convex Choquet preferences. We discuss the economic relevance of the main result for the existence of speculative trade under ambiguity. Following Haller [Haller, H., 1990. Non-market reopening, time-consistent plans and the structure of intertemporal preferences. Economics Letters 32, 1–5] we also elaborate the relevance of updating preferences upon arrival of public information for the potential of re-contracting initial contingent claims.
Article
The existing decision models have been successfully applied to solving many decision problems in management, business, economics and other fields, but nowadays arises a need to develop more realistic decision models. The main drawback of the existing utility theories starting from von Neumann–Moregnstern expected utility to the advanced non-expected models is that they are designed for laboratory examples with simple, well-defined gambles which do not adequately enough reflect real decision situations. In real-life decision making problems preferences are vague and decision-relevant information is imperfect as described in natural language (NL). Vagueness of preferences and imperfect decision relevant information require using suitable utility model which would be fundamentally different to the existing precise utility models. Precise utility models cannot reflect vagueness of preferences, vagueness of objective conditions and outcomes, imprecise beliefs, etc. The time has come for a new generation of decision theories. In this study, we propose a decision theory, which is capable to deal with vague preferences and imperfect information. The theory discussed here is based on a fuzzy-valued non-expected utility model representing linguistic preference relations and imprecise beliefs.
Article
T he attempt to reinterpret the common tests of significance used in scientific research as though they constituted some kind of acceptance procedure and led to “decisions” in Wald's sense, originated in several misapprehensions and has led, apparently, to several more. The three phrases examined here, with a view to elucidating the fallacies they embody, are: “Repeated sampling from the same population”, Errors of the “second kind”, “Inductive behaviour”. Mathematicians without personal contact with the Natural Sciences have often been misled by such phrases. The errors to which they lead are not always only numerical.
Article
Suppose that we have a series of observations, given to have been drawn independently from a common chance distribution of specified form depending on a single unknown parameter 6. Suppose the observations have been takeni according to some sampling rule such that the eventual lnumber of observations does lnot depend on 0 except possibly through the observations themselves. All recognized kinds of sequential rule, including fixedsample-size rules, satisfy this condition. (A type of sanmpling rule to be excluded would be one where the number of observations depended on other observations subsequently suppressed.) It is well kniown that the likelihood function of 0, given the observations, is independent of the sampling rule. For in any expression for the chance of observing what has actually been observed, the sampling rule only enters as a factor independent of 6, which is therefore irrelevant when the chances corresponding to different values of 6 are compared. It follows that the posterior probability distribution for 0 derived by Bayes's theorem from some given prior distribution is also independent of the sampling rule. (I shall refer to this use of Bayes's theorem as 'the Bayesian argument', for short.) On the other hand, in order to make a significance test (for instance, a test of whether the specified form of parent distribution is reasonably compatible with the observations), we need to take the sampling rule into account, in general. The same is true of other common statistical procedures; for examples and references, see Anscombe (1954). The purpose of this note is to show that in order to apply Fisher's fiducial argument (as recently re-expounded by him, 1956) we need to consider the sampling rule, just as for significance tests. This is interesting for two reasons. First, Fisher has stressed that a fiducial distribution is to be interpreted in exactly the same way as if it had been derived by a Bayesian argument, and indeed a fiducial distribution may be taken as prior distribution for incorporation in a subsequent Bayesian argument. Moreover, when the fiducial argument is for mathematical reasons not available, Fisher suggests that the likelihood function may be considered directly instead. Thus, what the fiducial argument is likened to and intimately associated with, and what is suggested as a substitute for it when it is not available, both differ from it in regard to dependence on the sampling rule. Secondly, Lindley (1957) has shown that, if the common chance distribution of the observations referred to in the opening sentence above is a member of the exponential family, then the fiducial argument does not have a property of consistency evidently desired for it by Fisher unless the resulting distribution for 6 is actually a posterior Bayes distribution. The examples below show that, in so far as fiducial distributions can be identified with posterior Bayes distributions, the corresponding prior distributions depend on the
Article
We would like to call attention to a few problems raised by but not solved in this paper: 1) find a necessary and sufficient condition that \( \hat p\) be the unique unbiased estimate for p; 2) suggest criteria for selecting one unbiased estimate when more than one is possible; 3) evaluate the variance of \( \hat p\). In this connection, in a forthcoming paper by M.A. Girshick, it will be shown for certain regions, for example for those of the sequential probability ratio test, that the variance of \( \hat p(\alpha )\), $$ \sigma _{\hat p}^2 \geqslant pq/E(x + y),$$ where E(x + y) is the expected number of observations required to reach a boundary point.
Article
Significance testing, as described in most textbooks, consists in fixing a standard significance level α such as.01 or.05 and rejecting the hypothesis θ = θ0 if a suitable statistic Y exceeds C where Pθo{Y > C} = α. Such a procedure controls the probability of false rejection (error of the first kind) at the desired level α but leaves the power of the test and hence the probability of an error of the second kind to the mercy of the experi-ment. It seems more natural when deciding on a significance level (and this suggestion is certainly not new) to take into account also what power can be achieved with the given experiment. In Section 3 a specific suggestion will be made as to how to balance α against the power β obtainable against the alterna-tives of interest.
Statistical aspects of ESP
  • W Feller
W. FELLER, "Statistical aspects of ESP," J. Parapsych., Vol. 4 (1940), pp. 271-298.
Induzione e Statistica
  • Di
punti di vista," Induzione e Statistica, Rome, Istituto Matematico dell' Universita, 1959.
The axioms and algebra of intuitive probabilityThe bases of probability
B. 0. KOOPMAN, "The axioms and algebra of intuitive probability," Ann. of Math., Vol. 41 (1940), pp. 269-292. [20] ,"The bases of probability," Bull. Amer. Math. Soc., Vol. 46 (1940), pp. 763-774.
Logical Foundations of Probability The Continuum of Inductive Methods
  • R Carnap
R. CARNAP, Logical Foundations of Probability, Chicago, University of Chicago Press, 1950. [4] , The Continuum of Inductive Methods, Chicago, University of Chicago Press, 1952.
The Foundations of StatisticsLa probabilita soggettiva nei problemi pratici della statisticaSubjective probability and statistical practice
  • L J Savage
L. J. SAVAGE, The Foundations of Statistics, New York, Wiley, 1954. [26] , "La probabilita soggettiva nei problemi pratici della statistica," Induzione e Statistica, Rome, Istituto Matematico dell' Universita, 1959. [27] , "Subjective probability and statistical practice," to be published.
La prevision: ses lois logiques, ses sources subjectivesLa probabilita e la statistica nei rapporti con l'induzione, seconde i diversi
  • B De
  • Finetti
B. DE FINETTI, "La prevision: ses lois logiques, ses sources subjectives," Ann. Inst. H. Poincarg, Vol. 7 (1937), pp. 1-68. [7] , "La probabilita e la statistica nei rapporti con l'induzione, seconde i diversi
Statistical inference
  • D V Lindley
D. V. LINDLEY, "Statistical inference," J. Roy. Statist. Soc., Ser. B, Vol. 15 (1953), pp. 30-76.