Article

Dragon-Kings, Black Swans and the Prediction of Crises

Authors:
To read the full-text of this research, you can request a copy directly from the author.

Abstract

We develop the concept of “dragon-kings” corresponding to meaningful outliers, which are found to coexist with power laws in the distributions of event sizes under a broad range of conditions in a large variety of systems. These dragon-kings reveal the existence of mechanisms of self-organization that are not apparent otherwise from the distribution of their smaller siblings. We present a generic phase diagram to explain the generation of dragon-kings and document their presence in six different examples (distribution of city sizes, distribution of acoustic emissions associated with material failure, distribution of velocity increments in hydrodynamic turbulence, distribution of financial drawdowns, distribution of the energies of epileptic seizures in humans and in model animals, distribution of the earthquake energies). We emphasize the importance of understanding dragon-kings as being often associated with a neighborhood of what can be called equivalently a phase transition, a bifurcation, a catastrophe (in the sense of René Thom), or a tipping point. The presence of a phase transition is crucial to learn how to diagnose in advance the symptoms associated with a coming dragon-king. Several examples of predictions using the derived log-periodic power law method are discussed, including material failure predictions and the forecasts of the end of financial bubbles.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the author.

... Detection of outliers in itself is a challenging task, regardless of the distribution, and requires various statistical tests [4][5][6]. One such type of outliers, where the data deviate strongly upward from the above-mentioned straight line, is dubbed "Dragon Kings" and is also encountered across multitudes of disciplines and phenomena [7][8][9][10]. In financial markets, power-law tails can lead to what theoretically could be arbitrarily large gains or losses-"Black Swans" in the latter case-while Dragon Kings could indicate even far more distressful events. ...
... While there are some indications of outlier behavior in financial markets [7,[11][12][13][14], the main goal of this paper is to investigate the realized volatility-a market quantity which is commonly used by market practitioners and economists alike-for the presence of any potential outliers. This is motivated by the fact that intuitively one would suspect that the most catastrophic market downturns and spikes in volatility-Black Monday, Tech Bubble, Financial Crisis and COVID Pandemic-would be plausible candidates for Dragon Kings. ...
... If the largest RV fall on the straight line, they can be classified as Black Swans (BS). If, however, they show statistically significant deviations upward or downward from this straight line, they can be classified as Dragon Kings (DK) [7,8] or negative Dragon Kings (nDK), respectively [5]. ...
Article
Full-text available
In this study, we undertake a systematic study of historic market volatility spanning roughly five preceding decades. We focus specifically on the time series of the realized volatility (RV) of the S&P500 index and its distribution function. As expected, the largest values of RV coincide with the largest economic upheavals of the period: Savings and Loan Crisis, Tech Bubble, Financial Crisis and Covid Pandemic. We address the question of whether these values belong to one of the three categories: Black Swans (BS), that is, they lie on scale-free, power-law tails of the distribution; Dragon Kings (DK), defined as statistically significant upward deviations from BS; or Negative Dragons Kings (nDK), defined as statistically significant downward deviations from BS. In analyzing the tails of the distribution with RV>40, we observe the appearance of “potential” DK, which eventually terminate in an abrupt plunge to nDK. This phenomenon becomes more pronounced with the increase in the number of days over which the average RV is calculated—here from daily, n=1, to “monthly”, n=21. We fit the entire distribution with a modified Generalized Beta (mGB) distribution function, which terminates at a finite value of the variable but exhibits a long power-law stretch prior to that, as well as a Generalized Beta Prime (GB2) distribution function, which has a power-law tail. We also fit the tails directly with a straight line on a log-log scale. In order to ascertain BS, DK or nDK behavior, all fits include their confidence intervals and p-values are evaluated for the data points to check whether they can come from the respective distributions.
... Numerical simulations reveal that this coupled BTW-Kuramoto model dynamics leads to a long-time oscillatory behavior dominated by a desirable phase where oscillators are fully synchronized, and the total load builds up, while cascades are mostly avoided. However, ultimately the system reaches a tipping point where a self-amplifying mechanism kicks in and, in contrast to the BTW model, a large cascade triggers an even larger cascade, leading to a "cascade of cascades," which can be classified as a Dragon King event 68,69 . This causes a system-wide discharge of load, reminiscent of running sandpiles 70 . ...
... Dragon King (DK) events are massive events which occur significantly more often than what extrapolation from smaller events would sug-gest. Often, Dragon Kings are related to tipping points, phase transitions, or bifurcations and are generated by a selfamplifying, nonlinear, endogenous mechanism that is different from the mechanism behind smaller events 68,69 . DKs are often easy to predict and they can be contrasted from black swans which are tail events from a power law distribution, such as in SOC. ...
... Recall, DK events are the massive events which occur significantly more often than what extrapolation from smaller events would suggest. Often, Dragon Kings are related to tipping points, phase transitions, or bifurcations and are generated by a self-amplifying, nonlinear, endogenous mechanism that is different from the mechanism behind smaller events 68,69 . DKs are also expected to be much more predictable than black swans 68,69 . ...
Article
Full-text available
Cascading failures abound in complex systems and the Bak–Tang–Weisenfeld (BTW) sandpile model provides a theoretical underpinning for their analysis. Yet, it does not account for the possibility of nodes having oscillatory dynamics, such as in power grids and brain networks. Here, we consider a network of Kuramoto oscillators upon which the BTW model is unfolding, enabling us to study how the feedback between the oscillatory and cascading dynamics can lead to new emergent behaviors. We assume that the more out-of-sync a node is with its neighbors, the more vulnerable it is and lower its load-carrying capacity accordingly. Also, when a node topples and sheds load, its oscillatory phase is reset at random. This leads to novel cyclic behavior at an emergent, long timescale. The system spends the bulk of its time in a synchronized state where load builds up with minimal cascades. Yet, eventually, the system reaches a tipping point where a large cascade triggers a “cascade of larger cascades,” which can be classified as a dragon king event. The system then undergoes a short transient back to the synchronous, buildup phase. The coupling between capacity and synchronization gives rise to endogenous cascade seeds in addition to the standard exogenous ones, and we show their respective roles. We establish the phenomena from numerical studies and develop the accompanying mean-field theory to locate the tipping point, calculate the load in the system, determine the frequency of the long-time oscillations, and find the distribution of cascade sizes during the buildup phase.
... For those who rely solely on gauged records fit by a probability function from which extrapolations are made to estimate unrecorded extremes, it is becoming increasingly clear that the best function for a wide range of phenomena, for both social and natural systems, is the power law (Sornette, 2009) including for flood peaks (Malamud & Turcotte, 2006). The theoretical basis of this function has been established by Thurner et al. (2018) as an expression of a driven critical phenomenon (not a self-organized critical phenomenon for which the input is constant but the outputs are power law distributed; Turcotte, 1999) with statistical behaviour as if it were at a phase transition or critical point (Thurner et al., 2018). ...
... Power laws indicate that extremes are not exceptional events because of their fat-tailed properties; that is, power laws, unlike thintailed distributions, provide estimates of extremes that are larger than those of other distributions and have an equal number of events in each size category, because they are scale-free, with significant implications for risk assessments. The existence of power laws also suggests that extremes and smaller events are produced by the same processes (Sornette, 2009), implying that large events are a result of amplification of the causes of smaller events. ...
... Such surprises are called Black Swans by Taleb (2007). Sornette (2009) goes further to introduce the idea of the "Dragon King", events that are even more extreme than those lying in the tail of a power law and that may be considered outliers and therefore, and unwisely, disregarded as errors of some kind. Dragon Kings point to mechanisms of generation that differ from those in a power law and can coexist with power laws. ...
Chapter
Full-text available
The sociologist of risk, Ulrich Beck (1992), reached the following profoundly important conclusion (as translated by Power, 1997): “…a paradigm shift in the understanding of risk is needed: a shift from the problem of knowing risk to the problem of the risks inherent in ways of knowing.”
... In this work, tweets are classified as being related to an urban area and/or the earthquake by performing successive automated look-ups of the terms included in the lists from Tables 3.3 It should be noted that, as discussed later in this thesis, extremely populous urban areas, sometimes referred to as dragon kings [Sornette, 2009], can be considered as statistical outliers due to their special socioeconomic 77 role in the urban dynamics of a country. Hence, they could be affecting the underlying scaling models followed by the rest of urban areas under consideration [Arcaute et al., 2015]. ...
... Indeed, Figure 5. Greater London was taken out from the urban system corresponding to the United Kingdom, the result of any analysis would be somewhat incomplete. In order to specify what is meant by extremely populous cities, the statistical concept of dragon king [Sornette, 2009] is introduced here. A 'dragon king' is defined as an event that, due to its large size, is statistically and mechanistically considered to be an outlier of the underlying heavy-tailed probability distribution followed by the rest of events. ...
... Sornette, who introduced the concept of dragon king, verifies once again that the empirical distribution of the population size of the urban areas in a country is, in general, compatible with Zipf's law [Sornette, 2009]. The lognormal or the Pareto distributions can then be regarded as very general models for the distribution of urban population sizes, but caution should be exercised when interpreting particular deviations from these models. ...
Thesis
Full-text available
The world is undergoing a rapid urbanisation process such that the ma- jority of people now live in urban areas. In this context, it is crucial to understand the behaviour that emerges in cities as a result of complex in- teractions between environmental, social, economic and political factors. To improve our knowledge, different techniques are used in this thesis in order to quantitatively model how one city compares with another. Owing to the present-day ease of access to information, most of the results in the following pages have been obtained via assessment of real-world data, made available by different public organisations. Urban scaling is used as the main modelling framework. This approach concerns the relationship between the population size of an urban area and some other urban characteristic. The work is applied to two specific topics of interest. Firstly, the amount of coverage given by the media to Mexican urban areas, before and after the 2017 Puebla earthquake, which affected several regions in Mexico. Secondly, the number of road traffic accidents per person in urban areas from several European countries for different degrees of accident severity or different definitions for the urban areas. The thesis also contains methodological contributions regarding the problem of accounting for urban areas with extremely large population in urban scaling models. Finally, this work explores the impact of the findings presented here to support the creation of new policies involving urban areas.
... Numerical simulations reveal that this coupled BTW-Kuramoto model dynamics leads to a long-time oscillatory behavior dominated by a desirable phase where oscillators are fully synchronized, and the total load builds up, while cascades are mostly avoided. However, ultimately the system reaches a tipping point where a self-amplifying mechanism kicks in and, in contrast to the BTW model, a large cascade triggers an even larger cascade, leading to a "cascade of cascades," which can be classified as a Dragon King event 68,69 . This causes a system-wide discharge of load, reminiscent of running sandpiles 70 . ...
... Dragon King (DK) events are massive events which occur significantly more often than what extrapolation from smaller events would sug-gest. Often, Dragon Kings are related to tipping points, phase transitions, or bifurcations and are generated by a selfamplifying, nonlinear, endogenous mechanism that is different from the mechanism behind smaller events 68,69 . DKs are often easy to predict and they can be contrasted from black swans which are tail events from a power law distribution, such as in SOC. ...
... Recall, DK events are the massive events which occur significantly more often than what extrapolation from smaller events would suggest. Often, Dragon Kings are related to tipping points, phase transitions, or bifurcations and are generated by a self-amplifying, nonlinear, endogenous mechanism that is different from the mechanism behind smaller events 68,69 . DKs are also expected to be much more predictable than black swans 68,69 . ...
Preprint
The BTW sandpile model of cascading dynamics forms a cornerstone for our understanding of failures in systems ranging from avalanches and forest fires to power grids and brain networks. The latter two are examples of oscillator networks, yet the BTW model does not account for this. Here we establish that the interplay between the oscillatory and sandpile dynamics can lead to emergent new behaviors by considering the BTW sandpile model on a network of Kuramoto oscillators. Inspired by high-level objectives in the power grids, we aim to leverage this interaction to maximize synchronization, maximize load, and minimize large cascades. We assume that the more out-of-sync a node is with its neighbors, the more vulnerable it is to failure so that a node's capacity is a function of its local level of synchronization. And when a node topples, its phase is reset at random. This leads to a novel long-time oscillatory behavior at an emergent timescale. The bulk of an oscillatory cycle is spent in a build-up phase where oscillators are fully synchronized, and cascades are largely avoided while the overall load in the system increases. Then the system reaches a tipping point where, in contrast to the BTW model, a large cascade triggers an even larger cascade, leading to a "cascade of cascades," which can be classified as a Dragon King event, after which the system has a short transient dynamic that restores full synchrony. This coupling between capacity and synchronization gives rise to endogenous cascade seeds in addition to exogenous ones, and we show their respective roles in the Dragon King events. We establish the phenomena from numerical studies and develop the accompanying mean-field theory to locate the tipping point, calculate the amount of load in the system, determine the frequency of the emergent long-time oscillations and find the distribution of cascade sizes during the build-up phase.
... DK are fluctuations so extreme that they exceed those described by a power law distribution, i.e., they are of a different nature. DK fluctuations differ from the rest of the events because they are produced by a cumulative feedback mechanism that may leave predictable traces [15], therefore creating a potential predictability scenario. Further studies of virtual stick balancing bolstered this possibility [16]. ...
... Further studies of virtual stick balancing bolstered this possibility [16]. Thus, a rationale seems to arise to implement Machine Learning (ML) to forecast the wild escaping process in HSB, and, in principle, it is not different from the case of financial fluctuations where DK is associated with crashes [15] and ML techniques are used [17][18][19]. ...
... If the largest RV fall on the straight line they can be classified as Black Swans (BS). If, however, they show statistically significant deviations upward or downward from this straight line, they can be classified as Dragon Kings (DK) [11,12] or negative Dragon Kings (nDK) respectively [13]. ...
... We fit CCDF of the full RV distribution -for the entire time span discussed in Sec. 2 -using mGB (7) and GB2 (11). The fits are shown on the log-log scale in Figs. 4 -13, together with the linear fit (LF) of the tails with RV > 40. ...
Preprint
Full-text available
We undertake a systematic study of historic market volatility spanning roughly five preceding decades. We focus specifically on the time series of realized volatility (RV) of the S&P500 index and its distribution function. As expected, the largest values of RV coincide with the largest economic upheavals of the period: Savings and Loan Crisis, Tech Bubble, Financial Crisis and Covid Pandemic. We address the question of whether these values belong to one of the three categories: Black Swans (BS), that is they lie on scale-free, power-law tails of the distribution; Dragon Kings (DK), defined as statistically significant upward deviations from BS; or Negative Dragons Kings (nDK), defined as statistically significant downward deviations from BS. In analyzing the tails of the distribution with RV > 40, we observe the appearance of "potential" DK which eventually terminate in an abrupt plunge to nDK. This phenomenon becomes more pronounced with the increase of the number of days over which the average RV is calculated -- here from daily, n=1, to "monthly," n=21. We fit the entire distribution with a modified Generalized Beta (mGB) distribution function, which terminates at a finite value of the variable but exhibits a long power-law stretch prior to that, as well as Generalized Beta Prime (GB2) distribution function, which has a power-law tail. We also fit the tails directly with a straight line on a log-log scale. In order to ascertain BS, DK or nDK behavior, all fits include their confidence intervals and p-values are evaluated for the data points to check if they can come from the respective distributions.
... The main purpose of such generalization is to obtain a distribution which has long power-law tails above the first scale, only to be eventually terminated, via a drop-off to zero, at the second scale. The need for such distribution was motivated by our looking into a possibility of Dragon Kings (DK) [23,24] in the realized volatility (RV) [25]. In particular, it is of great interestto glean into whether the most calamitous events in the stock market-Savings and Loan Crisis, Tech Bubble, Great Recession and Covid Pandemic-follow the Black Swan (BS) behavior and stay on the power-law tails or they deviate strongly higher and thus are DK. ...
... • We identified an SDE (35), which produces a modified Generalized Beta distribution as its steady-state solution, whose PDF and CDF are given, respectively, by (36) and (37). • A similar yet simpler modified Generalized Beta distribution (30)-(31) was constructed via a change of variable (4) from a modified Beta distribution (26)- (27), which is a steady-state distribution of the α = 1 SDE, (23). • We showed that an expanded version of (23), (21) allows for a physically appealing explanation of the Generalized Beta hierarchy, (24), as well as of natural link between (Generalized) Beta 1 and (Generalized) Beta 2 from (25). ...
Article
Full-text available
We approach the Generalized Beta (GB) family of distributions using a mean-reverting stochastic differential equation (SDE) for a power of the variable, whose steady-state (stationary) probability density function (PDF) is a modified GB (mGB) distribution. The SDE approach allows for a lucid explanation of Generalized Beta Prime (GB2) and Generalized Beta (GB1) limits of GB distribution and, further down, of Generalized Inverse Gamma (GIGa) and Generalized Gamma (GGa) limits, as well as describe the transition between the latter two. We provide an alternative form to the “traditional” GB PDF to underscore that a great deal of usefulness of GB distribution lies in its allowing a long-range power-law behavior to be ultimately terminated at a finite value. We derive the cumulative distribution function (CDF) of the “traditional” GB, which belongs to the family generated by the regularized beta function and is crucial for analysis of the tails of the distribution. We analyze fifty years of historical data on realized market volatility, specifically for S &P500, as a case study of the use of GB/mGB distributions and show that its behavior is consistent with that of negative Dragon Kings.
... Hydrofractures reaching the surface of the model domain transport the entire fluid volume with few but large events (Fig. 10). These large events can be called 'dragon kings' (Sornette, 2009), which are outliers coexisting with power-law distributions, but taking on very high values far beyond those of the power-law distribution. They are usually associated with the development of a tipping point or a bifurcation (Sornette, 2009). ...
... These large events can be called 'dragon kings' (Sornette, 2009), which are outliers coexisting with power-law distributions, but taking on very high values far beyond those of the power-law distribution. They are usually associated with the development of a tipping point or a bifurcation (Sornette, 2009). They are linked to the small hydrofractures that self-organize the fluid-pressure distribution, trigger avalanches and therefore initiate large fluid-escape events de Riese et al. 2020). ...
Article
Full-text available
Hydrofractures, or hydraulic fractures, are fractures where a significantly elevated fluid pressure played a role in their formation. Natural hydrofractures are abundant in rocks and are often preserved as magmatic dykes or sills, and mineral-filled fractures or mineral veins. However, we focus on the formation and evolution of non-igneous hydrofractures. Here we review the basic theory of the role of fluid pressure in rock failure, showing that both Terzaghi’s and Biot’s theories can be reconciled if the appropriate boundary conditions are considered. We next discuss the propagation of hydrofractures after initial failure, where networks of hydrofractures may form or hydrofractures may ascend through the crust as mobile hydrofractures. As fractures can form as a result of both tectonic stresses and an elevated fluid pressure, we address the question of how to ascertain whether a fracture is a hydrofracture. We argue that extensional or dilational fractures that formed below c. 2–3 km depth are, under normal circumstances, hydrofractures, but at shallower depth they may, but must not be hydrofractures. Since veins and breccias are often the products of hydrofractures that are left in the geological record, we discuss these and critically assess which vein structures can, and which do not necessarily, indicate hydrofracturing. Hydrofracturing can suddenly and locally change the permeability in a rock by providing new fluid pathways. This can lead to highly dynamic self-organization of crustal-scale fluid flow.
... The main purpose of such generalization is to obtain a distribution which has long power-law tails above the first scale, only to be eventually terminated, via a drop-off to zero, at the second scale. The need for such distribution was motivated by our looking into a possibility of Dragon Kings (DK) [23,24] in the realized volatility (RV) [25]. In particular, it is of great interest to glean into whether the most calamitous events in the stock market -Savings and Loan Crisis, Tech Bubble, Great Recession and Covid Pandemic -follow the Black Swan (BS) behavior and stay on the power-law tails or they deviate strongly higher and thus are DK. ...
... • A similar yet simpler modified Generalized Beta distribution (30)-(31) was constructed via a change of variable (4) from a modified Beta distribution (26)- (27), which is a steady-state distribution of the α = 1 SDE, (23). ...
Preprint
Full-text available
We approach the Generalized Beta (GB) family of distributions using a mean-reverting stochastic differential equation (SDE) for a power of the variable, whose steady-state (stationary) probability density function (PDF) is a modified GB (mGB) distribution. The SDE approach allows for a lucid explanation of Generalized Beta Prime (GB2) and Generalized Beta (GB1) limits of GB distribution and, further down, of Generalized Inverse Gamma (GIGa) and Generalized Gamma (GGa) limits, as well as describe the transition between the latter two. We provide an alternative form to the "traditional" GB PDF to underscore that a great deal of usefulness of GB distribution lies in its allowing a long-range power-law behavior to be ultimately terminated at a finite value. We derive the cumulative distribution function (CDF) of the "traditional" GB, which belongs to the family generated by the regularized beta function and is crucial for analysis of the tails of the distribution. We analyze fifty years of historical data on realized market volatility, specifically for S\&P500, as a case study of the use of GB/mGB distributions and show that its behavior is consistent with that of negative Dragon Kings.
... For example, the heavy-tailed distributions of natural hazards indicates that very large (i.e. several orders of magnitude greater than average) disasters are bound to happen, with catastrophic consequences [31,32]. In human society, power-law tails of wealth and income distributions suggest a very large disparity and inequity [33,34]. ...
Article
Full-text available
We revisit the sandpile model and examine the effect of introducing site-dependent thresholds that increase over time based on the generated avalanche size. This is inspired by the simplest means of introducing stability into a self-organized system: the locations of collapse are repaired and reinforced. Statistically, for the case of finite driving times, we observe that the site-dependent reinforcements decrease the occurrence of very large avalanches, leading to an effective global stabilization. Interestingly, however, long simulations runs indicate that the system will persist in a state of self-organized criticality (SOC), recovering the power-law distributions with a different exponent as the original sandpile. These results suggest that tipping the heavy-tailed power-laws into more equitable and normal statistics may require unrealistic scales of intervention for real-world systems, and that, in the long run, SOC mechanisms still emerge. This may help explain the robustness of power-law statistics for many complex systems.
... This matter of fact is as well-known as consciously ignored. Even differences and similarities in the behavior of small and large earthquakes are still not definitely understood (Pacheco et al., 1992;Sornette, 2009). ...
Article
Full-text available
Plain Language Summary Large earthquakes can be preceded by a wide range of different seismic anomalies. Among these, seismicity has been reported to increase both in magnitude and frequency, but, on the other hand, it can also undergo a short period of reduced intensity before major events. The first pattern corresponds to the occurrence of foreshocks, that is, small to moderate quakes forewarning an upcoming larger one, while the second behavior is called seismic quiescence. In our research, we focus on foreshock activity. We perform an analysis of seismicity in Southern California, for which a well‐provided relocated earthquake catalog is available. While several studies have been conducted so far about what happens before large earthquakes after their occurrence and also there are some works about foreshocks discrimination, a systematic analysis comparing properties of clusters of “swarms”, seismicity without a major event, and “foreshocks” before their mainshock is missing. Are foreshocks different from swarms before the occurrence of the main event? Are foreshocks fore‐shocks? Our results suggest that foreshocks can hardly be distinguished from swarms until the largest event takes place. On the base of this analysis and theoretical modeling, we think that foreshocks have limited reliability, if considered alone, for short‐term forecasts.
... The devastating nature of DKs is usually alleviated by their predictability, allowing preparation and planning [14,15,29,43]. The analytic treatment developed in this Letter could be useful for predicting and controlling DKs. ...
Article
Full-text available
The spontaneous emergence of scale invariance, called self-organized criticality (SOC), is often attributed to a second-order absorbing-state phase transition (ASPT). Many real-world systems display SOC, yet with extreme events overrepresented, called dragon kings (DKs) and causing significant disruption. We show analytically that the trade-off between driving impulse and dissipation rate can create DKs in a second-order ASPT. This establishes that DKs exist in SOC systems, reveals a taxonomy of DKs, and shows that larger dissipation and smoother driving lower risk of extreme events.
... In our previous research (Westphal & Sornette, 2020), we investigated the impact of a "dragon-rider", i.e. a third class of investors in addition to fundamentalists and noise traders, who exploit their ability to diagnose financial bubbles from the endogenous price history to determine optimal entry and exit trading times. The name "dragon-rider" is based on the empirical observation that crashes that follow bubbles are exceptional events, outliers of strong significance (Johansen & Sornette, 2002, which are named "dragon-kings" to emphasise their special status and specific amplifying mechanisms (Sornette, 2009;Sornette & Ouillon, 2012) (for a pedagogical introduction, see https:// en. wikip edia. ...
Article
Full-text available
Using a previously validated agent-based model with fundamentalists and chartists, we investigate the usefulness and impact of direct market intervention. The policy maker diagnoses bubbles by forming an expectation of the future returns, then invests in burgeoning bubbles to develop a sufficient inventory of the risky asset in order to be able to sell adequate amounts of the overpriced asset later countercyclically to fight market exuberance. Preventing bubbles and crashes, this market intervention improves all analysed market return metrics, volatility, skewness, kurtosis and VaR, without affecting long-term growth. This increases the Sharpe ratios of noise traders and of fundamentalists by approximately 28% and 45% respectively. The results are robust even for substantially miscalibrated long-term expected returns.
... However, the pre-and post-TMI distributions become indistinguishable for the very large CCDP/ΔCDP values. This aligns with the often reported observation that an excessive focus on controlling small events in complex technical systems can backfire, either by increasing the risk of extreme events (Sornette and Ouillon, 2012;Sornette, 2009), or failing to modify them while contributing to complacency. ...
Article
The opposition in some countries to including nuclear power in future sustainable energy portfolios—in part due to “nuclear dread”—often has limited quantitative scientific foundation of the real benefits and risks. This has been amplified by the lack of sound estimates of operational risk due to the scarcity of the relevant empirical data. In order to address this gap, we use the largest open database on accident precursors along with our in-house generic probabilistic safety assessment models to conduct a comprehensive statistical study of operational risks in the civil nuclear sector. We find that the distribution of precursor severities follows a Pareto distribution, and we observe a runaway Dragon Kings regime for the most significant events. Based on our findings, we have determined that exogenous factors account for 95% of the risk associated with nuclear power. By addressing these factors in new reactor designs, we estimate that the frequency of accidents similar to the Fukushima Daiichi level can be reduced to about one every 300 years for the global fleet. Finally, our study highlights the importance and need for international cooperation focused on constructing comprehensive blockchains of accident precursors.
... In human dynamics, herd behavior ("herding and mob" mentality) occurs. That is, panic during a stampede (crowds, stock markets, etc.) will cause more panic, which will make for the cattle to run more and more, and so forth 2,4 . ...
... It is even believed that crashes form different statistical populations with extreme properties (also called "dragon-kings"). They may be predictable to some degree [30]. ...
Article
Full-text available
The general framework for quantitative technical analysis of market prices is revisited and extended. The concept of a global time-translation invariance and its spontaneous violation and restoration is introduced and discussed. We find that different temporal patterns leading to some famous crashes (e.g., bubbles, hockey sticks, etc.) exhibit analogous probabilistic distributions found only in the time series for the stock market indices. A number of examples of crashes are presented. We stress that our goal here is to study the crash as a particular phenomenon created by spontaneous time-translation symmetry breaking/restoration. We ask only "how to calculate and interpret the probabilistic pattern which we encounter in the day preceding crash, and how to calculate the typical market reactions to shock?".
... Dragon kings [16] Meaningful outliers that are found to coexist with power laws in the distributions of event sizes under a broad range of conditions in a wide variety of systems. ...
Article
Full-text available
Background: Unexpected events or major supply chain disruptions have demonstrated the vulnerability in which supply chains operate. While supply chains are usually prepared for operational disruptions, unexpected or black swan events are widely disregarded, as there is no reliable way to forecast them. However, this kind of event could rapidly and seriously deteriorate supply chain performance, and ignoring that possibility could lead to devastating consequences. Methods: In this paper, definitions of major disruptions and the methods to cope with them are studied. Additionally, a methodology to develop supply chain resilience roadmaps is conceptualised by analysing existing literature to help plan for unexpected events. Results: The methodology is introduced to create roadmaps comprises several stages, including supply chain exploration, scenario planning, system analysis, definition of strategies, and signal monitoring. Each roadmap contains the description of a plausible future in terms of supply chain disruptions and the strategies to implement to help mitigate negative impacts. Conclusions: The creation of roadmaps calls for an anticipatory mindset from all members along the supply chain. The roadmaps development establishes the foundations for a holistic supply chain disruption preparation and analysis.
... This can therefore be generated by trade, financial intra-and international national currency valuation and devaluation theory and mechanics, political/diplomatic or military dominance vis-a-vis Core countries for a Dragon king, or Black swan breakout model [7] from a given Socioeconomic macrodynamics of automata sector-landscapes, a power law tail exponent formula P(x) Cmu/x1+mu [Sornette] [8]. via the Gumbel Law distribution where G(x)=exp(exp(−x)),x≥0 because of macroeconomic entropy factors such as higher interest rates and moribundity due to slowing growth rates proposed due to an exponential rise decay. ...
... Feed recommendation regimes that allow preferential attachment to already popular content and users can lead to disproportionally dominant agents [6]. These superstars (also known as "dragonkings" [7,8]) of platforms are often able to hold onto influence (and subsequent prominence in feeds) even despite major fluctuations in rank within platforms [9]. To mitigate this rich-get-richer regime, feeds need to focus on promoting low-popularity content as well, but this is not sufficient -you can't just show people low quality content or users stop engaging. ...
Preprint
Full-text available
The physical environment you navigate strongly determines which communities and people matter most to individuals. These effects drive both personal access to opportunities and the social capital of communities, and can often be observed in the personal mobility traces of individuals. Traditional social media feeds underutilize these mobility-based features, or do so in a privacy exploitative manner. Here we propose a consent-first private information sharing paradigm for driving social feeds from users' personal private data, specifically using mobility traces. This approach designs the feed to explicitly optimize for integrating the user into the local community and for social capital building through leveraging mobility trace overlaps as a proxy for existing or potential real-world social connections, creating proportionality between whom a user sees in their feed, and whom the user is likely to see in person. These claims are validated against existing social-mobility data, and a reference implementation of the proposed algorithm is built for demonstration. In total, this work presents a novel technique for designing feeds that represent real offline social connections through private set intersections requiring no third party, or public data exposure.
... "Worst cases", likelihoods, and frequencies are based on historic or perceived data but are not treated holistically. Neither is the recurrence nor the changing structure of extremes (e.g. the impact of heat on drought persistence) fully understood (Sornette, 2009). Secondly, this approach limits the ability to understand, address and flexibly respond to disruptions and their underlying vulnerabilities, including the identification of points at which they could escalate into cascading dynamics (Pescaroli and Alexander, 2016). ...
Article
Full-text available
Purpose This paper applies the theory of cascading, interconnected and compound risk to the practice of preparing for, managing, and responding to threats and hazards. Our goal is to propose a consistent approach for managing major risk in urban systems by bringing together emergency management, organisational resilience, and climate change adaptation. Design/methodology/approach We develop a theory-building process using an example from the work of the Greater London Authority in the United Kingdom. First, we explore how emergency management approaches systemic risk, including examples from of exercises, contingency plans and responses to complex incidents. Secondly, we analyse how systemic risk is integrated into strategies and practices of climate change adaptation. Thirdly, we consider organisational resilience as a cross cutting element between the approaches. Findings London has long been a champion of resilience strategies for dealing with systemic risk. However, this paper highlights a potential for integrating better the understanding of common points of failure in society and organisations, especially where they relate to interconnected domains and where they are driven by climate change. Originality/value The paper suggests shifting toward the concept of operational continuity to address systemic risk and gaps between Emergency Management, Organizational Resilience and Climate Change Adaptation.
... The perceived low probability of occurrence and the cost-benefit criteria can be the reasons behind this. However, this low probability and high severity events, known as "Black Swans" are not that rare in the real world (Sornette, 2009). This can be due to the inadequacies pointed out in Section 2.2, such as probative blindness. ...
Article
Airport operators need to use several domain-specific risk management systems to comply with regulations and protect their business interests. These risk management systems are simple and focused but handicapped by their deterministic view of hazards and risks. They cannot understand a hazard’s behaviour in a complex, socio-technical system such as airports. They also suffer from probative blindness during their risk assessment and accord prominence to high severity and high-frequency risks and tend to neglect low frequency but high severity risks known as “Black Swans” as improbable. Moreover, they view risks from the regulators’ perspectives rather than from the airport operators and often discount their business continuity risks. To overcome these limitations, this work proposed an integrated risk management method that can help an airport operator to manage complex risks at the organisation’s level. It uses subject matter experts’ brainstorming to identify compound hazards and associated risks. It prioritises risks based on the existing mitigations and their effectiveness. The results section demonstrates this method’s usefulness by applying it to day-to-day scenarios in an airport. In conclusion, this proposed method can help an airport operator to identify complex risks and manage them in an effective manner. This is a simple to use, cost-effective, and user-friendly method. Airport operators can implement it to manage risks with a holistic view with no significant changes to their existing organisational structure. Since this method has a generalized process, any organisation can use it apart from airports. This method’s usefulness depends on qualified subject matter experts’ availability. A field trial with an airport can help this method to gain acceptability with the operators and regulators.
... However, there are very rare cases in which the values deviate and exceed the base distribution in the large-value area. These unique events are termed "dragons" based on the terminology introduced by D. Sornette [18]. ...
Article
Full-text available
Precipitation extrema over the Barents Sea and the neighbouring locations in Europe were analysed using data obtained from station observations and a highly detailed ERA5 re-analysis dataset. These data did not always spatially coincide (on average, coincidence was ~50%). Daily amounts of precipitation were typically higher in the observation data, although there may be a reverse picture. The analysis revealed that at several stations and in many of the ERA5 grids, the set of precipitation extremes exists as a mixture of two different subsets. The cumulative distribution functions (CDF) of the largest population in the context of both the re-analysis and observational data are well described by Pareto’s law. However, very rare cases exist in which the values deviate and exceed this base distribution value in regions possessing large values. These super-large anomalies do not obey the statistical law common to all other extremes. However, this does not mean that the extremes can be arbitrarily large. They do not exceed the marginal values that are typical for this type of climate and season. The analysis confirms that extreme precipitation in the western sector of the Arctic is caused by the penetration of moist air masses from the Atlantic in the circulation systems of intense cyclones. At certain times, mesoscale convective systems are embedded in atmospheric fronts and can significantly contribute to the formation of precipitation. Intensification of such cyclones corresponding to global warming should lead to a transformation of typical CDF, as modern outliers will become regular components of the Pareto law. This change in the statistics of extreme events reflects the nonstationarity of the climate state. The influence of polar lows on the formation of large daily precipitation amounts is not felt.
... Figure 1. Predictability based on interaction and diversity in a system (Sornette, 2009). Education W hen it came to education, the COVID-19 pandemic resulted in challenges that administrators, parents, students, and teachers were not prepared to meet. ...
Article
When JPCS published “A Complexity Context to Classroom Interactions and Climate Impact on Achievement” in 2017, the article was a cutting-edge application of ABM to classroom dynamics. Five years later, though, there have been dramatic changes to education as a result of the COVID-19 pandemic. While the technology of ABM has advanced sufficiently that reexamining of the topic may be justified, the trauma caused by the pandemic should make us question whether any such model would accurately reflect the real world. Given the isolating nature of COVID-19 and online learning, the purpose of this article is to remind us that in a classroom environment, “every interaction matters.” Effective action steps can easily be taken to dramatically strengthen interactions and thus strengthen leaning networks, which will lead to higher levels of achievement. This can be done by the means of simple strategies like increasing positive climate behavioral markers in the classroom, like using student names, checking-in with students, smiling, using polite language, laughing, and clapping. In contrast, negative behavior markers like anger, sarcasm, irritability, harsh voice, yelling, exclusion of students, bad language, physical control of students, teasing, and bullying must be eradicated.
Article
One of the main insights of prospect theory is that investment decisions are often irrational, following certain trends, and this is particularly true for individual investment decisions. The theory’s main proponents and its developers have described the phenomenon of risk seeking over losses and risk aversion over gains, mainly by looking at stock market trends. One of the hypotheses of our paper is that investors become risk averse in times of crisis. The other hypothesis is that the hummingbird effect can be detected in stock market trading. Using linear regression, we have been able to show that investors become more risk averse in times of crisis, which can also be seen in stock market trading, through the shift in the index. In addition, we have also been able to show that the hummingbird effect can also be detected in stock market trading.
Article
Full-text available
Humanity faces a myriad of existential technology, geopolitical, and ecological risks. The paper analyzes the possibility that negative shocks superimpose and multiply their effects, leading to catastrophic macro-dynamics. Methodologically, this paper presents a rare, quantitative scenario model superimposed upon narrative scenarios where the cascading economic effects of 19 quantitative indicators of growth or decline are projected into 2075. These indicators map onto five narrative scenarios, and are subsequently re-combined to study effects of plausible cascading risk events coming to pass in the 50 years period between 2025 and 2075. Results indicate that even in the case of deeply catastrophic singular events, the world would eventually recover within 25 years, as has historically been the case. The exception is that in the event of several catastrophic events in short order around the midpoint of the 50-year scenario timeline, the cascading risk escalation would create formidable negative cascades. The possibility of a protracted depression and no visible recovery within 25 years is the result. However, if we assume a modest interaction effect, even with just 3-5 co-occurring catastrophes, the result becomes a path towards humanity’s extinction based on economic decline alone. The implications are that humanity either needs to avoid significant cascading effects at all costs or needs to identify novel ways to recover compared to previous depressions. Given the amount of model assumptions they rely upon, these projections carry a degree of uncertainty. Further study should therefore be conducted with a larger set of indicators and impacts, including mortality modeling, to assess how much worse plausible real-world outcomes might be compared to the simplified economic model deployed here.
Article
Aiming to assess the progress and current challenges on the formidable problem of the prediction of solar energetic events since the COSPAR / International Living With a Star (ILWS) Roadmap paper of Schrijver et al. (2015), we attempt an overview of the current status of global research efforts. By solar energetic events we refer to flares, coronal mass ejections (CMEs), and solar energetic particle (SEP) events. The emphasis, therefore, is on the prediction methods of solar flares and eruptions, as well as their associated SEP manifestations. This work complements the COSPAR International Space Weather Action Teams (ISWAT) review paper on the understanding of solar eruptions by Linton et al. (2023) (hereafter, ISWAT review papers are conventionally referred to as ’Cluster’ papers, given the ISWAT structure). Understanding solar flares and eruptions as instabilities occurring above the nominal background of solar activity is a core solar physics problem. We show that effectively predicting them stands on two pillars: physics and statistics. With statistical methods appearing at an increasing pace over the last 40 years, the last two decades have brought the critical realization that data science needs to be involved, as well, as volumes of diverse ground- and space-based data give rise to a Big Data landscape that cannot be handled, let alone processed, with conventional statistics. Dimensionality reduction in immense parameter spaces with the dual aim of both interpreting and forecasting solar energetic events has brought artificial intelligence (AI) methodologies, in variants of machine and deep learning, developed particularly for tackling Big Data problems. With interdisciplinarity firmly present, we outline an envisioned framework on which statistical and AI methodologies should be verified in terms of performance and validated against each other. We emphasize that a homogenized and streamlined (i.e., readily performed) method validation is another open challenge. The performance of the plethora of methods is typically far from perfect, with physical reasons to blame, besides practical shortcomings: imperfect data, data gaps and a lack of multiple, and meaningful, vantage points of solar observations. We briefly discuss these issues, too, that shape our desired short- and long-term objectives for an efficient future predictive capability. A central aim of this article is to trigger meaningful, targeted discussions that will compel the community to adopt standards for performance verification and validation, which could be maintained and enriched by institutions such as NASA’s Community Coordinated Modeling Center (CCMC) and the community-driven COSPAR/ISWAT initiative.
Article
Extreme events are ubiquitous in nature and social society, including natural disasters, accident disasters, crises in public health (such as Ebola and the COVID-19 pandemic), and social security incidents (wars, conflicts, and social unrest). These extreme events will heavily impact financial markets and lead to the appearance of extreme fluctuations in financial time series. Such extreme events lack statistics and are thus hard to predict. Recurrence interval analysis provides a feasible solution for risk assessment and forecasting. This Element aims to provide a systemic description of the techniques and research framework of recurrence interval analysis of financial time series. The authors also provide perspectives on future topics in this direction.
Article
Large-amplitude rolling motions, also regarded as extreme oscillations, are a great threat to marine navigation, which may lead to capsizing in ship motion. Therefore, it is important to quantify extreme oscillations, assess reliability of ship systems, and establish a suitable indicator to characterize extreme oscillations in ship systems. In this work, extreme events are investigated in a ship model considering a complex ocean environment, described by a single-degree-of-freedom nonlinear system with stochastic harmonic excitation and colored Gaussian noise. The stationary probability density function (PDF) of the system is derived through a probabilistic decomposition-synthesis method. Based on this, we infer the classical damage rate of the system. Furthermore, a new indicator, independent of the PDF, is proposed to quantify the damage related only to the fourth-order moment of the system and the threshold for extreme events. It is more universal and easier to determine as compared with the classical damage rate. A large damping ratio, a large noise intensity, or a short correlation time can reduce the damage rate and the value of the indicator. These findings provide new insights and theoretical guidance to avoid extreme oscillations and assess the reliability of practical ship movements.
Book
Abstract The key issue of the book is the uncertainty associated with the operation and maintenance of homo-centric complex systems, known as Cyber-Physical-Social Systems. They are subject to disturbances and disruptions from a variable environment which are difficult to predict due to high level of structural, spatial and temporal complexity. In such cases, we are dealing with so-called deep uncertainty, for which statistical methods are not applicable. However, Cognitive Systems Engineering, built on the concept of imperfect knowledge, allows to use unconventional sources of information, such as expert intuition and other not fully conscious cognitive processes. On this basis, a new transdisciplinary concept of Cognitive Dependability Engineering was proposed as a foundation for effective, efficient, and safe managing the operation and maintenance of Cyber-Physical-Social Systems under conditions of deep uncertainty. Of practical benefit to readers may be a framework for solving complex problems in Cyber-Physical-Social Systems operating under deep uncertainty. It is based on the synergistic cooperation of the human expert and its cognitive digital twin, the precautionary principle, as well as the extended resilience matrix. The ways of using this framework are illustrated by three examples based on real disasters that happened in the last decades around the world.
Chapter
Disaster response planning has progressed substantially; however, it remains frustratingly bogged down as some major impacts were not anticipated. The traditional disaster response method of predicting risks in advance and executing it in an emergency has progressed substantially. However, recent catastrophic events, such as the 2011 East Japan earthquake and the ongoing COVID-19 pandemic implied that modern society is getting vulnerable to disaster due to the increasing features of complexity, interconnectedness, and uncertainty, which often results in unexpected or emerging risk. To respond well to these unexpected events, this study explored the emergence of systemic disaster risk resulting from risk environment and its change by the intersection with response measures. As a result, the authors proposed a multi-dimensional systemic response framework, consisting in risk management flow, strategic response map, and multi-dimensional emergency planning, which can help countries respond effectively to the systemic disaster risk that threatens the safety of modern society. The response framework adopts the notions of risk types from a response perspective, anticipated risk, emerging risk, amplified risk, lingering risk, and mitigated risk, and applies the risk management flow to address various risks in an emergency.
Chapter
Resilience is a word that is gaining increasing currency in the field of strategic management (Cascio, Foreign Policy 172:82–95, 2009) although not without some criticism (Rose, Environmental Hazards: Human and Policy Dimensions 6:1–16, 2007). Use of the word is evolving from its classical etymology and narrow engineering definition as bounce-back to the status quo ante. In life sciences resilience is taken to be evolutionary in nature. This understanding accords with the reality of living in dynamic networks, where our ‘bounded rationality’ (Simon, H.A. [1956] 1982. Reply: Surrogates for uncertain decision problems. In Models of bounded rationality, vol. 1: Economic analysis and public policy. Cambridge, MA: The MIT Press.) is increasingly dangerous to ignore. On these terms resilience is a realist concept for enabling bodies to bounce forward, innovating appropriately through learning from a past overtaken by events and exploration of the uncertainties ahead.
Article
Bitcoin (BTC) and Ethereum (ETH), pioneering public blockchains implementations, are two fundamental levers to register and transfer digital value. This article studies the structure of their daily price volatility time series following a multifaceted approach: first, it examines the existence of chaoticity and fractality in the time series. Obtained results confirm that the BTC and ETH price volatility series present signs of chaoticity, persistence of a long-term correlation and multifractality. Second, it analyses the corresponding visibility graphs associated with these time series using complex network theory. The undirected and connected complex networks, spawned by their natural visibility graphs (VGs) and horizontal visibility graphs (HVGs), present a hierarchical structure. These networks, especially the HVGs, confirm the fractality of the originating time series. The study of HVGs also confirms a lack of uncorrelated randomness in the originating BTC and ETH price series. This paper validates the value of visibility graphs as useful proxies to better understand complex time series, in this case, related to public blockchain implementations.
Article
Full-text available
Purpose: This study intends to examine the nature & direction of relationship between stock market movements, particularly market decline, and its liquidity in 14 selected emerged and emerging economies (G8+5 and Pakistan) for January 2001 through December 2017 by applying Autoregressive Distributed Lag (ARDL) Bounds test and Granger-causality test. Trading value and turnover ratio are employed to measure market liquidity. Methodology: The study is conducted on a sample of 14 economies (G8 + 5 emerging economies, and Pakistan) for January 2001 through December 2017. Daily basis data for all variables is collected from data stream and Economic Indicator website. Market Liquidity is measured by trading value and turnover ratio Findings: Results of trading value Granger-causality test highlight the evidence of no causality in Germany & India. Bi-directional causality exists in Pakistan only. Uni-directional causality subsists only in Russia at 10% significance level from trading value to market return. However, from market return to trading value, results demonstrate the presence of uni-directional causality at 5% significance level for Brazil, Japan. Canada, China, France, Italy, UK, USA, South Africa and Mexico. Negative returns are used to represent the notion of market decline. Implications: Study summarizes the stock market movements of emerging and emerged countries which will be helpful for future researchers and policy makers in their projects.
Article
Despite over twenty years of political resistance, globalisation endures, both as a discourse and political project. While many works have established how discourses of globalisation serve to constrain and/or guide macroeconomic policy, the dynamic nexus that exists between microeconomic decision-making processes vis-à-vis macroeconomic conditions is a less explored matter. Epistemologically, this paper offers a means of gaining analytical purchase over the logics, motivations, and thought processes of international bankers – and the financial press, which chronicle their actions. This is accomplished via a rigorous discursive-content analysis that gauges how said agents understood globalisation, the role of finance, the state, and even how market sentiment was factored into their ontological worldview. Ultimately, the goal is to establish an historical and spatiotemporally heterogeneous analysis on if, how, and what ideas of globalisation and neoliberalism influenced/rationalized the financialization of Argentine banks during the 1990s and how these evolved en route to Argentina's 2001 collapse.
Chapter
We propose a Systematic Risk Indicator derived from a polymodel estimation. Polymodels allow us to measure the strength of the links that a stock market maintains with its economic environment. We show that these links tend to be more extreme before a market crisis, confirming the well-known increase of correlations while proposing a more subtle perspective. A fully automated and successful trading strategy is implemented to assess the interest of the signal, which is shown to be strongly significant, both from an economic and statistical point of view. Results are robust across different time-periods, for various sets of explanatory variables, and among 12 different stock markets.KeywordsPolymodel theorySystematic riskFinancial crisisSpeculative bubbleStock market crashTrading strategyLNLM modelingMarket timingDrawdown prediction
Article
Full-text available
We trace the evolution of research on extreme solar and solar-terrestrial events from the 1859 Carrington event to the rapid development of the last twenty years. Our focus is on the largest observed/inferred/theoretical cases of sunspot groups, flares on the Sun and Sun-like stars, coronal mass ejections, solar proton events, and geomagnetic storms. The reviewed studies are based on modern observations, historical or long-term data including the auroral and cosmogenic radionuclide record, and Kepler observations of Sun-like stars. We compile a table of 100- and 1000-year events based on occurrence frequency distributions for the space weather phenomena listed above. Questions considered include the Sun-like nature of superflare stars and the existence of impactful but unpredictable solar "black swans" and extreme "dragon king" solar phenomena that can involve different physics from that operating in events which are merely large.
Article
What are black swans? And why are mainstream economists so often surprised by financial crises? Furthermore, have supply-side responses to crises merely deepened economic inequality and instability? Rather than simply comparing theories and methods – economics’ ‘ontological turn’ posits that what is foremostly required is an audit of prevailing theories’ metaphysical commitments. That is, to interrogate what economics presumes there is to observe, interact with, and know. Seeking to garner greater analytical purchase over the ‘real-world’, the ontological turn has also birthed several multidisciplinary syntheses – namely, Austrian-materialism (Lewis, P., 2005. Boettke, the Austrian school and the reclamation of reality in modern economics. The Review of Austrian Economics, 18 (1), 83–108.) and Keynesian-critical realism (Lawson, T., 2003. Reorienting economics. New York: Routledge.). Alternatively, and in support of the constructivist claim that social dynamics ultimately drive economics, this paper resituates and extends Wesley Widmaier’s (2004) vision of a Keynesian-constructivism within the broader ontological turn. Herein, this paper interrogates the concepts of black swans vis-à-vis post-Keynesian understandings of business cycles, demand-side logics, and Modern Money – en route to establishing new analytical frameworks.
Article
Full-text available
Recognition of anomalous events is a challenging but critical task in many scientific and industrial fields, especially when the properties of anomalies are unknown. In this paper, we introduce a new anomaly concept called “unicorn” or unique event and present a new, model-free, unsupervised detection algorithm to detect unicorns. The key component of the new algorithm is the Temporal Outlier Factor (TOF) to measure the uniqueness of events in continuous data sets from dynamic systems. The concept of unique events differs significantly from traditional outliers in many aspects: while repetitive outliers are no longer unique events, a unique event is not necessarily an outlier; it does not necessarily fall out from the distribution of normal activity. The performance of our algorithm was examined in recognizing unique events on different types of simulated data sets with anomalies and it was compared with the Local Outlier Factor (LOF) and discord discovery algorithms. TOF had superior performance compared to LOF and discord detection algorithms even in recognizing traditional outliers and it also detected unique events that those did not. The benefits of the unicorn concept and the new detection method were illustrated by example data sets from very different scientific fields. Our algorithm successfully retrieved unique events in those cases where they were already known such as the gravitational waves of a binary black hole merger on LIGO detector data and the signs of respiratory failure on ECG data series. Furthermore, unique events were found on the LIBOR data set of the last 30 years.
Article
Full-text available
This paper uses multivariate Hawkes processes to model the transactions behavior of the US stock market as measured by the 30 Dow Jones Industrial Average individual stocks before, during and after the 36-min May 6, 2010, Flash Crash. The basis for our analysis is the excitation matrix, which describes a complex network of interactions among the stocks. Using high-frequency transactions data, we find strong evidence of self- and asymmetrically cross-induced contagion and the presence of fragmented trading venues. Our findings have implications for stock trading and corresponding risk management strategies as well as stock market microstructure design.
Chapter
The basic elements needed for defining and finding bubbles and crashes are now in place and this chapter collates, extends, and summarizes the concepts that have earlier been developed. The result is a practical empirical method that enables extreme events to be statistically defined, detected, and tested by applying the elasticity of variance approach to real-world data. Given the fractal, scaled nature of financial markets, all bubbles and crashes are comprised of what can be called microbubbles and microcrashes. Predictability and forecasting aspects are then covered as extensions.
Article
Full-text available
The probability distribution of stock price changes is studied by analyzing a database (the Trades and Quotes Database) documenting every trade for all stocks in three major US stock markets, for the two year period January 1994 { December 1995. A sample of 40 million data points is extracted, which is substantially larger than studied hitherto. We nd an asymptotic power-law behavior for the cumulative distribution with an exponent3, well outside the L evy regime (0
Article
Full-text available
Palaeoseismological data for the Wasatch and San Andreas fault zones have led to the formulation of the characteristic earthquake model, which postulates that individual faults and fault segments tend to generate essentially same size or characteristic earthquakes having a relatively narrow range of magnitudes near the maximum. Analysis of scarp-derived colluvium in trench exposures across the Wasatch fault provides estimates of the timing and displacement associated with individual surface faulting earthquakes. The characteristic earthquake appears to be a fundamental aspect of the behavior of the Wasatch and San Andreas faults and may apply to many other faults as well.-from Authors
Article
Full-text available
Paleoearthquake and fault slip-rate data are combined with the CIT-USGS catalog for the period 1944 to 1992 to examine the shape of the magnitude-frequency distribution along the major strike-slip faults of southern California. The resulting distributions for the Newport-Inglewood, Elsinore, Garlock, and San Andreas faults are in accord with the characteristic earthquake model of fault behavior. The distribution observed along the San Jacinto fault satisfies the Gutenberg-Richter relationship. The Gutenberg-Richter distribution observed for the entirety of the San Jacinto may reflect the sum of seismicity along a number of distinct fault segments, each of which displays a characteristic earthquake distribution. -from Author
Article
Full-text available
We investigate a 1D dynamical version of the Burridge-Knopoff model for earthquakes with a velocity-weakening friction law. Such system exhibits two types of solution: chaotic motion and solitary-wave propagation. The latter can be seen as propagative localized macrodislocations whose shape, size and velocity are found to exhibit very little fluctuations. This property leads to "resonances" in the mean friction force as a function of the control parameter Θ, defined as the product of the driving displacement rate by the size of the system. The number of solitary waves is proportional to the parameter Θ. Such solitary-wave solutions can be correlated to recent self-healing crack models for the dynamics of earthquake rupture.
Book
Full-text available
One of the broadly accepted universal laws of complex systems, particularly relevant in social sciences and economics, is that proposed by Zipf (1949). Zipf’s law usually refers to the fact that the probability P(s) = Pr{S > s} that the value S of some stochastic variable, usually a size or frequency, is greater than s, decays with the growth of s as P(s) ∼ s − 1. This in turn means that the probability density functions p(s) exhibits the power law dependence
Article
Full-text available
To account quantitatively for many reported “natural” fat tail distributions in Nature and Economy, we propose the stretched exponential family as a complement to the often used power law distributions. It has many advantages, among which to be economical with only two adjustable parameters with clear physical interpretation. Furthermore, it derives from a simple and generic mechanism in terms of multiplicative processes. We show that stretched exponentials describe very well the distributions of radio and light emissions from galaxies, of US GOM OCS oilfield reserve sizes, of World, US and French agglomeration sizes, of country population sizes, of daily Forex US-Mark and Franc-Mark price variations, of Vostok (near the south pole) temperature variations over the last 400 000 years, of the Raup-Sepkoski's kill curve and of citations of the most cited physicists in the world. We also discuss its potential for the distribution of earthquake sizes and fault displacements. We suggest physical interpretations of the parameters and provide a short toolkit of the statistical properties of the stretched exponentials. We also provide a comparison with other distributions, such as the shifted linear fractal, the log-normal and the recently introduced parabolic fractal distributions. PACS. 02.50.+r Probability theory, stochastic processes and statistics - 89.90.+n Other areas of general interest to physicists - 01.75.+m Science and society
Article
Full-text available
We present an analysis of the economic, political and social factors that underlay the Apollo program, one of the most exceptional and costly projects ever undertaken by the United States in peacetime that culminated in 1969 with the first human steps on the Moon. This study suggests that the Apollo program provides a vivid illustration of a societal bubble, defined as a collective over-enthusiasm as well as unreasonable investments and efforts, derived through excessive public and/or political expectations of positive outcomes associated with a general reduction of risk aversion. We show that economic, political and social factors weaved a network of reinforcing feedbacks that led to widespread over-enthusiasm and extraordinary commitment by those involved in the project as well as by politicians and by the public at large. We propose the general concept of “pro-bubbles”, according to which bubbles are an unavoidable development in technological and social enterprise that benefits society by allowing exceptional niches of innovation to be explored.
Article
Full-text available
We report an empirical determination of the probability density functions Pdata(r) (and its cumulative version) of the number r of earthquakes in finite space-time windows for the California catalog, over fixed spatial boxes 5 ×5km2, 20 ×20km2 and 50 ×50km2 and time intervals t = 10,100\tau =10,~100 and 1000days. The data can be represented by asymptotic power law tails together with several cross-overs. These observations are explained by a simple stochastic branching process previously studied by many authors, the ETAS (epidemic-type aftershock sequence) model which assumes that each earthquake can trigger other earthquakes (“aftershocks”). An aftershock sequence results in this model from the cascade of aftershocks of each past earthquake. We develop the full theory in terms of generating functions for describing the space-time organization of earthquake sequences and develop several approximations to solve the equations. The calibration of the theory to the empirical observations shows that it is essential to augment the ETAS model by taking account of the pre-existing frozen heterogeneity of spontaneous earthquake sources. This seems natural in view of the complex multi-scale nature of fault networks, on which earthquakes nucleate. Our extended theory is able to account for the empirical observation but some discrepancies, especially for the shorter time windows, point to limits of both our theoretical approach and of the ETAS model.
Article
Full-text available
The probability distribution of stock price changes is studied by analyzing a database (the Trades and Quotes Database) documenting every trade for all stocks in three major US stock markets, for the two year period January 1994 - December 1995. A sample of 40 million data points is extracted, which is substantially larger than studied hitherto. We find an asymptotic power-law behavior for the cumulative distribution with an exponent a » 3a \approx 3, well outside the Lévy regime (0 < a < 2)(0 < \alpha < 2). PACS. 89.90.+n Other areas of general interest to physicists
Article
Full-text available
At what level should government or companies support research? This complex multi-faceted question encompasses such qualitative bonus as satisfying natural human curiosity, the quest for knowledge and the impact on education and culture, but one of its most scrutinized component reduces to the assessment of economic performance and wealth creation derived from research. In certain areas such as biotechnology, semi-conductor physics, optical communications, the impact of basic research is direct while, in other disciplines, the path from discovery to applications is full of surprises. As a consequence, there are persistent uncertainties in the quantification of the exact economic returns of public expenditure on basic research. Here, we suggest that these uncertainties have a fundamental origin to be found in the interplay between the intrinsic ``fat tail'' power law nature of the distribution of economic returns, characterized by a mathematically diverging variance, and the stochastic character of discovery rates. In the regime where the cumulative economic wealth derived from research is expected to exhibit a long-term positive trend, we show that strong fluctuations blur out significantly the short-time scales: a few major unpredictable innovations may provide a finite fraction of the total creation of wealth. In such a scenario, any attempt to assess the economic impact of research over a finite time horizon encompassing only a small number of major discoveries is bound to be highly unreliable. New tools, developed in the theory of self-similar and complex systems to tackle similar extreme fluctuations in Nature can be adapted to measure the economic benefits of research, which is intimately associated to this large variability.
Article
Full-text available
Amid the current financial crisis, there has been one equity index beating all others: the Shanghai Composite. Our analysis of this main Chinese equity index shows clear signatures of a bubble build up and we go on to predict its most likely crash date: July 17-27, 2009 (20%/80% quantile confidence interval).
Article
Full-text available
By combining (i) the economic theory of rational expectation bubbles, (ii) behavioral finance on imitation and herding of investors and traders and (iii) the mathematical and statistical physics of bifurcations and phase transitions, the log-periodic power law (LPPL) model has been developed as a flexible tool to detect bubbles. The LPPL model considers the faster-than-exponential (power law with finite-time singularity) increase in asset prices decorated by accelerating oscillations as the main diagnostic of bubbles. It embodies a positive feedback loop of higher return anticipations competing with negative feedback spirals of crash expectations. We use the LPPL model in one of its incarnations to analyze two bubbles and subsequent market crashes in two important indexes in the Chinese stock markets between May 2005 and July 2009. Both the Shanghai Stock Exchange Composite index (US ticker symbol SSEC) and Shenzhen Stock Exchange Component index (SZSC) exhibited such behavior in two distinct time periods: 1) from mid-2005, bursting in October 2007 and 2) from November 2008, bursting in the beginning of August 2009. We successfully predicted time windows for both crashes in advance (Sornette, 2007; Bastiaensen et al., 2009) with the same methods used to successfully predict the peak in mid-2006 of the US housing bubble (Zhou and Sornette, 2006b) and the peak in July 2008 of the global oil bubble (Sornette et al., 2009). The more recent bubble in the Chinese indexes was detected and its end or change of regime was predicted independently by two groups with similar results, showing that the model has been well-documented and can be replicated by industrial practitioners. Here we present more detailed analysis of the individual Chinese index predictions and of the methods used to make and test them. We complement the detection of log-periodic behavior with Lomb spectral analysis of detrended residuals and $(H, q)$-derivative of logarithmic indexes for both bubbles. We
Article
Full-text available
We present a self-consistent model for explosive financial bubbles, which combines a mean-reverting volatility process and a stochastic conditional return which reflects nonlinear positive feedbacks and continuous updates of the investors’ beliefs and sentiments. The conditional expected returns exhibit faster-than-exponential acceleration decorated by accelerating oscillations, called “log-periodic power law.” Tests on residuals show a remarkable low rate (0.2%) of false positives when applied to a GARCH benchmark. When tested on the S&P500 US index from Jan. 3, 1950 to Nov. 21, 2008, the model correctly identifies the bubbles ending in Oct. 1987, in Oct. 1997, in Aug. 1998 and the ITC bubble ending on the first quarter of 2000. Different unit-root tests confirm the high relevance of the model specification. Our model also provides a diagnostic for the duration of bubbles: applied to the period before Oct. 1987 crash, there is clear evidence that the bubble started at least 4 years earlier. We confirm the validity and universality of the volatility-confined LPPL model on seven other major bubbles that have occurred in the World in the last two decades. Using Bayesian inference, we find a very strong statistical preference for our model compared with a standard benchmark, in contradiction with Chang and Feigenbaum [2006] which used a unit-root model for residuals.
Article
Full-text available
We study a rational expectation model of bubbles and crashes. The model has two components: (1) our key assumption is that a crash may be caused by local self-reinforcing imitation between noise traders. If the tendency for noise traders to imitate their nearest neighbors increases up to a certain point called the “critical†point, all noise traders may place the same order (sell) at the same time, thus causing a crash. The interplay between the progressive strengthening of imitation and the ubiquity of noise is characterized by the hazard rate, i.e. the probability per unit time that the crash will happen in the next instant if it has not happened yet. (2) Since the crash is not a certain deterministic out come of the bubble, it remains rational for traders to remain invested provided they are compensated by a higher rate of growth of the bubble for taking the risk of a crash. Our model distinguishes between the end of the bubble and the time of the crash: the rational expectation constraint has the specific implication that the date of the crash must be random. The theoretical death of the bubble is not the time of the crash because the crash could happen at any time before, even though this is not very likely. The death of the bubble is the most probable time for the crash. There also exists a finite probability of the attaining the end of the bubble without crash. Our model has specific predictions about the presence of certain critical log-periodic patterns in pre-crash prices, associated with the deterministic components of the bubble mechanism. We provide empirical evidence showing that these patterns were indeed present before the crashes of 1929, 1962 and 1987 on Wall Street and the 1997 crash on the Hong Kong Stock Exchange. These results are compared with the statistical tests on synthetic data.
Book
Continuous Gibrat#x2019 s Law and Gabaix#x2019 s Derivation of Zipf#x2019 s Law.- Flow of Firm Creation.- Useful Properties of Realizations of the Geometric Brownian Motion.- Exit or #x201C Death#x201D of Firms.- Deviations from Gibrat#x2019 s Law and Implications for Generalized Zipf#x2019 s Laws.- Firm#x2019 s Sudden Deaths.- Non-stationary Mean Birth Rate.- Properties of the Realization Dependent Distribution of Firm Sizes.- Future Directions and Conclusions.
Article
We present a self-consistent model for explosive financial bubbles, which combines a mean-reverting volatility process and a stochastic conditional return which reflects nonlinear positive feedbacks and continuous updates of the investors’ beliefs and sentiments. The conditional expected returns exhibit faster-than-exponential acceleration decorated by accelerating oscillations, called “log-periodic power law.” Tests on residuals show a remarkable low rate (0.2%) of false positives when applied to a GARCH benchmark. When tested on the S&P500 US index from Jan. 3, 1950 to Nov. 21, 2008, the model correctly identifies the bubbles ending in Oct. 1987, in Oct. 1997, in Aug. 1998 and the ITC bubble ending on the first quarter of 2000. Different unit-root tests confirm the high relevance of the model specification. Our model also provides a diagnostic for the duration of bubbles: applied to the period before Oct. 1987 crash, there is clear evidence that the bubble started at least 4 years earlier. We confirm the validity and universality of the volatility-confined LPPL model on seven other major bubbles that have occurred in the World in the last two decades. Using Bayesian inference, we find a very strong statistical preference for our model compared with a standard benchmark, in contradiction with Chang and Feigenbaum [2006] which used a unit-root model for residuals.
Article
The scientific study of complex systems has transformed a wide range of disciplines in recent years, enabling researchers in both the natural and social sciences to model and predict phenomena as diverse as earthquakes, global warming, demographic patterns, financial crises, and the failure of materials. In this book, Didier Sornette boldly applies his varied experience in these areas to propose a simple, powerful, and general theory of how, why, and when stock markets crash.Most attempts to explain market failures seek to pinpoint triggering mechanisms that occur hours, days, or weeks before the collapse. Sornette proposes a radically different view: the underlying cause can be sought months and even years before the abrupt, catastrophic event in the build-up of cooperative speculation, which often translates into an accelerating rise of the market price, otherwise known as a "bubble." Anchoring his sophisticated, step-by-step analysis in leading-edge physical and statistical modeling techniques, he unearths remarkable insights and some predictions--among them, that the "end of the growth era" will occur around 2050.Sornette probes major historical precedents, from the decades-long "tulip mania" in the Netherlands that wilted suddenly in 1637 to the South Sea Bubble that ended with the first huge market crash in England in 1720, to the Great Crash of October 1929 and Black Monday in 1987, to cite just a few. He concludes that most explanations other than cooperative self-organization fail to account for the subtle bubbles by which the markets lay the groundwork for catastrophe.Any investor or investment professional who seeks a genuine understanding of looming financial disasters should read this book. Physicists, geologists, biologists, economists, and others will welcomeWhy Stock Markets Crashas a highly original "scientific tale," as Sornette aptly puts it, of the exciting and sometimes fearsome--but no longer quite so unfathomable--world of stock markets.
Article
Dissipative dynamical systems with many degrees of freedom naturally evolve to a critical state with fluctuations extending over all length- and time-scales. It is suggested, and supported by simulations on simple toy-model systems, that turbulence, earthquakes, “1/f” noise, and economics may operate at the self-organized critical state.
Article
We analyze evolving stress and seismicity generated by three realizations of a discrete model of a strike-slip fault in a three-dimensional (3-D) elastic half space using five functions of stress and five functions of seismicity. The first model (F) has realistic dynamic weakening (static minus dynamic frictions), the second (FC) has zero critical dynamic weakening, and the third (SYS) is constrained to produce only system size events. The results for model F show cyclical development, saturation, and destruction of fluctuations and long-range correlations on the fault, punctuated by the system size events. The development stage involves evolution of stress and seismicity to distributions having broad ranges of scales, evolution of response functions toward scale-invariant behavior, increasing seismicity rate and event sizes, and increasing hypocenter diffusion. Most functions reach asymptotically stable values around 2/3 of the cycle and then fluctuate until one event cascades to become the next large earthquake. In model FC the above evolution is replaced by scale-invariant statistical fluctuations, while in model SYS the signals show simple cyclic behavior. The results suggest that large earthquake cycles on heterogeneous faults with realistic positive dynamic weakening are associated with intermittent criticality, produced by spontaneous evolution of stress heterogeneities toward a critical level of disorder having a broad range of scales. The stress evolution and development of large earthquake cycles may be tracked with seismicity functions.
Article
Statistical methods are used to test the characteristic earthquake hypothesis. Several distributions of earthquake size (seismic moment-frequency relations) are described. Based on the results of other researchers as well, evidence of the characteristic earthquake hypothesis can be explained either by statistical bias or statistical artifact. Since other distributions of earthquake size provide a simpler explanation for available information, the hypothesis cannot be regarded as proven. -Author
Article
"If I were a brilliant scientist, I would be working on earthquake prediction." This is a statement from a Los Angeles radio talk show I heard just after the Northridge earthquake of January 17, 1994. Five weeks later, at a monthly meeting of the Southern California Earthquake Center (SCEC), where more than two hundred scientists and engineers gathered to exchange notes on the earthquake, a distinguished French geologist who works on earthquake faults in China envied me for working now in southern California. This place is like northeastern China 20 years ago, when high seismicity and research activities led to the successful prediction of the Haicheng earthquake of February 4, 1975 with magnitude 7.3. A difficult question still haunting us [Aki, 1989] is whether the Haicheng prediction was founded on the physical reality of precursory phenomena or on the wishful thinking of observers subjected to the political pressure which encouraged precursor reporting. It is, however, true that a successful life-saving prediction like the Haicheng prediction can only be carried out by the coordinated efforts of decision makers and physical scientists.
Article
Crack propagation in a graphite sheet is investigated with million atom molecular-dynamics simulations based on Brenner's reactive empirical bond-order potential. For certain crystalline orientations, multiple crack branches with nearly equal spacing sprout as the crack tip reaches a critical speed of 0.6VR, where VR is the Rayleigh wave speed. This results in a fracture surface with secondary branches and overhangs. Within the same branch the crack-front profile is characterized by a roughness exponent, alpha = 0.41+/-0.05. However, for interbranch fracture surface profiles the return probability yields alpha = 0.71+/-0.10. Fracture toughness is estimated from Griffith analysis and local-stress distributions.
Article
Can the time, location, and magnitude of future earthquakes be predicted reliably and accurately? In their Perspective, Geller et al.'s answer is “no.” Citing recent results from the physics of nonlinear systems “chaos theory,” they argue that any small earthquake has some chance of cascading into a large event. According to research cited by the authors, whether or not this happens depends on unmeasurably fine details of conditions in Earth's interior. Earthquakes are therefore inherently unpredictable. Geller et al. suggest that controversy over prediction lingers because prediction claims are not stated as objectively testable scientific hypotheses, and due to overly optimistic reports in the mass media.
Article
The concept of self-organized criticality was introduced to explain the behaviour of the sandpile model. In this model, particles are randomly dropped onto a square grid of boxes. When a box accumulates four particles they are redistributed to the four adjacent boxes or lost off the edge of the grid. Redistributions can lead to further instabilities with the possibility of more particles being lost from the grid, contributing to the size of each avalanche. These model 'avalanches' satisfied a power-law frequency-area distribution with a slope near unity. Other cellular-automata models, including the slider-block and forest-fire models, are also said to exhibit self-organized critical behaviour. It has been argued that earthquakes, landslides, forest fires, and species extinctions are examples of self-organized criticality in nature. In addition, wars and stock market crashes have been associated with this behaviour. The forest-fire model is particularly interesting in terms of its relation to the critical-point behaviour of the site-percolation model. In the basic forest-fire model, trees are randomly planted on a grid of points. Periodically in time, sparks are randomly dropped on the grid. If a spark drops on a tree, that tree and adjacent trees burn in a model fire. The fires are the `avalanches' and they are found to satisfy power-law frequency-area distributions with slopes near unity. This forest-fire model is closely related to the site-percolation model, that exhibits critical behaviour. In the forest-fire model there is an inverse cascade of trees from small clusters to large clusters, trees are lost primarily from model fires that destroy the largest clusters. This quasi steady-state cascade gives a power-law frequency-area distribution for both clusters of trees and smaller fires. The site-percolation model is equivalent to the forest-fire model without fires. In this case there is a transient cascade of trees from small to large clusters and a power-law distribution is found only at a critical density of trees.
Article
Since the discovery of the renormalization group theory in statistical physics, the realm of applications of the concepts of scale invariance and criticality has pervaded several fields of natural and social sciences. This is the leitmotiv of Didier Sornette's book, who in Critical Phenomena in Natural Sciences reviews three decades of developments and applications of the concepts of criticality, scale invariance and power law behaviour from statistical physics, to earthquake prediction, ruptures, plate tectonics, modelling biological and economic systems and so on. This strongly interdisciplinary book addresses students and researchers in disciplines where concepts of criticality and scale invariance are appropriate: mainly geology from which most of the examples are taken, but also engineering, biology, medicine, economics, etc. A good preparation in quantitative science is assumed but the presentation of statistical physics principles, tools and models is self-contained, so that little background in this field is needed. The book is written in a simple informal style encouraging intuitive comprehension rather than stressing formal derivations. Together with the discussion of the main conceptual results of the discipline, great effort is devoted to providing applied scientists with the tools of data analysis and modelling necessary to analyse, understand, make predictions and simulate systems undergoing complex collective behaviour. The book starts from a purely descriptive approach, explaining basic probabilistic and geometrical tools to characterize power law behaviour and scale invariant sets. Probability theory is introduced by a detailed discussion of interpretative issues warning the reader on the use and misuse of probabilistic concepts when the emphasis is on prediction of low probability rare---and often catastrophic---events. Then, concepts that have proved useful in risk evaluation, extreme value statistics, large limit theorems for sums of independent variables with power law distribution, random walks, fractals and multifractal formalisms, etc, are discussed in an immediate and direct way so as to provide ready-to-use tools for analysing and representing power law behaviour in natural phenomena. The exposition then continues discussing the main developments, allowing the reader to understand theoretically and model strongly correlated behaviour. After a concise, but useful, introduction to the fundamentals of statistical physics a discussion of equilibrium critical phenomena and the renormalization group is proposed to the reader. With the centrality of the problem of non-equilibrium behaviour in mind, a discussion is devoted to tentative applications of the concept of temperature in the off-equilibrium context. Particular emphasis is given to the development of long range correlation and of precursors of phase transitions, and their role in the prediction of catastrophic events. Then, basic models such as percolation and rupture models are described. A central position in the book is occupied by a chapter on mechanisms for power laws and a subsequent one on self-organized criticality as a general paradigm for critical behaviour as proposed by P Bak and collaborators. The book concludes with a chapter on the prediction of fields generated by a random distribution of sources. The book maintains the promise of the title of providing concepts and tools to tackle criticality and self-organization. The second edition, while retaining the structure of the first edition, considerably extends the scope with new examples and applications of a research field which is constantly growing. Any scientific book has to solve the dichotomy between the depth of discussion, the pedagogical character of exposition and the quantity of material discussed. In general the book, which evolved from a graduate student course, favours these last two aspects at the expense of the first one. This makes the book very readable and means that, while complicated concepts are always explained by means of simple examples, important results are often mentioned but not derived or discussed in depth. Most of the time this style of exposition manages to successfully convey the essential information, other times unfortunately, e.g. in the case of the chapter on disordered systems, the presentation appears rather superficial. This is the price we pay for a book covering an impressively vast subject area and the huge bibliography (more than 1000 references) furnishes a necessary guide for acquiring the working knowledge of the subject covered. I would recommend it to teachers planning introductory courses on the field of complex systems and to researchers wanting to learn about an area of great contemporary interest.
Article
It has been suggested that distributed seismicity is an example of self-organized criticality. If this is the case, the Earth's crust in an active tectonic zone is in a near-critical state and faults can interact over large distances. Observed seismicity and earthquake statistics are consequences of the dynamical interaction of seismic faulting over a wide range of scales. We address this problem by considering a two-dimensional array of slider blocks with static/dynamic friction. The twodimensional system is treated as a cellular automaton such that only one slider block is allowed to slip at a given time and interacts only with its nearest neighbours through connecting springs. Because of this treatment, the amount of slip for each failed block can be obtained analytically, and the system is deterministic with no stochastic inputs or spatial heterogeneities. In many cases, the slip of one block induces the slip of adjacent blocks. The size of an event is specified by the number of blocks that participate in the event. The number of small events with a specified size has a power-law dependence on the size with a power close to -1.36. The distributions of normalized recurrence times for most events are close to a Poisson process, and gradually deviate towards periodicity for large events. The recurrence time statistics are generally insensitive to parameter variations. Large events may occur at stress levels considerably lower than the failure strength of an individual block, and the stress drops associated with large events are generally small. This may provide an explanation for observed low stress levels in tectonically active areas.
Article
We present a strategy for estimating the recurrence times between large earthquakes and associated seismic hazard on a given fault section. The goal of the analysis is to address two fundamental problems. (1) The lack of sufficient direct earthquake data and (2) the existence of ‘subgrid’ processes that can not be accounted for in any model. We deal with the first problem by using long simulations (some 10 000 yr) of a physically motivated ‘coarsegrain’ model that reproduces the main statistical properties of seismicity on individual faults. We address the second problem by adding stochasticity to the macroscopic model parameters. A small number N of observational earthquake times (2 ≤N≤ 10) can be used to determine the values of model parameters which are most representative for the fault. As an application of the method, we consider a model set-up that produces the characteristic earthquake distribution, and where the stress drops are associated with some uncertainty. Using several model realizations with different values of stress drops, we generate a set of corresponding synthetic earthquake catalogues. The recurrence time distributions in the simulated catalogues are fitted approximately by a gamma distribution. A superposition of appropriately scaled gamma distributions is then used to construct a distribution of recurrence intervals that incorporates the assumed uncertainty of the stress drops. Combining such synthetic data with observed recurrence times between the observational ∼M6 earthquakes on the Parkfield segment of the San Andreas fault, allows us to constrain the distribution of recurrence intervals and to estimate the average stress drop of the events. Based on this procedure, we calculate for the Parkfield region the expected recurrence time distribution, the hazard function, and the mean waiting time to the next ∼M6 earthquake. Using five observational recurrence times from 1857 to 1966, the recurrence time distribution has a maximum at 22.2 yr and decays rapidly for higher intervals. The probability for the post 1966 large event to occur on or before 2004 September 28 is 94 per cent. The average stress drop of ∼M6 Parkfield earthquakes is in the range Δτ= (3.04 ± 0.27) MPa.
Article
A general theory of innovation and progress in human society is outlined, based on the combat between two opposite forces (conservatism/inertia and speculative herding “bubble” behavior). We contend that human affairs are characterized by ubiquitous “bubbles”, which involve huge risks which would not otherwise be taken using standard cost/benefit analysis. Bubbles result from self-reinforcing positive feedbacks. This leads to explore uncharted territories and niches whose rare successes lead to extraordinary discoveries and provide the base for the observed accelerating development of technology and of the economy. But the returns are very heterogeneous and very risky. In other words, bubbles, which are characteristic definitions of human activity, allow huge risks to get huge returns over large scales. We refer to and summarize a large bibliography covering our research efforts in the last decade, which present the relevant underlying mathematical tools and a few results involving positive feedbacks, emergence, heavy-tailed power laws, outliers/kings, the problem of predictability and the illusion of control.
Article
The fracture of materials is a catastrophic phenomenon of considerable technological and scientific importance. Here, we analysed experiments designed for industrial applications in order to test the concept that, in heterogeneous materials such as fiber composites, rocks, concrete under compression and materials with large distributed residual stresses, rupture is a genuine critical point, i.e., the culmination of a self-organization of damage and cracking characterized by power law signatures. Specifically, we analyse the acoustic emissions recorded during the pressurisation of spherical tanks of kevlar or carbon fibers pre-impregnated in a resin matrix wrapped up around a thin metallic liner (steel or titanium) fabricated and instrumented by Aérospatiale-Matra Inc. These experiments are performed as part of a routine industrial procedure which tests the quality of the tanks prior to shipment. We find that the seven acoustic emission recordings of seven pressure tanks which was brought to rupture exhibit clear acceleration in agreement with a power law “divergence” expected from the critical point theory. In addition, we find strong evidence of log-periodic corrections that quantify the intermittent succession of accelerating bursts and quiescent phases of the acoustic emissions on the approach to rupture. An improved model accounting for the cross-over from the non-critical to the critical region close to the rupture point exhibits interesting predictive potential.
Article
We present an analysis of oil prices in USD and in other major currencies that diagnoses unsustainable faster-than-exponential behavior. This supports the hypothesis that the recent oil price run-up was amplified by speculative behavior of the type found during a bubble-like expansion. We also attempt to unravel the information hidden in the oil supply-demand data reported by two leading agencies, the US Energy Information Administration (EIA) and the International Energy Agency (IEA). We suggest that the found increasing discrepancy between the EIA and IEA figures provides a measure of the estimation errors. Rather than a clear transition to a supply restricted regime, we interpret the discrepancy between the IEA and EIA as a signature of uncertainty, and there is no better fuel than uncertainty to promote speculation! Our post-crash analysis confirms that the oil peak in July 2008 occurred within the expected 80% confidence interval predicted with data available in our pre-crash analysis.
Article
We present creep experiments on fiber composite materials. Recorded strain rates and acoustic emission (AE) rates exhibit both a power-law relaxation in the primary creep regime and a power-law acceleration before global failure. In particular, we observe time-to-failure power-laws in the tertiary regime for acoustic emissions over four decades in time. We also discover correlations between some characteristics of the primary creep (exponent of the power-law and duration) and the time to failure of the samples. This result indicates that the tertiary regime is dependent on the relaxation and damage processes that occur in the primary regime and suggests a method for predicting the time to failure based on the early time recording of the strain rate or AE rate. We consider a simple model of representative elements, interacting via democratic load sharing, with a large heterogeneity of strengths. Each element consists of a non-linear dashpot in parallel with a spring. This model recovers the experimental observations of the strain rate as a function of time.
Article
We call attention against what seems to a widely held misconception according to which large crashes are the largest events of distributions of price variations with fat tails. We demonstrate on the Dow Jones Industrial index that with high probability the three largest crashes in this century are outliers. This result supports suggestion that large crashes result from specific amplification processes that might lead to observable pre-cursory signatures.
Article
The financial crisis of 2008, which started with an initially well-defined epicenter focused on mortgage backed securities (MBS), has been cascading into a global economic recession, whose increasing severity and uncertain duration has led and is continuing to lead to massive losses and damage for billions of people. Heavy central bank interventions and government spending programs have been launched worldwide and especially in the USA and Europe, with the hope to unfreeze credit and boltster consumption. Here, we present evidence and articulate a general framework that allows one to diagnose the fundamental cause of the unfolding financial and economic crisis: the accumulation of several bubbles and their interplay and mutual reinforcement has led to an illusion of a ``perpetual money machine'' allowing financial institutions to extract wealth from an unsustainable artificial process. Taking stock of this diagnostic, we conclude that many of the interventions to address the so-called liquidity crisis and to encourage more consumption are ill-advised and even dangerous, given that precautionary reserves were not accumulated in the ``good times'' but that huge liabilities were. The most ``interesting'' present times constitute unique opportunities but also great challenges, for which we offer a few recommendations.
Article
Based on the idea that the rupture of heterogenous systems is similar to a critical point, we show how to predict the failure stress with good reliability and precision ($\approx 5%$) from acoustic emission measurements at constant stress rate up to a maximum load 15-20% below the failure stress. The basis of our approach is to fit the experimental signals to a mathematical expression deduced from a new scaling theory for rupture in terms of complex fractal exponents. The method is tested successfully on an industrial application, namely high pressure spherical tanks made of various fiber-matrix composites. As a by-product, our results constitute the first observation in a natural context of the universal periodic corrections to scaling in the renormalization-group framework. Our method could be applied usefully to other similar predicting problems in the natural sciences (earthquakes, volcanic eruptions, etc.).
Article
We perform a systematic parameter space study of the seismic response of a large fault with different levels of heterogeneity, using a 3-D elastic framework within the continuum limit. The fault is governed by rate-and-state friction and simulations are performed for model realizations with frictional and large scale properties characterized by different ranges of size scales. We use a number of seismicity and stress functions to characterize different types of seismic responses and test the correlation between hypocenter locations and the employed distributions of model parameters. The simulated hypocenters are found to correlate significantly with small L values of the rate-and-state friction. The final sizes of earthquakes are correlated with physical properties at their nucleation sites. The obtained stacked scaling relations are overall self-similar and have good correspondence with properties of natural earthquakes.
Article
Relevant and timely questions such as regarding the predictability of seizures and their capacity to trigger more seizures remain the subject of debate in epileptology. The present study endeavors to gain insight into these dynamic issues by adopting a non-reductionist approach and via the use of mathematical tools. Probability distribution functions of seizure energies and inter-seizure intervals and the probability of seizure occurrence conditional upon the time elapsed from the previous seizure were estimated from prolonged recordings from subjects with pharmaco-resistant seizures, undergoing surgical evaluation, on reduced doses of or on no medications. The energy and inter-seizure interval distributions for pharmaco-resistant seizures, under the prevailing study conditions, are governed by power laws ('scale-free' behavior). Pharmaco-resistant seizures tend to occur in clusters and the time to the next seizure increases with the duration of the seizure-free interval since the last one. However, characteristic size energy probability density functions were found in a few subjects. These findings suggests that: (i) pharmaco-resistant seizures have an inherent self-triggering capacity; (ii) their time of occurrence and intensity may be predictable in light of the existence of power law distributions and of their self-triggering capacity; and (iii) their lack of typical size and duration (scale-free), features upon which their classification into ictal or interictal is largely based, may be inadequate/insufficient classifiers.