Article

A Test For Normality of Observations and Regression Residuals

Authors:
To read the full-text of this research, you can request a copy directly from the authors.

Abstract

Using the Lagrange multiplier procedure or score test on the Pearson family of distributions we obtain tests for normality of observations and regression disturbances. The tests suggested have optimum asymptotic power properties and good finite sample performance. Due to their simplicity they should prove to be useful tools in statistical analysis. /// En utilisant la procédure du multiplicateur de Lagrange, ou le score test, sur les distributions du genre Pearson, on obtient des tests de normalité pour les observations et les résidus de régression. Les tests suggérés ont des proprietés optimales asymptotiques et des bonnes performances pour des échantillons finis. A cause de leur simplicité ces tests doivent s'avérer comme des instruments utiles dans l'analyse statistique.

No full-text available

Request Full-text Paper PDF

To read the full-text of this research,
you can request a copy directly from the authors.

... or the probability plot (Section 1.1.2). An accurate way of verifying normality is to use statistical tests, for example, Anderson-Darling's test [30] or the Kolmogorov-Smirnov nonparametric goodness of fit [31], which are based on the empirical distribution function, the Shapiro-Wilk test [32], the Ryan-Joiner test [33] correlation-based test or the Jarque-Bera test [34] based on the measures of skewness and kurtosis. Different tests of normality are based on different principles and have different strengths. ...
... Jarque-Bera's test [34] is a good-of-fit test that determines whether the data's skewness (S) and kurtosis (K) conform to a normal distribution. The test statistic (JB) can be calculated via Formula (30). ...
Article
Full-text available
This study aims to highlight the importance of a systematic approach to process capability assessment and the importance of following a sequence of steps. Statistical process control provides several different ways of assessing process capability. This study evaluates the process capability of crown cap manufacturing through capability indices. In addition to calculating the indices, the evaluation involves extensive data analysis. Before calculating the capability indices, the assumptions for their correct selection and use were also verified. Several statistical tests were used to verify each assumption. The research value of the study lies in pointing out that not all tests led to the same conclusions. It highlights the importance of selecting the appropriate test type for the evaluated process quality characteristics.
... For Gaussian distribution testing, a widely used is the Shapiro-Wilk test [33] and its several extensions, see [34,35]. Another commonly utilized tests for Gaussianity are based on skewness and kurtosis, namely Jarque-Bera test [36] and D'agostino-Pearson test [37]. Several approaches assessing the empirical distribution function of the random sample have been introduced to test for Gaussian distribution, e.g. ...
... Moreover, we compare the results of the proposed test for Gaussianity with the tests known in the literature. We selected tests thoroughly investigated in [45] and commonly used in various applications, namely Kolmogorov-Smirnov (KS) test [38] (AD) test [42], Shapiro-Wilk (SW) test [33] and Jarque-Bera (JB) test [36]. In addition, we also compare the results with the Lillefors (LF) test [43], and in the class of the α-stable distribution with the likelihood ratio (LR) test [32]. ...
Preprint
Full-text available
In this paper, we explore the modified Greenwood statistic, which, in contrast to the classical Greenwood statistic, is properly defined for random samples from any distribution. The classical Greenwood statistic, extensively examined in the existing literature, has found diverse and interesting applications across various domains. Furthermore, numerous modifications to the classical statistic have been proposed. The modified Greenwood statistic, as proposed and discussed in this paper, shares several key properties with its classical counterpart. Emphasizing its stochastic monotonicity within three broad classes of distributions - namely, generalized Pareto, $\alpha-$stable, and Student's t distributions - we advocate for the utilization of the modified Greenwood statistic in testing scenarios. Our exploration encompasses three distinct directions. In the first direction, we employ the modified Greenwood statistic for Gaussian distribution testing. Our empirical results compellingly illustrate that the proposed approach consistently outperforms alternative goodness-of-fit tests documented in the literature, particularly exhibiting superior efficacy for small sample sizes. The second considered problem involves testing the infinite-variance distribution of a given random sample. The last proposition suggests using the modified Greenwood statistic for testing of a given distribution. The presented simulation study strongly supports the efficiency of the proposed approach in the considered problems. Theoretical results and power simulation studies are further validated by real data analysis.
... This research examines the distribution characteristics of variables using skewness and kurtosis as standard metrics. The Jarque and Bera (1987) normality test was used to assess the normal distribution of the variables in Eq. (9). ...
... The study analyzes the normality statistics of the data and finds that the skewness and kurtosis values suggest a deviation from a normal distribution. The Jarque and Bera (1987) test was used to address the issue of data normality. All variables exhibit non-linear behavior, leading to the (17) ...
Article
Full-text available
Digital infrastructure has the potential to help achieve global carbon neutrality by promoting the use of renewable energy (RE) in economic operations. Therefore, it aids in the growth of a sustainable economy and society. Digital infrastructure and efficient budget allocation are the determinants of a modern and digital society. In this context, public debt plays an important role if it is invested in digital infrastructure projects. Taking this into account, we examine the relationship between carbon (CO2) emissions and digital infrastructure moderated by public debt in G20 countries for the period 2003–2021. The study employed advanced econometric techniques for panel data that are robust to solving the problem of cross-sectional dependency. As a result, cross-sectionally augmented autoregressive distributed lag (CS-ARDL) estimation technique was employed. The results indicate that digital infrastructure has a negative impact on CO2 emissions. Conversely, we find that public debt has a positive impact on CO2 emissions. Furthermore, the study confirms that the interaction between digital infrastructure and public debt has a negative effect on CO2 emissions. This implies that public debt, if used in digital infrastructure projects, leads to a decrease in CO2 emissions. To reduce CO2 emissions, it is recommended that G20 nations give priority to upgrading digital infrastructure while maintaining a manageable level of public debt. Graphical Abstract
... Do the principal components of political development exhibit any anomalies in the cross-country distribution over time? To gauge the normality assumption behind the distribution of the scores on first and second component, in Table 6, the normality assumption for the world distribution of formal checks and balances and strength of non-elites is tested using the distribution tests by Kolmogorov (1933), Smirnov (1933), Shapiro and Wilk (1965), Jarque and Bera (1987), and more recent tests by Henze and Zirkler (1990) and Doornik and Hansen (2008). The null hypothesis in each test is that the observed distribution of both principal components follows the Gaussian empirical distribution function. ...
Article
Full-text available
In this article, latent variable analysis is used to construct hybrid measure of political development based on the plausible common variation between objective and subjective indicators of political institutions. For a sample of 167 countries for the period 1810–2018, we chart long-term paths of political development. Our empirical strategy attempts to overcome the existing potential bias in the measures of democracy in the long run by extracting the institutional characteristics of political regimes, voter turnout, expert-based assessments and electoral outcomes into two latent indices of political development that can be compared both across space and time. The evidence reveals the remarkable persistence of multiple peaks in the world distribution of political development and uncovers contrasting long-term trajectories across countries traditionally featured in the political economy literature. Our findings add to the current debate about measurement of democratic backsliding.
... The outcomes of the unit root tests indicated that the null hypothesis was rejected at a 0.001 level for the analyzed financial assets' returns (refer to Table 1). In addition, we conducted the Jarque-Bera Test, which assesses whether a given dataset adheres to the normal distribution by assessing the skewness and kurtosis (Jarque and Bera 1987). The test's statistical expression incorporates skewness and kurtosis, which is expressed as JB = n 6 S 2 + 1 4 (K − 3) 2 where n denotes the number of data values. ...
Article
Full-text available
The aim of this study is to enhance the understanding of volatility dynamics in commodity returns, such as gold and cocoa, as well as the financial market index S&P500. It provides a comprehensive overview of each model’s efficacy in capturing volatility clustering, asymmetry, and long-term memory effects in asset returns. By employing models like sGARCH, eGARCH, gjrGARCH, and FIGARCH, the research offers a nuanced understanding of volatility evolution and its impact on asset returns. Using the Skewed Generalized Error Distribution (SGED) in model optimization shows how important it is to understand asymmetry and fat-tailedness in return distributions, which are common in financial data. Key findings include the sGARCH model being the preferred choice for Gold Futures due to its lower AIC value and favorable parameter estimates, indicating significant volatility clustering and a slight positive skewness in return distribution. For Cocoa Futures, the FIGARCH model demonstrates superior performance in capturing long memory effects, as evidenced by its higher log-likelihood value and lower AIC value. For the S&P500 Index, the eGARCH model stands out for its ability to capture asymmetry in volatility responses, showing superior performance in both log-likelihood and AIC values. Overall, identifying superior modeling approaches like the FIGARCH model for long memory effects can enhance risk management strategies by providing more accurate estimates of Value-at-Risk (VaR) and Expected Shortfall (ES). Additionally, the out-of-sample evaluation reveals that Support Vector Regression (SVR) outperforms traditional GARCH models for short-term forecasting horizons, indicating its potential as an alternative forecasting tool in financial markets. These findings underscore the importance of selecting appropriate modeling techniques tailored to specific asset classes and forecasting horizons. Furthermore, the study highlights the potential of advanced techniques like SVR in enhancing forecasting accuracy, thus offering valuable implications for portfolio management and risk assessment in financial markets.
... 2 -Descriptive statistics of variablesNotes: JB refers toJarque and Bera (1987) normality test. BB refers toBorn and Breitung (2016) serial correlation test. ...
Article
Full-text available
Since 2005, the International Financial Reporting Standards (IFRS) mandatory adoption in the European Union has played a pivotal to reduce financing costs which has influenced positively economic growth across member states. Thus, this study examines the effect of cost of capital on economic growth under IFRS mandatory adoption in 17 European countries between 1994 and 2021 using Pooled Mean Group Autoregressive Distributed Lag (PMG-ARDL) and System Generalized Method of Moments (GMM-system) methods. The findings reveal a positive correlation between the cost of capital and economic growth under IFRS adoption. Specifically, the model estimates indicate that the cost of capital contributes to a 0.58% increase in economic growth in the PMG-ARDL framework. Moreover, the GMM-system model underscores the significance of IFRS adoption in reducing the cost of capital, leading to a 0.52% increase in economic growth. These results provide insights into the benefits of adopting international accounting standards and highlight the importance of institutional and financial factors in shaping the economic impact of adopting accounting standards.
... Table 1 provides an overview of the pricing series. It is evident from the findings of the Jarque-Bera (1980, 1987 and Anderson and Darling (1954) tests that the price series does not fit a normal distribution. This result would not be surprising considering the characteristics of various financial and economic indices . ...
Article
Full-text available
Predictions of prices for a wide variety of commodities have been relied upon by governments and investors over the course of history. The purpose of this study is to investigate the difficult challenge of predicting daily palladium prices for the United States by utilizing time series data ranging from January 5, 1977, to March 26, 2024. When it comes to this crucial evaluation of commodity prices, estimates have not been given sufficient consideration in earlier research. In this context, price predictions are generated by the utilization of Gaussian process regression algorithms, which are estimated through the utilization of cross-validation processes and Bayesian optimization approaches. With a relative root mean square error of 0.4598%, our empirical prediction approach produces price estimates that are generally accurate for the out-of-sample phase that spans from March 24, 2017, to March 26, 2024. In order to make educated choices about the palladium industry, governments and investors can utilize price prediction models to get the information they need.
... JB is a goodness-of-fit test that determines whether or not sample data have skewness and kurtosis that matches a normal distribution. This test statistic is defined by Jarque and Bera (1987) and Thadewald and Büning (2007)): (1) ...
Article
Full-text available
The global energy narrative is pushing for the deep decarbonisation of energy-consuming systems. This paper seeks to assess cassava (a staple food in West Africa) production potentials in Nigeria and to technically evaluate the garri production process as well as to study the effect of storage duration on the quality of the garri. Available data indicate that there were wide variation in cassava production potential in Nigeria between 1961 to 2020 with appreciable increase of cassava production occurring within the period 1981 to 2000. Identified factors such government policies and interventions; cassava virus and bacteria are responsible for this variation. Garri has several value chains with roasting playing a dominant role in the energy consumption and quality. Garri was produced using three fermentation pathways with each product stored in three different types of storage containers (plastic container with cover, airtight polythene bag and ordinary polythene bag) for 15 weeks and monitored regularly for quality purposes. Results obtained indicated that fermentation duration, with the exception of acid content, does not significantly affect the moisture content, swelling index and bulk density of garri within the period under investigation. These results will be useful for designing an improved environmental friendlier and climate responsive garri roasting process unit powered by solar energy in the context of energy efficiency and quality control.
... To evaluate the validity, reliability and applicability of the regression model, we employed some diagnostics tests to address the econometric problems as presented in Table 7. Autocorrelation test by Durbin-Watson (1950, 1951 for first-order autocorrelation and Breusch-Godfrey tests by Breusch (1978) and Godfrey (1978) for higher-order autocorrelation was conducted and the study established no problem of autocorrelation in the regression model. The Breusch and Pagan (1979) for heteroscedasticity and Jarque and Bera (1987) for normality were also employed and the findings also revealed no problem of heteroscedasticity and the residuals are normally distributed respectively. Also, the model misspecification test using the Ramsey Regression Specification Error Test (RESET) by Ramsey (1969) and the model stability test using the cumulative sum (CUSUM) and cumulative sum of squares (CUSUMSQ) plots of recursive residuals developed by Brown et al. (1975) were employed and the plots fall within the 5 per cent significance level as presented in Figure 1 and Figure 2 respectively. ...
Article
Full-text available
Finance is a crucial component to entrepreneurial success and the dearth of financial resources can be detrimental to entrepreneurs and affect entrepreneurial activities and growth opportunities. This study investigates the financial determinants of entrepreneurship in Nigeria from 1991 to 2021 using the autoregressive distributed lag model. The findings of the study revealed that while foreign direct investments and financial development negatively affect self-employment rate in Nigeria, access to finance increases the rate of self-employment in Nigeria. This study, therefore, recommends that policymakers need to make access to financial resources easier and at a lesser cost to individuals who want to be self-employed in order to encourage self-employment and entrepreneurial activities for economic growth and stability.
... Now the p value for the considered data is 0.0360 based on Shapiro-Wilk test and hence the hypotheses of normality is rejected. Moreover, we apply the Jarque-Bera Test (Jarque and Bera 1987) to check whether the considered data have the skewness and kurtosis matching to a normal distribution. The computed p value is given by 0.0328 indicating the rejection of the hypotheses of normality. ...
Article
Full-text available
Outlier-prone data sets are of immense interest in diverse areas including economics, finance, statistical physics, signal processing, telecommunications and so on. Stable laws (also known as α\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\alpha $$\end{document}- stable laws) are often found to be useful in modeling outlier-prone data containing important information and exhibiting heavy tailed phenomenon. In this article, an asymptotic distribution of a unbiased and consistent estimator of the stability index α\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\alpha $$\end{document} is proposed based on jackknife empirical likelihood (JEL) and adjusted JEL method. Next, using the sum-preserving property of stable random variables and exploiting U-statistic theory, we have developed a goodness-of-fit test procedure for α\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\alpha $$\end{document}-stable distributions where the stability index α\documentclass[12pt]{minimal} \usepackage{amsmath} \usepackage{wasysym} \usepackage{amsfonts} \usepackage{amssymb} \usepackage{amsbsy} \usepackage{mathrsfs} \usepackage{upgreek} \setlength{\oddsidemargin}{-69pt} \begin{document}$$\alpha $$\end{document} is specified. Extensive simulation studies are performed in order to assess the finite sample performance of the proposed test. Finally, two appealing real life data examples related to the daily closing price of German Stock Index and Bitcoin cryptocurrency are analysed in detail for illustration purposes.
... R> plot(dax_model_3t, plot_type = "pr") Alternatively, the residuals can be extracted from the model object via the residuals() method for normality tests, for example a Jarque-Bera test (Jarque and Bera 1987). Here, the test is unable to reject the null hypothesis that the data are normally distributed. ...
Article
Full-text available
Hidden Markov models constitute a versatile class of statistical models for time series that are driven by hidden states. In financial applications, the hidden states can often be linked to market regimes such as bearish and bullish markets or recessions and periods of economics growth. To give an example, when the market is in a nervous state, corresponding stock returns often follow some distribution with relatively high variance, whereas calm periods are often characterized by a different distribution with relatively smaller variance. Hidden Markov models can be used to explicitly model the distribution of the observations conditional on the hidden states and the transitions between states, and thus help us to draw a comprehensive picture of market behavior. While various implementations of hidden Markov models are available, a comprehensive R package that is tailored to financial applications is still lacking. In this paper, we introduce the R package fHMM, which provides various tools for applying hidden Markov models to financial time series. It contains functions for fitting hidden Markov models to data, conducting simulation experiments, and decoding the hidden state sequence. Furthermore, functions for model checking, model selection, and state prediction are provided. In addition to basic hidden Markov models, hierarchical hidden Markov models are implemented, which can be used to jointly model multiple data streams that were observed at different temporal resolutions. The aim of the fHMM package is to give R users with an interest in financial applications access to hidden Markov models and their extensions.
... and represent the sample skewness and kurtosis of the h fitted "good" residuals, respectively. It is worth noting that the statistic (3) essentially reduces to the Jarque-Bera normality test (Jarque and Bera, 1987) for the h "good" residuals. ...
Preprint
Full-text available
This paper presents a nonparametric bootstrap strategy for estimating the proportions of inliers and outliers in robust regression models. Our strategy is founded on the concept of stability, making it robust to distributional assumptions and eliminating the need for a pre-specified confidence level. Numerical experiments suggest that the proposed strategy offers many empirical benefits over exisiting alternatives. Moreover, the proposed procedure naturally extends to generalized linear models. In this context, we find that variance-stabilizing transformations lead to residuals well suited to outlier detection. Finally, two real-world datasets reinforce the strategy's practical value in outlier detection.
... For existence of Heteroskedasticity White hetero Test (White, 1980) with no cross terms and ARCH test is used. To check the normality of the data the Jarque-Bera test ( Jarque and Bera, 1987) is implied. The entire test battery has null hypothesis of non-existence of ailment. ...
... Therefore, the normality of the series is investigated using the Jarque-Bera test (Jarque & Bera, 1980), a widely accepted method among econometricians. As highlighted by various researchers, including Jarque and Bera (1987) and Urzúa (1996), the Jarque-Bera (JB) test demonstrates more efficient performance compared to other tests for normality. The JB test statistic is derived from measures of skewness and kurtosis computed from the sample data. ...
Article
In 2023, global temperatures witnessed an alarming escalation, reaching an unprecedented 1.46 C above preindustrial levels, marking it as the hottest year on record. Simultaneously, atmospheric carbon dioxide surpassed 420 ppm, exceeding a stability maintained for over 6000 years by more than double. This troubling surge in CO2 intensifies global warming, leading to an increased frequency of extreme weather events and contributing to 24% of global deaths attributed to environmental concerns. These alarming environmental challenges demand urgent attention and the implementation of innovative policies. Responding to this imperative, the study examines the impact of artificial intelligence-based industrial robotics (AIIR) and other control variables such as green energy, green finance, and green energy investment on CO2 emissions in economies supporting green initiatives, including Canada, Denmark, China, Japan, New Zealand, Norway, Sweden, and Switzerland. Using monthly data from 2008 to 2021 and a novel nonlinear autoregressive distributed lag approach, the results indicate that AIIR significantly reduces CO2 emissions in the sample economies. Additionally, green energy, green finance, and green energy investment also significantly decrease CO2 emissions. The study's outcomes bear policy implications for decision-makers in the sampled economies, offering tangible insights for effective environmental management.
... To examine variable volatility, this study favors standard deviation, accommodating significant range variations. Additionally, the study examines distributional features through skewness and kurtosis, and standard metrics ( (Jarque and Bera 1987) show Eq. 7, rigorously assess variable normality, and employ a typical normality test formulation. ...
Article
Full-text available
International policymakers have been diligently working towards advancing sustainable development and restoring the green environment, as evidenced by the Paris Agreement and COP27. This study, situated within a comprehensive policy framework, examines the dynamic effects of digital financial inclusion and mineral resources on green economic growth in mineral resources endowment countries from 2000 to 2021. To address slope heterogeneity and cross-sectional dependency, the novel (MMQR) model is applied. We also conduct an asymmetric analysis to detect the moderating and mediating roles of economic governance in the linkage between digital financial inclusion, mineral resources, and green economic growth. Our findings reveal the following: (1) Enhancing digital financial inclusion plays a crucial role in fostering green economic growth, and the synergy between digital financial inclusion and effective economic governance can amplify this positive impact in different quantiles; (2) mineral resources exhibit a sensitive association with GEG, such that excessive mineral resource extraction damages green economic growth. However, the interaction of economic governance with mineral resources transforms the negative influence into a positive one on green economic growth; (3) technological innovation and human development display an asymmetric correlation with green economic growth in different quantiles. The nonparametric panel Granger causality test establishes a significant causal relationship, demonstrating that digital financial inclusion and mineral resources can enhance green economic growth through improved economic governance. This finding holds significant importance for most mineral-rich nations, and we propose policy recommendations to strengthen digital inclusive finance and enhance economic governance.
... Jarque-Bera method Finally, the Jarque-Bera test [62] (expression 11) was applied, where the skewness and Kurtosis were used (expressions [12][13]. 6 Where n is the sample size, and S and K are respectively the skewness and kurtosis coefficients, which are defined below: ...
Preprint
Full-text available
The goal was to model the irrigated (IBY) and rainfed (RBY) bean yields, as a function of essential climatic variables (ECVs), in the center (Culiacán) and south (Rosario) from Sinaloa. In Sinaloa, and for the period 1982–2013 (October–March), the following were calculated: a) temperatures. b) average degree days for the bean, c) cumulative potential evapotranspiration and d) cumulative effective precipitation. For ECVs, from the European Space Agency, e) daily soil moisture. f) IBY and RBY, from the Agrifood and Fisheries Information Service. Multiple linear regressions were applied, which modeled IBY–RBY (dependent variables), as a function of ECVs (independent variables). Then, to establish each Pearson correlation (PC) as significantly different from zero, a hypothesis test was applied: PC vs Pearson's critical correlation (CPC). The four models obtained were significantly predictive: IBY–Culiacán (PC = 0.590 > CPC = |0.349|), RBY–Culiacán (PC = 0.734 > CPC = |0.349|), IBY–Rosario (PC = 0.621 > CPC = |0.355|) and RBY–Rosario (PC = 0.532 > CPC = |0.349|). This study is the first in Sinaloa to predict IBY and RBY based on ECVs, contributing to the production of sustainable food.
... The standard deviation of different independent variables reflects that all these variables are volatile for the period (2000-2020). According to Jarque and Bera's (1987) [74] test, all three variables' data series obey normality [75]. Additionally, the skewness must ±1 to satisfy the normalcy requirements. ...
Article
Full-text available
The current research project investigates the correlation between economic growth, government spending, and public revenue in seventeen Indian states spanning the years 1990 to 2020. An analysis of the relationship between key fiscal policy variables and economic growth was conducted utilising a panel data approach, the Generalised Method of Moments (GMM), and fully modified Ordinary Least Squares (FMOLS & DOLS) estimation. In our investigation, we assessed the impacts of non-tax revenue, development plan expenditure, tax revenue, and development non-plan expenditure on (i) the net state domestic product (NSDP) and (ii) the NSDP per capita. The findings indicate that the selected fiscal variables are significantly related. The results indicate that expeditious expansion of the fiscal sector is obligatory to stimulate economic growth in India and advance the actual development of the economies of these states.
... Thus, the normality of residuals is necessary to validate t-tests (De Medeiros & Matsumoto, 2006). In the light of this, the study utilised the Jarque-Bera Normality test (Bera & Jarque, 1981) to test for the normality of the abnormal returns. The statistic was computed as follows: ...
Article
Full-text available
On the 11th of May, 2016 the federal government of Nigeria announced the removal of fuel subsidy which generated a lot of arguments and several reactions in the midst of observers and market analysts as to the likely consequences of such an action to the Nigerian economy on the market value of listed firms in Nigeria. This study aims to empirically investigate the reaction of the stock prices to the sudden announcement of the removal of fuel subsidy in the Nigerian Stock market. Data for the study was collected from the Nigerian Stock Exchange daily official list covering both the 190 days estimation period and the 35 days event window. The standard event study methodology was used on a sample of 76 firms that cut across all the various sectors of listed firms in the Nigerian Stock Exchange. The study found the event to have a significantly positive impact on the stock market as positive cumulative abnormal returns were recorded in the market on the event day and a negative significantly cumulative abnormal return after the announcement of the fuel subsidy removal. Consequent upon the findings, this study recommends, among other things that policy makers should avoid sudden announcement of policies that bothers on sensitive sectors of the economy. Such actions should be preceded by the release of information that adequately justifies such action by policymakers before it is made public. The release of such information is useful as it will help the stock market to correctly interpret the philosophy underlying the action by policy makers and will further drastically reduce information asymmetry in the market.
... This study confirmed the validity of the estimated ARDL model by using the Durbin-Watson autocorrelation test (Durbin and Watson, 1971), the Breusch-Godfrey serial correlation LM test (Breusch, 1978;Godfrey, 1978), the ARCH heteroscedasticity test (Engle et al., 1985), the white homoscedasticity test (White, 1980), the Jarque-Bera normality test (Bera and Higgins, 1992;Jarque and Bera, 1987), and the Ramsey RESET specification test (Ramsey, 1969). All of these tests were used to examine the distribution of the data. ...
Article
Full-text available
This research paper explores the presence of the Environmental Kuznets Curve (EKC) in Bangladesh, delving into the intricate relationships among GDP, industrialization, renewable energy utilization, and CO2 emissions. Utilizing the Stochastic Impacts by Regression on Population, Affluence, and Technology (STIRPAT) model, our investigation spans the 1992 to 2021 timeframe, offering a comprehensive analysis of the interplay between these crucial factors in the context of Bangladesh's environmental sustainability. In this study, the F-bound test is utilized to establish the cointegration relationship among variables. And the short-run and long-run elasticity of explanatory variables is investigated by employing the Autoregressive Distributed Lag (ARDL) approach. This research also employed a pairwise granger causality test to explore the direction of causality between them. Empirical results prove the existence of a cointegration relationship among CO2 emission, GDP per capita, a quadratic form of GDP per capita, renewable energy, industrialization, and population. The findings provide strong support for the presence of the EKC in the case of Bangladesh. Besides that, significant inverse relationship was also obtained between CO2 emission and renewable energy consumption. Additionally, the study found the detrimental long-term effects of industrialization on the environment. The EKC hypothesis argues that Bangladesh can accomplish sustainable economic development. To attain the goal country should adopt appropriate government policies and ensure their implication. This study strongly advocates using sustainable energy sources and implementing regulations on pollutant industries.
... The (Jarque & Bera, 1987) normality test was used to determine if the residuals were normally distributed. The histogram should be bell-shaped if the residuals are normally distributed, and the statistics should not be significant. ...
Article
Full-text available
Tax revenue is the main source of revenue for governments in advanced and emerging economies, which fund government spending. The main goal of this research is to look at the factors affecting tax revenue in Ethiopia from 1996 to 2020 using time series data. The impact of agricultural-to-GDP, service-to-GDP, inflation, corruption, political stability, and tax reformation on the ratio of tax revenue to GDP was investigated in this study. The short-run and long-run associations between the variables were determined using the autoregressive distributed lag (ARDL) co-integration approach. The study's outcomes reveal that inflation has a positive relationship with tax; however, agricultural GDP negatively impacted tax revenue in the short run over the study period. Political stability, service-to-GDP, and inflation, on the other hand, has a positive long-run impact on tax revenue, while corruption has a negative impact. We recommend that policymakers and governments combat corruption, maintain political stability, broaden tax bases to include more service oriented businesses, and reduce reliance on agricultural sectors
... The Jarque-Bera test [55] (Expression (11)) was applied, in which the skewness and kurtosis were assessed (Expressions (12) and (13)): ...
Article
Full-text available
The goal was to model irrigated (IBY) and rainfed (RBY) bean yields in central (Culiacán) and southern (Rosario) Sinaloa state as a function of the essential climate variables soil moisture, temperature, reference evapotranspiration, and precipitation. For Sinaloa, for the period 1982-2013 (October-March), the following were calculated: (a) temperatures, (b) average degree days for the bean, (c) cumulative reference evapotranspiration, and (d) cumulative effective precipitation. For essential climate variables, (e) daily soil moisture obtained from the European Space Agency and (f) IBY and RBY from the Agrifood and Fisheries Information Service were used. Multiple linear regressions were significant for predicting IBY-RBY (dependent variables) as a function of essential climate variables (independent variables). The four models obtained were significantly predictive: IBY-Culiacán (Pearson correlation (PC) = 0.590 > Pearson critical correlation (CPC) = |0.349|), RBY-Culiacán (PC = 0.734 > CPC = |0.349|), IBY-Rosario (PC = 0.621 > CPC = |0.355|), and RBY-Rosario (PC = 0.532 > CPC = |0.349|). Due to the lack of irrigation depth data, many studies only focus on modeling RBY; this study is the first in Sinaloa to predict IBY and RBY based on essential climate variables, contributing to the production of sustainable food.
... Four normality tests were applied to the geotechnical parameter data to check if they follow a normal distribution. These tests were performed according to Shapiro and Wilk, Anderson and Darling, Lilliefors, and Jarque and Bera (1987). It appears that the data for the LL, IP, Pm, PP, εs, SAIR, AIR, IRCP, CIA, MIA, and MIA[O] parameters do not follow a normal distribution (Table 9). ...
Article
Lateritic gravels are extensively used in road construction and other civil engineering structures. Depending on various parameters, genetic characteristics vary from one locality to another. Geological analyses were conducted to investigate their influence on the geotechnical properties of lateritic gravels developed on gneiss for their better use in road construction. Petrographical, mineralogical, and geochemical results show the existence of two types of lateritic gravels in the study area: yellowish to brownish lateritic gravels of humid savannah (LGHS) and reddish lateritic gravels of dry savannah (LGDS). These materials mostly content quartz, kaolinite, hematite, goethite, muscovite, gibbsite, anatase, Fe2O3, Al2O3, and SiO2. The hematite, goethite, Fe2O3, and Al2O3 content has a positive influence on the geotechnical behavior of the studied lateritic gravels, while the content of quartz, muscovite, and SiO2 has the opposite effect. The evaluation of alterologic parameters show that the degree of lateritization has a positive influence on CBR, maximum dry density, and the specific gravity. Three CBR models with determination coefficients R2 = 0.85, 0.87, and 0.88 were established by associating the genetic characteristics of the materials with geotechnical parameters. The outcomes of the present study show that geotechnical properties of lateritic gravels are significatively influenced by their genetic characteristics. Is therefore important to performed geological studies for a better use of lateritic gravels in road construction.
... where is the estimated test statistic value, sample size, and is the sample degree of relationship between residuals. Jarque and Bera (1987) proposed a test for non-normality of observations and the test statistic is given as: ...
Article
Full-text available
In this paper, we introduced a new hybrid model namely Exponential Autoregressive-Fractional Integrated Generalized Autoregressive Conditional Heteroscedasticity (ExpAR-FIGARCH) model and study financial data. The Daily Nigeria All Share Stock Index that exhibit nonlinear, volatility and long memory effect were analyzed in the study. The existing ExpAR-Generalized Autoregressive Conditional Heteroscedasticity (ExpAR-GARCH) model were estimated and compared with the proposed ExpAR-FIGARCHmodel. Results showed that the new hybrid model is better based on efficient parameters, serial correlation analysis and forecast measures of accuracy. Therefore, as a conclusion, the current study indicates that the ExpAR-FIGARCHmodel performed better compared to the ExpAR-GARCHhybrid model. Therefore, the ExpAR-FIGARCHmodel is a better option for modeling nonlinear, volatility and long memory characteristics of time series. Future study should focus on the application of the developed hybrid ExpAR-FIGARCHmodel using health, meteorological and economic data.
Article
This study empirically examines the nexus between natural resource rent and financial development in the context of the developing economy of Nigeria, between 1990 and 2021, by considering the important role of corruption control under an asymmetric approach. The study further looked at the influence of information technology, and renewable energy, on financial development. The bound test result confirms the existence of a long-term relationship among the variables. This study first uses the nonlinear autoregressive distributive lag (NARDL) model to capture the asymmetry that arises from positive or negative components of natural resource rent. The empirical evidence of the NARDL estimation shows that natural resource rent negatively influences financial development; meanwhile, corruption control boosts financial development and positively moderates this relationship in the Nigerian context. This confirms the existence of a natural resource curse. The results further explained that both information technology, renewable energy, and corruption control enhance financial development. Furthermore, the causality test discovers that there exists a bidirectional causal relationship between financial development and the scrutinized variables. These findings offer valuable policy recommendations for policymakers.
Article
This study estimates the relationship between digital banking transactions and bank net profit through regression analysis using the data of deposit banks in Turkey between 2011 and 2021. The data from 11 Turkish banks are analyzed with the least squares method (OLS) in the Stata package program. For this purpose, the relationship between digital banking transactions and bank net profit is estimated by constructing three models. In the first model, online banking transactions have a positive effect on bank profitability. The number of internet banking customers, the number of Automated Teller Machine (ATM), and the number of Point of Sale (POS) are found to have a negative effect on net profit, but these coefficients are not statistically significant. In the second model, mobile banking transactions have a positive effect on bank net profit. ATM has a positive and statistically significant effect on bank net profit, while the number of POS has a negative coefficient on bank net profit, but this coefficient is not statistically significant. In the third and last model, the digital banking transactions have a positive effect on the net profit of the bank. ATM and POS numbers have a negative effect on bank net profit, but these coefficients are not statistically significant. In conclusion, both internet and mobile banking transactions have a positive effect on bank profits in Turkey. Digital banking transaction, which is a combination of internet and mobile banking transactions, is also found to have a positive effect on bank profits. Additionally, it has been revealed that internet banking services contribute more to bank profits than both mobile and digital banking services.
Preprint
Full-text available
We present a comprehensive account of Rao's Score (RS) test, starting from the first principle of testing, and going over several applications and recent extensions. The paper is intended to be like a one-stop-shopping for RS test history.
Article
Purpose This study aims to investigate the dynamics of return connectedness of the Standard & Poor’s (S&P) Gulf Cooperation Council (GCC) composite index with five regional equity indices, three global equity indices and other different asset classes during the COVID-19 pandemic period. Design/methodology/approach This study uses daily data spanning from January 2, 2018, to December 23, 2021. A subsample analysis is conducted to determine the role of uncertainty in modifying the connectedness structure during the ongoing pandemic period. Findings The results of this study show that the nature of connectedness is time-frequent, with clear evidence for a higher level of connectedness during stress periods, especially after the onset of the pandemic. The GCC index is found to be a net receiver of shocks to other assets, with an increase in magnitude during the COVID period. Research limitations/implications This study is limited by the use of only daily data, and future research could consider using higher frequency data. Practical implications The results of this study confirm the disturbing effects of the pandemic on the GCC index and its connectedness with other assets, which matters for policymakers and investors. Originality/value This study provides new insights into the dynamics of return connectedness of the GCC index with other assets during the COVID-19 pandemic period, which has not been previously explored.
Article
I use GARCH-X specifications for stock index returns to investigate the connections between conditional skewness and variance shocks computed from realized variances. The evidence indicates that the conditional skewness of index returns is strongly influenced by variance shocks and displays a remarkable degree of persistence. Indeed, variance shocks not only drive changes in conditional skewness, but also act as a common factor that generates substantial negative correlation between contemporaneous changes in the conditional volatility and conditional skewness of index returns. The resultant linkages between these conditional moments could have important implications in asset pricing, portfolio selection, and risk management applications.
Article
Since the end of the 1980s, most developing countries, particularly those in the WAEMU, have resorted to foreign capital, notably Foreign Direct Investment (FDI) as a new source of financing. The general observation is that WAEMU countries have difficulty mobilizing and taking advantage of the opportunities offered by foreign capital to initiate their development. The aim of this research is to show the contribution of FDI to the human capital accumulation in WAEMU countries. So, the effects of foreign direct investment and public education spending on human capital are analyses. The data used cover the period 2000-2020 where countries have implemented structural adjustment programs. In order to highlight the effects of foreign direct investments on human capital in the WAEMU, after a theoretical and empirical discussion on foreign direct investments on human capital, an Auto Regressive Distributive Lag (ARDL) model was constructed. The results show that foreign direct investment have a positive significant impact on human capital, which, in terms of economic policy implications means that FDI are means of improving the accumulation of human capital in WAEMU.
Article
This article analyses the dependence structure between traditional energy prices traditional oil indices and renewable energy stock indices employing Canonical Vine (C-Vine) copula based GARCH approach. The analysis is conducted before and during-COVID-pandemic period. Renewable energy stock is represented by SPGCE and CLEANTECH index, whereas traditional energy is represented by WTI crude oil and OVX. The empirical analysis indicated increasing asymmetric dependence between WTI crude and renewable energy indices during COVID-19. However, WTI crude oil is not found to be the primary determinant for influencing renewable energy stocks. In fact, we find a strong association between SPGCE and CLEANTECH and degree of dependence escalated during COVID-19 period. Besides this, the dependence structure between WTI crude and CLEANTECH changes from the upper tail dependence to lower tail dependence signifying the consequences of low economic activities on the technology sector. Furthermore, the dependence between OVX and SPGCE conditional on WTI reveal the scope of diversification as dependence parameters are insignificant. These findings are robust considering alternative GARCH specifications and alternative oil future benchmark and renewable energy stock index. The results provide useful implications for the energy policymakers and investors.
Article
Full-text available
This study examines the effects of the exchange rate on foreign reserves in Nigeria, using time series data obtained from 1986 to 2021. The study employed the autoregressive distributed lag (ARDL) approach for its statistical analysis and concludes that the exchange rate has a positive and significant effect on foreign reserves in the short run while in the long run exchange rate has a negative and significant effect on foreign reserves. Thus, based on the inverse relationship between the exchange rate and foreign reserves in the long run, it is therefore recommended that policymakers should adopt a prudent policy to deal with the problem of exchange rate fluctuation to maintain stability in Nigeria's foreign reserves.
Article
Son yıllarda uluslararası hizmet ticareti hızla artmaktadır. 1990’lı yıllarda Dünya Ticaret Örgütü hizmet sektörünün gelişmesine katkıda bulunan düzenlemeler yapmış ve kurallar belirlemiştir. Uluslararası hizmet ticaretini engelleyen durumlar ortadan kaldırılmış veya en aza indirilmiştir. Ayrıca hizmet sektörleri belirlenmiştir; sağlık, eğitim, turizm, danışmanlık, sigortacılık, bilişim vb. hizmetleri ticarete konu olmuştur. Çalışmada, dünyada son zamanlarda cazip bir sektör haline gelen uluslararası sağlık hizmetinin (sağlık turizmi veya sağlık hizmeti ihracatı olarak adlandırılan alan) döviz kuru ve ekonomik büyüme ile ilişkisinin olup olmadığını araştırmak amaçlanmıştır. Bu bağlamda Türkiye’nin hizmet ihracatı geliri olan sağlık hizmeti ihracatı ile reel efektif döviz kuru, ekonomik büyüme, cari denge, turizm gelirleri ve hizmet giderleri, Johansen Eşbütünleşme analizi yardımıyla analiz edilmiştir. 2002:Q1-2019:Q4 dönemine ait veriler kullanılarak test uygulanmıştır. Elde edilen sonuçlara göre, değişkenler arasında uzun dönemli ilişkinin olduğu görülmüştür. Ekonomik büyüme ile turizm gelirleri, sağlık hizmeti ihracatını pozitif yönde; reel efektif döviz kuru, hizmet gideri ve cari denge, sağlık hizmeti ihracatını negatif yönde etkilemektedir. Çalışmada döviz kurunda yaşanacak değişimin sağlık turistlerinin ülke seçiminde belirleyici olabileceği bulunmuştur. Ayrıca analizde uygulanan hata düzeltme modeli testi, uzun dönemde dengede oluşacak sapmaların düzeltilebileceğini göstermiştir.
Article
Addressing human anatomical and physiological variability is a crucial component of human health risk assessment of chemicals. Experts have recommended probabilistic chemical risk assessment paradigms in which distributional adjustment factors are used to account for various sources of uncertainty and variability, including variability in the pharmacokinetic behavior of a given substance in different humans. In practice, convenient assumptions about the distribution forms of adjustment factors and human equivalent doses (HEDs) are often used. Parameters such as tissue volumes and blood flows are likewise often assumed to be lognormally or normally distributed without evaluating empirical data for consistency with these forms. In this work, we performed dosimetric extrapolations using physiologically based pharmacokinetic (PBPK) models for dichloromethane (DCM) and chloroform that incorporate uncertainty and variability to determine if the HEDs associated with such extrapolations are approximately lognormal and how they depend on the underlying distribution shapes chosen to represent model parameters. We accounted for uncertainty and variability in PBPK model parameters by randomly drawing their values from a variety of distribution types. We then performed reverse dosimetry to calculate HEDs based on animal points of departure (PODs) for each set of sampled parameters. Corresponding samples of HEDs were tested to determine the impact of input parameter distributions on their central tendencies, extreme percentiles, and degree of conformance to lognormality. This work demonstrates that the measurable attributes of human variability should be considered more carefully and that generalized assumptions about parameter distribution shapes may lead to inaccurate estimates of extreme percentiles of HEDs.
Article
ISTAT has recently released an updated version of short-term statistics on hours worked in Italy, which are used in labor input estimates by the Quarterly National Accounts (QNA). The coverage of these statistics has been expanded from larger-than-ten workers firms to include the entire universe of Italian private firms. To include the updated indicator within estimates by QNA, the series must be reconstructed back to 1995 first quarter (1995q1) due to methodological requirements of QNA. In this paper, we first reconstruct the updated indicator using the Kalman filter and smoother algorithms applied to a state-space representation of a multivariate structural model (SUTSE). Next, we comparatively assess the performance of the new indicator against the non-updated one. This assessment is based on estimates of quarterly per-employee hours worked using temporal disaggregation methods for seven economic sections spanning the non-agricultural private business economy over the period 1995q1 to 2020q4. Compared to the previous indicator, the reconstructed indicator (i) implies improvements in temporal disaggregation model fitting in the majority of economic sections considered; (ii) returns smaller forecast errors in the 64.3% of the estimations, based on MAE; (iii) ensures a higher correlation between the estimated quarterly series to the indicator in the 71.4% of the estimates.
Article
Purpose This study aims to analyze the convergence of environmental degradation clubs in the Association of Southeast Asian Nations (ASEAN). In addition, this study also analyzes the influence of renewable energy and foreign direct investment (FDI) on each club as an intervention to change the convergence pattern in each club. Design/methodology/approach This study analyzes the club convergence of environmental degradation in an effort to find out the distribution of environmental degradation reduction policies. This study uses club convergence with the Phillips and Sul (PS) convergence methodology because it considers multiple steady-states and is robust. This study uses annual panel data from 1998 to 2020 and ASEAN country units with ecological footprints as proxies for environmental degradation. After obtaining the club results, the analysis continued by analyzing the impact of renewable energy and FDI on each club using panel data regression and the Stochastic Impacts by Regression on Population, Affluence and Technology model specification. Findings Based on club convergence, ASEAN countries can be grouped into three clubs with two divergent countries. Club 1 has an increasing pattern of environmental degradation, while Club 2 and Club 3 show no increase. Club 1 can primarily apply renewable energy to reduce environmental degradation, while Club 2 requires more FDI. The authors expect policymakers to take into account the clubs established to formulate collaborative policies among countries. The result that FDI reduces environmental degradation in this study is in line with the pollution halo hypothesis. This study also found that population has a significant effect on environmental degradation, so policies to regulate population need to be considered. On the other hand, increasing income has no effect on reducing environmental degradation. Therefore, the use of renewable energy and FDI toward green investment is expected to intensify within ASEAN countries to reduce environmental degradation. Originality/value This research is by far the first to apply PS Club convergence to environmental degradation in ASEAN. In addition, this study is also the first to analyze the influence of renewable energy and FDI on each club formed, considering the need for renewable energy use that has not been maximized in ASEAN.
Article
Full-text available
This study examines the intricate interplay of political stability, natural resource rent, industrialization, globalization, economic growth, and carbon emissions in nine Arab resource-abundant countries (ANRAC) from 1996 to 2019. Applying advanced statistical approaches, such as the Method of Moment Quantiles Regression (MMQREG) as a baseline estimation approach, along with the inclusion of PCSE, FGLS, and FMOLS, to enhance, to enhance the reliability and stability of the obtained results. The study results suggest that globalization, coupled with the interplay between political stability and economic growth, fosters advancements in environmental conditions and the pursuit of sustainable practices. In contrast, political stability, abundant natural resources, sustained economic expansion, and widespread industrialization are associated with increased CO2 emissions, posing detrimental effects on the environment. Notably, there seems to be a correlation between the concurrent improvement of political stability and economic growth and a reduction in CO2 emissions.
Article
Full-text available
The existing stream of empirical literature on regional inequalities has always adopted a retrospective look by analyzing the past evolution. We depart from the main stream by adopting a future perspective: Will regional inequalities shrink over time? How will the shape of income distribution evolve? Will spatial dependency increase? In the current paper, we forecast the long-term trajectory of per capita real personal income for U.S. states using the ARIMA model. We estimate the future of disparity level (for 2050 and 2090), the shape and spatial pattern of income distribution, convergence trend and spatial dependence by the help of inequality indexes (Atkinson, Coefficient of Variation, Theil) Kernel probability density distributions, explorative maps and Moran’s I test. The dataset includes 48 coterminous U.S. states over the period 1929–2022. A set of important results appeared to emerge as an outcome of the empirical analyses: First, income disparities are expected to increase over the long-term period that implies a divergence pattern. Second, the forecasted shape of the income distribution is bi-modal and polarized, therefore, pointing to a widening of the inequalities. Third, the geography of the prosperity is projected to change in a way that the geographical position of high and low-income areas will change. Fourth, spatial dependence in per capita income is expected to fade away in the future. From a political stand point, additional resources should be devoted to the states that are expected to become backward (for some states in Northeast and Southwest) in order to maintain territorial cohesion.
Article
The problem of dynamic relationships among the price indices of 10 major steel products – the rebar, steel wire, plate, hot rolled coil, cold rolled plate, galvannealed sheet, seamless tube, welded tube, section and narrow strip – is addressed in the present work for the Chinese market from 2011M7 to 2021M4. For examination of contemporaneous causal links among the 10 series, we use data on a daily basis and combine the vector error correction model and directed acyclic graph. This analysis is done using both the Peter and Clark and linear non-Gaussian acyclic model algorithms. With the exception of the price series for steel wire, each of the 10 series is part of cointegration relationships according to the vector error correction model, and all save the price series for thin strips respond to long-run equilibrium disturbances. The linear non-Gaussian acyclic model method allows us to achieve causal routes that allow for the examination of innovation accounting, but the Peter and Clark algorithm prevented us from reaching an acyclic graph. We categorise complex dynamics among price adjustment processes after shocks based on impulsive responses, for which the price indices of the plate, seamless tube and thin strip are predominating in comparison to the other seven items. Our findings show that these three goods should get the most consideration when designing long-term strategies for steel prices.
Article
Abstract This study aims to examine the impact of fintech investments and resource efficiency on sustainable development in OECD countries between 2010 and 2019. Various estimation techniques, including the Method of Moments Quantile Regression (MMQREG), machine learning-based Kernel Regularized Least Squares (KRLS), and Generalized Method of Moments (GMM), have been utilized in this study. MMQREG and KRLS are both estimators that examine the relationship between variables in several qunatiles, increasing the reliability of the findings. The research results indicate that fintech investments and resource efficiency support sustainable development. Additionally, institutional quality, environmentally friendly technologies, and foreign trade are found to have a positive impact on sustainable development. These findings suggest that financial technology and resource management can play a significant role in promoting both economic and environmental sustainability.
Article
Full-text available
In these modern times, developed and developing economies use different means and strategies to attain economic growth and financial development. Still, environmental recovery instruments are not yet empirically explored in developed regions. The prime objective of this study is to unveil the nexus between electricity use, environmental policies, and financial development to report novel approaches through the lens of sustainable development. The present research examines the heterogeneous impacts of climate policies and ecological taxes on financial development. In doing so, the study has considered greener energy and institutional quality variables as policy factors for financial development. The authors employ the 29 OECD economies data from 1994 to 2020. This research gathered the data from authentic sources such as the OECD, the World Bank, and ICRG. The pre-estimation diagnostic (residual cross-section dependence, unit root, and cointegration) tests asserted cross-section dependence between countries, variables’ stationarity, and cointegration between the variables. Due to the asymmetrical behaviour of data shown by the Jarque and Bera (Int Stat Rev 55:163–172, 1987) test, this study uses non-parametric panel quantile regression. It asserts environmental policies and green electricity use have a substantial yet mixed influence on financial development. In contrast, trade openness and GDP are the significant factors of economic development in the region. Overall, Environmental taxes adversely affect financial development in developed countries across quantiles. This study suggests promoting and improving green investment, trade, and efficient environmental policies to encourage financial development in developed countries without affecting ecological quality.
Chapter
This chapter provides an overview of robust estimation. It is recognized that outliers, which arise from heavy tailed distributions or are simply bad data points because of errors, have an unusually large influence on the least squares estimators. That is, the outliers pull the least squares fit toward them too much; a resulting examination of the residuals is misleading because then they look more like normal ones. Accordingly, robust methods have been created to modify least squares schemes so that the outliers have much less influence on the final estimates. One of the most satisfying robust procedures is that given by a modification of the principle of maximum likelihood. Robust methods have consequently been used successfully in many applications. There has been some evidence that adaptive procedures are of value. The basic idea of adapting is the selection of the estimation procedure after observing the data.
Article
There has been increasing concern recently over the use of the simple first order Markov form to model error autocorrelation in regression analysis. The consequence of misspecifying the error model will be especially serious when the regressors include lagged values of the dependent variable. The purpose of this paper is to develop Lagrange multiplier tests of the assumed error model against specified ARMA alternatives. It is shown that all of the tests can be regarded as asymptotic tests of the significance of a coefficient of determination, and a table is provided which gives details of two general tests and several special cases.
Article
Shapiro and Wilk's (1965) W statistic arguably provides the best omnibus test of normality, but is currently limited to sample sizes between 3 and 50. W is extended up to n = 2000 and an approximate normalizing transformation suitable for computer implementation is given. A novel application of W in transforming data to normality is suggested, using the three-parameter lognormal as an example.
Article
In econometrics, specification tests have been constructed to verify the validity of one specification at a time. It is argued that most of these tests are not, in general, robust in the presence of other misspecifications, so their application may result in misleading conclusions. Using the Lagrange Multiplier principle we develop efficient test procedures that are capable of testing a number of specifications simultaneously. These tests will ‘confirm’ the validity (or invalidity) of a general model requiring the estimates of the restricted model only. Through an extensive Monte Carlo experiment we study the performance of these tests and some commonly used one-directional tests. We also suggest a Multiple Comparison Procedure, to identify different sources of errors. This, we hope, will lead to a better specification of econometric models.
Article
In this article we establish under fairly general conditions that the F tests used in the linear model and the correlation model are asymptotically valid in the presence of nonnormality, in that their sizes are unaffected, asymptotically, by this nonnormality. Similar results could be derived for Scheffé-type simultaneous confidence intervals as well as the one-sided t tests used in these models. Finally, we find the asymptotic distribution of the sample variance and show why the size of a χ test about the variance for the linear model is not asymptotically valid in the presence of nonnormal errors.
Article
We present a test of normality based on a statistic D which is up to a constant the ratio of Downton's linear unbiased estimator of the population standard deviation to the sample standard deviation. For the usual levels of significance Monte Carlo simulations indicate that Cornish-Fisher expansions adequately approximate the null distribution of D if the sample size is 50 or more. The test is an omnibus test, being appropriate to detect deviations from normality due either to skewness or kurtosis. Simulation results of powers for various alternatives when the sample size is 50 indicate that the test compares favourably with the Shapiro-Wilk W test,√1, b2 and the ratio of range to standard deviation.
Article
This paper is a preliminary to a detailed survey of the relative powers of a number of omnibus and directional tests of nonnormality. The probability integrals of √ b 1 and b 2, the standardized third and fourth moment statistics, are found for random samples from a normal distribution. Main attention is given to b 2. Extensive computer simulation and curve fitting have been used to provide charts of probability levels out to the 0.1% point, for 20 ≤ n ≤ 200. For √ b 1, the parameters of Johnson's symmetrical S U approximation are tabled for values of n between 8 and 1000. An illustration is given of two ‘omnibus’ tests applying the charts and table, involving the joint use of √ b 1 and b 2.
Article
OLS and BLUS residuals in the classical linear regression model are compared with respect to testing the normality of regression disturbances. It is shown that in theory both types of residuals suffer from the common problem of lack of independence under the alternative hypothesis of nonnormal disturbances. Further, Monte Carlo experiments appear to show the superiority of OLS over BLUS residuals, as well as the superiority of the Shapiro-Wilk test over other tests studied, when one is testing for normality.
Article
The test statistic X2(√ b1) + X2(b2), where X(√ b1) and X(b2) are standardized normal equivalents to the sample skewness, √ b1, and kurtosis, b2, statistics, is considered in normal sampling. Using the Johnson system, SU and SB, as approximate normalizing distributions, contours in the (√ b1, b2) plane of the test statistic are set up for sample sizes ranging from 20 to 1000.
Article
Previous work on testing multivariate normality is reviewed. Coordinate-dependent and invariant procedures are distinguished. The arguments for concentrating on tests of linearity of regression are indicated and such tests, both coordinate-dependent and invariant, are developed.
Article
SUMMARY A comparative study of approximations to the null distribution of √ b1 is given for small sample sizes, i. e. n ≤ 35. The approximations studied are the approximations by the normal distribution, the Johnson SU distributions, the t or Pearson Type VII distributions, the Cornish-Fisher expansions using moments up to the eighth, with and without Geary's adjustment, and, finally, Monte-Carlo simulations. For the usual levels of significance all the approximations, except the normal approximation, appear satisfactory for n ≥ 8. Because the SU and t distributions allow approximations to the probability integrals, they can be used to combine the results of independently drawn samples.
Article
A number of statistical procedures involve the comparison of a ‘regression’ mean square with a ‘residual’ mean square using the normal-theory F distribution for reference. The use of the procedure for the analysis of actual data implies that the distribution of the mean-square ratio is insensitive to moderate non-normality. Many investigators, in particular Pearson (1931), Geary (1947), Gayen (1950), have considered the sensitivity of this distribution to parent non-normality for important special cases and a very general investigation was carried out by David & Johnson (1951a, b). The principal object of this paper is to demonstrate the overriding influence which the numerical values of the regression variables have in deciding sensitivity to non-normality and to demonstrate the essential nature of this dependency. We first obtain a simple approximation to the distribution of the regression F statistic in the non-normal case. This shows that it is ‘the extent of non-normality’ in the regression variables (the x's), which determines sensitivity to non-normality in the observations (the y's). Our results are illustrated for certain familiar special cases. In particular the well-known robustness of the analysis of variance test to compare means of equal-sized groups and the notorious lack of robustness of the test to compare two estimates of variance from independent samples are discussed in this context. We finally show that it is possible to choose the regression variables so that, to the order of approximation we employ, non-normality in the y's is without effect on the distribution of the test statistic. Our results demonstrate the effect which the choice of experimental design has in deciding robustness to non-normality.
Article
Results are given of an empirical sampling study of the sensitivities of nine statistical procedures for evaluating the normality of a complete sample. The nine statistics are W (Shapiro and Wilk, 1965), (standard third moment), b2 (standard fourth moment), KS (Kolmogorov-Smirnov), CM (Cramer-Von Mises), WCM (weighted CM), D (modified KS), CS (chi-squared) and u (Studentized range). Forty-five alternative distributions in twelve families and five sample sizes were studied. Results are included on the comparison of the statistical procedures in relation to groupings of the alternative distributions, on means and variances of the statistics under the various alternatives, on dependence of sensitivities on sample size, on approach to normality as measured by the W statistic within some classes of distribution, and on the effect of misspecification of parameters on the performance of the simple hypothesis test statistics.The general findings include: (i) The W statistic provides a generally superior omnibus measure of non-normality; (ii) the distance tests (KS, CM, WCM, D) are typically very insensitive; (iii) the u statistic is excellent against symmetric, especially short-tailed, distributions but has virtually no sensitivity to asymmetry; (iv) a combination of both and b2 usually provides a sensitive judgment but even their combined performance is usually dominated by W; (v) with sensitive procedures, good indication of extreme non-normality (e.g., the exponential distribution) can be achieved with samples of size less than 20.
Article
This article presents a modification of the Shapiro-Wilk W statistic for testing normality which can be used with large samples. Shapiro and Wilk gave coefficients and percentage points for sample sizes up to 50. These coefficients required obtaining an approximation to the covariance matrix of the normal order statistics. The proposed test uses coefficients which depend only on the expected values of the normal order statistics which are generally available. Results of an empirical sampling study to compare the sensitivity of the test statistic to the W test statistic are briefly discussed.
Article
The main intent of this paper is to introduce a new statistical procedure for testing a complete sample for normality. The test statistic is obtained by dividing the square of an appropriate linear combination of the sample order statistics by the usual symmetric estimate of variance. This ratio is both scale and origin invariant and hence the statistic is appropriate for a test of the composite hypothesis of normality. Testing for distributional assumptions in general and for normality in particular has been a major area of continuing statistical research-both theoretically and practically. A possible cause of such sustained interest is that many statistical procedures have been derived based on particular distributional assumptions-especially that of normality. Although in many cases the techniques are more robust than the assumptions underlying them, still a knowledge that the underlying assumption is incorrect may temper the use and application of the methods. Moreover, the study of a body of data with the stimulus of a distributional test may encourage consideration of, for example, normalizing transformations and the use of alternate methods such as distribution-free techniques, as well as detection of gross peculiarities such as outliers or errors. The test procedure developed in this paper is defined and some of its analytical properties described in ? 2. Operational information and tables useful in employing the test are detailed in ? 3 (which may be read independently of the rest of the paper). Some examples are given in ? 4. Section 5 consists of an extract from an empirical sampling study of the comparison of the effectiveness of various alternative tests. Discussion and concluding remarks are given in ?6. 2. THE W TEST FOR NORMALITY (COMPLETE SAMPLES) 2 1. Motivation and early work This study was initiated, in part, in an attempt to summarize formally certain indications of probability plots. In particular, could one condense departures from statistical linearity of probability plots into one or a few 'degrees of freedom' in the manner of the application of analysis of variance in regression analysis? In a probability plot, one can consider the regression of the ordered observations on the expected values of the order statistics from a standardized version of the hypothesized distribution-the plot tending to be linear if the hypothesis is true. Hence a possible method of testing the distributional assumptionis by means of an analysis of variance type procedure. Using generalized least squares (the ordered variates are correlated) linear and higher-order
Article
centage points of Shapiro & Francia's (1972) W' approximate test of normality are shown to be well approximated by the percentage points of Shapiro & Wilk's (1965) W test for normality.
Article
In the present paper we present results of a large scale simulation study of the power of various tests of normality. We follow the procedure adopted by Shapiro, Wilk & Chen of applying a considerable variety of tests to samples drawn from a wide range of nonnormal populations. We, however, consider some additional tests, emphasize the difference between ‘omnibus’ and ‘directional’ tests and show the difficulty of applying testa based on ordered observations, such as the Shapiro-Wilk test, when in practice the sample data may contain ‘ties’ resulting from grouping or rounding.
Article
It was shown by Pierce & Kopecky (1979) that tests of normality appropriate for the identically distributed case are asymptotically valid when computed from residuals in regression situations. Numerical results based on simulation are given here to study the adequacy of this result for sample sizes of 20 and 40. The Shapiro-Wilk and Anderson-Darling tests are found to be quite adequate for n = 20 in typical simple linear regression situations, and for n = 40 in certain multiple regression settings.
Article
The parameters of a system of demand equations are estimated subject to the prior information of classical demand theory. The equations are estimated as a system using a variant of generalized least squares, the parametric restrictions being imposed by Lagrange multipliers. Tests of significance are given, both for individual restrictions and for the restrictions applied collectively. The method is applied to Barten's sixteen commodity consumer expenditure data for Holland. The work was done independently of R. H. Court's [6] similar treatment; however, there are significant differences in the method which warrant further discussion and the application is itself of some interest.
Article
this article are 1. To show that, in large samples under fairly general conditions, several well-known and easily computable 16 O Journal of the American StitisticII Assoalation March IMP, Volume 7S, Number Applications Section 17 tests for departures from nornlity may be modified for use in the classical linear regression framework; 2. To examine the consequences of replacing with in moderate and large samples; 3. To asess the power of each test with respect to alternatives to nornlity likely to be encountered empirically; and 4. To compare power across tests
Families of Frequency Distributions, Griffin's Statistical Monographs and Courses 30 Testing normality of errors in regression models
  • J K Ord
Ord, J.K. (1972). Families of Frequency Distributions, Griffin's Statistical Monographs and Courses 30. London: Griffin. Pierce, D.A. & Gray, R.J. (1982). Testing normality of errors in regression models. Biometrika 69, 233-236.
On the general theory of skew correlation and non-linear regression
  • K Pearson
Pearson, K. (1905). On the general theory of skew correlation and non-linear regression. Biometrika 4, 172-212.