Figure 2 - uploaded by Luiz C Leal
Content may be subject to copyright.
Covariance matrix for the pointwise capture cross section.

Covariance matrix for the pointwise capture cross section.

Source publication
Article
Full-text available
In this paper, we describe techniques used to determine realistic and appropriate uncertainties and correlations (or, equivalently, covariances) for multigroup cross sections in the resolved-resonance region, starting from fundamental principles. The entire process is described, with emphasis on the propagation of uncertainties through each step of...

Contexts in source publication

Context 1
... resonance parameter values found in the previous section using the PUP method were used to construct capture and total pointwise cross sections and the associated CMs. Figure 1 shows the CM for the reduced capture data; Fig. 2 shows the very different CM for the pointwise capture data, using the same energy scale. Figure 3 shows the CM that would be found for the pointwise capture data if the off-diagonal elements of the RPCM were not used; note the marked absence of the secondary peaks seen in Fig. 2. ...
Context 2
... associated CMs. Figure 1 shows the CM for the reduced capture data; Fig. 2 shows the very different CM for the pointwise capture data, using the same energy scale. Figure 3 shows the CM that would be found for the pointwise capture data if the off-diagonal elements of the RPCM were not used; note the marked absence of the secondary peaks seen in Fig. 2. ...
Context 3
... this simple example, we somewhat arbitrarily choose two energy groups for averaging; the flux ( ) E Φ is taken to be independent of energy. Table 2 shows the multigroup averages, uncertainties, and correlation coefficients between the two groups when the full RPCM is used (as shown for the pointwise capture cross section in Fig. 2), and when off-diagonal components of the RPCM are neglected (as shown for the pointwise capture cross section in Fig. 3). While the values and uncertainties for the multigroup averaged cross sections do not differ too much for the two cases, the correlations are extremely different. ...

Similar publications

Technical Report
Full-text available
Transmission measurements have been performed at the time-of-flight facility GELINA to determine the total cross section for neutron induced reactions in 197Au. The measurements have been carried out at a 50 m transmission station of GELINA with the accelerator operating at 800 Hz. This report provides the experimental details required to deliver t...
Technical Report
Full-text available
Transmission measurements have been performed at the time-of-flight facility GELINA using metallic discs of natural tungsten. The measurements have been carried out at 25 m and 50 m stations using Li-glass scintillators with the accelerator operating at 800 Hz. This report describes the experimental details required to deliver the data to the EXFOR...
Technical Report
Full-text available
Transmission measurements have been performed at the time-of-flight facility GELINA to determine neutron resonance parameters for 63Cu and 65Cu. The experiments have been carried out at a 50 m transmission station at a moderated neutron beam using a Li-glass scintillator with the accelerator operating at 800 Hz. Measurements were performed with met...

Citations

... The so-called "evaluated nuclear data" are shown as the solid line in Fig. 1.1, as a result of combining experimental measurements, theoretical nuclear models, and statistical analysis (when multiple experimental measurements are available for the same quantity of interest). Evaluated nuclear data reflect the best representation of the true cross sections [7]. They are organized into the "ENDF-6" format [8] which is highly-ordered and computer readable. ...
... A similar route is undertaken by the uncertainties in experimentally measured nuclear data which propagate through several stages of processing and become particular types of nuclear data uncertainties, often known as the "multigroup covariance matrices" that are used in this PhD work and related studies. In the previously mentioned time-of-flight experiment for nuclear data measurement, the count rate, i.e. the number of counts (of outgoing particles) per (energy, angle) channel·time are the raw data and they obey the Poisson statistics [7]. Performing consistent normalization (a ± ∆a) and background removal (b ± ∆b) on the experimental raw data (r i , r j ) introduces correlations among experimentally measured cross sections (d i ,d j ). ...
... Performing consistent normalization (a ± ∆a) and background removal (b ± ∆b) on the experimental raw data (r i , r j ) introduces correlations among experimentally measured cross sections (d i ,d j ). The experimental covariance is [7]: ...
Article
The goal of the present PhD research is to establish a methodology of nuclear data uncertainty quantification (NDUQ) for MCNPX, the continuous-energy Monte-Carlo (M-C) code. The high fidelity (continuous-energy treatment and flexible geometry modelling) of MCNPX makes it the choice of routine criticality safety calculations at PSI/LRS, but also raises challenges for NDUQ by conventional sensitivity/uncertainty (S/U) methods. The methodology developed during this PhD research is fundamentally different from the conventional S/U approach: nuclear data are treated as random variables and sampled in accordance to presumed probability distributions. When sampled nuclear data are used in repeated model calculations, the output variance is attributed to the collective uncertainties of nuclear data. The NUSS (Nuclear data Uncertainty Stochastic Sampling) tool based on this sampling approach, is implemented to work with MCNPX's ACE format of nuclear data, which also gives NUSS compatibility with MCNP and SERPENT M-C codes. In contrast, multigroup uncertainties are used for the sampling of ACE-formatted pointwise-energy nuclear data in a groupwise manner due to the more limited quantity and quality of nuclear data uncertainties. Conveniently, the usage of multigroup nuclear data uncertainties allows consistent comparison between NUSS and other methods (both S/U and sampling-based) that employ the same nuclear data uncertainty format. The first stage of NUSS development focuses on applying simple random sampling algorithm for uncertainty quantification. The effect of combining multigroup and ACE format on the propagated nuclear data uncertainties is assessed. It is found that the number of energy groups has minor impact on the precision of the multiplication factor (k-eff) uncertainty as long as the group structure reflects the neutron flux spectrum. Successful verification of the NUSS tool for propagating nuclear data uncertainties through MCNPX and quantifying MCNPX output parameter uncertainties is obtained. The second stage of NUSS development is motivated by the need for an efficient sensitivity analysis methodology based on global sampling and coupled with MCNPX. For complex systems, the computing time for obtaining a breakdown of the total uncertainty contribution by individual inputs becomes prohibitive when many MCNPX runs are required. The capability of determining simultaneously the total uncertainty and individual nuclear data uncertainty contributions is thus researched and implemented into the NUSS-RF tool. It is based on the Random Balance Design algorithm and is validated by three mathematical test cases for both linear and nonlinear models and correlated inputs. NUSS-RF is then applied to demonstrate the efficient decomposition of total uncertainty by individual nuclear data. However an attempt to decompose total uncertainty into individual contributions using the conventional S/U method shows different decomposition results when the inputs are correlated. The investigation and findings of this PhD work are valuable because of the introduction of global sensitivity analysis into the existing repertoire of nuclear data uncertainty quantification methods. The NUSS tool is expected to be useful for expanding the types of MCNPX-related applications, such as an upgrade to the current PSI criticality safety assessment methodology for Swiss application, for which nuclear data uncertainty contributions can be quantified.
... [5] ...
Technical Report
Full-text available
This report has been issued by WPEC Subgroup 33, whose mission was to study methods and issues of the combined use of integral experiments and covariance data, with the objective of recommending a set of best and consistent practices in order to improve evaluated nuclear data files. In a first step, the subgroup reviewed and assessed the existing nuclear data adjustment methodologies. The outcome is presented in this intermediate report.
... The covariance matrix is symmetric and its diagonal elements represent the uncorrelated statistical uncertainty while the nondiagonal elements represent the correlated systematic uncertainty (Stanga and Muntele, 2000; Winkler, 1998). Although, the accuracy of nuclear data has significantly improved, little information exists on the various components of uncertainties and their correlation (Leal et al., 2005; Larson et al., 2006). A thorough knowledge of correlation between the uncertainties is important as they greatly influence the final uncertainty in the resultant parameters of interest (Smith, 1991). ...
Article
Covariance matrix elements depict the statistical and systematic uncertainties in reactor parameter measurements. All the efforts have so far been devoted only to minimise the statistical uncertainty by repeated measurements but the dominant systematic uncertainty has either been neglected or randomized. In recent years efforts has been devoted to simulate the resonance parameter uncertainty information through covariance matrices in code SAMMY. But, the code does not have any provision to check the reliability of the simulated covariance data. We propose a new approach called entropy based information theory to reduce the systematic uncertainty in the correlation matrix element so that resonance parameters with minimum systematic uncertainty can be modelled. We apply our information theory approach in generating the resonance parameters of (156)Gd with reduced systematic uncertainty and demonstrate the superiority of our technique over the principal component analysis method.
... It 50 is pertinent to note that systematic uncertainty pervades 51 all types of physical measurement and is affected by errors 52 due to instrumentation, environment and personnel 53 (Coates et al., 1983;Leinweber et al., 2006). It is always 54 recommended, to spend the maximum possible effort on 55 identification and minimization of correlated uncertainties 56 (Massart et al., 1988). in the evaluation of the experimental data in the resolved 62 and unresolved resonance region (Larson et al., 2006;63 Arbanas et al., 2006). SAMMY takes into consideration Let p 1 , p 2 , . . . ...
Article
Due to complex nature of resonance region interactions, significant effort has been devoted to quantify the resonance parameter uncertainty information through covariance matrices. Statistical uncertainties arising from measurements contribute only to the diagonal elements of the covariance matrix, but the off-diagonal contributions arise from multiple sources like systematic errors in cross-section measurement, correlation due to nuclear reaction formalism, etc. All the efforts have so far been devoted to minimize the statistical uncertainty by repeated measurements but systematic uncertainty cannot be reduced by mere repetition. The computer codes like SAMMY and KALMAN so far developed to generate resonance parameter covariance have no provision to improve upon the highly correlated experimental data and hence reduce the systematic uncertainty. We propose a new approach called entropy based information theory to reduce the systematic uncertainty in the covariance matrix element wise so that resonance parameters with minimum systematic uncertainty can be simulated. Our simulation approach will aid both the experimentalists and the evaluators to design the experimental facility with minimum systematic uncertainty and thus improve the quality of measurement and the associated instrumentation. We demonstrate, the utility of our approach in simulating the resonance parameters of Uranium-235 and Plutonium-239 with reduced systematic uncertainty.
... These expressions contain resonance parameters determined by regression of the theoretical equations with one or more differential measurements. Converting the measured results into an evaluated cross section introduces a number of correlated uncertainties into the differential data, caused by background corrections, normalization, etc. 8 In this paper the term "integral parameter" refers to an intrinsic nuclear data parameter, as opposed to an extrinsic integral parameter such as the criticality factor k eff that inherently depends upon the facility where the measurement is done. Intrinsic integral parameters are measured experimentally and, in theory, should be independent of the measurement facility; thus, they are "fundamental" quantities in the same sense as differential nuclear data. ...
Article
Full-text available
Computational tools are available to utilize sensitivity and uncertainty (S/U) methods for a wide variety of applications in reactor analysis and criticality safety. S/U analysis generally requires knowledge of the underlying uncertainties in evaluated nuclear data, as expressed by covariance matrices; however, only a few nuclides currently have covariance information available in ENDF/B-VII. Recently new covariance evaluations have become available for several important nuclides, but a complete set of uncertainties for all materials needed in nuclear applications is unlikely to be available for several years at least. Therefore if the potential power of S/U techniques is to be realized for near-term projects in advanced reactor design and criticality safety analysis, it is necessary to establish procedures for generating approximate covariance data. This paper discusses an approach to create applications-oriented covariance data by applying integral uncertainties to differential data within the corresponding energy range.
Article
Full-text available
In SCALE 6, the Tools for Sensitivity and UNcertainty Analysis Methodology Implementation (TSUNAMI) modules calculate the sensitivity of k eff or reactivity differences to the neutron cross-section data on an energy-dependent, nuclide-reaction-specific basis. These sensitivity data are useful for uncertainty quantification, using the comprehensive neutron cross-section-covariance data in SCALE 6. Additional modules in SCALE 6 use the sensitivity and uncertainty data to produce correlation coef-ficients and other relational parameters that quantify the similarity of benchmark experiments to application systems for code validation purposes. Bias and bias uncertainties are quantified using parametric trending analysis or data adjustment techniques, providing detailed assessments of sources of biases and their uncertainties and quantifying gaps in experimental data available for validation. An example application of these methods is presented for a generic burnup credit cask model.