(a) Histogram of temperature values occuring at each grid point in the 20-year average temperature. (b) Same as (a) except each value is weighted by the fractional volume corresponding to the particular grid point. (c) Variance with depth of the weighted temperatures in (b) before (solid line) and after (dashed line) removal of the lowest singular vector pair. (d) Cumulative sum of squared singular vectors for time-mean temperature field. (e) Histogram of values in the residual after removal of the q ¼ 1 singular value pair and which is unimodal without a dominating skewness. 

(a) Histogram of temperature values occuring at each grid point in the 20-year average temperature. (b) Same as (a) except each value is weighted by the fractional volume corresponding to the particular grid point. (c) Variance with depth of the weighted temperatures in (b) before (solid line) and after (dashed line) removal of the lowest singular vector pair. (d) Cumulative sum of squared singular vectors for time-mean temperature field. (e) Histogram of values in the residual after removal of the q ¼ 1 singular value pair and which is unimodal without a dominating skewness. 

Source publication
Article
Full-text available
Lower-bounds on uncertainties in oceanic data and a model are calculated for the 20-year time means and their temporal evolution for oceanic temperature, salinity, and sea surface height, during the data-dense interval 1994–2013. The essential step of separating stochastic from systematic or deterministic elements of the fields is explored by suppr...

Contexts in source publication

Context 1
... temperature, removal of only the first pair, q ¼ 1; reduces the temperature norm of N 0 q r a ; z k ð Þ by over 90% and the histogram of volume weighted values (Fig. 2) is now unimodal. Over 20 years, the response of the ocean is dominantly linear, producing a normal stochastic field that is supported, e.g. by the discussion of Gebbie and Huybers (2018), and the known physics of short-time scale adjustment. With the risk of over-estimating the uncertainty, q ¼ 1 is chosen, and the elements of N 0 q r a ; z k ð Þ are assumed to be iid and thus suitable for computing a bootstrap mean and standard error. Fifty samples of bootstrap reconstruction produces a two-standard devi- ation uncertainty of 1.9 Â 10 À3 C owing to the stochastic elements or hTi ¼ 3:53160:002 C, the 'formal' error. (For a Gaussian process, two-standard deviations is approximately 95% confidence ...
Context 2
... the problem of determining the 20-year global ocean average temperature and its corresponding uncertainty. A 20-year average, computed 50 years in the future, might usefully be compared with the present 20-year average. Hourly values of the state estimate, aver- aged over 20-years, 1994-2013, produce point-wise calcu- lated mean potential temperatures, T i at each grid point. Mean temperature at one depth can be seen in Fig. 1, dis- playing the classical large-scale features that are clearly deterministic over 20 years with superimposed stochastic elements. A histogram of the gridded mean temperatures can be seen in Fig. 2 and its heavily skewed behavior is ...
Context 3
... quoted above. A histogram of the similarly skewed weighted values can also be seen in Fig. 2. A standard deviation computed from the volume weighted values has no meaning as: (a) much of the field is effectively deter- ministic and, (b) the probability density of the stochastic elements is ...
Context 4
... a residual with a very much reduced spatially cor- related structure. The variance with depth of both h i and h 0 i (Fig. 2) shows that removal of the first pair alone (q ¼ 1Þ leaves a variance much closer to being depth-inde- pendent and with a histogram that is unimodal and not heavily skewed, so that a standard deviation has some simple meaning. (A chart of u 2 ; not shown, shows it to have a spatially complex structure with a plausibly sto- chastic behavior, which could be tested in ...

Similar publications

Conference Paper
Full-text available
Uncertainty quantification metrics are critical in the campaign of stochastic model updating, by provide an elaborate measurement of the uncertainty in both simulations and experiments. In this work, the Bhattacharyya distance is proposed as a comprehensive model updating metric for two samples considering their probabilistic properties. The updati...
Article
Full-text available
This study discusses an application of the stochastic collocation method for the solution of a nonlinear magnetoquasistatic interface problem that is constrained by a partial differential equation with random input data. Special attention is given to finding the robust design of a rotor and a stator of an electrical machine for the reduction of ele...
Preprint
Full-text available
Conservation laws in the form of elliptic and parabolic partial differential equations (PDEs) are fundamental to the modeling of many applications such as heat transfer and flow in porous media. Based on the relation between stochastic diffusion processes and PDEs, Monte Carlo (MC) methods are available to solve these PDEs in cases where the conduc...
Presentation
Full-text available
Keywords: Uncertainty Quantification, Effective Elastic Modulus, Midpoint Approximation, Random Field Theory, Stochastic Finite Element Method

Citations

... An alternative method for calculating S essentially amounts to monitoring changes in the weight of the ocean, which represent the net exchange of freshwater with the land, atmosphere and cryosphere (assuming PONTE ET AL. negligible changes in salt content). Expressing changes in freshwater as an equivalent water thickness change δh fw , the fractional change in S is approximately equal and of opposite sign to the fractional change in ocean volume or mean depth (Munk, 2003;Wunsch, 2018), that is, ...
... How consistent are the in situ estimates of S in Figure 1 and those that can be inferred from δh fw values in Figure 2? We examine separately the mean seasonal cycle, interannual variability, and long-term trends. Values of δh fw are converted to changes in S using Equation 1 with H 0 = 3,682 m (Charette & Smith, 2010) and  0 34.7 S g/kg (Wunsch, 2018). ...
Article
Full-text available
Global ocean mean salinity S is a key indicator of the Earth's hydrological cycle and the exchanges of freshwater between land and ocean, but its determination remains a challenge. Aside from traditional methods based on gridded salinity fields derived from in situ measurements, we explore estimates of S based on liquid freshwater changes derived from space gravimetry data corrected for sea ice effects. For the 2005-2019 period analyzed, the different S series show little consistency in seasonal, interannual, and long-term variability. In situ estimates show sensitivity to choice of product and unrealistic variations. A suspiciously large rise in S since ∼2015 is enough to measurably affect halosteric sea level estimates and can explain recent discrepancies in the global mean sea level budget. Gravimetry-based S estimates are more realistic, inherently consistent with estimated freshwater contributions to global mean sea level, and provide a way to calibrate the in situ estimates.
... In the absence of better information, we resort to the simplified representation of the error covariances, consistent with existing state estimation efforts (e.g., Forget, Campin et al., 2015;Mazloff, Heimbach, & Wunsch, 2010). These error estimates play an important role in any least squares optimization (both ECCO-related and other data assimilation efforts), and their improved estimation is itself an important area of ongoing research (Wunsch, 2018). We will discuss these further below. ...
Article
Full-text available
A description and assessment of the first release of the Arctic Subpolar gyre sTate Estimate (ASTE_R1), a data‐constrained ocean‐sea ice model‐data synthesis, is presented. ASTE_R1 has a nominal resolution of 1/3° and spans the period 2002–2017. The fit of the model to an extensive (O(10⁹)) set of satellite and in situ observations was achieved through adjoint‐based nonlinear least squares optimization. The improvement of the solution compared to an unconstrained simulation is reflected in misfit reductions of 77% for Argo, 50% for satellite sea surface height, 58% for the Fram Strait mooring, 65% for Ice Tethered Profilers, and 83% for sea ice extent. Exact dynamical and kinematic consistency is a key advantage of ASTE_R1, distinguishing the state estimate from existing ocean reanalyses. Through strict adherence to conservation laws, all sources and sinks within ASTE_R1 can be accounted for, permitting meaningful analysis of closed budgets at the grid‐scale, such as contributions of horizontal and vertical convergence to the tendencies of heat and salt. ASTE_R1 thus serves as the biggest effort undertaken to date of producing a specialized Arctic ocean‐ice estimate over the 21st century. Transports of volume, heat, and freshwater are consistent with published observation‐based estimates across important Arctic Mediterranean gateways. Interannual variability and low frequency trends of freshwater and heat content are well represented in the Barents Sea, western Arctic halocline, and east subpolar North Atlantic. Systematic biases remain in ASTE_R1, including a warm bias in the Atlantic Water layer in the Arctic and deficient freshwater inputs from rivers and Greenland discharge.
... Ocean models can be used to generate idealized or realistic configurations that allow us to explore fundamental ocean mechanisms. However, in order to build a physically consistent estimate of the ocean state, its evolution, and ultimately reanalysis products that can describe past evolution, ocean models are combined with data assimilation techniques to extrapolate in space and time the sparse oceanic observations (Ferry et al., 2012;Carton et al., 2018;Wunsch, 2018). The same data assimilation techniques are used to generate the initial state that is used perform short-term forecasts (Chassignet and Verron, 2006;Dombrowsky et al., 2009;Schiller and Brassington, 2011;Bell et al., 2015;Chassignet et al., 2018). ...
Chapter
Full-text available
Ocean circulation models allow us to identify and characterize complex and diverse physical mechanisms in the ocean. The field of ocean modeling has matured, in large part, because of strong cross-disciplinary collaborations and the consolidation of efforts that took place among different groups involved in the various applications and dimensions of ocean circulation and earth system models. In particular, ocean forecasting, as a component of operational oceanography, combines ocean models and near-real-time collection of ocean observations to generate forecasts on a variety of timescales of ocean currents, sea level, temperature, salinity, sea ice, surface waves, and concentrations of tracers relevant to environmental or biogeochemical processes. The multidisciplinary nature of operational oceanography makes it an exciting scientific career path, particularly for those interested in research or the assimilation of new technological advances and methods.
... latitude, salinity and temperature), but these also strongly covary with other predictors, and on a global scale lack dynamic range (i.e. 98% of the ocean is between 0-5 °C and 99% is between 33.5 and 35.5‰ salinity, Wunsch, 2018). ...
Article
Full-text available
Aim The emergence of pattern in the natural world can carry important messages about underlying processes. For example, collections of broadly similar terrestrial ecosystems have historically been categorized as biomes – groupings of systems that sort along energetic and structural process axes. In marine systems however, a similar classification of biomes has not emerged. The aim here is to develop an effective classification scheme for marine biomes and communities. Approach Candidate predictor variables that could explain pattern and process in differentiating marine communities, such as light, nutrients, depth, etc., were collected from the existing literature with a systematic review. The candidate predictors were then evaluated in an inductive process, allowing community level observations to demonstrate patterns across marine biomes. Marine biomes and communities were identified a priori, and emergent patterns were evaluated quantitatively, via principal component ordination based on the vectors of explanatory predictors, and qualitatively via mapping. Conclusions Gross primary production and substrate mobility not only effectively sort marine biomes, but also work on finer scales discriminating communities within biomes. As a result, these predictors were more effective in classifying marine communities than other scales, such as available light and nutrients. The richness of this classification is also demonstrated in revealing other patterns, such as the distribution of human impacts. The effectiveness of this mapping provides support for, but is not a test of, the hypothesis that primary production and substrate mobility are important underlying processes that interact to structure marine ecosystems.
... Following Eq. 3, we calculated the uncertainties of the global and regional averages as the product of the simple sum of the diagonal terms of the error covariance matrix and an inflation factor, which probably resulted in overestimates. In fact, DIVA may be used to more accurately estimate the spatial covariances as described by the non-diagonal terms, albeit at a much higher computational cost (Troupin et al., 2012;Troupin et al., 2019, Section 4.5; see also Wunsch, 2018). Therefore we decided to 11±0.14) ...
Preprint
Full-text available
Abstract. We present a climatology of the sea-surface temperature (SST) anomaly and the sea-ice extent during the Last Glacial Maximum (LGM, 23 000–19 000 years before present) mapped on a global regular 1° × 1° grid. It is an extension of the Glacial Atlantic Ocean Mapping (GLAMAP) reconstruction of the Atlantic SST based on the results of the Multiproxy Approach for the Reconstruction of the Glacial Ocean Surface (MARGO) project and several recent estimates of the LGM sea-ice extent. Such a gridded climatology is highly useful for the visualization of the LGM climate, calculation of global and regional SST averages and estimation of the equilibrium climate sensitivity, as well as a boundary condition for atmospheric general circulation models. The gridding of the sparse SST reconstruction was done in an optimal way using the Data-Interpolating Variational Analysis (DIVA) software, which takes into account the uncertainty on the reconstruction and includes the calculation of an error field. The resulting Glacial Ocean Map (GLOMAP) confirmed the previous findings by the MARGO project regarding longitudinal and meridional SST differences that were greater than today in all oceans and an equilibrium climate sensitivity at the lower end of the currently accepted range.
... The averaging time, s avg , required for that to be rigorously true, is likely much longer than 26 years, but whether it is 100 or 1000 years, or if it is valid at all, remains obscure and awaits either much longer observational records, or analysis of demonstrably skillful high resolution circulation models. Wunsch (2018) and Meyssignac et al. (2019) address some of the sampling issues associated with persistent spatial patterns in thermal fields. These consequences are not described here. ...
Article
Full-text available
Numerous indicators show that multi-annual and longer oceanic baroclinic variability retains a complicated spatial structure out to decades and longer. With time-averaging, the sub-basin scales connected to abyssal topography and meteorological structures emerge in the fields. Here, using 26-years of an oceanic state estimate (ECCO), an attempt is made to extract simpler patterns from the vertical average (whole water column) annual mean temperature anomalies and, separately, the vertical structures at each horizontal position. Singular vectors (SVs)/empirical orthogonal functions (EOFs) successfully simplify vertical and horizontal fields, but principal observation patterns (POPs) do not do so. About 3 horizontal spatial patterns account for more than 95% of the interannual and longer variances. A breakdown of the purely vertical structure at each grid point leads in contrast to an intricate variability with depth. Results have implications both for future sampling strategies, and for estimates, e.g. of the accuracy of any mean oceanic scalar.
... In conjunction with data assimilation, they are used to extrapolate in space and in time the available discrete oceanic observations to build a physically consistent estimate of the ocean state and its evolution. This, in turn allows for the creation of reanalysis products that can describe past evolution (Ferry et al., 2012;Carton et al., 2018;Wunsch, 2018), or initial state to perform forecasts (Chassignet and Verron, 2006;Dombrowsky et al., 2009;Schiller and Brassington, 2011;Bell et al., 2015;Chassignet et al., 2018). Observational data via data assimilation sets the stage for model state estimates and forecasts , with the quality of the estimates and the forecast being strongly dependent upon the ability of an ocean numerical model to faithfully represent the resolved dynamics of the ocean and the parameterized subgrid scale physics. ...
Article
Full-text available
Operational oceanography can be described as the provision of routine oceanographic information needed for decision-making purposes. It is dependent upon sustained research and development through the end-to-end framework of an operational service, from observation collection to delivery mechanisms. The core components of operational oceanographic systems are a multi-platform observation network, a data management system, a data assimilative prediction system, and a dissemination/accessibility system. These are interdependent, necessitating communication and exchange between them, and together provide the mechanism through which a clear picture of ocean conditions, in the past, present, and future, can be seen. Ocean observations play a critical role in all aspects of operational oceanography, not only for assimilation but as part of the research cycle, and for verification and validation of products. Data assimilative prediction systems are advancing at a fast pace, in tandem with improved science and the growth in computing power. To make best use of the system capability these advances would be matched by equivalent advances in operational observation coverage. This synergy between the prediction and observation systems underpins the quality of products available to stakeholders, and justifies the need for sustained ocean observations. In this white paper, the components of an operational oceanographic system are described, highlighting the critical role of ocean observations, and how the operational systems will evolve over the next decade to improve the characterization of ocean conditions, including at finer spatial and temporal scales.
... In conjunction with data assimilation, they are used to extrapolate in space and in time the available discrete oceanic observations to build a physically consistent estimate of the ocean state and its evolution. This, in turn allows for the creation of reanalysis products that can describe past evolution (Ferry et al., 2012;Carton et al., 2018;Wunsch, 2018), or initial state to perform forecasts (Chassignet and Verron, 2006;Dombrowsky et al., 2009;Schiller and Brassington, 2011;Bell et al., 2015;Chassignet et al., 2018). Observational data via data assimilation sets the stage for model state estimates and forecasts , with the quality of the estimates and the forecast being strongly dependent upon the ability of an ocean numerical model to faithfully represent the resolved dynamics of the ocean and the parameterized subgrid scale physics. ...
Article
Full-text available
Operational oceanography can be described as the provision of routine oceanographic information needed for decision-making purposes. It is dependent upon sustained research and development through the end-to-end framework of an operational service, from observation collection to delivery mechanisms. The core components of operational oceanographic systems are a multi-platform observation network, a data management system, a data assimilative prediction system, and a dissemination/accessibility system. These are interdependent, necessitating communication and exchange between them, and together provide the mechanism through which a clear picture of ocean conditions, in the past, present, and future, can be seen. Ocean observations play a critical role in all aspects of operational oceanography, not only for assimilation but as part of the research cycle, and Frontiers in Marine Science | www.frontiersin.org 1 September 2019 | Volume 6 | Article 450 Davidson et al. Synergies in Operational Oceanography for verification and validation of products. Data assimilative prediction systems are advancing at a fast pace, in tandem with improved science and the growth in computing power. To make best use of the system capability these advances would be matched by equivalent advances in operational observation coverage. This synergy between the prediction and observation systems underpins the quality of products available to stakeholders, and justifies the need for sustained ocean observations. In this white paper, the components of an operational oceanographic system are described, highlighting the critical role of ocean observations, and how the operational systems will evolve over the next decade to improve the characterization of ocean conditions, including at finer spatial and temporal scales.
... ECCO requires dynamical and kinematical consistency of its products, in particular, conservation of mass, heat, and salt throughout the estimation period. Avoiding shortcomings identified in atmospheric reanalysis (e.g., Bengtsson et al., 2004Bengtsson et al., , 2007 and making optimal use of the sparse observational coverage calls for the use of smoothing methods from optimal estimation theory (Wunsch andHeimbach, 2007, 2013;Stammer et al., 2016). The ECCO method exploits information contained in observations both forward and backward in time, while avoiding unphysical perturbations of the time-evolving state that is being constrained. ...
... The analysis is set against the larger backdrop of full-depth ocean heat content changes over the last few decades. The latest ECCOv4 estimate produces a global mean heating rate of 0.48 ± 0.16 W m −2 , which includes a 0.095 W m −2 geothermal flux (Wunsch, 2018). All uncertainties quoted are likely at lower bounds as they do not account for systematic errors. ...
Article
Full-text available
In 1999, the consortium on Estimating the Circulation and Climate of the Ocean (ECCO) set out to synthesize the hydrographic data collected by the World Ocean Circulation Experiment (WOCE) and the satellite sea surface height measurements into a complete and coherent description of the ocean, afforded by an ocean general circulation model. Twenty years later, the versatility of ECCO's estimation framework enables the production of global and regional ocean and sea-ice state estimates, that incorporate not only the initial suite of data and its successors, but nearly all data streams available today. New observations include measurements from Argo floats, marine mammal-based hydrography, satellite retrievals of ocean bottom pressure and sea surface salinity, as well as ice-tethered profiled data in polar regions. The framework also produces improved estimates of uncertain inputs, including initial conditions, surface atmospheric state variables, and mixing parameters. The freely available state estimates and related efforts are property-conserving, allowing closed budget calculations that are a requisite to detect, quantify, and understand the evolution of climate-relevant signals, as mandated by the Coupled Model Intercomparison Project Phase 6 (CMIP6) protocol. The solutions can be reproduced by users through provision of the underlying modeling and assimilation machinery. Regional efforts have spun off that offer increased spatial resolution to better resolve relevant processes. Emerging foci of ECCO are on a global sea level changes, in particular contributions from polar ice sheets, and the increased use of biogeochemical and ecosystem data to constrain global cycles of carbon, nitrogen and oxygen. Challenges in the coming decade include provision of uncertainties, informing observing system design, globally increased resolution, and moving toward a coupled Earth system estimation with consistent momentum, heat and freshwater fluxes between the ocean, atmosphere, cryosphere and land.
Article
Full-text available
Recent estimates of the global warming rates suggest that approximately 9% of Earth’s excess heat has been cumulated in the deep and abyssal oceans (below 2000-m depth) during the last two decades. Such estimates assume stationary trends deducted as long-term rates. To reassess the deep ocean warming and potentially shed light on its interannual variability, we formulate the balance between Earth’s energy imbalance (EEI), the steric sea level, and the ocean heat content (OHC), at yearly time scales during the 2003–18 period, as a variational problem. The solution is achieved through variational minimization, merging observational data from top-of-atmosphere EEI, inferred from Clouds and the Earth’s Radiant Energy System (CERES), steric sea level estimates from altimetry minus gravimetry, and upper-ocean heat content estimates from in situ platforms (mostly Argo floats). Global ocean reanalyses provide background-error covariances for the OHC analysis. The analysis indicates a 2000-m–bottom warming of 0.08 ± 0.04 W m ⁻² for the period 2003–18, equal to 13% of the total ocean warming (0.62 ± 0.08 W m ⁻² ), slightly larger than previous estimates but consistent within the error bars. The analysis provides a fully consistent optimized solution also for the steric sea level and EEI. Moreover, the simultaneous use of the different heat budget observing networks is able to decrease the analysis uncertainty with respect to the observational one, for all observation types and especially for the 0–700-m OHC and steric sea level (more than 12% reduction). The sensitivity of the analysis to the choice of the background time series proved insignificant. Significance Statement Several observing networks provide complementary information about the temporal evolution of the global energy budget. Here, satellite observations of Earth’s energy imbalance (EEI) and steric sea level and in situ–derived estimates of ocean heat content anomalies are combined in a variational analysis framework, with the goal of assessing the deep ocean warming. The optimized solution accounts for the uncertainty of the different observing networks. Furthermore, it provides fully consistent analyses of global ocean heat content, steric sea level, and EEI, which show smaller uncertainty than the original observed time series. The deep ocean (below 2000-m depth) exhibits a significant warming of 0.08 ± 0.04 W m ⁻² for the period 2003–18, equal to the 13% of the total ocean warming.