105 resultados para Ensemble doublement résolvant


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates the effect on balance of a number of Schur product-type localization schemes which have been designed with the primary function of reducing spurious far-field correlations in forecast error statistics. The localization schemes studied comprise a non-adaptive scheme (where the moderation matrix is decomposed in a spectral basis), and two adaptive schemes, namely a simplified version of SENCORP (Smoothed ENsemble COrrelations Raised to a Power) and ECO-RAP (Ensemble COrrelations Raised to A Power). The paper shows, we believe for the first time, how the degree of balance (geostrophic and hydrostatic) implied by the error covariance matrices localized by these schemes can be diagnosed. Here it is considered that an effective localization scheme is one that reduces spurious correlations adequately but also minimizes disruption of balance (where the 'correct' degree of balance or imbalance is assumed to be possessed by the unlocalized ensemble). By varying free parameters that describe each scheme (e.g. the degree of truncation in the schemes that use the spectral basis, the 'order' of each scheme, and the degree of ensemble smoothing), it is found that a particular configuration of the ECO-RAP scheme is best suited to the convective-scale system studied. According to our diagnostics this ECO-RAP configuration still weakens geostrophic and hydrostatic balance, but overall this is less so than for other schemes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study has investigated serial (temporal) clustering of extra-tropical cyclones simulated by 17 climate models that participated in CMIP5. Clustering was estimated by calculating the dispersion (ratio of variance to mean) of 30 December-February counts of Atlantic storm tracks passing nearby each grid point. Results from single historical simulations of 1975-2005 were compared to those from historical ERA40 reanalyses from 1958-2001 ERA40 and single future model projections of 2069-2099 under the RCP4.5 climate change scenario. Models were generally able to capture the broad features in reanalyses reported previously: underdispersion/regularity (i.e. variance less than mean) in the western core of the Atlantic storm track surrounded by overdispersion/clustering (i.e. variance greater than mean) to the north and south and over western Europe. Regression of counts onto North Atlantic Oscillation (NAO) indices revealed that much of the overdispersion in the historical reanalyses and model simulations can be accounted for by NAO variability. Future changes in dispersion were generally found to be small and not consistent across models. The overdispersion statistic, for any 30 year sample, is prone to large amounts of sampling uncertainty that obscures the climate change signal. For example, the projected increase in dispersion for storm counts near London in the CNRMCM5 model is 0.1 compared to a standard deviation of 0.25. Projected changes in the mean and variance of NAO are insufficient to create changes in overdispersion that are discernible above natural sampling variations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A statistical-dynamical downscaling method is used to estimate future changes of wind energy output (Eout) of a benchmark wind turbine across Europe at the regional scale. With this aim, 22 global climate models (GCMs) of the Coupled Model Intercomparison Project Phase 5 (CMIP5) ensemble are considered. The downscaling method uses circulation weather types and regional climate modelling with the COSMO-CLM model. Future projections are computed for two time periods (2021–2060 and 2061–2100) following two scenarios (RCP4.5 and RCP8.5). The CMIP5 ensemble mean response reveals a more likely than not increase of mean annual Eout over Northern and Central Europe and a likely decrease over Southern Europe. There is some uncertainty with respect to the magnitude and the sign of the changes. Higher robustness in future changes is observed for specific seasons. Except from the Mediterranean area, an ensemble mean increase of Eout is simulated for winter and a decreasing for the summer season, resulting in a strong increase of the intra-annual variability for most of Europe. The latter is, in particular, probable during the second half of the 21st century under the RCP8.5 scenario. In general, signals are stronger for 2061–2100 compared to 2021–2060 and for RCP8.5 compared to RCP4.5. Regarding changes of the inter-annual variability of Eout for Central Europe, the future projections strongly vary between individual models and also between future periods and scenarios within single models. This study showed for an ensemble of 22 CMIP5 models that changes in the wind energy potentials over Europe may take place in future decades. However, due to the uncertainties detected in this research, further investigations with multi-model ensembles are needed to provide a better quantification and understanding of the future changes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The regional climate modelling system PRECIS, was run at 25 km horizontal resolution for 150 years (1949-2099) using global driving data from a five member perturbed physics ensemble (based on the coupled global climate model HadCM3). Output from these simulations was used to investigate projected changes in tropical cyclones (TCs) over Vietnam and the South China Sea due to global warming (under SRES scenario A1B). Thirty year climatological mean periods were used to look at projected changes in future (2069-2098) TCs compared to a 1961-1990 baseline. Present day results were compared qualitatively with IBTrACS observations and found to be reasonably realistic. Future projections show a 20-44 % decrease in TC frequency, although the spatial patterns of change differ between the ensemble members, and an increase of 27-53 % in the amount of TC associated precipitation. No statistically significant changes in TC intensity were found, however, the occurrence of more intense TCs (defined as those with a maximum 10 m wind speed > 35 m/s) was found to increase by 3-9 %. Projected increases in TC associated precipitation are likely caused by increased evaporation and availability of atmospheric water vapour, due to increased sea surface and atmospheric temperature. The mechanisms behind the projected changes in TC frequency are difficult to link explicitly; changes are most likely due to the combination of increased static stability, increased vertical wind shear and decreased upward motion, which suggest a decrease in the tropical overturning circulation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study has explored the prediction errors of tropical cyclones (TCs) in the European Centre for Medium-Range Weather Forecasts (ECMWF) Ensemble Prediction System (EPS) for the Northern Hemisphere summer period for five recent years. Results for the EPS are contrasted with those for the higher-resolution deterministic forecasts. Various metrics of location and intensity errors are considered and contrasted for verification based on IBTrACS and the numerical weather prediction (NWP) analysis (NWPa). Motivated by the aim of exploring extended TC life cycles, location and intensity measures are introduced based on lower-tropospheric vorticity, which is contrasted with traditional verification metrics. Results show that location errors are almost identical when verified against IBTrACS or the NWPa. However, intensity in the form of the mean sea level pressure (MSLP) minima and 10-m wind speed maxima is significantly underpredicted relative to IBTrACS. Using the NWPa for verification results in much better consistency between the different intensity error metrics and indicates that the lower-tropospheric vorticity provides a good indication of vortex strength, with error results showing similar relationships to those based on MSLP and 10-m wind speeds for the different forecast types. The interannual variation in forecast errors are discussed in relation to changes in the forecast and NWPa system and variations in forecast errors between different ocean basins are discussed in terms of the propagation characteristics of the TCs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A smoother introduced earlier by van Leeuwen and Evensen is applied to a problem in which real obser vations are used in an area with strongly nonlinear dynamics. The derivation is new , but it resembles an earlier derivation by van Leeuwen and Evensen. Again a Bayesian view is taken in which the prior probability density of the model and the probability density of the obser vations are combined to for m a posterior density . The mean and the covariance of this density give the variance-minimizing model evolution and its errors. The assumption is made that the prior probability density is a Gaussian, leading to a linear update equation. Critical evaluation shows when the assumption is justified. This also sheds light on why Kalman filters, in which the same ap- proximation is made, work for nonlinear models. By reference to the derivation, the impact of model and obser vational biases on the equations is discussed, and it is shown that Bayes’ s for mulation can still be used. A practical advantage of the ensemble smoother is that no adjoint equations have to be integrated and that error estimates are easily obtained. The present application shows that for process studies a smoother will give superior results compared to a filter , not only owing to the smooth transitions at obser vation points, but also because the origin of features can be followed back in time. Also its preference over a strong-constraint method is highlighted. Further more, it is argued that the proposed smoother is more efficient than gradient descent methods or than the representer method when error estimates are taken into account

Relevância:

20.00% 20.00%

Publicador:

Resumo:

It is for mally proved that the general smoother for nonlinear dynamics can be for mulated as a sequential method, that is, obser vations can be assimilated sequentially during a for ward integration. The general filter can be derived from the smoother and it is shown that the general smoother and filter solutions at the final time become identical, as is expected from linear theor y. Then, a new smoother algorithm based on ensemble statistics is presented and examined in an example with the Lorenz equations. The new smoother can be computed as a sequential algorithm using only for ward-in-time model integrations. It bears a strong resemblance with the ensemble Kalman filter . The difference is that ever y time a new dataset is available during the for ward integration, an analysis is computed for all previous times up to this time. Thus, the first guess for the smoother is the ensemble Kalman filter solution, and the smoother estimate provides an improvement of this, as one would expect a smoother to do. The method is demonstrated in this paper in an intercomparison with the ensemble Kalman filter and the ensemble smoother introduced by van Leeuwen and Evensen, and it is shown to be superior in an application with the Lorenz equations. Finally , a discussion is given regarding the properties of the analysis schemes when strongly non-Gaussian distributions are used. It is shown that in these cases more sophisticated analysis schemes based on Bayesian statistics must be used.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper discusses an important issue related to the implementation and interpretation of the analysis scheme in the ensemble Kalman filter . I t i s shown that the obser vations must be treated as random variables at the analysis steps. That is, one should add random perturbations with the correct statistics to the obser vations and generate an ensemble of obser vations that then is used in updating the ensemble of model states. T raditionally , this has not been done in previous applications of the ensemble Kalman filter and, as will be shown, this has resulted in an updated ensemble with a variance that is too low . This simple modification of the analysis scheme results in a completely consistent approach if the covariance of the ensemble of model states is interpreted as the prediction error covariance, and there are no further requirements on the ensemble Kalman filter method, except for the use of an ensemble of sufficient size. Thus, there is a unique correspondence between the error statistics from the ensemble Kalman filter and the standard Kalman filter approach

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ring-shedding process in the Agulhas Current is studied using the ensemble Kalman filter to assimilate geosat altimeter data into a two-layer quasigeostrophic ocean model. The properties of the ensemble Kalman filter are further explored with focus on the analysis scheme and the use of gridded data. The Geosat data consist of 10 fields of gridded sea-surface height anomalies separated 10 days apart that are added to a climatic mean field. This corresponds to a huge number of data values, and a data reduction scheme must be applied to increase the efficiency of the analysis procedure. Further, it is illustrated how one can resolve the rank problem occurring when a too large dataset or a small ensemble is used.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Quantifying the effect of the seawater density changes on sea level variability is of crucial importance for climate change studies, as the sea level cumulative rise can be regarded as both an important climate change indicator and a possible danger for human activities in coastal areas. In this work, as part of the Ocean Reanalysis Intercomparison Project, the global and regional steric sea level changes are estimated and compared from an ensemble of 16 ocean reanalyses and 4 objective analyses. These estimates are initially compared with a satellite-derived (altimetry minus gravimetry) dataset for a short period (2003–2010). The ensemble mean exhibits a significant high correlation at both global and regional scale, and the ensemble of ocean reanalyses outperforms that of objective analyses, in particular in the Southern Ocean. The reanalysis ensemble mean thus represents a valuable tool for further analyses, although large uncertainties remain for the inter-annual trends. Within the extended intercomparison period that spans the altimetry era (1993–2010), we find that the ensemble of reanalyses and objective analyses are in good agreement, and both detect a trend of the global steric sea level of 1.0 and 1.1 ± 0.05 mm/year, respectively. However, the spread among the products of the halosteric component trend exceeds the mean trend itself, questioning the reliability of its estimate. This is related to the scarcity of salinity observations before the Argo era. Furthermore, the impact of deep ocean layers is non-negligible on the steric sea level variability (22 and 12 % for the layers below 700 and 1500 m of depth, respectively), although the small deep ocean trends are not significant with respect to the products spread.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Accurate knowledge of the location and magnitude of ocean heat content (OHC) variability and change is essential for understanding the processes that govern decadal variations in surface temperature, quantifying changes in the planetary energy budget, and developing constraints on the transient climate response to external forcings. We present an overview of the temporal and spatial characteristics of OHC variability and change as represented by an ensemble of dynamical and statistical ocean reanalyses (ORAs). Spatial maps of the 0–300 m layer show large regions of the Pacific and Indian Oceans where the interannual variability of the ensemble mean exceeds ensemble spread, indicating that OHC variations are well-constrained by the available observations over the period 1993–2009. At deeper levels, the ORAs are less well-constrained by observations with the largest differences across the ensemble mostly associated with areas of high eddy kinetic energy, such as the Southern Ocean and boundary current regions. Spatial patterns of OHC change for the period 1997–2009 show good agreement in the upper 300 m and are characterized by a strong dipole pattern in the Pacific Ocean. There is less agreement in the patterns of change at deeper levels, potentially linked to differences in the representation of ocean dynamics, such as water mass formation processes. However, the Atlantic and Southern Oceans are regions in which many ORAs show widespread warming below 700 m over the period 1997–2009. Annual time series of global and hemispheric OHC change for 0–700 m show the largest spread for the data sparse Southern Hemisphere and a number of ORAs seem to be subject to large initialization ‘shock’ over the first few years. In agreement with previous studies, a number of ORAs exhibit enhanced ocean heat uptake below 300 and 700 m during the mid-1990s or early 2000s. The ORA ensemble mean (±1 standard deviation) of rolling 5-year trends in full-depth OHC shows a relatively steady heat uptake of approximately 0.9 ± 0.8 W m−2 (expressed relative to Earth’s surface area) between 1995 and 2002, which reduces to about 0.2 ± 0.6 W m−2 between 2004 and 2006, in qualitative agreement with recent analysis of Earth’s energy imbalance. There is a marked reduction in the ensemble spread of OHC trends below 300 m as the Argo profiling float observations become available in the early 2000s. In general, we suggest that ORAs should be treated with caution when employed to understand past ocean warming trends—especially when considering the deeper ocean where there is little in the way of observational constraints. The current work emphasizes the need to better observe the deep ocean, both for providing observational constraints for future ocean state estimation efforts and also to develop improved models and data assimilation methods.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A set of four eddy-permitting global ocean reanalyses produced in the framework of the MyOcean project have been compared over the altimetry period 1993–2011. The main differences among the reanalyses used here come from the data assimilation scheme implemented to control the ocean state by inserting reprocessed observations of sea surface temperature (SST), in situ temperature and salinity profiles, sea level anomaly and sea-ice concentration. A first objective of this work includes assessing the interannual variability and trends for a series of parameters, usually considered in the community as essential ocean variables: SST, sea surface salinity, temperature and salinity averaged over meaningful layers of the water column, sea level, transports across pre-defined sections, and sea ice parameters. The eddy-permitting nature of the global reanalyses allows also to estimate eddy kinetic energy. The results show that in general there is a good consistency between the different reanalyses. An intercomparison against experiments without data assimilation was done during the MyOcean project and we conclude that data assimilation is crucial for correctly simulating some quantities such as regional trends of sea level as well as the eddy kinetic energy. A second objective is to show that the ensemble mean of reanalyses can be evaluated as one single system regarding its reliability in reproducing the climate signals, where both variability and uncertainties are assessed through the ensemble spread and signal-to-noise ratio. The main advantage of having access to several reanalyses differing in the way data assimilation is performed is that it becomes possible to assess part of the total uncertainty. Given the fact that we use very similar ocean models and atmospheric forcing, we can conclude that the spread of the ensemble of reanalyses is mainly representative of our ability to gauge uncertainty in the assimilation methods. This uncertainty changes a lot from one ocean parameter to another, especially in global indices. However, despite several caveats in the design of the multi-system ensemble, the main conclusion from this study is that an eddy-permitting multi-system ensemble approach has become mature and our results provide a first step towards a systematic comparison of eddy-permitting global ocean reanalyses aimed at providing robust conclusions on the recent evolution of the oceanic state.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The interannual-decadal variability of the wintertime mixed layer depths (MLDs) over the North Pacific is investigated from an empirical orthogonal function (EOF) analysis of an ensemble of global ocean reanalyses. The first leading EOF mode represents the interannual MLD anomalies centered in the eastern part of the central mode water formation region in phase opposition with those in the eastern subtropics and the central Alaskan Gyre. This first EOF mode is highly correlated with the Pacific decadal oscillation index on both the interannual and decadal time scales. The second leading EOF mode represents the MLD variability in the subtropical mode water (STMW) formation region and has a good correlation with the wintertime West Pacific (WP) index with time lag of 3 years, suggesting the importance of the oceanic dynamical response to the change in the surface wind field associated with the meridional shifts of the Aleutian Low. The above MLD variabilities are in basic agreement with previous observational and modeling findings. Moreover the reanalysis ensemble provides uncertainty estimates. The interannual MLD anomalies in the first and second EOF modes are consistently represented by the individual reanalyses and the amplitudes of the variabilities generally exceed the ensemble spread of the reanalyses. Besides, the resulting MLD variability indices, spanning the 1948–2012 period, should be helpful for characterizing the North Pacific climate variability. In particular, a 6-year oscillation including the WP teleconnection pattern in the atmosphere and the oceanic MLD variability in the STMW formation region is first detected.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Plant–Craig stochastic convection parameterization (version 2.0) is implemented in the Met Office Regional Ensemble Prediction System (MOGREPS-R) and is assessed in comparison with the standard convection scheme with a simple stochastic scheme only, from random parameter variation. A set of 34 ensemble forecasts, each with 24 members, is considered, over the month of July 2009. Deterministic and probabilistic measures of the precipitation forecasts are assessed. The Plant–Craig parameterization is found to improve probabilistic forecast measures, particularly the results for lower precipitation thresholds. The impact on deterministic forecasts at the grid scale is neutral, although the Plant–Craig scheme does deliver improvements when forecasts are made over larger areas. The improvements found are greater in conditions of relatively weak synoptic forcing, for which convective precipitation is likely to be less predictable.