135 resultados para Ensemble dominant connexe
Resumo:
Providing probabilistic forecasts using Ensemble Prediction Systems has become increasingly popular in both the meteorological and hydrological communities. Compared to conventional deterministic forecasts, probabilistic forecasts may provide more reliable forecasts of a few hours to a number of days ahead, and hence are regarded as better tools for taking uncertainties into consideration and hedging against weather risks. It is essential to evaluate performance of raw ensemble forecasts and their potential values in forecasting extreme hydro-meteorological events. This study evaluates ECMWF’s medium-range ensemble forecasts of precipitation over the period 2008/01/01-2012/09/30 on a selected mid-latitude large scale river basin, the Huai river basin (ca. 270,000 km2) in central-east China. The evaluation unit is sub-basin in order to consider forecast performance in a hydrologically relevant way. The study finds that forecast performance varies with sub-basin properties, between flooding and non-flooding seasons, and with the forecast properties of aggregated time steps and lead times. Although the study does not evaluate any hydrological applications of the ensemble precipitation forecasts, its results have direct implications in hydrological forecasts should these ensemble precipitation forecasts be employed in hydrology.
Resumo:
In this paper ensembles of forecasts (of up to six hours) are studied from a convection-permitting model with a representation of model error due to unresolved processes. The ensemble prediction system (EPS) used is an experimental convection-permitting version of the UK Met Office’s 24- member Global and Regional Ensemble Prediction System (MOGREPS). The method of representing model error variability, which perturbs parameters within the model’s parameterisation schemes, has been modified and we investigate the impact of applying this scheme in different ways. These are: a control ensemble where all ensemble members have the same parameter values; an ensemble where the parameters are different between members, but fixed in time; and ensembles where the parameters are updated randomly every 30 or 60 min. The choice of parameters and their ranges of variability have been determined from expert opinion and parameter sensitivity tests. A case of frontal rain over the southern UK has been chosen, which has a multi-banded rainfall structure. The consequences of including model error variability in the case studied are mixed and are summarised as follows. The multiple banding, evident in the radar, is not captured for any single member. However, the single band is positioned in some members where a secondary band is present in the radar. This is found for all ensembles studied. Adding model error variability with fixed parameters in time does increase the ensemble spread for near-surface variables like wind and temperature, but can actually decrease the spread of the rainfall. Perturbing the parameters periodically throughout the forecast does not further increase the spread and exhibits “jumpiness” in the spread at times when the parameters are perturbed. Adding model error variability gives an improvement in forecast skill after the first 2–3 h of the forecast for near-surface temperature and relative humidity. For precipitation skill scores, adding model error variability has the effect of improving the skill in the first 1–2 h of the forecast, but then of reducing the skill after that. Complementary experiments were performed where the only difference between members was the set of parameter values (i.e. no initial condition variability). The resulting spread was found to be significantly less than the spread from initial condition variability alone.
Resumo:
Useful probabilistic climate forecasts on decadal timescales should be reliable (i.e. forecast probabilities match the observed relative frequencies) but this is seldom examined. This paper assesses a necessary condition for reliability, that the ratio of ensemble spread to forecast error being close to one, for seasonal to decadal sea surface temperature retrospective forecasts from the Met Office Decadal Prediction System (DePreSys). Factors which may affect reliability are diagnosed by comparing this spread-error ratio for an initial condition ensemble and two perturbed physics ensembles for initialized and uninitialized predictions. At lead times less than 2 years, the initialized ensembles tend to be under-dispersed, and hence produce overconfident and hence unreliable forecasts. For longer lead times, all three ensembles are predominantly over-dispersed. Such over-dispersion is primarily related to excessive inter-annual variability in the climate model. These findings highlight the need to carefully evaluate simulated variability in seasonal and decadal prediction systems.Useful probabilistic climate forecasts on decadal timescales should be reliable (i.e. forecast probabilities match the observed relative frequencies) but this is seldom examined. This paper assesses a necessary condition for reliability, that the ratio of ensemble spread to forecast error being close to one, for seasonal to decadal sea surface temperature retrospective forecasts from the Met Office Decadal Prediction System (DePreSys). Factors which may affect reliability are diagnosed by comparing this spread-error ratio for an initial condition ensemble and two perturbed physics ensembles for initialized and uninitialized predictions. At lead times less than 2 years, the initialized ensembles tend to be under-dispersed, and hence produce overconfident and hence unreliable forecasts. For longer lead times, all three ensembles are predominantly over-dispersed. Such over-dispersion is primarily related to excessive inter-annual variability in the climate model. These findings highlight the need to carefully evaluate simulated variability in seasonal and decadal prediction systems.
Resumo:
Modelling of disorder in organic crystals is highly desirable since it would allow thermodynamic stabilities and other disorder-sensitive properties to be estimated for such systems. Two disordered organic molecular systems are modeled using a symmetry-adapted ensemble approach, in which the disordered system is treated as an ensemble of the configurations of a supercell with respect to substitution of one disorder component for another. Computation time is kept manageable by performing calculations only on the symmetrically inequivalent configurations. Calculations are presented on a substitutionally disordered system, the dichloro/dibromobenzene solid solution, and on an orientationally disordered system, eniluracil, and the resultant free energies, disorder patterns, and system properties are discussed. The results are found to be in agreement with experiment following manual removal of physically implausible configurations from ensemble averages, highlighting the dangers of a completely automated approach to organic crystal thermodynamics which ignores the barriers to equilibration once the crystal has been formed.
Resumo:
The recent identification of multiple dominant mutations in the gene encoding β-catenin in both humans and mice has enabled exploration of the molecular and cellular basis of β-catenin function in cognitive impairment. In humans, β-catenin mutations that cause a spectrum of neurodevelopmental disorders have been identified. We identified de novo β-catenin mutations in patients with intellectual disability, carefully characterized their phenotypes, and were able to define a recognizable intellectual disability syndrome. In parallel, characterization of a chemically mutagenized mouse line that displays features similar to those of human patients with β-catenin mutations enabled us to investigate the consequences of β-catenin dysfunction through development and into adulthood. The mouse mutant, designated batface (Bfc), carries a Thr653Lys substitution in the C-terminal armadillo repeat of β-catenin and displayed a reduced affinity for membrane-associated cadherins. In association with this decreased cadherin interaction, we found that the mutation results in decreased intrahemispheric connections, with deficits in dendritic branching, long-term potentiation, and cognitive function. Our study provides in vivo evidence that dominant mutations in β-catenin underlie losses in its adhesion-related functions, which leads to severe consequences, including intellectual disability, childhood hypotonia, progressive spasticity of lower limbs, and abnormal craniofacial features in adults
Resumo:
Activating transcription factor 3 (Atf3) is rapidly and transiently upregulated in numerous systems, and is associated with various disease states. Atf3 is required for negative feedback regulation of other genes, but is itself subject to negative feedback regulation possibly by autorepression. In cardiomyocytes, Atf3 and Egr1 mRNAs are upregulated via ERK1/2 signalling and Atf3 suppresses Egr1 expression. We previously developed a mathematical model for the Atf3-Egr1 system. Here, we adjusted and extended the model to explore mechanisms of Atf3 feedback regulation. Introduction of an autorepressive loop for Atf3 tuned down its expression and inhibition of Egr1 was lost, demonstrating that negative feedback regulation of Atf3 by Atf3 itself is implausible in this context. Experimentally, signals downstream from ERK1/2 suppress Atf3 expression. Mathematical modelling indicated that this cannot occur by phosphorylation of pre-existing inhibitory transcriptional regulators because the time delay is too short. De novo synthesis of an inhibitory transcription factor (ITF) with a high affinity for the Atf3 promoter could suppress Atf3 expression, but (as with the Atf3 autorepression loop) inhibition of Egr1 was lost. Developing the model to include newly-synthesised miRNAs very efficiently terminated Atf3 protein expression and, with a 4-fold increase in the rate of degradation of mRNA from the mRNA/miRNA complex, profiles for Atf3 mRNA, Atf3 protein and Egr1 mRNA approximated to the experimental data. Combining the ITF model with that of the miRNA did not improve the profiles suggesting that miRNAs are likely to play a dominant role in switching off Atf3 expression post-induction.
Resumo:
For certain observing types, such as those that are remotely sensed, the observation errors are correlated and these correlations are state- and time-dependent. In this work, we develop a method for diagnosing and incorporating spatially correlated and time-dependent observation error in an ensemble data assimilation system. The method combines an ensemble transform Kalman filter with a method that uses statistical averages of background and analysis innovations to provide an estimate of the observation error covariance matrix. To evaluate the performance of the method, we perform identical twin experiments using the Lorenz ’96 and Kuramoto-Sivashinsky models. Using our approach, a good approximation to the true observation error covariance can be recovered in cases where the initial estimate of the error covariance is incorrect. Spatial observation error covariances where the length scale of the true covariance changes slowly in time can also be captured. We find that using the estimated correlated observation error in the assimilation improves the analysis.
Resumo:
We present a novel method for retrieving high-resolution, three-dimensional (3-D) nonprecipitating cloud fields in both overcast and broken-cloud situations. The method uses scanning cloud radar and multiwavelength zenith radiances to obtain gridded 3-D liquid water content (LWC) and effective radius (re) and 2-D column mean droplet number concentration (Nd). By using an adaption of the ensemble Kalman filter, radiances are used to constrain the optical properties of the clouds using a forward model that employs full 3-D radiative transfer while also providing full error statistics given the uncertainty in the observations. To evaluate the new method, we first perform retrievals using synthetic measurements from a challenging cumulus cloud field produced by a large-eddy simulation snapshot. Uncertainty due to measurement error in overhead clouds is estimated at 20% in LWC and 6% in re, but the true error can be greater due to uncertainties in the assumed droplet size distribution and radiative transfer. Over the entire domain, LWC and re are retrieved with average error 0.05–0.08 g m-3 and ~2 μm, respectively, depending on the number of radiance channels used. The method is then evaluated using real data from the Atmospheric Radiation Measurement program Mobile Facility at the Azores. Two case studies are considered, one stratocumulus and one cumulus. Where available, the liquid water path retrieved directly above the observation site was found to be in good agreement with independent values obtained from microwave radiometer measurements, with an error of 20 g m-2.
Resumo:
Water scarcity severely impairs food security and economic prosperity in many countries today. Expected future population changes will, in many countries as well as globally, increase the pressure on available water resources. On the supply side, renewable water resources will be affected by projected changes in precipitation patterns, temperature, and other climate variables. Here we use a large ensemble of global hydrological models (GHMs) forced by five global climate models and the latest greenhouse-gas concentration scenarios (Representative Concentration Pathways) to synthesize the current knowledge about climate change impacts on water resources. We show that climate change is likely to exacerbate regional and global water scarcity considerably. In particular, the ensemble average projects that a global warming of 2 degrees C above present (approximately 2.7 degrees C above preindustrial) will confront an additional approximate 15% of the global population with a severe decrease in water resources and will increase the number of people living under absolute water scarcity (< 500 m(3) per capita per year) by another 40% (according to some models, more than 100%) compared with the effect of population growth alone. For some indicators of moderate impacts, the steepest increase is seen between the present day and 2 degrees C, whereas indicators of very severe impacts increase unabated beyond 2 degrees C. At the same time, the study highlights large uncertainties associated with these estimates, with both global climate models and GHMs contributing to the spread. GHM uncertainty is particularly dominant in many regions affected by declining water resources, suggesting a high potential for improved water resource projections through hydrological model development.
Resumo:
Climate change due to anthropogenic greenhouse gas emissions is expected to increase the frequency and intensity of precipitation events, which is likely to affect the probability of flooding into the future. In this paper we use river flow simulations from nine global hydrology and land surface models to explore uncertainties in the potential impacts of climate change on flood hazard at global scale. As an indicator of flood hazard we looked at changes in the 30-y return level of 5-d average peak flows under representative concentration pathway RCP8.5 at the end of this century. Not everywhere does climate change result in an increase in flood hazard: decreases in the magnitude and frequency of the 30-y return level of river flow occur at roughly one-third (20-45%) of the global land grid points, particularly in areas where the hydro-graph is dominated by the snowmelt flood peak in spring. In most model experiments, however, an increase in flooding frequency was found in more than half of the grid points. The current 30-y flood peak is projected to occur in more than 1 in 5 y across 5-30% of land grid points. The large-scale patterns of change are remarkably consistent among impact models and even the driving climate models, but at local scale and in individual river basins there can be disagreement even on the sign of change, indicating large modeling uncertainty which needs to be taken into account in local adaptation studies.
Resumo:
Increasing concentrations of greenhouse gases in the atmosphere are expected to modify the global water cycle with significant consequences for terrestrial hydrology. We assess the impact of climate change on hydrological droughts in a multimodel experiment including seven global impact models (GIMs) driven by bias-corrected climate from five global climate models under four representative concentration pathways (RCPs). Drought severity is defined as the fraction of land under drought conditions. Results show a likely increase in the global severity of hydrological drought at the end of the 21st century, with systematically greater increases for RCPs describing stronger radiative forcings. Under RCP8.5, droughts exceeding 40% of analyzed land area are projected by nearly half of the simulations. This increase in drought severity has a strong signal-to-noise ratio at the global scale, and Southern Europe, the Middle East, the Southeast United States, Chile, and South West Australia are identified as possible hotspots for future water security issues. The uncertainty due to GIMs is greater than that from global climate models, particularly if including a GIM that accounts for the dynamic response of plants to CO2 and climate, as this model simulates little or no increase in drought frequency. Our study demonstrates that different representations of terrestrial water-cycle processes in GIMs are responsible for a much larger uncertainty in the response of hydrological drought to climate change than previously thought. When assessing the impact of climate change on hydrology, it is therefore critical to consider a diverse range of GIMs to better capture the uncertainty.
Resumo:
The UPSCALE (UK on PRACE: weather-resolving Simulations of Climate for globAL Environmental risk) project, using PRACE (Partnership for Advanced Computing in Europe) resources, constructed and ran an ensemble of atmosphere-only global climate model simulations, using the Met Office Unified Model GA3 configuration. Each simulation is 27 years in length for both the present climate and an end-of-century future climate, at resolutions of N96 (130 km), N216 (60 km) and N512 (25 km), in order to study the impact of model resolution on high impact climate features such as tropical cyclones. Increased model resolution is found to improve the simulated frequency of explicitly tracked tropical cyclones, and correlations of interannual variability in the North Atlantic and North West Pacific lie between 0.6 and 0.75. Improvements in the deficit of genesis in the eastern North Atlantic as resolution increases appear to be related to the representation of African Easterly Waves and the African Easterly Jet. However, the intensity of the modelled tropical cyclones as measured by 10 m wind speed remain weak, and there is no indication of convergence over this range of resolutions. In the future climate ensemble, there is a reduction of 50% in the frequency of Southern Hemisphere tropical cyclones, while in the Northern Hemisphere there is a reduction in the North Atlantic, and a shift in the Pacific with peak intensities becoming more common in the Central Pacific. There is also a change in tropical cyclone intensities, with the future climate having fewer weak storms and proportionally more stronger storms
Resumo:
Incomplete understanding of three aspects of the climate system—equilibrium climate sensitivity, rate of ocean heat uptake and historical aerosol forcing—and the physical processes underlying them lead to uncertainties in our assessment of the global-mean temperature evolution in the twenty-first century1,2. Explorations of these uncertainties have so far relied on scaling approaches3,4, large ensembles of simplified climate models1,2, or small ensembles of complex coupled atmosphere–ocean general circulation models5,6 which under-represent uncertainties in key climate system properties derived from independent sources7–9. Here we present results from a multi-thousand-member perturbed-physics ensemble of transient coupled atmosphere–ocean general circulation model simulations. We find that model versions that reproduce observed surface temperature changes over the past 50 years show global-mean temperature increases of 1.4–3 K by 2050, relative to 1961–1990, under a mid-range forcing scenario. This range of warming is broadly consistent with the expert assessment provided by the Intergovernmental Panel on Climate Change Fourth Assessment Report10, but extends towards larger warming than observed in ensemblesof-opportunity5 typically used for climate impact assessments. From our simulations, we conclude that warming by the middle of the twenty-first century that is stronger than earlier estimates is consistent with recent observed temperature changes and a mid-range ‘no mitigation’ scenario for greenhouse-gas emissions.
Resumo:
Advanced forecasting of space weather requires simulation of the whole Sun-to-Earth system, which necessitates driving magnetospheric models with the outputs from solar wind models. This presents a fundamental difficulty, as the magnetosphere is sensitive to both large-scale solar wind structures, which can be captured by solar wind models, and small-scale solar wind “noise,” which is far below typical solar wind model resolution and results primarily from stochastic processes. Following similar approaches in terrestrial climate modeling, we propose statistical “downscaling” of solar wind model results prior to their use as input to a magnetospheric model. As magnetospheric response can be highly nonlinear, this is preferable to downscaling the results of magnetospheric modeling. To demonstrate the benefit of this approach, we first approximate solar wind model output by smoothing solar wind observations with an 8 h filter, then add small-scale structure back in through the addition of random noise with the observed spectral characteristics. Here we use a very simple parameterization of noise based upon the observed probability distribution functions of solar wind parameters, but more sophisticated methods will be developed in the future. An ensemble of results from the simple downscaling scheme are tested using a model-independent method and shown to add value to the magnetospheric forecast, both improving the best estimate and quantifying the uncertainty. We suggest a number of features desirable in an operational solar wind downscaling scheme.
Resumo:
Summary Reasons for performing study: Metabonomics is emerging as a powerful tool for disease screening and investigating mammalian metabolism. This study aims to create a metabolic framework by producing a preliminary reference guide for the normal equine metabolic milieu. Objectives: To metabolically profile plasma, urine and faecal water from healthy racehorses using high resolution 1H-NMR spectroscopy and to provide a list of dominant metabolites present in each biofluid for the benefit of future research in this area. Study design: This study was performed using seven Thoroughbreds in race training at a single time-point. Urine and faecal samples were collected non-invasively and plasma was obtained from samples taken for routine clinical chemistry purposes. Methods: Biofluids were analysed using 1H-NMR spectroscopy. Metabolite assignment was achieved via a range of 1D and 2D experiments. Results: A total of 102 metabolites were assigned across the three biological matrices. A core metabonome of 14 metabolites was ubiquitous across all biofluids. All biological matrices provided a unique window on different aspects of systematic metabolism. Urine was the most populated metabolite matrix with 65 identified metabolites, 39 of which were unique to this biological compartment. A number of these were related to gut microbial host co-metabolism. Faecal samples were the most metabolically variable between animals; acetate was responsible for the majority (28%) of this variation. Short chain fatty acids were the predominant features identified within this biofluid by 1H-NMR spectroscopy. Conclusions: Metabonomics provides a platform for investigating complex and dynamic interactions between the host and its consortium of gut microbes and has the potential to uncover markers for health and disease in a variety of biofluids. Inherent variation in faecal extracts along with the relative abundance of microbial-mammalian metabolites in urine and invasive nature of plasma sampling, infers that urine is the most appropriate biofluid for the purposes of metabonomic analysis.