961 resultados para Chance-constrained model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aeolian mineral dust aerosol is an important consideration in the Earth's radiation budget as well as a source of nutrients to oceanic and land biota. The modelling of aeolian mineral dust has been improving consistently despite the relatively sparse observations to constrain them. This study documents the development of a new dust emissions scheme in the Met Office Unified ModelTM (MetUM) based on the Dust Entrainment and Deposition (DEAD) module. Four separate case studies are used to test and constrain the model output. Initial testing was undertaken on a large dust event over North Africa in March 2006 with the model constrained using AERONET data. The second case study involved testing the capability of the model to represent dust events in the Middle East without being re-tuned from the March 2006 case in the Sahara. While the model is unable to capture some of the daytime variation in AERONET AOD there is good agreement between the model and observed dust events. In the final two case studies new observations from in situ aircraft data during the Dust Outflow and Deposition to the Ocean (DODO) campaigns in February and August 2006 were used. These recent observations provided further data on dust size distributions and vertical profiles to constrain the model. The modelled DODO cases were also compared to AERONET data to make sure the radiative properties of the dust were comparable to observations. Copyright © 2009 Royal Meteorological Society and Crown Copyright

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Motivation: We compare phylogenetic approaches for inferring functional gene links. The approaches detect independent instances of the correlated gain and loss of pairs of genes from species' genomes. We investigate the effect on results of basing evidence of correlations on two phylogenetic approaches, Dollo parsminony and maximum likelihood (ML). We further examine the effect of constraining the ML model by fixing the rate of gene gain at a low value, rather than estimating it from the data. Results: We detect correlated evolution among a test set of pairs of yeast (Saccharomyces cerevisiae) genes, with a case study of 21 eukaryotic genomes and test data derived from known yeast protein complexes. If the rate at which genes are gained is constrained to be low, ML achieves by far the best results at detecting known functional links. The model then has fewer parameters but it is more realistic by preventing genes from being gained more than once. Availability: BayesTraits by M. Pagel and A. Meade, and a script to configure and repeatedly launch it by D. Barker and M. Pagel, are available at http://www.evolution.reading.ac.uk .

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A novel type of tweezer molecule containing electron-rich 2-pyrenyloxy arms has been designed to exploit intramolecular hydrogen bonding in stabilising a preferred conformation for supramolecular complexation to complementary sequences in aromatic copolyimides. This tweezer-conformation is demonstrated by single-crystal X-ray analyses of the tweezer molecule itself and of its complex with an aromatic diimide model-compound. In terms of its ability to bind selectively to polyimide chains, the new tweezer molecule shows very high sensitivity to sequence effects. Thus, even low concentrations of tweezer relative to diimide units (<2.5 mol%) are sufficient to produce dramatic, sequence-related splittings of the pyromellitimide proton NMR resonances. These induced resonance-shifts arise from ring-current shielding of pyromellitimide protons by the pyrenyloxy arms of the tweezer-molecule, and the magnitude of such shielding is a function of the tweezer-binding constant for any particular monomer sequence. Recognition of both short-range and long-range sequences is observed, the latter arising from cumulative ring-current shielding of diimide protons by tweezer molecules binding at multiple adjacent sites on the copolymer chain.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have developed a model that allows players in the building and construction sector and the energy policy makers on energy strategies to be able to perceive the interest of investors in the kingdom of Bahrain in conducting Building Integrated Photovoltaic (BIPV) or Building integrated wind turbines (BIWT) projects, i.e. a partial sustainable or green buildings. The model allows the calculation of the Sustainable building index (SBI), which ranges from 0.1 (lowest) to 1.0 (highest); the higher figure the more chance for launching BIPV or BIWT. This model was tested in Bahrain and the calculated SBI was found 0.47. This means that an extensive effort must be made through policies on renewable energy, renewable energy education, and incentives to BIPV and BIWT projects, environmental awareness and promotion to clean and sustainable energy for building and construction projects. Our model can be used internationally to create a "Global SBI" database. The Sustainable building and construction initiative (SBCI), United Nation, can take the task for establishing such task using this model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes the SIMULINK implementation of a constrained predictive control algorithm based on quadratic programming and linear state space models, and its application to a laboratory-scale 3D crane system. The algorithm is compatible with Real Time. Windows Target and, in the case of the crane system, it can be executed with a sampling period of 0.01 s and a prediction horizon of up to 300 samples, using a linear state space model with 3 inputs, 5 outputs and 13 states.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using the classical Parzen window (PW) estimate as the target function, the sparse kernel density estimator is constructed in a forward-constrained regression (FCR) manner. The proposed algorithm selects significant kernels one at a time, while the leave-one-out (LOO) test score is minimized subject to a simple positivity constraint in each forward stage. The model parameter estimation in each forward stage is simply the solution of jackknife parameter estimator for a single parameter, subject to the same positivity constraint check. For each selected kernels, the associated kernel width is updated via the Gauss-Newton method with the model parameter estimate fixed. The proposed approach is simple to implement and the associated computational cost is very low. Numerical examples are employed to demonstrate the efficacy of the proposed approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new probabilistic neural network (PNN) learning algorithm based on forward constrained selection (PNN-FCS) is proposed. An incremental learning scheme is adopted such that at each step, new neurons, one for each class, are selected from the training samples arid the weights of the neurons are estimated so as to minimize the overall misclassification error rate. In this manner, only the most significant training samples are used as the neurons. It is shown by simulation that the resultant networks of PNN-FCS have good classification performance compared to other types of classifiers, but much smaller model sizes than conventional PNN.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The budgets of seven halogenated gases (CFC-11, CFC-12, CFC-113, CFC-114, CFC-115, CCl4 and SF6) are studied by comparing measurements in polar firn air from two Arctic and three Antarctic sites, and simulation results of two numerical models: a 2-D atmospheric chemistry model and a 1-D firn diffusion model. The first one is used to calculate atmospheric concentrations from emission trends based on industrial inventories; the calculated concentration trends are used by the second one to produce depth concentration profiles in the firn. The 2-D atmospheric model is validated in the boundary layer by comparison with atmospheric station measurements, and vertically for CFC-12 by comparison with balloon and FTIR measurements. Firn air measurements provide constraints on historical atmospheric concentrations over the last century. Age distributions in the firn are discussed using a Green function approach. Finally, our results are used as input to a radiative model in order to evaluate the radiative forcing of our target gases. Multi-species and multi-site firn air studies allow to better constrain atmospheric trends. The low concentrations of all studied gases at the bottom of the firn, and their consistency with our model results confirm that their natural sources are small. Our results indicate that the emissions, sinks and trends of CFC-11, CFC-12, CFC-113, CFC-115 and SF6 are well constrained, whereas it is not the case for CFC-114 and CCl4. Significant emission-dependent changes in the lifetimes of halocarbons destroyed in the stratosphere were obtained. Those result from the time needed for their transport from the surface where they are emitted to the stratosphere where they are destroyed. Efforts should be made to update and reduce the large uncertainties on CFC lifetimes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper demonstrates that recent influential contributions to monetary policy imply an emerging consensus whereby neither rigid rules nor complete discretion are found optimal. Instead, middle-ground monetary regimes based on rules (operative under 'normal' circumstances) to anchor inflation expectations over the long run, but designed with enough flexibility to mitigate the short-run effect of shocks (with communicated discretion in 'exceptional' circumstances temporarily overriding these rules), are gaining support in theoretical models and policy formulation and implementation. The opposition of 'rules versus discretion' has, thus, reappeared as the synthesis of 'rules cum discretion', in essence as inflation-forecast targeting. But such synthesis is not without major theoretical problems, as we argue in this contribution. Furthermore, the very recent real-world events have made it obvious that the inflation targeting strategy of monetary policy, which rests upon the new consensus paradigm in modern macroeconomics is at best a 'fair weather' model. In the turbulent economic climate of highly unstable inflation, deep financial crisis and world-wide, abrupt economic slowdown nowadays this approach needs serious rethinking to say the least, if not abandoning it altogether

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We describe a model-data fusion (MDF) inter-comparison project (REFLEX), which compared various algorithms for estimating carbon (C) model parameters consistent with both measured carbon fluxes and states and a simple C model. Participants were provided with the model and with both synthetic net ecosystem exchange (NEE) of CO2 and leaf area index (LAI) data, generated from the model with added noise, and observed NEE and LAI data from two eddy covariance sites. Participants endeavoured to estimate model parameters and states consistent with the model for all cases over the two years for which data were provided, and generate predictions for one additional year without observations. Nine participants contributed results using Metropolis algorithms, Kalman filters and a genetic algorithm. For the synthetic data case, parameter estimates compared well with the true values. The results of the analyses indicated that parameters linked directly to gross primary production (GPP) and ecosystem respiration, such as those related to foliage allocation and turnover, or temperature sensitivity of heterotrophic respiration, were best constrained and characterised. Poorly estimated parameters were those related to the allocation to and turnover of fine root/wood pools. Estimates of confidence intervals varied among algorithms, but several algorithms successfully located the true values of annual fluxes from synthetic experiments within relatively narrow 90% confidence intervals, achieving >80% success rate and mean NEE confidence intervals <110 gC m−2 year−1 for the synthetic case. Annual C flux estimates generated by participants generally agreed with gap-filling approaches using half-hourly data. The estimation of ecosystem respiration and GPP through MDF agreed well with outputs from partitioning studies using half-hourly data. Confidence limits on annual NEE increased by an average of 88% in the prediction year compared to the previous year, when data were available. Confidence intervals on annual NEE increased by 30% when observed data were used instead of synthetic data, reflecting and quantifying the addition of model error. Finally, our analyses indicated that incorporating additional constraints, using data on C pools (wood, soil and fine roots) would help to reduce uncertainties for model parameters poorly served by eddy covariance data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present the first climate prediction of the coming decade made with multiple models, initialized with prior observations. This prediction accrues from an international activity to exchange decadal predictions in near real-time, in order to assess differences and similarities, provide a consensus view to prevent over-confidence in forecasts from any single model, and establish current collective capability. We stress that the forecast is experimental, since the skill of the multi-model system is as yet unknown. Nevertheless, the forecast systems used here are based on models that have undergone rigorous evaluation and individually have been evaluated for forecast skill. Moreover, it is important to publish forecasts to enable open evaluation, and to provide a focus on climate change in the coming decade. Initialized forecasts of the year 2011 agree well with observations, with a pattern correlation of 0.62 compared to 0.31 for uninitialized projections. In particular, the forecast correctly predicted La Niña in the Pacific, and warm conditions in the north Atlantic and USA. A similar pattern is predicted for 2012 but with a weaker La Niña. Indices of Atlantic multi-decadal variability and Pacific decadal variability show no signal beyond climatology after 2015, while temperature in the Niño3 region is predicted to warm slightly by about 0.5 °C over the coming decade. However, uncertainties are large for individual years and initialization has little impact beyond the first 4 years in most regions. Relative to uninitialized forecasts, initialized forecasts are significantly warmer in the north Atlantic sub-polar gyre and cooler in the north Pacific throughout the decade. They are also significantly cooler in the global average and over most land and ocean regions out to several years ahead. However, in the absence of volcanic eruptions, global temperature is predicted to continue to rise, with each year from 2013 onwards having a 50 % chance of exceeding the current observed record. Verification of these forecasts will provide an important opportunity to test the performance of models and our understanding and knowledge of the drivers of climate change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Radiative forcing and climate sensitivity have been widely used as concepts to understand climate change. This work performs climate change experiments with an intermediate general circulation model (IGCM) to examine the robustness of the radiative forcing concept for carbon dioxide and solar constant changes. This IGCM has been specifically developed as a computationally fast model, but one that allows an interaction between physical processes and large-scale dynamics; the model allows many long integrations to be performed relatively quickly. It employs a fast and accurate radiative transfer scheme, as well as simple convection and surface schemes, and a slab ocean, to model the effects of climate change mechanisms on the atmospheric temperatures and dynamics with a reasonable degree of complexity. The climatology of the IGCM run at T-21 resolution with 22 levels is compared to European Centre for Medium Range Weather Forecasting Reanalysis data. The response of the model to changes in carbon dioxide and solar output are examined when these changes are applied globally and when constrained geographically (e.g. over land only). The CO2 experiments have a roughly 17% higher climate sensitivity than the solar experiments. It is also found that a forcing at high latitudes causes a 40% higher climate sensitivity than a forcing only applied at low latitudes. It is found that, despite differences in the model feedbacks, climate sensitivity is roughly constant over a range of distributions of CO2 and solar forcings. Hence, in the IGCM at least, the radiative forcing concept is capable of predicting global surface temperature changes to within 30%, for the perturbations described here. It is concluded that radiative forcing remains a useful tool for assessing the natural and anthropogenic impact of climate change mechanisms on surface temperature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

14C-dated pollen and lake-level data from Europe are used to assess the spatial patterns of climate change between 6000 yr BP and present, as simulated by the NCAR CCM1 (National Center for Atmospheric Research, Community Climate Model, version 1) in response to the change in the Earth’s orbital parameters during this perod. First, reconstructed 6000 yr BP values of bioclimate variables obtained from pollen and lake-level data with the constrained-analogue technique are compared with simulated values. Then a 6000 yr BP biome map obtained from pollen data with an objective biome reconstruction (biomization) technique is compared with BIOME model results derived from the same simulation. Data and simulations agree in some features: warmer-than-present growing seasons in N and C Europe allowed forests to extend further north and to higher elevations than today, and warmer winters in C and E Europe prevented boreal conifers from spreading west. More generally, however, the agreement is poor. Predominantly deciduous forest types in Fennoscandia imply warmer winters than the model allows. The model fails to simulate winters cold enough, or summers wet enough, to allow temperate deciduous forests their former extended distribution in S Europe, and it incorrectly simulates a much expanded area of steppe vegetation in SE Europe. Similar errors have also been noted in numerous 6000 yr BP simulations with prescribed modern sea surface temperatures. These errors are evidently not resolved by the inclusion of interactive sea-surface conditions in the CCM1. Accurate representation of mid-Holocene climates in Europe may require the inclusion of dynamical ocean–atmosphere and/or vegetation–atmosphere interactions that most palaeoclimate model simulations have so far disregarded.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Well-resolved air–sea interactions are simulated in a new ocean mixed-layer, coupled configuration of the Met Office Unified Model (MetUM-GOML), comprising the MetUM coupled to the Multi-Column K Profile Parameterization ocean (MC-KPP). This is the first globally coupled system which provides a vertically resolved, high near-surface resolution ocean at comparable computational cost to running in atmosphere-only mode. As well as being computationally inexpensive, this modelling framework is adaptable– the independent MC-KPP columns can be applied selectively in space and time – and controllable – by using temperature and salinity corrections the model can be constrained to any ocean state. The framework provides a powerful research tool for process-based studies of the impact of air–sea interactions in the global climate system. MetUM simulations have been performed which separate the impact of introducing inter- annual variability in sea surface temperatures (SSTs) from the impact of having atmosphere–ocean feedbacks. The representation of key aspects of tropical and extratropical variability are used to assess the performance of these simulations. Coupling the MetUM to MC-KPP is shown, for example, to reduce tropical precipitation biases, improve the propagation of, and spectral power associated with, the Madden–Julian Oscillation and produce closer-to-observed patterns of springtime blocking activity over the Euro-Atlantic region.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Animal models of acquired epilepsies aim to provide researchers with tools for use in understanding the processes underlying the acquisition, development and establishment of the disorder. Typically, following a systemic or local insult, vulnerable brain regions undergo a process leading to the development, over time, of spontaneous recurrent seizures. Many such models make use of a period of intense seizure activity or status epilepticus, and this may be associated with high mortality and/or global damage to large areas of the brain. These undesirable elements have driven improvements in the design of chronic epilepsy models, for example the lithium-pilocarpine epileptogenesis model. Here, we present an optimised model of chronic epilepsy that reduces mortality to 1% whilst retaining features of high epileptogenicity and development of spontaneous seizures. Using local field potential recordings from hippocampus in vitro as a probe, we show that the model does not result in significant loss of neuronal network function in area CA3 and, instead, subtle alterations in network dynamics appear during a process of epileptogenesis, which eventually leads to a chronic seizure state. The model’s features of very low mortality and high morbidity in the absence of global neuronal damage offer the chance to explore the processes underlying epileptogenesis in detail, in a population of animals not defined by their resistance to seizures, whilst acknowledging and being driven by the 3Rs (Replacement, Refinement and Reduction of animal use in scientific procedures) principles.