97 resultados para Controlled Monte Carlo Data Generation


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Assessment is made of the effect of the assumed form for the ion velocity distribution function on estimates of three-dimensional ion temperature from one-dimensional observations. Incoherent scatter observations by the EISCAT radar at a variety of aspect angles are used to demonstrate features of ion temperature determination and to study the ion velocity distribution function. One form of the distribution function which has recently been widely used In the interpretation of EISCAT measurements, is found to be consistent with the data presented here, in that no deviation from a Maxwellian can be detected for observations along the magnetic field line and that the ion temperature and its anisotropy are accurately predicted. It is shown that theoretical predictions of the anisotropy by Monte Carlo computations are very accurate, the observed value being greater by only a few percent. It is also demonstrated for the case studied that errors of up to 93% are introduced into the ion temperature estimate if the anisotropy is neglected. Observations at an aspect angle of 54.7°, which are not subject to this error, have a much smaller uncertainty (less than 1%) due to the adopted form of the distribution of line-of-sight velocity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Epidemiological data suggest inverse associations between citrus flavanone intake and cardiovascular disease (CVD) risk. However, insufficient randomized controlled trial (RCT) data limit our understanding of mechanisms by which flavanones and their metabolites potentially reduce cardiovascular (CV) risk factors. Objective: We examined the effects of orange juice or a dose-matched hesperidin supplement on plasma concentrations of established and novel flavanone metabolites and their effects on CV risk biomarkers in men at moderate CVD risk. Methods: In an acute, randomized, placebo-controlled crossover trial, 16 fasted participants (aged 51-69 y) received orange juice or a hesperidin supplement (both providing 320 mg hesperidin) or control (all matched for sugar and vitamin C content). At baseline and 5 h post-intake, endothelial function (primary outcome), further CV risk biomarkers (i.e. blood pressure, arterial stiffness, cardiac autonomic function, platelet activation and NADPH oxidase gene expression) and plasma flavanone metabolites were assessed. Prior to each intervention, a diet low in flavonoids, nitrate/nitrite, alcohol and caffeine was followed and a standardized low-flavonoid evening meal was consumed. Results: Orange juice intake significantly elevated mean (± SEM) plasma concentrations of 8 flavanone (1.75 ± 0.35 µmol/L, P < 0.0001) and 15 phenolic metabolites (13.27 ± 2.22 µmol/L, P < 0.0001) compared with control at 5 h post-consumption. Despite increased plasma flavanone and phenolic metabolite concentrations, CV risk biomarkers were unaltered. Following hesperidin supplement intake, flavanone metabolites were not different to control, suggesting altered absorption/metabolism compared with the orange juice matrix. Conclusions: Following single-dose flavanone intake within orange juice, we detected circulating flavanone and phenolic metabolites collectively reaching a concentration of 15.20 ± 2.15 µmol/L but observed no effect on CV risk biomarkers. Longer-duration RCTs are required to further examine the previous associations between higher flavanone intakes and improved cardiovascular health and to ascertain the relative importance of food matrix and flavanone-derived phenolic metabolites.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The decomposition of soil organic matter (SOM) is temperature dependent, but its response to a future warmer climate remains equivocal. Enhanced rates of decomposition of SOM under increased global temperatures might cause higher CO2 emissions to the atmosphere, and could therefore constitute a strong positive feedback. The magnitude of this feedback however remains poorly understood, primarily because of the difficulty in quantifying the temperature sensitivity of stored, recalcitrant carbon that comprises the bulk (>90%) of SOM in most soils. In this study we investigated the effects of climatic conditions on soil carbon dynamics using the attenuation of the 14C ‘bomb’ pulse as recorded in selected modern European speleothems. These new data were combined with published results to further examine soil carbon dynamics, and to explore the sensitivity of labile and recalcitrant organic matter decomposition to different climatic conditions. Temporal changes in 14C activity inferred from each speleothem was modelled using a three pool soil carbon inverse model (applying a Monte Carlo method) to constrain soil carbon turnover rates at each site. Speleothems from sites that are characterised by semi-arid conditions, sparse vegetation, thin soil cover and high mean annual air temperatures (MAATs), exhibit weak attenuation of atmospheric 14C ‘bomb’ peak (a low damping effect, D in the range: 55–77%) and low modelled mean respired carbon ages (MRCA), indicating that decomposition is dominated by young, recently fixed soil carbon. By contrast, humid and high MAAT sites that are characterised by a thick soil cover and dense, well developed vegetation, display the highest damping effect (D = c. 90%), and the highest MRCA values (in the range from 350 ± 126 years to 571 ± 128 years). This suggests that carbon incorporated into these stalagmites originates predominantly from decomposition of old, recalcitrant organic matter. SOM turnover rates cannot be ascribed to a single climate variable, e.g. (MAAT) but instead reflect a complex interplay of climate (e.g. MAAT and moisture budget) and vegetation development.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, we study jumps in commodity prices. Unlike assumed in existing models of commodity price dynamics, a simple analysis of the data reveals that the probability of tail events is not constant but depends on the time of the year, i.e. exhibits seasonality. We propose a stochastic volatility jump–diffusion model to capture this seasonal variation. Applying the Markov Chain Monte Carlo (MCMC) methodology, we estimate our model using 20 years of futures data from four different commodity markets. We find strong statistical evidence to suggest that our model with seasonal jump intensity outperforms models featuring a constant jump intensity. To demonstrate the practical relevance of our findings, we show that our model typically improves Value-at-Risk (VaR) forecasts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Although the sunspot-number series have existed since the mid-19th century, they are still the subject of intense debate, with the largest uncertainty being related to the "calibration" of the visual acuity of individual observers in the past. Daisy-chain regression methods are applied to inter-calibrate the observers which may lead to significant bias and error accumulation. Here we present a novel method to calibrate the visual acuity of the key observers to the reference data set of Royal Greenwich Observatory sunspot groups for the period 1900-1976, using the statistics of the active-day fraction. For each observer we independently evaluate their observational thresholds [S_S] defined such that the observer is assumed to miss all of the groups with an area smaller than S_S and report all the groups larger than S_S. Next, using a Monte-Carlo method we construct, from the reference data set, a correction matrix for each observer. The correction matrices are significantly non-linear and cannot be approximated by a linear regression or proportionality. We emphasize that corrections based on a linear proportionality between annually averaged data lead to serious biases and distortions of the data. The correction matrices are applied to the original sunspot group records for each day, and finally the composite corrected series is produced for the period since 1748. The corrected series displays secular minima around 1800 (Dalton minimum) and 1900 (Gleissberg minimum), as well as the Modern grand maximum of activity in the second half of the 20th century. The uniqueness of the grand maximum is confirmed for the last 250 years. It is shown that the adoption of a linear relationship between the data of Wolf and Wolfer results in grossly inflated group numbers in the 18th and 19th centuries in some reconstructions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A truly variance-minimizing filter is introduced and its per for mance is demonstrated with the Korteweg– DeV ries (KdV) equation and with a multilayer quasigeostrophic model of the ocean area around South Africa. It is recalled that Kalman-like filters are not variance minimizing for nonlinear model dynamics and that four - dimensional variational data assimilation (4DV AR)-like methods relying on per fect model dynamics have dif- ficulty with providing error estimates. The new method does not have these drawbacks. In fact, it combines advantages from both methods in that it does provide error estimates while automatically having balanced states after analysis, without extra computations. It is based on ensemble or Monte Carlo integrations to simulate the probability density of the model evolution. When obser vations are available, the so-called importance resampling algorithm is applied. From Bayes’ s theorem it follows that each ensemble member receives a new weight dependent on its ‘ ‘distance’ ’ t o the obser vations. Because the weights are strongly var ying, a resampling of the ensemble is necessar y. This resampling is done such that members with high weights are duplicated according to their weights, while low-weight members are largely ignored. In passing, it is noted that data assimilation is not an inverse problem by nature, although it can be for mulated that way . Also, it is shown that the posterior variance can be larger than the prior if the usual Gaussian framework is set aside. However , i n the examples presented here, the entropy of the probability densities is decreasing. The application to the ocean area around South Africa, gover ned by strongly nonlinear dynamics, shows that the method is working satisfactorily . The strong and weak points of the method are discussed and possible improvements are proposed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Land cover data derived from satellites are commonly used to prescribe inputs to models of the land surface. Since such data inevitably contains errors, quantifying how uncertainties in the data affect a model’s output is important. To do so, a spatial distribution of possible land cover values is required to propagate through the model’s simulation. However, at large scales, such as those required for climate models, such spatial modelling can be difficult. Also, computer models often require land cover proportions at sites larger than the original map scale as inputs, and it is the uncertainty in these proportions that this article discusses. This paper describes a Monte Carlo sampling scheme that generates realisations of land cover proportions from the posterior distribution as implied by a Bayesian analysis that combines spatial information in the land cover map and its associated confusion matrix. The technique is computationally simple and has been applied previously to the Land Cover Map 2000 for the region of England and Wales. This article demonstrates the ability of the technique to scale up to large (global) satellite derived land cover maps and reports its application to the GlobCover 2009 data product. The results show that, in general, the GlobCover data possesses only small biases, with the largest belonging to non–vegetated surfaces. In vegetated surfaces, the most prominent area of uncertainty is Southern Africa, which represents a complex heterogeneous landscape. It is also clear from this study that greater resources need to be devoted to the construction of comprehensive confusion matrices.