46 resultados para Cumulative Distribution Function


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Land surface albedo, a key parameter to derive Earth's surface energy balance, is used in the parameterization of numerical weather prediction, climate monitoring and climate change impact assessments. Changes in albedo due to fire have not been fully investigated on a continental and global scale. The main goal of this study, therefore, is to quantify the changes in instantaneous shortwave albedo produced by biomass burning activities and their associated radiative forcing. The study relies on the MODerate-resolution Imaging Spectroradiometer (MODIS) MCD64A1 burned-area product to create an annual composite of areas affected by fire and the MCD43C2 bidirectional reflectance distribution function (BRDF) albedo snow-free product to compute a bihemispherical reflectance time series. The approximate day of burning is used to calculate the instantaneous change in shortwave albedo. Using the corresponding National Centers for Environmental Prediction (NCEP) monthly mean downward solar radiation flux at the surface, the global radiative forcing associated with fire was computed. The analysis reveals a mean decrease in shortwave albedo of −0.014 (1σ = 0.017), causing a mean positive radiative forcing of 3.99 Wm−2 (1σ = 4.89) over the 2002–20012 time period in areas affected by fire. The greatest drop in mean shortwave albedo change occurs in 2002, which corresponds to the highest total area burned (378 Mha) observed in the same year and produces the highest mean radiative forcing (4.5 Wm−2). Africa is the main contributor in terms of burned area, but forests globally give the highest radiative forcing per unit area and thus give detectable changes in shortwave albedo. The global mean radiative forcing for the whole period studied (~0.0275 Wm−2) shows that the contribution of fires to the Earth system is not insignificant.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Large waves pose risks to ships, offshore structures, coastal infrastructure and ecosystems. This paper analyses 10 years of in-situ measurements of significant wave height (Hs) and maximum wave height (Hmax) from the ocean weather ship Polarfront in the Norwegian Sea. During the period 2000 to 2009, surface elevation was recorded every 0.59 s during sampling periods of 30 min. The Hmax observations scale linearly with Hs on average. A widely-used empirical Weibull distribution is found to estimate average values of Hmax/Hs and Hmax better than a Rayleigh distribution, but tends to underestimate both for all but the smallest waves. In this paper we propose a modified Rayleigh distribution which compensates for the heterogeneity of the observed dataset: the distribution is fitted to the whole dataset and improves the estimate of the largest waves. Over the 10-year period, the Weibull distribution approximates the observed Hs and Hmax well, and an exponential function can be used to predict the probability distribution function of the ratio Hmax/Hs. However, the Weibull distribution tends to underestimate the occurrence of extremely large values of Hs and Hmax. The persistence of Hs and Hmax in winter is also examined. Wave fields with Hs>12 m and Hmax>16 m do not last longer than 3 h. Low-to-moderate wave heights that persist for more than 12 h dominate the relationship of the wave field with the winter NAO index over 2000–2009. In contrast, the inter-annual variability of wave fields with Hs>5.5 m or Hmax>8.5 m and wave fields persisting over ~2.5 days is not associated with the winter NAO index.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The co-polar correlation coefficient (ρhv) has many applications, including hydrometeor classification, ground clutter and melting layer identification, interpretation of ice microphysics and the retrieval of rain drop size distributions (DSDs). However, we currently lack the quantitative error estimates that are necessary if these applications are to be fully exploited. Previous error estimates of ρhv rely on knowledge of the unknown "true" ρhv and implicitly assume a Gaussian probability distribution function of ρhv samples. We show that frequency distributions of ρhv estimates are in fact highly negatively skewed. A new variable: L = -log10(1 - ρhv) is defined, which does have Gaussian error statistics, and a standard deviation depending only on the number of independent radar pulses. This is verified using observations of spherical drizzle drops, allowing, for the first time, the construction of rigorous confidence intervals in estimates of ρhv. In addition, we demonstrate how the imperfect co-location of the horizontal and vertical polarisation sample volumes may be accounted for. The possibility of using L to estimate the dispersion parameter (µ) in the gamma drop size distribution is investigated. We find that including drop oscillations is essential for this application, otherwise there could be biases in retrieved µ of up to ~8. Preliminary results in rainfall are presented. In a convective rain case study, our estimates show µ to be substantially larger than 0 (an exponential DSD). In this particular rain event, rain rate would be overestimated by up to 50% if a simple exponential DSD is assumed.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

This investigation deals with the question of when a particular population can be considered to be disease-free. The motivation is the case of BSE where specific birth cohorts may present distinct disease-free subpopulations. The specific objective is to develop a statistical approach suitable for documenting freedom of disease, in particular, freedom from BSE in birth cohorts. The approach is based upon a geometric waiting time distribution for the occurrence of positive surveillance results and formalizes the relationship between design prevalence, cumulative sample size and statistical power. The simple geometric waiting time model is further modified to account for the diagnostic sensitivity and specificity associated with the detection of disease. This is exemplified for BSE using two different models for the diagnostic sensitivity. The model is furthermore modified in such a way that a set of different values for the design prevalence in the surveillance streams can be accommodated (prevalence heterogeneity) and a general expression for the power function is developed. For illustration, numerical results for BSE suggest that currently (data status September 2004) a birth cohort of Danish cattle born after March 1999 is free from BSE with probability (power) of 0.8746 or 0.8509, depending on the choice of a model for the diagnostic sensitivity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Substantial resources are used for surveillance of bovine spongiform encephalopathy (BSE) despite an extremely low detection rate, especially in healthy slaughtered cattle. We have developed a method based on the geometric waiting time distribution to establish and update the statistical evidence for BSE-freedom for defined birth cohorts using continued surveillance data. The results suggest that currently (data included till September 2004) a birth cohort of Danish cattle born after March 1999 is free from BSE with probability (power) of 0.8746 or 0.8509, depending on the choice of a model for the diagnostic sensitivity. These results apply to an assumed design prevalence of 1 in 10,000 and account for prevalence heterogeneity. The age-dependent, diagnostic sensitivity for the detection of BSE has been identified as major determinant of the power. The incorporation of heterogeneity was deemed adequate on scientific grounds and led to improved power values. We propose our model as a decision tool for possible future modification of the BSE surveillance and discuss public health and international trade implications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Sulphate-reducing bacteria (SRB) and methanogenic archaea (MA) are important anaerobic terminal oxidisers of organic matter. However, we have little knowledge about the distribution and types of SRB and MA in the environment or the functional role they play in situ. Here we have utilised sediment slurry microcosms amended with ecologically significant substrates, including acetate and hydrogen, and specific functional inhibitors, to identify the important SRB and MA groups in two contrasting sites on a UK estuary. Substrate and inhibitor additions had significant effects on methane production and on acetate and sulphate consumption in the slurries. By using specific 16S-targeted oligonucleotide probes we were able to link specific SRB and MA groups to the use of the added substrates. Acetate consumption in the freshwater-dominated sediments was mediated by Methanosarcinales under low-sulphate conditions and Desulfobacter under the high-sulphate conditions that simulated a tidal incursion. In the marine-dominated sediments, acetate consumption was linked to Desulfobacter. Addition of trimethylamine, a non-competitive substrate for methanogenesis, led to a large increase in Methanosarcinales signal in marine slurries. Desulfobulbus was linked to non-sulphate-dependent H-2 consumption in the freshwater sediments. The addition of sulphate to freshwater sediments inhibited methane production and reduced signal from probes targeted to Methanosarcinales and Methanomicrobiales, while the addition of molybdate to marine sediments inhibited Desulfobulbus and Desulfobacterium. These data complement our understanding of the ecophysiology of the organisms detected and make a firm connection between the capabilities of species, as observed in the laboratory, to their roles in the environment. (C) 2003 Federation of European Microbiological Societies. Published by Elsevier Science B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The correlated k-distribution (CKD) method is widely used in the radiative transfer schemes of atmospheric models and involves dividing the spectrum into a number of bands and then reordering the gaseous absorption coefficients within each one. The fluxes and heating rates for each band may then be computed by discretizing the reordered spectrum into of order 10 quadrature points per major gas and performing a monochromatic radiation calculation for each point. In this presentation it is shown that for clear-sky longwave calculations, sufficient accuracy for most applications can be achieved without the need for bands: reordering may be performed on the entire longwave spectrum. The resulting full-spectrum correlated k (FSCK) method requires significantly fewer monochromatic calculations than standard CKD to achieve a given accuracy. The concept is first demonstrated by comparing with line-by-line calculations for an atmosphere containing only water vapor, in which it is shown that the accuracy of heating-rate calculations improves approximately in proportion to the square of the number of quadrature points. For more than around 20 points, the root-mean-squared error flattens out at around 0.015 K/day due to the imperfect rank correlation of absorption spectra at different pressures in the profile. The spectral overlap of m different gases is treated by considering an m-dimensional hypercube where each axis corresponds to the reordered spectrum of one of the gases. This hypercube is then divided up into a number of volumes, each approximated by a single quadrature point, such that the total number of quadrature points is slightly fewer than the sum of the number that would be required to treat each of the gases separately. The gaseous absorptions for each quadrature point are optimized such that they minimize a cost function expressing the deviation of the heating rates and fluxes calculated by the FSCK method from line-by-line calculations for a number of training profiles. This approach is validated for atmospheres containing water vapor, carbon dioxide, and ozone, in which it is found that in the troposphere and most of the stratosphere, heating-rate errors of less than 0.2 K/day can be achieved using a total of 23 quadrature points, decreasing to less than 0.1 K/day for 32 quadrature points. It would be relatively straightforward to extend the method to include other gases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background FFAR1 receptor is a long chain fatty acid G-protein coupled receptor which is expressed widely, but found in high density in the pancreas and central nervous system. It has been suggested that FFAR1 may play a role in insulin sensitivity, lipotoxicity and is associated with type 2 diabetes. Here we investigate the effect of three common SNPs of FFAR1 (rs2301151; rs16970264; rs1573611) on pancreatic function, BMI, body composition and plasma lipids. Methodology/Principal Findings For this enquiry we used the baseline RISCK data, which provides a cohort of overweight subjects at increased cardiometabolic risk with detailed phenotyping. The key findings were SNPs of the FFAR1 gene region were associated with differences in body composition and lipids, and the effects of the 3 SNPs combined were cumulative on BMI, body composition and total cholesterol. The effects on BMI and body fat were predominantly mediated by rs1573611 (1.06 kg/m2 higher (P = 0.009) BMI and 1.53% higher (P = 0.002) body fat per C allele). Differences in plasma lipids were also associated with the BMI-increasing allele of rs2301151 including higher total cholesterol (0.2 mmol/L per G allele, P = 0.01) and with the variant A allele of rs16970264 associated with lower total (0.3 mmol/L, P = 0.02) and LDL (0.2 mmol/L, P<0.05) cholesterol, but also with lower HDL-cholesterol (0.09 mmol/L, P<0.05) although the difference was not apparent when controlling for multiple testing. There were no statistically significant effects of the three SNPs on insulin sensitivity or beta cell function. However accumulated risk allele showed a lower beta cell function on increasing plasma fatty acids with a carbon chain greater than six. Conclusions/Significance Differences in body composition and lipids associated with common SNPs in the FFAR1 gene were apparently not mediated by changes in insulin sensitivity or beta-cell function.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we perform an analytical and numerical study of Extreme Value distributions in discrete dynamical systems. In this setting, recent works have shown how to get a statistics of extremes in agreement with the classical Extreme Value Theory. We pursue these investigations by giving analytical expressions of Extreme Value distribution parameters for maps that have an absolutely continuous invariant measure. We compare these analytical results with numerical experiments in which we study the convergence to limiting distributions using the so called block-maxima approach, pointing out in which cases we obtain robust estimation of parameters. In regular maps for which mixing properties do not hold, we show that the fitting procedure to the classical Extreme Value Distribution fails, as expected. However, we obtain an empirical distribution that can be explained starting from a different observable function for which Nicolis et al. (Phys. Rev. Lett. 97(21): 210602, 2006) have found analytical results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we perform an analytical and numerical study of Extreme Value distributions in discrete dynamical systems that have a singular measure. Using the block maxima approach described in Faranda et al. [2011] we show that, numerically, the Extreme Value distribution for these maps can be associated to the Generalised Extreme Value family where the parameters scale with the information dimension. The numerical analysis are performed on a few low dimensional maps. For the middle third Cantor set and the Sierpinskij triangle obtained using Iterated Function Systems, experimental parameters show a very good agreement with the theoretical values. For strange attractors like Lozi and H\`enon maps a slower convergence to the Generalised Extreme Value distribution is observed. Even in presence of large statistics the observed convergence is slower if compared with the maps which have an absolute continuous invariant measure. Nevertheless and within the uncertainty computed range, the results are in good agreement with the theoretical estimates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A common procedure for studying the effects on cognition of repetitive transcranial magnetic stimulation (rTMS) is to deliver rTMS concurrent with task performance, and to compare task performance on these trials versus on trials without rTMS. Recent evidence that TMS can have effects on neural activity that persist longer than the experimental session itself, however, raise questions about the assumption of the transient nature of rTMS that underlies many concurrent (or "online") rTMS designs. To our knowledge, there have been no studies in the cognitive domain examining whether the application of brief trains of rTMS during specific epochs of a complex task may have effects that spill over into subsequent task epochs, and perhaps into subsequent trials. We looked for possible immediate spill-over and longer-term cumulative effects of rTMS in data from two studies of visual short-term delayed recognition. In 54 subjects, 10-Hz rTMS trains were applied to five different brain regions during the 3-s delay period of a spatial task, and in a second group of 15 subjects, electroencephalography (EEG) was recorded while 10-Hz rTMS was applied to two brain areas during the 3-s delay period of both spatial and object tasks. No evidence for immediate effects was found in the comparison of the memory probe-evoked response on trials that were vs. were not preceded by delay-period rTMS. No evidence for cumulative effects was found in analyses of behavioral performance, and of EEG signal, as a function of task block. The implications of these findings, and their relation to the broader literature on acute vs. long-lasting effects of rTMS, are considered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We introduce a new methodology that allows the construction of wave frequency distributions due to growing incoherent whistler-mode waves in the magnetosphere. The technique combines the equations of geometric optics (i.e. raytracing) with the equation of transfer of radiation in an anisotropic lossy medium to obtain spectral energy density as a function of frequency and wavenormal angle. We describe the method in detail, and then demonstrate how it could be used in an idealised magnetosphere during quiet geomagnetic conditions. For a specific set of plasma conditions, we predict that the wave power peaks off the equator at ~15 degrees magnetic latitude. The new calculations predict that wave power as a function of frequency can be adequately described using a Gaussian function, but as a function of wavenormal angle, it more closely resembles a skew normal distribution. The technique described in this paper is the first known estimate of the parallel and oblique incoherent wave spectrum as a result of growing whistler-mode waves, and provides a means to incorporate self-consistent wave-particle interactions in a kinetic model of the magnetosphere over a large volume.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A primitive equation model is used to study the sensitivity of baroclinic wave life cycles to the initial latitude-height distribution of humidity. Diabatic heating is parametrized only as a consequence of condensation in regions of large-scale ascent. Experiments are performed in which the initial relative humidity is a simple function of model level, and in some cases latitude bands are specified which are initially relatively dry. It is found that the presence of moisture can either increase or decrease the peak eddy kinetic energy of the developing wave, depending on the initial moisture distribution. A relative abundance of moisture at mid-latitudes tends to weaken the wave, while a relative abundance at low latitudes tends to strengthen it. This sensitivity exists because competing processes are at work. These processes are described in terms of energy box diagnostics. The most realistic case lies on the cusp of this sensitivity. Further physical parametrizations are then added, including surface fluxes and upright moist convection. These have the effect of increasing wave amplitude, but the sensitivity to initial conditions of relative humidity remains. Finally, 'control' and 'doubled CO2' life cycles are performed, with initial conditions taken from the time-mean zonal-mean output of equilibrium GCM experiments. The attenuation of the wave resulting from reduced baroclinicity is more pronounced than any effect due to changes in initial moisture.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

tWe develop an orthogonal forward selection (OFS) approach to construct radial basis function (RBF)network classifiers for two-class problems. Our approach integrates several concepts in probabilisticmodelling, including cross validation, mutual information and Bayesian hyperparameter fitting. At eachstage of the OFS procedure, one model term is selected by maximising the leave-one-out mutual infor-mation (LOOMI) between the classifier’s predicted class labels and the true class labels. We derive theformula of LOOMI within the OFS framework so that the LOOMI can be evaluated efficiently for modelterm selection. Furthermore, a Bayesian procedure of hyperparameter fitting is also integrated into theeach stage of the OFS to infer the l2-norm based local regularisation parameter from the data. Since eachforward stage is effectively fitting of a one-variable model, this task is very fast. The classifier construc-tion procedure is automatically terminated without the need of using additional stopping criterion toyield very sparse RBF classifiers with excellent classification generalisation performance, which is par-ticular useful for the noisy data sets with highly overlapping class distribution. A number of benchmarkexamples are employed to demonstrate the effectiveness of our proposed approach.