927 resultados para distributions to shareholders


Relevância:

30.00% 30.00%

Publicador:

Resumo:

[ 1] We have used a fully coupled chemistry-climate model (CCM), which generates its own wind and temperature quasi-biennial oscillation (QBO), to study the effect of coupling on the QBO and to examine the QBO signals in stratospheric trace gases, particularly ozone. Radiative coupling of the interactive chemistry to the underlying general circulation model tends to prolong the QBO period and to increase the QBO amplitude in the equatorial zonal wind in the lower and middle stratosphere. The model ozone QBO agrees well with Stratospheric Aerosol and Gas Experiment II and Total Ozone Mapping Spectrometer satellite observations in terms of vertical and latitudinal structure. The model captures the ozone QBO phase change near 28 km over the equator and the column phase change near +/- 15 degrees latitude. Diagnosis of the model chemical terms shows that variations in NOx are the main chemical driver of the O-3 QBO around 35 km, i.e., above the O-3 phase change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Much uncertainty in the value of the imaginary part of the refractive index of mineral dust contributes to uncertainty in the radiative effect of mineral dust in the atmosphere. A synthesis of optical, chemical and physical in-situ aircraft measurements from the DODO experiments during February and August 2006 are used to calculate the refractive index mineral dust encountered over West Africa. Radiative transfer modeling and measurements of broadband shortwave irradiance at a range of altitudes are used to test and validate these calculations for a specific dust event on 23 August 2006 over Mauritania. Two techniques are used to determine the refractive index: firstly a method combining measurements of scattering, absorption, size distributions and Mie code simulations, and secondly a method using composition measured on filter samples to apportion the content of internally mixed quartz, calcite and iron oxide-clay aggregates, where the iron oxide is represented by either hematite or goethite and clay by either illite or kaolinite. The imaginary part of the refractive index at 550 nm (ni550) is found to range between 0.0001 i to 0.0046 i, and where filter samples are available, agreement between methods is found depending on mineral combination assumed. The refractive indices are also found to agree well with AERONET data where comparisons are possible. ni550 is found to vary with dust source, which is investigated with the NAME model for each case. The relationship between both size distribution and ni550 on the accumulation mode single scattering albedo at 550 nm (ω0550) are examined and size distribution is found to have no correlation to ω0550, while ni550 shows a strong linear relationship with ω0550. Radiative transfer modeling was performed with different models (Mie-derived refractive indices, but also filter sampling composition assuming both internal and external mixing). Our calculations indicate that Mie-derived values of ni550 and the externally mixed dust where the iron oxide-clay aggregate corresponds to the goethite-kaolinite combination result in the best agreement with irradiance measurements. The radiative effect of the dust is found to be very sensitive to the mineral combination (and hence refractive index) assumed, and to whether the dust is assumed to be internally or externally mixed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many models of immediate memory predict the presence or absence of various effects, but none have been tested to see whether they predict an appropriate distribution of effect sizes. The authors show that the feature model (J. S. Nairne, 1990) produces appropriate distributions of effect sizes for both the phonological confusion effect and the word-length effect. The model produces the appropriate number of reversals, when participants are more accurate with similar items or long items, and also correctly predicts that participants performing less well overall demonstrate smaller and less reliable phonological similarity and word-length effects and are more likely to show reversals. These patterns appear within the model without the need to assume a change in encoding or rehearsal strategy or the deployment of a different storage buffer. The implications of these results and the wider applicability of the distributionmodeling approach are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Carbendazim-amended soil was placed above or below unamended soil. Control tests comprised two layers of unamended soil. Allolobophora chlorotica earthworms were added to either the upper or the unamended soil. After 72 h vertical distributions of earthworms were compared between control and carbendazim-amended experiments. Earthworm distributions in the carbendazim-amended test containers differed significantly from the ‘normal’ distribution observed in the control tests. In the majority of the experiments, earthworms significantly altered their burrowing behaviour to avoid carbendazim. However, when earthworms were added to an upper layer of carbendazim-amended soil they remained in this layer. This non-avoidance is attributed to (1) the earthworms’ inability to sense the lower layer of unamended soil and (2) the toxic effect of carbendazim inhibiting burrowing. Earthworms modified their burrowing behaviour in response to carbendazim in the soil. This may explain anomalous results observed in pesticide field trials when carbendazim is used as a control substance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study investigates the function of non-cropped field margins in arable farming systems for enhancing the biodiversity value of beetle communities. Three different sown seed mixtures were used to establish field margins, a Countryside Stewardship mix, a fine grass and forbs mix and a tussock grass and forbs mix. The structure of beetle communities in the first full year of establishment was found to show no difference between the tussock grass and Countryside Stewardship margins. However, both differed from the fine grass margins, which supported lower overall abundance and species richness of beetles. This was attributed to small-scale architectural differences between species of fine and tussock grasses, rather than differences in plant composition. Body size distributions of beetles showed distinct similarities between the Countryside Stewardship and tussock margins. A greater abundance of large beetles was found in fine grass margins, although in all cases these body size distributions were attributed to a small number of species or a taxonomically distinct group. All three margin types included beetle species of conservation value. The importance of these results was discussed in the context of the value of these seed mixtures for invertebrate conversation. (c) 2004 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the comparison of two formulations in terms of average bioequivalence using the 2 × 2 cross-over design. In a bioequivalence study, the primary outcome is a pharmacokinetic measure, such as the area under the plasma concentration by time curve, which is usually assumed to have a lognormal distribution. The criterion typically used for claiming bioequivalence is that the 90% confidence interval for the ratio of the means should lie within the interval (0.80, 1.25), or equivalently the 90% confidence interval for the differences in the means on the natural log scale should be within the interval (-0.2231, 0.2231). We compare the gold standard method for calculation of the sample size based on the non-central t distribution with those based on the central t and normal distributions. In practice, the differences between the various approaches are likely to be small. Further approximations to the power function are sometimes used to simplify the calculations. These approximations should be used with caution, because the sample size required for a desirable level of power might be under- or overestimated compared to the gold standard method. However, in some situations the approximate methods produce very similar sample sizes to the gold standard method. Copyright © 2005 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The problem of estimating the individual probabilities of a discrete distribution is considered. The true distribution of the independent observations is a mixture of a family of power series distributions. First, we ensure identifiability of the mixing distribution assuming mild conditions. Next, the mixing distribution is estimated by non-parametric maximum likelihood and an estimator for individual probabilities is obtained from the corresponding marginal mixture density. We establish asymptotic normality for the estimator of individual probabilities by showing that, under certain conditions, the difference between this estimator and the empirical proportions is asymptotically negligible. Our framework includes Poisson, negative binomial and logarithmic series as well as binomial mixture models. Simulations highlight the benefit in achieving normality when using the proposed marginal mixture density approach instead of the empirical one, especially for small sample sizes and/or when interest is in the tail areas. A real data example is given to illustrate the use of the methodology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Estimation of population size with missing zero-class is an important problem that is encountered in epidemiological assessment studies. Fitting a Poisson model to the observed data by the method of maximum likelihood and estimation of the population size based on this fit is an approach that has been widely used for this purpose. In practice, however, the Poisson assumption is seldom satisfied. Zelterman (1988) has proposed a robust estimator for unclustered data that works well in a wide class of distributions applicable for count data. In the work presented here, we extend this estimator to clustered data. The estimator requires fitting a zero-truncated homogeneous Poisson model by maximum likelihood and thereby using a Horvitz-Thompson estimator of population size. This was found to work well, when the data follow the hypothesized homogeneous Poisson model. However, when the true distribution deviates from the hypothesized model, the population size was found to be underestimated. In the search of a more robust estimator, we focused on three models that use all clusters with exactly one case, those clusters with exactly two cases and those with exactly three cases to estimate the probability of the zero-class and thereby use data collected on all the clusters in the Horvitz-Thompson estimator of population size. Loss in efficiency associated with gain in robustness was examined based on a simulation study. As a trade-off between gain in robustness and loss in efficiency, the model that uses data collected on clusters with at most three cases to estimate the probability of the zero-class was found to be preferred in general. In applications, we recommend obtaining estimates from all three models and making a choice considering the estimates from the three models, robustness and the loss in efficiency. (© 2008 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This article is about modeling count data with zero truncation. A parametric count density family is considered. The truncated mixture of densities from this family is different from the mixture of truncated densities from the same family. Whereas the former model is more natural to formulate and to interpret, the latter model is theoretically easier to treat. It is shown that for any mixing distribution leading to a truncated mixture, a (usually different) mixing distribution can be found so. that the associated mixture of truncated densities equals the truncated mixture, and vice versa. This implies that the likelihood surfaces for both situations agree, and in this sense both models are equivalent. Zero-truncated count data models are used frequently in the capture-recapture setting to estimate population size, and it can be shown that the two Horvitz-Thompson estimators, associated with the two models, agree. In particular, it is possible to achieve strong results for mixtures of truncated Poisson densities, including reliable, global construction of the unique NPMLE (nonparametric maximum likelihood estimator) of the mixing distribution, implying a unique estimator for the population size. The benefit of these results lies in the fact that it is valid to work with the mixture of truncated count densities, which is less appealing for the practitioner but theoretically easier. Mixtures of truncated count densities form a convex linear model, for which a developed theory exists, including global maximum likelihood theory as well as algorithmic approaches. Once the problem has been solved in this class, it might readily be transformed back to the original problem by means of an explicitly given mapping. Applications of these ideas are given, particularly in the case of the truncated Poisson family.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Phylogenetic methods hold great promise for the reconstruction of the transition from precursor to modern flora and the identification of underlying factors which drive the process. The phylogenetic methods presently used to address the question of the origin of the Cape flora of South Africa are considered here. The sampling requirements of each of these methods, which include dating of diversifications using calibrated molecular trees, sister pair comparisons, lineage through time plots and biogeographical optimizations are reviewed. Sampling of genes, genomes and species are considered. Although increased higher-level studies and increased sampling are required for robust interpretation, it is clear that much progress is already made. It is argued that despite the remarkable richness of the flora, the Cape flora is a valuable model system to demonstrate the utility of phylogenetic methods in determining the history of a modern flora.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: To clarify the role of growth monitoring in primary school children, including obesity, and to examine issues that might impact on the effectiveness and cost-effectiveness of such programmes. Data sources: Electronic databases were searched up to July 2005. Experts in the field were also consulted. Review methods: Data extraction and quality assessment were performed on studies meeting the review's inclusion criteria. The performance of growth monitoring to detect disorders of stature and obesity was evaluated against National Screening Committee (NSC) criteria. Results: In the 31 studies that were included in the review, there were no controlled trials of the impact of growth monitoring and no studies of the diagnostic accuracy of different methods for growth monitoring. Analysis of the studies that presented a 'diagnostic yield' of growth monitoring suggested that one-off screening might identify between 1: 545 and 1: 1793 new cases of potentially treatable conditions. Economic modelling suggested that growth monitoring is associated with health improvements [ incremental cost per quality-adjusted life-year (QALY) of pound 9500] and indicated that monitoring was cost-effective 100% of the time over the given probability distributions for a willingness to pay threshold of pound 30,000 per QALY. Studies of obesity focused on the performance of body mass index against measures of body fat. A number of issues relating to human resources required for growth monitoring were identified, but data on attitudes to growth monitoring were extremely sparse. Preliminary findings from economic modelling suggested that primary prevention may be the most cost-effective approach to obesity management, but the model incorporated a great deal of uncertainty. Conclusions: This review has indicated the potential utility and cost-effectiveness of growth monitoring in terms of increased detection of stature-related disorders. It has also pointed strongly to the need for further research. Growth monitoring does not currently meet all NSC criteria. However, it is questionable whether some of these criteria can be meaningfully applied to growth monitoring given that short stature is not a disease in itself, but is used as a marker for a range of pathologies and as an indicator of general health status. Identification of effective interventions for the treatment of obesity is likely to be considered a prerequisite to any move from monitoring to a screening programme designed to identify individual overweight and obese children. Similarly, further long-term studies of the predictors of obesity-related co-morbidities in adulthood are warranted. A cluster randomised trial comparing growth monitoring strategies with no growth monitoring in the general population would most reliably determine the clinical effectiveness of growth monitoring. Studies of diagnostic accuracy, alongside evidence of effective treatment strategies, could provide an alternative approach. In this context, careful consideration would need to be given to target conditions and intervention thresholds. Diagnostic accuracy studies would require long-term follow-up of both short and normal children to determine sensitivity and specificity of growth monitoring.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates the application of the Hilbert spectrum (HS), which is a recent tool for the analysis of nonlinear and nonstationary time-series, to the study of electromyographic (EMG) signals. The HS allows for the visualization of the energy of signals through a joint time-frequency representation. In this work we illustrate the use of the HS in two distinct applications. The first is for feature extraction from EMG signals. Our results showed that the instantaneous mean frequency (IMNF) estimated from the HS is a relevant feature to clinical practice. We found that the median of the IMNF reduces when the force level of the muscle contraction increases. In the second application we investigated the use of the HS for detection of motor unit action potentials (MUAPs). The detection of MUAPs is a basic step in EMG decomposition tools, which provide relevant information about the neuromuscular system through the morphology and firing time of MUAPs. We compared, visually, how MUAP activity is perceived on the HS with visualizations provided by some traditional (e.g. scalogram, spectrogram, Wigner-Ville) time-frequency distributions. Furthermore, an alternative visualization to the HS, for detection of MUAPs, is proposed and compared to a similar approach based on the continuous wavelet transform (CWT). Our results showed that both the proposed technique and the CWT allowed for a clear visualization of MUAP activity on the time-frequency distributions, whereas results obtained with the HS were the most difficult to interpret as they were extremely affected by spurious energy activity. (c) 2008 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper the meteorological processes responsible for transporting tracer during the second ETEX (European Tracer EXperiment) release are determined using the UK Met Office Unified Model (UM). The UM predicted distribution of tracer is also compared with observations from the ETEX campaign. The dominant meteorological process is a warm conveyor belt which transports large amounts of tracer away from the surface up to a height of 4 km over a 36 h period. Convection is also an important process, transporting tracer to heights of up to 8 km. Potential sources of error when using an operational numerical weather prediction model to forecast air quality are also investigated. These potential sources of error include model dynamics, model resolution and model physics. In the UM a semi-Lagrangian monotonic advection scheme is used with cubic polynomial interpolation. This can predict unrealistic negative values of tracer which are subsequently set to zero, and hence results in an overprediction of tracer concentrations. In order to conserve mass in the UM tracer simulations it was necessary to include a flux corrected transport method. Model resolution can also affect the accuracy of predicted tracer distributions. Low resolution simulations (50 km grid length) were unable to resolve a change in wind direction observed during ETEX 2, this led to an error in the transport direction and hence an error in tracer distribution. High resolution simulations (12 km grid length) captured the change in wind direction and hence produced a tracer distribution that compared better with the observations. The representation of convective mixing was found to have a large effect on the vertical transport of tracer. Turning off the convective mixing parameterisation in the UM significantly reduced the vertical transport of tracer. Finally, air quality forecasts were found to be sensitive to the timing of synoptic scale features. Errors in the position of the cold front relative to the tracer release location of only 1 h resulted in changes in the predicted tracer concentrations that were of the same order of magnitude as the absolute tracer concentrations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A revised Bayesian algorithm for estimating surface rain rate, convective rain proportion, and latent heating profiles from satellite-borne passive microwave radiometer observations over ocean backgrounds is described. The algorithm searches a large database of cloud-radiative model simulations to find cloud profiles that are radiatively consistent with a given set of microwave radiance measurements. The properties of these radiatively consistent profiles are then composited to obtain best estimates of the observed properties. The revised algorithm is supported by an expanded and more physically consistent database of cloud-radiative model simulations. The algorithm also features a better quantification of the convective and nonconvective contributions to total rainfall, a new geographic database, and an improved representation of background radiances in rain-free regions. Bias and random error estimates are derived from applications of the algorithm to synthetic radiance data, based upon a subset of cloud-resolving model simulations, and from the Bayesian formulation itself. Synthetic rain-rate and latent heating estimates exhibit a trend of high (low) bias for low (high) retrieved values. The Bayesian estimates of random error are propagated to represent errors at coarser time and space resolutions, based upon applications of the algorithm to TRMM Microwave Imager (TMI) data. Errors in TMI instantaneous rain-rate estimates at 0.5°-resolution range from approximately 50% at 1 mm h−1 to 20% at 14 mm h−1. Errors in collocated spaceborne radar rain-rate estimates are roughly 50%–80% of the TMI errors at this resolution. The estimated algorithm random error in TMI rain rates at monthly, 2.5° resolution is relatively small (less than 6% at 5 mm day−1) in comparison with the random error resulting from infrequent satellite temporal sampling (8%–35% at the same rain rate). Percentage errors resulting from sampling decrease with increasing rain rate, and sampling errors in latent heating rates follow the same trend. Averaging over 3 months reduces sampling errors in rain rates to 6%–15% at 5 mm day−1, with proportionate reductions in latent heating sampling errors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes the impact of changing the current imposed ozone climatology upon the tropical Quasi-Biennial Oscillation (QBO) in a high top climate configuration of the Met Office U.K. general circulation model. The aim is to help distinguish between QBO changes in chemistry climate models that result from temperature-ozone feedbacks and those that might be forced by differences in climatology between previously fixed and newly interactive ozone distributions. Different representations of zonal mean ozone climatology under present-day conditions are taken to represent the level of change expected between acceptable model realizations of the global ozone distribution and thus indicate whether more detailed investigation of such climatology issues might be required when assessing ozone feedbacks. Tropical stratospheric ozone concentrations are enhanced relative to the control climatology between 20–30 km, reduced from 30–40 km and enhanced above, impacting the model profile of clear-sky radiative heating, in particular warming the tropical stratosphere between 15–35 km. The outcome is consistent with a localized equilibrium response in the tropical stratosphere that generates increased upwelling between 100 and 4 hPa, sufficient to account for a 12 month increase of modeled mean QBO period. This response has implications for analysis of the tropical circulation in models with interactive ozone chemistry because it highlights the possibility that plausible changes in the ozone climatology could have a sizable impact upon the tropical upwelling and QBO period that ought to be distinguished from other dynamical responses such as ozone-temperature feedbacks.