60 resultados para % of sample area

em CentAUR: Central Archive University of Reading - UK


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We consider the comparison of two formulations in terms of average bioequivalence using the 2 × 2 cross-over design. In a bioequivalence study, the primary outcome is a pharmacokinetic measure, such as the area under the plasma concentration by time curve, which is usually assumed to have a lognormal distribution. The criterion typically used for claiming bioequivalence is that the 90% confidence interval for the ratio of the means should lie within the interval (0.80, 1.25), or equivalently the 90% confidence interval for the differences in the means on the natural log scale should be within the interval (-0.2231, 0.2231). We compare the gold standard method for calculation of the sample size based on the non-central t distribution with those based on the central t and normal distributions. In practice, the differences between the various approaches are likely to be small. Further approximations to the power function are sometimes used to simplify the calculations. These approximations should be used with caution, because the sample size required for a desirable level of power might be under- or overestimated compared to the gold standard method. However, in some situations the approximate methods produce very similar sample sizes to the gold standard method. Copyright © 2005 John Wiley & Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a case study to illustrate the range of decisions involved in designing a sampling strategy for a complex, longitudinal research study. It is based on experience from the Young Lives project and identifies the approaches used to sample children for longitudinal follow-up in four less developed countries (LDCs). The rationale for decisions made and the resulting benefits, and limitations, of the approaches adopted are discussed. Of particular importance is the choice of sampling approach to yield useful analysis; specific examples are presented of how this informed the design of the Young Lives sampling strategy.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data assimilation refers to the problem of finding trajectories of a prescribed dynamical model in such a way that the output of the model (usually some function of the model states) follows a given time series of observations. Typically though, these two requirements cannot both be met at the same time–tracking the observations is not possible without the trajectory deviating from the proposed model equations, while adherence to the model requires deviations from the observations. Thus, data assimilation faces a trade-off. In this contribution, the sensitivity of the data assimilation with respect to perturbations in the observations is identified as the parameter which controls the trade-off. A relation between the sensitivity and the out-of-sample error is established, which allows the latter to be calculated under operational conditions. A minimum out-of-sample error is proposed as a criterion to set an appropriate sensitivity and to settle the discussed trade-off. Two approaches to data assimilation are considered, namely variational data assimilation and Newtonian nudging, also known as synchronization. Numerical examples demonstrate the feasibility of the approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

[1] Cloud cover is conventionally estimated from satellite images as the observed fraction of cloudy pixels. Active instruments such as radar and Lidar observe in narrow transects that sample only a small percentage of the area over which the cloud fraction is estimated. As a consequence, the fraction estimate has an associated sampling uncertainty, which usually remains unspecified. This paper extends a Bayesian method of cloud fraction estimation, which also provides an analytical estimate of the sampling error. This method is applied to test the sensitivity of this error to sampling characteristics, such as the number of observed transects and the variability of the underlying cloud field. The dependence of the uncertainty on these characteristics is investigated using synthetic data simulated to have properties closely resembling observations of the spaceborne Lidar NASA-LITE mission. Results suggest that the variance of the cloud fraction is greatest for medium cloud cover and least when conditions are mostly cloudy or clear. However, there is a bias in the estimation, which is greatest around 25% and 75% cloud cover. The sampling uncertainty is also affected by the mean lengths of clouds and of clear intervals; shorter lengths decrease uncertainty, primarily because there are more cloud observations in a transect of a given length. Uncertainty also falls with increasing number of transects. Therefore a sampling strategy aimed at minimizing the uncertainty in transect derived cloud fraction will have to take into account both the cloud and clear sky length distributions as well as the cloud fraction of the observed field. These conclusions have implications for the design of future satellite missions. This paper describes the first integrated methodology for the analytical assessment of sampling uncertainty in cloud fraction observations from forthcoming spaceborne radar and Lidar missions such as NASA's Calipso and CloudSat.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hunting foxes with hounds has been a countryside pursuit in Britain since the 17th Century, but its effect nationally on habitat management is little understood by the general public. A survey questionnaire was distributed to 163 mounted fox hunts of England and Wales to quantify their management practices in woodland and other habitat. Ninety-two hunts (56%), covering 75,514 km(2), returned details on woodland management motivated by the improvement of their sport. The management details were verified via on-site visits for a sample of 200 woodlands. Following verification, the area of woodlands containing the management was conservatively estimated at 24,053 (+/- 2241) ha, comprising 5.9% of woodland area within the whole of the area hunted by the 92 hunts. Management techniques included: tree planting, coppicing, felling, ride and perimeter management. A case study in five hunt countries in southern England examined, through the use of botanical survey and butterfly counts, the consequences of the hunt management on woodland ground flora and butterflies. Managed areas had, within the last 5 years, been coppiced and rides had been cleared. Vegetation cover in managed and unmanaged sites averaged 86% and 64%, respectively, and managed areas held on average 4 more plant species and a higher plant diversity than unmanaged areas (Shannon index of diversity: 2.25 vs. 1.95). Both the average number of butterfly species (2.2 vs. 0.3) and individuals counted (4.6 vs. 0.3) were higher in the managed than unmanaged sites.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The applicability of BET model for calculation of surface area of activated carbons is checked by using molecular simulations. By calculation of geometric surface areas for the simple model carbon slit-like pore with the increasing width, and by comparison of the obtained values with those for the same systems from the VEGA ZZ package (adsorbate-accessible molecular surface), it is shown that the latter methods provide correct values. For the system where a monolayer inside a pore is created the ASA approach (GCMC, Ar, T = 87 K) underestimates the value of surface area for micropores (especially, where only one layer is observed and/or two layers of adsorbed Ar are formed). Therefore, we propose the modification of this method based on searching the relationship between the pore diameter and the number of layers in a pore. Finally BET; original andmodified ASA; and A, B and C-point surface areas are calculated for a series of virtual porous carbons using simulated Ar adsorption isotherms (GCMC and T = 87 K). The comparison of results shows that the BET method underestimates and not, as it was usually postulated, overestimates the surface areas of microporous carbons.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents practical approaches to the problem of sample size re-estimation in the case of clinical trials with survival data when proportional hazards can be assumed. When data are readily available at the time of the review, on a full range of survival experiences across the recruited patients, it is shown that, as expected, performing a blinded re-estimation procedure is straightforward and can help to maintain the trial's pre-specified error rates. Two alternative methods for dealing with the situation where limited survival experiences are available at the time of the sample size review are then presented and compared. In this instance, extrapolation is required in order to undertake the sample size re-estimation. Worked examples, together with results from a simulation study are described. It is concluded that, as in the standard case, use of either extrapolation approach successfully protects the trial error rates. Copyright © 2012 John Wiley & Sons, Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The impact of 1973–2005 land use–land cover (LULC) changes on near-surface air temperatures during four recent summer extreme heat events (EHEs) are investigated for the arid Phoenix, Arizona, metropolitan area using the Weather Research and Forecasting Model (WRF) in conjunction with the Noah Urban Canopy Model. WRF simulations were carried out for each EHE using LULC for the years 1973, 1985, 1998, and 2005. Comparison of measured near-surface air temperatures and wind speeds for 18 surface stations in the region show a good agreement between observed and simulated data for all simulation periods. The results indicate consistent significant contributions of urban development and accompanying LULC changes to extreme temperatures for the four EHEs. Simulations suggest new urban developments caused an intensification and expansion of the area experiencing extreme temperatures but mainly influenced nighttime temperatures with an increase of up to 10 K. Nighttime temperatures in the existing urban core showed changes of up to 2 K with the ongoing LULC changes. Daytime temperatures were not significantly affected where urban development replaced desert land (increase by 1 K); however, maximum temperatures increased by 2–4 K when irrigated agricultural land was converted to suburban development. According to the model simulations, urban landscaping irrigation contributed to cooling by 0.5–1 K in maximum daytime as well as minimum nighttime 2-m air temperatures in most parts of the urban region. Furthermore, urban development led to a reduction of the already relatively weak nighttime winds and therefore a reduction in advection of cooler air into the city.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper proposes and tests a new framework for weighting recursive out-of-sample prediction errors according to their corresponding levels of in-sample estimation uncertainty. In essence, we show how to use the maximum possible amount of information from the sample in the evaluation of the prediction accuracy, by commencing the forecasts at the earliest opportunity and weighting the prediction errors. Via a Monte Carlo study, we demonstrate that the proposed framework selects the correct model from a set of candidate models considerably more often than the existing standard approach when only a small sample is available. We also show that the proposed weighting approaches result in tests of equal predictive accuracy that have much better sizes than the standard approach. An application to an exchange rate dataset highlights relevant differences in the results of tests of predictive accuracy based on the standard approach versus the framework proposed in this paper.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Victoria Island lies at the north-western extremity of the region covered by the vast North American Laurentide Ice Sheet (LIS) in the Canadian Arctic Archipelago. This area is significant because it linked the interior of the LIS to the Arctic Ocean, probably via a number of ice streams. Victoria Island, however, exhibits a remarkably complex glacial landscape, with several successive generations of ice flow indicators superimposed on top of each other and often at abrupt (90 degrees) angles. This complexity represents a major challenge to those attempting to produce a detailed reconstruction of the glacial history of the region. This paper presents a map of the glacial geomorphology of Victoria Island. The map is based on analysis of Landsat Enhanced Thematic Plus (ETM+) satellite imagery and contains over 58,000 individual glacial features which include: glacial lineations, moraines (terminal, lateral, subglacial shear margin), hummocky moraine, ribbed moraine, eskers, glaciofluvial deposits, large meltwater channels, and raised shorelines. The glacial features reveal marked changes in ice flow direction and vigour over time. Moreover, the glacial geomorphology indicates a non-steady withdrawal of ice during deglaciation, with rapidly flowing ice streams focussed into the inter-island troughs and several successively younger flow patterns superimposed on older ones. It is hoped that detailed analysis of this map will lead to an improved reconstruction of the glacial history of this area which will provide other important insights, for example, with respect to the interactions between ice streaming, deglaciation and Arctic Ocean meltwater events.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A rapid capillary electrophoresis method was developed simultaneously to determine artificial sweeteners, preservatives and colours used as additives in carbonated soft drinks. Resolution between all additives occurring together in soft drinks was successfully achieved within a 15-min run-time by employing the micellar electrokinetic chromatography mode with a 20 mM carbonate buffer at pH 9.5 as the aqueous phase and 62 mM sodium dodecyl sulfate as the micellar phase. By using a diode-array detector to monitor the UV-visible range (190-600 nm), the identity of sample components, suggested by migration time, could be confirmed by spectral matching relative to standards.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Maize silage nutritive quality is routinely determined by near infrared reflectance spectroscopy (NIRS). However, little is known about the impact of sample preparation on the accuracy of the calibration to predict biological traits. A sample population of 48 maize silages representing a wide range of physiological maturities was used in a study to determine the impact of different sample preparation procedures (i.e., drying regimes; the presence or absence of residual moisture; the degree of particle comminution) on resultant NIR prediction statistics. All silages were scanned using a total of 12 combinations of sample pre-treatments. Each sample preparation combination was subjected to three multivariate regression techniques to give a total of 36 predictions per biological trait. Increased sample preparations procedure, relative to scanning the unprocessed whole plant (WP) material, always resulted in a numerical minimisation of model statistics. However, the ability of each of the treatments to significantly minimise the model statistics differed. Particle comminution was the most important factor, oven-drying regime was intermediate, and residual moisture presence was the least important. Models to predict various biological parameters of maize silage will be improved if material is subjected to a high degree of particle comminution (i.e., having been passed through a 1 mm screen) and developed on plant material previously dried at 60 degrees C. The extra effort in terms of time and cost required to remove sample residual moisture cannot be justified. (c) 2005 Elsevier B.V. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Genealogical data have been used very widely to construct indices with which to examine the contribution of plant breeding programmes to the maintenance and enhancement of genetic resources. In this paper we use such indices to examine changes in the genetic diversity of the winter wheat crop in England and Wales between 1923 and 1995. We find that, except for one period characterized by the dominance of imported varieties, the genetic diversity of the winter wheat crop has been remarkably stable. This agrees with many studies of plant breeding programmes elsewhere. However, underlying the stability of the winter wheat crop is accelerating varietal turnover without any significant diversification of the genetic resources used. Moreover, the changes we observe are more directly attributable to changes in the varietal shares of the area under winter wheat than to the genealogical relationship between the varieties sown. We argue, therefore, that while genealogical indices reflect how well plant breeders have retained and exploited the resources with which they started, these indices suffer from a critical limitation. They do not reflect the proportion of the available range of genetic resources which has been effectively utilized in the breeding programme: complex crosses of a given set of varieties can yield high indices, and yet disguise the loss (or non-utilization) of a large proportion of the available genetic diversity.