47 resultados para Accuracy.
em CentAUR: Central Archive University of Reading - UK
Resumo:
Using a flexible chemical box model with full heterogeneous chemistry, intercepts of chemically modified Langley plots have been computed for the 5 years of zenith-sky NO2 data from Faraday in Antarctica (65°S). By using these intercepts as the effective amount in the reference spectrum, drifts in zero of total vertical NO2 were much reduced. The error in zero of total NO2 is ±0.03×1015 moleccm−2 from one year to another. This error is small enough to determine trends in midsummer and any variability in denoxification between midwinters. The technique also suggests a more sensitive method for determining N2O5 from zenith-sky NO2 data.
Resumo:
Insect returns from the UK's Doppler weather radars were collected in the summers of 2007 and 2008, to ascertain their usefulness in providing information about boundary layer winds. Such observations could be assimilated into numerical weather prediction models to improve forecasts of convective showers before precipitation begins. Significant numbers of insect returns were observed during daylight hours on a number of days through this period, when they were detected at up to 30 km range from the radars, and up to 2 km above sea level. The range of detectable insect returns was found to vary with time of year and temperature. There was also a very weak correlation with wind speed and direction. Use of a dual-polarized radar revealed that the insects did not orient themselves at random, but showed distinct evidence of common orientation on several days, sometimes at an angle to their direction of travel. Observation minus model background residuals of wind profiles showed greater bias and standard deviation than that of other wind measurement types, which may be due to the insects' headings/airspeeds and to imperfect data extraction. The method used here, similar to the Met Office's procedure for extracting precipitation returns, requires further development as clutter contamination remained one of the largest error contributors. Wind observations derived from the insect returns would then be useful for data assimilation applications.
Resumo:
With the rapid development in technology over recent years, construction, in common with many areas of industry, has become increasingly complex. It would, therefore, seem to be important to develop and extend the understanding of complexity so that industry in general and in this case the construction industry can work with greater accuracy and efficiency to provide clients with a better service. This paper aims to generate a definition of complexity and a method for its measurement in order to assess its influence upon the accuracy of the quantity surveying profession in UK new build office construction. Quantitative data came from an analysis of twenty projects of varying size and value and qualitative data came from interviews with professional quantity surveyors. The findings highlight the difficulty in defining and measuring project complexity. The correlation between accuracy and complexity was not straightforward, being subjected to many extraneous variables, particularly the impact of project size. Further research is required to develop a better measure of complexity. This is in order to improve the response of quantity surveyors, so that an appropriate level of effort can be applied to individual projects, permitting greater accuracy and enabling better resource planning within the profession.
Resumo:
Although accuracy of digital elevation models (DEMs) can be quantified and measured in different ways, each is influenced by three main factors: terrain character, sampling strategy and interpolation method. These parameters, and their interaction, are discussed. The generation of DEMs from digitised contours is emphasised because this is the major source of DEMs, particularly within member countries of OEEPE. Such DEMs often exhibit unwelcome artifacts, depending on the interpolation method employed. The origin and magnitude of these effects and how they can be reduced to improve the accuracy of the DEMs are also discussed.
Resumo:
We survey observations of the radial magnetic field in the heliosphere as a function of position, sunspot number, and sunspot cycle phase. We show that most of the differences between pairs of simultaneous observations, normalized using the square of the heliocentric distance and averaged over solar rotations, are consistent with the kinematic "flux excess" effect whereby the radial component of the frozen-in heliospheric field is increased by longitudinal solar wind speed structure. In particular, the survey shows that, as expected, the flux excess effect at high latitudes is almost completely absent during sunspot minimum but is almost the same as within the streamer belt at sunspot maximum. We study the uncertainty inherent in the use of the Ulysses result that the radial field is independent of heliographic latitude in the computation of the total open solar flux: we show that after the kinematic correction for the excess flux effect has been made it causes errors that are smaller than 4.5%, with a most likely value of 2.5%. The importance of this result for understanding temporal evolution of the open solar flux is reviewed.
Resumo:
Stephens and Donnelly have introduced a simple yet powerful importance sampling scheme for computing the likelihood in population genetic models. Fundamental to the method is an approximation to the conditional probability of the allelic type of an additional gene, given those currently in the sample. As noted by Li and Stephens, the product of these conditional probabilities for a sequence of draws that gives the frequency of allelic types in a sample is an approximation to the likelihood, and can be used directly in inference. The aim of this note is to demonstrate the high level of accuracy of "product of approximate conditionals" (PAC) likelihood when used with microsatellite data. Results obtained on simulated microsatellite data show that this strategy leads to a negligible bias over a wide range of the scaled mutation parameter theta. Furthermore, the sampling variance of likelihood estimates as well as the computation time are lower than that obtained with importance sampling on the whole range of theta. It follows that this approach represents an efficient substitute to IS algorithms in computer intensive (e.g. MCMC) inference methods in population genetics. (c) 2006 Elsevier Inc. All rights reserved.
Resumo:
1. Suspension feeding by caseless caddisfly larvae (Trichoptera) constitutes a major pathway for energy flow, and strongly influences productivity, in streams and rivers. 2. Consideration of the impact of these animals on lotic ecosystems has been strongly influenced by a single study investigating the efficiency of particle capture of nets built by one species of hydropsychid caddisfly. 3. Using water sampling techniques at appropriate spatial scales, and taking greater consideration of local hydrodynamics than previously, we examined the size-frequency distribution of particles captured by the nets of Hydropsyche siltalai. Our results confirm that capture nets are selective in terms of particle size, and in addition suggest that this selectivity is for particles likely to provide the most energy. 4. By incorporating estimates of flow diversion around the nets of caseless caddisfly larvae, we show that capture efficiency (CE) is considerably higher than previously estimated, and conclude that more consideration of local hydrodynamics is needed to evaluate the efficiency of particle capture. 5. We use our results to postulate a mechanistic explanation for a recent example of interspecific facilitation, whereby a reduction of near-bed velocities seen in single species monocultures leads to increased capture rates and local depletion of seston within the region of reduced velocity.
Resumo:
This study investigated the relationships between phonological awareness and reading in Oriya and English. Oriya is the official language of Orissa, an eastern state of India. The writing system is an alphasyllabary. Ninety-nine fifth grade children (mean age 9 years 7 months) were assessed on measures of phonological awareness, word reading and pseudo-word reading in both languages. Forty-eight of the children attended Oriya-medium schools where they received literacy instruction in Oriya from grade 1 and learned English from grade 2. Fifty-one children attended English-medium schools where they received literacy instruction in English from grade 1 and in Oriya from grade 2. The results showed that phonological awareness in Oriya contributed significantly to reading Oriya and English words and pseudo-words for the children in the Oriya-medium schools. However, it only contributed to Oriya pseudo-word reading and English word reading for children in the English-medium schools. Phonological awareness in English contributed to English word and pseudo-word reading for both groups. Further analyses investigated the contribution of awareness of large phonological units (syllable, onsets and rimes) and small phonological units (phonemes) to reading in each language. The data suggest that cross-language transfer and facilitation of phonological awareness to word reading is not symmetrical across languages and may depend both on the characteristics of the different orthographies of the languages being learned and whether the first literacy language is also the first spoken language.
Resumo:
An increasing number of neuroscience experiments are using virtual reality to provide a more immersive and less artificial experimental environment. This is particularly useful to navigation and three-dimensional scene perception experiments. Such experiments require accurate real-time tracking of the observer's head in order to render the virtual scene. Here, we present data on the accuracy of a commonly used six degrees of freedom tracker (Intersense IS900) when it is moved in ways typical of virtual reality applications. We compared the reported location of the tracker with its location computed by an optical tracking method. When the tracker was stationary, the root mean square error in spatial accuracy was 0.64 mm. However, we found that errors increased over ten-fold (up to 17 mm) when the tracker moved at speeds common in virtual reality applications. We demonstrate that the errors we report here are predominantly due to inaccuracies of the IS900 system rather than the optical tracking against which it was compared. (c) 2006 Elsevier B.V. All rights reserved.
Resumo:
In this paper, an improved stochastic discrimination (SD) is introduced to reduce the error rate of the standard SD in the context of multi-class classification problem. The learning procedure of the improved SD consists of two stages. In the first stage, a standard SD, but with shorter learning period is carried out to identify an important space where all the misclassified samples are located. In the second stage, the standard SD is modified by (i) restricting sampling in the important space; and (ii) introducing a new discriminant function for samples in the important space. It is shown by mathematical derivation that the new discriminant function has the same mean, but smaller variance than that of standard SD for samples in the important space. It is also analyzed that the smaller the variance of the discriminant function, the lower the error rate of the classifier. Consequently, the proposed improved SD improves standard SD by its capability of achieving higher classification accuracy. Illustrative examples axe provided to demonstrate the effectiveness of the proposed improved SD.
Resumo:
Stochastic discrimination (SD) depends on a discriminant function for classification. In this paper, an improved SD is introduced to reduce the error rate of the standard SD in the context of a two-class classification problem. The learning procedure of the improved SD consists of two stages. Initially a standard SD, but with shorter learning period is carried out to identify an important space where all the misclassified samples are located. Then the standard SD is modified by 1) restricting sampling in the important space, and 2) introducing a new discriminant function for samples in the important space. It is shown by mathematical derivation that the new discriminant function has the same mean, but with a smaller variance than that of the standard SD for samples in the important space. It is also analyzed that the smaller the variance of the discriminant function, the lower the error rate of the classifier. Consequently, the proposed improved SD improves standard SD by its capability of achieving higher classification accuracy. Illustrative examples are provided to demonstrate the effectiveness of the proposed improved SD.
Resumo:
Background: This study was carried out as part of a European Union funded project (PharmDIS-e+), to develop and evaluate software aimed at assisting physicians with drug dosing. A drug that causes particular problems with drug dosing in primary care is digoxin because of its narrow therapeutic range and low therapeutic index. Objectives: To determine (i) accuracy of the PharmDIS-e+ software for predicting serum digoxin levels in patients who are taking this drug regularly; (ii) whether there are statistically significant differences between predicted digoxin levels and those measured by a laboratory and (iii) whether there are differences between doses prescribed by general practitioners and those suggested by the program. Methods: We needed 45 patients to have 95% Power to reject the null hypothesis that the mean serum digoxin concentration was within 10% of the mean predicted digoxin concentration. Patients were recruited from two general practices and had been taking digoxin for at least 4 months. Exclusion criteria were dementia, low adherence to digoxin and use of other medications known to interact to a clinically important extent with digoxin. Results: Forty-five patients were recruited. There was a correlation of 0·65 between measured and predicted digoxin concentrations (P < 0·001). The mean difference was 0·12 μg/L (SD 0·26; 95% CI 0·04, 0·19, P = 0·005). Forty-seven per cent of the patients were prescribed the same dose as recommended by the software, 44% were prescribed a higher dose and 9% a lower dose than recommended. Conclusion: PharmDIS-e+ software was able to predict serum digoxin levels with acceptable accuracy in most patients.
Resumo:
Saccadic eye-movements to a visual target are less accurate if there are distracters close to its location (local distracters). The addition of more distracters, remote from the target location (remote distracters), invokes an involuntary increase in the response latency of the saccade and attenuates the effect of local distracters on accuracy. This may be due to the target and distracters directly competing (direct route) or to the remote distracters acting to impair the ability to disengage from fixation (indirect route). To distinguish between these we examined the development of saccade competition by recording saccade latency and accuracy responses made to a target and local distracter compared with those made with an addition of a remote distracter. The direct route would predict that the remote distracter impacts on the developing competition between target and local distracter, while the indirect route would predict no change as the accuracy benefit here derives from accessing the same competitive process but at a later stage. We found that the presence of the remote distracter did not change the pattern of accuracy improvement. This suggests that the remote distracter was acting along an indirect route that inhibits disengagement from fixation, slows saccade initiation, and enables more accurate saccades to be made.