40 resultados para Error in essence


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Abstract Basal ice samples were collected from ice exposures in a natural subglacial cavity beneath an outlet glacier of Øksfjordjøkelen, North Norway. Sediment and cation (Ca2+, Mg2+, Na+, K+) concentrations were then determined, and indicate stacking of basal ice units producing a repeat pattern of ‘clean firnification ice’ overlying sediment-rich ice. All measured cations show correlation with sediment concentration indicating weathering reactions to be the dominant contributor of cations. Regressions of specific sediment surface area per unit volume with cation concentration are performed and used to predict cation concentrations. These predicted values provide an indication of cation relocation within the basal ice sequence. The results suggest limited melting and refreezing resulting in the relocation of predominantly monovalent cations downward through the profile. Exchange of cations into solution during the melting of sediment-rich ice samples has previously been suggested as a source of error in such investigations. Analyses of sediment-free regelation ice spicules formed at the bed show cation concentrations above firnification ice levels and comparable, in many instances, to the basal ice samples.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECTIVE:

To estimate the prevalence of refractive errors in persons 40 years and older.

METHODS:

Counts of persons with phakic eyes with and without spherical equivalent refractive error in the worse eye of +3 diopters (D) or greater, -1 D or less, and -5 D or less were obtained from population-based eye surveys in strata of gender, race/ethnicity, and 5-year age intervals. Pooled age-, gender-, and race/ethnicity-specific rates for each refractive error were applied to the corresponding stratum-specific US, Western European, and Australian populations (years 2000 and projected 2020).

RESULTS:

Six studies provided data from 29 281 persons. In the US, Western European, and Australian year 2000 populations 40 years or older, the estimated crude prevalence for hyperopia of +3 D or greater was 9.9%, 11.6%, and 5.8%, respectively (11.8 million, 21.6 million, and 0.47 million persons). For myopia of -1 D or less, the estimated crude prevalence was 25.4%, 26.6%, and 16.4% (30.4 million, 49.6 million, and 1.3 million persons), respectively, of whom 4.5%, 4.6%, and 2.8% (5.3 million, 8.5 million, and 0.23 million persons), respectively, had myopia of -5 D or less. Projected prevalence rates in 2020 were similar.

CONCLUSIONS:

Refractive errors affect approximately one third of persons 40 years or older in the United States and Western Europe, and one fifth of Australians in this age group.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

OBJECTIVES: To evaluate different refractive cutoffs for spectacle provision with regards to their impact on visual improvement and spectacle compliance. DESIGN: Prospective study of visual improvement and spectacle compliance. PARTICIPANTS: South African school children aged 6-19 years receiving free spectacles in a programme supported by Helen Keller International. METHODS: Refractive error, age, gender, urban versus rural residence, presenting and best-corrected vision were recorded for participants. Spectacle wear was observed directly at an unannounced follow-up examination 4-11 months after initial provision of spectacles. The association between five proposed refractive cutoff protocols and visual improvement and spectacle compliance were examined in separate multivariate models. MAIN OUTCOMES: Refractive cutoffs for spectacle distribution which would effectively identify children with improved vision, and those more likely to comply with spectacle wear. RESULTS: Among 8520 children screened, 810 (9.5%) received spectacles, of whom 636 (79%) were aged 10-14 years, 530 (65%) were girls, 324 (40%) had vision improvement > or = 3 lines, and 483 (60%) were examined 6.4+/-1.5 (range 4.6 to 10.9) months after spectacle dispensing. Among examined children, 149 (31%) were wearing or carrying their glasses. Children meeting cutoffs < or = -0.75 D of myopia, > or = +1.00 D of hyperopia and > or = +0.75 D of astigmatism had significantly greater improvement in vision than children failing to meet these criteria, when adjusting for age, gender and urban versus rural residence. None of the proposed refractive protocols discriminated between children wearing and not wearing spectacles. Presenting vision and improvement in vision were unassociated with subsequent spectacle wear, but girls (p < or = 0.0006 for all models) were more likely to be wearing glasses than were boys. CONCLUSIONS: To the best of our knowledge, this is the first suggested refractive cutoff for glasses dispensing validated with respect to key programme outcomes. The lack of association between spectacle retention and either refractive error or vision may have been due to the relatively modest degree of refractive error in this African population.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

PURPOSE: To evaluate the prevalence and causes of visual impairment among Chinese children aged 3 to 6 years in Beijing. DESIGN: Population-based prevalence survey. METHODS: Presenting and pinhole visual acuity were tested using picture optotypes or, in children with pinhole vision < 6/18, a Snellen tumbling E chart. Comprehensive eye examinations and cycloplegic refraction were carried out for children with pinhole vision < 6/18 in the better-seeing eye. RESULTS: All examinations were completed on 17,699 children aged 3 to 6 years (95.3% of sample). Subjects with bilateral correctable low vision (presenting vision < 6/18 correctable to >or= 6/18) numbered 57 (0.322%; 95% confidence interval [CI], 0.237% to 0.403%), while 14 (0.079%; 95% CI, 0.038% to 0.120%) had bilateral uncorrectable low vision (best-corrected vision of < 6/18 and >or= 3/60), and 5 subjects (0.028%; 95% CI, 0.004% to 0.054%) were bilaterally blind (best-corrected acuity < 3/60). The etiology of 76 cases of visual impairment included: refractive error in 57 children (75%), hereditary factors (microphthalmos, congenital cataract, congenital motor nystagmus, albinism, and optic nerve disease) in 13 children (17.1 %), amblyopia in 3 children (3.95%), and cortical blindness in 1 child (1.3%). The cause of visual impairment could not be established in 2 (2.63%) children. The prevalence of visual impairment did not differ by gender, but correctable low vision was significantly (P < .0001) more common among urban as compared with rural children. CONCLUSION: The leading causes of visual impairment among Chinese preschool-aged children are refractive error and hereditary eye diseases. A higher prevalence of refractive error is already present among urban as compared with rural children in this preschool population.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Breast cancer is a heterogeneous disease, at both an inter- and intra-tumoural level. Appreciating heterogeneity through the application of biomarkers and molecular signatures adds complexity to tumour taxonomy but is key to personalising diagnosis, treatment and prognosis. The extent to which heterogeneity exists, and its interpretation remains a challenge to pathologists. Using HER2 as an exemplar, we have developed a simple reproducible heterogeneity index. Cell-to-cell HER2 heterogeneity was extensive in a proportion of both reported 'amplified' and 'non-amplified' cases. The highest levels of heterogeneity objectively identified occurred in borderline categories and higher ratio non-amplified cases. A case with particularly striking heterogeneity was analysed further with an array of biomarkers in order to assign a molecular diagnosis. Broad biological complexity was evident. In essence, interpretation, depending on the area of tumour sampled, could have been one of three distinct phenotypes, each of which would infer different therapeutic interventions. Therefore, we recommend that heterogeneity is assessed and taken into account when determining treatment options.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The inclusion of collisional rates for He-like Fe and Ca ions is discussed with reference to the analysis of solar flare Fe XXV and Ca XIX line emission, particularly from the Yohkoh Bragg Crystal Spectrometer (BCS). The new data are a slight improvement on calculations presently used in the BCS analysis software in that the discrepancy in the Fe XXV y and z line intensities (observed larger than predicted) is reduced. Values of electron temperature from satellite-to-resonance line ratios are slightly reduced (by up to 1 MK) for a given observed ratio. The new atomic data will be incorporated in the Yohkoh BCS databases. The data should also be of interest for the analysis of high-resolution, non-solar spectra expected from the Constellation-X and Astro-E space missions. A comparison is made of a tokamak S XV spectrum with a synthetic spectrum using atomic data in the existing software and the agreement is found to be good, so validating these data for particularly high-n satellite wavelengths close to the S XV resonance line. An error in a data file used for analyzing BCS Fe XXVI spectra is corrected, so permitting analysis of these spectra.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Cooling of mechanical resonators is currently a popular topic in many fields of physics including ultra-high precision measurements, detection of gravitational waves and the study of the transition between classical and quantum behaviour of a mechanical system. Here we report the observation of self-cooling of a micromirror by radiation pressure inside a high-finesse optical cavity. In essence, changes in intensity in a detuned cavity, as caused by the thermal vibration of the mirror, provide the mechanism for entropy flow from the mirror's oscillatory motion to the low-entropy cavity field. The crucial coupling between radiation and mechanical motion was made possible by producing free-standing micromirrors of low mass (m approximately 400 ng), high reflectance (more than 99.6%) and high mechanical quality (Q approximately 10,000). We observe cooling of the mechanical oscillator by a factor of more than 30; that is, from room temperature to below 10 K. In addition to purely photothermal effects we identify radiation pressure as a relevant mechanism responsible for the cooling. In contrast with earlier experiments, our technique does not need any active feedback. We expect that improvements of our method will permit cooling ratios beyond 1,000 and will thus possibly enable cooling all the way down to the quantum mechanical ground state of the micromirror.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Effects of inappropriate installation can bias the measurements of flowmeters. For vortex flowmeters, a method is proposed to detect inappropriate installation of the flowmeter from the oscillatory signal of the vortex sensor. The method is based on assuming the process of vortex generation to be a generic, noisy, nonlinear oscillation, describable by a noisy Stuart-Landau equation, with a corresponding sensor signal that also contains higher harmonic excitations. By making use of the scaling properties of the Navier-Stokes Equation, the method was designed to be robust with respect to uncertainties in the fluid properties. The diagnostic functionality is demonstrated on measurement data. In the experiments, installation effects that lead to more than 0.5% error in the output of the flowmeter could clearly be detected. (C) 2003 Elsevier Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Three-dimensional reconstruction from volumetric medical images (e.g. CT, MRI) is a well-established technology used in patient-specific modelling. However, there are many cases where only 2D (planar) images may be available, e.g. if radiation dose must be limited or if retrospective data is being used from periods when 3D data was not available. This study aims to address such cases by proposing an automated method to create 3D surface models from planar radiographs. The method consists of (i) contour extraction from the radiograph using an Active Contour (Snake) algorithm, (ii) selection of a closest matching 3D model from a library of generic models, and (iii) warping the selected generic model to improve correlation with the extracted contour.

This method proved to be fully automated, rapid and robust on a given set of radiographs. Measured mean surface distance error values were low when comparing models reconstructed from matching pairs of CT scans and planar X-rays (2.57–3.74 mm) and within ranges of similar studies. Benefits of the method are that it requires a single radiographic image to perform the surface reconstruction task and it is fully automated. Mechanical simulations of loaded bone with different levels of reconstruction accuracy showed that an error in predicted strain fields grows proportionally to the error level in geometric precision. In conclusion, models generated by the proposed technique are deemed acceptable to perform realistic patient-specific simulations when 3D data sources are unavailable.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This is the first in a two-part analysis of Northern Ireland’s engagement with the climate governance regime created by the UK Climate Change Act 2008. It contends that UK devolution has shaped this national regime and may itself be shaped by the national low carbon transition, particularly in the case of the UK’s most devolved region. In essence, while Northern Ireland’s consent to the application of the Act appeared to represent a long-term commitment to share power in the interests of present and future generations and thus to devolution itself, this first article argues that it was also potentially illusory. The second article argues that making an effective commitment to climate governance will require its devolved administration to allow constitutional arrangements designed for conflict resolution to mature. Failure to do so will have important implications for the UK’s putative ‘national’ low carbon transition and the longer term viability of devolution in the region.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The finite element method plays an extremely important role in forging process design as it provides a valid means to quantify forging errors and thereby govern die shape modification to improve the dimensional accuracy of the component. However, this dependency on process simulation could raise significant problems and present a major drawback if the finite element simulation results were inaccurate. This paper presents a novel approach to assess the dimensional accuracy and shape quality of aeroengine blades formed from finite element hot-forging simulation. The proposed virtual inspection system uses conventional algorithms adopted by modern coordinate measurement processes as well as the latest free-form surface evaluation techniques to provide a robust framework for virtual forging error assessment. Established techniques for the physical registration of real components have been adapted to localise virtual models in relation to a nominal Design Coordinate System. Blades are then automatically analysed using a series of intelligent routines to generate measurement data and compute dimensional errors. The results of a comparison study indicate that the virtual inspection results and actual coordinate measurement data are highly comparable, validating the approach as an effective and accurate means to quantify forging error in a virtual environment. Consequently, this provides adequate justification for the implementation of the virtual inspection system in the virtual process design, modelling and validation of forged aeroengine blades in industry.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Data obtained with any research tool must be reproducible, a concept referred to as reliability. Three techniques are often used to evaluate reliability of tools using continuous data in aging research: intraclass correlation coefficients (ICC), Pearson correlations, and paired t tests. These are often construed as equivalent when applied to reliability. This is not correct, and may lead researchers to select instruments based on statistics that may not reflect actual reliability. The purpose of this paper is to compare the reliability estimates produced by these three techniques and determine the preferable technique. A hypothetical dataset was produced to evaluate the reliability estimates obtained with ICC, Pearson correlations, and paired t tests in three different situations. For each situation two sets of 20 observations were created to simulate an intrarater or inter-rater paradigm, based on 20 participants with two observations per participant. Situations were designed to demonstrate good agreement, systematic bias, or substantial random measurement error. In the situation demonstrating good agreement, all three techniques supported the conclusion that the data were reliable. In the situation demonstrating systematic bias, the ICC and t test suggested the data were not reliable, whereas the Pearson correlation suggested high reliability despite the systematic discrepancy. In the situation representing substantial random measurement error where low reliability was expected, the ICC and Pearson coefficient accurately illustrated this. The t test suggested the data were reliable. The ICC is the preferred technique to measure reliability. Although there are some limitations associated with the use of this technique, they can be overcome.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Efficiently exploring exponential-size architectural design spaces with many interacting parameters remains an open problem: the sheer number of experiments required renders detailed simulation intractable.We attack this via an automated approach that builds accurate predictive models. We simulate sampled points, using results to teach our models the function describing relationships among design parameters. The models can be queried and are very fast, enabling efficient design tradeoff discovery. We validate our approach via two uniprocessor sensitivity studies, predicting IPC with only 1–2% error. In an experimental study using the approach, training on 1% of a 250-K-point CMP design space allows our models to predict performance with only 4–5% error. Our predictive modeling combines well with techniques that reduce the time taken by each simulation experiment, achieving net time savings of three-four orders of magnitude.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In essence, optimal software engineering means creating the right product, through the right process, to the overall satisfaction of everyone involved. Adopting the agile approach to software development appears to have helped many companies make substantial progress towards that goal. The purpose of this paper is to clarify that contribution from comparative survey information gathered in 2010 and 2012. The surveys were undertaken in software development companies across Northern Ireland. The paper describes the design of the surveys and discusses optimality in relation to the results obtained. Both surveys aimed to achieve comprehensive coverage of a single region rather than rely on a voluntary sample. The main outcome from the work is a collection of insights into the nature and advantages of agile development, suggesting how further progress towards optimality might be achieved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Organismal metabolic rates influence many ecological processes, and the mass-specific metabolic rate of organisms decreases with increasing body mass according to a power law. The exponent in this equation is commonly thought to be the three-quarter-power of body mass, determined by fundamental physical laws that extend across taxa. However, recent work has cast doubt as to the universality of this relationship, the value of 0.75 being an interspecies 'average' of scaling exponents that vary naturally between certain boundaries. There is growing evidence that metabolic scaling varies significantly between even closely related species, and that different values can be associated with lifestyle, activity and metabolic rates. Here we show that the value of the metabolic scaling exponent varies within a group of marine ectotherms, chitons (Mollusca: Polyplacophora: Mopaliidae), and that differences in the scaling relationship may be linked to species-specific adaptations to different but overlapping microhabitats. Oxygen consumption rates of six closely related, co-occurring chiton species from the eastern Pacific (Vancouver Island, British Columbia) were examined under controlled experimental conditions. Results show that the scaling exponent varies between species (between 0.64 and 0.91). Different activity levels, metabolic rates and lifestyle may explain this variation. The interspecific scaling exponent in these data is not significantly different from the archetypal 0.75 value, even though five out of six species-specific values are significantly different from that value. Our data suggest that studies using commonly accepted values such as 0.75 derived from theoretical models to extrapolate metabolic data of species to population or community levels should consider the likely variation in exponents that exists in the real world, or seek to encompass such error in their models. This study, as in numerous previous ones, demonstrates that scaling exponents show large, naturally occurring variation, and provides more evidence against the existence of a universal scaling law. © 2012 Elsevier B.V.