926 resultados para validation tests of PTO


Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE: To determine the objective measures of visual function that are most relevant to subjective quality of vision and perceived reading ability in patients with acquired macular disease. METHODS: Twenty-eight patients with macular disease underwent a comprehensive assessment of visual function. The patients also completed a vision-related quality-of-life questionnaire that included a section of general questions about perceived visual performance and a section with specific questions on reading. RESULTS: Results of all tests of vision correlated highly with reported vision-related quality-of-life impairment. Low-contrast tests explained most of the variance in self-reported problems with reading. Text-reading speed correlated highly with overall concern about vision. CONCLUSIONS: Reading performance is strongly associated with vision-related quality of life. High-contrast distance acuity is not the only relevant measure of visual function in relation to the perceived visual performance of a patient with macular disease. The results suggest the importance of print contrast, even over print size, in reading performance in patients with acquired macular disease.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Context: Population-based screening has been advocated for subclinical thyroid dysfunction in the elderly because the disorder is perceived to be common, and health benefits may be accrued by detection and treatment. Objective: The objective of the study was to determine the prevalence of subclinical thyroid dysfunction and unidentified overt thyroid dysfunction in an elderly population. Design, Setting, and Participants: A cross-sectional survey of a community sample of participants aged 65 yr and older registered with 20 family practices in the United Kingdom. Exclusions: Exclusions included current therapy for thyroid disease, thyroid surgery, or treatment within 12 months. Outcome Measure: Tests of thyroid function (TSH concentration and free T 4 concentration in all, with measurement of free T3 in those with low TSH) were conducted. Explanatory Variables: These included all current medical diagnoses and drug therapies, age, gender, and socioeconomic deprivation (Index of Multiple Deprivation, 2004) Analysis: Standardized prevalence rates were analyzed. Logistic regression modeling was used to determine factors associated with the presence of subclinical thyroid dysfunction Results: A total of 5960 attended for screening. Using biochemical definitions, 94.2% [95% confidence interval (CI) 93.8-94.6%] were euthyroid. Unidentified overt hyper- and hypothyroidism were uncommon (0.3, 0.4%, respectively). Subclinical hyperthyroidism and hypothyroidism were identified with similar frequency (2.1%, 95% CI 1.8-2.3%; 2.9%, 95% CI 2.6-3.1%, respectively). Subclinical thyroid dysfunction was more common in females (P < 0.001) and with increasing age (P < 0.001). After allowing for comorbidities, concurrent drug therapies, age, and gender, an association between subclinical hyperthyroidism and a composite measure of socioeconomic deprivation remained. Conclusions: Undiagnosed overt thyroid dysfunction is uncommon. The prevalence of subclinical thyroid dysfunction is 5%. We have, for the first time, identified an independent association between the prevalence of subclinical thyroid dysfunction and deprivation that cannot be explained solely by the greater burden of chronic disease and/or consequent drug therapies in the deprived population. Copyright © 2006 by The Endocrine Society.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The field evaporation literature has been carefully analysed and is shown to contain various confusions. After redefining consistent terminology, this thesis investigates the mechanisms of field evaporation, in particular, the relevance of the theoretical mechanisms by analysing the available experimental data. A new formalism `extended image-hump formalism' is developed and is used to devise several tests of whether the image-hump mechanism is operating. The general conclusion is that in most cases the Mueller mechanism is not operating and escape takes place via Gomer-type mechanisms.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: To develop a questionnaire that subjectively assesses near visual function in patients with 'accommodating' intraocular lenses (IOLs). Methods: A literature search of existing vision-related quality-of-life instruments identified all questions relating to near visual tasks. Questions were combined if repeated in multiple instruments. Further relevant questions were added and item interpretation confirmed through multidisciplinary consultation and focus groups. A preliminary 19-item questionnaire was presented to 22 subjects at their 4-week visit post first eye phacoemulsification with 'accommodative' IOL implantation, and again 6 and 12 weeks post-operatively. Rasch Analysis, Frequency of Endorsement, and tests of normality (skew and kurtosis) were used to reduce the instrument. Cronbach's alpha and test-retest reliability (intraclass correlation coefficient, ICC) were determined for the final questionnaire. Construct validity was obtained by Pearson's product moment correlation (PPMC) of questionnaire scores to reading acuity (RA) and to Critical Print Size (CPS) reading speed. Criterion validity was obtained by receiver operating characteristic (ROC) curve analysis and dimensionality of the questionnaire was assessed by factor analysis. Results: Rasch Analysis eliminated nine items due to poor fit statistics. The final items have good separation (2.55), internal consistency (Cronbach's α = 0.97) and test-retest reliability (ICC = 0.66). PPMC of questionnaire scores with RA was 0.33, and with CPS reading speed was 0.08. Area under the ROC curve was 0.88 and Factor Analysis revealed one principal factor. Conclusion: The pilot data indicates the questionnaire to be internally consistent, reliable and a valid instrument that could be useful for assessing near visual function in patients with 'accommodating' IOLS. The questionnaire will now be expanded to include other types of presbyopic correction. © 2007 British Contact Lens Association.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The pathological lesions characteristic of Alzheimer's disease (AD), viz., senile plaques (SP) and neurofibrillary tangles (NFT) may not be randomly distributed with reference to each other but exhibit a degree of sptial association or correlation, information on the degree of association between SP and NFT or between the lesions and normal histological features, such as neuronal perikarya and blood vessels, may be valuable in elucidating the pathogenesis of AD. This article reviews the statistical methods available for studying the degree of spatial association in histological sections of AD tissue. These include tests of interspecific association between two or more histological features using chi-square contingency tables, measurement of 'complete' and 'absolute' association, and more complex methods that use grids of contiguous samples. In addition, analyses of association using correlation matrices and stepwise multiple regression methods are described. The advantages and limitations of each method are reviewed and possible future developments discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Stereology and other image analysis methods have enabled rapid and objective quantitative measurements to be made on histological sections. These mesurements may include total volumes, surfaces, lengths and numbers of cells and blood vessels or pathological lesions. Histological features, however, may not be randomly distributed across a section but exhibit 'dispersion', a departure from randomness either towards regularity or aggregation. Information of population dispersion may be valuable not only in understanding the two-or three-dimensional structure but also in elucidating the pathogenesis of lesions in pathological conditions. This article reviews some of the statistical methods available for studying dispersion. These range from simple tests of whether the distribution of a histological faeture departs significantly from random to more complex methods which can detect the intensity of aggregation and the sizes, distribution and spacing of the clusters.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Discrete, microscopic lesions are developed in the brain in a number of neurodegenerative diseases. These lesions may not be randomly distributed in the tissue but exhibit a spatial pattern, i.e., a departure from randomness towards regularlity or clustering. The spatial pattern of a lesion may reflect its development in relation to other brain lesions or to neuroanatomical structures. Hence, a study of spatial pattern may help to elucidate the pathogenesis of a lesion. A number of statistical methods can be used to study the spatial patterns of brain lesions. They range from simple tests of whether the distribution of a lesion departs from random to more complex methods which can detect clustering and the size, distribution and spacing of clusters. This paper reviews the uses and limitations of these methods as applied to neurodegenerative disorders, and in particular to senile plaque formation in Alzheimer's disease.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Some of the factors affecting colonisation of a colonisation sampler, the Standard Aufwuchs Unit (S. Auf. U.) were investigated, namely immersion period, whether anchored on the bottom or suspended, and the influence of riffles. It was concluded that a four-week immersion period was best. S. Auf. U. anchored on the bottom collected both more taxa and individuals than suspended ones. Fewer taxa but more individuals colonised S. Auf. U. in the potamon zone compared to the rhithron zone with a consequent reduction in the values of pollution indexes and diversity. It was concluded that a completely different scoring system was necessary for lowland rivers. Macroinvertebrates colonising S. Auf. U. in simulated streams, lowland rivers and the R. Churnet reflected water quality. A variety of pollution and diversity indexes were applied to results from lowland river sites. Instead of these, it was recommended that an abbreviated species - relative abundance list be used to summarise biological data for use in lowland river surveillance. An intensive study of gastropod populations was made in simulated streams. Lynnaea peregra increased in abundance whereas Potamopyrgas jenkinsi decreased with increasing sewage effluent concentration. No clear-cut differences in reproduction were observed. The presence/absence of eight gastropod taxa was compared with concentrations of various pollutants in lowland rivers. On the basis of all field work it appeared that ammonia, nitrite, copper and zinc were the toxicants most likely to be detrimental to gastropods and that P. jenkinsi and Theodoxus fluviatilis were the least tolerant taxa. 96h acute toxicity tests of P. jenkinsi using ammonia and copper were carried out in a flow-through system after a variety of static range finding tests. P. jenkinsi was intolerant to both toxicants compared to reports on other taxa and the results suggested that these toxicants would affect distribution of this species in the field.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work concerns the developnent of a proton irduced X-ray emission (PIXE) analysis system and a multi-sample scattering chamber facility. The characteristics of the beam pulsing system and its counting rate capabilities were evaluated by observing the ion-induced X-ray emission from pure thick copper targets, with and without beam pulsing operation. The characteristic X-rays were detected with a high resolution Si(Li) detector coupled to a rrulti-channel analyser. The removal of the pile-up continuum by the use of the on-demand beam pulsing is clearly demonstrated in this work. This new on-demand pu1sirg system with its counting rate capability of 25, 18 and 10 kPPS corresponding to 2, 4 am 8 usec main amplifier time constant respectively enables thick targets to be analysed more readily. Reproducibility tests of the on-demard beam pulsing system operation were checked by repeated measurements of the system throughput curves, with and without beam pulsing. The reproducibility of the analysis performed using this system was also checked by repeated measurements of the intensity ratios from a number of standard binary alloys during the experimental work. A computer programme has been developed to evaluate the calculations of the X-ray yields from thick targets bornbarded by protons, taking into account the secondary X-ray yield production due to characteristic X-ray fluorescence from an element energetically higher than the absorption edge energy of the other element present in the target. This effect was studied on metallic binary alloys such as Fe/Ni and Cr/Fe. The quantitative analysis of Fe/Ni and Cr/Fe alloy samples to determine their elemental composition taking into account the enhancement has been demonstrated in this work. Furthermore, the usefulness of the Rutherford backscattering (R.B.S.) technique to obtain the depth profiles of the elements in the upper micron of the sample is discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Oxysterols (OS), the polyoxygenated sterols, represent a class of potent regulatory molecules for important biological actions. Cytotoxicity of OS is one of the most important aspects in studies of OS bioactivities. However, studies, the structure-activity relationship (SAR) study in particular, have been hampered by the limited availability of structurally diverse OS in numbers and amounts. The aim of this project was to develop robust synthetic methods for the preparation of polyhydroxyl sterols, thereof, evaluate their cytotoxicity and establish structure-activity relationship. First, we found hydrophobicity of the side chain is essential for 7-HC's cytotoxicity, and a limited number of hydroxyl groups and a desired configuration on the A, B ring are required for a potent cytotoxicity of an OS, after syntheses and tests of a number of 7-HC's analogues against cancer cell lines. Then polyoxygenation of cholesterol A, B rings was explored. A preparative method for the synthesis of four diastereomerically pure cholest-4-en-3,6-diols was developed. Epoxidation on these cholest-4-en-3,6-diols showed that an allyl group exerts an auxiliary role in producing products with desired configuration in syntheses of the eight diastereomerically pure 45-epoxycholestane-3,6-diols. Reduction of the eight 45-epoxycholestane-3,6-diols produced all eight isomers of the cytotoxic 5α-acholestane 3β,5,6β-triol (CT) for the first time. Epoxide ring opening with protic or Lewis acids on the eight 45-epoxycholestane-3,6-diols are carefully studied. The results demonstrated a combination of an acid and a solvent affected the outcomes of a reaction dramatically. Acyl group participation and migration play an important role with numbers of substrates under certain conditions. All the eight 4,5-trans cholestane- 3,4,5,6-tetrols were synthesised through manipulation of acyl participation. Furthermore these reaction conditions were tested when a number of cholestane-3,4, 5,6,7-pentols and other C3-C7 oxygenated sterols were synthesised for the first time. Introduction of an oxygenated functional group through cholest-2-ene derivatives was studied. The elimination of 3-(4-toluenesulfonate) esters showed the interaction between the existing hydroxyls or acyls with the reaction centre often resulted in different products. The allyl oxidation, epoxidation and Epoxide ring opening reactions are investigated with these cholest-2-enes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Surface deposition of dense aerosol particles is of major concern in the nuclear industry for safety assessment. This study presents theoretical investigations and computer simulations of single gas-born U3O8 particles impacting with the in-reactor surface and the fragmentation of small agglomerates. A theoretical model for elasto-plastic spheres has been developed and used to analyse the force-displacement and force-time relationships. The impulse equations, based on Newton's second law, are applied to govern the tangential bouncing behaviour. The theoretical model is then incorporated into the Distinct Element Method code TRUBAL in order to perform computer simulated tests of particle collisions. A comparison of simulated results with both theoretical predictions and experimental measurements is provided. For oblique impacts, the results in terms of the force-displacement relationship, coefficients of restitution, trajectory of the impacting particle, and distribution of kinetic energy and work done during the process of impact are presented. The effects of Poisson's ratio, friction, plastic deformation and initial particle rotation on the bouncing behaviour are also discussed. In the presence of adhesion an elasto-plastic collision model, which is an extension to the JKR theory, is developed. Based on an energy balance equation the critical sticking velocity is obtained. For oblique collisions computer simulated results are used to establish a set of criteria determining whether or not the particle bounces off the target plate. For impact velocities above the critical sticking value, computer simulated results for the coefficients of restitution and rebound angles of the particle are presented. Computer simulations of fracture/fragmentation resulting from agglomerate-wall impact have also been performed, where two randomly generated agglomerates (one monodisperse, the other polydisperse), each consisting of 50 primary particles are used. The effects of impact angle, local structural arrangements close to the impact point, and plastic deformation at the contacts on agglomerate damage are examined. The simulated results show a significant difference in agglomerate strength between the two assemblies. The computer data also shows that agglomerate damage resulting from an oblique impact is determined by the normal velocity component rather than the impact speed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A study was made of the corrosion behaviour in the ASTM standard Nitric acid and Oxalic acid tests, of two commercial AISI type 304L steels in the as received condition and after various heat treatments. Optical microscopy and SEM, TEM and STEM in conjunction with energy dispersive x-ray analysis, were used to correlate the corrosion behaviour of these steels with their microstructure. Some evidence of phosphorus segregation at grain boundaries was found. The corrosion behaviour at microstructural level was studied by examining on the TEM thin foils of steel that had been exposed to boiling nitric acid. Banding attack in the nitric acid and oxalic acid tests was studied using SEM and EPNA and found to be due to the micro-segregation of chromium and nickel. Using two experimental series of 304L, one a 17% Cr, 91 Ni, steel with phosphorus additions from 0.006% to 0.028%, the other a 20% Cr, 121 Ni steel with boron additions from 0.0011 to 0.00B51. The effect of these elements on corrosion in the nitric acid test was studied. The effect of different cooling rates and different solution treatment temperature on the behaviour of these steels was examined. TEM and STEM in conjunction with energy-dispersive x-ray analysis were again used to study the microstructure of the steels. Phosphorus was found to affect the corrosion behaviour but no effect was found with boron.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Many tests of financial contagion require a definition of the dates separating calm from crisis periods. We propose to use a battery of break search procedures for individual time series to objectively identify potential break dates in relationships between countries. Applied to the biggest European stock markets and combined with two well established tests for financial contagion, this approach results in break dates which correctly identify the timing of changes in cross-country transmission mechanisms. Application of break search procedures breathes new life into the established contagion tests, allowing for an objective, data-driven timing of crisis periods.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose – This paper attempts to seek answers to four questions. Two of these questions have been borrowed (but adapted) from the work of Defee et al.: RQ1. To what extent is theory used in purchasing and supply chain management (P&SCM) research? RQ2. What are the prevalent theories to be found in P&SCM research? Following on from these questions an additional question is posed: RQ3. Are theory-based papers more highly cited than papers with no theoretical foundation? Finally, drawing on the work of Harland et al., the authors have added a fourth question: RQ4. To what extent does P&SCM meet the tests of coherence, breadth and depth, and quality necessary to make it a scientific discipline? Design/methodology/approach – A systematic literature review was conducted in accordance with the model outlined by Tranfield et al. for three journals within the field of “purchasing and supply chain management”. In total 1,113 articles were reviewed. In addition a citation analysis was completed covering 806 articles in total. Findings – The headline features from the results suggest that nearly a decade-and-a-half on from its development, the field still lacks coherence. There is the absence of theory in much of the work and although theory-based articles achieved on average a higher number of citations than non-theoretical papers, there is no obvious contender as an emergent paradigm for the discipline. Furthermore, it is evident that P&SCM does not meet Fabian's test necessary to make it a scientific discipline and is still some way from being a normal science. Research limitations/implications – This study would have benefited from the analysis of further journals, however the analysis of 1,113 articles from three leading journals in the field of P&SCM was deemed sufficient in scope. In addition, a further significant line of enquiry to follow is the rigour vs relevance debate. Practical implications – This article is of interest to both an academic and practitioner audience as it highlights the use theories in P&SCM. Furthermore, this article raises a number of important questions. Should research in this area draw more heavily on theory and if so which theories are appropriate? Social implications – The broader social implications relate to the discussion of how a scientific discipline develops and builds on the work of Fabian and Amundson. Originality/value – The data set for this study is significant and builds on a number of previous literature reviews. This review is both greater in scope than previous reviews and is broader in its subject focus. In addition, the citation analysis (not previously conducted in any of the reviews) and statistical test highlights that theory-based articles are more highly cited than non-theoretically based papers. This could indicate that researchers are attempting to build on one another's work.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: Phonological accounts of reading implicate three aspects of phonological awareness tasks that underlie the relationship with reading; a) the language-based nature of the stimuli (words or nonwords), b) the verbal nature of the response, and c) the complexity of the stimuli (words can be segmented into units of speech). Yet, it is uncertain which task characteristics are most important as they are typically confounded. By systematically varying response-type and stimulus complexity across speech and non-speech stimuli, the current study seeks to isolate the characteristics of phonological awareness tasks that drive the prediction of early reading. Method: Four sets of tasks were created; tone stimuli (simple non-speech) requiring a non-verbal response, phonemes (simple speech) requiring a non-verbal response, phonemes requiring a verbal response, and nonwords (complex speech) requiring a verbal response. Tasks were administered to 570 2nd grade children along with standardized tests of reading and non-verbal IQ. Results: Three structural equation models comparing matched sets of tasks were built. Each model consisted of two 'task' factors with a direct link to a reading factor. The following factors predicted unique variance in reading: a) simple speech and non-speech stimuli, b) simple speech requiring a verbal response but not simple speech requiring a non-verbal-response, and c) complex and simple speech stimuli. Conclusions: Results suggest that the prediction of reading by phonological tasks is driven by the verbal nature of the response and not the complexity or 'speechness' of the stimuli. Findings highlight the importance of phonological output processes to early reading.