15 resultados para MELTING CURVE ANALYSIS

em Aston University Research Archive


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose To develop a standardized questionnaire of near visual function and satisfaction to complement visual function evaluations of presbyopic corrections. Setting Eye Clinic, School of Life and Health Sciences, Aston University, Midland Eye Institute and Solihull Hospital, Birmingham, United Kingdom. Design Questionnaire development. Methods A preliminary 26-item questionnaire of previously used near visual function items was completed by patients with monofocal intraocular lenses (IOLs), multifocal IOLs, accommodating IOLs, multifocal contact lenses, or varifocal spectacles. Rasch analysis was used for item reduction, after which internal and test–retest reliabilities were determined. Construct validity was determined by correlating the resulting Near Activity Visual Questionnaire (NAVQ) scores with near visual acuity and critical print size (CPS), which was measured using the Minnesota Low Vision Reading Test chart. Discrimination ability was assessed through receiver-operating characteristic (ROC) curve analysis. Results One hundred fifty patients completed the questionnaire. Item reduction resulted in a 10-item NAVQ with excellent separation (2.92), internal consistency (Cronbach a = 0.95), and test–retest reliability (intraclass correlation coefficient = 0.72). Correlations of questionnaire scores with near visual acuity (r = 0.32) and CPS (r = 0.27) provided evidence of validity, and discrimination ability was excellent (area under ROC curve = 0.91). Conclusion Results show the NAVQ is a reliable, valid instrument that can be incorporated into the evaluation of presbyopic corrections.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Purpose: To develop a questionnaire that subjectively assesses near visual function in patients with 'accommodating' intraocular lenses (IOLs). Methods: A literature search of existing vision-related quality-of-life instruments identified all questions relating to near visual tasks. Questions were combined if repeated in multiple instruments. Further relevant questions were added and item interpretation confirmed through multidisciplinary consultation and focus groups. A preliminary 19-item questionnaire was presented to 22 subjects at their 4-week visit post first eye phacoemulsification with 'accommodative' IOL implantation, and again 6 and 12 weeks post-operatively. Rasch Analysis, Frequency of Endorsement, and tests of normality (skew and kurtosis) were used to reduce the instrument. Cronbach's alpha and test-retest reliability (intraclass correlation coefficient, ICC) were determined for the final questionnaire. Construct validity was obtained by Pearson's product moment correlation (PPMC) of questionnaire scores to reading acuity (RA) and to Critical Print Size (CPS) reading speed. Criterion validity was obtained by receiver operating characteristic (ROC) curve analysis and dimensionality of the questionnaire was assessed by factor analysis. Results: Rasch Analysis eliminated nine items due to poor fit statistics. The final items have good separation (2.55), internal consistency (Cronbach's α = 0.97) and test-retest reliability (ICC = 0.66). PPMC of questionnaire scores with RA was 0.33, and with CPS reading speed was 0.08. Area under the ROC curve was 0.88 and Factor Analysis revealed one principal factor. Conclusion: The pilot data indicates the questionnaire to be internally consistent, reliable and a valid instrument that could be useful for assessing near visual function in patients with 'accommodating' IOLS. The questionnaire will now be expanded to include other types of presbyopic correction. © 2007 British Contact Lens Association.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Following a scene-setting introduction are detailed reviews of the relevant scientific principles, thermal analysis as a research tool and the development of the zinc-aluminium family of alloys. A recently introduced simultaneous thermal analyser, the STA 1500, its use for differential thermal analysis (DTA) being central to the investigation, is described, together with the sources of support information, chemical analysis, scanning electron microscopy, ingot cooling curves and fluidity spiral castings. The compositions of alloys tested were from the binary zinc-aluminium system, the ternary zinc-aluminium-silicon system at 30%, 50% and 70% aluminium levels, binary and ternary alloys with additions of copper and magnesium to simulate commercial alloys and five widely used commercial alloys. Each alloy was shotted to provide the smaller, 100mg, representative sample required for DTA. The STA 1500 was characterised and calibrated with commercially pure zinc, and an experimental procedure established for the determination of DTA heating curves at 10°C per minute and cooling curves at 2°C per minute. Phase change temperatures were taken from DTA traces, most importantly, liquidus from a cooling curve and solidus from both heating and cooling curves. The accepted zinc-aluminium binary phase diagram was endorsed with the added detail that the eutectic is at 5.2% aluminium rather than 5.0%. The ternary eutectic trough was found to run through the points, 70% Al, 7.1% Si, 545°C; 50% Al, 3.9% Si, 520°C; 30% Al, 1.4% Si, 482°C. The dendrite arm spacing in samples after DTA increased with increasing aluminium content from 130m at 30% to 220m at 70%. The smallest dendrite arm spacing of 60m was in the 30% aluminium 2% silicon alloy. A 1kg ingot of the 10% aluminium binary alloy, insulated with Kaowool, solidified at the same 2°C per minute rate as the DTA samples. A similar sized sand casting was solidified at 3°C per minute and a chill casting at 27°C per minute. During metallographic examination the following features were observed: heavily cored phase which decomposed into ' and '' on cooling; needles of the intermetallic phase FeAl4; copper containing ternary eutectic and copper rich T phase.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we introduce and illustrate non-trivial upper and lower bounds on the learning curves for one-dimensional Gaussian Processes. The analysis is carried out emphasising the effects induced on the bounds by the smoothness of the random process described by the Modified Bessel and the Squared Exponential covariance functions. We present an explanation of the early, linearly-decreasing behavior of the learning curves and the bounds as well as a study of the asymptotic behavior of the curves. The effects of the noise level and the lengthscale on the tightness of the bounds are also discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1. The techniques associated with regression, whether linear or non-linear, are some of the most useful statistical procedures that can be applied in clinical studies in optometry. 2. In some cases, there may be no scientific model of the relationship between X and Y that can be specified in advance and the objective may be to provide a ‘curve of best fit’ for predictive purposes. In such cases, the fitting of a general polynomial type curve may be the best approach. 3. An investigator may have a specific model in mind that relates Y to X and the data may provide a test of this hypothesis. Some of these curves can be reduced to a linear regression by transformation, e.g., the exponential and negative exponential decay curves. 4. In some circumstances, e.g., the asymptotic curve or logistic growth law, a more complex process of curve fitting involving non-linear estimation will be required.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this research was to improve the quantitative support to project planning and control principally through the use of more accurate forecasting for which new techniques were developed. This study arose from the observation that in most cases construction project forecasts were based on a methodology (c.1980) which relied on the DHSS cumulative cubic cost model and network based risk analysis (PERT). The former of these, in particular, imposes severe limitations which this study overcomes. Three areas of study were identified, namely growth curve forecasting, risk analysis and the interface of these quantitative techniques with project management. These fields have been used as a basis for the research programme. In order to give a sound basis for the research, industrial support was sought. This resulted in both the acquisition of cost profiles for a large number of projects and the opportunity to validate practical implementation. The outcome of this research project was deemed successful both in theory and practice. The new forecasting theory was shown to give major reductions in projection errors. The integration of the new predictive and risk analysis technologies with management principles, allowed the development of a viable software management aid which fills an acknowledged gap in current technology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The project set out to investigate the relative effectiveness of thermal conductive heating (from external resistance heaters) and viscous heating in the heating (and melting) of low density polyethylene. A model system was used in order to simplify the mathematical analysis. A theory was developed to describe both processes in the model apparatus. The results showed large differences between the experimental and predicted results at low melt temperatures (the predicted results were much greater than the experimental) . Analysis of the results indicated that the apparatus was probably not producing the required shear rates in the sample. The theory appeared to be satisfactory, in that it did not over estimate the viscous heating to any significant extent. The theoretical results could therefore be considered to be a reasonable estimate of the viscous heating.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The extent to which the surface parameters of Progressive Addition Lenses (PALs) affect successful patient tolerance was investigated. Several optico-physical evaluation techniques were employed, including a newly constructed surface reflection device which was shown to be of value for assessing semi-finished PAL blanks. Detailed physical analysis was undertaken using a computer-controlled focimeter and from these data, iso-cylindrical and mean spherical plots were produced for each PAL studied. Base curve power was shown to have little impact upon the distribution of PAL astigmatism. A power increase in reading addition primarily caused a lengthening and narrowing of the lens progression channel. Empirical measurements also indicated a marginal steepening of the progression power gradient with an increase in reading addition power. A sample of the PAL wearing population were studied using patient records and questionnaire analysis (90% were returned). This subjective analysis revealed the reading portion to be the most troublesome lens zone and showed that patients with high astigmatism (> 2.00D) adapt more readily to PALs than those with spherical or low cylindrical (2.00D) corrections. The psychophysical features of PALs were then investigated. Both grafting visual acuity (VA) and contrast sensitivity (CS) were shown to be reduced with an increase in eccentricity from the central umbilical line. Two sample populations (N= 20) of successful and unsuccessful PAL wearers were assessed for differences in their visual performance and their adaptation to optically induced distortion. The possibility of dispensing errors being the cause of poor patient tolerance amongst the unsuccessful wearer group was investigated and discounted. The contrast sensitivity of the successful group was significantly greater than that of the unsuccessful group. No differences in adaptation to or detection of curvature distortion were evinced between these presbyopic groups.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The trend in modal extraction algorithms is to use all the available frequency response functions data to obtain a global estimate of the natural frequencies, damping ratio and mode shapes. Improvements in transducer and signal processing technology allow the simultaneous measurement of many hundreds of channels of response data. The quantity of data available and the complexity of the extraction algorithms make considerable demands on the available computer power and require a powerful computer or dedicated workstation to perform satisfactorily. An alternative to waiting for faster sequential processors is to implement the algorithm in parallel, for example on a network of Transputers. Parallel architectures are a cost effective means of increasing computational power, and a larger number of response channels would simply require more processors. This thesis considers how two typical modal extraction algorithms, the Rational Fraction Polynomial method and the Ibrahim Time Domain method, may be implemented on a network of transputers. The Rational Fraction Polynomial Method is a well known and robust frequency domain 'curve fitting' algorithm. The Ibrahim Time Domain method is an efficient algorithm that 'curve fits' in the time domain. This thesis reviews the algorithms, considers the problems involved in a parallel implementation, and shows how they were implemented on a real Transputer network.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The use of quantitative methods has become increasingly important in the study of neuropathology and especially in neurodegenerative disease. Disorders such as Alzheimer's disease (AD) and the frontotemporal dementias (FTD) are characterized by the formation of discrete, microscopic, pathological lesions which play an important role in pathological diagnosis. This chapter reviews the advantages and limitations of the different methods of quantifying pathological lesions in histological sections including estimates of density, frequency, coverage, and the use of semi-quantitative scores. The sampling strategies by which these quantitative measures can be obtained from histological sections, including plot or quadrat sampling, transect sampling, and point-quarter sampling, are described. In addition, data analysis methods commonly used to analysis quantitative data in neuropathology, including analysis of variance (ANOVA), polynomial curve fitting, multiple regression, classification trees, and principal components analysis (PCA), are discussed. These methods are illustrated with reference to quantitative studies of a variety of neurodegenerative disorders.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The aim of this research was to investigate the molecular interactions occurring in the formulation of non-ionic surfactant based vesicles composed monopalmitoyl glycerol (MPG), cholesterol (Chol) and dicetyl phosphate (DCP). In the formulation of these vesicles, the thermodynamic attributes and surfactant interactions based on molecular dynamics, Langmuir monolayer studies, differential scanning calorimetry (DSC), hot stage microscopy and thermogravimetric analysis (TGA) were investigated. Initially the melting points of the components individually, and combined at a 5:4:1 MPG:Chol:DCP weight ratio, were investigated; the results show that lower (90 C) than previously reported (120-140 C) temperatures could be adopted to produce molten surfactants for the production of niosomes. This was advantageous for surfactant stability; whilst TGA studies show that the individual components were stable to above 200 C, the 5:4:1 MPG:Chol:DCP mixture show ∼2% surfactant degradation at 140 C, compared to 0.01% was measured at 90 C. Niosomes formed at this lower temperature offered comparable characteristics to vesicles prepared using higher temperatures commonly reported in literature. In the formation of niosome vesicles, cholesterol also played a key role. Langmuir monolayer studies demonstrated that intercalation of cholesterol in the monolayer did not occur in the MPG:Chol:DCP (5:4:1 weight ratio) mixture. This suggests cholesterol may support bilayer assembly, with molecular simulation studies also demonstrating that vesicles cannot be built without the addition of cholesterol, with higher concentrations of cholesterol (5:4:1 vs 5:2:1, MPG:Chol:DCP) decreasing the time required for niosome assembly. © 2013 Elsevier B.V.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose: To determine whether curve-fitting analysis of the ranked segment distributions of topographic optic nerve head (ONH) parameters, derived using the Heidelberg Retina Tomograph (HRT), provide a more effective statistical descriptor to differentiate the normal from the glaucomatous ONH. Methods: The sample comprised of 22 normal control subjects (mean age 66.9 years; S.D. 7.8) and 22 glaucoma patients (mean age 72.1 years; S.D. 6.9) confirmed by reproducible visual field defects on the Humphrey Field Analyser. Three 10°-images of the ONH were obtained using the HRT. The mean topography image was determined and the HRT software was used to calculate the rim volume, rim area to disc area ratio, normalised rim area to disc area ratio and retinal nerve fibre cross-sectional area for each patient at 10°-sectoral intervals. The values were ranked in descending order, and each ranked-segment curve of ordered values was fitted using the least squares method. Results: There was no difference in disc area between the groups. The group mean cup-disc area ratio was significantly lower in the normal group (0.204 ± 0.16) compared with the glaucoma group (0.533 ± 0.083) (p < 0.001). The visual field indices, mean deviation and corrected pattern S.D., were significantly greater (p < 0.001) in the glaucoma group (-9.09 dB ± 3.3 and 7.91 ± 3.4, respectively) compared with the normal group (-0.15 dB ± 0.9 and 0.95 dB ± 0.8, respectively). Univariate linear regression provided the best overall fit to the ranked segment data. The equation parameters of the regression line manually applied to the normalised rim area-disc area and the rim area-disc area ratio data, correctly classified 100% of normal subjects and glaucoma patients. In this study sample, the regression analysis of ranked segment parameters method was more effective than conventional ranked segment analysis, in which glaucoma patients were misclassified in approximately 50% of cases. Further investigation in larger samples will enable the calculation of confidence intervals for normality. These reference standards will then need to be investigated for an independent sample to fully validate the technique. Conclusions: Using a curve-fitting approach to fit ranked segment curves retains information relating to the topographic nature of neural loss. Such methodology appears to overcome some of the deficiencies of conventional ranked segment analysis, and subject to validation in larger scale studies, may potentially be of clinical utility for detecting and monitoring glaucomatous damage. © 2007 The College of Optometrists.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Consideration of the influence of test technique and data analysis method is important for data comparison and design purposes. The paper highlights the effects of replication interval, crack growth rate averaging and curve-fitting procedures on crack growth rate results for a Ni-base alloy. It is shown that an upper bound crack growth rate line is not appropriate for use in fatigue design, and that the derivative of a quadratic fit to the a vs N data looks promising. However, this type of averaging, or curve fitting, is not useful in developing an understanding of microstructure/crack tip interactions. For this purpose, simple replica-to-replica growth rate calculations are preferable. © 1988.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The relationship between sleep apnoea–hypopnoea syndrome (SAHS) severity and the regularity of nocturnal oxygen saturation (SaO2) recordings was analysed. Three different methods were proposed to quantify regularity: approximate entropy (AEn), sample entropy (SEn) and kernel entropy (KEn). A total of 240 subjects suspected of suffering from SAHS took part in the study. They were randomly divided into a training set (96 subjects) and a test set (144 subjects) for the adjustment and assessment of the proposed methods, respectively. According to the measurements provided by AEn, SEn and KEn, higher irregularity of oximetry signals is associated with SAHS-positive patients. Receiver operating characteristic (ROC) and Pearson correlation analyses showed that KEn was the most reliable predictor of SAHS. It provided an area under the ROC curve of 0.91 in two-class classification of subjects as SAHS-negative or SAHS-positive. Moreover, KEn measurements from oximetry data exhibited a linear dependence on the apnoea–hypopnoea index, as shown by a correlation coefficient of 0.87. Therefore, these measurements could be used for the development of simplified diagnostic techniques in order to reduce the demand for polysomnographies. Furthermore, KEn represents a convincing alternative to AEn and SEn for the diagnostic analysis of noisy biomedical signals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The representation of serial position in sequences is an important topic in a variety of cognitive areas including the domains of language, memory, and motor control. In the neuropsychological literature, serial position data have often been normalized across different lengths, and an improved procedure for this has recently been reported by Machtynger and Shallice (2009). Effects of length and a U-shaped normalized serial position curve have been criteria for identifying working memory deficits. We present simulations and analyses to illustrate some of the issues that arise when relating serial position data to specific theories. We show that critical distinctions are often difficult to make based on normalized data. We suggest that curves for different lengths are best presented in their raw form and that binomial regression can be used to answer specific questions about the effects of length, position, and linear or nonlinear shape that are critical to making theoretical distinctions. © 2010 Psychology Press.