30 resultados para Curve fitting

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aims: Previous data suggest heterogeneity in laminar distribution of the pathology in the molecular disorder frontotemporal lobar degeneration (FTLD) with transactive response (TAR) DNA-binding protein of 43kDa (TDP-43) proteinopathy (FTLD-TDP). To study this heterogeneity, we quantified the changes in density across the cortical laminae of neuronal cytoplasmic inclusions, glial inclusions, neuronal intranuclear inclusions, dystrophic neurites, surviving neurones, abnormally enlarged neurones, and vacuoles in regions of the frontal and temporal lobe. Methods: Changes in density of histological features across cortical gyri were studied in 10 sporadic cases of FTLD-TDP using quantitative methods and polynomial curve fitting. Results: Our data suggest that laminar neuropathology in sporadic FTLD-TDP is highly variable. Most commonly, neuronal cytoplasmic inclusions, dystrophic neurites and vacuolation were abundant in the upper laminae and glial inclusions, neuronal intranuclear inclusions, abnormally enlarged neurones, and glial cell nuclei in the lower laminae. TDP-43-immunoreactive inclusions affected more of the cortical profile in longer duration cases; their distribution varied with disease subtype, but was unrelated to Braak tangle score. Different TDP-43-immunoreactive inclusions were not spatially correlated. Conclusions: Laminar distribution of pathological features in 10 sporadic cases of FTLD-TDP is heterogeneous and may be accounted for, in part, by disease subtype and disease duration. In addition, the feedforward and feedback cortico-cortical connections may be compromised in FTLD-TDP. © 2012 The Authors. Neuropathology and Applied Neurobiology © 2012 British Neuropathological Society.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Non-linear relationships are common in microbiological research and often necessitate the use of the statistical techniques of non-linear regression or curve fitting. In some circumstances, the investigator may wish to fit an exponential model to the data, i.e., to test the hypothesis that a quantity Y either increases or decays exponentially with increasing X. This type of model is straight forward to fit as taking logarithms of the Y variable linearises the relationship which can then be treated by the methods of linear regression.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

PURPOSE: To determine whether letter sequences and/or lens-presentation order should be randomized when measuring defocus curves and to assess the most appropriate criterion for calculating the subjective amplitude of accommodation (AoA) from defocus curves. SETTING: Eye Clinic, School of Life & Health Sciences, Aston University, Birmingham, United Kingdom. METHODS: Defocus curves (from +3.00 diopters [D] to -3.00 D in 0.50 D steps) for 6 possible combinations of randomized or nonrandomized letter sequences and/or lens-presentation order were measured in a random order in 20 presbyopic subjects. Subjective AoA was calculated from the defocus curves by curve fitting using various published criteria, and each was correlated to subjective push-up AoA. Objective AoA was measured for comparison of blur tolerance and pupil size. RESULTS: Randomization of lens-presentation order and/or letter sequences, or lack of, did not affect the measured defocus curves (P>.05, analysis of variance). The range of defocus that maintains highest achievable visual acuity (allowing for variability of repeated measurement) was better correlated to (r = 0.84) and agreed best with ( 0.50 D) subjective push-up AoA than any other relative or absolute acuity criterion used in previous studies. CONCLUSIONS: Nonrandomized letters and lens presentation on their own did not affect subjective AoA measured by defocus curves, although their combination should be avoided. Quantification of subjective AoA from defocus curves should be standardized to the range of defocus that maintains the best achievable visual acuity.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

1. The techniques associated with regression, whether linear or non-linear, are some of the most useful statistical procedures that can be applied in clinical studies in optometry. 2. In some cases, there may be no scientific model of the relationship between X and Y that can be specified in advance and the objective may be to provide a ‘curve of best fit’ for predictive purposes. In such cases, the fitting of a general polynomial type curve may be the best approach. 3. An investigator may have a specific model in mind that relates Y to X and the data may provide a test of this hypothesis. Some of these curves can be reduced to a linear regression by transformation, e.g., the exponential and negative exponential decay curves. 4. In some circumstances, e.g., the asymptotic curve or logistic growth law, a more complex process of curve fitting involving non-linear estimation will be required.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A procedure has been developed which measures the settling velocity distribution of particles within a complete sewage sample. The development of the test method included observations of particle and liquid interaction using both synthetic media and sewage. Comparison studies with two other currently used settling velocity test procedures was undertaken. The method is suitable for use with either DWF or storm sewage. Information relating to the catchment characteristics of 35 No. wastewater treatment works was collected from the privatised water companies in England and Wales. 29 No. of these sites were used in an experimental programme to determine the settling velocity grading of 33 No. sewage samples. The collected data were analysed in an attempt to relate the settling velocity distribution to the characteristics of the contributing catchment. Statistical analysis of the catchment data and the measured settling velocity distributions was undertaken. A curve fitting exercise using an S-shaped curve which had the same physical characteristics as the settling velocity distributions was performed. None of these analyses found evidence that the settling velocity distribution of sewage had a significant relationship with the chosen catchment characteristics. The regression equations produced from the statistical analysis cannot be used to assist in the design of separation devices. However, a grading curve envelope was produced, the limits of which were clearly defined for the measured data set. There was no evidence of a relationship between settling velocity grading and the characteristics of the contributing catchment, particularly the catchment area. The present empirical approach to settling tank design cannot be improved upon at present by considering the variation in catchment parameters. This study has provided a basis for future research into the settling velocity measurement and should be of benefit to future workers within this field.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The trend in modal extraction algorithms is to use all the available frequency response functions data to obtain a global estimate of the natural frequencies, damping ratio and mode shapes. Improvements in transducer and signal processing technology allow the simultaneous measurement of many hundreds of channels of response data. The quantity of data available and the complexity of the extraction algorithms make considerable demands on the available computer power and require a powerful computer or dedicated workstation to perform satisfactorily. An alternative to waiting for faster sequential processors is to implement the algorithm in parallel, for example on a network of Transputers. Parallel architectures are a cost effective means of increasing computational power, and a larger number of response channels would simply require more processors. This thesis considers how two typical modal extraction algorithms, the Rational Fraction Polynomial method and the Ibrahim Time Domain method, may be implemented on a network of transputers. The Rational Fraction Polynomial Method is a well known and robust frequency domain 'curve fitting' algorithm. The Ibrahim Time Domain method is an efficient algorithm that 'curve fits' in the time domain. This thesis reviews the algorithms, considers the problems involved in a parallel implementation, and shows how they were implemented on a real Transputer network.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The slope of the two-interval, forced-choice psychometric function (e.g. the Weibull parameter, ß) provides valuable information about the relationship between contrast sensitivity and signal strength. However, little is known about how or whether ß varies with stimulus parameters such as spatiotemporal frequency and stimulus size and shape. A second unresolved issue concerns the best way to estimate the slope of the psychometric function. For example, if an observer is non-stationary (e.g. their threshold drifts between experimental sessions), ß will be underestimated if curve fitting is performed after collapsing the data across experimental sessions. We measured psychometric functions for 2 experienced observers for 14 different spatiotemporal configurations of pulsed or flickering grating patches and bars on each of 8 days. We found ß ˜ 3 to be fairly constant across almost all conditions, consistent with a fixed nonlinear contrast transducer and/or a constant level of intrinsic stimulus uncertainty (e.g. a square law transducer and a low level of intrinsic uncertainty). Our analysis showed that estimating a single ß from results averaged over several experimental sessions was slightly more accurate than averaging multiple estimates from several experimental sessions. However, the small levels of non-stationarity (SD ˜ 0.8 dB) meant that the difference between the estimates was, in practice, negligible.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The use of quantitative methods has become increasingly important in the study of neuropathology and especially in neurodegenerative disease. Disorders such as Alzheimer's disease (AD) and the frontotemporal dementias (FTD) are characterized by the formation of discrete, microscopic, pathological lesions which play an important role in pathological diagnosis. This chapter reviews the advantages and limitations of the different methods of quantifying pathological lesions in histological sections including estimates of density, frequency, coverage, and the use of semi-quantitative scores. The sampling strategies by which these quantitative measures can be obtained from histological sections, including plot or quadrat sampling, transect sampling, and point-quarter sampling, are described. In addition, data analysis methods commonly used to analysis quantitative data in neuropathology, including analysis of variance (ANOVA), polynomial curve fitting, multiple regression, classification trees, and principal components analysis (PCA), are discussed. These methods are illustrated with reference to quantitative studies of a variety of neurodegenerative disorders.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Purpose: To determine whether curve-fitting analysis of the ranked segment distributions of topographic optic nerve head (ONH) parameters, derived using the Heidelberg Retina Tomograph (HRT), provide a more effective statistical descriptor to differentiate the normal from the glaucomatous ONH. Methods: The sample comprised of 22 normal control subjects (mean age 66.9 years; S.D. 7.8) and 22 glaucoma patients (mean age 72.1 years; S.D. 6.9) confirmed by reproducible visual field defects on the Humphrey Field Analyser. Three 10°-images of the ONH were obtained using the HRT. The mean topography image was determined and the HRT software was used to calculate the rim volume, rim area to disc area ratio, normalised rim area to disc area ratio and retinal nerve fibre cross-sectional area for each patient at 10°-sectoral intervals. The values were ranked in descending order, and each ranked-segment curve of ordered values was fitted using the least squares method. Results: There was no difference in disc area between the groups. The group mean cup-disc area ratio was significantly lower in the normal group (0.204 ± 0.16) compared with the glaucoma group (0.533 ± 0.083) (p < 0.001). The visual field indices, mean deviation and corrected pattern S.D., were significantly greater (p < 0.001) in the glaucoma group (-9.09 dB ± 3.3 and 7.91 ± 3.4, respectively) compared with the normal group (-0.15 dB ± 0.9 and 0.95 dB ± 0.8, respectively). Univariate linear regression provided the best overall fit to the ranked segment data. The equation parameters of the regression line manually applied to the normalised rim area-disc area and the rim area-disc area ratio data, correctly classified 100% of normal subjects and glaucoma patients. In this study sample, the regression analysis of ranked segment parameters method was more effective than conventional ranked segment analysis, in which glaucoma patients were misclassified in approximately 50% of cases. Further investigation in larger samples will enable the calculation of confidence intervals for normality. These reference standards will then need to be investigated for an independent sample to fully validate the technique. Conclusions: Using a curve-fitting approach to fit ranked segment curves retains information relating to the topographic nature of neural loss. Such methodology appears to overcome some of the deficiencies of conventional ranked segment analysis, and subject to validation in larger scale studies, may potentially be of clinical utility for detecting and monitoring glaucomatous damage. © 2007 The College of Optometrists.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Consideration of the influence of test technique and data analysis method is important for data comparison and design purposes. The paper highlights the effects of replication interval, crack growth rate averaging and curve-fitting procedures on crack growth rate results for a Ni-base alloy. It is shown that an upper bound crack growth rate line is not appropriate for use in fatigue design, and that the derivative of a quadratic fit to the a vs N data looks promising. However, this type of averaging, or curve fitting, is not useful in developing an understanding of microstructure/crack tip interactions. For this purpose, simple replica-to-replica growth rate calculations are preferable. © 1988.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Familial frontotemporal lobar degeneration with transactive response (TAR) DNA-binding protein of 43 kDa (TDP-43) proteinopathy (FTLD-TDP) is most commonly caused by progranulin (GRN) gene mutation. To characterize cortical degeneration in these cases, changes in density of the pathology across the cortical laminae of the frontal and temporal lobe were studied in seven cases of FTLD-TDP with GRN mutation using quantitative analysis and polynomial curve fitting. In 50% of gyri studied, neuronal cytoplasmic inclusions (NCI) exhibited a peak of density in the upper cortical laminae. Most frequently, neuronal intranuclear inclusions (NII) and dystrophic neurites (DN) exhibited a density peak in lower and upper laminae, respectively, glial inclusions (GI) being distributed in low densities across all laminae. Abnormally enlarged neurons (EN) were distributed either in the lower laminae or were more uniformly distributed across the cortex. The distribution of all neurons present varied between cases and regions, but most commonly exhibited a bimodal distribution, density peaks occurring in upper and lower laminae. Vacuolation primarily affected the superficial laminae and density of glial cell nuclei increased with distance across the cortex from pia mater to white matter. The densities of the NCI, GI, NII, and DN were not spatially correlated. The laminar distribution of the pathology in GRN mutation cases was similar to previously reported sporadic cases of FTLD-TDP. Hence, pathological changes initiated by GRN mutation, and by other causes in sporadic cases, appear to follow a parallel course resulting in very similar patterns of cortical degeneration in FTLD-TDP.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We have proposed a similarity matching method (SMM) to obtain the change of Brillouin frequency shift (BFS), in which the change of BFS can be determined from the frequency difference between detecting spectrum and selected reference spectrum by comparing their similarity. We have also compared three similarity measures in the simulation, which has shown that the correlation coefficient is more accurate to determine the change of BFS. Compared with the other methods of determining the change of BFS, the SMM is more suitable for complex Brillouin spectrum profiles. More precise result and much faster processing speed have been verified in our simulation and experiments. The experimental results have shown that the measurement uncertainty of the BFS has been improved to 0.72 MHz by using the SMM, which is almost one-third of that by using the curve fitting method, and the speed of deriving the BFS change by the SMM is 120 times faster than that by the curve fitting method.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

In some circumstances, there may be no scientific model of the relationship between X and Y that can be specified in advance and indeed the objective of the investigation may be to provide a ‘curve of best fit’ for predictive purposes. In such an example, the fitting of successive polynomials may be the best approach. There are various strategies to decide on the polynomial of best fit depending on the objectives of the investigation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1. Fitting a linear regression to data provides much more information about the relationship between two variables than a simple correlation test. A goodness of fit test of the line should always be carried out. Hence, r squared estimates the strength of the relationship between Y and X, ANOVA whether a statistically significant line is present, and the ‘t’ test whether the slope of the line is significantly different from zero. 2. Always check whether the data collected fit the assumptions for regression analysis and, if not, whether a transformation of the Y and/or X variables is necessary. 3. If the regression line is to be used for prediction, it is important to determine whether the prediction involves an individual y value or a mean. Care should be taken if predictions are made close to the extremities of the data and are subject to considerable error if x falls beyond the range of the data. Multiple predictions require correction of the P values. 3. If several individual regression lines have been calculated from a number of similar sets of data, consider whether they should be combined to form a single regression line. 4. If the data exhibit a degree of curvature, then fitting a higher-order polynomial curve may provide a better fit than a straight line. In this case, a test of whether the data depart significantly from a linear regression should be carried out.