21 resultados para THERMAL ANALYSIS METHODS

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This article explains first, the reasons why a knowledge of statistics is necessary and describes the role that statistics plays in an experimental investigation. Second, the normal distribution is introduced which describes the natural variability shown by many measurements in optometry and vision sciences. Third, the application of the normal distribution to some common statistical problems including how to determine whether an individual observation is a typical member of a population and how to determine the confidence interval for a sample mean is described.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this second article, statistical ideas are extended to the problem of testing whether there is a true difference between two samples of measurements. First, it will be shown that the difference between the means of two samples comes from a population of such differences which is normally distributed. Second, the 't' distribution, one of the most important in statistics, will be applied to a test of the difference between two means using a simple data set drawn from a clinical experiment in optometry. Third, in making a t-test, a statistical judgement is made as to whether there is a significant difference between the means of two samples. Before the widespread use of statistical software, this judgement was made with reference to a statistical table. Even if such tables are not used, it is useful to understand their logical structure and how to use them. Finally, the analysis of data, which are known to depart significantly from the normal distribution, will be described.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In some studies, the data are not measurements but comprise counts or frequencies of particular events. In such cases, an investigator may be interested in whether one specific event happens more frequently than another or whether an event occurs with a frequency predicted by a scientific model.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In any investigation in optometry involving more that two treatment or patient groups, an investigator should be using ANOVA to analyse the results assuming that the data conform reasonably well to the assumptions of the analysis. Ideally, specific null hypotheses should be built into the experiment from the start so that the treatments variation can be partitioned to test these effects directly. If 'post-hoc' tests are used, then an experimenter should examine the degree of protection offered by the test against the possibilities of making either a type 1 or a type 2 error. All experimenters should be aware of the complexity of ANOVA. The present article describes only one common form of the analysis, viz., that which applies to a single classification of the treatments in a randomised design. There are many different forms of the analysis each of which is appropriate to the analysis of a specific experimental design. The uses of some of the most common forms of ANOVA in optometry have been described in a further article. If in any doubt, an investigator should consult a statistician with experience of the analysis of experiments in optometry since once embarked upon an experiment with an unsuitable design, there may be little that a statistician can do to help.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

1. Pearson's correlation coefficient only tests whether the data fit a linear model. With large numbers of observations, quite small values of r become significant and the X variable may only account for a minute proportion of the variance in Y. Hence, the value of r squared should always be calculated and included in a discussion of the significance of r. 2. The use of r assumes that a bivariate normal distribution is present and this assumption should be examined prior to the study. If Pearson's r is not appropriate, then a non-parametric correlation coefficient such as Spearman's rs may be used. 3. A significant correlation should not be interpreted as indicating causation especially in observational studies in which there is a high probability that the two variables are correlated because of their mutual correlations with other variables. 4. In studies of measurement error, there are problems in using r as a test of reliability and the ‘intra-class correlation coefficient’ should be used as an alternative. A correlation test provides only limited information as to the relationship between two variables. Fitting a regression line to the data using the method known as ‘least square’ provides much more information and the methods of regression and their application in optometry will be discussed in the next article.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Multiple regression analysis is a complex statistical method with many potential uses. It has also become one of the most abused of all statistical procedures since anyone with a data base and suitable software can carry it out. An investigator should always have a clear hypothesis in mind before carrying out such a procedure and knowledge of the limitations of each aspect of the analysis. In addition, multiple regression is probably best used in an exploratory context, identifying variables that might profitably be examined by more detailed studies. Where there are many variables potentially influencing Y, they are likely to be intercorrelated and to account for relatively small amounts of the variance. Any analysis in which R squared is less than 50% should be suspect as probably not indicating the presence of significant variables. A further problem relates to sample size. It is often stated that the number of subjects or patients must be at least 5-10 times the number of variables included in the study.5 This advice should be taken only as a rough guide but it does indicate that the variables included should be selected with great care as inclusion of an obviously unimportant variable may have a significant impact on the sample size required.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PCA/FA is a method of analyzing complex data sets in which there are no clearly defined X or Y variables. It has multiple uses including the study of the pattern of variation between individual entities such as patients with particular disorders and the detailed study of descriptive variables. In most applications, variables are related to a smaller number of ‘factors’ or PCs that account for the maximum variance in the data and hence, may explain important trends among the variables. An increasingly important application of the method is in the ‘validation’ of questionnaires that attempt to relate subjective aspects of a patients experience with more objective measures of vision.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this research was to investigate the molecular interactions occurring in the formulation of non-ionic surfactant based vesicles composed monopalmitoyl glycerol (MPG), cholesterol (Chol) and dicetyl phosphate (DCP). In the formulation of these vesicles, the thermodynamic attributes and surfactant interactions based on molecular dynamics, Langmuir monolayer studies, differential scanning calorimetry (DSC), hot stage microscopy and thermogravimetric analysis (TGA) were investigated. Initially the melting points of the components individually, and combined at a 5:4:1 MPG:Chol:DCP weight ratio, were investigated; the results show that lower (90 C) than previously reported (120-140 C) temperatures could be adopted to produce molten surfactants for the production of niosomes. This was advantageous for surfactant stability; whilst TGA studies show that the individual components were stable to above 200 C, the 5:4:1 MPG:Chol:DCP mixture show ∼2% surfactant degradation at 140 C, compared to 0.01% was measured at 90 C. Niosomes formed at this lower temperature offered comparable characteristics to vesicles prepared using higher temperatures commonly reported in literature. In the formation of niosome vesicles, cholesterol also played a key role. Langmuir monolayer studies demonstrated that intercalation of cholesterol in the monolayer did not occur in the MPG:Chol:DCP (5:4:1 weight ratio) mixture. This suggests cholesterol may support bilayer assembly, with molecular simulation studies also demonstrating that vesicles cannot be built without the addition of cholesterol, with higher concentrations of cholesterol (5:4:1 vs 5:2:1, MPG:Chol:DCP) decreasing the time required for niosome assembly. © 2013 Elsevier B.V.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The structure and thermal properties of yttrium alumino-phosphate glasses, of nominal composition (Y2O3)(0.31-z)(Al2O3)(z)(P2O5)(0.69) with 0 less than or similar to z less than or similar to 0.31, were studied by using a combination of neutron diffraction, Al-27 and P-31 magic angle spinning nuclear magnetic resonance, differential scanning calorimetry and thermal gravimetric analysis methods. The Vickers hardness of the glasses was also measured. The data are compared to those obtained for pseudo-binary Al2O3-P2O5 glasses and the structure of all these materials is rationalized in terms of a generic model for vitreous phosphate materials in which Y3+ and Al3+ act as modifying cations that bind only to the terminal (non-bridging) oxygen atoms of PO4 tetrahedra. The results are used to help elucidate the phenomenon of rare-earth clustering in phosphate glasses which can be reduced by substituting Al3+ ions for rare-earth R3+ ions at fixed modifier content.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Aims: Characterization of the representative protozoan Acanthamoeba polyphaga surface carbohydrate exposure by a novel combination of flow cytometry and ligand-receptor analysis. Methods and Results: Trophozoite and cyst morphological forms were exposed to a panel of FITC-lectins. Population fluorescence associated with FITC-lectin binding to acanthamoebal surface moieties was ascertained by flow cytometry. Increasing concentrations of representative FITC-lectins, saturation binding and determination of K d and relative Bmax values were employed to characterize carbohydrate residue exposure. FITC-lectins specific for N-acetylglucosamine, N-acetylgalactosamine and mannose/glucose were readily bound by trophozoite and cyst surfaces. Minor incremental increases in FITC-lectin concentration resulted in significant differences in surface fluorescence intensity and supported the calculation of ligand-binding determinants, Kd and relative B max, which gave a trophozoite and cyst rank order of lectin affinity and surface receptor presence. Conclusions: Trophozoites and cysts expose similar surface carbohydrate residues, foremost amongst which is N-acetylglucosamine, in varying orientation and availability. Significance and Impact of the Study: The outlined versatile combination of flow cytometry and ligand-receptor analysis allowed the characterization of surface carbohydrate exposure by protozoan morphological forms and in turn will support a valid comparison of carbohydrate exposure by other single-cell protozoa and eucaryotic microbes analysed in the same manner.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The use of quantitative methods has become increasingly important in the study of neurodegenerative disease. Disorders such as Alzheimer's disease (AD) are characterized by the formation of discrete, microscopic, pathological lesions which play an important role in pathological diagnosis. This article reviews the advantages and limitations of the different methods of quantifying the abundance of pathological lesions in histological sections, including estimates of density, frequency, coverage, and the use of semiquantitative scores. The major sampling methods by which these quantitative measures can be obtained from histological sections, including plot or quadrat sampling, transect sampling, and point-quarter sampling, are also described. In addition, the data analysis methods commonly used to analyse quantitative data in neuropathology, including analyses of variance (ANOVA) and principal components analysis (PCA), are discussed. These methods are illustrated with reference to particular problems in the pathological diagnosis of AD and dementia with Lewy bodies (DLB).

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Stereology and other image analysis methods have enabled rapid and objective quantitative measurements to be made on histological sections. These mesurements may include total volumes, surfaces, lengths and numbers of cells and blood vessels or pathological lesions. Histological features, however, may not be randomly distributed across a section but exhibit 'dispersion', a departure from randomness either towards regularity or aggregation. Information of population dispersion may be valuable not only in understanding the two-or three-dimensional structure but also in elucidating the pathogenesis of lesions in pathological conditions. This article reviews some of the statistical methods available for studying dispersion. These range from simple tests of whether the distribution of a histological faeture departs significantly from random to more complex methods which can detect the intensity of aggregation and the sizes, distribution and spacing of the clusters.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This book is aimed primarily at microbiologists who are undertaking research and who require a basic knowledge of statistics to analyse their experimental data. Computer software employing a wide range of data analysis methods is widely available to experimental scientists. The availability of this software, however, makes it essential that investigators understand the basic principles of statistics. Statistical analysis of data can be complex with many different methods of approach, each of which applies in a particular experimental circumstance. Hence, it is possible to apply an incorrect statistical method to data and to draw the wrong conclusions from an experiment. The purpose of this book, which has its origin in a series of articles published in the Society for Applied Microbiology journal ‘The Microbiologist’, is an attempt to present the basic logic of statistics as clearly as possible and therefore, to dispel some of the myths that often surround the subject. The 28 ‘Statnotes’ deal with various topics that are likely to be encountered, including the nature of variables, the comparison of means of two or more groups, non-parametric statistics, analysis of variance, correlating variables, and more complex methods such as multiple linear regression and principal components analysis. In each case, the relevant statistical method is illustrated with examples drawn from experiments in microbiological research. The text incorporates a glossary of the most commonly used statistical terms and there are two appendices designed to aid the investigator in the selection of the most appropriate test.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This thesis describes an experimental study of the abrasion resistance of concrete at both the macro and micro levels. This is preceded by a review related to friction and wear, methods of test for assessing abrasion resistance, and factors influencing the abrasion resistance of concrete. A versatile test apparatus was developed to assess the abrasion resistance of concrete. This could be operated in three modes and a standardised procedure was established for all tests. A laboratory programme was undertaken to investigate the influence, on abrasion resistance, of three major factors - finishing techniques, curing regimes and surface treatments. The results clearly show that abrasion resistance was significantly affected by these factors, and tentative mechanisms were postulated to explain these observations. To substantiate these mechanisms, the concrete specimens from the macro-study were subjected to micro-structural investigation, using such techniques as 'Mercury Intrusion Forosimetry, Microhardness, Scanning Electron Microscopy, Petrography and Differential Thermal Analysis. The results of this programme clearly demonstrated that the abrasion resistance of concrete is primarily dependent on the microstructure of the concrete nearest to the surface. The viability of indirectly assessing the abrasion resistance was investigated using three non-destructive techniques - Ultrasonic Pulse Velocity, Schmidt Rebound Hardness, and the Initial Surface Absorption Test. The Initial Surface Absorption was found to be most sensitive to factors which were shown to have influenced the abrasion resistance of concrete. An extensive field investigation was also undertaken. The results were used to compare site and laboratorypractices, and the performance in the accelerated abrasion test with the service wear. From this study, criteria were developed for assessing the quality of concrete floor slabs in terms of abrasion resistance.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The use of quantitative methods has become increasingly important in the study of neuropathology and especially in neurodegenerative disease. Disorders such as Alzheimer's disease (AD) and the frontotemporal dementias (FTD) are characterized by the formation of discrete, microscopic, pathological lesions which play an important role in pathological diagnosis. This chapter reviews the advantages and limitations of the different methods of quantifying pathological lesions in histological sections including estimates of density, frequency, coverage, and the use of semi-quantitative scores. The sampling strategies by which these quantitative measures can be obtained from histological sections, including plot or quadrat sampling, transect sampling, and point-quarter sampling, are described. In addition, data analysis methods commonly used to analysis quantitative data in neuropathology, including analysis of variance (ANOVA), polynomial curve fitting, multiple regression, classification trees, and principal components analysis (PCA), are discussed. These methods are illustrated with reference to quantitative studies of a variety of neurodegenerative disorders.