36 resultados para STATISTICAL-METHOD

em Aston University Research Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper introduces a method for the analysis of regional linguistic variation. The method identifies individual and common patterns of spatial clustering in a set of linguistic variables measured over a set of locations based on a combination of three statistical techniques: spatial autocorrelation, factor analysis, and cluster analysis. To demonstrate how to apply this method, it is used to analyze regional variation in the values of 40 continuously measured, high-frequency lexical alternation variables in a 26-million-word corpus of letters to the editor representing 206 cities from across the United States.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

This book is aimed primarily at microbiologists who are undertaking research and who require a basic knowledge of statistics to analyse their experimental data. Computer software employing a wide range of data analysis methods is widely available to experimental scientists. The availability of this software, however, makes it essential that investigators understand the basic principles of statistics. Statistical analysis of data can be complex with many different methods of approach, each of which applies in a particular experimental circumstance. Hence, it is possible to apply an incorrect statistical method to data and to draw the wrong conclusions from an experiment. The purpose of this book, which has its origin in a series of articles published in the Society for Applied Microbiology journal ‘The Microbiologist’, is an attempt to present the basic logic of statistics as clearly as possible and therefore, to dispel some of the myths that often surround the subject. The 28 ‘Statnotes’ deal with various topics that are likely to be encountered, including the nature of variables, the comparison of means of two or more groups, non-parametric statistics, analysis of variance, correlating variables, and more complex methods such as multiple linear regression and principal components analysis. In each case, the relevant statistical method is illustrated with examples drawn from experiments in microbiological research. The text incorporates a glossary of the most commonly used statistical terms and there are two appendices designed to aid the investigator in the selection of the most appropriate test.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A culster analysis was performed on 78 cases of Alzheimer's disease (AD) to identify possible pathological subtypes of the disease. Data on 47 neuropathological variables, inculding features of the gross brain and the density and distribution of senile plaques (SP) and neurofibrillary tangles (NFT) were used to describe each case. Cluster analysis is a multivariate statistical method which combines together in groups, AD cases with the most similar neuropathological characteristics. The majority of cases (83%) were clustered into five such groups. The analysis suggested that an initial division of the 78 cases could be made into two major groups: (1) a large group (68%) in which the distribution of SP and NFT was restricted to a relatively small number of brain regions, and (2) a smaller group (15%) in which the lesions were more widely disseminated throughout the neocortex. Each of these groups could be subdivided on the degree of capillary amyloid angiopathy (CAA) present. In addition, those cases with a restricted development of SP/NFT and CAA could be divided further into an early and a late onset form. Familial AD cases did not cluster as a separate group but were either distributed between four of the five groups or were cases with unique combinations of pathological features not closely related to any of the groups. It was concluded that multivariate statistical methods may be of value in the classification of AD into subtypes. © 1994 Springer-Verlag.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Discriminant analysis (also known as discriminant function analysis or multiple discriminant analysis) is a multivariate statistical method of testing the degree to which two or more populations may overlap with each other. It was devised independently by several statisticians including Fisher, Mahalanobis, and Hotelling ). The technique has several possible applications in Microbiology. First, in a clinical microbiological setting, if two different infectious diseases were defined by a number of clinical and pathological variables, it may be useful to decide which measurements were the most effective at distinguishing between the two diseases. Second, in an environmental microbiological setting, the technique could be used to study the relationships between different populations, e.g., to what extent do the properties of soils in which the bacterium Azotobacter is found differ from those in which it is absent? Third, the method can be used as a multivariate ‘t’ test , i.e., given a number of related measurements on two groups, the analysis can provide a single test of the hypothesis that the two populations have the same means for all the variables studied. This statnote describes one of the most popular applications of discriminant analysis in identifying the descriptive variables that can distinguish between two populations.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In Alzheimer's disease (AD), neurofibrillary tangles (NFT) occur within neurons in both the upper and lower cortical laminae. Using a statistical method that estimates the size and spacing of NFT clusters along the cortex parallel to the pia mater, two hypotheses were tested: 1) that the cluster size and distribution of the NFT in gyri of the temporal lobe reflect degeneration of the feedforward (FF) and feedback (FB) cortico-cortical pathways, and 2) that there is a spatial relationship between the clusters of NFT in the upper and lower laminae. In 16 temporal lobe gyri from 10 cases of sporadic AD, NFT were present in both the upper and lower laminae in 11/16 (69%) gyri and in either the upper or lower laminae in 5/16 (31%) gyri. Clustering of the NFT was observed in all gyri. A significant peak-to-peak distance was observed in the upper laminae in 13/15 (87%) gyri and in the lower laminae in 8/ 12 (67%) gyri, suggesting a regularly repeating pattern of NFT clusters along the cortex. The regularly distributed clusters of NFT were between 500 and 800 μm in size, the estimated size of the cells of origin of the FF and FB cortico-cortical projections, in the upper laminae of 6/13 (46%) gyri and in the lower laminae of 2/8 (25%) gyri. Clusters of NFT in the upper laminae were spatially correlated (in phase) with those in the lower laminae in 5/16 (31%) gyri. The clustering patterns of the NFT are consistent with their formation in relation to the FF and FB cortico-cortical pathways. In most gyri, NFT clusters appeared to develop independently in the upper and lower laminae.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In Statnotes 24 and 25, multiple linear regression, a statistical method that examines the relationship between a single dependent variable (Y) and two or more independent variables (X), was described. The principle objective of such an analysis was to determine which of the X variables had a significant influence on Y and to construct an equation that predicts Y from the X variables. ‘Principal components analysis’ (PCA) and ‘factor analysis’ (FA) are also methods of examining the relationships between different variables but they differ from multiple regression in that no distinction is made between the dependent and independent variables, all variables being essentially treated the same. Originally, PCA and FA were regarded as distinct methods but in recent times they have been combined into a single analysis, PCA often being the first stage of a FA. The basic objective of a PCA/FA is to examine the relationships between the variables or the ‘structure’ of the variables and to determine whether these relationships can be explained by a smaller number of ‘factors’. This statnote describes the use of PCA/FA in the analysis of the differences between the DNA profiles of different MRSA strains introduced in Statnote 26.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The spatial patterns of discrete beta-amyloid (Abeta) deposits in brain tissue from patients with Alzheimer disease (AD) were studied using a statistical method based on linear regression, the results being compared with the more conventional variance/mean (V/M) method. Both methods suggested that Abeta deposits occurred in clusters (400 to <12,800 mu m in diameter) in all but 1 of the 42 tissues examined. In many tissues, a regular periodicity of the Abeta deposit clusters parallel to the tissue boundary was observed. In 23 of 42 (55%) tissues, the two methods revealed essentially the same spatial patterns of Abeta deposits; in 15 of 42 (36%), the regression method indicated the presence of clusters at a scale not revealed by the V/M method; and in 4 of 42 (9%), there was no agreement between the two methods. Perceived advantages of the regression method are that there is a greater probability of detecting clustering at multiple scales, the dimension of larger Abeta clusters can be estimated more accurately, and the spacing between the clusters may be estimated. However, both methods may be useful, with the regression method providing greater resolution and the V/M method providing greater simplicity and ease of interpretation. Estimates of the distance between regularly spaced Abeta clusters were in the range 2,200-11,800 mu m, depending on tissue and cluster size. The regular periodicity of Abeta deposit clusters in many tissues would be consistent with their development in relation to clusters of neurons that give rise to specific neuronal projections.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Multiple regression analysis is a complex statistical method with many potential uses. It has also become one of the most abused of all statistical procedures since anyone with a data base and suitable software can carry it out. An investigator should always have a clear hypothesis in mind before carrying out such a procedure and knowledge of the limitations of each aspect of the analysis. In addition, multiple regression is probably best used in an exploratory context, identifying variables that might profitably be examined by more detailed studies. Where there are many variables potentially influencing Y, they are likely to be intercorrelated and to account for relatively small amounts of the variance. Any analysis in which R squared is less than 50% should be suspect as probably not indicating the presence of significant variables. A further problem relates to sample size. It is often stated that the number of subjects or patients must be at least 5-10 times the number of variables included in the study.5 This advice should be taken only as a rough guide but it does indicate that the variables included should be selected with great care as inclusion of an obviously unimportant variable may have a significant impact on the sample size required.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

To determine the spatial pattern of ß-amyloid (Aß) deposition throughout the temporal lobe in Alzheimer's disease (AD). Methods: Sections of the complete temporal lobe from six cases of sporadic AD were immunolabelled with antibody against Aß. Fourier (spectral) analysis was used to identify sinusoidal patterns in the fluctuation of Aß deposition in a direction parallel to the pia mater or alveus. Results: Significant sinusoidal fluctuations in density were evident in 81/99 (82%) analyses. In 64% of analyses, two frequency components were present with density peaks of Aß deposits repeating every 500–1000 µm and at distances greater than 1000 µm. In 25% of analyses, three or more frequency components were present. The estimated period or wavelength (number of sample units to complete one full cycle) of the first and second frequency components did not vary significantly between gyri of the temporal lobe, but there was evidence that the fluctuations of the classic deposits had longer periods than the diffuse and primitive deposits. Conclusions: (i) Aß deposits exhibit complex sinusoidal fluctuations in density in the temporal lobe in AD; (ii) fluctuations in Aß deposition may reflect the formation of Aß deposits in relation to the modular and vascular structure of the cortex; and (iii) Fourier analysis may be a useful statistical method for studying the patterns of Aß deposition both in AD and in transgenic models of disease.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The aim of this work was to investigate the feasibility of detecting and locating damage in large frame structures where visual inspection would be difficult or impossible. This method is based on a vibration technique for non-destructively assessing the integrity of structures by using measurements of changes in the natural frequencies. Such measurements can be made at a single point in the structure. The method requires that initially a comprehensive theoretical vibration analysis of the structure is undertaken and from it predictions are made of changes in dynamic characteristics that will occur if each member of the structure is damaged in turn. The natural frequencies of the undamaged structure are measured, and then routinely remeasured at intervals . If a change in the natural frequencies is detected a statistical method. is used to make the best match between the measured changes in frequency and the family of theoretical predictions. This predicts the most likely damage site. The theoretical analysis was based on the finite element method. Many structures were extensively studied and a computer model was used to simulate the effect of the extent and location of the damage on natural frequencies. Only one such analysis is required for each structure to be investigated. The experimental study was conducted on small structures In the laboratory. Frequency changes were found from inertance measurements on various plane and space frames. The computational requirements of the location analysis are small and a desk-top micro computer was used. Results of this work showed that the method was successful in detecting and locating damage in the test structures.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Biomass-To-Liquid (BTL) is one of the most promising low carbon processes available to support the expanding transportation sector. This multi-step process produces hydrocarbon fuels from biomass, the so-called “second generation biofuels” that, unlike first generation biofuels, have the ability to make use of a wider range of biomass feedstock than just plant oils and sugar/starch components. A BTL process based on gasification has yet to be commercialized. This work focuses on the techno-economic feasibility of nine BTL plants. The scope was limited to hydrocarbon products as these can be readily incorporated and integrated into conventional markets and supply chains. The evaluated BTL systems were based on pressurised oxygen gasification of wood biomass or bio-oil and they were characterised by different fuel synthesis processes including: Fischer-Tropsch synthesis, the Methanol to Gasoline (MTG) process and the Topsoe Integrated Gasoline (TIGAS) synthesis. This was the first time that these three fuel synthesis technologies were compared in a single, consistent evaluation. The selected process concepts were modelled using the process simulation software IPSEpro to determine mass balances, energy balances and product distributions. For each BTL concept, a cost model was developed in MS Excel to estimate capital, operating and production costs. An uncertainty analysis based on the Monte Carlo statistical method, was also carried out to examine how the uncertainty in the input parameters of the cost model could affect the output (i.e. production cost) of the model. This was the first time that an uncertainty analysis was included in a published techno-economic assessment study of BTL systems. It was found that bio-oil gasification cannot currently compete with solid biomass gasification due to the lower efficiencies and higher costs associated with the additional thermal conversion step of fast pyrolysis. Fischer-Tropsch synthesis was the most promising fuel synthesis technology for commercial production of liquid hydrocarbon fuels since it achieved higher efficiencies and lower costs than TIGAS and MTG. None of the BTL systems were competitive with conventional fossil fuel plants. However, if government tax take was reduced by approximately 33% or a subsidy of £55/t dry biomass was available, transport biofuels could be competitive with conventional fuels. Large scale biofuel production may be possible in the long term through subsidies, fuels price rises and legislation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The purpose of this study is to investigate the impact of human resource (HR) practices on organizational performance through the mediating role of psychological contract (expressed by the influence of employer on employee promises fulfillment through employee attitudes). The study is based on a national sample of 78 organizations from the public and private services sector in Greece, including education, health, and banking, and on data obtained from 348 employees. The statistical method employed is structural equation modeling, via LISREL and bootstrapping estimation. The findings of the study suggest that employee incentives, performance appraisal, and employee promotion are three major HR practices that must be extensively employed. Furthermore, the study suggests that the organization must primarily keep its promises about a pleasant and safe working environment, respectful treatment, and feedback for performance, in order for employees to largely keep their own promises about showing loyalty to the organization, maintaining high levels of attendance, and upholding company reputation. Additionally, the study argues that the employee attitudes of motivation, satisfaction, and commitment constitute the nested epicenter mediating construct in both the HR practices–performance and employer–employee promise fulfillment relationships, resulting in superior organizational performance. © 2012 Wiley Periodicals, Inc.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

With its implications for vaccine discovery, the accurate prediction of T cell epitopes is one of the key aspirations of computational vaccinology. We have developed a robust multivariate statistical method, based on partial least squares, for the quantitative prediction of peptide binding to major histocompatibility complexes (MHC), the principal checkpoint on the antigen presentation pathway. As a service to the immunobiology community, we have made a Perl implementation of the method available via a World Wide Web server. We call this server MHCPred. Access to the server is freely available from the URL: http://www.jenner.ac.uk/MHCPred. We have exemplified our method with a model for peptides binding to the common human MHC molecule HLA-B*3501.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Microfluidics has recently emerged as a new method of manufacturing liposomes, which allows for reproducible mixing in miliseconds on the nanoliter scale. Here we investigate microfluidics-based manufacturing of liposomes. The aim of these studies was to assess the parameters in a microfluidic process by varying the total flow rate (TFR) and the flow rate ratio (FRR) of the solvent and aqueous phases. Design of experiment and multivariate data analysis were used for increased process understanding and development of predictive and correlative models. High FRR lead to the bottom-up synthesis of liposomes, with a strong correlation with vesicle size, demonstrating the ability to in-process control liposomes size; the resulting liposome size correlated with the FRR in the microfluidics process, with liposomes of 50 nm being reproducibly manufactured. Furthermore, we demonstrate the potential of a high throughput manufacturing of liposomes using microfluidics with a four-fold increase in the volumetric flow rate, maintaining liposome characteristics. The efficacy of these liposomes was demonstrated in transfection studies and was modelled using predictive modeling. Mathematical modelling identified FRR as the key variable in the microfluidic process, with the highest impact on liposome size, polydispersity and transfection efficiency. This study demonstrates microfluidics as a robust and high-throughput method for the scalable and highly reproducible manufacture of size-controlled liposomes. Furthermore, the application of statistically based process control increases understanding and allows for the generation of a design-space for controlled particle characteristics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Low-density parity-check codes with irregular constructions have recently been shown to outperform the most advanced error-correcting codes to date. In this paper we apply methods of statistical physics to study the typical properties of simple irregular codes. We use the replica method to find a phase transition which coincides with Shannon's coding bound when appropriate parameters are chosen. The decoding by belief propagation is also studied using statistical physics arguments; the theoretical solutions obtained are in good agreement with simulation results. We compare the performance of irregular codes with that of regular codes and discuss the factors that contribute to the improvement in performance.