952 resultados para Multivariate statistical method
Resumo:
Obesity is the most common nutritional problem in dogs and it can cause various harmful effects on animal health. However, the effect of this condition on systolic blood pressure (SBP) in obese dogs is controversial. The indirect method of measuring PAS is the most commonly used in veterinary medicine for the diagnosis of systemic hypertension, because it is more practical and easy to access. There is little scientific information about the comparison of the two non- invasive methods of measurement of blood pressure in obese dogs. Therefore, the objective of this study was to evaluate the SBP in obese dogs by comparing both indirect methods of measuring blood pressure, with oscillometric and doppler vascular in order to verify the differences in blood pressure values, but also the best method to assess the SBP dogs in this body score condition. The study complain blood pressure measurement of 50 dogs, with were divided in obese dogs with mean body condition score (BCS) of 8.42 +/- 0.50 (n = 25) and dogs with ideal BCS of 4.56 +/- 0.51 (n = 25). At comparison of blood pressure values, SBP values obtained by doppler method (152 +/- 16mmHg) were higher than the oscillometric (136 +/- 11mmHg). Correspondence analysis determined by multivariate statistical analysis showed correlation between body condition and the procedure of the SBP. These findings suggest that the indirect method doppler can better reflect the SBP in obese dogs.
Resumo:
OBJECTIVE: To assess the presence of microorganism contamination in the preservation solution for transplant organs (kidney/pancreas). Method: Between August 2007 and March 2008, 136 samples of preservation solution were studied prior to graft implantation. Variables related to the donor and to the presence of microorganisms in the preservation solution of organs were evaluated, after which the contamination was evaluated in relation to the recipient culture variable. Univariate and multivariate statistical analyses were performed. RESULTS: The contamination rate of the preservation solution was 27.9%. Coagulase-negative Staphylococcus was the most frequently isolated microorganism. However, highly virulent agents, such as fungi and enterobacteria, were also isolated. In univariate analysis, the variable donor antibiotic use was significantly associated to the contamination of the preservation solution. on the other hand, multivariate analysis found statistical significance in donor antibiotic use and donor's infectious complications variables. CONCLUSIONS: In this study, 27.9% of the preservation solutions of transplant organs were contaminated. Infectious diseases and non-use of antibiotics by the donor were significantly related to the presence of microorganisms in organ preservation solutions. Contamination in organ preservation solutions was not associated with infection in the recipient.
Resumo:
In this paper we describe how morphological castes can be distinguished using multivariate statistical methods combined with jackknife estimators of the allometric coefficients. Data from the polymorphic ant, Camponotus rufipes, produced two distinct patterns of allometric variation, and thus two morphological castes. Morphometric analysis distinguished different allometric patterns within the two castes, with overall variability being greater in the major workers. Caste-specific scaling variabilities were associated with the relative importance of first principal component. The static multivariate allometric coefficients for each of 10 measured characters were different between castes, but their relative magnitudes within castes were similar. Multivariate statistical analysis of worker polymorphism in ants is a more complete descriptor of shape variation than, and provides statistical and conceptual advantages over, the standard bivariate techniques commonly used.
Resumo:
Background Several researchers seek methods for the selection of homogeneous groups of animals in experimental studies, a fact justified because homogeneity is an indispensable prerequisite for casualization of treatments. The lack of robust methods that comply with statistical and biological principles is the reason why researchers use empirical or subjective methods, influencing their results. Objective To develop a multivariate statistical model for the selection of a homogeneous group of animals for experimental research and to elaborate a computational package to use it. Methods The set of echocardiographic data of 115 male Wistar rats with supravalvular aortic stenosis (AoS) was used as an example of model development. Initially, the data were standardized, and became dimensionless. Then, the variance matrix of the set was submitted to principal components analysis (PCA), aiming at reducing the parametric space and at retaining the relevant variability. That technique established a new Cartesian system into which the animals were allocated, and finally the confidence region (ellipsoid) was built for the profile of the animals’ homogeneous responses. The animals located inside the ellipsoid were considered as belonging to the homogeneous batch; those outside the ellipsoid were considered spurious. Results The PCA established eight descriptive axes that represented the accumulated variance of the data set in 88.71%. The allocation of the animals in the new system and the construction of the confidence region revealed six spurious animals as compared to the homogeneous batch of 109 animals. Conclusion The biometric criterion presented proved to be effective, because it considers the animal as a whole, analyzing jointly all parameters measured, in addition to having a small discard rate.
Resumo:
Artificial neural networks (ANNs) have been widely applied to the resolution of complex biological problems. An important feature of neural models is that their implementation is not precluded by the theoretical distribution shape of the data used. Frequently, the performance of ANNs over linear or non-linear regression-based statistical methods is deemed to be significantly superior if suitable sample sizes are provided, especially in multidimensional and non-linear processes. The current work was aimed at utilising three well-known neural network methods in order to evaluate whether these models would be able to provide more accurate outcomes in relation to a conventional regression method in pupal weight predictions of Chrysomya megacephala, a species of blowfly (Diptera: Calliphoridae), using larval density (i.e. the initial number of larvae), amount of available food and pupal size as input data. It was possible to notice that the neural networks yielded more accurate performances in comparison with the statistical model (multiple regression). Assessing the three types of networks utilised (Multi-layer Perceptron, Radial Basis Function and Generalised Regression Neural Network), no considerable differences between these models were detected. The superiority of these neural models over a classical statistical method represents an important fact, because more accurate models may clarify several intricate aspects concerning the nutritional ecology of blowflies.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Background: Large gene expression studies, such as those conducted using DNA arrays, often provide millions of different pieces of data. To address the problem of analyzing such data, we describe a statistical method, which we have called ‘gene shaving’. The method identifies subsets of genes with coherent expression patterns and large variation across conditions. Gene shaving differs from hierarchical clustering and other widely used methods for analyzing gene expression studies in that genes may belong to more than one cluster, and the clustering may be supervised by an outcome measure. The technique can be ‘unsupervised’, that is, the genes and samples are treated as unlabeled, or partially or fully supervised by using known properties of the genes or samples to assist in finding meaningful groupings. Results: We illustrate the use of the gene shaving method to analyze gene expression measurements made on samples from patients with diffuse large B-cell lymphoma. The method identifies a small cluster of genes whose expression is highly predictive of survival. Conclusions: The gene shaving method is a potentially useful tool for exploration of gene expression data and identification of interesting clusters of genes worth further investigation.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Questa tesi descrive alcuni studi di messa a punto di metodi di analisi fisici accoppiati con tecniche statistiche multivariate per valutare la qualità e l’autenticità di oli vegetali e prodotti caseari. L’applicazione di strumenti fisici permette di abbattere i costi ed i tempi necessari per le analisi classiche ed allo stesso tempo può fornire un insieme diverso di informazioni che possono riguardare tanto la qualità come l’autenticità di prodotti. Per il buon funzionamento di tali metodi è necessaria la costruzione di modelli statistici robusti che utilizzino set di dati correttamente raccolti e rappresentativi del campo di applicazione. In questo lavoro di tesi sono stati analizzati oli vegetali e alcune tipologie di formaggi (in particolare pecorini per due lavori di ricerca e Parmigiano-Reggiano per un altro). Sono stati utilizzati diversi strumenti di analisi (metodi fisici), in particolare la spettroscopia, l’analisi termica differenziale, il naso elettronico, oltre a metodiche separative tradizionali. I dati ottenuti dalle analisi sono stati trattati mediante diverse tecniche statistiche, soprattutto: minimi quadrati parziali; regressione lineare multipla ed analisi discriminante lineare.
Resumo:
A critical point in the analysis of ground displacements time series is the development of data driven methods that allow the different sources that generate the observed displacements to be discerned and characterised. A widely used multivariate statistical technique is the Principal Component Analysis (PCA), which allows reducing the dimensionality of the data space maintaining most of the variance of the dataset explained. Anyway, PCA does not perform well in finding the solution to the so-called Blind Source Separation (BSS) problem, i.e. in recovering and separating the original sources that generated the observed data. This is mainly due to the assumptions on which PCA relies: it looks for a new Euclidean space where the projected data are uncorrelated. The Independent Component Analysis (ICA) is a popular technique adopted to approach this problem. However, the independence condition is not easy to impose, and it is often necessary to introduce some approximations. To work around this problem, I use a variational bayesian ICA (vbICA) method, which models the probability density function (pdf) of each source signal using a mix of Gaussian distributions. This technique allows for more flexibility in the description of the pdf of the sources, giving a more reliable estimate of them. Here I present the application of the vbICA technique to GPS position time series. First, I use vbICA on synthetic data that simulate a seismic cycle (interseismic + coseismic + postseismic + seasonal + noise) and a volcanic source, and I study the ability of the algorithm to recover the original (known) sources of deformation. Secondly, I apply vbICA to different tectonically active scenarios, such as the 2009 L'Aquila (central Italy) earthquake, the 2012 Emilia (northern Italy) seismic sequence, and the 2006 Guerrero (Mexico) Slow Slip Event (SSE).
Resumo:
The purpose of this study was to search the orthodontic literature and determine the frequency of reporting of confidence intervals (CIs) in orthodontic journals with an impact factor. The six latest issues of the American Journal of Orthodontics and Dentofacial Orthopedics, the European Journal of Orthodontics, and the Angle Orthodontist were hand searched and the reporting of CIs, P values, and implementation of univariate or multivariate statistical analyses were recorded. Additionally, studies were classified according to the type/design as cross-sectional, case-control, cohort, and clinical trials, and according to the subject of the study as growth/genetics, behaviour/psychology, diagnosis/treatment, and biomaterials/biomechanics. The data were analyzed using descriptive statistics followed by univariate examination of statistical associations, logistic regression, and multivariate modelling. CI reporting was very limited and was recorded in only 6 per cent of the included published studies. CI reporting was independent of journal, study area, and design. Studies that used multivariate statistical analyses had a higher probability of reporting CIs compared with those using univariate statistical analyses. Misunderstanding of the use of P values and CIs may have important implications in implementation of research findings in clinical practice.
Resumo:
Background. Research into methods for recovery from fatigue due to exercise is a popular topic among sport medicine, kinesiology and physical therapy. However, both the quantity and quality of studies and a clear solution of recovery are lacking. An analysis of the statistical methods in the existing literature of performance recovery can enhance the quality of research and provide some guidance for future studies. Methods: A literature review was performed using SCOPUS, SPORTDiscus, MEDLINE, CINAHL, Cochrane Library and Science Citation Index Expanded databases to extract the studies related to performance recovery from exercise of human beings. Original studies and their statistical analysis for recovery methods including Active Recovery, Cryotherapy/Contrast Therapy, Massage Therapy, Diet/Ergogenics, and Rehydration were examined. Results: The review produces a Research Design and Statistical Method Analysis Summary. Conclusion: Research design and statistical methods can be improved by using the guideline from the Research Design and Statistical Method Analysis Summary. This summary table lists the potential issues and suggested solutions, such as, sample size calculation, sports specific and research design issues consideration, population and measure markers selection, statistical methods for different analytical requirements, equality of variance and normality of data, post hoc analyses and effect size calculation.^
Resumo:
The operator effect is a well-known methodological bias already quantified in some taphonomic studies. However, the replicability effect, i.e., the use of taphonomic attributes as a replicable scientific method, has not been taken into account to the present. Here, we quantified for the first time this replicability bias using different multivariate statistical techniques, testing if the operator effect is related to the replicability effect. We analyzed the results reported by 15 operators working on the same dataset. Each operator analyzed 30 biological remains (bivalve shells) from five different sites, considering the attributes fragmentation, edge rounding, corrasion, bioerosion and secondary color. The operator effect followed the same pattern reported in previous studies, characterized by a worse correspondence for those attributes having more than two levels of damage categories. However, the effect did not appear to have relation with the replicability effect, because nearly all operators found differences among sites. Despite the binary attribute bioerosion exhibited 83% of correspondence among operators it was the taphonomic attributes that showed the highest dispersion among operators (28%). Therefore, we conclude that binary attributes (despite showing a reduction of the operator effect) diminish replicability, resulting in different interpretations of concordant data. We found that a variance value of nearly 8% among operators, was enough to generate a different taphonomic interpretation, in a Q-mode cluster analysis. The results reported here showed that the statistical method employed influences the level of replicability and comparability of a study and that the availability of results may be a valid alternative to reduce bias.
Resumo:
Biological productivity in the modern equatorial Pacific Ocean, a region with high nutrients and low chlorophyll, is currently limited by the micronutrient Fe. In order to test whether Fe was limiting in the past and to identify potential pathways of Fe delivery that could drive Fe fertilization (i.e., dust delivery from eolian inputs vs. Fe supplied by the Equatorial Undercurrent), we chemically isolated the terrigenous material from sediment along a cross-equatorial transect in the central equatorial Pacific at 140°W and at Ocean Drilling Program Site 850 in the eastern equatorial Pacific. We quantified the contribution from each potential Fe-bearing terrigenous source using a suite of chemical- and isotopic discrimination strategies as well as multivariate statistical techniques. We find that the distribution of the terrigenous sources (i.e., Asian loess, South American ash, Papua New Guinea, and ocean island basalt) varies through time, latitude, and climate. Regardless of which method is used to determine accumulation rate, there also is no relationship between flux of any particular Fe source and climate. Moreover, there is no connection between a particular Fe source or pathway (eolian vs. Undercurrent) to total productivity during the Last Glacial Maximum, Pleistocene glacial episodes, and the Miocene "Biogenic Bloom". This would suggest an alternative process, such as an interoceanic reorganization of nutrient inventories, may be responsible for past changes in total export in the open ocean, rather than simply Fe supply from dust and/or Equatorial Undercurrent processes. Additionally, perhaps a change in Fe source or flux is related to a change in a particular component of the total productivity (e.g., the production of organic matter, calcium carbonate, or biogenic opal).
Resumo:
We provide high-resolution sea surface temperature (SST) and paleoproductivity data focusing on Termination 1. We describe a new method for estimating SSTs based on multivariate statistical analyses performed on modern coccolithophore census data, and we present the first downcore reconstructions derived from coccolithophore assemblages at Ocean Drilling Project (ODP) Site 1233 located offshore Chile. We compare our coccolithophore SST record to alkenone-based SSTs as well as SST reconstructions based on dinoflagellates and radiolaria. All reconstructions generally show a remarkable concordance. As in the alkenone SST record, the Last Glacial Maximum (LGM, 19-23 kyr B.P.) is not clearly defined in our SST reconstruction. After the onset of deglaciation, three major warming steps are recorded: from 18.6 to 18 kyr B.P. (~2.6°C), from 15.7 to 15.3 kyr B.P. (~2.5°C), and from 13 to 11.4 kyr B.P. (~3.4°C). Consistent with the other records from Site 1233 and Antarctic ice core records, we observed a clear Holocene Climatic Optimum (HCO) from ~8-12 kyr B.P. Combining the SST reconstruction with coccolith absolute abundances and accumulation rates, we show that colder temperatures during the LGM are linked to higher coccolithophore productivity offshore Chile and warmer SSTs during the HCO to lower coccolithophore productivity, with indications of weak coastal upwelling. We interpret our data in terms of latitudinal displacements of the Southern Westerlies and the northern margin of the Antarctic Circumpolar Current system over the deglaciation and the Holocene.