884 resultados para Analysis of variance (ANOVA)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The theory of nonlinear dyamic systems provides some new methods to handle complex systems. Chaos theory offers new concepts, algorithms and methods for processing, enhancing and analyzing the measured signals. In recent years, researchers are applying the concepts from this theory to bio-signal analysis. In this work, the complex dynamics of the bio-signals such as electrocardiogram (ECG) and electroencephalogram (EEG) are analyzed using the tools of nonlinear systems theory. In the modern industrialized countries every year several hundred thousands of people die due to sudden cardiac death. The Electrocardiogram (ECG) is an important biosignal representing the sum total of millions of cardiac cell depolarization potentials. It contains important insight into the state of health and nature of the disease afflicting the heart. Heart rate variability (HRV) refers to the regulation of the sinoatrial node, the natural pacemaker of the heart by the sympathetic and parasympathetic branches of the autonomic nervous system. Heart rate variability analysis is an important tool to observe the heart's ability to respond to normal regulatory impulses that affect its rhythm. A computerbased intelligent system for analysis of cardiac states is very useful in diagnostics and disease management. Like many bio-signals, HRV signals are non-linear in nature. Higher order spectral analysis (HOS) is known to be a good tool for the analysis of non-linear systems and provides good noise immunity. In this work, we studied the HOS of the HRV signals of normal heartbeat and four classes of arrhythmia. This thesis presents some general characteristics for each of these classes of HRV signals in the bispectrum and bicoherence plots. Several features were extracted from the HOS and subjected an Analysis of Variance (ANOVA) test. The results are very promising for cardiac arrhythmia classification with a number of features yielding a p-value < 0.02 in the ANOVA test. An automated intelligent system for the identification of cardiac health is very useful in healthcare technology. In this work, seven features were extracted from the heart rate signals using HOS and fed to a support vector machine (SVM) for classification. The performance evaluation protocol in this thesis uses 330 subjects consisting of five different kinds of cardiac disease conditions. The classifier achieved a sensitivity of 90% and a specificity of 89%. This system is ready to run on larger data sets. In EEG analysis, the search for hidden information for identification of seizures has a long history. Epilepsy is a pathological condition characterized by spontaneous and unforeseeable occurrence of seizures, during which the perception or behavior of patients is disturbed. An automatic early detection of the seizure onsets would help the patients and observers to take appropriate precautions. Various methods have been proposed to predict the onset of seizures based on EEG recordings. The use of nonlinear features motivated by the higher order spectra (HOS) has been reported to be a promising approach to differentiate between normal, background (pre-ictal) and epileptic EEG signals. In this work, these features are used to train both a Gaussian mixture model (GMM) classifier and a Support Vector Machine (SVM) classifier. Results show that the classifiers were able to achieve 93.11% and 92.67% classification accuracy, respectively, with selected HOS based features. About 2 hours of EEG recordings from 10 patients were used in this study. This thesis introduces unique bispectrum and bicoherence plots for various cardiac conditions and for normal, background and epileptic EEG signals. These plots reveal distinct patterns. The patterns are useful for visual interpretation by those without a deep understanding of spectral analysis such as medical practitioners. It includes original contributions in extracting features from HRV and EEG signals using HOS and entropy, in analyzing the statistical properties of such features on real data and in automated classification using these features with GMM and SVM classifiers.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ENGLISH: Longline hook rates of bigeye and yellowfin tunas in the eastern Pacific Ocean were standardized by maximum depth of fishing, area, and season, using generalized linear models (GLM's). The annual trends of the standardized hook rates differ from the unstandardized, and are more likely to represent the changes in abundance of tunas in the age groups most vulnerable to longliners in the fishing grounds. For both species all of the interactions in the GLM's involving years, depths of fishing, areas, and seasons were significant. This means that the annual trends in hook rates depend on which depths, areas, and seasons are being considered. The overall average hook rates for each were estimated by weighting each 5-degree quadrangle equally and each season by the number of months in it. Since the annual trends in hook rates for each fishing depth category are roughly the same for bigeye, total average annual hook rate estimates are possible with the GLM. For yellowfin, the situation is less clear because of a preponderance of empty cells in the model. The full models explained 55% of the variation in bigeye hook rate and 33% of that of yellowfin. SPANISH: Se estandardizaron las tasas de captura con palangre de atunes patudo y aleta amarilla en el Océano Pacífico oriental por la profunidad máxima de pesca, área, y temporada, usando modelos lineales generalizados (MLG). Las tendencias anuales de las tasas de captura estandardizadas son diferentes a las de las tasas no estandardizadas, y es más que representen los cambios en la abundancia de los atunes en los grupos de edad más vulnerables a los palangreros en las áreas de pesca. Para ambas especies fueron significativas todas las interacciones en los MLG con año, profundidad de pesca, área, y temporada. Esto significa que las tendencias anuales de las tasas de captura dependen de cuál profundidad, área, y temporado se está considerando. Para la estimación de la tasa de captura general media para cada especie se ponderó cada cuadrángulo de 5 grados igualmente y cada temporada por el número de meses que contiene. Ya que las tendencias anuales en las tasas de captura para cada categoría de profundidad de pesca son aproximadamente iguales para el patudo, son posibles estimaciones de la tasa de captura anual media total con el MLG. En el caso del aleta amarilla, la situación es más confusa, debido a una preponderancia de celdas vacías en el modelo. Los modelos completos explican el 55% de la variación de la tasa de captura de patudo y 33% de la del aleta amarilla. (PDF contains 19 pages.)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A brief description is given of a program to carry out analysis of variance two-way classification on MICRO 2200, for use in fishery data processing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An unbalanced nested sampling design was used to investigate the spatial scale of soil and herbicide interactions at the field scale. A hierarchical analysis of variance based on residual maximum likelihood (REML) was used to analyse the data and provide a first estimate of the variogram. Soil samples were taken at 108 locations at a range of separating distances in a 9 ha field to explore small and medium scale spatial variation. Soil organic matter content, pH, particle size distribution, microbial biomass and the degradation and sorption of the herbicide, isoproturon, were determined for each soil sample. A large proportion of the spatial variation in isoproturon degradation and sorption occurred at sampling intervals less than 60 m, however, the sampling design did not resolve the variation present at scales greater than this. A sampling interval of 20-25 m should ensure that the main spatial structures are identified for isoproturon degradation rate and sorption without too great a loss of information in this field.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The variogram is essential for local estimation and mapping of any variable by kriging. The variogram itself must usually be estimated from sample data. The sampling density is a compromise between precision and cost, but it must be sufficiently dense to encompass the principal spatial sources of variance. A nested, multi-stage, sampling with separating distances increasing in geometric progression from stage to stage will do that. The data may then be analyzed by a hierarchical analysis of variance to estimate the components of variance for every stage, and hence lag. By accumulating the components starting from the shortest lag one obtains a rough variogram for modest effort. For balanced designs the analysis of variance is optimal; for unbalanced ones, however, these estimators are not necessarily the best, and the analysis by residual maximum likelihood (REML) will usually be preferable. The paper summarizes the underlying theory and illustrates its application with data from three surveys, one in which the design had four stages and was balanced and two implemented with unbalanced designs to economize when there were more stages. A Fortran program is available for the analysis of variance, and code for the REML analysis is listed in the paper. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present study was designed to analyse the average depth of the microporosity of a nickel-chromium (Ni-Cr) system alloy (Verabond II). The metal surface was subject to one of the following surface treatment: (i) Electrolytic etching in nitric acid 0.5 N at a current density of 250 mA cm(-2) ; (ii) chemical etching with CG-Etch etchant; and (iii) Sandblasting with alumina particles 50 mum. Half of the samples were polished before the surface treatments. The depth of porosity was measured through photomicrographs (500x) with a profilometer, and the data were statistically analysed using an analysis of variance (anova) followed by Tukey's test. The conclusions were (i) Differents surface treatment of the Ni-Cr system alloy lead to different depths of microporosity; (ii) the greatest depth of porosity was observed in non-polished alloy; (iii) the greatest and identical depth of microporosity was observed following electrolytic etching and chemical etching; (iv) the least and identical depth of microporosity was observed with chemical etching and sandblasting with alumina particles 50 mum, and (v) Chemical etching showed an intermediary depth.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study was to evaluate the effects of the autogenous demineralized dentin matrix (ADDM) on the third molar socket wound healing process in humans, using the guided bone regeneration technique and a polytetrafluoroethylene barrier (PTFE). Twenty-seven dental sockets were divided into three groups: dental socket (Control), dental socket with PTFE barrier (PTFE), and dental socket with ADDM slices associated to PTFE banier (ADDM + PTFE). The dental sockets were submitted to radiographic bone densitometry analysis and statistical analysis on the 15th, 30th, 60th and 90th days using analysis of variance (ANOVA) and Tukey's test (p ≤ 0.05). The radiographic analysis of the ADDM + PTFE group showed greater homogeneity of bone radiopacity than the Control group and the PTFE group, during all the observation times. The dentin matrix gradually disappeared from the dental socket during the course of the repair process, suggesting its resorption during the bone remodeling process. It was concluded that the radiographic bone density of the dental sockets treated with ADDM was similar to that of the surrounding normal bone on the 90th day. The ADDM was biocompatible with the bone tissue of the surgical wounds of human dental sockets. The radiographic analysis revealed that the repair process was discreetly faster in the ADDM + PTFE group than in the Control and PTFE groups, although the difference was not statistically significant. In addition, the radiographic image of the ADDM + PTFE group suggested that its bone architecture was better than that of the Control and PFTE groups.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Concentrations of 39 organic compounds were determined in three fractions (head, heart and tail) obtained from the pot still distillation of fermented sugarcane juice. The results were evaluated using analysis of variance (ANOVA), Tukey's test, principal component analysis (PCA), hierarchical cluster analysis (HCA) and linear discriminant analysis (LDA). According to PCA and HCA, the experimental data lead to the formation of three clusters. The head fractions give rise to a more defined group. The heart and tail fractions showed some overlap consistent with its acid composition. The predictive ability of calibration and validation of the model generated by LDA for the three fractions classification were 90.5 and 100%, respectively. This model recognized as the heart twelve of the thirteen commercial cachacas (92.3%) with good sensory characteristics, thus showing potential for guiding the process of cuts.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Specific research tools and designs can assist in identifying the efficiency of physical activity in elderly women. Objectives: To identify the effects of physical activity on the physical condition of older women. Method: A one-year-long physical activity program (123 sessions) was implemented for women aged 60 years or older. Four physical assessments were conducted, in which weight, height, BMI, blood pressure, heart rate, absences, grip strength, flexibility, VO2max, and static and dynamic balance were assessed. The statistical analyses included a repeated measures analysis, both inferential (analysis of variance - ANOVA) and effect size (Cohen's d coefficient), as well as identification of the participants' efficiency (Data Envelopment Analysis - DEA). Results: Despite the observation of differences that depended on the analysis used, the results were successful in the sense that they showed that physical activity adapted to older women can effectively change the decline in physical ability associated with aging, depending on the purpose of the study. The 60-65 yrs group was the most capable of converting physical activity into health benefits in both the short and long term. The >65 yrs group took less advantage of physical activity. Conclusions: Adherence to the program and actual time spent on each type of exercise are the factors that determine which population can benefit from physical activity programs. The DEA allows the assessment of the results related to time spent on physical activity in terms of health concerns. Article registered in Clinicaltrials.gov under number NCT01558401.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The electroencephalogram (EEG) is a physiological time series that measures electrical activity at different locations in the brain, and plays an important role in epilepsy research. Exploring the variance and/or volatility may yield insights for seizure prediction, seizure detection and seizure propagation/dynamics.^ Maximal Overlap Discrete Wavelet Transforms (MODWTs) and ARMA-GARCH models were used to determine variance and volatility characteristics of 66 channels for different states of an epileptic EEG – sleep, awake, sleep-to-awake and seizure. The wavelet variances, changes in wavelet variances and volatility half-lives for the four states were compared for possible differences between seizure and non-seizure channels.^ The half-lives of two of the three seizure channels were found to be shorter than all of the non-seizure channels, based on 95% CIs for the pre-seizure and awake signals. No discernible patterns were found the wavelet variances of the change points for the different signals. ^