418 resultados para ANOVA, analysis of variance
em Queensland University of Technology - ePrints Archive
Resumo:
The theory of nonlinear dyamic systems provides some new methods to handle complex systems. Chaos theory offers new concepts, algorithms and methods for processing, enhancing and analyzing the measured signals. In recent years, researchers are applying the concepts from this theory to bio-signal analysis. In this work, the complex dynamics of the bio-signals such as electrocardiogram (ECG) and electroencephalogram (EEG) are analyzed using the tools of nonlinear systems theory. In the modern industrialized countries every year several hundred thousands of people die due to sudden cardiac death. The Electrocardiogram (ECG) is an important biosignal representing the sum total of millions of cardiac cell depolarization potentials. It contains important insight into the state of health and nature of the disease afflicting the heart. Heart rate variability (HRV) refers to the regulation of the sinoatrial node, the natural pacemaker of the heart by the sympathetic and parasympathetic branches of the autonomic nervous system. Heart rate variability analysis is an important tool to observe the heart's ability to respond to normal regulatory impulses that affect its rhythm. A computerbased intelligent system for analysis of cardiac states is very useful in diagnostics and disease management. Like many bio-signals, HRV signals are non-linear in nature. Higher order spectral analysis (HOS) is known to be a good tool for the analysis of non-linear systems and provides good noise immunity. In this work, we studied the HOS of the HRV signals of normal heartbeat and four classes of arrhythmia. This thesis presents some general characteristics for each of these classes of HRV signals in the bispectrum and bicoherence plots. Several features were extracted from the HOS and subjected an Analysis of Variance (ANOVA) test. The results are very promising for cardiac arrhythmia classification with a number of features yielding a p-value < 0.02 in the ANOVA test. An automated intelligent system for the identification of cardiac health is very useful in healthcare technology. In this work, seven features were extracted from the heart rate signals using HOS and fed to a support vector machine (SVM) for classification. The performance evaluation protocol in this thesis uses 330 subjects consisting of five different kinds of cardiac disease conditions. The classifier achieved a sensitivity of 90% and a specificity of 89%. This system is ready to run on larger data sets. In EEG analysis, the search for hidden information for identification of seizures has a long history. Epilepsy is a pathological condition characterized by spontaneous and unforeseeable occurrence of seizures, during which the perception or behavior of patients is disturbed. An automatic early detection of the seizure onsets would help the patients and observers to take appropriate precautions. Various methods have been proposed to predict the onset of seizures based on EEG recordings. The use of nonlinear features motivated by the higher order spectra (HOS) has been reported to be a promising approach to differentiate between normal, background (pre-ictal) and epileptic EEG signals. In this work, these features are used to train both a Gaussian mixture model (GMM) classifier and a Support Vector Machine (SVM) classifier. Results show that the classifiers were able to achieve 93.11% and 92.67% classification accuracy, respectively, with selected HOS based features. About 2 hours of EEG recordings from 10 patients were used in this study. This thesis introduces unique bispectrum and bicoherence plots for various cardiac conditions and for normal, background and epileptic EEG signals. These plots reveal distinct patterns. The patterns are useful for visual interpretation by those without a deep understanding of spectral analysis such as medical practitioners. It includes original contributions in extracting features from HRV and EEG signals using HOS and entropy, in analyzing the statistical properties of such features on real data and in automated classification using these features with GMM and SVM classifiers.
Resumo:
Recently, mean-variance analysis has been proposed as a novel paradigm to model document ranking in Information Retrieval. The main merit of this approach is that it diversifies the ranking of retrieved documents. In its original formulation, the strategy considers both the mean of relevance estimates of retrieved documents and their variance. How- ever, when this strategy has been empirically instantiated, the concepts of mean and variance are discarded in favour of a point-wise estimation of relevance (to replace the mean) and of a parameter to be tuned or, alternatively, a quantity dependent upon the document length (to replace the variance). In this paper we revisit this ranking strategy by going back to its roots: mean and variance. For each retrieved document, we infer a relevance distribution from a series of point-wise relevance estimations provided by a number of different systems. This is used to compute the mean and the variance of document relevance estimates. On the TREC Clueweb collection, we show that this approach improves the retrieval performances. This development could lead to new strategies to address the fusion of relevance estimates provided by different systems.
Resumo:
Gene expression is arguably the most important indicator of biological function. Thus identifying differentially expressed genes is one of the main aims of high throughout studies that use microarray and RNAseq platforms to study deregulated cellular pathways. There are many tools for analysing differentia gene expression from transciptomic datasets. The major challenge of this topic is to estimate gene expression variance due to the high amount of ‘background noise’ that is generated from biological equipment and the lack of biological replicates. Bayesian inference has been widely used in the bioinformatics field. In this work, we reveal that the prior knowledge employed in the Bayesian framework also helps to improve the accuracy of differential gene expression analysis when using a small number of replicates. We have developed a differential analysis tool that uses Bayesian estimation of the variance of gene expression for use with small numbers of biological replicates. Our method is more consistent when compared to the widely used cyber-t tool that successfully introduced the Bayesian framework to differential analysis. We also provide a user-friendly web based Graphic User Interface for biologists to use with microarray and RNAseq data. Bayesian inference can compensate for the instability of variance caused when using a small number of biological replicates by using pseudo replicates as prior knowledge. We also show that our new strategy to select pseudo replicates will improve the performance of the analysis. - See more at: http://www.eurekaselect.com/node/138761/article#sthash.VeK9xl5k.dpuf
Resumo:
The approach of generalized estimating equations (GEE) is based on the framework of generalized linear models but allows for specification of a working matrix for modeling within-subject correlations. The variance is often assumed to be a known function of the mean. This article investigates the impacts of misspecifying the variance function on estimators of the mean parameters for quantitative responses. Our numerical studies indicate that (1) correct specification of the variance function can improve the estimation efficiency even if the correlation structure is misspecified; (2) misspecification of the variance function impacts much more on estimators for within-cluster covariates than for cluster-level covariates; and (3) if the variance function is misspecified, correct choice of the correlation structure may not necessarily improve estimation efficiency. We illustrate impacts of different variance functions using a real data set from cow growth.
Resumo:
Heart rate variability (HRV) refers to the regulation of the sinoatrial node, the natural pacemaker of the heart, by the sympathetic and parasympathetic branches of the autonomic nervous system. Heart rate variability analysis is an important tool to observe the heart's ability to respond to normal regulatory impulses that affect its rhythm. A computer-based intelligent system for analysis of cardiac states is very useful in diagnostics and disease management. Like many bio-signals, HRV signals are nonlinear in nature. Higher order spectral analysis (HOS) is known to be a good tool for the analysis of nonlinear systems and provides good noise immunity. In this work, we studied the HOS of the HRV signals of normal heartbeat and seven classes of arrhythmia. We present some general characteristics for each of these classes of HRV signals in the bispectrum and bicoherence plots. We also extracted features from the HOS and performed an analysis of variance (ANOVA) test. The results are very promising for cardiac arrhythmia classification with a number of features yielding a p-value < 0.02 in the ANOVA test.
Resumo:
This naturalistic study investigated the mechanisms of change in measures of negative thinking and in 24-h urinary metabolites of noradrenaline (norepinephrine), dopamine and serotonin in a sample of 43 depressed hospital patients attending an eight-session group cognitive behavior therapy program. Most participants (91%) were taking antidepressant medication throughout the therapy period according to their treating Psychiatrists' prescriptions. The sample was divided into outcome categories (19 Responders and 24 Non-responders) on the basis of a clinically reliable change index [Jacobson, N.S., & Truax, P., 1991. Clinical significance: a statistical approach to defining meaningful change in psychotherapy research. Journal of Consulting and Clinical Psychology, 59, 12–19.] applied to the Beck Depression Inventory scores at the end of the therapy. Results of repeated measures analysis of variance [ANOVA] analyses of variance indicated that all measures of negative thinking improved significantly during therapy, and significantly more so in the Responders as expected. The treatment had a significant impact on urinary adrenaline and metadrenaline excretion however, these changes occurred in both Responders and Non-responders. Acute treatment did not significantly influence the six other monoamine metabolites. In summary, changes in urinary monoamine levels during combined treatment for depression were not associated with self-reported changes in mood symptoms.
Resumo:
The performance of an adaptive filter may be studied through the behaviour of the optimal and adaptive coefficients in a given environment. This thesis investigates the performance of finite impulse response adaptive lattice filters for two classes of input signals: (a) frequency modulated signals with polynomial phases of order p in complex Gaussian white noise (as nonstationary signals), and (b) the impulsive autoregressive processes with alpha-stable distributions (as non-Gaussian signals). Initially, an overview is given for linear prediction and adaptive filtering. The convergence and tracking properties of the stochastic gradient algorithms are discussed for stationary and nonstationary input signals. It is explained that the stochastic gradient lattice algorithm has many advantages over the least-mean square algorithm. Some of these advantages are having a modular structure, easy-guaranteed stability, less sensitivity to the eigenvalue spread of the input autocorrelation matrix, and easy quantization of filter coefficients (normally called reflection coefficients). We then characterize the performance of the stochastic gradient lattice algorithm for the frequency modulated signals through the optimal and adaptive lattice reflection coefficients. This is a difficult task due to the nonlinear dependence of the adaptive reflection coefficients on the preceding stages and the input signal. To ease the derivations, we assume that reflection coefficients of each stage are independent of the inputs to that stage. Then the optimal lattice filter is derived for the frequency modulated signals. This is performed by computing the optimal values of residual errors, reflection coefficients, and recovery errors. Next, we show the tracking behaviour of adaptive reflection coefficients for frequency modulated signals. This is carried out by computing the tracking model of these coefficients for the stochastic gradient lattice algorithm in average. The second-order convergence of the adaptive coefficients is investigated by modeling the theoretical asymptotic variance of the gradient noise at each stage. The accuracy of the analytical results is verified by computer simulations. Using the previous analytical results, we show a new property, the polynomial order reducing property of adaptive lattice filters. This property may be used to reduce the order of the polynomial phase of input frequency modulated signals. Considering two examples, we show how this property may be used in processing frequency modulated signals. In the first example, a detection procedure in carried out on a frequency modulated signal with a second-order polynomial phase in complex Gaussian white noise. We showed that using this technique a better probability of detection is obtained for the reduced-order phase signals compared to that of the traditional energy detector. Also, it is empirically shown that the distribution of the gradient noise in the first adaptive reflection coefficients approximates the Gaussian law. In the second example, the instantaneous frequency of the same observed signal is estimated. We show that by using this technique a lower mean square error is achieved for the estimated frequencies at high signal-to-noise ratios in comparison to that of the adaptive line enhancer. The performance of adaptive lattice filters is then investigated for the second type of input signals, i.e., impulsive autoregressive processes with alpha-stable distributions . The concept of alpha-stable distributions is first introduced. We discuss that the stochastic gradient algorithm which performs desirable results for finite variance input signals (like frequency modulated signals in noise) does not perform a fast convergence for infinite variance stable processes (due to using the minimum mean-square error criterion). To deal with such problems, the concept of minimum dispersion criterion, fractional lower order moments, and recently-developed algorithms for stable processes are introduced. We then study the possibility of using the lattice structure for impulsive stable processes. Accordingly, two new algorithms including the least-mean P-norm lattice algorithm and its normalized version are proposed for lattice filters based on the fractional lower order moments. Simulation results show that using the proposed algorithms, faster convergence speeds are achieved for parameters estimation of autoregressive stable processes with low to moderate degrees of impulsiveness in comparison to many other algorithms. Also, we discuss the effect of impulsiveness of stable processes on generating some misalignment between the estimated parameters and the true values. Due to the infinite variance of stable processes, the performance of the proposed algorithms is only investigated using extensive computer simulations.
Resumo:
Purpose: To undertake rigorous psychometric testing of the newly developed contemporary work environment measure (the Brisbane Practice Environment Measure [B-PEM]) using exploratory factor analysis and confirmatory factor analysis. Methods: Content validity of the 33-item measure was established by a panel of experts. Initial testing involved 195 nursing staff using principal component factor analysis with varimax rotation (orthogonal) and Cronbach's alpha coefficients. Confirmatory factor analysis was conducted using data from a further 983 nursing staff. Results: Principal component factor analysis yielded a four-factor solution with eigenvalues greater than 1 that explained 52.53% of the variance. These factors were then verified using confirmatory factor analysis. Goodness-of-fit indices showed an acceptable fit overall with the full model, explaining 21% to 73% of the variance. Deletion of items took place throughout the evolution of the instrument, resulting in a 26-item, four-factor measure called the Brisbane Practice Environment Measure-Tested. Conclusions: The B-PEM has undergone rigorous psychometric testing, providing evidence of internal consistency and goodness-of-fit indices within acceptable ranges. The measure can be utilised as a subscale or total score reflective of a contemporary nursing work environment. Clinical Relevance: An up-to-date instrument to measure practice environment may be useful for nursing leaders to monitor the workplace and to assist in identifying areas for improvement, facilitating greater job satisfaction and retention.
Resumo:
The problem of delays in the construction industry is a global phenomenon and the construction industry in Brunei Darussalam is no exception. The goal of all parties involved in construction projects – owners, contractors, engineers and consultants in either the public or private sector is to successfully complete the project on schedule, within planned budget, with the highest quality and in the safest manner. Construction projects are frequently influenced by either success factors that help project parties reach their goal as planned, or delay factors that stifle or postpone project completion. The purpose of this research is to identify success and delay factors which can help project parties reach their intended goals with greater efficiency. This research extracted seven of the most important success factors according to the literature and seven of the most important delay factors identified by project parties, and then examined correlations between them to determine which were the most influential in preventing project delays. This research uses a comprehensive literature review to design and conduct a survey to investigate success and delay factors and then obtain a consensus of expert opinion using the Delphi methodology to rank the most needed critical success factors for Brunei construction projects. A specific survey was distributed to owners, contractors and engineers to examine the most critical delay factors. A general survey was distributed to examine the correlation between the identified delay factors and the seven most important critical success factors selected. A consensus of expert opinion using the Delphi methodology was used to rank the most needed critical success factors for Brunei building construction. Data was collected and evaluated by statistical methods to identify the most significant causes of delay and to measure the strength and direction of the relationship between critical success factors and delay factors in order to examine project parties’ evaluation of projects’ critical success and delay factors, and to evaluate the influence of critical success factors on critical delay factors. A relative importance index has been used to determine the relative importance of the various causes of delays. A one and two-way analysis of variance (ANOVA) has been used to examine how the group or groups evaluated the influence of the critical success factors in avoiding or preventing each of the delay factors, and which success factors were perceived as most influential in avoiding or preventing critical delay factors. Finally the Delphi method, using consensus from an expert panel, was employed to identify the seven most critical success factors used to avoid the delay factors, and thereby improve project performance.
Resumo:
Venous leg ulceration is a serious condition affecting 1 – 3% of the population. Decline in the function of the calf muscle pump is correlated with venous ulceration. Many previous studies have reported an improvement in the function of the calf muscle pump, endurance of the calf muscle and increased range of ankle motion after structured exercise programs. However, there is a paucity of published research that assesses if these improvements result in an improvement in the healing rates of venous ulcers. The primary purpose of this pilot study was to establish the feasibility of a homebased progressive resistance exercise program and examine if there was any clinical significance or trend toward healing. The secondary aims were to examine the benefit of a home-based progressive resistance exercise program on calf muscle pump function and physical parameters. The methodology used was a randomised controlled trial where eleven participants were randomised into an intervention (n = 6) or control group (n = 5). Participants who were randomised to receive a 12-week home-based progressive resistance exercise program were instructed through weekly face-to-face consultations during their wound clinic appointment by the author. Control group participants received standard wound care and compression therapy. Changes in ulcer parameters were measured fortnightly at the clinic (number healed at 12 weeks, percentage change in area and pressure ulcer score healing score). An air plethysmography test was performed at baseline and following the 12 weeks of training to determine changes in calf muscle pump function. Functional measures included maximum number of heel raises (endurance), maximal isometric plantar flexion (strength) and range of ankle motion (ROAM); these tests were conducted at baseline, week 6 and week 12. The sample for the study was drawn from the Princess Alexandra Hospital in Brisbane, Australia. Participants with venous leg ulceration who met the inclusion criteria were recruited. The participants were screened via duplex scanning and ankle brachial pressure index (ABPI) to ensure they did not have any arterial complications. Participants were excluded if there was evidence of cellulitis. Demographic data were obtained from each participant and details regarding medical history, quality of life and geriatric depression scores were collected at baseline. Both the intervention and control group were required to complete a weekly exercise diary to monitor activity levels between groups. To test for the effect of the intervention over time, a repeated measures analysis of variance was conducted on the major outcome variables. Group (intervention versus control) was the between subject factor and time (baseline, week 6, week 12) was the within subject or repeated measures factor. Due to the small sample size, further tests were conducted to check the assumptions of the statistical test to be used. The results showed that Mauchly.s Test, the Sphericity assumptions of repeated measures for ANOVA were met. Further tests of homogeneity of variance assumptions also confirmed that this assumption was met. Data analysis was conducted using the software package SPSS for Windows Release 17.0. The pilot study proved feasible with all of the intervention (n=6) participants continuing with the resistance program for the 12 week duration and no deleterious effects noted. Clinical significance was observed in the intervention group with a 32% greater change in ulcer size (p= 0.26) than the control group, and a 10% (p = 0.74) greater difference between the numbers healed compared to the control group. Statistical significance was observed for the ejection fraction (p = 0.05), residual volume fraction (p = 0.04) and ROAM (p = 0.01), which all improved significantly in the intervention group over time. These results are encouraging, nevertheless, further investigations seem warranted to examine the effect exercise has on the healing rates of venous leg ulcers, with a multistudy site, larger sample size and longer follow up period.
Resumo:
Background We investigated the geographical variation of water supply and sanitation indicators (WS&S) and their role to the risk of schistosomiasis and hookworm infection in school age children in West Africa. The aim was to predict large-scale geographical variation in WS&S, quantify the attributable risk of S. haematobium, S. mansoni and hookworm infections due to WS&S and identify communities where sustainable transmission control could be targeted across the region. Methods National cross-sectional household-based demographic health surveys were conducted in 24,542 households in Burkina Faso, Ghana and Mali, in 2003–2006. We generated spatially-explicit predictions of areas without piped water, toilet facilities and finished floors in West Africa, adjusting for household covariates. Using recently published helminth prevalence data we developed Bayesian geostatistical models (MGB) of S. haematobium, S. mansoni and hookworm infection in West Africa including environmental and the mapped outputs for WS&S. Using these models we estimated the effect of WS&S on parasite risk, quantified their attributable fraction of infection, and mapped the risk of infection in West Africa. Findings Our maps show that most areas in West Africa are very poorly served by water supply except in major urban centers. There is a better geographical coverage for toilet availability and improved household flooring. We estimated smaller attributable risks for water supply in S. mansoni (47%) compared to S. haematobium (71%), and 5% of hookworm cases could be averted by improving sanitation. Greater levels of inadequate sanitation increased the risk of schistosomiasis, and increased levels of unsafe water supply increased the risk of hookworm. The role of floor type for S. haematobium infection (21%) was comparable to that of S. mansoni (16%), but was significantly higher for hookworm infection (86%). S. haematobium and hookworm maps accounting for WS&S show small clusters of maximal prevalence areas in areas bordering Burkina Faso and Mali smaller. The map of S. mansoni shows that this parasite is much more wide spread across the north of the Niger River basin than previously predicted. Interpretation Our maps identify areas where the Millennium Development Goal for water and sanitation is lagging behind. Our results show that WS&S are important contributors to the burden of major helminth infections of children in West Africa. Including information about WS&S as well as the “traditional” environmental risk factors in spatial models of helminth risk yielded a substantial gain both in model fit and at explaining the proportion of spatial variance in helminth risk. Mapping the distribution of infection risk adjusted for WS&S allowed the identification of communities in West Africa where integrative preventive chemotherapy and engineering interventions will yield the greatest public health benefits.
Resumo:
Modern technology now has the ability to generate large datasets over space and time. Such data typically exhibit high autocorrelations over all dimensions. The field trial data motivating the methods of this paper were collected to examine the behaviour of traditional cropping and to determine a cropping system which could maximise water use for grain production while minimising leakage below the crop root zone. They consist of moisture measurements made at 15 depths across 3 rows and 18 columns, in the lattice framework of an agricultural field. Bayesian conditional autoregressive (CAR) models are used to account for local site correlations. Conditional autoregressive models have not been widely used in analyses of agricultural data. This paper serves to illustrate the usefulness of these models in this field, along with the ease of implementation in WinBUGS, a freely available software package. The innovation is the fitting of separate conditional autoregressive models for each depth layer, the ‘layered CAR model’, while simultaneously estimating depth profile functions for each site treatment. Modelling interest also lay in how best to model the treatment effect depth profiles, and in the choice of neighbourhood structure for the spatial autocorrelation model. The favoured model fitted the treatment effects as splines over depth, and treated depth, the basis for the regression model, as measured with error, while fitting CAR neighbourhood models by depth layer. It is hierarchical, with separate onditional autoregressive spatial variance components at each depth, and the fixed terms which involve an errors-in-measurement model treat depth errors as interval-censored measurement error. The Bayesian framework permits transparent specification and easy comparison of the various complex models compared.
Resumo:
Background Although the non-operative management of closed humeral midshaft fractures has been advocated for years, the increasing popularity of operative intervention has left the optimal treatment choice unclear. Objective To compare the outcomes of operative and non-operative treatment of traumatic closed humeral midshaft fractures in adult patients. Methods A multicentre prospective comparative cohort study across 20 centres was conducted. Patients with AO type 12 A2, A3 and B2 fractures were treated with a functional brace or a retrograde-inserted unreamed humeral nail. Follow-up measurements were taken at 6, 12 and 52 weeks after the injury. The primary outcome was fracture healing after 1 year. Secondary outcomes included sub-items of the Constant score, general patient satisfaction, complications and cost-effectiveness parameters. Functions of the uninjured extremity were used as reference parameters. Intention-to-treat analysis was applied with the use of t-tests, Fisher’s exact tests, Mann–Whitney U-tests and adjusted analysis of variance (ANOVA). Results Forty-seven patients were included. The patient sample consisted of 23 women and 24 men, with a mean age of 52.7 years (range 17–86 years). Of the 47 cases, 14 were treated non-operatively and 33 operatively. The follow-up rate at 1 year was 81%. After 1 year, 11 fractures (100%) healed in the non-operative group and at least 24 fractures (≥89%) healed in the operative group [1 non-union patient (4%) and no data for 2 patients (7%)]. There were no significant differences in pain, range of motion (ROM) of the shoulder and elbow, and return to work after 6 weeks, 12 weeks and 1 year. Although operatively treated patients showed significantly greater shoulder abduction strength (p = 0.036), elbow flexion strength (p = 0.021), functional hand positioning (p = 0.008) and return to recreational activities (p = 0.043) after 6 weeks, no statistically significant differences existed in any outcome measure at the 1-year follow-up. Conclusions Our findings indicate that the non-operative management of humeral midshaft fractures can be expected to have similar functional outcomes and patient satisfaction at 1 year, despite an early benefit to operative treatment. If no radiological evidence of fracture healing exists in non-operatively treated patients during early follow-up, a switch to surgical treatment results in good functional outcomes and patient satisfaction. Keywords: Humeral shaft fracture, Non-operative treatment, Functional brace, Operative treatment, Unreamed humeral nail (UHN), Prospective, Cohort study