974 resultados para Data Interpretation, Statistical
Resumo:
Hemichordates were traditionally allied to the chordates, but recent molecular analyses have suggested that hemichordates are a sister group to the echinoderms, a relationship that has important consequences for the interpretation of the evolution of deuterostome body plans. However, the molecular phylogenetic analyses to date have not provided robust support for the hemichordate + echinoderm clade. We use a maximum likelihood framework, including the parametric bootstrap, to reanalyze DNA data from complete mitochondrial genomes and nuclear 18S rRNA. This approach provides the first statistically significant support for the hemichordate + echinoderm clade from molecular data. This grouping implies that the ancestral deuterostome had features that included an adult with a pharynx and a dorsal nerve cord and an indirectly developing dipleurula-like larva.
Resumo:
Phenylalanine hydroxylase (PAH) is the enzyme that converts phenylalanine to tyrosine as a rate-limiting step in phenylalanine catabolism and protein and neurotransmitter biosynthesis. Over 300 mutations have been identified in the gene encoding PAH that result in a deficient enzyme activity and lead to the disorders hyperphenylalaninaemia and phenylketonuria. The determination of the crystal structure of PAH now allows the determination of the structural basis of mutations resulting in PAH deficiency. We present an analysis of the structural basis of 120 mutations with a 'classified' biochemical phenotype and/or available in vitro expression data. We find that the mutations can be grouped into five structural categories, based on the distinct expected structural and functional effects of the mutations in each category. Missense mutations and small amino acid deletions are found in three categories:'active site mutations', 'dimer interface mutations', and 'domain structure mutations'. Nonsense mutations and splicing mutations form the category of 'proteins with truncations and large deletions'. The final category, 'fusion proteins', is caused by frameshift mutations. We show that the structural information helps formulate some rules that will help predict the likely effects of unclassified and newly discovered mutations: proteins with truncations and large deletions, fusion proteins and active site mutations generally cause severe phenotypes; domain structure mutations and dimer interface mutations spread over a range of phenotypes, but domain structure mutations in the catalytic domain are more likely to be severe than domain structure mutations in the regulatory domain or dimer interface mutations.
Resumo:
The problem of the negative values of the interaction parameter in the equation of Frumkin has been analyzed with respect to the adsorption of nonionic molecules on energetically homogeneous surface. For this purpose, the adsorption states of a homologue series of ethoxylated nonionic surfactants on air/water interface have been determined using four different models and literature data (surface tension isotherms). The results obtained with the Frumkin adsorption isotherm imply repulsion between the adsorbed species (corresponding to negative values of the interaction parameter), while the classical lattice theory for energetically homogeneous surface (e.g., water/air) admits attraction alone. It appears that this serious contradiction can be overcome by assuming heterogeneity in the adsorption layer, that is, effects of partial condensation (formation of aggregates) on the surface. Such a phenomenon is suggested in the Fainerman-Lucassen-Reynders-Miller (FLM) 'Aggregation model'. Despite the limitations of the latter model (e.g., monodispersity of the aggregates), we have been able to estimate the sign and the order of magnitude of Frumkin's interaction parameter and the range of the aggregation numbers of the surface species. (C) 2004 Elsevier B.V All rights reserved.
Resumo:
Background-Randomized trials that studied clinical outcomes after percutaneous coronary intervention (PCI) with bare metal stenting versus coronary artery bypass grafting (CABG) are underpowered to properly assess safety end points like death, stroke, and myocardial infarction. Pooling data from randomized controlled trials increases the statistical power and allows better assessment of the treatment effect in high-risk subgroups. Methods and Results-We performed a pooled analysis of 3051 patients in 4 randomized trials evaluating the relative safety and efficacy of PCI with stenting and CABG at 5 years for the treatment of multivessel coronary artery disease. The primary end point was the composite end point of death, stroke, or myocardial infarction. The secondary end point was the occurrence of major adverse cardiac and cerebrovascular accidents, death, stroke, myocardial infarction, and repeat revascularization. We tested for heterogeneities in treatment effect in patient subgroups. At 5 years, the cumulative incidence of death, myocardial infarction, and stroke was similar in patients randomized to PCI with stenting versus CABG (16.7% versus 16.9%, respectively; hazard ratio, 1.04, 95% confidence interval, 0.86 to 1.27; P = 0.69). Repeat revascularization, however, occurred significantly more frequently after PCI than CABG (29.0% versus 7.9%, respectively; hazard ratio, 0.23; 95% confidence interval, 0.18 to 0.29; P<0.001). Major adverse cardiac and cerebrovascular events were significantly higher in the PCI than the CABG group (39.2% versus 23.0%, respectively; hazard ratio, 0.53; 95% confidence interval, 0.45 to 0.61; P<0.001). No heterogeneity of treatment effect was found in the subgroups, including diabetic patients and those presenting with 3-vessel disease. Conclusions-In this pooled analysis of 4 randomized trials, PCI with stenting was associated with a long-term safety profile similar to that of CABG. However, as a result of persistently lower repeat revascularization rates in the CABG patients, overall major adverse cardiac and cerebrovascular event rates were significantly lower in the CABG group at 5 years.
Wavelet correlation between subjects: A time-scale data driven analysis for brain mapping using fMRI
Resumo:
Functional magnetic resonance imaging (fMRI) based on BOLD signal has been used to indirectly measure the local neural activity induced by cognitive tasks or stimulation. Most fMRI data analysis is carried out using the general linear model (GLM), a statistical approach which predicts the changes in the observed BOLD response based on an expected hemodynamic response function (HRF). In cases when the task is cognitively complex or in cases of diseases, variations in shape and/or delay may reduce the reliability of results. A novel exploratory method using fMRI data, which attempts to discriminate between neurophysiological signals induced by the stimulation protocol from artifacts or other confounding factors, is introduced in this paper. This new method is based on the fusion between correlation analysis and the discrete wavelet transform, to identify similarities in the time course of the BOLD signal in a group of volunteers. We illustrate the usefulness of this approach by analyzing fMRI data from normal subjects presented with standardized human face pictures expressing different degrees of sadness. The results show that the proposed wavelet correlation analysis has greater statistical power than conventional GLM or time domain intersubject correlation analysis. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
The identification, modeling, and analysis of interactions between nodes of neural systems in the human brain have become the aim of interest of many studies in neuroscience. The complex neural network structure and its correlations with brain functions have played a role in all areas of neuroscience, including the comprehension of cognitive and emotional processing. Indeed, understanding how information is stored, retrieved, processed, and transmitted is one of the ultimate challenges in brain research. In this context, in functional neuroimaging, connectivity analysis is a major tool for the exploration and characterization of the information flow between specialized brain regions. In most functional magnetic resonance imaging (fMRI) studies, connectivity analysis is carried out by first selecting regions of interest (ROI) and then calculating an average BOLD time series (across the voxels in each cluster). Some studies have shown that the average may not be a good choice and have suggested, as an alternative, the use of principal component analysis (PCA) to extract the principal eigen-time series from the ROI(s). In this paper, we introduce a novel approach called cluster Granger analysis (CGA) to study connectivity between ROIs. The main aim of this method was to employ multiple eigen-time series in each ROI to avoid temporal information loss during identification of Granger causality. Such information loss is inherent in averaging (e.g., to yield a single ""representative"" time series per ROI). This, in turn, may lead to a lack of power in detecting connections. The proposed approach is based on multivariate statistical analysis and integrates PCA and partial canonical correlation in a framework of Granger causality for clusters (sets) of time series. We also describe an algorithm for statistical significance testing based on bootstrapping. By using Monte Carlo simulations, we show that the proposed approach outperforms conventional Granger causality analysis (i.e., using representative time series extracted by signal averaging or first principal components estimation from ROIs). The usefulness of the CGA approach in real fMRI data is illustrated in an experiment using human faces expressing emotions. With this data set, the proposed approach suggested the presence of significantly more connections between the ROIs than were detected using a single representative time series in each ROI. (c) 2010 Elsevier Inc. All rights reserved.
Resumo:
Functional magnetic resonance imaging (fMRI) is currently one of the most widely used methods for studying human brain function in vivo. Although many different approaches to fMRI analysis are available, the most widely used methods employ so called ""mass-univariate"" modeling of responses in a voxel-by-voxel fashion to construct activation maps. However, it is well known that many brain processes involve networks of interacting regions and for this reason multivariate analyses might seem to be attractive alternatives to univariate approaches. The current paper focuses on one multivariate application of statistical learning theory: the statistical discrimination maps (SDM) based on support vector machine, and seeks to establish some possible interpretations when the results differ from univariate `approaches. In fact, when there are changes not only on the activation level of two conditions but also on functional connectivity, SDM seems more informative. We addressed this question using both simulations and applications to real data. We have shown that the combined use of univariate approaches and SDM yields significant new insights into brain activations not available using univariate methods alone. In the application to a visual working memory fMRI data, we demonstrated that the interaction among brain regions play a role in SDM`s power to detect discriminative voxels. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Objective: The aim of this article is to propose an integrated framework for extracting and describing patterns of disorders from medical images using a combination of linear discriminant analysis and active contour models. Methods: A multivariate statistical methodology was first used to identify the most discriminating hyperplane separating two groups of images (from healthy controls and patients with schizophrenia) contained in the input data. After this, the present work makes explicit the differences found by the multivariate statistical method by subtracting the discriminant models of controls and patients, weighted by the pooled variance between the two groups. A variational level-set technique was used to segment clusters of these differences. We obtain a label of each anatomical change using the Talairach atlas. Results: In this work all the data was analysed simultaneously rather than assuming a priori regions of interest. As a consequence of this, by using active contour models, we were able to obtain regions of interest that were emergent from the data. The results were evaluated using, as gold standard, well-known facts about the neuroanatomical changes related to schizophrenia. Most of the items in the gold standard was covered in our result set. Conclusions: We argue that such investigation provides a suitable framework for characterising the high complexity of magnetic resonance images in schizophrenia as the results obtained indicate a high sensitivity rate with respect to the gold standard. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
P>The determination of normal parameters is an important procedure in the evaluation of the stomatognathic system. We used the surface electromyography standardization protocol described by Ferrario et al. (J Oral Rehabil. 2000;27:33-40, 2006;33:341) to determine reference values of the electromyographic standardized indices for the assessment of muscular symmetry (left and right side, percentage overlapping coefficient, POC), potential lateral displacing components (unbalanced contractile activities of contralateral masseter and temporalis muscles, TC), relative activity (most prevalent pair of masticatory muscles, ATTIV) and total activity (integrated areas of the electromyographic potentials over time, IMPACT) in healthy Brazilian young adults, and the relevant data reproducibility. Electromyography of the right and left masseter and temporalis muscles was performed during maximum teeth clenching in 20 healthy subjects (10 women and 10 men, mean age 23 years, s.d. 3), free from periodontal problems, temporomandibular disorders, oro-facial myofunctional disorder, and with full permanent dentition (28 teeth at least). Data reproducibility was computed for 75% of the sample. The values obtained were POC Temporal (88 center dot 11 +/- 1 center dot 45%), POC masseter (87 center dot 11 +/- 1 center dot 60%), TC (8 center dot 79 +/- 1 center dot 20%), ATTIV (-0 center dot 33 +/- 9 center dot 65%) and IMPACT (110 center dot 40 +/- 23 center dot 69 mu V/mu V center dot s %). There were no statistical differences between test and retest values (P > 0 center dot 05). The Technical Errors of Measurement (TEM) for 50% of subjects assessed during the same session were 1 center dot 5, 1 center dot 39, 1 center dot 06, 3 center dot 83 and 10 center dot 04. For 25% of the subjects assessed after a 6-month interval, the TEM were 0 center dot 80, 1 center dot 03, 0 center dot 73, 12 center dot 70 and 19 center dot 10. For all indices, there was good reproducibility. These electromyographic indices could be used in the assessment of patients with stomatognathic dysfunction.
Resumo:
The collection of spatial information to quantify changes to the state and condition of the environment is a fundamental component of conservation or sustainable utilization of tropical and subtropical forests, Age is an important structural attribute of old-growth forests influencing biological diversity in Australia eucalypt forests. Aerial photograph interpretation has traditionally been used for mapping the age and structure of forest stands. However this method is subjective and is not able to accurately capture fine to landscape scale variation necessary for ecological studies. Identification and mapping of fine to landscape scale vegetative structural attributes will allow the compilation of information associated with Montreal Process indicators lb and ld, which seek to determine linkages between age structure and the diversity and abundance of forest fauna populations. This project integrated measurements of structural attributes derived from a canopy-height elevation model with results from a geometrical-optical/spectral mixture analysis model to map forest age structure at a landscape scale. The availability of multiple-scale data allows the transfer of high-resolution attributes to landscape scale monitoring. Multispectral image data were obtained from a DMSV (Digital Multi-Spectral Video) sensor over St Mary's State Forest in Southeast Queensland, Australia. Local scene variance levels for different forest tapes calculated from the DMSV data were used to optimize the tree density and canopy size output in a geometric-optical model applied to a Landsat Thematic Mapper (TU) data set. Airborne laser scanner data obtained over the project area were used to calibrate a digital filter to extract tree heights from a digital elevation model that was derived from scanned colour stereopairs. The modelled estimates of tree height, crown size, and tree density were used to produce a decision-tree classification of forest successional stage at a landscape scale. The results obtained (72% accuracy), were limited in validation, but demonstrate potential for using the multi-scale methodology to provide spatial information for forestry policy objectives (ie., monitoring forest age structure).
Resumo:
The Eysenck Personality Questionnaire-Revised (EPQ-R), the Eysenck Personality Profiler Short Version (EPP-S), and the Big Five Inventory (BFI-V4a) were administered to 135 postgraduate students of business in Pakistan. Whilst Extraversion and Neuroticism scales from the three questionnaires were highly correlated, it was found that Agreeableness was most highly correlated with Psychoticism in the EPQ-R and Conscientiousness was most highly correlated with Psychoticism in the EPP-S. Principal component analyses with varimax rotation were carried out. The analyses generally suggested that the five factor model rather than the three-factor model was more robust and better for interpretation of all the higher order scales of the EPQ-R, EPP-S, and BFI-V4a in the Pakistani data. Results show that the superiority of the five factor solution results from the inclusion of a broader variety of personality scales in the input data, whereas Eysenck's three factor solution seems to be best when a less complete but possibly more important set of variables are input. (C) 2001 Elsevier Science Ltd. All rights reserved.
Resumo:
Background. Although digital and videotaped images are known to be comparable for the evaluation of left ventricular function, their relative accuracy for assessment of more complex anatomy is unclear. We sought to compare reading time, storage costs, and concordance of video and digital interpretations across multiple observers and sites. Methods. One hundred one patients with valvular (90 mitral, 48 aortic, 80 tricuspid) disease were selected prospectively, and studies were stored according to video and standardized digital protocols. The same reviewer interpreted video and digital images independently and at different times with the use of a standard report form to evaluate 40 items (eg, severity of stenosis or regurgitation, leaflet thickening, and calcification) as normal or mildly, moderately, or severely abnormal Concordance between modalities was expressed at kappa Major discordance (difference of >1 level of severity) was ascribed to the modality that gave the lesser severity. CD-ROM was used to store digital data (20:1 lossy compression), and super-VHS video-tape was used to store video data The reading time and storage costs for each modality were compared Results. Measured parameters were highly concordant (ejection fraction was 52% +/- 13% by both). Major discordance was rare, and lesser values were reported with digital rather than video interpretation in the categories of aortic and mitral valve thicken ing (1% to 2%) and severity of mitral regurgitation (2%). Digital reading time was 6.8 +/- 2.4 minutes, 38% shorter than with video (11.0 +/- 3.0, range 8 to 22 minutes, P < .001). Compressed digital studies had an average size of 60 <plus/minus> 14 megabytes (range 26 to 96 megabytes). Storage cost for video was A$0.62 per patient (18 studies per tape, total cost A$11.20), compared with A$0.31 per patient for digital storage (8 studies per CD-ROM, total cost A$2.50). Conclusion. Digital and video interpretation were highly concordant; in the few cases of major discordance, the digital scores were lower, perhaps reflecting undersampling. Use of additional views and longer clips may be indicated to minimize discordance with video in patients with complex problems. Digital interpretation offers a significant reduction in reading times and the cost of archiving.
Resumo:
This article reports on the results of a study undertaken by the author together with her research assistant, Heather Green. The study collected and analysed data from all disciplinary tribunal decisions heard in Queensland since 1930 in an attempt to provide empirical information which has previously been lacking. This article will outline the main features of the disciplinary system in Queensland, describe the research methodology used in the present study and then report on some findings from the study. Reported findings include a profile of solicitors who have appeared before a disciplinary hearing, the types of matters which have attracted formal discipline and the types of orders made by the tribunal. Much of the data is then presented on a time scale so as to reveal any changes over time.
Resumo:
To determine the effect of slurry rheology on industrial grinding performance, 45 surveys were conducted on 16 full-scale grinding mills in five sites. Four operating variables - mill throughput, slurry density, slurry viscosity and feed fines content-were investigated. The rheology of the mill discharge slurries was measured either on-line or off-line, and the data were processed using a standard procedure to obtain a full range of flow curves. Multi-linear regression was employed as a statistical analysis tool to determine whether or not rheological effects exert an influence on industrial grinding, and to assess the influence of the four mill operating conditions on mill performance in terms of the Grinding Index, a criterion describing the overall breakage of particles across the mill. The results show that slurry rheology does influence industrial grinding. The trends of these effects on Grinding Index depend upon the rheological nature of the slurry-whether the slurries are dilatant or pseudoplastic, and whether they exhibit a high or low yield stress. The interpretation of the regression results is discussed, the observed effects are summarised, and the potential for incorporating rheological principles into process control is considered, Guidelines are established to improve industrial grinding operations based on knowledge of the rheological effects. This study confirms some trends in the effect of slurry rheology on grinding reported in the literature, and extends these to a broader understanding of the relationship between slurry properties and rheology, and their effects on industrial milling performance. (C) 2002 Elsevier Science B.V. All rights reserved.
Resumo:
Observations of an insect's movement lead to theory on the insect's flight behaviour and the role of movement in the species' population dynamics. This theory leads to predictions of the way the population changes in time under different conditions. If a hypothesis on movement predicts a specific change in the population, then the hypothesis can be tested against observations of population change. Routine pest monitoring of agricultural crops provides a convenient source of data for studying movement into a region and among fields within a region. Examples of the use of statistical and computational methods for testing hypotheses with such data are presented. The types of questions that can be addressed with these methods and the limitations of pest monitoring data when used for this purpose are discussed. (C) 2002 Elsevier Science B.V. All rights reserved.